Rusty Hog is a secret scanner built in Rust for performance, and based on TruffleHog which is written in Python.

Overview

Rusty Hog is a secret scanner built in Rust for performance, and based on TruffleHog which is written in Python. Rusty Hog provides the following binaries:

  • Ankamali Hog: Scans for secrets in a Google doc.
  • Berkshire Hog: Scans for secrets in an S3 bucket.
  • Choctaw Hog: Scans for secrets in a Git repository.
  • Duroc Hog: Scans for secrets in a directory, file, and archive.
  • Essex Hog: Scans for secrets in a Confluence wiki page.
  • Gottingen Hog: Scans for secrets in a JIRA issue.
  • Slack Hog: Scans for secrets in a Slack Channel.

Table of contents

Usage

This project provides a set of scanners that use regular expressions to try and detect the presence of sensitive information, such as API keys, passwords, and personal information. It includes a set of regular expressions by default, but also accepts a JSON object containing your custom regular expressions.

How to install using downloaded binaries

Download and unzip the latest ZIP on the releases tab. Then, run each binary with -h to see the usage.

wget https://github.com/newrelic/rusty-hog/releases/download/v1.0.11/rustyhogs-darwin-choctaw_hog-1.0.11.zip
unzip rustyhogs-darwin-choctaw_hog-1.0.11.zip
darwin_releases/choctaw_hog -h

How to run using DockerHub

Rusty Hog Docker images can be found at the authors personal DockerHub page here A Docker Image is built for each Hog and for each release. So to use choctaw_hog you would run the following commands:

docker pull wetfeet2000/choctaw_hog:1.0.11
docker run -it --rm wetfeet2000/choctaw_hog:1.0.11 --help

How to build

  • Ensure you have Rust installed and on your path.
  • Clone this repo, and then run cargo build --release. The binaries are located in target/release.
  • To build and view HTML documents, run cargo doc --no-deps --open.
  • To run unit tests, run cargo test.
  • To cross-compile Berkshire Hog for the AWS Lambda environment, run the following commands and upload berkshire_lambda.zip to your AWS Lambda dashboard:
docker run --rm -it -v "$(pwd)":/home/rust/src ekidd/rust-musl-builder cargo build --release
cp target/x86_64-unknown-linux-musl/release/berkshire_hog bootstrap
zip -j berkshire_lambda.zip bootstrap

How to build on Windows

You will need to compile static OpenSSL binaries and tell Rust/Cargo where to find them:

mkdir \Tools
cd \Tools
git clone https://github.com/Microsoft/vcpkg.git
cd vcpkg
.\bootstrap-vcpkg.bat
.\vcpkg.exe install openssl:x64-windows-static

$env:OPENSSL_DIR = 'C:\Tools\vcpkg\installed\x64-windows-static'
$env:OPENSSL_STATIC = 'Yes'
[System.Environment]::SetEnvironmentVariable('OPENSSL_DIR', $env:OPENSSL_DIR, [System.EnvironmentVariableTarget]::User)
[System.Environment]::SetEnvironmentVariable('OPENSSL_STATIC', $env:OPENSSL_STATIC, [System.EnvironmentVariableTarget]::User)

You can now follow the main build instructions listed above.

Anakamali Hog (GDoc Scanner) usage

USAGE:
    ankamali_hog [FLAGS] [OPTIONS] 
   
    

FLAGS:
        --caseinsensitive    Sets the case insensitive flag for all regexes
        --entropy            Enables entropy scanning
        --oauthsecret        Path to an OAuth secret file (JSON) ./clientsecret.json by default
        --oauthtoken         Path to an OAuth token storage file ./temp_token by default
        --prettyprint        Outputs the JSON in human readable format
    -v, --verbose            Sets the level of debugging information
    -h, --help               Prints help information
    -V, --version            Prints version information

OPTIONS:
    -a, --allowlist 
    
                                         Sets a custom allowlist JSON file
        --default_entropy_threshold 
     
          Default entropy threshold (0.6 by default)
    -o, --outputfile 
                                            Sets the path to write the scanner results to (stdout by default)

        --regex 
       
         Sets a custom regex JSON file ARGS: 
        
          The ID of the Google drive file you want to scan 
        
       
     
    
   

Berkshire Hog (S3 Scanner - CLI) usage

USAGE:
    berkshire_hog [FLAGS] [OPTIONS] 
    
    
     

FLAGS:
        --caseinsensitive    Sets the case insensitive flag for all regexes
        --entropy            Enables entropy scanning
        --prettyprint        Outputs the JSON in human readable format
    -r, --recursive          Recursively scans files under the prefix
    -v, --verbose            Sets the level of debugging information
    -h, --help               Prints help information
    -V, --version            Prints version information

OPTIONS:
    -a, --allowlist 
     
                                          Sets a custom allowlist JSON file
        --default_entropy_threshold 
      
           Default entropy threshold (0.6 by default)
    -o, --outputfile 
        Sets the path to write the scanner results to (stdout by default) --profile 
        
          When using a configuration file, enables a non-default profile --regex 
         
           Sets a custom regex JSON file ARGS: 
          
            The location of a S3 bucket and optional prefix or filename to scan. This must be written in the form s3://mybucket[/prefix_or_file] 
           
             Sets the region of the S3 bucket to scan 
           
          
         
        
      
     
    
   

Berkshire Hog (S3 Scanner - Lambda) usage

Berkshire Hog is currently designed to be used as a Lambda function. This is the basic data flow:

    ┌───────────┐              ┌───────┐     ┌────────────────┐     ┌────────────┐
    │ S3 bucket │ ┌────────┐   │       │     │ Berkshire Hog  │     │ S3 bucket  │
    │  (input) ─┼─┤S3 event├──▶│  SQS  │────▶│    (Lambda)    │────▶│  (output)  │
    │           │ └────────┘   │       │     │                │     │            │
    └───────────┘              └───────┘     └────────────────┘     └────────────┘

In order to run Berkshire Hog this way, set up the following:

  1. Configure the input bucket to send an "event" to SQS for each PUSH/PUT event.
  2. Set up the SQS topic to accept events from S3, including IAM permissions.
  3. Run Berkshire Hog with IAM access to SQS and S3.

Choctaw Hog (Git Scanner) usage

USAGE:
    choctaw_hog [FLAGS] [OPTIONS] 
   
    

FLAGS:
        --caseinsensitive    Sets the case insensitive flag for all regexes
        --entropy            Enables entropy scanning
        --prettyprint        Outputs the JSON in human readable format
    -v, --verbose            Sets the level of debugging information
    -h, --help               Prints help information
    -V, --version            Prints version information

OPTIONS:
        --default_entropy_threshold 
    
         Default entropy threshold (4.5 by default)
        --httpspass 
     
                                          Takes a password for HTTPS-based authentication
        --httpsuser 
      
                                           Takes a username for HTTPS-based authentication
    -o, --outputfile 
        Sets the path to write the scanner results to (stdout by default) --recent_days 
        
          Filters commits to the last number of days (branch agnostic) -r, --regex 
         
           Sets a custom regex JSON file --since_commit 
          
            Filters commits based on date committed (branch agnostic) --sshkeypath 
           
             Takes a path to a private SSH key for git authentication, defaults to ssh-agent --sshkeyphrase 
            
              Takes a passphrase to a private SSH key for git authentication, defaults to none --until_commit 
             
               Filters commits based on date committed (branch agnostic) -a, --allowlist 
              
                Sets a custom ALLOWLIST JSON file ARGS: 
               
                 Sets the path (or URL) of the Git repo to scan. SSH links must include username (git@) 
               
              
             
            
           
          
         
        
      
     
    
   

Duroc Hog (file system scanner) usage

USAGE:
    duroc_hog [FLAGS] [OPTIONS] 
   
    

FLAGS:
        --caseinsensitive    Sets the case insensitive flag for all regexes
        --entropy            Enables entropy scanning
        --norecursive        Disable recursive scanning of all subdirectories underneath the supplied path
        --prettyprint        Outputs the JSON in human readable format
    -z, --unzip              Recursively scans archives (ZIP and TAR) in memory (dangerous)
    -v, --verbose            Sets the level of debugging information
    -h, --help               Prints help information
    -V, --version            Prints version information

OPTIONS:
    -a, --allowlist 
    
                                         Sets a custom allowlist JSON file
        --default_entropy_threshold 
     
          Default entropy threshold (0.6 by default)
    -o, --outputfile 
                                            Sets the path to write the scanner results to (stdout by default)
    -r, --regex 
       
         Sets a custom regex JSON file ARGS: 
        
          Sets the path of the directory or file to scan. 
        
       
     
    
   

Essex Hog (Confluence scanner) usage

USAGE:
    essex_hog [FLAGS] [OPTIONS] 
    
    
     

FLAGS:
        --caseinsensitive    Sets the case insensitive flag for all regexes
        --entropy            Enables entropy scanning
        --prettyprint        Outputs the JSON in human readable format
    -v, --verbose            Sets the level of debugging information
    -h, --help               Prints help information
    -V, --version            Prints version information

OPTIONS:
    -a, --allowlist 
     
                                          Sets a custom allowlist JSON file
        --authtoken 
      
                                         Confluence basic auth bearer token (instead of user & pass)

        --default_entropy_threshold 
       
         Default entropy threshold (0.6 by default) -o, --outputfile 
         Sets the path to write the scanner results to (stdout by default) --password 
         
           Confluence password (crafts basic auth header) --regex 
          
            Sets a custom regex JSON file --username 
           
             Confluence username (crafts basic auth header) ARGS: 
            
              The ID (e.g. 1234) of the confluence page you want to scan 
             
               Base URL of Confluence instance (e.g. https://newrelic.atlassian.net/) 
             
            
           
          
         
       
      
     
    
   

Gottingen Hog (JIRA scanner) usage

Jira secret scanner in Rust.

USAGE:
    gottingen_hog [FLAGS] [OPTIONS] 
   
    

FLAGS:
        --caseinsensitive    Sets the case insensitive flag for all regexes
        --entropy            Enables entropy scanning
        --prettyprint        Outputs the JSON in human readable format
    -v, --verbose            Sets the level of debugging information
    -h, --help               Prints help information
    -V, --version            Prints version information

OPTIONS:
    -a, --allowlist 
    
                                         Sets a custom allowlist JSON file
        --authtoken 
     
                                        Jira basic auth bearer token (instead of user & pass)
        --default_entropy_threshold 
      
           Default entropy threshold (0.6 by default)
        --url 
       
         Base URL of JIRA instance (e.g. https://jira.atlassian.net/) -o, --outputfile 
         Sets the path to write the scanner results to (stdout by default) --password 
         
           Jira password (crafts basic auth header) --regex 
          
            Sets a custom regex JSON file --username 
           
             Jira username (crafts basic auth header) ARGS: 
            
              The ID (e.g. PROJECT-123) of the Jira issue you want to scan 
            
           
          
         
       
      
     
    
   

Hante Hog (SLACK scanner) usage

Slack secret scanner in Rust.

USAGE:
    hante_hog [FLAGS] [OPTIONS] --authtoken 
   
     --channelid 
    
      --url 
     
      

FLAGS:
        --caseinsensitive    Sets the case insensitive flag for all regexes
        --entropy            Enables entropy scanning
        --prettyprint        Outputs the JSON in human readable format
    -v, --verbose            Sets the level of debugging information
    -h, --help               Prints help information
    -V, --version            Prints version information

OPTIONS:
    -a, --allowlist 
      
                                           Sets a custom allowlist JSON file
        --authtoken 
       
         Slack basic auth bearer token --channelid 
        
          The ID (e.g. C12345) of the Slack channel you want to scan --default_entropy_threshold 
         
           Default entropy threshold (0.6 by default) --latest 
          
            End of time range of messages to include in search --oldest 
           
             Start of time range of messages to include in search -o, --outputfile 
             Sets the path to write the scanner results to (stdout by default) --regex 
             
               Sets a custom regex JSON file --url 
              
                Base URL of Slack Workspace (e.g. https://[WORKSPACE NAME].slack.com) 
              
             
           
          
         
        
       
      
     
    
   

Regex JSON file format

The regex option on scanners allows users to provide a path to their own JSON file of regular expressions that match sensitive material. Any provided file currently will replace, not append to, the default regular expressions provided by SecretScanner. The expected format of the file is a single json object.

The keys should be names for the type of secret each regex entry will detect, as the keys will be used for the reason properties output by the scanner.

Each value should be a string containing a valid [https://docs.rs/regex/1.3.9/regex/#syntax](regular expression for Rust) that should match the type of secret described by its corresponding key.

As of version 1.0.8, the Rusty Hog engine also supports objects as values for each secret. The object can contain all of the following:

  • a pattern property with the matching regex expression (mandatory)
  • an entropy_filter property with a boolean value to enable entropy scanning for this information (mandatory)
  • a threshold property to customize the entropy tolerance on a scale of 0 - 1 (optional, will adjust for old 1-8 format, default 0.6)
  • a keyspace property to indicate how many possible values are in the key, e.g. 16 for hex, 64 for base64, 128 for ASCII (optional, default 128)
  • a make_ascii_lowercase property to indicate whether Rust should perform .make_ascii_lowercase() on the key before calculating entropy (optional, default false)

The higher the threshold, the more entropy is required in the secret to consider it a match.

An example of this format is here:

{
    "Generic Secret": {
        "pattern": "(?i)secret[\\s[[:punct:]]]{1,4}[0-9a-zA-Z-_]{16,64}[\\s[[:punct:]]]?",
        "entropy_filter": true,
        "threshold": "0.6"
    },
    "Slack Token": { 
        "pattern": "(xox[p|b|o|a]-[0-9]{12}-[0-9]{12}-[0-9]{12}-[a-z0-9]{32})",
        "entropy_filter": true,
        "threshold": "0.6",
        "keyspace": "36",
        "make_ascii_lowercase": true
    },
    "Google API Key": {
        "pattern": "AIza[0-9A-Za-z\\-_]{35}",
        "entropy_filter": true
    },
    "PGP private key block": "-----BEGIN PGP PRIVATE KEY BLOCK-----"
}

As of version 1.0.11, the current default regex JSON used is as follows:

{
	"Slack Token": "(xox[p|b|o|a]-[0-9]{12}-[0-9]{12}-[0-9]{12}-[a-z0-9]{32})",
	"RSA private key": "-----BEGIN RSA PRIVATE KEY-----",
	"SSH (DSA) private key": "-----BEGIN DSA PRIVATE KEY-----",
	"SSH (EC) private key": "-----BEGIN EC PRIVATE KEY-----",
	"PGP private key block": "-----BEGIN PGP PRIVATE KEY BLOCK-----",
	"Amazon AWS Access Key ID": "AKIA[0-9A-Z]{16}",
	"Amazon MWS Auth Token": "amzn\\.mws\\.[0-9a-f]{8}-[0-9a-f]{4}-[0-9a-f]{4}-[0-9a-f]{4}-[0-9a-f]{12}",
	"Facebook Access Token": "EAACEdEose0cBA[0-9A-Za-z]+",
	"Facebook OAuth": "(?i)facebook[\\s[[:punct:]]]{1,4}[0-9a-f]{32}[\\s[[:punct:]]]?",
	"GitHub": "(?i)(github|access[[:punct:]]token)[\\s[[:punct:]]]{1,4}[0-9a-zA-Z]{35,40}",
	"Generic API Key": {
		"pattern": "(?i)(api|access)[\\s[[:punct:]]]?key[\\s[[:punct:]]]{1,4}[0-9a-zA-Z\\-_]{16,64}[\\s[[:punct:]]]?",
		"entropy_filter": true,
		"threshold": "0.6",
		"keyspace": "guess"
	},
	"Generic Account API Key": {
		"pattern": "(?i)account[\\s[[:punct:]]]?api[\\s[[:punct:]]]{1,4}[0-9a-zA-Z\\-_]{16,64}[\\s[[:punct:]]]?",
		"entropy_filter": true,
		"threshold": "0.6",
		"keyspace": "guess"
	},
	"Generic Secret": {
		"pattern": "(?i)secret[\\s[[:punct:]]]{1,4}[0-9a-zA-Z-_]{16,64}[\\s[[:punct:]]]?",
		"entropy_filter": true,
		"threshold": "0.6",
		"keyspace": "guess"
	},
	"Google API Key": "AIza[0-9A-Za-z\\-_]{35}",
	"Google Cloud Platform API Key": "AIza[0-9A-Za-z\\-_]{35}",
	"Google Cloud Platform OAuth": "(?i)[0-9]+-[0-9A-Za-z_]{32}\\.apps\\.googleusercontent\\.com",
	"Google Drive API Key": "AIza[0-9A-Za-z\\-_]{35}",
	"Google Drive OAuth": "(?i)[0-9]+-[0-9A-Za-z_]{32}\\.apps\\.googleusercontent\\.com",
	"Google (GCP) Service-account": "(?i)\"type\": \"service_account\"",
	"Google Gmail API Key": "AIza[0-9A-Za-z\\-_]{35}",
	"Google Gmail OAuth": "(?i)[0-9]+-[0-9A-Za-z_]{32}\\.apps\\.googleusercontent\\.com",
	"Google OAuth Access Token": "ya29\\.[0-9A-Za-z\\-_]+",
	"Google YouTube API Key": "AIza[0-9A-Za-z\\-_]{35}",
	"Google YouTube OAuth": "(?i)[0-9]+-[0-9A-Za-z_]{32}\\.apps\\.googleusercontent\\.com",
	"Heroku API Key": "[h|H][e|E][r|R][o|O][k|K][u|U][\\s[[:punct:]]]{1,4}[0-9A-F]{8}-[0-9A-F]{4}-[0-9A-F]{4}-[0-9A-F]{4}-[0-9A-F]{12}",
	"MailChimp API Key": "[0-9a-f]{32}-us[0-9]{1,2}",
	"Mailgun API Key": "(?i)key-[0-9a-zA-Z]{32}",
	"Credentials in absolute URL": "(?i)((https?|ftp)://)(([a-z0-9$_\\.\\+!\\*'\\(\\),;\\?&=-]|%[0-9a-f]{2})+(:([a-z0-9$_\\.\\+!\\*'\\(\\),;\\?&=-]|%[0-9a-f]{2})+)@)((([a-z0-9]\\.|[a-z0-9][a-z0-9-]*[a-z0-9]\\.)*[a-z][a-z0-9-]*[a-z0-9]|((\\d|[1-9]\\d|1\\d{2}|2[0-4][0-9]|25[0-5])\\.){3}(\\d|[1-9]\\d|1\\d{2}|2[0-4][0-9]|25[0-5]))(:\\d+)?)(((/+([a-z0-9$_\\.\\+!\\*'\\(\\),;:@&=-]|%[0-9a-f]{2})*)*(\\?([a-z0-9$_\\.\\+!\\*'\\(\\),;:@&=-]|%[0-9a-f]{2})*)?)?)?",
	"PayPal Braintree Access Token": "(?i)access_token\\$production\\$[0-9a-z]{16}\\$[0-9a-f]{32}",
	"Picatic API Key": "(?i)sk_live_[0-9a-z]{32}",
	"Slack Webhook": "(?i)https://hooks.slack.com/services/T[a-zA-Z0-9_]{8}/B[a-zA-Z0-9_]{8}/[a-zA-Z0-9_]{24}",
	"Stripe API Key": "(?i)sk_live_[0-9a-zA-Z]{24}",
	"Stripe Restricted API Key": "(?i)rk_live_[0-9a-zA-Z]{24}",
	"Square Access Token": "(?i)sq0atp-[0-9A-Za-z\\-_]{22}",
	"Square OAuth Secret": "(?i)sq0csp-[0-9A-Za-z\\-_]{43}",
	"Twilio API Key": "SK[0-9a-fA-F]{32}",
	"Twitter Access Token": "(?i)twitter[\\s[[:punct:]]]{1,4}[1-9][0-9]+-[0-9a-zA-Z]{40}",
	"Twitter OAuth": "(?i)twitter[\\s[[:punct:]]]{1,4}['|\"]?[0-9a-zA-Z]{35,44}['|\"]?",
	"New Relic Partner & REST API Key": "[\\s[[:punct:]]][A-Fa-f0-9]{47}[\\s[[:punct:]][[:cntrl:]]]",
	"New Relic Mobile Application Token": "[\\s[[:punct:]]][A-Fa-f0-9]{42}[\\s[[:punct:]][[:cntrl:]]]",
	"New Relic Synthetics Private Location": "(?i)minion_private_location_key",
	"New Relic Insights Key (specific)": "(?i)insights[\\s[[:punct:]]]?(key|query|insert)[\\s[[:punct:]]]{1,4}\\b[\\w-]{32,40}\\b",
	"New Relic Insights Key (vague)": "(?i)(query|insert)[\\s[[:punct:]]]?key[\\s[[:punct:]]]{1,4}b[\\w-]{32,40}\\b",
	"New Relic License Key": "(?i)license[\\s[[:punct:]]]?key[\\s[[:punct:]]]{1,4}\\b[\\w-]{32,40}\\b",
	"New Relic Internal API Key": "(?i)nr-internal-api-key",
	"New Relic HTTP Auth Headers and API Key": "(?i)(x|newrelic|nr)-?(admin|partner|account|query|insert|api|license)-?(id|key)[\\s[[:punct:]]]{1,4}\\b[\\w-]{32,47}\\b",
	"New Relic API Key Service Key (new format)": "(?i)NRAK-[A-Z0-9]{27}",
	"New Relic APM License Key (new format)": "(?i)[a-f0-9]{36}NRAL",
	"New Relic APM License Key (new format, region-aware)": "(?i)[a-z]{2}[0-9]{2}xx[a-f0-9]{30}NRAL",
	"New Relic REST API Key (new format)": "(?i)NRRA-[a-f0-9]{42}",
	"New Relic Admin API Key (new format)": "(?i)NRAA-[a-f0-9]{27}",
	"New Relic Insights Insert Key (new format)": "(?i)NRII-[A-Za-z0-9-_]{32}",
	"New Relic Insights Query Key (new format)": "(?i)NRIQ-[A-Za-z0-9-_]{32}",
	"New Relic Synthetics Private Location Key (new format)": "(?i)NRSP-[a-z]{2}[0-9]{2}[a-f0-9]{31}",
	"Email address": "(?i)\\b(?:[a-z0-9!#$%&'*+/=?^_`{|}~-]+(?:\\.[a-z0-9!#$%&'*+/=?^_`{|}~-]+)*)@[a-z0-9][a-z0-9-]+\\.(com|de|cn|net|uk|org|info|nl|eu|ru)([\\W&&[^:/]]|\\A|\\z)",
	"New Relic Account IDs in URL": "(newrelic\\.com/)?accounts/\\d{1,10}/",
	"Account ID": "(?i)account[\\s[[:punct:]]]?id[\\s[[:punct:]]]{1,4}\\b[\\d]{1,10}\\b",
	"Salary Information": "(?i)(salary|commission|compensation|pay)([\\s[[:punct:]]](amount|target))?[\\s[[:punct:]]]{1,4}\\d+"
}

Allowlist JSON file format

Scanners provide an allowlist feature. This allows you to specify a list of regular expressions for each pattern that will be ignored by the scanner. You can now optionally supply a list of regular expressions that are evaluated against the file path as well.

The format for this allowlist file should be a single json object. Each key in the allowlist should match a key in the regex json, and the value can be one of two things:

  1. An array of strings that are exceptions for that regex pattern. For example:
  2. An object with at least one key (patterns) and optionally a second key (paths).

In addition, you can specify the key which is evaluated against all patterns.

The following is the default allowlist included in all scans:

": [ "(?i)example", "(?i)fake", "(?i)replace", "(?i)deadbeef", "(?i)ABCDEFGHIJKLMNOPQRSTUVWX", "1234567890" ] }">
{
	"Email address": {
		"patterns": [
			"(?i)@newrelic.com",
			"(?i)noreply@",
			"(?i)test@"
		],
		"paths": [
			"(?i)authors",
			"(?i)contributors",
			"(?i)license",
			"(?i)maintainers",
			"(?i)third_party_notices"
		]
	},
	"Credentials in absolute URL": {
		"patterns": [
			"(?i)(https?://)?user:pass(word)?@"
		]
	},
	"New Relic API Key Service Key (new format)": {
		"patterns": [
			"NRAK-123456789ABCDEFGHIJKLMNOPQR"
		]
	},
	"Generic API Key": {
		"patterns": [
			"(?i)sanitizeAPIKeyForLogging"
		]
	},
	"New Relic License Key": {
		"patterns": [
			"(?i)bootstrap_newrelic_admin_license_key",
			"(?i)xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx",
			"(?i)__YOUR_NEW_RELIC_LICENSE_KEY__LICENSE__",
			"(?i)YOUR_NEW_RELIC_APPLICATION_TOKEN"
		]
	},
	"Generic Secret": {
		"patterns": [
			"(?i)secret:NewRelicLicenseKeySecret"
		]
	},
	"
   
    "
   : [
		"(?i)example",
		"(?i)fake",
		"(?i)replace",
		"(?i)deadbeef",
		"(?i)ABCDEFGHIJKLMNOPQRSTUVWX",
		"1234567890"
	]
}

Be aware that in these are strings, not regex expressions, and the keys for this allowlist have to a key in the regex json. Keys are case-sensitive.

Project information

Open source license

This project is distributed under the Apache 2 license.

Support

New Relic has open-sourced this project. This project is provided AS-IS WITHOUT WARRANTY OR SUPPORT, although you can report issues and contribute to the project here on GitHub.

Please do not report issues with this software to New Relic Global Technical Support.

Community

New Relic hosts and moderates an online forum where customers can interact with New Relic employees as well as other customers to get help and share best practices. Like all official New Relic open source projects, there's a related Community topic in the New Relic Explorer's Hub. You can find this project's topic/threads here:

https://discuss.newrelic.com/t/rusty-hog-multi-platform-secret-key-scanner/90117

Issues / enhancement requests

Submit issues and enhancement requests in the Issues tab of this repository. Please search for and review the existing open issues before submitting a new issue.

Contributing

Contributions are welcome (and if you submit a enhancement request, expect to be invited to contribute it yourself). Please review our Contributors Guide.

Keep in mind that when you submit your pull request, you'll need to sign the CLA via the click-through using CLA-Assistant. If you'd like to execute our corporate CLA, or if you have any questions, please drop us an email at [email protected].

Feature Roadmap

  • 1.1: Enterprise features

    • Support config files (instead of command line args)
    • Support environment variables instead of CLI args
    • Multi-threading
    • Better context detection and false positive filtering (GitHound, machine learning)
    • Use Rusoto instead of s3-rust
    • Add JIRA scanner
    • Add file-system & archive scanner
    • Use Rust features to reduce compilation dependencies?
  • 1.2: Integration with larger scripts and UIs

    • Support Github API for larger org management
      • Scan all repos for a list of users
      • Scan all repos in an org
    • Generate a web report or web interface. Support "save state" generation from UI.
    • Agent/manager model
    • Scheduler process (blocked by save state support)

What does the name mean?

TruffleHog is considered the de facto standard / original secret scanner. I have been building a suite of secret scanning tools for various platforms based on TruffleHog and needed a naming scheme, so I started at the top of Wikipedia's list of pig breeds. Thus each tool name is a breed of pig starting at "A" and working up.

Comments
  • Essex Hog: Removed hard-coded context from Confluence URL

    Essex Hog: Removed hard-coded context from Confluence URL

    Based on the Confluence documentation, this change now sets the default to no context when passing in the base URL. If a user is scanning a Confluence instance that uses context, they will need to append it to the base URL to get a correct API endpoint

    opened by bp4151 10
  • essex_hog flexibility enhancement

    essex_hog flexibility enhancement

    Hi. Attempting to use essex_hog, but the resulting URL does not align with the behind-the-firewall Confluence site I'm attempting to scan due to it's inclusion of /wiki/ in the URL.

    Example: essex_hog attempts to retrieve: https://confluence.example.domain/wiki/rest/api/content/39273027?expand=body.storage

    while the correct URL to body.storage would be: https://confluence.example.domain/rest/api/content/39273027?expand=body.storage

    Thank-you.

    enhancement 
    opened by cmiller123456 10
  • Choctaw Hog v1.0.9 binary working, v1.0.10 not working

    Choctaw Hog v1.0.9 binary working, v1.0.10 not working

    Description

    Choctaw Hog completes successfully when using pre-built v1.0.9 binary, but returns a list of empty issues when utilising the pre-built 1.0.10 binary.

    Steps to Reproduce

    v1.0.9

    $ ./choctaw_hog https://github.com/newrelic/rusty-hog  --httpsuser "" --httpspass "" --help
    choctaw_hog 1.0.9
    
    ...........
    
    $ ./choctaw_hog https://github.com/newrelic/rusty-hog  --httpsuser "" --httpspass "" --prettyprint
    [
      {
        "commit": "v1.0.10\n",
        "commitHash": "04bd867ad782daa532e28bcfe45f18a66b9aa90a",
        "date": "2021-04-23 18:10:31",
        "diff": "  \"PGP private key block\": \"-----BEGIN PGP PRIVATE KEY BLOCK-----\",\n",
        "stringsFound": [
          "-----BEGIN PGP PRIVATE KEY BLOCK-----"
        ],
    
    ...........
    
    

    v1.0.10

    $ unzip rustyhogs-darwin-choctaw_hog-1.0.10.zip
    Archive:  rustyhogs-darwin-choctaw_hog-1.0.10.zip
      inflating: darwin_releases/choctaw_hog
    $ cd darwin_releases/
    $ ./choctaw_hog https://github.com/newrelic/rusty-hog  --httpsuser "" --httpspass "" --prettyprint
    []
    

    Expected Behaviour

    Pre-built v1.0.10 also returns results

    Your Environment

    macOS Big Sur (20.6.0 Darwin Kernel Version 20.6.0: Wed Jun 23 00:26:31 PDT 2021; root:xnu-7195.141.2~5/RELEASE_X86_64 x86_64), but replicated with Docker image aswell.

    bug 
    opened by h888t 6
  • Document regex json format with example

    Document regex json format with example

    Summary

    I need to supply a regex json. But the json format is not immediately discoverable. I also can't tell if the file is supplementary or replaces the defaults.

    Desired Behavior

    In the README and help text, the format of the file (A JSON Object where the key is a description and the value is a regex as a JSON string) should be described.

    Example files should be included.

    An option should allow the user to include the defaults with their supplementary file.

    Perhaps the default should be provided as a file instead of a string in Rust. Then the commands would have a default file we could unpack and copy.

    enhancement 
    opened by jeffalder 6
  • Move SecretScanner to its own crate

    Move SecretScanner to its own crate

    Move SecretScanner into a different crate rusty-hog-scanner. Use cargo workspaces to import this new crate. This way the scanner can be easily used in other projects without having to import all the other functionality such as git or s3 secrets reader. This also makes it possible to compile the rusty-hog-scanner to wasm modules.

    Fixes https://github.com/newrelic/rusty-hog/issues/44

    opened by raulcabello 5
  • Extend the scanner to allow entropy threshold definition per pattern

    Extend the scanner to allow entropy threshold definition per pattern

    Extend the pattern definition to allow entropy threshold definition also for patterns such as:

    {
        "Slack Token": { 
            "pattern": "(xox[p|b|o|a]-[0-9]{12}-[0-9]{12}-[0-9]{12}-[a-z0-9]{32})",
            "entropy": true,
            "threshold": "4.5"
        },
        "RSA private key": "-----BEGIN RSA PRIVATE KEY-----",
        "SSH (DSA) private key": "-----BEGIN DSA PRIVATE KEY-----",
        "SSH (EC) private key": "-----BEGIN EC PRIVATE KEY-----",
        "PGP private key block": "-----BEGIN PGP PRIVATE KEY BLOCK-----",
        "Amazon AWS Access Key ID": {
            "pattern": "AKIA[0-9A-Z]{16}",
            "entropy": true
        },
        "Amazon MWS Auth Token": {
            "pattern": "amzn\\.mws\\.[0-9a-f]{8}-[0-9a-f]{4}-[0-9a-f]{4}-[0-9a-f]{4}-[0-9a-f]{12}",
            "enropy": true
        },
        "AWS API Key": {
            "pattern": "AKIA[0-9A-Z]{16}",
            "entropy": true
        },
        "GitHub": {
            "pattern": "(?i)(github|access[[:punct:]]token)[\\s[[:punct:]]]{1,4}[0-9a-zA-Z]{35,40}",
            "entropy": true,
            "threshold": "4.5"
        },
        "Generic API Key": {
            "pattern": "(?i)(api|access)[\\s[[:punct:]]]?key[\\s[[:punct:]]]{1,4}[0-9a-zA-Z\\-_]{16,64}[\\s[[:punct:]]]?",
            "entropy": true
        },
        "Generic Account API Key": {
            "pattern": "(?i)account[\\s[[:punct:]]]?api[\\s[[:punct:]]]{1,4}[0-9a-zA-Z\\-_]{16,64}[\\s[[:punct:]]]?",
            "entropy": true
        },
        "Generic Secret": {
            "pattern" : "(?i)secret[\\s[[:punct:]]]{1,4}[0-9a-zA-Z-_]{16,64}[\\s[[:punct:]]]?",
            "entropy": true
        },
        "Google API Key": {
            "pattern": "AIza[0-9A-Za-z\\-_]{35}",
            "entropy": true
        },
        "Google Cloud Platform API Key": {
            "pattern": "AIza[0-9A-Za-z\\-_]{35}",
            "entropy": true
        },
        "Google Cloud Platform OAuth": "(?i)[0-9]+-[0-9A-Za-z_]{32}\\.apps\\.googleusercontent\\.com",
        "Google Drive API Key": {
            "pattern": "AIza[0-9A-Za-z\\-_]{35}",
            "entropy": true
        },
        "Google Drive OAuth": "(?i)[0-9]+-[0-9A-Za-z_]{32}\\.apps\\.googleusercontent\\.com",
        "Google (GCP) Service-account": {
            "pattern": "(?i)\"type\": \"service_account\"",
            "entropy": true
        },
        "Google Gmail API Key": {
            "pattern": "AIza[0-9A-Za-z\\-_]{35}",
            "entropy": true
        },
        "Google Gmail OAuth": "(?i)[0-9]+-[0-9A-Za-z_]{32}\\.apps\\.googleusercontent\\.com",
        "Google OAuth Access Token": {
            "pattern": "ya29\\.[0-9A-Za-z\\-_]+",
            "entropy": true
        },
        "Credentials in absolute URL": "(?i)((https?|ftp)://)(([a-z0-9$_\\.\\+!\\*'\\(\\),;\\?&=-]|%[0-9a-f]{2})+(:([a-z0-9$_\\.\\+!\\*'\\(\\),;\\?&=-]|%[0-9a-f]{2})+)?@)((([a-z0-9]\\.|[a-z0-9][a-z0-9-]*[a-z0-9]\\.)*[a-z][a-z0-9-]*[a-z0-9]|((\\d|[1-9]\\d|1\\d{2}|2[0-4][0-9]|25[0-5])\\.){3}(\\d|[1-9]\\d|1\\d{2}|2[0-4][0-9]|25[0-5]))(:\\d+)?)(((/+([a-z0-9$_\\.\\+!\\*'\\(\\),;:@&=-]|%[0-9a-f]{2})*)*(\\?([a-z0-9$_\\.\\+!\\*'\\(\\),;:@&=-]|%[0-9a-f]{2})*)?)?)?",
        "Slack Webhook": "(?i)https://hooks.slack.com/services/T[a-zA-Z0-9_]{8}/B[a-zA-Z0-9_]{8}/[a-zA-Z0-9_]{24}"
    }
    
    

    The entropy is calculated for each word of a matched text considering only the word with the maximum entropy. The findings with an entropy lower than the threshold are filtered out.

    This allow us to remove many false positives which are for instance only paths in a secrets manager.

    This changes are backward compatible with the old regex definition.

    opened by ccojocar 5
  • Git clone links improperly reported as email addresses

    Git clone links improperly reported as email addresses

    Description

    Git clone links that start with [email protected] and [email protected] are flagged as email addresses when they are not.

    Steps to Reproduce

    duroc_hog on a markdown file with [email protected]:newrelic/foo.git in it.

    Expected Behaviour

    git@ is almost never an email; [email protected] is definitely never an email.

    Relevant Logs / Console output

      {
        "stringsFound": [
          "[email protected]"
        ],
        "path": "./README.md",
        "reason": "Email address",
        "linenum": 8,
        "diff": "git clone [email protected]:java-agent/java_agent.git"
      },  {
        "stringsFound": [
          "[email protected]"
        ],
        "path": "./build.gradle",
        "reason": "Email address",
        "linenum": 116,
        "diff": "            url \"[email protected]:newrelic/java_agent.git\""
      }
    

    Your Environment

    v1.0.5

    bug 
    opened by jeffalder 4
  • Publish containers to dockerhub

    Publish containers to dockerhub

    Summary

    Publish images to dockerhub.

    Desired Behaviour

    I would like to use containers so that:

    1. I am not required to download and install binaries manually, and maintain versions manually
    2. I can run the scanners in Kubernetes pods as init containers
    3. I can run the scanner on virtually any platform without requiring special builds (and without New Relic having to publish special builds)
    4. I can have the scanners separate so I don't download an S3 or Google Docs scanner that I'll never use.
    5. The containers can easily be triggered as AWS Lambdas in response to various events (S3 bucket changes, for example).
    enhancement 
    opened by jeffalder 4
  • fixes #11 add git regex to default regex json \w test

    fixes #11 add git regex to default regex json \w test

    Fixes #11

    What

    This PR adds a regex for git repositories to the default regex json to avoid git repos from falsely being identified as email addresses.

    The key for git repos is named "Git Repository".

    Testing

    Unit test is included against a typical git repo address, a test to confirm email addresses still work is already in the repo.

    opened by nicolasjhampton 3
  • Choctaw_hog: Enable parallel processing

    Choctaw_hog: Enable parallel processing

    Summary

    The current implementation only uses a single CPU core.

    Desired Behaviour

    It would be great to leverage all compute power (or provide an option of number of core to use) to hunt down secrets.

    Additional context

    When scanning entire repositories, the scan can take a long time to complete -- too long in some cases.

    enhancement 
    opened by madchap 2
  • add flag recent_days

    add flag recent_days

    Added the flag recent_days to only check commits to the repo in the last number of days. This lets us run a scan against only new commits while not requiring us to keep state of the last scanned commit (or look it up prior to the scan to pass it in). This can only be used if since_commit is not also specified.

    My first PR to a Rust project, so sorry if I missed anything. Thanks for creating this! The speed of this implementation lets us get through so many more repos. 💨

    opened by b3nn 2
  • Update dependencies

    Update dependencies

    Summary

    A lot of the dependencies are now majorly out of date, but simply changing the cargo.toml is not enough, a fair amount of code will need to be re-written to support the newer libraries.

    Additional context

    Updating libraries will support newer features and hopefully improve performance.

    enhancement 
    opened by cutler-scott-newrelic 4
  • Scan GitHub and GitHub Enterprise comments

    Scan GitHub and GitHub Enterprise comments

    Summary

    Scan GitHub and GitHub Enterprise PR comments for secrets

    Additional context

    Users or bots (Terraform Atlantis) may inadvertently commit secrets in the comments of a Pull Request

    enhancement 
    opened by NolanT 1
  • choctaw hog fails to run on macOS 12.1 due to incorrect path for libssl.1.1.dylib

    choctaw hog fails to run on macOS 12.1 due to incorrect path for libssl.1.1.dylib

    Description

    choctaw hog fails to run on macOS 12.1 due to an incorrect / missing path for libssl.1.1.dylib

    Steps to Reproduce

    1. Start with a macOS 12.1 machine.
    2. Download the Darwin binary for choctaw hog.
    3. Run the binary
    4. Receive error:
    darwin_releases/choctaw_hog
    
    dyld[35939]: Library not loaded: /usr/local/opt/[email protected]/lib/libssl.1.1.dylib
      Referenced from: /Users/<username>/Downloads/darwin_releases/choctaw_hog
      Reason: tried: '/usr/local/opt/[email protected]/lib/libssl.1.1.dylib' (no such file), '/usr/local/lib/libssl.1.1.dylib' (no such file), '/usr/lib/libssl.1.1.dylib' (no such file)
    
    [1]    35939 abort      ~/Downloads/darwin_releases/choctaw_hog
    

    The correct path for libssl.1.1.dylib assuming you have it installed from homebrew is:

    /opt/homebrew/Cellar/[email protected]/1.1.1l_1/lib/libssl.1.1.dylib
    

    Expected Behaviour

    For the application to either find the existing library from /opt/homebrew/Cellar/openssl@//lib/libssl..dylib or for the library to be provided with the download.

    Your Environment

    HOMEBREW_VERSION: 3.3.9
    ORIGIN: https://github.com/Homebrew/brew
    HEAD: 96137bc19e68398ebbb7033379df288cd8b9a3f9
    Last commit: 6 days ago
    Core tap ORIGIN: https://github.com/Homebrew/homebrew-core
    Core tap HEAD: 3e08bfbbaa36488a38e10e1a8263d65bb83d62e4
    Core tap last commit: 24 hours ago
    Core tap branch: master
    HOMEBREW_PREFIX: /opt/homebrew
    HOMEBREW_CASK_OPTS: []
    HOMEBREW_CORE_GIT_REMOTE: https://github.com/Homebrew/homebrew-core
    HOMEBREW_GITHUB_API_TOKEN: set
    HOMEBREW_MAKE_JOBS: 10
    HOMEBREW_NO_ANALYTICS: set
    Homebrew Ruby: 2.6.8 => /System/Library/Frameworks/Ruby.framework/Versions/2.6/usr/bin/ruby
    CPU: 10-core 64-bit arm_firestorm_icestorm
    Clang: 13.0.0 build 1300
    Git: 2.34.1 => /opt/homebrew/bin/git
    Curl: 7.77.0 => /usr/bin/curl
    macOS: 12.1-arm64
    CLT: 13.2.0.0.1.1638488800
    Xcode: N/A
    Rosetta 2: false
    
    bug 
    opened by sammcj 4
  • Add to Homebrew package manager

    Add to Homebrew package manager

    Summary

    Add to Homebrew package manager for macOS

    Desired Behaviour

    brew install rusty-hog
    

    or

    brew install choctaw-hog
    brew install duroc-hog
    

    etc....

    Possible Solution

    Either via brew create to create and submit a package or by creating a tap.

    • https://docs.brew.sh/Formula-Cookbook
    • https://docs.brew.sh/How-to-Create-and-Maintain-a-Tap

    Additional context

    This would make it far more accessible for people to install and update the binaries.

    enhancement 
    opened by sammcj 1
  • GitHub Action for choctaw hog

    GitHub Action for choctaw hog

    Summary

    It would be super nice to have a GHA for choctaw hog :-)

    I noticed that trufflehog had one (even though I never tried it): https://github.com/marketplace/actions/trufflehog-actions-scan

    Desired Behaviour

    For starters, probably something basic. Of course and ideally, we'd have a simple UX way to mark status on findings to keep up lists of false-positives, etc.. so they don't come back to haunt us again!

    Additional context

    CI context, e.g. run automatically on PRs.

    enhancement help wanted 
    opened by madchap 8
Releases(v1.0.11)
Owner
New Relic
New Relic
A minimalistic cross-platform malware scanner with non-blocking realtime filesystem monitoring using YARA rules.

Sauron is a minimalistic, YARA based malware scanner with realtime filesystem monitoring written in Rust. Features Realtime scan of created and modifi

Simone Margaritelli 155 Dec 26, 2022
A value transfer bridge between the Monero blockchain and the Secret Network.

Secret-Monero-Bridge A value transfer bridge between the Monero blockchain and the Secret Network. Proof-of-Concept Video Demonstration: https://ipfs.

null 28 Dec 7, 2022
A pure-Rust implementation of various threshold secret sharing schemes

Threshold Secret Sharing Efficient pure-Rust library for secret sharing, offering efficient share generation and reconstruction for both traditional S

Snips 137 Dec 29, 2022
Rust implementation of Shamir's Secret Sharing

Horcrux - Rust implementation of Shamir's Secret Sharing This program is an example implementation of Shamir's Secret Sharing in Rust. You can find mo

null 13 Dec 29, 2022
🔑 Threshold Shamir's secret sharing in Rust

Rusty Secrets Rusty Secrets is an implementation of a threshold Shamir's secret sharing scheme. Documentation (latest) Documentation (master) Design g

Spin Research 233 Dec 17, 2022
secret folders generator to hide hentais in your computer

hentai dream 95 secret folders generator to hide hentais in your computer, but its really old way as **** used techniquee one injection technique from

jumango pussu 7 Jul 8, 2021
Manage secret values in-repo via public key cryptography

amber Manage secret values in-repo via public key cryptography. See the announcement blog post for more motivation. Amber provides the ability to secu

FP Complete 82 Nov 10, 2022
Cross-platform Secure TUI Secret Locker

SafeCloset keeps your secrets in password protected files. SafeCloset is designed to be convenient and avoid common weaknesses like external editing o

Canop 63 Dec 26, 2022
A tool for secret-shared passphrases.

harpo harpo is a tool and library that provides the following functionality: It can generate a seed phrase. Given a seed phrase, it can generate any n

Thomas Locher 11 Jun 30, 2022
Secret contract for Anons project.

Snip-721 Protocal by Baedrik template with several edits Minting Limits mint() caps tokens max at 580 mint() will keep count of how many anons each ad

Stake or Die 14 Jul 9, 2022
A CLI application that implements multi-key-turn security via Shamir's Secret Sharing.

agree agree is a CLI tool for easily applying multi-key-turn security via Shamirs Secret Sharing. Project state agree is unstable. Version semantics:

Alexander Weber 19 Aug 29, 2023
Built for Perpetual Protocol v2 Curie on Optimism chain. This CLI tool was built with Rust.

Perpetual Protocol CLI for Perp v2 Curie This tool is to provide a simple, fast and efficient way to interact Perpetual Protocol contracts from your t

Brendan Wenzel 4 Jan 11, 2023
ARYA Network is a polkadot/substrate based chain for Non-fungible Token platform on which we can own sell and buy the NFT's on polkadot network.

ARYA Network ARYA Network is a polkadot/substrate based chain for Non-fungible Token platform on which we can own sell and buy the NFT's on polkadot n

Pankaj Chaudhary 6 Dec 20, 2022
An extensible and practical demonstration of constructing evm-based sandwich attacks built with ethers-rs and Huff language.

subway-rs • Construct evm-based sandwich attacks using Rust and Huff. Getting Started subway-rs is a port of libevm's original subway, implemented wit

refcell.eth 230 Apr 25, 2023
An open source Rust high performance cryptocurrency trading API with support for multiple exchanges and language wrappers. written in rust(🦀) with ❤️

Les.rs - Rust Cryptocurrency Exchange Library An open source Rust high performance cryptocurrency trading API with support for multiple exchanges and

Crabby AI 4 Jan 9, 2023
CLI tool written in Rust which can be used to generate hashes

rustgenhash rustgenhash is a tool to generate hashes on the commandline from stdio. It can be used to generate single or multiple hashes for usage in

Volker Schwaberow 11 Dec 29, 2022
An extremely high performance matching engine written in Rust.

Galois Introduction Galois is an extremely high performance matching engine written in Rust, typically used for the crypto currency exchange service.

UINB Tech 66 Jan 7, 2023
An easy-to-use, high-performance Interledger implementation written in Rust

Interledger implementation in Rust ?? Requirements All crates require Rust 2018 edition and are tested on the following channels: stable Connecting to

Interledger.rs 184 Dec 13, 2022
A CLI application which allows you to archive Urbit channels and all linked content in them.

The Urbit Content Archiver is a small CLI application that exports channels from your Urbit ship and auto-downloads any directly linked content locall

Robert Kornacki 33 Sep 25, 2022