& DNSSEC attacks mentioned in this talk against the following nameservers and domain names Nameservers ns1.insecuredns.com ns2.insecuredns.com Domains totallylegit.in insecuredns.com
This talk is about practical recon techniques that are useful for bug bounty hunters and penetration testers The objective of this talk is to cover exhaustive number of practical recon techniques, tools of trade and tips/tricks
of gathering preliminary data or intelligence on your target. The data is gathered in order to better plan for your attack. Reconnaissance can be performed actively or passively.
FOR DURING RECON? RECON? 1. Info to increase attack surface(domains, net blocks) 2. Credentials(email, passwords, API keys) 3. Sensitive information 4. Infrastructure details
like Google and Bing supports various advanced search operators to refine search queries. site: is helpful in doing vertical domain correlation(sub-domains) ip: is helpful in doing horizontal domain correlation
VirusTotal runs its own passive DNS replication service, built by storing DNS resolutions performed when visiting URLs submitted by users. https://www.virustotal.com/#/home/search
have to publish all SSL/TLS certificates they issue in a public log Anyone can look through the CT logs and find certificates issued for a domain Details of known CT log files - https://www.certificate-transparency.org/known- logs https://blog.appsecco.com/certificate-transparency-part-2-the-bright-side-c0b99ebf31a8
CT logs by design contain all the certificates issued by a participating CA for any given domain By looking through the logs, an attacker can gather a lot of information about an organization’s infrastructure i.e. internal domains, email addresses in a completely passive manner https://blog.appsecco.com/certificate-transparency-part-3-the-dark-side-9d401809b025
various search engines that collect the CT logs and let’s anyone search through them 1. 2. 3. 4. https://crt.sh/ https://censys.io/ https://developers.facebook.com/tools/ct/ https://google.com/transparencyreport/https/ct/
CT logs are append-only. There is no way to delete an existing entry The domain names found in the CT logs may not exist anymore and thus they can’t be resolved to an IP address https://blog.appsecco.com/a-penetration-testers-guide-to-sub-domain-enumeration- 7d842d5570f6
use tools like along with CT logs script to quickly identify resolvable domain names. massdns python3 ct.py example.com | ./bin/massdns -r resolvers.txt -t
CT logs only where "legit" CA submit the certs to a log; CertDB is based on the scanning the IPv4 segment, domains and "finding & analyzing" all the certificates curl -L -sd "api_key=API-KEY&q=Organization:\"tesla\"&response https://certdb.com
When setting up some CMSs like Wordpress, Joomla and others, there is a window of time where the installer has no form of authentication If the domain supports HTTPS it will end up on a CT log(sometimes in near real time) If an attacker can search through CT Logs and find such a web application without authentication then he/she can take over the server
This attack has been demonstrated by He claimed to have found 5,000 WordPress installations using CT logs over a period of 3 months that he could have potentially taken over HD Moore also discussed this technique in his Hanno Böck at Defcon 25 talk at BSidesLV 2017
of SSL scans on IPv4 address space and also from Certificate Transparency (CT) logs This is a good source of domains and also email addresses https://0xpatrik.com/censys-guide/
the Content- Security-Policy HTTP header, which allows us to create a whitelist of sources of trusted content, and instructs the browser to only execute or render resources from those sources So basically, Content-Security-Policy header will list a bunch of sources(domains) that might be of interest to us as an attackers.
record and is used to indicate to recieving mail exchanges which hosts are authorized to send mail for a given domain Simply put, an SPF record lists all the hosts that are authorised send emails on behalf of a domain
of Existence(RFC 7129) In DNS, when client queries for a non- existent domain, the server must deny the existence of that domain. It is harder to do that in DNSSEC due to cryptographic signing.
The ldns-walk(part of ldnsutils) can be used to zone walk DNSSEC signed zone that uses NSEC. # zone walking with ldnsutils $ ldns-walk iana.org iana.org. iana.org. A NS SOA MX TXT AAAA RRSIG NSEC DNSKEY api.iana.org. CNAME RRSIG NSEC app.iana.org. CNAME RRSIG NSEC autodiscover.iana.org. CNAME RRSIG NSEC beta.iana.org. CNAME RRSIG NSEC data.iana.org. CNAME RRSIG NSEC dev.iana.org. CNAME RRSIG NSEC ftp.iana.org. CNAME RRSIG NSEC ^C
but, NSEC3 provides a signed gap of hashes of domain names. Returning hashes was intended to prevent zone enumeration(or make it expensive). 231SPNAMH63428R68U7BV359PFPJI2FC.example.com. NSEC3 1 0 3 ABCD NKDO8UKT2STOL6EJRD1EKVD1BQ2688DM A NS SOA TXT AAAA RRSIG DNSKE NKDO8UKT2STOL6EJRD1EKVD1BQ2688DM.example.com. NSEC3 1 0 3 ABCD 231SPNAMH63428R68U7BV359PFPJI2FC A TXT AAAA RRSIG
all the sub-domain hashes and crack the hashes offline Tools like , help us automate collecting NSEC3 hashes and cracking the hashes nsec3walker nsec3map
zone using nsec3walker: # Collect NSEC3 hashes of a domain $ ./collect insecuredns.com > insecuredns.com.collect # Undo the hashing, expose the sub-domain information. $ ./unhash < insecuredns.com.collect > insecuredns.com.unhash
used following commands to install nsec3walker on Ubuntu 16.04. build-essential package is a prerequisite. https://dnscurve.org/nsec3walker.html # Installing nsec3walker $ wget https://dnscurve.org/nsec3walker-20101223.tar.gz $ tar -xzf nsec3walker-20101223.tar.gz $ cd nsec3walker-20101223 $ make
to setup and gained popularity Especially object/block storage Object storage is ideal for storing static, unstructured data like audio, video, documents, images and logs as well as large amounts of text. 1. AWS S3 buckets 2. Digital Ocean Spaces
OBJECT STORAGE? Due to the nature of object storage, it is a treasure trove of information from an attacker/penetration tester perspective. In our experience, given an chance, users will store anything on third-party services, from their passwords in plain text files to pictures of their pets.
BUCKETS BUCKETS Users can store Files(Objects) in a Bucket Each Bucket will get an unique, predictable URL and each file in a Bucket will get an unique URL as well There are Access controls mechanisms available at both Bucket and Object level.
BUCKETS BUCKETS As buckets have predictable URL it is trivial to do a dictionary based attack Following tools help run a dictionary attack to identify S3 buckets 1. 2. AWSBucketDump Slurp
in a “Space” Each Space will get an unique, predictable URL Each file in a Space will get an unique URL as well. Access controls mechanisms are available at Space and file level.
S3 API, we tweaked to work with DO Spaces Spaces finder is a tool that can look for publicly accessible DO Spaces using a wordlist, list all the accessible files on a public Space and download the files. AWSBucketDump https://github.com/appsecco/spaces-finder
have become critical in authenticating API keys are treated as keys to the kingdom For applications, API keys tend to be achilles heel https://danielmiessler.com/blog/apis-2fas-achilles-heel/
popular version control and collaboration platform Code repos on github tend to have all sorts of sensitive information Github also has a powerful search feature with advanced operators Github has a very well designed REST API has a neat little guide on edoverflow GitHub for Bug Bounty Hunters
ideally clone all the target organization's repos and analyze them locally by @mazen160 comes very handy to automate the process GitHubCloner $ python githubcloner.py --org organization -o /tmp/output https://gist.github.com/EdOverflow/922549f610b258f459b219a32f92d10b
cloned, you can do a static code analysis There are language specific tools to speed up and automate the process 1. for Ruby 2. for Python Brakeman Bandit
Once you have the repos cloned. You can understand the code, language used and architecture Start looking for keywords or patterns - API and key. (Get some more endpoints and find API keys.) - token - secret - vulnerable - http://
dorks Github search is quite powerful feature & can be used to find sensitive data on the repos A collection of Github dorks Tool to run Github dorks against a repo https://github.com/techgaun/github- dorks/blob/master/github-dorks.txt https://github.com/techgaun/github-dorks
There are various projects that gather Internet wide scan data and make it available to researchers and the security community. This data includes port scans, DNS data, SSL/TLS cert data and even data breach dumps that they can find. Find your needle in the haystack.
SETS FOR RECON? RECON? To reduce dependency on 3rd party APIs and services To reduce active probing of target infrastructure More the sources better the coverage Build your own recon platforms
DATASETS DATASETS Name Description Price zone files for "new" global TLDs FREE American IP registry information FREE Daily snapshots of ASN to IPv4 mappings FREE CZDS ARIN CAIDA PFX2AS IPv4
DATASETS DATASETS Name Description Price US government domain names FREE UK government domain names FREE Regional IP allocations FREE US Gov UK Gov RIR Delegations
DATASETS DATASETS Name Description DNS zone files for com/net/info/org/biz/xxx/sk/us TLDs Domains across many TLDs (~198m) New domain whois data PremiumDrops WWWS.io WhoisXMLAPI.com https://github.com/fathom6/inetdata
its Forward DNS study/dataset on scans.io project(it's a massive dataset, 20+ GB compressed & 300+ GB uncompressed) This dataset aims to discover all domains found on the Internet
The data format is a gzip-compressed JSON file so we can use jq utility to extract sub-domains of a specific domain: curl --silent -L https://opendata.rapid7.com/sonar.fdns_v2/201 cat 2018-04-21-1524297601-fdns_any.json.gz | pigz -dc | grep " https://opendata.rapid7.com/about/