In the fast-paced world of cybersecurity, staying ahead of potential threats requires thorough reconnaissance and vulnerability assessments. This blog post introduces you to a powerful bash script designed to automate these crucial tasks. Whether you’re a security professional or a cybersecurity enthusiast, this script provides a comprehensive approach to uncovering vulnerabilities and mapping out your target domain.
What This Script Will Do
This script offers an automated solution for a wide range of security assessments:
- 🔍 Subdomain Enumeration: Discover and collect subdomains associated with your target domain.
- 🌐 DNS Enumeration: Gather detailed information about DNS records and perform zone transfers.
- 🌐 IP Resolution and Reverse Lookup: Resolve subdomains to IP addresses and perform reverse DNS lookups.
- 🕵️♂️ Port Scanning: Identify open ports and services on the resolved IP addresses.
- 🌐 Web Probing and WAF Detection: Probe for live web servers and detect Web Application Firewalls (WAFs).
- 📸 Screenshots and Content Discovery: Capture screenshots of web pages and discover hidden directories or files.
- 🔎 Vulnerability Scanning: Scan for known vulnerabilities in web applications and associated services.
- 🔍 Additional Checks: Perform checks for exposed S3 buckets, JavaScript secrets, SSL/TLS vulnerabilities, and more.
- 📜 Certificate Transparency Logs and Wayback Machine Crawling: Retrieve certificate transparency logs and historical URLs.
This script provides a thorough security assessment by leveraging multiple tools and techniques to uncover potential vulnerabilities and security issues.
Script
GitHub Link : https://github.com/securitycipher/Bug-Bounty-Resources/blob/main/content/auto-recon.sh
#!/bin/bash
if [ $# -eq 0 ]; then
echo "Usage: $0 <domain>"
exit 1
fi
domain=$1
output_dir="recon_$domain"
mkdir -p $output_dir/{subdomains,ip_addresses,ports,screenshots,content,vulnerabilities,emails,technologies,dns,certificates,scans}
echo "[+] Starting comprehensive reconnaissance and vulnerability scanning for $domain"
# Subdomain enumeration
echo "[+] Enumerating subdomains..."
subfinder -d $domain -o $output_dir/subdomains/subfinder.txt
assetfinder --subs-only $domain > $output_dir/subdomains/assetfinder.txt
amass enum -d $domain -o $output_dir/subdomains/amass.txt
github-subdomains -d $domain -t /path/to/github_token -o $output_dir/subdomains/github_subdomains.txt
chaos -d $domain -o $output_dir/subdomains/chaos.txt
sort -u $output_dir/subdomains/*.txt > $output_dir/subdomains/all_subdomains.txt
# DNS enumeration
echo "[+] Performing DNS enumeration..."
dnsenum $domain --noreverse -o $output_dir/dns/dns_enum.txt
dnsrecon -d $domain -t std,brt -c $output_dir/dns/dnsrecon.csv
# Resolve IP addresses
echo "[+] Resolving IP addresses..."
cat $output_dir/subdomains/all_subdomains.txt | dnsx -a -resp-only -o $output_dir/ip_addresses/resolved_ips.txt
# Reverse DNS lookup
echo "[+] Performing reverse DNS lookup..."
for ip in $(cat $output_dir/ip_addresses/resolved_ips.txt); do
host $ip | awk '{print $5}' >> $output_dir/ip_addresses/reverse_dns.txt
done
# Port scanning
echo "[+] Scanning ports..."
nmap -iL $output_dir/ip_addresses/resolved_ips.txt -p- -sV -sC -oN $output_dir/ports/nmap_full_scan.txt
masscan -iL $output_dir/ip_addresses/resolved_ips.txt -p1-65535 --rate=1000 -oG $output_dir/ports/masscan_results.txt
rustscan -a $output_dir/ip_addresses/resolved_ips.txt --ulimit 5000 -- -sV -sC -oN $output_dir/ports/rustscan_results.txt
# Web probing
echo "[+] Probing for web servers..."
cat $output_dir/subdomains/all_subdomains.txt | httpx -o $output_dir/content/live_subdomains.txt
# WAF detection
echo "[+] Detecting WAF..."
wafw00f -i $output_dir/content/live_subdomains.txt -o $output_dir/content/waf_detection.txt
# Screenshots
echo "[+] Taking screenshots..."
gowitness file -f $output_dir/content/live_subdomains.txt -P $output_dir/screenshots/
# Content discovery
echo "[+] Discovering content..."
for subdomain in $(cat $output_dir/content/live_subdomains.txt); do
ffuf -w /path/to/wordlist.txt -u "https://$subdomain/FUZZ" -mc 200,204,301,302,307,401,403 -o $output_dir/content/ffuf_${subdomain}.json
dirsearch -u "https://$subdomain" -o $output_dir/content/dirsearch_${subdomain}.txt
done
# Vulnerability scanning
echo "[+] Scanning for vulnerabilities..."
nuclei -l $output_dir/content/live_subdomains.txt -o $output_dir/vulnerabilities/nuclei_results.txt
nikto -h $output_dir/content/live_subdomains.txt -output $output_dir/vulnerabilities/nikto_results.txt
# Git exposure check
echo "[+] Checking for exposed .git directories..."
for subdomain in $(cat $output_dir/content/live_subdomains.txt); do
curl -s "https://$subdomain/.git/HEAD" | grep "ref:" && echo "https://$subdomain/.git" >> $output_dir/vulnerabilities/exposed_git.txt
done
# S3 bucket enumeration
echo "[+] Enumerating S3 buckets..."
for subdomain in $(cat $output_dir/subdomains/all_subdomains.txt); do
s3scanner -bucket "$subdomain" >> $output_dir/vulnerabilities/s3_buckets.txt
done
# JavaScript analysis
echo "[+] Analyzing JavaScript files..."
for subdomain in $(cat $output_dir/content/live_subdomains.txt); do
getJS -url "https://$subdomain" --output $output_dir/content/js_files_${subdomain}.txt
cat $output_dir/content/js_files_${subdomain}.txt | grep -Ei "api\.|token|key|secret|password|aws|azure|gcp" >> $output_dir/vulnerabilities/js_secrets_${subdomain}.txt
linkfinder -i "https://$subdomain" -o $output_dir/content/linkfinder_${subdomain}.txt
done
# SSL/TLS analysis
echo "[+] Analyzing SSL/TLS..."
for subdomain in $(cat $output_dir/content/live_subdomains.txt); do
sslyze --regular "$subdomain" >> $output_dir/vulnerabilities/sslyze_results.txt
testssl.sh "$subdomain" >> $output_dir/vulnerabilities/testssl_results.txt
done
# CORS misconfiguration check
echo "[+] Checking for CORS misconfigurations..."
for subdomain in $(cat $output_dir/content/live_subdomains.txt); do
curl -s -I -H "Origin: https://evil.com" "https://$subdomain" | grep -i "Access-Control-Allow-Origin: https://evil.com" && echo "CORS misconfiguration found at https://$subdomain" >> $output_dir/vulnerabilities/cors_misconfig.txt
done
# Technology stack identification
echo "[+] Identifying technology stack..."
for subdomain in $(cat $output_dir/content/live_subdomains.txt); do
whatweb "https://$subdomain" >> $output_dir/technologies/whatweb_results.txt
wappalyzer "https://$subdomain" --pretty >> $output_dir/technologies/wappalyzer_results.json
done
# Email harvesting
echo "[+] Harvesting email addresses..."
theHarvester -d $domain -b all -f $output_dir/emails/theharvester_results.txt
# Subdomain takeover check
echo "[+] Checking for subdomain takeover..."
subjack -w $output_dir/subdomains/all_subdomains.txt -t 100 -timeout 30 -o $output_dir/vulnerabilities/subjack_results.txt -ssl
# DNS zone transfer attempt
echo "[+] Attempting DNS zone transfer..."
for ns in $(dig +short NS $domain); do
dig @$ns $domain AXFR > $output_dir/dns/zone_transfer_${ns}.txt
done
# Certificate transparency logs
echo "[+] Checking certificate transparency logs..."
ct_logs=$(curl -s "https://crt.sh/?q=%.$domain&output=json" | jq -r '.[].name_value' | sort -u)
echo "$ct_logs" > $output_dir/certificates/ct_logs.txt
# Fuzzing for virtual hosts
echo "[+] Fuzzing for virtual hosts..."
for ip in $(cat $output_dir/ip_addresses/resolved_ips.txt); do
ffuf -w $output_dir/subdomains/all_subdomains.txt -u "http://$ip" -H "Host: FUZZ" -fc 404 -o $output_dir/content/vhost_fuzzing_${ip}.json
done
# API endpoint discovery
echo "[+] Discovering API endpoints..."
for subdomain in $(cat $output_dir/content/live_subdomains.txt); do
ffuf -w /path/to/api_wordlist.txt -u "https://$subdomain/FUZZ" -mc 200,201,204,301,302,307,401,403 -o $output_dir/content/api_discovery_${subdomain}.json
done
# Wayback machine crawling
echo "[+] Crawling Wayback Machine..."
waybackurls $domain | sort -u > $output_dir/content/wayback_urls.txt
# SQLi Scanning
echo "[+] Scanning for SQL injection vulnerabilities..."
sqlmap -m $output_dir/content/wayback_urls.txt --batch --random-agent --level 1 --risk 1 -o -report-file $output_dir/vulnerabilities/sqlmap_results.txt
# XSS Scanning
echo "[+] Scanning for XSS vulnerabilities..."
xsser --url "https://$domain" --auto --Cw 3 --Cl 5 --Cs 5 --Cp 5 --CT 5 --threads 10 --output $output_dir/vulnerabilities/xsser_results.xml
# Open redirect scanning
echo "[+] Checking for open redirects..."
cat $output_dir/content/wayback_urls.txt | grep -E '(=|%3D)https?%3A%2F%2F' | qsreplace 'https://evil.com' | httpx -silent -status-code -location -json -o $output_dir/vulnerabilities/open_redirects.json
# SSRF scanning
echo "[+] Scanning for SSRF vulnerabilities..."
cat $output_dir/content/wayback_urls.txt | grep "=" | qsreplace "http://169.254.169.254/latest/meta-data/" | httpx -silent -status-code -json -o $output_dir/vulnerabilities/potential_ssrf.json
# GraphQL introspection
echo "[+] Checking for GraphQL endpoints and introspection..."
for subdomain in $(cat $output_dir/content/live_subdomains.txt); do
graphql-cop -t "https://$subdomain/graphql" -o $output_dir/vulnerabilities/graphql_${subdomain}.json
done
# CRLF injection
echo "[+] Checking for CRLF injection..."
crlfuzz -l $output_dir/content/live_subdomains.txt -o $output_dir/vulnerabilities/crlf_results.txt
# XML external entity (XXE) injection
echo "[+] Scanning for XXE vulnerabilities..."
cat $output_dir/content/live_subdomains.txt | xmlrpcscan -server "collaborator_url.net" > $output_dir/vulnerabilities/xxe_results.txt
# Server-side template injection (SSTI)
echo "[+] Checking for SSTI vulnerabilities..."
cat $output_dir/content/wayback_urls.txt | grep "=" | qsreplace "{{7*7}}" | httpx -silent -status-code -content-length -json -o $output_dir/vulnerabilities/potential_ssti.json
# WebSocket security
echo "[+] Analyzing WebSocket security..."
for subdomain in $(cat $output_dir/content/live_subdomains.txt); do
wscat -c "wss://$subdomain" --execute 'ping' | tee $output_dir/vulnerabilities/websocket_${subdomain}.txt
done
# CORS misconfiguration (extended)
echo "[+] Extended CORS misconfiguration check..."
corscanner -i $output_dir/content/live_subdomains.txt -o $output_dir/vulnerabilities/cors_extended.txt
# HTTP request smuggling
echo "[+] Checking for HTTP request smuggling..."
smuggler -u "https://$domain" -t 20 -o $output_dir/vulnerabilities/http_smuggling.txt
# Web cache poisoning
echo "[+] Scanning for web cache poisoning vulnerabilities..."
cacheblaster -u "https://$domain" -o $output_dir/vulnerabilities/cache_poisoning.txt
# CSTI (Client-Side Template Injection)
echo "[+] Checking for CSTI vulnerabilities..."
csti-scanner -D "https://$domain" -o $output_dir/vulnerabilities/csti_results.txt
echo "[+] Comprehensive reconnaissance and vulnerability scanning completed. Results saved in $output_dir"
Script Sections and Functionality
Subdomain Enumeration 🔍
Purpose: Identifying subdomains is essential for mapping the attack surface of a domain.
Tools Used:
- subfinder: Fast subdomain discovery.
- assetfinder: Finds subdomains using various sources.
- amass: Advanced subdomain enumeration.
- github-subdomains: Extracts subdomains from GitHub repositories.
- chaos: Discover subdomains through passive and active techniques.
Expected Output: A consolidated list of discovered subdomains, saved for further analysis.
DNS Enumeration 🌐
Purpose: Gather information about DNS records to reveal additional details about the domain’s infrastructure.
Tools Used:
- dnsenum: Collects DNS records and performs DNS queries.
- dnsrecon: Provides detailed DNS reconnaissance including zone transfers.
Expected Output: Detailed DNS records and zone transfer attempts, highlighting hidden aspects of the domain’s DNS configuration.
IP Resolution and Reverse Lookup 🌐
Purpose: Resolve subdomains to IP addresses and perform reverse DNS lookups to identify associated infrastructure.
Tools Used:
Expected Output: A list of resolved IP addresses and associated domain names.
Port Scanning 🕵️♂️
Purpose: Identify open ports and services to understand potential attack vectors.
Tools Used:
- nmap: Versatile network discovery and security auditing.
- masscan: High-speed port scanner.
- rustscan: Modern port scanner with Nmap integration.
Expected Output: Scan results detailing open ports and services.
Web Probing and WAF Detection 🌐
Purpose: Check for live web applications and detect protective mechanisms like Web Application Firewalls (WAFs).
Tools Used:
Expected Output: A list of live subdomains and WAF detection results.
Screenshots and Content Discovery 📸
Purpose: Capture screenshots and discover web content to gain visual and contextual insights.
Tools Used:
- gowitness: Captures web page screenshots.
- ffuf: Fuzzing tool for hidden directories and files.
- dirsearch: Brute-forces directories and files.
Expected Output: Screenshots of web pages and lists of discovered directories and files.
Vulnerability Scanning 🔎
Purpose: Identify known vulnerabilities in web applications and services.
Tools Used:
- nuclei: Customizable vulnerability detection.
- nikto: Web server vulnerability scanner.
- sqlmap: SQL injection testing.
- xsser: XSS vulnerability scanning.
- subjack: Subdomain takeover checks.
Expected Output: Detailed vulnerability scan results, highlighting potential security issues.
Additional Checks 🔍
Purpose: Cover a wide range of potential issues beyond core vulnerabilities.
Tools Used:
- s3scanner: Detects exposed Amazon S3 buckets.
- getJS: Extracts and analyzes JavaScript files.
- sslyze and testssl.sh: Assess SSL/TLS configurations.
- corscanner: Checks for CORS misconfigurations.
- smuggler: Tests for HTTP request smuggling vulnerabilities.
- cacheblaster: Scans for web cache poisoning vulnerabilities.
- csti-scanner: Detects client-side template injection (CSTI).
- xmlrpcscan: Scans for XML external entity (XXE) injection.
- crlfuzz: Checks for CRLF injection vulnerabilities.
- wscat: Analyzes WebSocket security.
- graphql-cop: Checks for GraphQL Vulnerabilities.
Expected Output: Comprehensive results covering additional vulnerabilities and security issues.
Certificate Transparency Logs and Wayback Machine Crawling 📜
Purpose: Retrieve historical data and past issues from certificate transparency logs and the Wayback Machine.
Tools Used:
- crt.sh: Certificate transparency logs.
- waybackurls: Historical URLs from the Wayback Machine.
Expected Output: Certificate transparency logs and historical URLs.
Prerequisites and Setup 🛠️
Before running the script, ensure you have the following:
Tool Installation
Install the required tools using package managers or from their official sources. Refer to each tool’s documentation for installation instructions.
Configuration
- API Keys and Tokens: Configure API keys for tools like github-subdomains.
- Wordlists: Update paths for wordlists used in ffuf and dirsearch.
- Java Deserialization: Ensure deserial.jar for ysoserial is correctly placed.
Permissions
Obtain explicit authorization before scanning any domain to comply with legal and ethical standards.
Performance Optimization 🚀
To enhance performance:
Parallelism
Run tools like nmap, masscan, and rustscan in parallel to speed up the scanning process.
Rate Limiting
Adjust the scanning rate in tools like masscan to balance speed and network load.
Resource Allocation
Monitor system resources to avoid overloading. Adjust tool configurations as needed.
Troubleshooting 🛠️
Address common issues:
Tool Errors
- Installation Issues: Verify tools are installed correctly.
- Configuration Errors: Check configuration files and API keys.
Script Failures
- Permissions: Ensure necessary permissions for script execution and domain access.
- Dependencies: Verify required tools and libraries are present.
Output Verification
Ensure output files are saved correctly and validate results.
Acknowledgments 🙏
A special thank you to the creators and maintainers of the tools used in this script. Your contributions are invaluable to the cybersecurity community.
By following this guide, you’ll be equipped to perform a comprehensive security assessment with ease. Customize the script to fit your needs, and use it as a powerful tool for uncovering vulnerabilities and enhancing your security posture. Happy scanning! 🕵️♂️🔍
Thank you
God bless you