Fast, thorough, XSS/SQLi spider. Give it a URL and it'll test every link it finds for cross-site scripting and some SQL injection vulnerabilities. See FAQ for more details about SQLi detection. From within the main folder run: If you wish to login then crawl: If you wish to login with HTTP Basic Auth then crawl: If you wish to use cookies: If you wish to limit simultaneous connections to 20: If you want to rate limit to 60 requests per minute: XSS vulnerabilities are reported in xsscrapy-vulns.txt May need additional libraries depending on OS.
libxml2
libxslt
zlib
libffi
openssl (sometimes libssl-dev) Copyright (c) 2014, Dan McInerney
All rights reserved. Redistribution and use in source and binary forms, with or without
modification, are permitted provided that the following conditions are met: THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND
ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED
WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE
DISCLAIMED. IN NO EVENT SHALL BE LIABLE FOR ANY
DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES
(INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES;
LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND
ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
(INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS
SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE../xsscrapy.py -u http://example.com
./xsscrapy.py -u http://example.com/login_page -l loginname
./xsscrapy.py -u http://example.com/login_page -l loginname --basic
./xsscrapy.py -u http://example.com/login_page --cookie "SessionID=abcdef1234567890"
./xsscrapy.py -u http://example.com -c 20
./xsscrapy.py -u http://example.com/ -r 60
wget -O -u https://bootstrap.pypa.io/get-pip.py
python get-pip.py
pip install -r requirements.txt
ImportError: cannot import name LinkExtractor
. This means that you don't have the latest version of scrapy. You can install it using: sudo pip install --upgrade scrapy
.