It performs “black-box” scans (it does not study the source code) of the web application by crawling the webpages of the deployed web app, looking for scripts and forms where it can inject data.
What’s new in Wapiti 3.0.3 ? Take a look here.
Wapiti can detect the following vulnerabilities :
- File disclosure (Local and remote include/require, fopen, readfile…)
- Database Injection (PHP/JSP/ASP SQL Injections and XPath Injections)
- XSS (Cross Site Scripting) injection (reflected and permanent)
- Command Execution detection (eval(), system(), passtru()…)
- CRLF Injection (HTTP Response Splitting, session fixation…)
- XXE (XML External Entity) injection
- SSRF (Server Side Request Forgery)
- Use of know potentially dangerous files (thanks to the Nikto database)
- Weak .htaccess configurations that can be bypassed
- Presence of backup files giving sensitive information (source code disclosure)
- Shellshock (aka Bash bug)
- Open Redirects
- Uncommon HTTP methods that can be allowed (PUT)
A buster module also allows to brute force directories and files names on the target webserver.
Wapiti supports both GET and POST HTTP methods for attacks.
It also supports multipart forms and can inject payloads in filenames (upload).
Warnings are raised when an anomaly is found (for example 500 errors and timeouts)
Wapiti is able to make the difference between permanent and reflected XSS vulnerabilities.
General features :
- Generates vulnerability reports in various formats (HTML, XML, JSON, TXT…)
- Can suspend and resume a scan or an attack (session mechanism using sqlite3 databases)
- Can give you colors in the terminal to highlight vulnerabilities
- Different levels of verbosity
- Fast and easy way to activate/deactivate attack modules
- Adding a payload can be as easy as adding a line to a text file
- Support HTTP, HTTPS and SOCKS5 proxies
- Authentication via several methods : Basic, Digest, Kerberos or NTLM
- Ability to restrain the scope of the scan (domain, folder, page, url)
- Automatic removal of one are more parameters in URLs
- Multiple safeguards against scan endless-loops (ifor example, limit of values for a parameter)
- Possibility to set the first URLs to explore (even if not in scope)
- Can exclude some URLs of the scan and attacks (eg: logout URL)
- Import of cookies (get them with the wapiti-getcookie tool)
- Can activate / deactivate SSL certificates verification
- Extract URLs from Flash SWF files
- HTML5 aware (understand recent HTML tags)
- Several options to control the crawler behavior and limits.
- Skipping some parameter names during attack.
- Setting a maximum time for the scan process.
- Adding some custom HTTP headers or setting a custom User-Agent.