NOTE_BEFORE_I_BEGIN: I don't do formal "reviews" but you know how excited I get over cool tools. I found something in this one that I thought you may benefit from...
What started out as an overly-simplistic scanner a few years ago has really become a contender in the Web vulnerability scanner market - especially for the price. In addition to rooting out a lot of XSS (even more than other tools can find), the latest version of Acunetix WVS can also check for blind SQL injection (something that's pretty much impossible to do manually - at least within reason) and it also does a port scan of the Web system to check for other services that may be vulnerable.
In some scans I ran it found FTP, SSH, and Windows Terminal Services open...Good things to know - things that are a *big* oversight in Web security assessments. You can't just look at layer 7 alone and assume you've done enough.
The graphic below doesn't do it much justice but Acunetix WVS has a pretty nice interface and is as easy to use as any scanning tool I've come across.
I've always said that, in most situations, you get what you pay for and this tool supports my theory. It's way better than any open source or freeware tool out there and really can hold its own with some of the bigger players.
If you already own one of the higher-end scanners but want to look at your Web systems from yet another perspective, then this would be a good route to go....a limited investment and based on what I've seen comparing it to some other popular tools you're pretty much guaranteed to find new/different/bigger Web vulnerabilities. It worked for me. It won't find everything (none of them will) but it finds a lot.
Now that I've painted a rosy picture there are a few things about the program I'm not crazy about:
- You can't run multiple scans at once either within the same session or by starting up a new instance of the program (it won't let you).
- It flags certain lower priority issues (in my mind) such as SSL v2.0 being used as higher priority ones...Most scanners do this. I'm just saying that there are many more things to worry about than someone having the tools and expertise to decrypt Web data in transit...and then getting anything valuable out of it such as the case with using SSL v2.0. Anyway, this is why you have to take your scan results with a grain of salt and do manual testing of your Web sites/apps and find/fix what matters in your context and in your environment.
- You can't apply changes to the maximum number of parallel connections (scanner threads so you can speed up or slow down the scan) while a scan is running. :-(
- You can only keep the scanner from submitting form input during the crawl phase - not the scanning phase. This is a GREAT way to tick off a bunch of people who'll undoubtedly receive thousands of emails from forms not protected with CAPTCHA or other mechanism being submitted over and over again during the scan.
- There's no easy way to work interactively and load/view the vulnerabilities it finds within a browser. You have to click on each finding then click in another window, and then copy/paste the URL, etc. into your browser.