Installation notes with simple scans.
[UPDATE 2016]: you may want to give Kali Linux a try for web app vulnerability testing.
Software
Software used in this article:
- Ubuntu 14.04 LTS
- Lynis 1.6.4
- Nmap 6.40
- Nikto 2.1.5
- Wapiti 2.3.0
- w3af
- Arachni 1.0.2-0.5.3
- Skipfish v2.10b
Before We Begin
You can get the script that installs everything in one go from: https://github.com/lisenet/security-scripts-for-linux
$ git clone https://github.com/lisenet/security-scripts-for-linux.git $ bash ./security-scripts-for-linux/sec-tools-installer.sh
Create a directory to store installation files:
$ mkdir /home/"$USER"/bin
Lynis (Community Edition v1.6.4)
Lynis is an open source security auditing tool for Unix and Linux based systems. Its primary goal is to perform a quick security scan on a system and determine room for improvement.
It is recommended to download the newest Lynis version.
Usage
Several options available for scans:
--man: view a man page. -c: perform a full check of the system, printing out the results of each test to stdout. -Q: perform a quick scan and do not wait for user input. --logfile: define location and name of log file, instead of default /var/log/lynis.log. --check-update: check for updates. --pentest: run a non-privileged scan. Some of the tests will be skipped if they require root permissions.
We may need root privileges to run a full security audit:
$ sudo lynis -c -Q --logfile /tmp/scan-lynis.txt
Check the output file for any warnings and/or suggestions:
$ sudo egrep -i 'warning|suggestion' /tmp/scan-lynis.txt
Nmap
Nmap is an open source port scanner and network exploration tool. It can be used for network discovery and security auditing.
Installation from Repositories
$ sudo apt-get update && sudo apt-get install nmap
Usage
Treat target hosts (localhost) as online (-Pn), scan only for standard SSH, HTTP, HTTPS, MSSQL, MySQL and RDP ports (-p) by using an aggressive (-T4) timing mode, and probe open ports to determine service and version info (-sV). Output scan in normal (-oN) and show only possibly open ports (–open).
$ nmap -Pn -p T:22,80,443,1433,3306,3389 -sV -T4 --open -oN /tmp/scan-nmap.txt localhost
Several scan techniques and many other scan options are available, check the nmap man page for more info. Also feel free to check this page for some nmap examples.
Nikto (v2.1.5)
Nikto is an open source web server scanner which performs comprehensive tests against web servers for multiple items, checks for outdated versions and version specific problems of servers.
Installation from Tarball
Install prerequisities:
$ sudo apt-get install perl perl-modules libnet-ssleay-perl libwhisker2-perl openssl
Download the newest Nikto version and install the package:
$ cd /home/"$USER"/bin $ wget http://cirt.net/nikto/nikto-2.1.5.tar.gz $ tar xvfz ./nikto-2.1.5.tar.gz $ mv ./nikto-2.1.5 ./nikto $ chown -R "$USER":"$USER" ./nikto; $ chmod u+x ./nikto/nikto.pl $ ./nikto/nikto.pl -update $ cd ./nikto
Usage
Several options available for scans:
-H: view an extended help page. -h: host to can. -p: TCP port(s) to use for a scan. -maxtime: maximum execution time per host, in seconds. Accepts minutes and hours such that all of these are one hour: 3600s, 60m, 1h -ssl: only test SSL on the ports specified. -nossl: do not use SSL to connect to the server. -F: save the output file specified with -o (-output) option in this format. -o: write output to the file specified. -t: seconds to wait before timing out a request. Default timeout is 10 seconds. -T: tuning options to control the test that Nikto will use against a target. By default, all tests are performed. -update: update the plugins and databases directly from cirt.net.
Tuning options:
0 - File Upload 1 - Interesting File / Seen in logs 2 - Misconfiguration / Default File 3 - Information Disclosure 4 - Injection (XSS/Script/HTML) 5 - Remote File Retrieval - Inside Web Root 6 - Denial of Service 7 - Remote File Retrieval - Server Wide 8 - Command Execution / Remote Shell 9 - SQL Injection a - Authentication Bypass b - Software Identification c - Remote Source Inclusion x - Reverse Tuning Options (i.e., include all except specified). The given string will be parsed from left to right, any x characters will apply to all characters to the right of the character.
Scan localhost on port 443 with SSL only and send output to the text file /tmp/scan-nikto.txt
. Use all tests except Denial of Service (x6).
$ ./nikto.pl -h localhost -p 443 -ssl -F txt -o /tmp/scan-nikto.txt -t 5 -T x6
Troubleshooting
SSL support not available (see docs for SSL install)
$ sudo apt-get install libnet-ssleay-perl
Wapiti (v2.3.0)
Wapiti is an open source web application vulnerability scanner. It can detect the following vulnerabilities:
- File handling errors (local and remote include/require, fopen, readfile).
- Database injection (PHP/JSP/ASP SQL Injections and XPath Injections).
- XSS (Cross Site Scripting) injection.
- LDAP injection.
- Command execution detection (eval(), system(), passtru()).
- CRLF injection (HTTP response splitting, session fixation).
Installation from Tarball
Install prerequisities:
$ sudo apt-get install python2.7 python2.7-dev python-requests python-ctypes python-beautifulsoup
Download the newest Wapiti version and install the package:
$ cd /home/"$USER"/bin $ wget http://netcologne.dl.sourceforge.net/project/wapiti/wapiti/wapiti-2.3.0/wapiti-2.3.0.tar.gz $ tar xvfz wapiti-2.3.0.tar.gz $ mv ./wapiti-2.3.0 ./wapiti $ chown -R "$USER":"$USER" ./wapiti; $ chmod u+x ./wapiti/bin/wapiti $ cd ./wapiti/bin
Usage
To access the help page:
$ ./wapiti --help | less
Several options available for scans:
-b: scope of the scan (page, folder or domain). -t: timeout to wait for the server to send a response. -n: a limit of URLs to browse with the same pattern. -u: use colours to highlight vulnerabilities and anomalies in output. -v: verbose level, from 0 to 2. -f: report format type (txt, html etc). -o: the name of the report file, or directory if html. -i: resume the previous scan saved in the specified XML status file. -k: resume the attacks without scanning the website again, loading the scan status from the specified file. --verify-ssl: check and verify SSL certificates if set to 1, ignore if set to 0. -m: the modules (and HTTP methods for each module) to use for attacks.
Modules available:
crlf - CRLF attack. exec - command execution attack. file - file handling attack. sql - error-based SQL Injection attack. xss - cross site scripting attack. backup - backup attack. htaccess - htaccess attack, i.e. redirecting users coming from search engines to malware. blindsql - blind SQL Injection attack. permanentxss - cross site scripting attack. nikto - Nikto attack. Nikto databases are csv files. http://cirt.net/nikto/UPDATES/2.1.5/db_tests
Start a scan against the localhost website, be verbose and use colours to highlight vulnerabilities:
$ ./wapiti http://localhost -v 2 -u
To only browse the target (without sending any payloads), deactivate every module with -m “-all”:
$ ./wapiti http://localhost -v 2 -u -m "-all"
If we don’t specify the HTTP methods, GET and POST will be used. To only use the HTTP GET method:
$ ./wapiti http://localhost -v 2 -u "-all,all:get"
Scan the localhost website on a standard HTTPS port without verifying SSL certificates, output to the /tmp/scan-wapiti.txt
file:
$ ./wapiti https://localhost -n 1 -b folder -f txt -o /tmp/scan-wapiti.txt -v 2 -t 5 -u --verify-ssl 0 -m "-all,all:get,exec:post,-nikto"
If we cancel a running scan, we can resume it by passing the -i parameter. When we launch a scan against localhost, Wapiti creates a /home/"$USER"/.wapiti/scans/localhost.hmtl
file. If we pass the -i parameter without specifying the fine name, Wapiti takes the default file from the “scans” folder.
We can use the -k parameter to resume an attack.
w3af
The w3af stands for the Web Application Attack and Audit Framework. The w3af is a complete environment for auditing and attacking web applications. This environment provides a solid platform for web vulnerability assessments and penetration tests.
Installation from GitHub
Install prerequisities:
$ sudo apt-get install git python2.7 python2.7-dev python-pip python-gitdb python-yaml libssl-dev libxml2-dev libxslt1-dev libyaml-dev libsqlite3-dev
Note: if intended to use w3af_gui, then python-gtksourceview2 and python-webkit may be needed.
$ dpkg --get-selections python-* | awk '{print $1'} python-apt python-apt-common python-async python-beautifulsoup python-chardet python-cheetah python-colorama python-configobj python-debian python-distlib python-gdbm python-gitdb python-html5lib python-json-pointer python-jsonpatch python-minimal python-oauth python-openssl python-pam python-pip python-pkg-resources python-prettytable python-pycurl python-requests python-serial python-setuptools python-six python-smmap python-twisted-bin python-twisted-core python-twisted-names python-twisted-web python-urllib3 python-xapian python-yaml python-zope.interface
Install w3af:
$ cd /home/"$USER"/bin $ git clone https://github.com/andresriancho/w3af.git $ chown -R "$USER":"$USER" ./w3af; $ chmod u+x ./w3af/w3af_console $ ./w3af/w3af_console $ sudo /tmp/w3af_dependency_install.sh
Usage (CLI)
Create a sample scan script:
$ cat > /tmp/w3af-script.w3af << EOF http-settings set timeout 5 set user_agent "This is a security scan." back misc-settings set max_discovery_time 15 set fuzz_cookies True set fuzz_form_files True set fuzz_url_parts True set fuzz_url_filenames True back plugins crawl pykto,robots_txt,sitemap_xml,web_spider audit blind_sqli,csrf,dav,eval,format_string,generic,os_commanding,sqli,ssi,un_ssl,xss,xst infrastructure allowed_methods,domain_dot,dot_net_errors,server_header,server_status auth generic grep analyze_cookies,code_disclosure,credit_cards,directory_indexing,error_500,error_pages,get_emails,path_disclosure,private_ip,strange_headers,strange_http_codes,strange_parameters,strange_reason grep config get_emails set only_target_domain False back output console,text_file output config text_file set output_file /tmp/test.txt set verbose False back output config console set verbose False back back target set target http://localhost back cleanup start EOF
Launch a scan from the script:
$ ./w3af_console -s /tmp/w3af-script.w3af
Arachni (v1.0.2-0.5.3)
Arachni is an Open Source, feature-full, modular, high-performance Ruby framework aimed towards helping penetration testers and administrators evaluate the security of web applications.
Installation from Tarball
Download the newest Arachni version and install the package:
$ cd /home/"$USER"/bin $ wget http://downloads.arachni-scanner.com/arachni-1.0.2-0.5.3-linux-x86_64.tar.gz $ tar xvfz arachni-1.0.2-0.5.3-linux-x86_64.tar.gz $ mv arachni-1.0.2-0.5.3 arachni $ chown -R "$USER":"$USER" ./arachni; $ cd ./arachni/bin
Usage (CLI)
The scan below will load all checks, the plugins under plugins/defaults
and audit all forms, links and cookies. Note that localhost URL is reserved from scanning.
$ ./arachni --output-only-positives --http-request-timeout 5 --http-user-agent "Security scan" http://example.local.lan/
Arachni’s output messages are classified into several categories, each of them prefixed with a different colored symbol:
[*] are status messages. [~] are informational messages. [+] are sucess messages. [v] are verbose messages. [!] are debug messages. [-] are error messages.
The –output-only-positives parameter will suppress all messages except for for the ones denoting success, usually regarding the discovery of some issue.
The scan below will load and use a custom configuration profile.afp file:
$ ./arachni --profile-load-filepath ./profile.afp http://iis.example.local.lan/
As you may see below, the profile is explicitly configured for Windows IIS platform improve efficiency when scanning Windows IIS systems.
$ cat ./profile.afp --- input: values: "(?i-mx:name)": admin "(?i-mx:user)": admin "(?i-mx:usr)": admin "(?i-mx:pass)": '123456' "(?i-mx:txt)": admin "(?i-mx:num)": '1' "(?i-mx:amount)": '100' "(?i-mx:mail)": [email protected] "(?i-mx:account)": '12' "(?i-mx:id)": '1' without_defaults: true force: false session: {} datastore: {} scope: redundant_path_patterns: {} dom_depth_limit: 10 exclude_path_patterns: [] exclude_content_patterns: [] include_path_patterns: [] restrict_paths: [] extend_paths: [] url_rewrites: {} include_subdomains: false https_only: false http: user_agent: SecurityScan request_timeout: 30000 request_redirect_limit: 5 request_concurrency: 15 request_queue_size: 500 request_headers: {} cookies: {} authentication_username: admin authentication_password: '123456' audit: exclude_vector_patterns: [] include_vector_patterns: [] link_templates: [] links: true forms: true cookies: true headers: false with_both_http_methods: false cookies_extensively: false browser_cluster: pool_size: 6 job_timeout: 120 worker_time_to_live: 100 ignore_images: false screen_width: 1600 screen_height: 1200 checks: - code_injection - code_injection_php_input_wrapper - code_injection_timing - csrf - file_inclusion - ldap_injection - no_sql_injection - no_sql_injection_differential - os_cmd_injection - os_cmd_injection_timing - path_traversal - response_splitting - rfi - session_fixation - source_code_disclosure - sql_injection - sql_injection_differential - sql_injection_timing - trainer - unvalidated_redirect - xpath_injection - xss - xss_dom - xss_dom_inputs - xss_dom_script_context - xss_event - xss_path - xss_script_context - xss_tag - allowed_methods - backdoors - backup_directories - backup_files - captcha - common_directories - common_files - cookie_set_for_parent_domain - credit_card - cvs_svn_users - directory_listing - emails - form_upload - hsts - htaccess_limit - html_objects - http_only_cookies - http_put - insecure_cookies - interesting_responses - localstart_asp - mixed_resource - origin_spoof_access_restriction_bypass - password_autocomplete - private_ip - ssn - unencrypted_password_forms - webdav - xst platforms: - windows - iis plugins: autothrottle: discovery: healthmap: timing_attacks: uniformity: no_fingerprinting: false authorized_by: admin
Check Arachni CLI reference for more info.
Usage (Web GUI)
$ ./arachni_web
Navigate your browser to http://localhost:9292 and follow the on-screen instructions.
Skipfish (v2.10b)
Skipfish is an active web application security reconnaissance tool.
Installation from Tarball
Install prerequisities:
$ sudo apt-get install libpcre3 libpcre3-dev libidn11-dev
Download the newest Skipfish version and install the package:
$ cd /home/"$USER"/bin $ wget http://skipfish.googlecode.com/files/skipfish-2.10b.tgz $ tar xvfz ./skipfish-2.10b.tgz $ mv ./skipfish-2.10b ./skipfish $ chown -R "$USER":"$USER" ./skipfish; $ cd ./skipfish && make
Usage
Reference to the automated audit using Skipfish is no longer available as the webpage was deleted on 29 June 2016 by Jmanico.
Install-All-in-One-Go
GitHub
You can get the script from GitHub:
$ git clone https://github.com/lisenet/security-scripts-for-linux.git
$ cd ./security-scripts-for-linux $ chmod u+x ./sec-tools-installer.sh $ ./sec-tools-installer.sh
Script
You can also use the script to download and install everything in one go. This script may be outdated.
#!/bin/bash # written by Tomas (www.lisenet.com) # 16/09/2014 (dd/mm/yy) # copyleft free software # # installation directory DIR="/home/"$USER"/bin"; # check for installation directory if [ -d "$DIR" ]; then echo ""$DIR" already exists. Aborting." exit 0; else mkdir -pv "$DIR"; cd "$DIR"; fi # # PREREQUISITES # sudo apt-get update; sudo apt-get install perl perl-modules libnet-ssleay-perl libwhisker2-perl \ python2.7 python2.7-dev python-requests python-ctypes python-beautifulsoup \ python-pip python-gitdb python-yaml libssl-dev libxml2-dev libxslt1-dev wget \ libyaml-dev libsqlite3-dev libpcre3 libpcre3-dev libidn11-dev openssl git -y; # # NMAP # sudo apt-get install nmap -y; # # NIKTO # wget http://cirt.net/nikto/nikto-2.1.5.tar.gz; tar xvfz ./nikto-2.1.5.tar.gz; mv ./nikto-2.1.5 ./nikto; chown -R "$USER":"$USER" ./nikto; chmod u+x ./nikto/nikto.pl; ./nikto/nikto.pl -update; # # WAPITI # wget http://netcologne.dl.sourceforge.net/project/wapiti/wapiti/wapiti-2.3.0/wapiti-2.3.0.tar.gz; tar xvfz wapiti-2.3.0.tar.gz; mv ./wapiti-2.3.0 ./wapiti; chown -R "$USER":"$USER" ./wapiti; chmod u+x ./wapiti/bin/wapiti; # # W3AF # git clone https://github.com/andresriancho/w3af.git; chown -R "$USER":"$USER" ./w3af; chmod u+x ./w3af/w3af_console; # # ARACHNI # wget http://downloads.arachni-scanner.com/arachni-1.0.2-0.5.3-linux-x86_64.tar.gz; tar xvfz arachni-1.0.2-0.5.3-linux-x86_64.tar.gz; mv arachni-1.0.2-0.5.3 arachni; chown -R "$USER":"$USER" ./arachni; # # SKIPFISH # wget http://skipfish.googlecode.com/files/skipfish-2.10b.tgz; tar xvfz ./skipfish-2.10b.tgz; mv ./skipfish-2.10b ./skipfish; chown -R "$USER":"$USER" ./skipfish; cd ./skipfish && make; # remote all tarballs and wget as these no longer needed rm -v "$DIR"/*gz; sudo apt-get autoremove wget -y; exit 0
Great, I’m definitely gonna use this! ;)
Thank you Mr. H. I noticed your site was experiencing technical difficulties. I hope you fixed them :)
FYI,
Rather than
tar xvfz blah-1.2.3.tar.gz
mv blah-1.2.3 blah
You can get tar to modify the output path as it extracts files, like so:
tar xvfz blah-1.2.3.tar.gz –transform ‘s,^[^/]*,blah,’
Very useful article. Thank you.
It’s less typing for me to use an ‘mv’ command and tab autocomplete, but of course you can if you’re familiar with sed. That’s why Linux is great, so many ways to skin a cat!
Do they work through OpenVas?
They should all do, except perhaps Lynis and Skipfish.