Metasploit Web Crawler - Metasploit


This page contains detailed information about how to use the auxiliary/crawler/msfcrawler metasploit module. For list of all metasploit modules, visit the Metasploit Module Library.

Module Overview


Name: Metasploit Web Crawler
Module: auxiliary/crawler/msfcrawler
Source code: modules/auxiliary/crawler/msfcrawler.rb
Disclosure date: -
Last modification time: 2017-08-24 21:38:44 +0000
Supported architecture(s): -
Supported platform(s): -
Target service / protocol: -
Target network port(s): 80
List of CVEs: -

This auxiliary module is a modular web crawler, to be used in conjunction with wmap (someday) or standalone.

Module Ranking and Traits


Module Ranking:

  • normal: The exploit is otherwise reliable, but depends on a specific version and can't (or doesn't) reliably autodetect. More information about ranking can be found here.

Basic Usage


This module is a scanner module, and is capable of testing against multiple hosts.

msf > use auxiliary/crawler/msfcrawler
msf auxiliary(msfcrawler) > show options
    ... show and set options ...
msf auxiliary(msfcrawler) > set RHOSTS ip-range
msf auxiliary(msfcrawler) > exploit

Other examples of setting the RHOSTS option:

Example 1:

msf auxiliary(msfcrawler) > set RHOSTS 192.168.1.3-192.168.1.200 

Example 2:

msf auxiliary(msfcrawler) > set RHOSTS 192.168.1.1/24

Example 3:

msf auxiliary(msfcrawler) > set RHOSTS file:/tmp/ip_list.txt

Required Options


  • RHOSTS: The target host(s), range CIDR identifier, or hosts file with syntax 'file:<path>'

Go back to menu.

Msfconsole Usage


Here is how the crawler/msfcrawler auxiliary module looks in the msfconsole:

msf6 > use auxiliary/crawler/msfcrawler

msf6 auxiliary(crawler/msfcrawler) > show info

       Name: Metasploit Web Crawler
     Module: auxiliary/crawler/msfcrawler
    License: Metasploit Framework License (BSD)
       Rank: Normal

Provided by:
  et <[email protected]>

Check supported:
  No

Basic options:
  Name     Current Setting  Required  Description
  ----     ---------------  --------  -----------
  PATH     /                yes       Starting crawling path
  RHOSTS                    yes       The target host(s), range CIDR identifier, or hosts file with syntax 'file:<path>'
  RPORT    80               yes       Remote port
  THREADS  1                yes       The number of concurrent threads (max one per host)

Description:
  This auxiliary module is a modular web crawler, to be used in 
  conjunction with wmap (someday) or standalone.

Module Options


This is a complete list of options available in the crawler/msfcrawler auxiliary module:

msf6 auxiliary(crawler/msfcrawler) > show options

Module options (auxiliary/crawler/msfcrawler):

   Name     Current Setting  Required  Description
   ----     ---------------  --------  -----------
   PATH     /                yes       Starting crawling path
   RHOSTS                    yes       The target host(s), range CIDR identifier, or hosts file with syntax 'file:<path>'
   RPORT    80               yes       Remote port
   THREADS  1                yes       The number of concurrent threads (max one per host)

Advanced Options


Here is a complete list of advanced options supported by the crawler/msfcrawler auxiliary module:

msf6 auxiliary(crawler/msfcrawler) > show advanced

Module advanced options (auxiliary/crawler/msfcrawler):

   Name                 Current Setting                                               Required  Description
   ----                 ---------------                                               --------  -----------
   CrawlerModulesDir    /opt/metasploit-framework/embedded/framework/data/msfcrawler  yes       The base directory containing the crawler modules
   DontCrawl            .exe,.zip,.tar,.bz2,.run,.asc,.gz                             yes       Filestypes not to crawl
   EnableUl             true                                                          no        Enable maximum number of request per URI
   MaxUriLimit          10                                                            yes       Number max. request per URI
   ReadTimeout          3                                                             yes       Read timeout (-1 forever)
   ShowProgress         true                                                          yes       Display progress messages during a scan
   ShowProgressPercent  10                                                            yes       The interval in percent that progress should be shown
   SleepTime            0                                                             yes       Sleep time (secs) between requests
   StoreDB              false                                                         no        Store requests in database
   TakeTimeout          15                                                            yes       Timeout for loop ending
   ThreadNum            20                                                            yes       Threads number
   VERBOSE              false                                                         no        Enable detailed status messages
   WORKSPACE                                                                          no        Specify the workspace for this module

Auxiliary Actions


This is a list of all auxiliary actions that the crawler/msfcrawler module can do:

msf6 auxiliary(crawler/msfcrawler) > show actions

Auxiliary actions:

   Name  Description
   ----  -----------

Evasion Options


Here is the full list of possible evasion options supported by the crawler/msfcrawler auxiliary module in order to evade defenses (e.g. Antivirus, EDR, Firewall, NIDS etc.):

msf6 auxiliary(crawler/msfcrawler) > show evasion

Module evasion options:

   Name  Current Setting  Required  Description
   ----  ---------------  --------  -----------

Go back to menu.

Error Messages


This module may fail with the following error messages:

Check for the possible causes from the code snippets below found in the module source code. This can often times help in identifying the root cause of the problem.

URI not crawled <URI>


Here is a relevant code snippet related to the "URI not crawled <URI>" error message:

118:	
119:	          @ViewedQueue[hashsig(hashreq)] = Time.now
120:	          @UriLimits[hashreq['uri']] += 1
121:	
122:	          if !File.extname(hashreq['uri']).empty? and datastore['DontCrawl'].include? File.extname(hashreq['uri'])
123:	            vprint_status "URI not crawled #{hashreq['uri']}"
124:	          else
125:	              prx = nil
126:	              #if self.useproxy
127:	              #	prx = "HTTP:"+self.proxyhost.to_s+":"+self.proxyport.to_s
128:	              #end

The Crawler modules parameter is set to an invalid directory


Here is a relevant code snippet related to the "The Crawler modules parameter is set to an invalid directory" error message:

222:	
223:	  def load_modules(crawlermodulesdir)
224:	
225:	    base = crawlermodulesdir
226:	    if (not File.directory?(base))
227:	      raise RuntimeError,"The Crawler modules parameter is set to an invalid directory"
228:	    end
229:	
230:	    @crawlermodules = {}
231:	    cmodules = Dir.new(base).entries.grep(/\.rb$/).sort
232:	    cmodules.each do |n|

Crawler module <N> failed to load: <E.CLASS> <E> <E.BACKTRACE>


Here is a relevant code snippet related to the "Crawler module <N> failed to load: <E.CLASS> <E> <E.BACKTRACE>" error message:

240:	          @crawlermodules[cmod.downcase] = klass.new(self)
241:	
242:	          print_status("Loaded crawler module #{cmod} from #{f}...")
243:	        end
244:	      rescue ::Exception => e
245:	        print_error("Crawler module #{n} failed to load: #{e.class} #{e} #{e.backtrace}")
246:	      end
247:	    end
248:	  end
249:	
250:	  def sendreq(nclient,reqopts={})

Here is a relevant code snippet related to the "[404] Invalid link <URI>" error message:

282:	        when 301..303
283:	          print_line("[#{resp.code}] Redirection to: #{resp['Location']}")
284:	          vprint_status urltohash('GET',resp['Location'],reqopts['uri'],nil)
285:	          insertnewpath(urltohash('GET',resp['Location'],reqopts['uri'],nil))
286:	        when 404
287:	          print_status "[404] Invalid link #{reqopts['uri']}"
288:	        else
289:	          print_status "Unhandled #{resp.code}"
290:	        end
291:	
292:	      else

No response


Here is a relevant code snippet related to the "No response" error message:

288:	        else
289:	          print_status "Unhandled #{resp.code}"
290:	        end
291:	
292:	      else
293:	        print_status "No response"
294:	      end
295:	      sleep(datastore['SleepTime'])
296:	    rescue
297:	      print_status "ERROR"
298:	      vprint_status "#{$!}: #{$!.backtrace}"

ERROR


Here is a relevant code snippet related to the "ERROR" error message:

292:	      else
293:	        print_status "No response"
294:	      end
295:	      sleep(datastore['SleepTime'])
296:	    rescue
297:	      print_status "ERROR"
298:	      vprint_status "#{$!}: #{$!.backtrace}"
299:	    end
300:	  end
301:	
302:	  #

Go back to menu.


Go back to menu.

See Also


Check also the following modules related to this module:

Authors


et

Version


This page has been produced using Metasploit Framework version 6.1.27-dev. For more modules, visit the Metasploit Module Library.

Go back to menu.