APPLICATION INSPECTION

So far we have looked at tools that examine the web server. In doing so, we miss vulnerabilities that may be present in the web application. This class of vulnerabilities arises from insecure programming and misconfigurations of the interaction between web servers and databases. We can't explain the nature of web application insecurity and the methodology and techniques for finding those vulnerabilities within a single chapter. What we will show are the tools necessary for you to peek into a web application. Although a few of these programs have grown from the security community, they deserve a place in a web application programmer's debugging tool kit as well.

Paros Proxy

If you search for web application proxy tools or assessment tools on the Internet, you'll likely come across references to utilities named Achilles or WebSleuth. Those tools are no longer actively developed and between them have enough quirks to frustrate the most dedicated web auditor . This is not a great loss because new, Java-based tools have taken over as heavyweights in the local proxy arena. Achilles introduced the utility of local proxies, but development stalled prematurely, and WebSleuth is intractably tied to Internet Explorer. Paros is a Java-based proxy that not only serves as a local proxy, but adds significant enhancements to usability, testing techniques, and data presentation. In other words, you should download (http://www.parosproxy.org/index.shtml), install, and try Paros because it's an excellent tool!

Implementation

Paros is pure Java. Hence, you can download and compile the source yourself or simply obtain the binary and begin testing. You will need to use the Java 1.4 environment, so be sure to update your system's Java installation if it does not meet this requirement. Once installed, launch Paros and set your browser's HTTP proxy settings for 127.0.0.1 port 8080 ( HTTPS uses the same port). Now, you are ready to begin examining a web application: navigate through the application as you normally would via the web browser. Paros silently records the directory and file structure of every request. This information is stored in a local file-based database, but can be exported as a text file.

Figure 7-1 shows the directory structure of an Aspnuke application in the Site frame in the upper-left corner of the interface.


Figure 7-1: Paros tracks the directory structure of each web site.

Although Paros observes every aspect of the request, whether the request uses HTTP or HTTPS, it will log only cookies and the site hierarchy by default. If you wish to record other aspects of the application, navigate to the Tools menu on the interface and set your desired options, as shown in Figure 7-2. Even though the GET and POST files have an .xls extension, they are tab-delimited plaintext files that you can view with a text editor or import into a spreadsheet application. The files are written to the filter directory in which Paros was installed.


Figure 7-2: Apply filters to save specific data.
Tip 

Enable the "Avoid Browser Cache" option if you will be testing user impersonation or session-based attacks (cookie guessing, cookie theft). Otherwise, you may mistakenly assume that a nefariously obtained profile page was due to URL parameter manipulation rather than a true vulnerability in the application.

Your next option is to instruct Paros to scan the items in the site hierarchy for common vulnerabilities. Navigate to the Scanner menu and select the type of scans you wish to perform, either all of the nodes in the Site listing (Scan All) or the highlighted site (Scan). Scan Policy options are shown in Figure 7-3. Once you select the scan type Paros begins its predefined tests.


Figure 7-3: Enable specific vulnerability scans.

Scan results can be obtained via the Report menu option or clicking on the Alerts tab in the bottom frame of the Paros window, as show in Figure 7-4.


Figure 7-4: View the vulnerability alerts from a scan.

The greatest benefit of a local proxy is the ability to intercept and rewrite web requests . Paros provides this capability in the Trap tab, which is split into two sections. The Header section shows the intercepted request when Trap Request is checked. This allows you to view and edit the entire URL and Headers that will be sent to the server. Once you click Continue, the Header and Body sections are populated with, appropriately enough, the HTTP Header and Body data returned by the server. This process is shown in the next two figures. Figure 7-5 demonstrates the alteration of a request to a vulnerable PHP application. You should notice that a single quote has been inserted into the topicid=1; URL parameter. Figure 7-6 shows the Header and Body returned by the server. The Header, which used to contain the modified request, contains the Date, Server, and other fields. More interesting is the Body section, which displays the error produced in the back-end Microsoft SQL database due to the extraneous semicolon inserted into the topicid parameter.


Figure 7-5: Trap and modify a URL request.

Figure 7-6: Trap the HTTP Headers and Body of a request.

The ability to rewrite and insert arbitrary characters into HTTP GET and POST requests makes a tool like Paros indispensable for auditing the security of a web application. Paros is just a tool; the techniques and tricks of testing web application security are far too broad to cover in this chapter.

Finally, Paros has an additional function hidden under the Tools menu. You can have Paros spider any HTTP or HTTPS application and populate the site hierarchy window automatically. The spider function works with varying success that depends on what the application requires with regard to cookies, headers, and authentication. Nevertheless, it serves as a nice utility that will improve over time. If you find yourself using Paros often, be sure to check out the advanced configuration options under the Tools menu. Figure 7-7 shows some of these options, including the ability to chain proxies and provide client-side SSL certificates and advanced scanning parameters.


Figure 7-7: Configure advanced options.

Burp Proxy

While Burp Proxy may claim a more memorable name than its peer, Paros, it performs the same basic functions. Burp is available at http://portswigger.net/proxy/ as part of a suite of Java-based utilities or a stand-alone Java-based tool.

Implementation

Burp eschews a Windows installer. Instead, you merely extract the zip (or tar.gz) file to a directory of your choice and launch the .jar application with Java. Windows users can click on the handy suite.bat file. Once started, the entire Burp suite looks something like that picture in Figure 7-8. Note that the proxy is actually part of a suite of tools that focus on web application security testing.


Figure 7-8: Launch Burp Proxy.

The first thing you'll want to do is navigate to the Options tab and adjust the intercept settings. By default, Burp will begin to intercept all HTTP requests for non-image files. The intercept logic can be as simple as pattern matching or a combination of Boolean operators against the request's attributes. Responses from the server are intercepted in an identical manner. Figure 7-9 shows some of these options.


Figure 7-9: Configure intercept options.

Figures 7-10 and 7-11 demonstrate the request interception, modification of the URL parameter, and capturing the server's response. This is identical to the method described previously with Paros against the topicid parameter.


Figure 7-10: Capture and modify a browser request.

Figure 7-11: Examine the server's response.
Tip 

It seems trivial to modify URL parameters with tools like Paros or Burp. In fact, the true power of these tools is apparent when you wish to modify POST data or cookie values. Then, the intercept capability comes in really useful.

Wget

The final tool we present probably seems out of place compared to the previous tools. Wget is a command-line tool that basically copies a web site's contents. It starts at the home page and follows every link until it has discovered every page of the web site. When someone performs a security audit of a web application, one of the first steps is to sift through every page of the application. For spammers, the goal would be to find e-mail addresses. For others, the goal would be to look for programmers' notes that perhaps contain passwords, SQL statements, or other juicy tidbits. In the end, a local copy of the web application's content enables the person to search large sites quickly for these types of information.

Wget has other uses from an administrator's point of view, such as creating mirrors for highly trafficked web sites. The administrators for the mirrors of many web sites (such as http://www.samba.org and http://www.kernel.org) use wget or similar tools to reproduce the master server on alternative servers. They do this to reduce load and to spread web sites geographically .

Implementation

As wget's main purpose is to download the contents of a web site, its usage is simple. To spider a web site recursively, use the r option:

 $ wget -r www.victim.com ...  (continues for entire site)  ... 

The -r or recursive option instructs wget to follow every link on the home page. This will create a http://www.victim.com directory and populate that directory with every HTML file and directory wget finds for the site. A major advantage of wget is that it follows every link possible. Thus, it will download the output for every argument that the application passes to a page. For example, the viewer.asp file for a site might be downloaded four times:

  • viewer.asp@ID=555

  • viewer.asp@ID=7

  • viewer.asp@ID=42

  • viewer.asp@ID=23

The @ symbol represents the ? delimiter in the original URL. The ID is the first argument (parameter) passed to the viewer.asp file. Some sites may require more advanced options such as support for proxies and HTTP Basic Authentication. Sites protected by Basic Authentication can be spidered in this way:

 [root@meddle]# wget -r --http-user:dwayne --http-pass:woodelf \ > https://www.victim.com/secure/ ...  continues for entire site  ... 

Sites that rely on cookies for session state or authentication can also be spidered by wget. Create a cookie file that contains a set of valid cookies from a user's session. The prerequisite, of course, is that you must be able to log into the site to collect the cookie values. Then, use the load-cookies option to instruct wget to impersonate that user based on the cookies:

 $ wget --load-cookies=cookies.txt \ > -r https://www.victim.com/secure/menu.asp 

Still other sites purposefully set cookies to defeat most spidering tools. Wget can handle session and saved cookies with the appropriately named cookies option. It is a Boolean value, so you can either turn it off (the default) or on:

 $ wget --load-cookies=cookies.txt --cookies=on \ > -r https://www.victim.com/secure/menu.asp 

The http-user and http-passwd options enable wget to access web applications that employ HTTP Basic Authentication. Set the values on the command line and watch wget fly:

 $ wget --http-user=guest --http-passwd=no1knows \ > -r https://www.victim.com/maillist/index.html 

In the end, wget provides a quick method for downloading the HTML contents of a web application for off-line analysis. If you are frustrated by the spidering capabilities of Paros, then use wget to perform these tasks .



Anti-Hacker Tool Kit
Anti-Hacker Tool Kit, Third Edition
ISBN: 0072262877
EAN: 2147483647
Year: 2006
Pages: 175

flylib.com © 2008-2017.
If you may any questions please contact us: flylib@qtcs.net