61 Extracting URLs from a Web Page


#61 Extracting URLs from a Web Page

A straightforward shell script application of lynx is to extract a list of URLs on a given web page, which can be quite helpful in a variety of situations.

The Code

 #!/bin/sh # getlinks - Given a URL, returns all of its internal and #   external links. if [ $# -eq 0 ] ; then   echo "Usage: 
 #!/bin/sh # getlinks - Given a URL, returns all of its internal and # external links. if [ $# -eq 0 ] ; then echo "Usage: $0 [-d-i-x] url" >&2 echo "-d=domains only, -i=internal refs only, -x=external only" >&2 exit 1 fi if [ $# -gt 1 ] ; then case "$1" in -d) lastcmd="cut -d/ -f3  sort  uniq" shift ;; -i) basedomain="http://$(echo $2  cut -d/ -f3)/" lastcmd="grep \"^$basedomain\"  sed \"s$basedomaing\"  sort  uniq" shift ;; -x) basedomain="http://$(echo $2  cut -d/ -f3)/" lastcmd="grep -v \"^$basedomain\"  sort  uniq" shift ;; *) echo "$0: unknown option specified: $1" >&2; exit 1 esac else lastcmd=" sort uniq" fi lynx -dump "$1"  \ sed -n '/^References$/,$p'  \ grep -E '[[:digit:]]+\.'  \ awk '{print $2}'  \ cut -d\? -f1  \ eval $lastcmd exit 0 
[-d-i-x] url" >&2 echo "-d=domains only, -i=internal refs only, -x=external only" >&2 exit 1 fi if [ $# -gt 1 ] ; then case "" in -d) lastcmd="cut -d/ -f3 sort uniq" shift ;; -i) basedomain="http://$(echo cut -d/ -f3)/" lastcmd="grep \"^$basedomain\" sed \"s$basedomaing\" sort uniq" shift ;; -x) basedomain="http://$(echo cut -d/ -f3)/" lastcmd="grep -v \"^$basedomain\" sort uniq" shift ;; *) echo "
 #!/bin/sh # getlinks - Given a URL, returns all of its internal and # external links. if [ $# -eq 0 ] ; then echo "Usage: $0 [-d-i-x] url" >&2 echo "-d=domains only, -i=internal refs only, -x=external only" >&2 exit 1 fi if [ $# -gt 1 ] ; then case "$1" in -d) lastcmd="cut -d/ -f3  sort  uniq" shift ;; -i) basedomain="http://$(echo $2  cut -d/ -f3)/" lastcmd="grep \"^$basedomain\"  sed \"s$basedomaing\"  sort  uniq" shift ;; -x) basedomain="http://$(echo $2  cut -d/ -f3)/" lastcmd="grep -v \"^$basedomain\"  sort  uniq" shift ;; *) echo "$0: unknown option specified: $1" >&2; exit 1 esac else lastcmd=" sort uniq" fi lynx -dump "$1"  \ sed -n '/^References$/,$p'  \ grep -E '[[:digit:]]+\.'  \ awk '{print $2}'  \ cut -d\? -f1  \ eval $lastcmd exit 0 
: unknown option specified: " >&2; exit 1 esac else lastcmd="sort uniq" fi lynx -dump "" \ sed -n '/^References$/,$p' \ grep -E '[[:digit:]]+\.' \ awk '{print }' \ cut -d\? -f1 \ eval $lastcmd exit 0

How It Works

When displaying a page, lynx shows the text of the page, formatted as best it can, followed by a list of all hypertext references, or links, found on that page. This script simply extracts just the links by using a sed invocation to print everything after the "References" string in the web page text, and then processes the list of links as needed based on the user -specified flags.

The one interesting technique demonstrated by this script is the way the variable lastcmd is set to filter the list of links that it extracts according to the flags specified by the user. Once lastcmd is set, the amazingly handy eval command is used to force the shell to interpret the content of the variable as if it were a command, not a variable.

Running the Script

By default, this script outputs a list of all links found on the specified web page, not just those that are prefaced with http: . There are three optional command flags that can be specified to change the results, however: -d produces just the domain names of all matching URLs, -i produces a list of just the internal references (that is, those references that are found on the same server as the current page), and -x produces just the external references, those URLs that point to a different server.

The Results

A simple request is a list of all links on a specified website home page:

 $  getlinks http://www.trivial.net/  http://www.intuitive.com/ http://www.trivial.net/kudos/index.html http://www.trivial.net/trivial.cgi mailto:nerds@trivial.net 

Another possibility is to request a list of all domain names referenced at a specific site. This time let's first use the standard Unix tool wc to check how many links are found overall:

 $  getlinks http://www.amazon.com/  wc -l  136 

Amazon has 136 links on its home page. Impressive! Now, how many different domains does that represent? Let's generate a full list with the -d flag:

 $  getlinks -d http://www.amazon.com/  s1.amazon.com www.amazon.com 

As you can see, Amazon doesn't tend to point anywhere else. Other sites are different, of course. As an example, here's a list of all external links in my weblog:

 $  getlinks -x http://www.intuitive.com/blog/  LYNXIMGMAP:http://www.intuitive.com/blog/#headermap http://blogarama.com/in.php http://blogdex.media.mit.edu/ http://booktalk.intuitive.com/ http://chris.pirillo.com/ http://cortana.typepad.com/rta/ http://dylan.tweney.com/ http://fx.crewtags.com/blog/ http://geourl.org/near/ http://hosting.verio.com/index.php/vps.html http://imajes.info/ http://jake.iowageek.com/ http://myst-technology.com/mysmartchannels/public/blog/214/ http://smattering.org/dryheat/ http://www.101publicrelations.com/blog/ http://www.APparenting.com/ http://www.backupbrain.com/ http://www.bloghop.com/ http://www.bloghop.com/ratemyblog.htm http://www.blogphiles.com/webring.shtml http://www.blogshares.com/blogs.php http://www.blogstreet.com/blogsqlbin/home.cgi http://www.blogwise.com/ http://www.gnome-girl.com/ http://www.google.com/search/ http://www.icq.com/ http://www.infoworld.com/ http://www.mail2web.com/ http://www.movabletype.org/ http://www.nikonusa.com/usa_product/product.jsp http://www.onlinetonight.net/ethos/ http://www.procmail.org/ http://www.ringsurf.com/netring/ http://www.spamassassin.org/ http://www.tryingreallyhard.com/ http://www.yahoo.com/r/p2 

Hacking the Script

You can see where getlinks could be quite useful as a site analysis tool. Stay tuned : Script #77, checklinks , is a logical follow-on to this script, allowing a quick link check to ensure that all hypertext references on a site are valid.




Wicked Cool Shell Scripts. 101 Scripts for Linux, Mac OS X, and Unix Systems
Wicked Cool Shell Scripts
ISBN: 1593270127
EAN: 2147483647
Year: 2004
Pages: 150
Authors: Dave Taylor

flylib.com © 2008-2017.
If you may any questions please contact us: flylib@qtcs.net