Parse a web page, capturing and returning any links found.
A URL to scan for links.
This is an internal routine used by several functions in the package.
A vector of link URLs
While it might be fun to try
LinkExtractor on a large website such as Google, the results will be unpredictable and perhaps disastrous if
depth is not set.
This is because there is no protection against infinite recursion.
Daniel C. Bowman firstname.lastname@example.org
1 2 3 4 5 6 7 8
#Find model runs for the #GFS 0.5x0.5 model ## Not run: urls.out <- LinkExtractor( "http://nomads.ncep.noaa.gov/cgi-bin/filter_gfs_0p50.pl") ## End(Not run)