Provides functions to download and parse 'robots.txt' files. Ultimately the package makes it easy to check if bots (spiders, crawler, scrapers, ...) are allowed to access specific resources on a domain.
|Author||Peter Meissner [aut, cre], Oliver Keys [ctb], Rich Fitz John [ctb]|
|Date of publication||2018-02-11 14:19:44 UTC|
|Maintainer||Peter Meissner <[email protected]>|
|License||MIT + file LICENSE|
|Package repository||View on CRAN|
Install the latest version of this package by entering the following in R:
Any scripts or data that you put into this service are public.
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.