Nothing
Provides functions to download and parse 'robots.txt' files. Ultimately the package makes it easy to check if bots (spiders, crawler, scrapers, ...) are allowed to access specific resources on a domain.
Package details |
|
---|---|
Author | Pedro Baltazar [aut, cre], Peter Meissner [aut], Kun Ren [aut, cph] (Author and copyright holder of list_merge.R.), Oliver Keys [ctb] (original release code review), Rich Fitz John [ctb] (original release code review) |
Maintainer | Pedro Baltazar <pedrobtz@gmail.com> |
License | MIT + file LICENSE |
Version | 0.7.15 |
URL | https://docs.ropensci.org/robotstxt/ https://github.com/ropensci/robotstxt |
Package repository | View on CRAN |
Installation |
Install the latest version of this package by entering the following in R:
|
Any scripts or data that you put into this service are public.
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.