Provides functions to download and parse 'robots.txt' files. Ultimately the package makes it easy to check if bots (spiders, crawler, scrapers, ...) are allowed to access specific resources on a domain.
Package details |
|
---|---|
Maintainer | |
License | MIT + file LICENSE |
Version | 0.7.15.9000 |
URL | https://docs.ropensci.org/robotstxt/ https://github.com/ropensci/robotstxt |
Package repository | View on GitHub |
Installation |
Install the latest version of this package by entering the following in R:
|
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.