robotstxt: A 'robots.txt' Parser and 'Webbot'/'Spider'/'Crawler' Permissions Checker
Version 0.5.2

Provides functions to download and parse 'robots.txt' files. Ultimately the package makes it easy to check if bots (spiders, crawler, scrapers, ...) are allowed to access specific resources on a domain.

Package details

AuthorPeter Meissner [aut, cre], Oliver Keys [ctb], Rich Fitz John [ctb]
Date of publication2017-11-12 17:45:33 UTC
MaintainerPeter Meissner <[email protected]>
LicenseMIT + file LICENSE
Version0.5.2
URL https://github.com/ropenscilabs/robotstxt
Package repositoryView on CRAN
Installation Install the latest version of this package by entering the following in R:
install.packages("robotstxt")

Try the robotstxt package in your browser

Any scripts or data that you put into this service are public.

robotstxt documentation built on Nov. 17, 2017, 8:14 a.m.