robotstxt: A 'robots.txt' Parser and 'Webbot'/'Spider'/'Crawler' Permissions Checker
Version 0.3.2

Provides functions to download and parse 'robots.txt' files. Ultimately the package makes it easy to check if bots (spiders, scrapers, ...) are allowed to access specific resources on a domain.

Package details

AuthorPeter Meissner [aut, cre], Oliver Keys [ctb], Rich Fitz John [ctb]
Date of publication2016-12-05 18:28:48
MaintainerPeter Meissner <retep.meissner@gmail.com>
LicenseMIT + file LICENSE
Version0.3.2
URL https://github.com/ropenscilabs/robotstxt
Package repositoryView on CRAN
Installation Install the latest version of this package by entering the following in R:
install.packages("robotstxt")

Try the robotstxt package in your browser

Any scripts or data that you put into this service are public.

robotstxt documentation built on May 29, 2017, 9:15 a.m.