petermeissner/robotstxt: A 'robots.txt' Parser and 'Webbot'/'Spider'/'Crawler' Permissions Checker

Provides functions to download and parse 'robots.txt' files. Ultimately the package makes it easy to check if bots (spiders, crawler, scrapers, ...) are allowed to access specific resources on a domain.

Getting started

Package details

Maintainer
LicenseMIT + file LICENSE
Version0.7.15.9000
URL https://docs.ropensci.org/robotstxt/ https://github.com/ropensci/robotstxt
Package repositoryView on GitHub
Installation Install the latest version of this package by entering the following in R:
install.packages("remotes")
remotes::install_github("petermeissner/robotstxt")
petermeissner/robotstxt documentation built on Nov. 17, 2024, 9:50 p.m.