allowedByRobots: Function to check if URL is blocked by robots.txt

Description Usage Arguments Examples

View source: R/allowedByRobots.R

Description

This function checks if a given URL is blocked by the robots.txt file of the website.

Usage

1
allowedByRobots(url, bot = "googlebot")

Arguments

url

The url you want to check

bot

The bot you want to check the indexability with. Default is googlebot allowedByRobots()

Examples

1
allowedByRobots("https://www.r-project.org/", bot = "googlebot")

seoR documentation built on Jan. 29, 2018, 5:05 p.m.

Related to allowedByRobots in seoR...