allowedByRobots: Function to check if URL is blocked by robots.txt

View source: R/allowedByRobots.R

allowedByRobotsR Documentation

Function to check if URL is blocked by robots.txt

Description

This function checks if a given URL is blocked by the robots.txt file of the website.

Usage

allowedByRobots(url, bot = "googlebot")

Arguments

url

The url you want to check

bot

The bot you want to check the indexability with. Default is googlebot allowedByRobots()

Examples

allowedByRobots("https://www.r-project.org/", bot = "googlebot")

dschmeh/seoR documentation built on Jan. 7, 2023, 12:19 a.m.