Description Usage Arguments Value Examples
This function fetch and parse robots.txt file of the website which is specified in the first argument and return the list of correspending rules .
1 | RobotParser(website, useragent)
|
website |
character, url of the website which rules have to be extracted . |
useragent |
character, the useragent of the crawler |
return a list of three elements, the first is a character vector of Disallowed directories, the third is a Boolean value which is TRUE if the user agent of the crawler is blocked.
1 2 | #RobotParser("http://www.glofile.com","AgentX")
#Return robot.txt rules and check whether AgentX is blocked or not.
|
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.