This function fetch and parse robots.txt file of the website which is specified in the first argument and return the list of correspending rules .
character, url of the website which rules have to be extracted .
character, the useragent of the crawler
return a list of three elements, the first is a character vector of Disallowed directories, the third is a Boolean value which is TRUE if the user agent of the crawler is blocked.
#RobotParser("http://www.glofile.com","AgentX") #Return robot.txt rules and check whether AgentX is blocked or not.
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.