Nothing
The 'Robots Exclusion Protocol' <https://www.robotstxt.org/orig.html> documents a set of standards for allowing or excluding robot/spider crawling of different areas of site content. Tools are provided which wrap The 'rep-cpp' <https://github.com/seomoz/rep-cpp> C++ library for processing these 'robots.txt' files.
Package details |
|
---|---|
Author | Bob Rudis (bob@rud.is) [aut, cre], SEOmoz, Inc [aut] |
Maintainer | Bob Rudis <bob@rud.is> |
License | MIT + file LICENSE |
Version | 0.2.5 |
URL | https://github.com/hrbrmstr/spiderbar |
Package repository | View on CRAN |
Installation |
Install the latest version of this package by entering the following in R:
|
Any scripts or data that you put into this service are public.
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.