API for robotstxt
A 'robots.txt' Parser and 'Webbot'/'Spider'/'Crawler' Permissions Checker

Global functions
%>% Man page
as.list.robotstxt_text Man page Source code
fix_url Man page Source code
get_robotstxt Man page Source code
get_robotstxt_http_get Man page Source code
get_robotstxts Man page Source code
guess_domain Man page Source code
http_domain_changed Man page Source code
http_subdomain_changed Man page Source code
http_was_redirected Man page Source code
is_suspect_robotstxt Man page Source code
is_valid_robotstxt Man page Source code
list_merge Man page Source code
named_list Man page Source code
null_to_defeault Man page Source code
on_client_error_default Man page
on_domain_change_default Man page
on_file_type_mismatch_default Man page
on_not_found_default Man page
on_redirect_default Man page
on_server_error_default Man page
on_sub_domain_change_default Man page
on_suspect_content_default Man page
parse_robotstxt Man page Source code
parse_url Man page Source code
paths_allowed Man page Source code
paths_allowed_worker_spiderbar Man page Source code
print.robotstxt Man page Source code
print.robotstxt_text Man page Source code
reduce Source code
remove_domain Man page Source code
request_handler_handler Man page Source code
robotstxt Man page Source code
rt_cache Man page
rt_get_comments Man page Source code
rt_get_fields Man page Source code
rt_get_fields_worker Man page Source code
rt_get_rtxt Man page Source code
rt_get_useragent Man page Source code
rt_last_http Man page
rt_list_rtxt Man page Source code
rt_request_handler Man page Source code
sanitize_path Man page Source code
robotstxt documentation built on Sept. 4, 2020, 1:08 a.m.