crawl_errors: Fetch a time-series of Googlebot crawl errors.

Description Usage Arguments Details Value See Also

Description

Get a list of errors detected by Googlebot over time. See here for details: https://developers.google.com/webmaster-tools/v3/urlcrawlerrorscounts/query

Usage

1
2
crawl_errors(siteURL, category = "all", platform = c("all", "mobile",
  "smartphoneOnly", "web"), latestCountsOnly = FALSE)

Arguments

siteURL

The URL of the website to delete. Must include protocol (http://).

category

Crawl error category. Defaults to 'all'

platform

The user agent type. 'all', 'mobile', 'smartphoneOnly' or 'web'.

latestCountsOnly

Default FALSE. Only the latest crawl error counts returned if TRUE.

Details

The timestamp is converted to a date as they are only available daily.

Category is one of: authPermissions, manyToOneRedirect, notFollowed, notFound, other, roboted, serverError, soft404.

Platform is one of: mobile, smartphoneOnly or web.

Value

dataframe of errors with $platform $category $count and $timecount.

See Also

Other working with search console errors: error_sample_url, fix_sample_url, list_crawl_error_samples



Search within the searchConsoleR package
Search all R packages, documentation and source code

Questions? Problems? Suggestions? or email at ian@mutexlabs.com.

Please suggest features or report bugs with the GitHub issue tracker.

All documentation is copyright its authors; we didn't write any of that.