sfr_crawl: Run a SEO crawl with Screaming Frog SEO Crawler and choose...

Description Usage Arguments Details Value

View source: R/sfr_crawl.R

Description

Function is a wrapper to Screaming Frog CLI program in versions 10+. It lets performing SEO crawl and exporting results.

Usage

1
2
3
4
5
6
7
sfr_crawl(url, output_folder = NULL, timestamped_output = FALSE,
  export_tabs = NULL, export_bulk = NULL, export_report = NULL,
  format = "csv", save_crawl_file = FALSE, overwrite = FALSE,
  headless = TRUE, create_sitemap = FALSE,
  create_images_sitemap = FALSE, config = NULL, use_majestic = NULL,
  use_mozscape = NULL, use_ahrefs = NULL,
  use_google_analytics = NULL, use_google_search_console = NULL)

Arguments

url

string or vector of strings. What should Screaming Frog crawl Single URL will trigger domain-crawl mode (starting from the provided URL); vector of URLs will trigger list-crawl mode - only provided URLs will be crawled

output_folder

string. The path to folder in which the output reports will be stored. If NULL it will create the reports into current working directory. Defaults to NULL

timestamped_output

logical. Should the output be created in a timestamped folder in the output folder? Defaults to FALSE

export_tabs

vector of strings. Supply a character vector of tabs and filters to export. You need to specify the tab name and the filter name based on the names in the Screaming Frog GUI: c(tab:filter, ...). Example: c("Internal:All", "External:All"). Defaults to NULL

export_bulk

vector of strings. Supply character vector of bulk exports to perform. The export names are the same as in the Bulk Export menu in the UI. To access exports in a submenu, use c(submenu-name:export-name). Example: c("All Inlinks", "Directives:Index Inlinks"). Defaults to NULL

export_report

vector of strings. Supply a character vector of reports to save. The report names are the same as in the Report menu in the UI. To access reports in a submenu, use c(submenu-name:report-name) Example: c("Crawl Overview", "Hreflang:All hreflang URLs"). Defaults to NULL

format

string. Supply a format to be used for all exports. Available options are "csv", "xls" and "xlsx". Defaults to "csv". IMPORTANT: it only affects exported tabs, bulk exports and reports, so if you decide to save the crawl file, it will not be affected

save_crawl_file

logical. Should the crawl file be saved? Defaults to FALSE

overwrite

logical. Should the files in the output folder be overwritten? Defaults to FALSE

headless

logical. Should the crawler be run in silent mode without a new window with a user interface? Defaults to TRUE

create_sitemap

logical. Should the crawler create a sitemap from the completed crawl? Defaults to FALSE

create_images_sitemap

logical. Should the crareate an images sitemap from the completed crawl? Defaults to FALSE

config

string. Supply a path to config file for the spider to use. If NULL, crawler will use the default configuration. Defaults to NULL

use_majestic

??? Use Majestic API during crawl

use_mozscape

??? Use Mozscape API during crawl

use_ahrefs

??? Use Ahrefs API during crawl

use_google_analytics

??? Use Google Analytics API during crawl

use_google_search_console

??? Use Google Search Console API during crawl

Details

This package requires Screaming Frog version 10.0 or above.

Crawler requires accepting the EULA and some features need to be activated by providing the license.

For more information, see: https://www.screamingfrog.co.uk/seo-spider/user-guide/general/

Value

spreadsheet files in choosen directory with the reports and/or the crawl file itself


Leszek-Sieminski/screamingFrogR documentation built on May 14, 2021, 2:45 p.m.