scraply: Scrape urls with llply, handling errors

Description Usage Arguments Value Examples

Description

This function works like ldply, but specifically for page scraping. Like ldply, it applies a function over a list (in this case a list of urls) and returns a data.frame. The difference is that scraply includes error handling and logging automagically This saves you a ton of time when you want to quickly write and deploy a page scraper. Happy scraplying!

Usage

1
  scraply(ids, fx, sleep = 0)

Arguments

ids

A character vector of ids/urls to feed to a scraping function

fx

The scraping function to apply across the ids/urls

sleep

Seconds to sleep between iterations.

Value

A data.frame created by the scraping function (fx), with an added "error" column. Urls that dont return data will have scraped fields filled with NAs.

Examples

1
# see example in the README

abelsonlive/scraply documentation built on May 10, 2019, 4:09 a.m.