hrbrmstr/sanders: Web-scraping and Web-crawling Content Parsing, Validation and Sanitization Helpers

When researchers crawl/scrape the web for content they are talking to strange computers running software created by a diverse array of humans. Content -- even content that is supposed to adhere to internet stadards -- can have very rough edges that need to be smoothed out to be useful and potentially less harmful to the systems running the scraping and analysis code. Methods are provided that sand off the rough edges of many different types of scraped content and metadata.

Getting started

Package details

AuthorBob Rudis (bob@rud.is)
MaintainerBob Rudis <bob@rud.is>
LicenseAGPL
Version0.1.0
URL https://github.com/hrbrmstr/sanders
Package repositoryView on GitHub
Installation Install the latest version of this package by entering the following in R:
install.packages("remotes")
remotes::install_github("hrbrmstr/sanders")
hrbrmstr/sanders documentation built on May 30, 2019, 4:35 p.m.