salimk/Rcrawler: Web Crawler and Scraper

Performs parallel web crawling and web scraping. It is designed to crawl, parse and store web pages to produce data that can be directly used for analysis application. For details see Khalil and Fakir (2017) <DOI:10.1016/j.softx.2017.04.004>.

Getting started

Package details

Maintainer
LicenseGPL (>=2)
Version0.1.9-1
URL https://github.com/salimk/Rcrawler/
Package repositoryView on GitHub
Installation Install the latest version of this package by entering the following in R:
install.packages("remotes")
remotes::install_github("salimk/Rcrawler")
salimk/Rcrawler documentation built on May 25, 2020, 5:02 p.m.