Description Details References
This package provides several S4 classes that make it easier to collect and work with datasets. The package is inspired by the scraperwiki project, which provides a webbased service for data collection. Also inspiring are Mathematica's xxxData functions, which provide in-built parametrizable datasets.
You can specify web resources with the urldata
and the xsparql
functions. For working with locally saved data,
see the internalData
and the csvdata
function. The objects instantiated with these functions can than be passed
to the generic query
along with some parameters to get to the data.
You can combine several resources with the datamart
function.
Besides parameterized queries ("read" operations), the package also aims to support "write" operations.
For this purpose, some functions (currently mdreport
, swvreport
) for defining targets
as well as some functions (currently blogger
and dirloc
) for defining locations
are provided. The generic put
then builds the target and puts it at the defined location.
Some examples aim to proof the concept, for instance dbpedia
, sourceforge
, expenditures
,
and city_coords
.
The package is highly experimental, and likely to change heavily without backward compatiblity.
Karsten Weinert, factbased blogspot.
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.