getPages: Download SNPedia pages

Description Usage Arguments Details Value See Also Examples

View source: R/getPages.R

Description

A function to download the (wiki) text content of a list of SNPedia pages.

Usage

1
2
getPages(titles, verbose = FALSE, limit = 50,
  wikiParseFunction = identity, baseURL, format, query, ...)

Arguments

titles

Titles of the pages to be downloaded.

verbose

If TRUE some messages are provided.

limit

The maximum number of items to be queried at a time.

wikiParseFunction

Function to be used to parse the wiki code at downloading time. Default is identity so the raw wiki text is provided.

baseURL

SNPedia boots URL.

format

Downloading format. Currently just JSON is available.

query

The query to be iterated.

...

any parameter to be pasted to the wikiParseFunction.

Details

JSON format is parsed to extract the wiki text returned by the function.

If the wikiParseFunction parameter is provided, parsing of the pages is done internally once each batch of pages is downloaded.

Pages do not need to be of the same class... but users may be aware of the type of pages they are queering, moreover when using their own wikiParseFunction.

Parameters baseURL, format and query are not intended for end users.

Value

A list containing the wiki content of the required pages or the formatted objects returned by the wikiParseFunction applied to each page.

See Also

extractTags, getCategoryElements

Examples

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
res <- getPages(titles = "Rs1234")
res

res <- getPages(titles = c("Rs1234", "Rs1234(A;A)", "Rs1234(A;C)"))
res

myfun <- function(x) substring(x, 1, 5)
lapply(res, myfun)

res <- getPages(titles = c("Rs1234", "Rs1234(A;A)", "Rs1234(A;C)"),
wikiParseFunction = myfun)
res

SNPediaR documentation built on Nov. 8, 2020, 5:08 p.m.