Takes a set of links, and downloads the data for the individual records at each of those links, saving that data to disk link by link. This is a little slower than just collecting it all then saving it, but it means that a) we don't have to store millions of records worth of data in memory, and b) that if for whatever reason the connection is lost, we have all the data up to that point in time. It also has a built-in sleep every 1000 links (approx. half an hour) so that it stops querying the server for around ten minutes.
1 | getRecordData(links, filename)
|
links |
Either a vector of links, or the filename of some links file that was written out by getLinks. |
filename |
The name of the file that the data is going to be written into. |
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.