View source: R/download_climate.R
download_climate | R Documentation |
For polite scraping, 5 sec interval is set in download_climate(), it takes over 5 hours to get climate data of all stations. Please use existing links by "data(climate_world)", if you do not need to renew climate data. You can see web page as below. https://www.data.jma.go.jp/gmd/cpd/monitor/nrmlist/
download_climate(url)
url |
A String to specify target html. |
A tibble including climate and station information, or NULL when failed.
# If you want all climate data, remove head(). # The codes take > 5 sec because of poliste scraping. library(magrittr) library(stringi) library(dplyr) data(station_links) station_links <- station_links %>% dplyr::mutate_all(stringi::stri_unescape_unicode) %>% head(3) %T>% { continent <<- `$`(., "continent") no <<- `$`(., "no") } %>% `$`("url") climate <- list() for(i in seq_along(station_links)){ print(stringr::str_c(i, " / ", length(station_links))) climate[[i]] <- download_climate(station_links[i]) } # run only when download_climate() successed if(sum(is.null(climate[[1]]), is.null(climate[[2]]), is.null(climate[[3]])) == 0){ month_per_year <- 12 climate_world <- dplyr::bind_rows(climate) %>% dplyr::bind_cols( tibble::tibble(continent = rep(continent, month_per_year))) %>% dplyr::bind_cols( tibble::tibble(no = rep(no, month_per_year))) %>% dplyr::relocate(no, continent, country, station) climate_world }
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.