retrieve_survey_data()
has been replaced by get_kobo_data
updated to retrieve data from the new API. The function retrieve_survey_metadata()
has been deprecated.googlesheets4
package.Improve national and municipal estimates combining packags Amelia
and
mice
for missing and outliers data imputation.
Integrating price per kg into export data
ingest_pds_map
. The function process and upload a data frame containing the number of trips, CPE (catch per unit effort) and RPE (revenue per unit effort) splitted by grids to produce leaflet maps in the web portal.-Fix error in cleaning of legacy landings: the columns indicating the number of individuals and fish length for catches > 60cm were exchanged
Adding option to produce Timor map filtered by fishing trips
Improve the validation step by flagging those observation characterized by having positive revenue (individuals) despite 0 or null individuals (revenue).
Use a log model to identify abnormal weight-revenue relationships on Cook's distance estimation.
Replace NA catch code with catch code "0" and ensure these observations having non positive individuals and revenue.
Use total length for weight calculation of MOO in all the landings (weights calculated with FL seem quite unrealistic)
Use 95° quantile instead of median to summarise weight parameters for catch types, it seems to return more realistic weight estimations by single individuals.
ingest_pds_matched_trips
to ingest matched pds tracks and survey landings in a zip folder on a monthly scale.get_nutrients_table
. The function links to the repository https://github.com/mamacneil/NutrientFishbase and join the estimated nutrients values with the FishBase species data.get_sync_tracks()
, get_full_tracks()
,get_full_trips()
) useful to retrieve the complete file of pds tracks.ingest_pds_map()
and get_tracks_map()
.ingest_rfish_table()
in the main pipeline.get_catch_types()
, get_fish_length()
,retrieve_lengths()
) useful to retrieve morphometric conversion factors from catch types names in metadata tables. join_weights()
which integrates morphometric data with merged landings.merge_trips()
which integrates data from the landings and tracking togetherformat_public_data()
to format and export data to be used for analyticspreprocess_pds_trips()
and validate_pds_trips()
. These functions make sure the data types are appropriate, check for trip duplicates and perform basic checks for trip duration and distance.retrieve_pds_trips_data()
, retrieve_pds_trips()
, retrieve_pds_tracks_data()
and retrieve_pds_tracks()
to
download trips and tracks from Pelagic Data System API. get_pds_res()
to convert Pelagic Data System API responses to data frames and merge trips and tracks data in a unique file.ingest_pds_trips()
and ingest_pds_tracks()
to upload Pelagic Data System data into cloud.cloud_object_name()
returns an empty vector when the bucket is emptyretrieve_survey_data()
merge_landings()
to merge and upload pre-processed recent and legacy landings data.clean_catches()
, coalist()
and clean_legacy_landings()
to restructure legacy landings to recent landings.preprocess_legacy_landings()
to clean and ingest preprocessed legacy data.validate_landings()
and ingest_validation_tables()
to get validation data, check the ladings and upload flags to Airtable. air_tibble_to_records()
and air_upload_records()
to create and update records in Airtablept_
ingest_legacy_landings
to retrieve data from legacy data (SFF landings)air_get_records()
and air_records_to_tibble()
to retrieve and process records from Airtablept_validate_boats()
, pt_validate_devices()
, and pt_validate_vms_installs()
to perform basic data validation from the metadata tablescloud_object_name()
can now also match files by exact name and not just by prefixingest_metadata_tables()
and preprocess_metadata_tables()
now use logic to use Airtable instead og Google Sheetsingest_metadata_tables()
to ingest data about boats, species, municipalities, etc. preprocess_metadata_tables()
to preprocess the data from the metadata ingestion. pt_get_devices_table()
and pt_validate_flags()
as helper functions for the metadata preprocessing. ingest_timor_landings()
to ingest_landings()
for brevity and because all functions relate to Timor anyways. preprocess_landings()
pt_nest_attachments()
to group all attachment columns into a nested column containing data frames.pt_nest_species()
to group all attachment columns into a nested column containing data frames.cloud_object_name()
as a complement to add_version()
to return the latest or an specified version of an object in an storage location.download_cloud_file()
to download files from cloud storage providers.cloud_storage_authenticate()
to internally authenticate to cloud storage instead of authenticating separately in each cloud functionjj. This simplifies authentication and ensures authentication is not attempted when credentials have been already validated. download_survey_data()
, download_survey_metadata()
, and download_survey()
have been renamed to retrieve_survey_data()
, retrieve_survey_metadata()
, and retrieve_survey()
. This is to avoid confusion with planned functions that download data from cloud locations.file_prefix
field).Adds infrastructure to download survey data and upload it to cloud storage providers and implements the ingestion of East Timor landings.
ingest_timor_landings()
. download_survey_data()
and download_survey_metadata()
which download data and metadata for an electronic survey hosted by kobo, kobohr, or ona. download_survey()
can be used as a wrapper to download data and metadata in a single call. upload_cloud_file()
can be used to upload a set of files to a cloud storage bucket. Currently only Google Cloud Services (GCS) is supported. add_version()
is an utility function that appends date-time and sha information to a string and is used to version file names. get_host_url()
is an utility function that gets the host url of an electronic survey provider API. The data pipeline is implemented and run in GitHub Actions on a schedule.
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.