Data for manuscript "Emulator-based calibration of a dynamic grassland model using recurrent neural networks and Hamiltonian Monte Carlo" by Aakula et al.

Beskrivning

Data and python code for the manuscript "Emulator-based calibration of a dynamic grassland model using recurrent neural networks and Hamiltonian Monte Carlo", for performing emulator hyperparameter optimization and training. Python file optimize_LSTM_emulator.py can be used either for training an LSTM emulator with predefined hyperparameters or to optimize hyperparameters from a given hyperparameter space. The training data for each fold is included in the files of shape training_data_fold_{}.parquet. The data is obtained from model simulations, including model inputs (meteorological forcings obtained from ERA5 data), model parameters (sampled from distributions defined in the manuscript) and model (BASGRA) outputs. Python file NUTS_calibration.py can be used to calibrate the emulator using the HMC NUTS algorithm against data from the given sites and years. The calibration data for each site is included in the files of shape observed_df_{}.parquet, which include meteorological data (ERA5) of the corresponding site, model parameters on soil properties of the specific site and the measured GPP values. Text file examples.txt gives instructions and examples on running the scripts.
Visa mer

Publiceringsår

2025

Typ av data

Upphovspersoner

Viivi Aakula - Upphovsperson, Medarbetare

Julius Vira - Medarbetare

Projekt

Övriga uppgifter

Vetenskapsområden

Geovetenskaper

Språk

engelska

Öppen tillgång

Öppet

Licens

Creative Commons Attribution 4.0 International (CC BY 4.0)

Nyckelord

INSPIRE theme: environment, Agroecosystem modeling; BASGRA; Neural network; Emulation; Training data; Hyperparameter optimization; Carbon balance

Ämnesord

Temporal täckning

undefined

Relaterade till denna forskningsdata

Relaterade till denna forskningsdata

Has previous version
Data for manuscript "Emulator-based calibration of a dynamic grassland model using recurrent neural networks and Hamiltonian Monte Carlo" by Aakula et al.