Ocean dynamics on HPC - storage
||Ocean dynamics on HPC - storage|
||NAISS Medium Storage|
||Göran Broström <firstname.lastname@example.org>|
||2023-01-27 – 2024-02-01|
This project connects to computational project NAISS 2023/5-14. Sub-projects are.
Small-scale dynamics (i.e. meter scale) in tidal flows and how a tidal power plant, Deep Green developed by Minesto, interacts with the flow. Two different codes are used, OpenFOAM and a fortran code for geophysical boundary layer turbulence. Lead Sam Fredriksson, Göran Broström.
• The OpenFOAM code produces large number of files and we frequently used1,000,000 files, and about 0.5-1 TB during runs. For present project we request 1,000,000 files and 5 TB for storage for OpenFOAM.
• The geophysical (LES) turbulence model produces few files but large data-sets, these are typically storage on local computers and rewritten as matlab files. Storage need is between 1-3 TB.
Lake and regional lake dynamics (scales 50-1000 m scales) has been develop during 2021-2022, and work will continue in 2023. Lead Göran Broström.
• Regional and fjord modelling. Here ROMS (www.myroms.org) is used and running 1 year takes about 3 weeks on 575 cores and produces about 3 TB/year for standard output. Model output is typically stored at local computers but for efficiency we ask for 5 TB. Input files are about 0.5TB.
• For lake modeling a lake Vänern/lake Mälaren simulation creates about 1-2 TB/year (depending on output) and runs for 8 years. Main storage are on local computers (i.e. results are downloaded). Input files are about 30 GB.
• For a new setup for Baltic Sea generates about 5 TB/year (depending on output), as the model runs for several weeks output will be moved to local computers say every 5 days..
For high resolution runs we find that writing output is a bottleneck, thus we therefore write output from each core, this will temporary generate a large number of files. These files are rewritten as netcdf files, but as this is started manually, there can be times where many files (say 1,500,000 files, are stored on the disk.
Regional-synoptic scales (1 km-100 km). There is a polynya (opening of approximately 80,000 km2) in the Antarctic Ice cover that re-occurs at irregular intervals (in the mid 70’ties and in 2017-2018). The dynamics of these events and the impact on ocean circulation and water mass transformation is studied using models focusing on km scales.
• A 25 year-long simulation of the regional configuration at 1/12° resolution generates about 1.3 TB per 3D field, i.e. about 6TB in total. The 1/60° resolution is much heavier in storage requirement, so only 2D fields are stored in general, except for the last 5 simulated years for later analysis. The storage requirement at this resolution is therefore about 2TB per run. In total, a total of 10-20TB of storage would allow a flexible management of output files.
Total request: 40TB storage and allowance for 2,500,000 files.