Project

General

Profile

ARCHER2

Contact persons

Get the model source code

As per other platforms, clone repository and run deploy.sh script.

Instructions for this can be found here

Update mh-linux file

This file will set up the machine and compiler specific settings/options during the configure step - FYI currently setup for cray compiler.

Move to config directory.

cd icon-hammoz # or name of directory you created..
cd icon/config

Move existing mh-linux.

mv mh-linux mh-linux_orig

Replace with this one: mh-linux

Update configure file

The configure file also needs updating to use ECCODES instead of GRIB_API (predecessor)

In base directory (icon-hammoz/icon) replace configure file with this one: configure

Make sure it has correct permissions.

chmod 755 configure

Environment settings

Logging onto ARCHER2 will automatically load the cray programming environment.

Some more modules need to be loaded before compiling.

module load cray-netcdf
module load cray-hdf5
module unload craype-network-ofi
module unload cray-mpich
module load craype-network-ucx
module load cray-mpich-ucx
module load libfabric
module load cpe/21.03

Compiling and Building

Compile the model.

./configure --with-fortran=cray --with-hammoz --with-atm_phy_echam_submodels &> configure.out

Now build.

make -j 16 &> make.out

If successful this will create a binary in the build directory - for example..

icon-hammoz/icon/build/x86_64-unknown-linux-gnu/bin/icon

You can check to make sure the binary is complete with the command..

ldd icon

If successful it will print out all the linked libraries.

Run scripts

You can find an example run script here: exp.ICON-HAM_archer2_example.run

This is almost entirely based on the standard run scripts that you will find in the icon-hammoz/icon/run directory, but tweaked for use on ARCHER2.

The main difference is the location of the shared folder that contains necessary inputs (grids, emission files etc).

Paths/things you may need to change:

* basedir (path to base icon directory where you have been working)
* bindir (path to the binary files discussed in the compilation section above)
* EXPNAME (directory name where all data will be output)
* icon_data_poolFolder (currently on a shared directory on ARCHER2)
* ham_data_poolFolder (as above)
* ham_grid_name (currently hardcoded but can be easily linked to grid details)

You will also need to move the lsdata.nc into your icon-hammoz/icon/data folder.

This can either be copied from /work/n02/shared/ross/ or downloaded here: lsdata.nc

Submitting to SLURM

Here is an example submission script: submit_run.sh

You'll need to update:

SLURM JOB OPTIONS: account name, run name, walltime etc.
icon directory path: icon_dir
run script name (at the bottom)

submit to SLURM.

sbatch submit_run.sh

Where is my output?

The run script will make a new directory in icon-hammoz/experiments/EXPNAME and put all your lovely data in there

You'll also be able to see all the run logs in icon.out (useful for keeping an eye on the run)