WAVEWATCH III ® on graham

This section describes the steps to set up and run the Strait of Georgia configuration of the WAVEWATCH III® model (wwatch3 hereafter) on the ComputeCanada graham.computecanada.ca HPC cluster.

Modules Setup

When working on graham, the module load command must be used to load extra software components.

You can manually load the modules each time you log in, or you can add the lines to your $HOME/.bashrc file so that they are automatically loaded upon login.

The module load command required to build wwatch3 is:

module load netcdf-fortran-mpi/4.4.4


The netcdf-fortran-mpi/4.4.4 module is incompatible with the module load netcdf-fortran/4.4.4 module that is required to build MOHID. So, if you are working on both MOHID and wwatch3, you will need to manually load the appropriate modules as necessary.

Create a Workspace and Clone the Configuration Repositories

graham provides several different types of file storage. We use project space for our working environments because it is large, high performance, and backed up. Scratch space is even larger, also high performance, but not backed up, so we use that as the space to execute wwatch3 runs in, and store their results. Files more than 60 days old are automatically purged from scratch space on graham.

graham automatically provides environment variables that are more convenient that remembering full paths to access your project and scratch spaces:

  • Your project space is at $PROJECT/$USER/

  • Your scratch space is at $SCRATCH/

Create MIDOSS/ directory trees in your project and scratch spaces:

$ mkdir -p $PROJECT/$USER/MIDOSS/wwatch3-5.16
$ mkdir -p $PROJECT/$USER/MIDOSS/wwatch3-runs
$ mkdir -p $SCRATCH/MIDOSS/wwatch3


If the above command fails, it may be because the symbolic link that PROJECT points to was not created when your graham account was set up. Try:

$ cd $HOME
$ ln -s $HOME/projects/def-allen project

Clone the SalishSeaWaves repository, the collection of configuration files for the Strait of Georgia wwatch3 model:

$ git clone git@github.com:SalishSeaCast/SalishSeaWaves.git


Unpack the wwatch3 code tarball and prepare it for building. You only need to do this once.

$ cd $PROJECT/$USER/MIDOSS/wwatch3-5.16
$ tar -xvzf /home/dlatorne/wwatch3.v5.16.tar.gz
$ ./install_ww3_tar

In answer to the questions from the install_ww3_tar script:

  • Choose a (L)ocal installation

  • Update your settings to:

    • Choose mpifort as the Fortran compiler to use

    • Choose mpicc as the C compiler to use

  • Accept the default answers for other questions

That stage of the build preparation should finish with output that looks similar to:

--- Set up / update directories ---
    Directory /home/dlatorne/project/dlatorne/MIDOSS/test-wwatch3-5.16/work

 Setting up links to comp link and switch ...
 Setting up links to selected GrADS scripts ...
 Setting up links to input files ...
 Install script not a link ...
 Install script identical to tar version, replace by link.

 --- Final remarks ---

 To run the WAVEWATCH III executables and the scripts to generate
 and update these executables from arbitrary directories, add the
 following directories to the path of your interactive shell :


 Note that 'comp' and 'link' and 'switch' are user/machine specific.

   Several comp and link files for known compilers are found in:

   If you cannot find one that suits your machine/preferences,
   create custom scripts based on the existing ones and add to bin.

                  ---       End of program        ---

Set up the comp and link scripts in the bin/ directory:

  • Edit the comp.Intel script to change 2 occurrences of mpiifort (with 2 i s) to mpifort (with 1 i); i.e. change:

    94 comp=mpiifort
    95 which mpiifort 1> /dev/null 2> /dev/null


    94 comp=mpifort
    95 which mpifort 1> /dev/null 2> /dev/null
  • Edit the link.Intel script to change 2 occurrences of mpiifort (with 2 i s) to mpifort (with 1 i); i.e. change:

    109 comp=mpiifort
    110 which mpiifort 1> /dev/null 2> /dev/null


    109 comp=mpifort
    110 which mpifort 1> /dev/null 2> /dev/null
  • Create symlinks for comp, link, and SalishSeaWaves/switch in the bin/ directory:

    $ cd bin
    $ ln -sf comp.Intel comp && chmod +x comp.Intel
    $ ln -sf link.Intel link
    $ ln -sf $PROJECT/$USER/MIDOSS/SalishSeaWaves/switch switch

Confirm that the netcdf-fortran-mpi module is loaded:

$ module load netcdf-fortran-mpi/4.4.4

Export the environment variables that are required to build wwatch3:

$ export PATH=$PATH:$PROJECT/$USER/MIDOSS/wwatch3-5.16/bin:$PROJECT/$USER/MIDOSS/wwatch3-5.16/exe
$ export NETCDF_CONFIG=$(which nc-config)

Compile and link the wwatch3 model programs:

$ cd $PROJECT/$USER/MIDOSS/wwatch3-5.16/work
$ w3_make

Generate Wind & Current Forcing Files on salish

wwatch3 uses netCDF4 wind and current forcing files that are generated from the HRDPS surface forcing files that are used to force SalishSeaCast NEMO runs, and the surface current fields that are produced by those runs.

For the moment, generation of those forcing files has to be done on salish and then the files uploaded from there to graham. This section describes the process for doing that.

We use the make_ww3_wind_file and make_ww3_current_file workers from the SalishSeaCast automation system in --debug mode to generate the forcing files.


Always run the workers with the --debug command-line option. That sends all logging information from the workers to the screen instead of log files, and prevents the workers from trying to communicate with the automation system manage.

Running a worker without the --debug option may disrupt the SalishSeaCast automation system.

Follow the instructions to set up a SalishSeaNowcast Development Environment.

A special SalishSeaNowcast configuration for generating wwatch3 forcing files is stored in SalishSeaNowcast/config/wwatch3-forcing.yaml

With your SalishSeaNowcast Development Environment activated, you can run make_ww3_wind_file with the command:

(salishsea-nowcast)$ python -m nowcast.workers.make_ww3_wind_file SalishSeaNowcast/config/wwatch3-forcing.yaml --debug salish nowcast --run-date yyyy-mm-dd

The generated forcing file will be stored in /data/MIDOSS/forcing/wwatch3/wind/SoG_wind_yyyymmdd.nc.

Likewise, you can run make_ww3_current_file with:

(salishsea-nowcast)$ python -m nowcast.workers.make_ww3_current_file SalishSeaNowcast/config/wwatch3-forcing.yaml --debug salish nowcast --run-date yyyy-mm-dd

The generated forcing file will be stored in /data/MIDOSS/forcing/wwatch3/current/SoG_current_yyyymmdd.nc.

A bash script like:

for dd in {01..31}
  python -m nowcast.workers.make_ww3_wind_file SalishSeaNowcast/config/wwatch3-forcing.yaml --debug salish nowcast --run-date ${yyyy}-${mm}-${dd}
  python -m nowcast.workers.make_ww3_current_file SalishSeaNowcast/config/wwatch3-forcing.yaml --debug salish nowcast --run-date ${yyyy}-${mm}-${dd}

will enable you to run the workers for a month at a time.

The generated files are at total of 74M per day (6.8M for the wind file, and 67M for the current file). That scales to approximately 2.2G per month, 26.7G per year, and 133.3G for the 2015 to late-2019 period covered by the nowcast-green.201812 SalishSeaCast NEMO results dataset.

The files will be produced with -rw-r--r-- permissions. To make them group-writable, you can use:

find /data/MIDOSS/forcing/wwatch3/ -type f -execdir chmod g+w {} \;

To upload the files to graham you can use:

rsync -rltv /data/MIDOSS/forcing/wwatch3/ graham:/scratch/dlatorne/MIDOSS/forcing/wwatch3/