This is an old revision of the document!
Installation of WRF/WPS
This guide is loosely based on the information presented in the WRF-ARW Online Tutorial. WRF expects the MPICH library for distributed-memory parallelism, but IT RCI recommends Open MPI on Caviness, so some minor adjustments are necessary. A major difference will be in how the software is organized and used.
As outlined in the software building and management guide, WRF (and accompanying WPS) will be installed in a versioned hierarchy. The base path for the installation will vary by user and use case:
- Versions managed by a singular user for that user's private usage
- Home directory, e.g.
${HOME}/sw/wrf
- Workgroup per-user directory, e.g.
${WORKDIR}/users/«user»/sw/wrf
- Versions managed for all members of a workgroup
- Workgroup software directory, e.g.
${WORKDIR}/sw/wrf
In this document the software will be installed for all members of a workgroup. Though the procedure assumes WRF was not previously installed for the workgroup, it has been constructed to work regardless. Procedural differences based on previous installations will be highlighted where appropriate.
Prepare Installation Path
To begin, the workgroup is entered and the installation directories are created. In this document version 4.5.2 will be built and installed:
$ workgroup -g «workgroup» $ WRF_PREFIX="${WORKDIR}/sw/wrf" $ WRF_VERSION="4.5.2" $ mkdir -p "${WRF_PREFIX}/${WRF_VERSION}/src" $ BUILD_DATE="$(date +%Y.%m.%d)"
Note that depending on the choice of base path (as outlined above) the value of WRF_PREFIX
will be different. The BUILD_DATE
will be used to differentiate some paths and filenames throughout this document. The WRF_VERSION
corresponds with WRF itself; it is up to the user to determine the correct version of WPS to accompany the chosen version of WRF.
Build Environment
WRF depends on several external libraries, including the NetCDF and NetCDF-Fortran libraries. Most of the other libraries are available in the OS.
$ vpkg_require cmake/default $ vpkg_devrequire netcdf-fortran/4.5.4:intel-oneapi-2023,openmpi
The PNG library must be at least 1.2.50:
$ pkg-config libpng --modversion
1.5.13
The Jasper library must be at least 1.900.1:
$ pkg-config jasper --modversion
1.900.1
The zip library (zlib) dependency isn't directly needed by WRF but by the NetCDF library to which it is linked. Since NetCDF is not being purpose-built for this copy of WRF (a VALET package is being used), the zip library is not needed.
For the sake of dependencies and build system tests, a directory will be created to the purpose:
$ mkdir -p "${WRF_PREFIX}/${WRF_VERSION}/src/3rd_party"
Build Dependencies
Any dependencies would be downloaded and built in the 3rd_party
directory and be installed to the WRF version directory.
The build of the zip library is shown here for illustrative purposes only – it does not need to be built as part of this procedure.
$ pushd "${WRF_PREFIX}/${WRF_VERSION}/src/3rd_party" $ wget 'https://www.zlib.net/zlib-1.3.1.tar.gz' $ tar -xf zlib-1.3.1.tar.gz $ pushd zlib-1.3.1 $ mkdir build-${BUILD_DATE} $ cd build-${BUILD_DATE} $ CC=icx \ cmake \ -DCMAKE_BUILD_TYPE=Release \ -DCMAKE_INSTALL_PREFIX="${WRF_PREFIX}/${WRF_VERSION}" \ .. $ make $ make install $ popd
Build System Tests
WRF has two tests to ensure the build system is likely to succees. In the 3rd_party
directory:
$ cd "${WRF_PREFIX}/${WRF_VERSION}/src/3rd_party" $ wget 'https://www2.mmm.ucar.edu/wrf/OnLineTutorial/compile_tutorial/tar_files/Fortran_C_NETCDF_MPI_tests.tar' $ tar -xf Fortran_C_NETCDF_MPI_tests.tar
Test 1: Serial
$ ${FC} ${CPPFLAGS} ${FCFLAGS} -c 01_fortran+c+netcdf_f.f $ ${CC} ${CPPFLAGS} ${CFLAGS} -c 01_fortran+c+netcdf_c.c $ ${FC} ${CPPFLAGS} ${FCFLAGS} -o 01_fortran+c+netcdf \ 01_fortran+c+netcdf_f.o 01_fortran+c+netcdf_c.o \ ${LDFLAGS} -lnetcdff -lnetcdf $ ./01_fortran+c+netcdf C function called by Fortran Values are xx = 2.00 and ii = 1 SUCCESS test 1 fortran + c + netcdf
Test 2: MPI Parallelism
$ mpif90 ${CPPFLAGS} ${FCFLAGS} -c 02_fortran+c+netcdf+mpi_f.f $ mpicc ${CPPFLAGS} ${CFLAGS} -c 02_fortran+c+netcdf+mpi_c.c $ mpif90 ${CPPFLAGS} ${FCFLAGS} -o 02_fortran+c+netcdf+mpi \ 02_fortran+c+netcdf+mpi_f.o 02_fortran+c+netcdf+mpi_c.o \ ${LDFLAGS} -lnetcdff -lnetcdf $ mpirun -np 1 02_fortran+c+netcdf+mpi C function called by Fortran Values are xx = 2.00 and ii = 1 status = 2 SUCCESS test 2 fortran + c + netcdf + mpi
Download Main Sources
The WRF and WPS source code will be cloned from Github in the versioned source directory and the desired version checked-out:
$ cd "${WRF_PREFIX}/${WRF_VERSION}/src" $ git clone https://github.com/wrf-model/WRF.git WRF $ pushd WRF $ git checkout v4.5.2 $ git submodule update --init --recursive $ popd $ git clone https://github.com/wrf-model/WPS.git WPS $ pushd WPS $ git checkout v4.5 $ popd
The clone and checkout process will take some time – these are large source code repositories.
For other 4.5.x and newer releases of WRF, the value assigned to WRF_VERSION
should be altered and the versions checked-out in the git repositories will be different.
Build WRF
For the Jasper library to be included in the program, the WRF configuration system must be altered:
$ cd "${WRF_PREFIX}/${WRF_VERSION}/src/WRF" $ sed -i 's/I_really_want_to_output_grib2_from_WRF = "FALSE"/I_really_want_to_output_grib2_from_WRF = "TRUE"/' arch/Config.pl
Two builds will be done: serial and distributed-memory parallelism (MPI).
Serial
For the serial build, the configure
script must be run with the following selections:
- Option 76: Intel (ifx/icx) "serial"
- 0: No nesting
$ ./clean -aa $ NETCDF="${NETCDF_DASH_FORTRAN_PREFIX}" JASPERLIB=/usr/lib64 JASPERINC=/usr/include ZLIB="${WRF_PREFIX}/${WRF_VERSION}" \ ./configure
The configure
script produces a configure.wrf
file for the build system. A few minor modifications are necessary to the compilers and for the sake of brevity (no need to time
the Fortran compiles):
$ sed -i -e 's/^\(DM_FC *= *\)mpi.*$/\1mpif90/' \ -e 's/^\(DM_CC *= *\)mpi.*$/\1mpicc/' \ -e 's/^\(FC *= *\)time /\1/' \ configure.wrf
Finally, the code can be compiled. The -j 4
flag allows up to four concurrent compiles to accelerate the process:
$ NETCDF="${NETCDF_DASH_FORTRAN_PREFIX}" \ JASPERLIB=/usr/lib64 JASPERINC=/usr/include \ ./compile -j 4 em_real | tee compile-serial-${BUILD_DATE}.log
A lot of information will be displayed to the terminal as the build proceeds; all of it will also be written to a file with the name compile-serial-«YYYY.MM.DD».log
. If the build is successful, the output will terminate with e.g.
========================================================================== build started: Tue Apr 2 16:19:28 EDT 2024 build completed: Tue Apr 2 16:33:37 EDT 2024 ---> Executables successfully built <--- -rwxr-xr-x 1 user workgroup 67864392 Apr 2 16:33 main/ndown.exe -rwxr-xr-x 1 user workgroup 67779464 Apr 2 16:33 main/real.exe -rwxr-xr-x 1 user workgroup 67344608 Apr 2 16:33 main/tc.exe -rwxr-xr-x 1 user workgroup 71494672 Apr 2 16:32 main/wrf.exe ==========================================================================
At this point those executables can be installed outside the source directory to ${WRF_PREFIX}/${WRF_VERSION}/bin
for future computational usage:
$ mkdir -p "${WRF_PREFIX}/${WRF_VERSION}/bin" $ for exe in main/*.exe; do install --mode=0775 "$exe" "${WRF_PREFIX}/${WRF_VERSION}/bin" done
In the future, the serial variants of the program will be referenced as wrf.exe
, real.exe
, etc.
Distributed-Memory Parallel
For the MPI build, the configure
script must be run with the following selections:
- Option 78: Intel (ifx/icx), "dm"
- 1: Basic nesting
$ ./clean -aa $ NETCDF="${NETCDF_DASH_FORTRAN_PREFIX}" JASPERLIB=/usr/lib64 JASPERINC=/usr/include ZLIB="${WRF_PREFIX}/${WRF_VERSION}" \ ./configure
The configure
script produces a configure.wrf
file for the build system. A few minor modifications are necessary to the compilers and for the sake of brevity (no need to time
the Fortran compiles):
$ sed -i -e 's/^\(DM_FC *= *\)mpi.*$/\1mpif90/' \ -e 's/^\(DM_CC *= *\)mpi.*$/\1mpicc/' \ -e 's/^\(FC *= *\)time /\1/' \ configure.wrf
Finally, the code can be compiled. The -j 4
flag allows up to four concurrent compiles to accelerate the process:
$ NETCDF="${NETCDF_DASH_FORTRAN_PREFIX}" \ JASPERLIB=/usr/lib64 JASPERINC=/usr/include \ ./compile -j 4 em_real | tee compile-mpi-${BUILD_DATE}.log
Again, a lot of information will be displayed to the terminal as the build proceeds; all of it will also be written to a file with the name compile-mpi-«YYYY.MM.DD».log
. If the build is successful, the output will terminate with e.g.
========================================================================== build started: Tue Apr 2 16:39:04 EDT 2024 build completed: Tue Apr 2 16:53:00 EDT 2024 ---> Executables successfully built <--- -rwxr-xr-x 1 user workgroup 67864392 Apr 2 16:53 main/ndown.exe -rwxr-xr-x 1 user workgroup 67779464 Apr 2 16:53 main/real.exe -rwxr-xr-x 1 user workgroup 67344608 Apr 2 16:53 main/tc.exe -rwxr-xr-x 1 user workgroup 71494672 Apr 2 16:52 main/wrf.exe ==========================================================================
At this point those executables can be installed outside the source directory to ${WRF_PREFIX}/${WRF_VERSION}/bin
for future computational usage:
$ mkdir -p "${WRF_PREFIX}/${WRF_VERSION}/bin" $ for exe in main/*.exe; do install --mode=0775 "$exe" "${WRF_PREFIX}/${WRF_VERSION}/bin/mpi_$(basename "${exe}")" done
In the future, the distributed-memory parallel variants of the program will be referenced as mpi_wrf.exe
, mpi_real.exe
, etc.