technical:recipes:ls-dyna

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Next revision
Previous revision
technical:recipes:ls-dyna [2024-06-07 11:09] – created freytechnical:recipes:ls-dyna [2024-06-07 12:51] (current) – [VALET package definition] frey
Line 21: Line 21:
 For this example, LS-DYNA will be installed in a workgroup's storage.  The base path ''$WORKDIR'' will be referenced in this recipe, which implies that prior to installation the ''workgroup'' command was used to spawn a workgroup shell.  The recipe assumes version R15.0.2 and the variants mentioned above. For this example, LS-DYNA will be installed in a workgroup's storage.  The base path ''$WORKDIR'' will be referenced in this recipe, which implies that prior to installation the ''workgroup'' command was used to spawn a workgroup shell.  The recipe assumes version R15.0.2 and the variants mentioned above.
  
-====== Directory Organization ======+===== Directory organization =====
  
-The LS-DYNA software packages must be downloaded to the cluster.  To facilitate that, a directory must be created to hold those files:+The LS-DYNA software packages must be downloaded to the cluster.  To facilitate that, a directory must be created to hold those files.  Rather than type-out the full path every time, the variable ''LS_DYNA_PREFIX'' will be set to the base directory containing LS-DYNA software:
  
 <code bash> <code bash>
-mkdir -p "${WORKDIR}/sw/ls-dyna/attic/15.0.2"+LS_DYNA_PREFIX="${WORKDIR}/sw/ls-dyna
 +$ mkdir -p "${LS_DYNA_PREFIX}/attic/15.0.2"
 </code> </code>
  
Line 32: Line 33:
  
 <code bash> <code bash>
-$ wget --directory-prefix="${WORKDIR}/sw/ls-dyna/attic/15.0.2" \+$ wget --directory-prefix="${LS_DYNA_PREFIX}/attic/15.0.2" \
        --user=<USERNAME> --password=<PASSWORD> \        --user=<USERNAME> --password=<PASSWORD> \
        '<DOWNLOAD-URL>'        '<DOWNLOAD-URL>'
Line 43: Line 44:
 </code> </code>
  
 +Downloading (or uploading) both packages, the attic directory now contains:
  
 +<code bash>
 +$ ls -l "${LS_DYNA_PREFIX}/attic/15.0.2"
 +total 234420
 +-rw-r--r-- 1 user group 140454912 Apr  7 22:15 ls-dyna_hyb_d_R15_0_2_x64_centos79_ifort190_avx2_openmpi405.tgz_extractor.sh
 +-rw-r--r-- 1 user group  99661023 Apr  7 22:15 ls-dyna_hyb_s_R15_0_2_x64_centos79_ifort190_avx2_openmpi405.tgz_extractor.sh
 +</code>
 +
 +In preparation for installing these packages, a hierarchy of directories must be created to hold the variants of R15.0.2.  Even though just two specific variants will be installed, the entire hierarchy is created:
 +
 +  * to promote a uniform organization between all versions of the software that get installed.
 +  * to make it easy to add missing variants in the future.
 +
 +The directory name matches the version, 15.0.2, with subdirectories for each option:
 +
 +<code bash>
 +$ mkdir -p "${LS_DYNA_PREFIX}/15.0.2/"{sse2,avx2,avx512}/{smp,hyb,mpp}/{double,single}
 +</code>
 +
 +==== Install a variant ====
 +
 +The single-precision variant is installed by changing to its subdirectory (created above) and running the appropriate installation extractor:
 +
 +<code bash>
 +$ pushd "${LS_DYNA_PREFIX}/15.0.2/avx2/hyb/single"
 +$ sh "${LS_DYNA_PREFIX}/attic/15.0.2/ls-dyna_hyb_s_R15_0_2_x64_centos79_ifort190_avx2_openmpi405.tgz_extractor.sh"
 +</code>
 +
 +There will be several files in the directory, including the LS-DYNA executable:
 +
 +<code bash>
 +$ ls -l
 +total 154359
 +-rwxr-xr-x 1 user group  11402168 Mar 29 18:05 ansyscl
 +-rwxr-xr-x 1 user group 219543680 Mar 29 18:14 ls-dyna_hyb_s_R15_0_2_x64_centos79_ifort190_avx2_openmpi405
 +-rwxr-xr-x 1 user group    437864 Mar 29 18:14 ls-dyna_hyb_s_R15_0_2_x64_centos79_ifort190_avx2_openmpi405.l2a
 +</code>
 +
 +Since the executable has a rather complicated name, a symlink is created so that the command ''ls-dyna'' is mapped to it:
 +
 +<code bash>
 +$ ln -s ls-dyna_hyb_s_R15_0_2_x64_centos79_ifort190_avx2_openmpi405 ls-dyna
 +$ ls -l
 +total 154359
 +-rwxr-xr-x 1 user group  11402168 Mar 29 18:05 ansyscl
 +lrwxrwxrwx 1 user group        59 Jun  7 10:40 ls-dyna -> ls-dyna_hyb_s_R15_0_2_x64_centos79_ifort190_avx2_openmpi405
 +-rwxr-xr-x 1 user group 219543680 Mar 29 18:14 ls-dyna_hyb_s_R15_0_2_x64_centos79_ifort190_avx2_openmpi405
 +-rwxr-xr-x 1 user group    437864 Mar 29 18:14 ls-dyna_hyb_s_R15_0_2_x64_centos79_ifort190_avx2_openmpi405.l2a
 +$ popd
 +</code>
 +
 +This variant of LS-DYNA R15.0.2 is now installed, but there are underlying prerequisites to its usability:  the Open MPI library on which it was built must also be made available.  That step will be covered in a subsequent section.
 +
 +==== Install additional variants ====
 +
 +For each additional variant of R15.0.2, the steps above are repeated with the appropriate modifications to the paths and file names.  All hybrid and MPP variants will require the availability of their underlying MPI library.
 +
 +===== Prerequisites =====
 +
 +As mentioned, the variants installed in this recipe are built atop Open MPI.  As the file names indicate (and as documented in the R15.0.2 release notes), Open MPI 4.0.5 and the Intel 2019 compiler were used to build the two variants in question.  Caviness/DARWIN have Intel 2019 compilers present, but the exact release of Open MPI built with that compiler may not be present.  An instance of Open MPI 4.0.5 compiled with Intel 2019 should be produced for the workgroup's usage.
 +
 +==== Open MPI install ====
 +
 +The procedure is presented as succinctly as possible and starts with setup of the directories and unpacking of the source code:
 +
 +<code bash>
 +$ mkdir -p "${WORKDIR}/sw/openmpi/attic"
 +$ wget --directory-prefix="${WORKDIR}/sw/openmpi/attic" \
 +    https://download.open-mpi.org/release/open-mpi/v4.0/openmpi-4.0.5.tar.bz2
 +$ mkdir -p "${WORKDIR}/sw/openmpi/src"
 +$ pushd "${WORKDIR}/sw/openmpi/src"
 +$ tar -xf "${WORKDIR}/sw/openmpi/attic/openmpi-4.0.5.tar.bz2"
 +$ cd openmpi-4.0.5
 +$ mkdir build-intel19-ls-dyna
 +$ cd build-intel19-ls-dyna
 +$ vpkg_rollback
 +</code>
 +
 +For Caviness, the OFI libfabric communications library is used
 +
 +<code bash>
 +$ vpkg_devrequire intel/2019 libfabric/1.17.1
 +$ ../configure --prefix="${WORKDIR}/sw/openmpi/4.0.5-intel19-ls-dyna" \
 +        --with-lustre --with-io-romio-flags=--with-file-system=ufs+nfs+lustre \
 +        --with-libfabric="$LIBFABRIC_PREFIX" \
 +        CC=icc CXX=icpc FC=ifort
 +</code>
 +
 +versus the OS-provided UCX library used on DARWIN
 +
 +<code bash>
 +$ vpkg_devrequire intel/2019
 +$ ../configure --prefix="${WORKDIR}/sw/openmpi/4.0.5-intel19-ls-dyna" \
 +        --with-hwloc=/opt/shared/hwloc/2 \
 +        --with-pmix=/opt/shared/pmix/3 \
 +        --with-libevent=/usr \
 +        --with-lustre \
 +        --with-io-romio-flags="--with-file-system=ufs+nfs+lustre" \
 +        CC=icc CXX=icpc FC=ifort
 +</code>
 +
 +If the configuration is successful, the software can be built and installed (this will take some time):
 +
 +<code bash>
 +$ make
 +$ make install
 +$ vpkg_rollback
 +$ popd
 +</code>
 +
 +There are likely local configuration details associated with Open MPI that are effected via the Open MPI configuration files.  A copy of the nearest-matching version maintained by IT RCI should be made.  For example, on Caviness
 +
 +<code bash>
 +$ cp -f /opt/shared/openmpi/4.0.3/etc/{openmpi,pmix}-mca-params.conf "${WORKDIR}/sw/openmpi/4.0.5-intel19-ls-dyna/etc"
 +</code>
 +
 +versus on DARWIN
 +
 +<code bash>
 +$ cp -f /opt/shared/openmpi/4.0.5/etc/{openmpi,pmix}-mca-params.conf "${WORKDIR}/sw/openmpi/4.0.5-intel19-ls-dyna/etc"
 +</code>
 +
 +=== VALET package definition ===
 +
 +To make this version of Open MPI available to your LS-DYNA installs, a VALET package definition file should be created.  The text ''<OPENMPI-INSTALL-DIR>'' should be replaced with the base installation path used above (e.g. ''/work/group/sw/openmpi'').
 +
 +<WRAP center round alert 80%>
 +If the workgroup already has an existing ''${WORKDIR}/sw/valet/openmpi.vpkg_yaml'' file, the new version should be added to that file:  do not simply overwrite the file with the content shown below!
 +</WRAP>
 +
 +For the configuration made on Caviness above:
 +
 +<file yaml openmpi.vpkg_yaml>
 +openmpi:
 +    description:         "Open MPI: Message-Passing Interface"
 +    url:                 http://www.open-mpi.org/
 +    prefix:              <OPENMPI-INSTALL-DIR>
 +    versions:
 +        "4.0.5:intel19,ls-dyna":
 +            description:     Open MPI 4.0.5 for LS-DYNA, Intel 2019
 +            dependencies:
 +                - intel/2019
 +                - libfabric/1.17.1
 +</file>
 +
 +On DARWIN, the file is slightly different.
 +
 +<file yaml openmpi.vpkg_yaml>
 +openmpi:
 +    description:         "Open MPI: Message-Passing Interface"
 +    url:                 http://www.open-mpi.org/
 +    prefix:              <OPENMPI-INSTALL-DIR>
 +    versions:
 +        "4.0.5:intel19,ls-dyna":
 +            description:     Open MPI 4.0.5 for LS-DYNA, Intel 2019
 +            dependencies:
 +                - intel/2019
 +</file>
 +
 +===== VALET package definition =====
 +
 +With the necessary dependencies satisfied, a VALET package definition for LS-DYNA can be created.  Assuming this is the first time LS-DYNA is being installed, the file will need to be created as it appears below; if versions/variants are added in the future, that file should be augmented with just the new version/variant information.  The ''<LS_DYNA_PREFIX>'' text must be replaced in the package definition with the value of that environment variable that was assigned at the beginning of this recipe:
 +
 +<code bash>
 +$ echo $LS_DYNA_PREFIX
 +/work/group/sw/ls-dyna
 +</code>
 +
 +The ''<LSTC-LICENSE-SERVER-NAME>'' text must be replaced with the IP address or DNS name of the computer that services the LSTC LS-DYNA license that was purchased.
 +
 +<file yaml ls-dyna.vpkg_yaml>
 +ls-dyna:
 +    prefix: <LS_DYNA_PREFIX>
 +    url: http://www.lstc.com/
 +    description: general-purpose finite element program simulating complex real world problems
 +
 +    actions:
 +        - variable: LSTC_LICENSE
 +          action: set
 +          value: network
 +        - variable: LSTC_LICENSE_SERVER
 +          action: set
 +          value: <LSTC-LICENSE-SERVER-NAME>
 +        - variable: LSTC_INTERNAL_CLIENT
 +          action: set
 +          value: off
 +        - variable: LSTC_ROOT
 +          action: set
 +          value: ${VALET_PATH_PREFIX}
 +        - bindir: ${VALET_PATH_PREFIX}
 +
 +    default-version: "15.0.2:double,hybrid,avx2"
 +
 +    versions:
 +
 +        "15.0.2:single,hybrid,avx2":
 +            description: 15.0.2, single-precision, Hybrid (SMP+MPP), AVX2
 +            prefix: 15.0.2/avx2/hyb/single
 +            dependencies:
 +                - openmpi/4.0.5:intel19,ls-dyna
 +        "15.0.2:double,hybrid,avx2":
 +            description: 15.0.2, double-precision, Hybrid (SMP+MPP), AVX2
 +            prefix: 15.0.2/avx2/hyb/double
 +            dependencies:
 +                - openmpi/4.0.5:intel19,ls-dyna
 +</file>
 +
 +Loading one of these packages into the shell environment makes the ''ls-dyna'' command (the symlink created during install) available:
 +
 +<code bash>
 +$ vpkg_require ls-dyna/single,hybrid,avx2
 +Adding dependency `intel/2019u5` to your environment
 +Adding dependency `libfabric/1.17.1` to your environment
 +Adding dependency `openmpi/4.0.5:intel19,ls-dyna` to your environment
 +Adding package `ls-dyna/15.0.2:single,avx2,hybrid` to your environment
 +
 +$ which ls-dyna
 +/work/group/sw/ls-dyna/15.0.2/avx2/hyb/single/ls-dyna
 +</code>
 +
 +===== Running jobs =====
 +
 +On the Caviness/DARWIN cluster, LS-DYNA computation must be submitted as a Slurm job.  The variants installed here use Open MPI, so the job script template at ''/opt/templates/slurm/generic/mpi/openmpi/openmpi.qs'' is a starting point for LS-DYNA jobs.  As in all cases for parallel jobs:
 +
 +  * The number of Slurm //tasks// represent the MPI ranks for a hybrid or MPP variant of LS-DYNA
 +  * The number of CPUs per Slurm task represent the Open MP threads for an smp or hybrid variant of LS-DYNA
 +
 +Follow the comments in the script header to determine which flags to alter and how to do so.  The script template must be altered to add the desired version/variant of LS-DYNA to the job environment:
 +
 +<code>
 +#
 +# [EDIT] Do any pre-processing, staging, environment setup with VALET
 +#        or explicit changes to PATH, LD_LIBRARY_PATH, etc.
 +#
 +vpkg_require ls-dyna/15.0.2:single,hybrid,avx2
 +</code>
 +
 +Toward the end of the job script template, the LS-DYNA program is run:
 +
 +<code>
 +#
 +# [EDIT] Execute your MPI program
 +#
 +${UD_MPIRUN} ls-dyna i=my_model.k
 +mpi_rc=$?
 +</code>
  • technical/recipes/ls-dyna.1717772944.txt.gz
  • Last modified: 2024-06-07 11:09
  • by frey