Skip to content

Managing Software Versions with (EM)

Environment Modules

Red Hat Enterprise Linux is the operating system on the cluster computers. The Argo cluster uses a system called "Environment Modules" to manage environment variables. Modules ensure that your environment variables are set up for the software you want to use. To use a particular software package, you have to load the relevant module. For example to use MATLAB you will need to load the matlab/VERSION module.

The following command shows all the module available:

module avail

Note: The list below shows an output of the above command at some instance of time. Availability of modules changes over time and major modules are replaced frequently with their updated counterparts. It is advisable that you use the above command to check currently available modules yourself.

----------------------------------------------------------------------------- /cm/local/modulefiles ---------------------------------------------------------------
bright-installer/7.2         cm-upgrade/7.2               matlab/R2014b                module-git                   openldap                     version
cluster-tools/7.2            dot                          matlab/R2016a                module-info                  openmpi/mlnx/gcc/64/1.8.4rc1
cmd                          freeipmi/1.4.11              matlab/R2017a                mvapich2/mlnx/gcc/64/2.0     shared
cmsh                         ipmitool/1.8.15              matlab/R2018b                null                         use.own

---------------------------------------------------------------------------- /cm/shared/modulefiles ---------------------------------------------------------------
amber/gcc/16                                 gem5/stable/alpha                            mpich/ge/open64/64/3.2
amber/intel/16                               gem5/stable/arm                              mpiexec/0.84_432
anaconda2/latest                             gem5/stable/MIPS                             mvapich2/gcc/64/2.0b
bazel/0.13.1(default)                        gem5/stable/Power                            mvapich2/gcc/64/2.2b
bazel/0.4.5                                  gem5/stable/sparc                            mvapich2/intel/64/2.2b
bazel/0.5.4                                  gem5/stable/x86                              mvapich2/open64/64/2.2b
blacs/openmpi/gcc/64/1.1patch03              globalarrays/gcc/openmpi/64/5.1              NCO/2018-10
blacs/openmpi/open64/64/1.1patch03           globalarrays/open64/openmpi/64/5.1           netcdf/gcc/64/4.3.3.1
blast/2.2.29+                                globalarrays/openmpi/gcc/64/5.4              netcdf/open64/64/4.3.3.1
boost/1_60_0/nompi                           globalarrays/openmpi/open64/64/5.4           netperf/2.7.0
boost/1.67.0                                 gromacs/intel/5.0.4/doubleMPI                open64/4.5.2.1
caffe/1.0.0-rc3                              gromacs/intel/5.0.4/singleMPI                openblas/0.2.20(default)
caffe/bcaffe/1.0                             gromacs/intel/double/4.6.5                   openblas/dynamic/0.2.15
caffe/caffe2/dev                             gromacs/intel/single/4.6.5                   openblas/dynamic/0.2.19
caffe/CPU/1.0.0-rc3                          gromacs/intel-2018.2-double                  openblas/dynamic/0.2.8
caffe/hnoh/1.0                               gromacs/intel-2018.2-double-mpi              OpenCV/2.4.10
cloudy/13.03                                 gromacs/intel-2018.2-plumed-2.4.2-double     OpenCV/3.1.0
cloudy/17.00                                 gromacs/intel-2018.2-plumed-2.4.2-double-mpi OpenCV/cuda75/3.1
cloudy/mpi/13.03                             gromacs/intel-2018.2-single-gpu              OpenFOAM/1806
cloudy/mpi/17.00                             gsl/1.16                                     OpenFOAM/1806c
cmake/2.8.12                                 HDF/4/4.2.13                                 OpenFOAM/2.3.0
cmake/3.10.2(default)                        HDF/5/1.8.12                                 openmpi/3.0.0
comsol/53a                                   HDF/5/1.8.16                                 openmpi/3.1.0
cplex/12.5.1.0                               HDF/5/1.8.19                                 openmpi/gcc/64/1.10.1
cplex/12.6.1.0                               hdf5_18/1.8.12                               openmpi/gcc/64/1.6.3-mlnx-ofed
cplex/12.8.0(default)                        hdf5_18/1.8.16                               openmpi/gcc/64/1.8.1
cuda/9.0                                     hdf5_18/1.8.19(default)                      openmpi/open64/64/1.10.1
cuda/9.1                                     heasoft/6.17                                 pgi/18.3(default)
cuda/9.2                                     hpl/2.1                                      plumed/2.4.2
cuda70/blas/7.0.28                           hwloc/1.10.1                                 proj4/4.9.3(default)
cuda70/fft/7.0.28                            hwloc/2.0.0                                  protobuf/3.5.1(default)
cuda70/gdk/346.46                            intel/compiler/64/15.0/2015.5.223            python/2.7.14
cuda70/nsight/7.0.28                         intel/compiler/64/16.0.4/2016.4.258          python/3.6.4
cuda70/profiler/7.0.28                       intel/mkl/64/11.2/2015.5.223                 python/gcc5.2/python2.7.12
cuda70/toolkit/7.0.28                        intel/mkl/64/11.3.4/2016.4.258               python/python2.7
cuda75/blas/7.5.18                           intel/mkl/mic/11.3.4/2016.4.258              python/python2.7.10
cuda75/fft/7.5.18                            intel/mpi/64/5.1.3/2016.4.258                python/python2.7.12
cuda75/gdk/352.79                            intel/mpi/mic/5.1.3/2016.4.258               python/python3.5
cuda75/nsight/7.5.18                         intel/ps_xe/18.0.1.163                       pytorch/0.4.1-py27
cuda75/profiler/7.5.18                       intel/python/2018-10-p27                     pytorch/0.4.1-py36
cuda75/toolkit/7.5.18                        intel/python/2018-10-p36                     R/3.0.2
cuda80/blas/8.0.44                           intel-cluster-checker/2.2.2                  R/3.2.0
cuda80/cudnn/8.0-v6.0                        intel-cluster-runtime/ia32/3.8               R/3.4.1(default)
cuda80/cudnn/8.0-v7.0                        intel-cluster-runtime/intel64/3.8            R/3.4.4
cuda80/fft/8.0.44                            intel-cluster-runtime/mic/3.8                R/3.5.0
cuda80/gdk/352.79                            intel-cluster-toolkit/ia32                   R/3.5.1
cuda80/nsight/8.0.44                         intel-cluster-toolkit/impi32                 redhawk/17.1.3(default)
cuda80/profiler/8.0.44                       intel-cluster-toolkit/impi64                 rosetta/3.5
cuda80/profiler-dev/8.0.44                   intel-cluster-toolkit/intel64                rosetta/3.5-mpi
cuda80/toolkit/8.0.44                        intel-itac/9.1.2/024                         Scala/2.10.4
dmtcp/2.2.1                                  intel-mpi/64/5.1.2/150                       singularity/2.3
dmtcp/2.5.2                                  intel-tbb-oss/ia32/44_20160526oss            singularity/2.3.1
dock/gcc/6                                   intel-tbb-oss/intel64/44_20160526oss         singularity/2.5.1
dock/intel/6                                 keras/2.1.4                                  singularity/2.5.2(default)
fftw2/openmpi/gcc/64/double/2.1.5            keras/2.2.0-py27                             singularity/2.6.0
fftw2/openmpi/gcc/64/float/2.1.5             keras/2.2.0-py36                             smtsim
fftw2/openmpi/open64/64/double/2.1.5         lapack/gcc/64/3.6.0                          spark/2.3.0(default)
fftw2/openmpi/open64/64/float/2.1.5          lapack/open64/64/3.6.0                       tensorflow/cpu/1.3.1
fftw3/3.3.4-Double                           lpsolve/5.5.2                                tensorflow/cpu/1.8.0-py27
fftw3/3.3.4-Single                           matio/1.5.2                                  tensorflow/cpu/1.8.0-py36
fftw3/openmpi/gcc/64/3.3.4                   MATLAB/v81/matlab_compiler_runtime           tensorflow/gpu/1.3.1
fftw3/openmpi/open64/64/3.3.4                MATLAB/v84/matlab_compiler_runtime           tensorflow/gpu/1.4.0
gamess/intel/Version-R1                      MATLAB/v901/matlab_compiler_runtime          tensorflow/gpu/1.5.0
gcc/4.8.4                                    MATLAB/v92/matlab_compiler_runtime           tensorflow/gpu/1.8.0-py27
gcc/5.2.0(default)                           MATLAB/v95/matlab_compiler_runtime           tensorflow/gpu/1.8.0-py36
gcc/6.3.0                                    mono/5.8.0.108(default)                      torch/2018-07-02
gcc/7.1.0                                    mpich/ge/gcc/64/3.1                          weka/3-6-13
GDAL/2.2.2                                   mpich/ge/gcc/64/3.2                          xilinx/2017.4(default)
GDAL/2.2.3(default)                          mpich/ge/intel/64/3.2                        zlib/1.2.11

Creating Custom Modules

You can use environment modules to manage software you have installed in your home directory by creating your own modules to load the software installed in your home directory. To load the modules you created, you have to first load the use.own module and then the module created by you.

$module show use.own
 -------------------------------------------------------------------
 /cm/local/modulefiles/use.own:
 
 module-whatis  adds your own modulefiles directory to MODULEPATH 
 module             use --append /home/<myusername>`/privatemodules `
 -------------------------------------------------------------------
You need to create a privatemodules directory, cd to privatemodules and add module files in that directory. Let us assume you want to add another R module compiled with a specific compiler like for example the intel c++ compiler. First you have to compile R from source and install it in to any local directory of your choice under home. Then you need to create a module file (see below) which will enable you to load your privately compiled R instead of the publicly available R module in the cluster. The module file, once created, is put inside ~/privatemodules with an appropriate name (like for example, the module file shown below , which loads R version 3.1.1 could be named R-3.1.1).
#%Module 1.0
#
#  Sets up variables for my local install of R-3.1.1, for use with the 'environment-modules' package.
#
proc ModulesHelp { } {

  puts stderr "\tAdds local install of R-3.1.1 to your environment variables,"
}

module-whatis "adds local install of R-3.1.1 to your environment variables"

#you can specify additional modules (if any) to be loaded before loading your custom module like shown below
if ![is-loaded intel/compiler/64/14.0/2013_sp1.1.106] {
       module add intel/compiler/64/14.0/2013_sp1.1.106
}
if ![is-loaded intel/mkl/64/11.1/2013_sp1.1.106] {
      module add intel/mkl/64/11.1/2013_sp1.1.106
}

prereq intel/compiler/64/14.0/2013_sp1.1.106
prereq intel/mkl/64/11.1/2013_sp1.1.106

#in this example we only needed to add the bin path where the R executable is
#but other packages may require you to add additional paths corresponding specific path environment variables. 
#this is particularly the case if you are installing a programming language which generally have 
#a path variable for pointing to its libraries and another path variable pointing to the executables.
#you must make sure all the relevant path variables is pointing to correct locations, otherwise you will not be able
#use your custom module
prepend-path   PATH    /home/your_user_name/path/to/your/local/R-3.1.1/bin/

To load your custom module use:

$module load use.own
$module load your-module-file-name
The above commands can be embedded inside your grid engine job script in the same way when you load public modules.