Skip to content
Snippets Groups Projects
Commit 40c12b02 authored by Lukáš Krupčík's avatar Lukáš Krupčík
Browse files

Update docs.it4i/software/tools/ansys/ansys-cfx.md,...

Update docs.it4i/software/tools/ansys/ansys-cfx.md, docs.it4i/software/tools/ansys/ansys-fluent.md, docs.it4i/software/tools/ansys/ansys-ls-dyna.md, docs.it4i/software/tools/ansys/ansys-mechanical-apdl.md, docs.it4i/software/tools/ansys/licensing.md, mkdocs.yml, docs.it4i/software/tools/ansys/setting-license-preferences.md, docs.it4i/software/tools/ansys/workbench.md, docs.it4i/software/tools/easybuild-images.md, docs.it4i/software/tools/spack.md, docs.it4i/software/tools/virtualization.md, docs.it4i/software/isv_licenses.md, docs.it4i/software/nvidia-cuda.md files
Deleted docs.it4i/software/tools/ansys/ls-dyna.md
parent 6f46b91e
No related branches found
No related tags found
1 merge request!338software
Pipeline #22381 failed
Showing
with 208 additions and 176 deletions
...@@ -79,4 +79,4 @@ The license is used and accounted only with the real usage of the product. So in ...@@ -79,4 +79,4 @@ The license is used and accounted only with the real usage of the product. So in
[1]: #Licence [1]: #Licence
[a]: https://extranet.it4i.cz/rsweb/barbora/licenses [a]: https://extranet.it4i.cz/rsweb/barbora/licenses
[b]: http://licelin.it4i.cz/list/ [b]: http://licelin.it4i.cz/list/
\ No newline at end of file
...@@ -20,13 +20,7 @@ The default programming model for GPU accelerators is NVIDIA CUDA. To set up the ...@@ -20,13 +20,7 @@ The default programming model for GPU accelerators is NVIDIA CUDA. To set up the
$ ml CUDA $ ml CUDA
``` ```
If the user code is hybrid and uses both CUDA and MPI, the MPI environment has to be set up as well. One way to do this is to use the `PrgEnv-gnu` module, which sets up the correct combination of the GNU compiler and MPI library: CUDA code can be compiled directly on login nodes. The user does not have to use compute nodes with GPU accelerators for compilation. To compile CUDA source code, use the NVCC compiler:
```console
$ ml PrgEnv-gnu
```
CUDA code can be compiled directly on login1 or login2 nodes. The user does not have to use compute nodes with GPU accelerators for compilation. To compile CUDA source code, use the NVCC compiler:
```console ```console
$ nvcc --version $ nvcc --version
...@@ -37,7 +31,7 @@ The CUDA Toolkit comes with a large number of examples, which can be a helpful r ...@@ -37,7 +31,7 @@ The CUDA Toolkit comes with a large number of examples, which can be a helpful r
```console ```console
$ cd ~ $ cd ~
$ mkdir cuda-samples $ mkdir cuda-samples
$ cp -R /apps/nvidia/cuda/6.5.14/samples/* ~/cuda-samples/ $ cp -R /apps/nvidia/cuda/VERSION_CUDA/samples/* ~/cuda-samples/
``` ```
To compile examples, change directory to the particular example (here the example used is deviceQuery) and run `make` to start the compilation; To compile examples, change directory to the particular example (here the example used is deviceQuery) and run `make` to start the compilation;
...@@ -51,7 +45,7 @@ To run the code, the user can use a PBS interactive session to get access to a n ...@@ -51,7 +45,7 @@ To run the code, the user can use a PBS interactive session to get access to a n
```console ```console
$ qsub -I -q qnvidia -A OPEN-0-0 $ qsub -I -q qnvidia -A OPEN-0-0
$ ml cuda $ ml CUDA
$ ~/cuda-samples/1_Utilities/deviceQuery/deviceQuery $ ~/cuda-samples/1_Utilities/deviceQuery/deviceQuery
``` ```
......
...@@ -6,18 +6,16 @@ To run ANSYS CFX in batch mode, you can utilize/modify the default cfx.pbs scrip ...@@ -6,18 +6,16 @@ To run ANSYS CFX in batch mode, you can utilize/modify the default cfx.pbs scrip
```bash ```bash
#!/bin/bash #!/bin/bash
#PBS -l select=5:ncpus=24:mpiprocs=24 #PBS -l select=5:ncpus=128:mpiprocs=128
#PBS -q qprod #PBS -q qprod
#PBS -N ANSYS-test #PBS -N ANSYS-test
#PBS -A XX-YY-ZZ #PBS -A XX-YY-ZZ
#! Mail to user when job terminate or abort
#PBS -m ae
#!change the working directory (default is home directory) #!change the working directory (default is home directory)
#cd <working directory> (working directory must exists) #cd <working directory> (working directory must exists)
WORK_DIR="/scratch/$USER/work" DIR=/scratch/project/PROJECT_ID/$PBS_JOBID
cd $WORK_DIR mkdir -p "$DIR"
cd "$DIR" || exit
echo Running on host `hostname` echo Running on host `hostname`
echo Time is `date` echo Time is `date`
...@@ -25,7 +23,7 @@ echo Directory is `pwd` ...@@ -25,7 +23,7 @@ echo Directory is `pwd`
echo This jobs runs on the following processors: echo This jobs runs on the following processors:
echo `cat $PBS_NODEFILE` echo `cat $PBS_NODEFILE`
ml ANSYS/19.1-intel-2017c ml ANSYS/21.1-intel-2018a
#### Set number of processors per host listing #### Set number of processors per host listing
#### (set to 1 as $PBS_NODEFILE lists each node twice if :ppn=2) #### (set to 1 as $PBS_NODEFILE lists each node twice if :ppn=2)
......
...@@ -9,18 +9,16 @@ To run ANSYS Fluent in a batch mode, you can utilize/modify the default fluent.p ...@@ -9,18 +9,16 @@ To run ANSYS Fluent in a batch mode, you can utilize/modify the default fluent.p
```bash ```bash
#!/bin/bash #!/bin/bash
#PBS -S /bin/bash #PBS -S /bin/bash
#PBS -l select=5:ncpus=24:mpiprocs=24 #PBS -l select=5:ncpus=128:mpiprocs=128
#PBS -q qprod #PBS -q qprod
#PBS -N ANSYS-test #PBS -N ANSYS-test
#PBS -A XX-YY-ZZ #PBS -A XX-YY-ZZ
#! Mail to user when job terminate or abort
#PBS -m ae
#!change the working directory (default is home directory) #!change the working directory (default is home directory)
#cd <working directory> (working directory must exists) #cd <working directory> (working directory must exists)
WORK_DIR="/scratch/$USER/work" DIR=/scratch/project/PROJECT_ID/$PBS_JOBID
cd $WORK_DIR mkdir -p "$DIR"
cd "$DIR" || exit
echo Running on host `hostname` echo Running on host `hostname`
echo Time is `date` echo Time is `date`
...@@ -29,7 +27,7 @@ echo This jobs runs on the following processors: ...@@ -29,7 +27,7 @@ echo This jobs runs on the following processors:
echo `cat $PBS_NODEFILE` echo `cat $PBS_NODEFILE`
#### Load ansys module so that we find the cfx5solve command #### Load ansys module so that we find the cfx5solve command
ml ANSYS/19.1-intel-2017c ml ANSYS/21.1-intel-2018a
# Use following line to specify MPI for message-passing instead # Use following line to specify MPI for message-passing instead
NCORES=`wc -l $PBS_NODEFILE |awk '{print $1}'` NCORES=`wc -l $PBS_NODEFILE |awk '{print $1}'`
...@@ -91,7 +89,7 @@ To run ANSYS Fluent in batch mode with the user's config file, you can utilize/m ...@@ -91,7 +89,7 @@ To run ANSYS Fluent in batch mode with the user's config file, you can utilize/m
```bash ```bash
#!/bin/sh #!/bin/sh
#PBS -l nodes=2:ppn=4 #PBS -l nodes=2:mpiprocs=4:ncpus=128
#PBS -1 qprod #PBS -1 qprod
#PBS -N $USE-Fluent-Project #PBS -N $USE-Fluent-Project
#PBS -A XX-YY-ZZ #PBS -A XX-YY-ZZ
...@@ -135,7 +133,7 @@ To run ANSYS Fluent in batch mode with the user's config file, you can utilize/m ...@@ -135,7 +133,7 @@ To run ANSYS Fluent in batch mode with the user's config file, you can utilize/m
Fluent arguments: $fluent_args" Fluent arguments: $fluent_args"
#run the solver #run the solver
/ansys_inc/v145/fluent/bin/fluent $fluent_args > $outfile fluent $fluent_args > $outfile
``` ```
It runs the jobs out of the directory from which they are submitted (PBS_O_WORKDIR). It runs the jobs out of the directory from which they are submitted (PBS_O_WORKDIR).
......
...@@ -6,18 +6,16 @@ To run ANSYS LS-DYNA in batch mode, you can utilize/modify the default ansysdyna ...@@ -6,18 +6,16 @@ To run ANSYS LS-DYNA in batch mode, you can utilize/modify the default ansysdyna
```bash ```bash
#!/bin/bash #!/bin/bash
#PBS -l select=5:ncpus=24:mpiprocs=24 #PBS -l select=5:ncpus=128:mpiprocs=128
#PBS -q qprod #PBS -q qprod
#PBS -N ANSYS-test #PBS -N ANSYS-test
#PBS -A XX-YY-ZZ #PBS -A XX-YY-ZZ
#! Mail to user when job terminate or abort
#PBS -m ae
#!change the working directory (default is home directory) #!change the working directory (default is home directory)
#cd <working directory> #cd <working directory>
WORK_DIR="/scratch/$USER/work" DIR=/scratch/project/PROJECT_ID/$PBS_JOBID
cd $WORK_DIR mkdir -p "$DIR"
cd "$DIR" || exit
echo Running on host `hostname` echo Running on host `hostname`
echo Time is `date` echo Time is `date`
...@@ -30,7 +28,7 @@ NPROCS=`wc -l < $PBS_NODEFILE` ...@@ -30,7 +28,7 @@ NPROCS=`wc -l < $PBS_NODEFILE`
echo This job has allocated $NPROCS nodes echo This job has allocated $NPROCS nodes
ml ANSYS/19.1-intel-2017c ml ANSYS/21.1-intel-2018a
#### Set number of processors per host listing #### Set number of processors per host listing
#### (set to 1 as $PBS_NODEFILE lists each node twice if :ppn=2) #### (set to 1 as $PBS_NODEFILE lists each node twice if :ppn=2)
...@@ -47,7 +45,7 @@ done ...@@ -47,7 +45,7 @@ done
echo Machines: $hl echo Machines: $hl
ansys191 -dis -lsdynampp i=input.k -machines $hl ansys211 -dis -lsdynampp i=input.k -machines $hl
``` ```
The header of the PBS file (above) is common and the description can be found on [this site][1]. [SVS FEM][b] recommends to utilize sources by keywords: nodes, ppn. These keywords allows addressing directly the number of nodes (computers) and cores (ppn) utilized in the job. In addition, the rest of the code assumes such structure of allocated resources. The header of the PBS file (above) is common and the description can be found on [this site][1]. [SVS FEM][b] recommends to utilize sources by keywords: nodes, ppn. These keywords allows addressing directly the number of nodes (computers) and cores (ppn) utilized in the job. In addition, the rest of the code assumes such structure of allocated resources.
......
...@@ -6,18 +6,16 @@ To run ANSYS MAPDL in batch mode you can utilize/modify the default mapdl.pbs sc ...@@ -6,18 +6,16 @@ To run ANSYS MAPDL in batch mode you can utilize/modify the default mapdl.pbs sc
```bash ```bash
#!/bin/bash #!/bin/bash
#PBS -l select=5:ncpus=24:mpiprocs=24 #PBS -l select=5:ncpus=128:mpiprocs=128
#PBS -q qprod #PBS -q qprod
#PBS -N ANSYS-test #PBS -N ANSYS-test
#PBS -A XX-YY-ZZ #PBS -A XX-YY-ZZ
#! Mail to user when job terminate or abort
#PBS -m ae
#!change the working directory (default is home directory) #!change the working directory (default is home directory)
#cd <working directory> (working directory must exists) #cd <working directory> (working directory must exists)
WORK_DIR="/scratch/$USER/work" DIR=/scratch/project/PROJECT_ID/$PBS_JOBID
cd $WORK_DIR mkdir -p "$DIR"
cd "$DIR" || exit
echo Running on host `hostname` echo Running on host `hostname`
echo Time is `date` echo Time is `date`
...@@ -25,7 +23,7 @@ echo Directory is `pwd` ...@@ -25,7 +23,7 @@ echo Directory is `pwd`
echo This jobs runs on the following processors: echo This jobs runs on the following processors:
echo `cat $PBS_NODEFILE` echo `cat $PBS_NODEFILE`
ml ANSYS/19.1-intel-2017c ml ANSYS/21.1-intel-2018a
#### Set number of processors per host listing #### Set number of processors per host listing
#### (set to 1 as $PBS_NODEFILE lists each node twice if :ppn=2) #### (set to 1 as $PBS_NODEFILE lists each node twice if :ppn=2)
...@@ -45,7 +43,7 @@ echo Machines: $hl ...@@ -45,7 +43,7 @@ echo Machines: $hl
#-i input.dat includes the input of analysis in APDL format #-i input.dat includes the input of analysis in APDL format
#-o file.out is output file from ansys where all text outputs will be redirected #-o file.out is output file from ansys where all text outputs will be redirected
#-p the name of license feature (aa_r=ANSYS Academic Research, ane3fl=Multiphysics(commercial), aa_r_dy=Academic AUTODYN) #-p the name of license feature (aa_r=ANSYS Academic Research, ane3fl=Multiphysics(commercial), aa_r_dy=Academic AUTODYN)
ansys191 -b -dis -p aa_r -i input.dat -o file.out -machines $hl -dir $WORK_DIR ansys211 -b -dis -p aa_r -i input.dat -o file.out -machines $hl -dir $WORK_DIR
``` ```
The header of the PBS file (above) is common and the description can be found on [this site][1]. [SVS FEM][b] recommends utilizing sources by keywords: nodes, ppn. These keywords allow addressing directly the number of nodes (computers) and cores (ppn) utilized in the job. In, addition the rest of the code assumes such structure of allocated resources. The header of the PBS file (above) is common and the description can be found on [this site][1]. [SVS FEM][b] recommends utilizing sources by keywords: nodes, ppn. These keywords allow addressing directly the number of nodes (computers) and cores (ppn) utilized in the job. In, addition the rest of the code assumes such structure of allocated resources.
......
...@@ -22,15 +22,13 @@ lic-ansys.vsb.cz / 1055 (2325) ...@@ -22,15 +22,13 @@ lic-ansys.vsb.cz / 1055 (2325)
## Available Versions ## Available Versions
* 19.1 * 21.1
* 19.3
``` console ``` console
$ ml av ANSYS $ ml av ANSYS
---------------- /apps/modules/tools ----------------------- ---------------- /apps/modules/tools -----------------------
ANSYS/19.1-intel-2017c (C6) ANSYS/19.3-intel-2017c (D) ANSYS/21.1-intel-2018a (D)
Where: Where:
C6: Old CentOS6 module
D: Default Module D: Default Module
``` ```
# LS-DYNA
[LS-DYNA][a] is a multi-purpose, explicit and implicit finite element program used to analyze the nonlinear dynamic response of structures. Its fully automated contact analysis capability, a wide range of constitutive models to simulate a whole range of engineering materials (steels, composites, foams, concrete, etc.), error-checking features, and the high scalability have enabled users worldwide to solve successfully many complex problems. Additionally LS-DYNA is extensively used to simulate impacts on structures from drop tests, underwater shock, explosions or high-velocity impacts. Explosive forming, process engineering, accident reconstruction, vehicle dynamics, thermal brake disc analysis, or nuclear safety are further areas in the broad range of possible applications. In leading-edge research, LS-DYNA is used to investigate the behavior of materials like composites, ceramics, concrete, or wood. Moreover, it is used in biomechanics, human modeling, molecular structures, casting, forging, or virtual testing.
!!! Info
We provide **1 commercial license of LS-DYNA without HPC** support, now.
To run LS-DYNA in batch mode, you can utilize/modify the default lsdyna.pbs script and execute it via the qsub command:
```bash
#!/bin/bash
#PBS -l select=5:ncpus=16:mpiprocs=16
#PBS -q qprod
#PBS -N ANSYS-test
#PBS -A XX-YY-ZZ
#! Mail to user when job terminate or abort
#PBS -m ae
#!change the working directory (default is home directory)
#cd <working directory> (working directory must exists)
WORK_DIR="/scratch/$USER/work"
cd $WORK_DIR
echo Running on host `hostname`
echo Time is `date`
echo Directory is `pwd`
ml lsdyna
/apps/engineering/lsdyna/lsdyna700s i=input.k
```
The header of the PBS file (above) is common and the description can be found on [this site][1]. [SVS FEM][b] recommends utilizing sources by keywords: nodes, ppn. These keywords allow addressing directly the number of nodes (computers) and cores (ppn) utilized in the job. In addition, the rest of the code assumes such structure of allocated resources.
A working directory has to be created before sending the PBS job into the queue. The input file should be in the working directory or a full path to input file has to be specified. The input file has to be defined by a common LS-DYNA **.k** file which is attached to the LS-DYNA solver via the `i=` parameter.
[1]: ../../../general/job-submission-and-execution.md
[a]: http://www.lstc.com/
[b]: http://www.svsfem.cz
# Setting License Preferences # Setting License Preferences
Some ANSYS tools allow you to explicitly specify usage of academic or commercial licenses in the command line (e.g. ansys161 -p aa_r to select the Academic Research license). However, we have observed that not all tools obey this option and choose the commercial license. Some ANSYS tools allow you to explicitly specify usage of academic or commercial licenses in the command line (e.g. ansys211 -p aa_r to select the Academic Research license). However, we have observed that not all tools obey this option and choose the commercial license.
Thus you need to configure preferred license order with ANSLIC_ADMIN. Follow these steps and move the Academic Research license to the top or bottom of the list accordingly. Thus you need to configure preferred license order with ANSLIC_ADMIN. Follow these steps and move the Academic Research license to the top or bottom of the list accordingly.
......
...@@ -18,17 +18,16 @@ Now, save the project and close Workbench. We will use this script to launch the ...@@ -18,17 +18,16 @@ Now, save the project and close Workbench. We will use this script to launch the
```bash ```bash
#!/bin/bash #!/bin/bash
#PBS -l select=2:ncpus=24 #PBS -l select=2:ncpus=128
#PBS -q qprod #PBS -q qprod
#PBS -N test9_mpi_2 #PBS -N test9_mpi_2
#PBS -A OPEN-0-0 #PBS -A OPEN-0-0
# Mail to user when job terminate or abort
#PBS -m a
# change the working directory # change the working directory
WORK_DIR="$PBS_O_WORKDIR" DIR=/scratch/project/PROJECT_ID/$PBS_JOBID
cd $WORK_DIR mkdir -p "$DIR"
cd "$DIR" || exit
echo Running on host `hostname` echo Running on host `hostname`
echo Time is `date` echo Time is `date`
...@@ -36,7 +35,7 @@ Now, save the project and close Workbench. We will use this script to launch the ...@@ -36,7 +35,7 @@ Now, save the project and close Workbench. We will use this script to launch the
echo This jobs runs on the following nodes: echo This jobs runs on the following nodes:
echo `cat $PBS_NODEFILE` echo `cat $PBS_NODEFILE`
ml ANSYS ml ANSYS/21.1-intel-2018a
#### Set number of processors per host listing #### Set number of processors per host listing
procs_per_host=24 procs_per_host=24
......
...@@ -2,10 +2,6 @@ ...@@ -2,10 +2,6 @@
EasyBuild has support for generating container recipes that will use EasyBuild to build and install a specified software stack. In addition, EasyBuild can (optionally) leverage the build tool provided by the container software of choice to create container images. EasyBuild has support for generating container recipes that will use EasyBuild to build and install a specified software stack. In addition, EasyBuild can (optionally) leverage the build tool provided by the container software of choice to create container images.
!!! info
The features documented here have been available since **EasyBuild v3.6.0** but are still experimental, which implies they are subject to change in upcoming versions of EasyBuild.
To use the features, enable the `--experimental` configuration option.
## Generating Container Recipes ## Generating Container Recipes
To generate container recipes, use `eb --containerize`, or `eb -C` for short. To generate container recipes, use `eb --containerize`, or `eb -C` for short.
......
...@@ -11,8 +11,7 @@ For more information, see Spack's [documentation][a]. ...@@ -11,8 +11,7 @@ For more information, see Spack's [documentation][a].
$ ml av Spack $ ml av Spack
---------------------- /apps/modules/devel ------------------------------ ---------------------- /apps/modules/devel ------------------------------
Spack/default Spack/0.16.2 (D)
``` ```
!!! note !!! note
...@@ -27,8 +26,8 @@ $ ml Spack ...@@ -27,8 +26,8 @@ $ ml Spack
== Settings for first use == Settings for first use
Couldn't import dot_parser, loading of dot files will not be possible. Couldn't import dot_parser, loading of dot files will not be possible.
== temporary log file in case of crash /tmp/eb-wLh1RT/easybuild-54vEn3.log == temporary log file in case of crash /tmp/eb-wLh1RT/easybuild-54vEn3.log
== processing EasyBuild easyconfig /apps/easybuild/easyconfigs-it4i/s/Spack/Spack-0.10.0.eb == processing EasyBuild easyconfig /apps/easybuild/easyconfigs-master/easybuild/easyconfigs/s/Spack/Spack-0.16.2.eb
== building and installing Spack/0.10.0... == building and installing Spack/0.16.2...
== fetching files... == fetching files...
== creating build dir, resetting environment... == creating build dir, resetting environment...
== unpacking... == unpacking...
...@@ -46,29 +45,27 @@ Couldn't import dot_parser, loading of dot files will not be possible. ...@@ -46,29 +45,27 @@ Couldn't import dot_parser, loading of dot files will not be possible.
== permissions... == permissions...
== packaging... == packaging...
== COMPLETED: Installation ended successfully == COMPLETED: Installation ended successfully
== Results of the build can be found in the log file(s) ~/.local/easybuild/software/Spack/0.10.0/easybuild/easybuild-Spack-0.10.0-20170707.122650.log == Results of the build can be found in the log file(s) /home/user/.local/easybuild/software/Spack/0.16.2/easybuild/easybuild-Spack-0.16.2-20210922.123022.log
== Build succeeded for 1 out of 1 == Build succeeded for 1 out of 1
== Temporary log file(s) /tmp/eb-wLh1RT/easybuild-54vEn3.log* have been removed. == Temporary log file(s) /tmp/eb-wLh1RT/easybuild-54vEn3.log* have been removed.
== Temporary directory /tmp/eb-wLh1RT has been removed. == Temporary directory /tmp/eb-wLh1RT has been removed.
== Create folder ~/Spack == Create folder ~/Spack
The following have been reloaded with a version change: The following have been reloaded with a version change:
1) Spack/default => Spack/0.10.0 1) Spack/default => Spack/0.16.2
$ spack --version $ spack --version
0.10.0 0.16.2
``` ```
## Usage Module Spack/Default ## Usage Module Spack/Default
```console ```console
$ ml Spack $ ml Spack
$ ml
The following have been reloaded with a version change: Currently Loaded Modules:
1) Spack/default => Spack/0.10.0 1) Spack/0.16.2
$ spack --version
0.10.0
``` ```
## Build Software Package ## Build Software Package
...@@ -107,9 +104,22 @@ To see more available versions of a package, run `spack versions`. ...@@ -107,9 +104,22 @@ To see more available versions of a package, run `spack versions`.
```console ```console
$ spack versions git $ spack versions git
==> Safe versions (already checksummed): ==> Safe versions (already checksummed):
2.11.0 2.9.3 2.9.2 2.9.1 2.9.0 2.8.4 2.8.3 2.8.2 2.8.1 2.8.0 2.7.3 2.7.1 2.29.0 2.27.0 2.25.0 2.20.1 2.19.1 2.17.1 2.15.1 2.13.0 2.12.1 2.11.1 2.9.3 2.9.1 2.8.4 2.8.2 2.8.0 2.7.1
2.28.0 2.26.0 2.21.0 2.19.2 2.18.0 2.17.0 2.14.1 2.12.2 2.12.0 2.11.0 2.9.2 2.9.0 2.8.3 2.8.1 2.7.3
==> Remote versions (not yet checksummed): ==> Remote versions (not yet checksummed):
Found no versions for git 2.33.0 2.26.2 2.23.3 2.21.1 2.18.3 2.16.1 2.13.6 2.10.4 2.7.0 2.5.2 2.4.2 2.3.0 2.0.2 1.8.5.2 1.8.3.1
2.32.0 2.26.1 2.23.2 2.20.5 2.18.2 2.16.0 2.13.5 2.10.3 2.6.7 2.5.1 2.4.1 2.2.3 2.0.1 1.8.5.1 1.8.3
2.31.1 2.25.5 2.23.1 2.20.4 2.18.1 2.15.4 2.13.4 2.10.2 2.6.6 2.5.0 2.4.0 2.2.2 2.0.0 1.8.5 1.8.2.3
2.31.0 2.25.4 2.23.0 2.20.3 2.17.6 2.15.3 2.13.3 2.10.1 2.6.5 2.4.12 2.3.10 2.2.1 1.9.5 1.8.4.5 0.7
2.30.2 2.25.3 2.22.5 2.20.2 2.17.5 2.15.2 2.13.2 2.10.0 2.6.4 2.4.11 2.3.9 2.2.0 1.9.4 1.8.4.4 0.6
2.30.1 2.25.2 2.22.4 2.20.0 2.17.4 2.15.0 2.13.1 2.9.5 2.6.3 2.4.10 2.3.8 2.1.4 1.9.3 1.8.4.3 0.5
2.30.0 2.25.1 2.22.3 2.19.6 2.17.3 2.14.6 2.12.5 2.9.4 2.6.2 2.4.9 2.3.7 2.1.3 1.9.2 1.8.4.2 0.04
2.29.3 2.24.4 2.22.2 2.19.5 2.17.2 2.14.5 2.12.4 2.8.6 2.6.1 2.4.8 2.3.6 2.1.2 1.9.1 1.8.4.1 0.03
2.29.2 2.24.3 2.22.1 2.19.4 2.16.6 2.14.4 2.12.3 2.8.5 2.6.0 2.4.7 2.3.5 2.1.1 1.9.0 1.8.4.rc0 0.02
2.29.1 2.24.2 2.22.0 2.19.3 2.16.5 2.14.3 2.11.4 2.7.6 2.5.6 2.4.6 2.3.4 2.1.0 1.8.5.6 1.8.4 0.01
2.28.1 2.24.1 2.21.4 2.19.0 2.16.4 2.14.2 2.11.3 2.7.5 2.5.5 2.4.5 2.3.3 2.0.5 1.8.5.5 1.8.3.4
2.27.1 2.24.0 2.21.3 2.18.5 2.16.3 2.14.0 2.11.2 2.7.4 2.5.4 2.4.4 2.3.2 2.0.4 1.8.5.4 1.8.3.3
2.26.3 2.23.4 2.21.2 2.18.4 2.16.2 2.13.7 2.10.5 2.7.2 2.5.3 2.4.3 2.3.1 2.0.3 1.8.5.3 1.8.3.2
``` ```
## Graph for Software Package ## Graph for Software Package
...@@ -118,6 +128,7 @@ Spack provides the `spack graph` command to display the dependency graph. By def ...@@ -118,6 +128,7 @@ Spack provides the `spack graph` command to display the dependency graph. By def
```console ```console
$ spack graph git $ spack graph git
==> Warning: gcc@4.8.5 cannot build optimized binaries for "zen2". Using best target possible: "x86_64"
o git o git
|\ |\
| |\ | |\
...@@ -127,45 +138,82 @@ o git ...@@ -127,45 +138,82 @@ o git
| | | | | |\ | | | | | |\
| | | | | | |\ | | | | | | |\
| | | | | | | |\ | | | | | | | |\
| | | | | | | o | curl | | | | | | | | |\
| |_|_|_|_|_|/| | | | | | | | | | | |\
|/| | | |_|_|/ / | | | | | | | | | | |\
| | | |/| | | | | | | | | | | | | | | |\
| | | o | | | | openssl | | | | | | | | | | | | |\
| |_|/ / / / / | | | | o | | | | | | | | | openssh
|/| | | | | | | |_|_|/| | | | | | | | | |
| | | | o | | gettext |/| | |/| | | | | | | | | |
| | | | |\ \ \ | | | | |\ \ \ \ \ \ \ \ \ \
| | | | | |\ \ \ | | | | | | | | | | | | o | | curl
| | | | | | |\ \ \ | |_|_|_|_|_|_|_|_|_|_|/| | |
| | | | | | | |\ \ \ |/| | | |_|_|_|_|_|_|_|/| | |
| | | | | | | o | | | libxml2 | | | |/| | | | | |_|_|/ / /
| |_|_|_|_|_|/| | | | | | | | | | | | |/| | | | |
|/| | | | |_|/| | | | | | | o | | | | | | | | | | openssl
| |_|/| | | | | | | | | | |
|/| |/ / / / / / / / / / /
| |/| | | | | | | | | | |
| | | | | | | | | o | | | gettext
| | | | |_|_|_|_|/| | | |
| | | |/| | | | |/| | | |
| | | | | | | | | |\ \ \ \
| | | | | | | | | | |\ \ \ \
| | | | | | | | | | | |\ \ \ \
| | | | | | | | | | | o | | | | libxml2
| |_|_|_|_|_|_|_|_|_|/| | | | |
|/| | | | | | | | |_|/| | | | |
| | | | | | | | |/| |/| | | | |
| | | | | | | | | |/| | | | | |
o | | | | | | | | | | | | | | | zlib
/ / / / / / / / / / / / / / /
| | | | | | | | o | | | | | | xz
| | | | | | | | / / / / / /
| | | | | | | | o | | | | | tar
| | | | | | | |/ / / / / /
| | | | | | | | | | | o | automake
| |_|_|_|_|_|_|_|_|_|/| |
|/| | | | | | | | | | | |
| | | | | | | | | | | |/
| | | | | | | | | | | o autoconf
| |_|_|_|_|_|_|_|_|_|/|
|/| | | | |_|_|_|_|_|/
| | | | |/| | | | | | | | | | |/| | | | | |
o | | | | | | | | | | zlib o | | | | | | | | | | perl
/ / / / / / / / / / |\ \ \ \ \ \ \ \ \ \ \
| | | o | | | | | | xz o | | | | | | | | | | | gdbm
| | | / / / / / / o | | | | | | | | | | | readline
| | | o | | | | | tar | |_|/ / / / / / / / /
| | | / / / / / |/| | | | | | | | | |
| | | | o | | | pkg-config | | | o | | | | | | | libedit
| | | | / / / | |_|/ / / / / / / /
o | | | | | | perl |/| | | | | | | | |
/ / / / / / o | | | | | | | | | ncurses
o | | | | | pcre | |_|_|_|_|_|/ / /
/ / / / / |/| | | | | | | |
| o | | | ncurses o | | | | | | | | pkgconf
/ / / / / / / /
| o | | | | | | pcre2
| / / / / / /
| | o | | | | libtool
| |/ / / / /
| o | | | | m4
| | o | | | libidn2
| | o | | | libunistring
| | |/ / /
| o | | | libsigsegv
| / / / | / / /
| | | o autoconf | | o | bzip2
| | | o m4 | | o | diffutils
| | | o libsigsegv | |/ /
| | | | o | libiconv
o | | libiconv | /
/ /
| o expat | o expat
| o libbsd
| |
o bzip2 o berkeley-db
``` ```
### Information for Software Package ### Information for Software Package
...@@ -174,45 +222,74 @@ To get more information on a particular package from `spack list`, use `spack in ...@@ -174,45 +222,74 @@ To get more information on a particular package from `spack list`, use `spack in
```console ```console
$ spack info git $ spack info git
Package: git AutotoolsPackage: git
Homepage: http://git-scm.com
Description:
Git is a free and open source distributed version control system
designed to handle everything from small to very large projects with
speed and efficiency.
Homepage: http://git-scm.com
Tags:
None
Preferred version:
2.29.0 https://mirrors.edge.kernel.org/pub/software/scm/git/git-2.29.0.tar.gz
Safe versions: Safe versions:
2.11.0 https://github.com/git/git/tarball/v2.11.0 2.29.0 https://mirrors.edge.kernel.org/pub/software/scm/git/git-2.29.0.tar.gz
2.9.3 https://github.com/git/git/tarball/v2.9.3 2.28.0 https://mirrors.edge.kernel.org/pub/software/scm/git/git-2.28.0.tar.gz
2.9.2 https://github.com/git/git/tarball/v2.9.2 2.27.0 https://mirrors.edge.kernel.org/pub/software/scm/git/git-2.27.0.tar.gz
2.9.1 https://github.com/git/git/tarball/v2.9.1 2.26.0 https://mirrors.edge.kernel.org/pub/software/scm/git/git-2.26.0.tar.gz
2.9.0 https://github.com/git/git/tarball/v2.9.0 2.25.0 https://mirrors.edge.kernel.org/pub/software/scm/git/git-2.25.0.tar.gz
2.8.4 https://github.com/git/git/tarball/v2.8.4 2.21.0 https://mirrors.edge.kernel.org/pub/software/scm/git/git-2.21.0.tar.gz
2.8.3 https://github.com/git/git/tarball/v2.8.3 2.20.1 https://mirrors.edge.kernel.org/pub/software/scm/git/git-2.20.1.tar.gz
2.8.2 https://github.com/git/git/tarball/v2.8.2 2.19.2 https://mirrors.edge.kernel.org/pub/software/scm/git/git-2.19.2.tar.gz
2.8.1 https://github.com/git/git/tarball/v2.8.1 2.19.1 https://mirrors.edge.kernel.org/pub/software/scm/git/git-2.19.1.tar.gz
2.8.0 https://github.com/git/git/tarball/v2.8.0 2.18.0 https://mirrors.edge.kernel.org/pub/software/scm/git/git-2.18.0.tar.gz
2.7.3 https://github.com/git/git/tarball/v2.7.3 2.17.1 https://mirrors.edge.kernel.org/pub/software/scm/git/git-2.17.1.tar.gz
2.7.1 https://github.com/git/git/tarball/v2.7.1 2.17.0 https://mirrors.edge.kernel.org/pub/software/scm/git/git-2.17.0.tar.gz
2.15.1 https://mirrors.edge.kernel.org/pub/software/scm/git/git-2.15.1.tar.gz
2.14.1 https://mirrors.edge.kernel.org/pub/software/scm/git/git-2.14.1.tar.gz
2.13.0 https://mirrors.edge.kernel.org/pub/software/scm/git/git-2.13.0.tar.gz
2.12.2 https://mirrors.edge.kernel.org/pub/software/scm/git/git-2.12.2.tar.gz
2.12.1 https://mirrors.edge.kernel.org/pub/software/scm/git/git-2.12.1.tar.gz
2.12.0 https://mirrors.edge.kernel.org/pub/software/scm/git/git-2.12.0.tar.gz
2.11.1 https://mirrors.edge.kernel.org/pub/software/scm/git/git-2.11.1.tar.gz
2.11.0 https://mirrors.edge.kernel.org/pub/software/scm/git/git-2.11.0.tar.gz
2.9.3 https://mirrors.edge.kernel.org/pub/software/scm/git/git-2.9.3.tar.gz
2.9.2 https://mirrors.edge.kernel.org/pub/software/scm/git/git-2.9.2.tar.gz
2.9.1 https://mirrors.edge.kernel.org/pub/software/scm/git/git-2.9.1.tar.gz
2.9.0 https://mirrors.edge.kernel.org/pub/software/scm/git/git-2.9.0.tar.gz
2.8.4 https://mirrors.edge.kernel.org/pub/software/scm/git/git-2.8.4.tar.gz
2.8.3 https://mirrors.edge.kernel.org/pub/software/scm/git/git-2.8.3.tar.gz
2.8.2 https://mirrors.edge.kernel.org/pub/software/scm/git/git-2.8.2.tar.gz
2.8.1 https://mirrors.edge.kernel.org/pub/software/scm/git/git-2.8.1.tar.gz
2.8.0 https://mirrors.edge.kernel.org/pub/software/scm/git/git-2.8.0.tar.gz
2.7.3 https://mirrors.edge.kernel.org/pub/software/scm/git/git-2.7.3.tar.gz
2.7.1 https://mirrors.edge.kernel.org/pub/software/scm/git/git-2.7.1.tar.gz
Variants: Variants:
None Name [Default] Allowed values Description
============== ============== ===========================================
tcltk [off] on, off Gitk: provide Tcl/Tk in the run environment
Installation Phases: Installation Phases:
install autoreconf configure build install
Build Dependencies: Build Dependencies:
autoconf curl expat gettext libiconv openssl pcre perl zlib autoconf automake curl expat gettext iconv libidn2 libtool m4 openssl pcre pcre2 perl tk zlib
Link Dependencies: Link Dependencies:
curl expat gettext libiconv openssl pcre perl zlib curl expat gettext iconv libidn2 openssl pcre pcre2 perl tk zlib
Run Dependencies: Run Dependencies:
None openssh
Virtual Packages: Virtual Packages:
None None
Description:
Git is a free and open source distributed version control system
designed to handle everything from small to very large projects with
speed and efficiency.
``` ```
### Install Software Package ### Install Software Package
...@@ -220,10 +297,23 @@ Description: ...@@ -220,10 +297,23 @@ Description:
`spack install` will install any package shown by `spack list`. For example, to install the latest version of the `git` package, you might type `spack install git` for default version or `spack install git@version` to chose a particular one. `spack install` will install any package shown by `spack list`. For example, to install the latest version of the `git` package, you might type `spack install git` for default version or `spack install git@version` to chose a particular one.
```console ```console
$ spack install git@2.11.0 $ spack install git@2.29.0
==> Installing git ==> Warning: specifying a "dotkit" module root has no effect [support for "dotkit" has been dropped in v0.13.0]
==> Installing pcre ==> Warning: gcc@4.8.5 cannot build optimized binaries for "zen2". Using best target possible: "x86_64"
==> Fetching http://ftp.csx.cam.ac.uk/pub/software/programming/pcre/pcre-8.39.tar.bz2 ==> Installing libsigsegv-2.12-lctnabj6w4bmnyxo7q6ct4wewke2bqin
==> No binary for libsigsegv-2.12-lctnabj6w4bmnyxo7q6ct4wewke2bqin found: installing from source
==> Fetching https://spack-llnl-mirror.s3-us-west-2.amazonaws.com/_source-cache/archive/3a/3ae1af359eebaa4ffc5896a1aee3568c052c99879316a1ab57f8fe1789c390b6.tar.gz
######################################################################## 100.0%
==> libsigsegv: Executing phase: 'autoreconf'
==> libsigsegv: Executing phase: 'configure'
==> libsigsegv: Executing phase: 'build'
==> libsigsegv: Executing phase: 'install'
[+] /home/kru0052/Spack/opt/spack/linux-centos7-x86_64/gcc-4.8.5/libsigsegv-2.12-lctnabj6w4bmnyxo7q6ct4wewke2bqin
==> Installing berkeley-db-18.1.40-bwuaqjex546zw3bimt23bgokfctnt46y
==> No binary for berkeley-db-18.1.40-bwuaqjex546zw3bimt23bgokfctnt46y found: installing from source
==> Fetching https://spack-llnl-mirror.s3-us-west-2.amazonaws.com/_source-cache/archive/0c/0cecb2ef0c67b166de93732769abdeba0555086d51de1090df325e18ee8da9c8.tar.gz
######################################################################## 100.0%
...
... ...
``` ```
......
# Virtualization # Virtualization
<!--
musime proverit
-->
Running virtual machines on compute nodes. Running virtual machines on compute nodes.
## Introduction ## Introduction
......
...@@ -206,7 +206,6 @@ nav: ...@@ -206,7 +206,6 @@ nav:
- ANSYS Fluent: software/tools/ansys/ansys-fluent.md - ANSYS Fluent: software/tools/ansys/ansys-fluent.md
- ANSYS LS-DYNA: software/tools/ansys/ansys-ls-dyna.md - ANSYS LS-DYNA: software/tools/ansys/ansys-ls-dyna.md
- ANSYS MAPDL: software/tools/ansys/ansys-mechanical-apdl.md - ANSYS MAPDL: software/tools/ansys/ansys-mechanical-apdl.md
- LS-DYNA: software/tools/ansys/ls-dyna.md
- Workbench: software/tools/ansys/workbench.md - Workbench: software/tools/ansys/workbench.md
- Setting License Preferences: software/tools/ansys/licensing.md - Setting License Preferences: software/tools/ansys/licensing.md
- Licensing and Available Versions: software/tools/ansys/setting-license-preferences.md - Licensing and Available Versions: software/tools/ansys/setting-license-preferences.md
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Please register or to comment