README.md
| @@ -21,4 +21,6 @@ Run Jobs on NERSC | |||
| 21 | 21 | ||
| 22 | 22 | 3.4. `<FULL_PATH_TO_RMCPROFILE_PACKAGE>` is the full path to the RMCProfile package directory. | |
| 23 | 23 | ||
| 24 | - | 3.5. `<RMC_RUNNING_DIR>` is the RMC running directory and `<YOUR_RMC_STEM_NAME>` is the stem name of the RMC setup | |
| 24 | + | 3.5. `<RMC_RUNNING_DIR>` is the RMC running directory and `<YOUR_RMC_STEM_NAME>` is the stem name of the RMC setup. | |
| 25 | + | ||
| 26 | + | 4. Submit the job by running `sbatch submit.slurm` from where the `submit.slurm` script is located, on the terminal. | |
README.md
| @@ -9,7 +9,7 @@ Run Jobs on NERSC | |||
| 9 | 9 | tar xzvf RMCProfile_V6.7.9_Linux_64.tgz | |
| 10 | 10 | ``` | |
| 11 | 11 | ||
| 12 | - | 2. Prepare the job submission script -- see the example above -- and put the script to somewhere in your home directory. I would suggest put the script in the same directory where the RMC run will be happening so that we can easily keep track of which one is for which one. | |
| 12 | + | 2. Prepare the job submission script -- see the example below -- and put the script to somewhere in your home directory. I would suggest put the script in the same directory where the RMC run will be happening so that we can easily keep track of which one is for which one. | |
| 13 | 13 | ||
| 14 | 14 | 3. Several notes about the script, | |
| 15 | 15 | ||
README.md(檔案已創建)
| @@ -0,0 +1,24 @@ | |||
| 1 | + | Run Jobs on NERSC | |
| 2 | + | === | |
| 3 | + | ||
| 4 | + | 1. First, download the RMCProfile package on Linux platform [here](https://yr.iris-home.net/rmclinux) and untar it somewhere in our home directory. Once logged onto NERSC from terminal, we can do this, | |
| 5 | + | ||
| 6 | + | ``` | |
| 7 | + | cd SOMEWHERE | |
| 8 | + | wget https://yr.iris-home.net/rmclinux -O RMCProfile_V6.7.9_Linux_64.tgz | |
| 9 | + | tar xzvf RMCProfile_V6.7.9_Linux_64.tgz | |
| 10 | + | ``` | |
| 11 | + | ||
| 12 | + | 2. Prepare the job submission script -- see the example above -- and put the script to somewhere in your home directory. I would suggest put the script in the same directory where the RMC run will be happening so that we can easily keep track of which one is for which one. | |
| 13 | + | ||
| 14 | + | 3. Several notes about the script, | |
| 15 | + | ||
| 16 | + | 3.1. `<YOUR_JOB_NAME>` is a name that we can give to the running job, which can be any meaningful name to us. | |
| 17 | + | ||
| 18 | + | 3.2. `<YOUR_OUTPUT_FILE>` is the name for the output file of the running job, meaning that all the terminal output during the job running will go into this file. | |
| 19 | + | ||
| 20 | + | 3.3. `<YOUR_EMAIL>` is our email address -- status (e.g., started, or stopped, or failure, etc.) of the job will be sent to this email. | |
| 21 | + | ||
| 22 | + | 3.4. `<FULL_PATH_TO_RMCPROFILE_PACKAGE>` is the full path to the RMCProfile package directory. | |
| 23 | + | ||
| 24 | + | 3.5. `<RMC_RUNNING_DIR>` is the RMC running directory and `<YOUR_RMC_STEM_NAME>` is the stem name of the RMC setup | |
submit.slurm(檔案已創建)
| @@ -0,0 +1,22 @@ | |||
| 1 | + | #!/bin/bash | |
| 2 | + | #SBATCH --job-name=<YOUR_JOB_NAME> | |
| 3 | + | #SBATCH --output=<YOUR_OUTPUT_FILE> | |
| 4 | + | #SBATCH -t 24:00:00 | |
| 5 | + | #SBATCH -N 1 | |
| 6 | + | #SBATCH -q regular | |
| 7 | + | #SBATCH -C cpu | |
| 8 | + | #SBATCH --mail-user=<YOUR_EMAIL> | |
| 9 | + | #SBATCH --mail-type=ALL | |
| 10 | + | ||
| 11 | + | #OpenMP settings: | |
| 12 | + | export OMP_NUM_THREADS=8 | |
| 13 | + | export OMP_PLACES=threads | |
| 14 | + | export OMP_PROC_BIND=spread | |
| 15 | + | ||
| 16 | + | RMCProfile_PATH=<FULL_PATH_TO_RMCPROFILE_PACKAGE> | |
| 17 | + | export PGPLOT_DIR=$RMCProfile_PATH/exe/libs | |
| 18 | + | export LD_LIBRARY_PATH=$RMCProfile_PATH/exe/libs | |
| 19 | + | export LIBRARY_PATH=$RMCProfile_PATH/exe/libs | |
| 20 | + | export PATH=$PATH:$RMCProfile_PATH/exe | |
| 21 | + | cd <RMC_RUNNING_DIR> | |
| 22 | + | $RMCProfile_PATH/exe/rmcprofile <YOUR_RMC_STEM_NAME> | |