Skip to content

Commit

Permalink
Merge pull request #78 from csc-training/ehuusko
Browse files Browse the repository at this point in the history
Updated project names and reservations
  • Loading branch information
EetuHuuskoCSC authored Oct 8, 2024
2 parents afa1c0e + 6b39252 commit c6d37d2
Show file tree
Hide file tree
Showing 4 changed files with 21 additions and 27 deletions.
2 changes: 1 addition & 1 deletion materials/batch_job.md
Original file line number Diff line number Diff line change
Expand Up @@ -38,7 +38,7 @@ Minimal example of batch script:

```bash title="simple.sh"
#!/bin/bash
#SBATCH --account=project_200xxxx # Your CSC project. Mandatory.
#SBATCH --account=project_20xxxxx # Your CSC project. Mandatory.
#SBATCH --partition=test # Partition, see section below
#SBATCH --time=00:02:00 # Maximum duration of the job.
#SBATCH --ntasks=1 # How many cores?
Expand Down
16 changes: 8 additions & 8 deletions materials/exercise_allas.md
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,7 @@ Learn how to:

:::{admonition} Change the default project and username

* `project_200xxxx` is an example project name, replace with your own CSC project name.
* `project_20xxxxx` is an example project name, replace with your own CSC project name.
* `cscusername` is an example username, replace with your username.
:::

Expand All @@ -54,38 +54,38 @@ allas-conf --mode s3cmd
# Create a new bucket
# s3cmd mb <name_of_your_bucket>
s3cmd mb s3://project_200xxxx-cscusername
s3cmd mb s3://project_20xxxxx-cscusername
# Upload (later syncronize) a folder to Allas
# s3cmd sync <local_folder> s3://<name_of_your_bucket>
s3cmd sync /appl/data/geo/mml/dem10m/2019/W3/W33/ s3://project_200xxxx-cscusername
s3cmd sync /appl/data/geo/mml/dem10m/2019/W3/W33/ s3://project_20xxxxx-cscusername
# List all buckets
s3cmd ls
# List all files in one bucket
# s3cmd ls s3://<name_of_your_bucket>
s3cmd ls s3://project_200xxxx-cscusername
s3cmd ls s3://project_20xxxxx-cscusername
# Read and write directly to Allas with GDAL
# Make GDAL avaialble
module load geoconda
# See metadata of a file from GDAL exercise
gdalinfo /vsis3/project_200xxxx-cscusername/W3333.tif
gdalinfo /vsis3/project_20xxxxx-cscusername/W3333.tif
# Enable writing with GDAL
export CPL_VSIL_USE_TEMP_FILE_FOR_RANDOM_WRITE=YES
# Make the .tif file to Cloud-Optimized GeoTiff
gdal_translate /vsis3/project_200xxxx-cscusername/W3333.tif /vsis3/project_200xxxx-cscusername/W3333_COG.tif -of COG
gdal_translate /vsis3/project_20xxxxx-cscusername/W3333.tif /vsis3/project_20xxxxx-cscusername/W3333_COG.tif -of COG
# See metadata of the new file
gdalinfo /vsis3/project_200xxxx-cscusername/W3333_COG.tif
gdalinfo /vsis3/project_20xxxxx-cscusername/W3333_COG.tif
# Delete all from Allas
s3cmd del --recursive --force s3://project_200xxxx-cscusername
s3cmd rm s3://project_200xxxx-cscusername
s3cmd rm s3://project_20xxxxx-cscusername
```

:::{admonition} Key points
Expand Down
10 changes: 5 additions & 5 deletions materials/exercise_basics.md
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@
:class: important

* Access to Puhti webinterface
* Own directory within the course directory `/scratch/project_200xxxx/students/cscusername`
* Own directory within the course directory `/scratch/project_20xxxxx/students/cscusername`

:::

Expand All @@ -44,7 +44,7 @@ Let's reserve 10 minutes.
On the **login node**: Start an interactive job with `srun`, e.g.:

```bash
srun --time=00:10:00 --pty --account=project_200xxxx --partition=interactive bash ##replace xxxx with your project number; you can also add --reservation=geocomputing_thu here for the course (not available at other times), change partition to small then
srun --time=00:10:00 --pty --account=project_20xxxxx --partition=interactive bash ##replace xxxxx with your project number; you can also add --reservation=geocomputing_wed here for the course (not available at other times), change partition to small then
```

**or** on Puhti you can also use the `sinteractive` wrapper to start an interactive session from the **login node**, which simplifies the call and asks you for the resources step by step:
Expand All @@ -55,7 +55,7 @@ sinteractive -i
**or** directly:

```bash
sinteractive --account project_200xxxx --time 00:10:00 # replace xxxx with your CSC project, e.g. project_2001234
sinteractive --account project_20xxxxx --time 00:10:00 # replace xxxxx with your CSC project, e.g. project_2001234
```

:::
Expand Down Expand Up @@ -125,7 +125,7 @@ If you use a software that is pre-installed by CSC, please [check its documentat
1. Go to your own directory in the `/scratch` directory of your project:

```bash
cd /scratch/project_200xxxx/students/cscusername # replace xxxx with your CSC project number and cscusername with your username
cd /scratch/project_20xxxxx/students/cscusername # replace xxxxx with your CSC project number and cscusername with your username
```

2. Create a file called `my_serial.bash` e.g. with the `nano` text editor:
Expand All @@ -138,7 +138,7 @@ nano my_serial.bash

```bash
#!/bin/bash
#SBATCH --account=project_200xxxx # Choose the billing project. Has to be defined!
#SBATCH --account=project_20xxxxx # Choose the billing project. Has to be defined!
#SBATCH --time=00:02:00 # Maximum duration of the job. Upper limit depends on the partition.
#SBATCH --partition=test # Job queues: test, interactive, small, large, longrun, hugemem, hugemem_longrun
#SBATCH --ntasks=1 # Number of tasks. Upper limit depends on partition. For a serial job this should be set to 1!
Expand Down
20 changes: 7 additions & 13 deletions materials/installations.md
Original file line number Diff line number Diff line change
Expand Up @@ -51,19 +51,13 @@

* Open [Puhti web interface](https://puhti.csc.fi) and log in
* Open Compute node shell (outside of the course, also Login node shell could be used)

Reservation: geocomputing_thu (only during the course)
Project: 2000xxx
Partition: small
Number of CPU cores: 1
Memory (GB): 4
Local disk (GB): 4
Time: 00:30:00

```
srun --reservation=geocomputing_thu --account=project_2011224 --mem=4000 --ntasks=1 --time=0:20:00 --gres=nvme:4 --pty bash -i
```

* Reservation: geocomputing_thu (only during the course)
* Project: project_20xxxxx
* Partition: small
* Number of CPU cores: 1
* Memory (GB): 4
* Local disk (GB): 4
* Time: 00:30:00

Make Tykky tools available
```
Expand Down

0 comments on commit c6d37d2

Please sign in to comment.