Skip to content

Commit

Permalink
Merge branch 'main' into yetulaxman-patch-1
Browse files Browse the repository at this point in the history
  • Loading branch information
EetuHuuskoCSC authored Oct 8, 2024
2 parents 18d34d6 + c6d37d2 commit ddba94e
Show file tree
Hide file tree
Showing 5 changed files with 23 additions and 29 deletions.
2 changes: 1 addition & 1 deletion materials/batch_job.md
Original file line number Diff line number Diff line change
Expand Up @@ -38,7 +38,7 @@ Minimal example of batch script:

```bash title="simple.sh"
#!/bin/bash
#SBATCH --account=project_200xxxx # Your CSC project. Mandatory.
#SBATCH --account=project_20xxxxx # Your CSC project. Mandatory.
#SBATCH --partition=test # Partition, see section below
#SBATCH --time=00:02:00 # Maximum duration of the job.
#SBATCH --ntasks=1 # How many cores?
Expand Down
2 changes: 1 addition & 1 deletion materials/csc_services.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@
* Virtual machines for web services and database: [cPouta](https://research.csc.fi/-/cpouta)
* Containers for web services: [Rahti](https://research.csc.fi/-/rahti)
* Sensitive data: [SD services](https://research.csc.fi/sensitive-data-services-for-research), [ePouta](https://research.csc.fi/-/epouta)
* Jupyter and RStudio for courses: [CSC Noppe](https://docs.csc.fi/cloud/noppe/)
* Jupyter and RStudio for courses: [Noppe](https://docs.csc.fi/cloud/noppe/)

## Data services

Expand Down
18 changes: 9 additions & 9 deletions materials/exercise_allas.md
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,7 @@ Learn how to:

:::{admonition} Change the default project and username

* `project_200xxxx` is an example project name, replace with your own CSC project name.
* `project_20xxxxx` is an example project name, replace with your own CSC project name.
* `cscusername` is an example username, replace with your username.
:::

Expand All @@ -54,38 +54,38 @@ allas-conf --mode s3cmd
# Create a new bucket
# s3cmd mb <name_of_your_bucket>
s3cmd mb s3://project_200xxxx-cscusername
s3cmd mb s3://project_20xxxxx-cscusername
# Upload (later syncronize) a folder to Allas
# s3cmd sync <local_folder> s3://<name_of_your_bucket>
s3cmd sync /appl/data/geo/mml/dem10m/2019/W3/W33/ s3://project_200xxxx-cscusername
s3cmd sync /appl/data/geo/mml/dem10m/2019/W3/W33/ s3://project_20xxxxx-cscusername
# List all buckets
s3cmd ls
# List all files in one bucket
# s3cmd ls s3://<name_of_your_bucket>
s3cmd ls s3://project_200xxxx-cscusername
s3cmd ls s3://project_20xxxxx-cscusername
# Read and write directly to Allas with GDAL
# Make GDAL avaialble
module load geoconda
# See metadata of a file from GDAL exercise
gdalinfo /vsis3/project_200xxxx-cscusername/W3333.tif
gdalinfo /vsis3/project_20xxxxx-cscusername/W3333.tif
# Enable writing with GDAL
export CPL_VSIL_USE_TEMP_FILE_FOR_RANDOM_WRITE=YES
# Make the .tif file to Cloud-Optimized GeoTiff
gdal_translate /vsis3/project_200xxxx-cscusername/W3333.tif /vsis3/project_200xxxx-cscusername/W3333_COG.tif -of COG
gdal_translate /vsis3/project_20xxxxx-cscusername/W3333.tif /vsis3/project_20xxxxx-cscusername/W3333_COG.tif -of COG
# See metadata of the new file
gdalinfo /vsis3/project_200xxxx-cscusername/W3333_COG.tif
gdalinfo /vsis3/project_20xxxxx-cscusername/W3333_COG.tif
# Delete all from Allas
s3cmd del --recursive --force s3://project_200xxxx-cscusername
s3cmd rb s3://project_200xxxx-cscusername
s3cmd del --recursive --force s3://project_20xxxxx-cscusername
s3cmd rb s3://project_20xxxxx-cscusername
```

:::{admonition} Key points
Expand Down
10 changes: 5 additions & 5 deletions materials/exercise_basics.md
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@
:class: important

* Access to Puhti webinterface
* Own directory within the course directory `/scratch/project_200xxxx/students/cscusername`
* Own directory within the course directory `/scratch/project_20xxxxx/students/cscusername`

:::

Expand All @@ -44,7 +44,7 @@ Let's reserve 10 minutes.
On the **login node**: Start an interactive job with `srun`, e.g.:

```bash
srun --time=00:10:00 --pty --account=project_200xxxx --partition=interactive bash ##replace xxxx with your project number; you can also add --reservation=geocomputing_thu here for the course (not available at other times), change partition to small then
srun --time=00:10:00 --pty --account=project_20xxxxx --partition=interactive bash ##replace xxxxx with your project number; you can also add --reservation=geocomputing_wed here for the course (not available at other times), change partition to small then
```

**or** on Puhti you can also use the `sinteractive` wrapper to start an interactive session from the **login node**, which simplifies the call and asks you for the resources step by step:
Expand All @@ -55,7 +55,7 @@ sinteractive -i
**or** directly:

```bash
sinteractive --account project_200xxxx --time 00:10:00 # replace xxxx with your CSC project, e.g. project_2001234
sinteractive --account project_20xxxxx --time 00:10:00 # replace xxxxx with your CSC project, e.g. project_2001234
```

:::
Expand Down Expand Up @@ -125,7 +125,7 @@ If you use a software that is pre-installed by CSC, please [check its documentat
1. Go to your own directory in the `/scratch` directory of your project:

```bash
cd /scratch/project_200xxxx/students/cscusername # replace xxxx with your CSC project number and cscusername with your username
cd /scratch/project_20xxxxx/students/cscusername # replace xxxxx with your CSC project number and cscusername with your username
```

2. Create a file called `my_serial.bash` e.g. with the `nano` text editor:
Expand All @@ -138,7 +138,7 @@ nano my_serial.bash

```bash
#!/bin/bash
#SBATCH --account=project_200xxxx # Choose the billing project. Has to be defined!
#SBATCH --account=project_20xxxxx # Choose the billing project. Has to be defined!
#SBATCH --time=00:02:00 # Maximum duration of the job. Upper limit depends on the partition.
#SBATCH --partition=test # Job queues: test, interactive, small, large, longrun, hugemem, hugemem_longrun
#SBATCH --ntasks=1 # Number of tasks. Upper limit depends on partition. For a serial job this should be set to 1!
Expand Down
20 changes: 7 additions & 13 deletions materials/installations.md
Original file line number Diff line number Diff line change
Expand Up @@ -51,19 +51,13 @@

* Open [Puhti web interface](https://puhti.csc.fi) and log in
* Open Compute node shell (outside of the course, also Login node shell could be used)

Reservation: geocomputing_thu (only during the course)
Project: 2000xxx
Partition: small
Number of CPU cores: 1
Memory (GB): 4
Local disk (GB): 4
Time: 00:30:00

```
srun --reservation=geocomputing_thu --account=project_2011224 --mem=4000 --ntasks=1 --time=0:20:00 --gres=nvme:4 --pty bash -i
```

* Reservation: geocomputing_thu (only during the course)
* Project: project_20xxxxx
* Partition: small
* Number of CPU cores: 1
* Memory (GB): 4
* Local disk (GB): 4
* Time: 00:30:00

Make Tykky tools available
```
Expand Down

0 comments on commit ddba94e

Please sign in to comment.