diff --git a/materials/batch_job.md b/materials/batch_job.md index 33d262ed..448e56cc 100644 --- a/materials/batch_job.md +++ b/materials/batch_job.md @@ -38,7 +38,7 @@ Minimal example of batch script: ```bash title="simple.sh" #!/bin/bash -#SBATCH --account=project_200xxxx # Your CSC project. Mandatory. +#SBATCH --account=project_20xxxxx # Your CSC project. Mandatory. #SBATCH --partition=test # Partition, see section below #SBATCH --time=00:02:00 # Maximum duration of the job. #SBATCH --ntasks=1 # How many cores? diff --git a/materials/csc_services.md b/materials/csc_services.md index 71da8883..e60205dd 100644 --- a/materials/csc_services.md +++ b/materials/csc_services.md @@ -6,7 +6,7 @@ * Virtual machines for web services and database: [cPouta](https://research.csc.fi/-/cpouta) * Containers for web services: [Rahti](https://research.csc.fi/-/rahti) * Sensitive data: [SD services](https://research.csc.fi/sensitive-data-services-for-research), [ePouta](https://research.csc.fi/-/epouta) -* Jupyter and RStudio for courses: [CSC Noppe](https://docs.csc.fi/cloud/noppe/) +* Jupyter and RStudio for courses: [Noppe](https://docs.csc.fi/cloud/noppe/) ## Data services diff --git a/materials/exercise_allas.md b/materials/exercise_allas.md index a2de6264..add491ed 100644 --- a/materials/exercise_allas.md +++ b/materials/exercise_allas.md @@ -28,7 +28,7 @@ Learn how to: :::{admonition} Change the default project and username -* `project_200xxxx` is an example project name, replace with your own CSC project name. +* `project_20xxxxx` is an example project name, replace with your own CSC project name. * `cscusername` is an example username, replace with your username. ::: @@ -54,38 +54,38 @@ allas-conf --mode s3cmd # Create a new bucket # s3cmd mb -s3cmd mb s3://project_200xxxx-cscusername +s3cmd mb s3://project_20xxxxx-cscusername # Upload (later syncronize) a folder to Allas # s3cmd sync s3:// -s3cmd sync /appl/data/geo/mml/dem10m/2019/W3/W33/ s3://project_200xxxx-cscusername +s3cmd sync /appl/data/geo/mml/dem10m/2019/W3/W33/ s3://project_20xxxxx-cscusername # List all buckets s3cmd ls # List all files in one bucket # s3cmd ls s3:// -s3cmd ls s3://project_200xxxx-cscusername +s3cmd ls s3://project_20xxxxx-cscusername # Read and write directly to Allas with GDAL # Make GDAL avaialble module load geoconda # See metadata of a file from GDAL exercise -gdalinfo /vsis3/project_200xxxx-cscusername/W3333.tif +gdalinfo /vsis3/project_20xxxxx-cscusername/W3333.tif # Enable writing with GDAL export CPL_VSIL_USE_TEMP_FILE_FOR_RANDOM_WRITE=YES # Make the .tif file to Cloud-Optimized GeoTiff -gdal_translate /vsis3/project_200xxxx-cscusername/W3333.tif /vsis3/project_200xxxx-cscusername/W3333_COG.tif -of COG +gdal_translate /vsis3/project_20xxxxx-cscusername/W3333.tif /vsis3/project_20xxxxx-cscusername/W3333_COG.tif -of COG # See metadata of the new file -gdalinfo /vsis3/project_200xxxx-cscusername/W3333_COG.tif +gdalinfo /vsis3/project_20xxxxx-cscusername/W3333_COG.tif # Delete all from Allas -s3cmd del --recursive --force s3://project_200xxxx-cscusername -s3cmd rb s3://project_200xxxx-cscusername +s3cmd del --recursive --force s3://project_20xxxxx-cscusername +s3cmd rb s3://project_20xxxxx-cscusername ``` :::{admonition} Key points diff --git a/materials/exercise_basics.md b/materials/exercise_basics.md index 330171b1..77e64a17 100644 --- a/materials/exercise_basics.md +++ b/materials/exercise_basics.md @@ -22,7 +22,7 @@ :class: important * Access to Puhti webinterface -* Own directory within the course directory `/scratch/project_200xxxx/students/cscusername` +* Own directory within the course directory `/scratch/project_20xxxxx/students/cscusername` ::: @@ -44,7 +44,7 @@ Let's reserve 10 minutes. On the **login node**: Start an interactive job with `srun`, e.g.: ```bash -srun --time=00:10:00 --pty --account=project_200xxxx --partition=interactive bash ##replace xxxx with your project number; you can also add --reservation=geocomputing_thu here for the course (not available at other times), change partition to small then +srun --time=00:10:00 --pty --account=project_20xxxxx --partition=interactive bash ##replace xxxxx with your project number; you can also add --reservation=geocomputing_wed here for the course (not available at other times), change partition to small then ``` **or** on Puhti you can also use the `sinteractive` wrapper to start an interactive session from the **login node**, which simplifies the call and asks you for the resources step by step: @@ -55,7 +55,7 @@ sinteractive -i **or** directly: ```bash -sinteractive --account project_200xxxx --time 00:10:00 # replace xxxx with your CSC project, e.g. project_2001234 +sinteractive --account project_20xxxxx --time 00:10:00 # replace xxxxx with your CSC project, e.g. project_2001234 ``` ::: @@ -125,7 +125,7 @@ If you use a software that is pre-installed by CSC, please [check its documentat 1. Go to your own directory in the `/scratch` directory of your project: ```bash -cd /scratch/project_200xxxx/students/cscusername # replace xxxx with your CSC project number and cscusername with your username +cd /scratch/project_20xxxxx/students/cscusername # replace xxxxx with your CSC project number and cscusername with your username ``` 2. Create a file called `my_serial.bash` e.g. with the `nano` text editor: @@ -138,7 +138,7 @@ nano my_serial.bash ```bash #!/bin/bash -#SBATCH --account=project_200xxxx # Choose the billing project. Has to be defined! +#SBATCH --account=project_20xxxxx # Choose the billing project. Has to be defined! #SBATCH --time=00:02:00 # Maximum duration of the job. Upper limit depends on the partition. #SBATCH --partition=test # Job queues: test, interactive, small, large, longrun, hugemem, hugemem_longrun #SBATCH --ntasks=1 # Number of tasks. Upper limit depends on partition. For a serial job this should be set to 1! diff --git a/materials/installations.md b/materials/installations.md index bc52cc46..b8abecac 100644 --- a/materials/installations.md +++ b/materials/installations.md @@ -51,19 +51,13 @@ * Open [Puhti web interface](https://puhti.csc.fi) and log in * Open Compute node shell (outside of the course, also Login node shell could be used) - -Reservation: geocomputing_thu (only during the course) -Project: 2000xxx -Partition: small -Number of CPU cores: 1 -Memory (GB): 4 -Local disk (GB): 4 -Time: 00:30:00 - -``` -srun --reservation=geocomputing_thu --account=project_2011224 --mem=4000 --ntasks=1 --time=0:20:00 --gres=nvme:4 --pty bash -i -``` - + * Reservation: geocomputing_thu (only during the course) + * Project: project_20xxxxx + * Partition: small + * Number of CPU cores: 1 + * Memory (GB): 4 + * Local disk (GB): 4 + * Time: 00:30:00 Make Tykky tools available ```