Skip to content

Commit

Permalink
renamed snippets folder for lesson 02
Browse files Browse the repository at this point in the history
  • Loading branch information
Peter Steinbach committed Mar 26, 2019
1 parent aeb40e2 commit d6911de
Show file tree
Hide file tree
Showing 2 changed files with 7 additions and 7 deletions.
6 changes: 3 additions & 3 deletions _episodes/02-03-mapreduce-for-pi.md
Original file line number Diff line number Diff line change
Expand Up @@ -70,7 +70,7 @@ if __name__=='__main__':
She tests her python program on a single input file. As she knows how long it'll take approximately, she can provide a good estimate of the run time of the job. If the cluster is busy, that allows the scheduler to start her job faster.

~~~
{% include /snippets/03/submit_filter_pi.{{ site.workshop_scheduler }} %}
{% include /snippets/02/submit_filter_pi.{{ site.workshop_scheduler }} %}
~~~
{: .bash}

Expand Down Expand Up @@ -137,14 +137,14 @@ The question is, she would love to send this averaging job after she filtered ev
~~~
{% include /snippets/03/map_filter_pi.{{ site.workshop_scheduler }} %}
{% include /snippets/02/map_filter_pi.{{ site.workshop_scheduler }} %}
~~~
{: .bash}
The above is called an _array job_. The same commands are executed on an array of files which share a similar file name. In this case, it is `pi_estimate_01.data, pi_estimate_02.data, pi_estimate_03.data, ...`. When the job runs on the cluster, the shell variable
~~~
{% include /snippets/03/array_job_task_id.{{ site.workshop_scheduler }} %}
{% include /snippets/02/array_job_task_id.{{ site.workshop_scheduler }} %}
~~~
{: .bash}
Expand Down
8 changes: 4 additions & 4 deletions _episodes/02-04-bonus-mpi-for-pi.md
Original file line number Diff line number Diff line change
Expand Up @@ -30,7 +30,7 @@ One of her more experienced colleagues has suggested to her, to use the _Message
Lola becomes curious. She wants to experiment with this parallelization technique a bit. For this, she would like to print the name of the node where a specific driver application is run.

~~~
{% include /snippets/03/submit_4_mpirun_hostname.{{ site.workshop_scheduler }} %}
{% include /snippets/02/submit_4_mpirun_hostname.{{ site.workshop_scheduler }} %}
~~~
{: .bash}

Expand All @@ -47,7 +47,7 @@ n01
The output makes her wonder. Apparently, the command was cloned and executed on the same host 4 times. If she increases the number of processors to a number larger than the number of CPU cores each of here nodes has, this might change and the distributed nature of `mpirun` will reveal itself.

~~~
{% include /snippets/03/submit_16_mpirun_hostname.{{ site.workshop_scheduler }} %}
{% include /snippets/02/submit_16_mpirun_hostname.{{ site.workshop_scheduler }} %}
~~~
{: .bash}

Expand Down Expand Up @@ -78,7 +78,7 @@ As the figure above shows, 12 instances of `hostname` were called on `n01` and 4
Like a reflex, Lola asks how to write these MPI programs. Her colleague points out that she needs to program the languages that MPI supports, such as FORTRAN, C, C++, Python and many more. As Lola is most confident with Python, her colleague wants to give her a head start using `mpi4py` and provides a minimal example. This example is analogous to what Lola just played with. This Python script called [`print_hostname.py`]({{ page.root }}/code/02_parallel_jobs/print_hostname.py) prints the number of the current MPI rank (i.e. the unique id of the execution thread within one mpirun invocation), the total number of MPI ranks available and the hostname this rank is currently run on.

~~~
{% include /snippets/03/submit_16_mpirun_python3_print_hostname.{{ site.workshop_scheduler }} %}
{% include /snippets/02/submit_16_mpirun_python3_print_hostname.{{ site.workshop_scheduler }} %}
~~~
{: .bash}

Expand Down Expand Up @@ -187,7 +187,7 @@ if rank == 0:
And that's it. Now, Lola can submit her first MPI job.

~~~
{% include /snippets/03/submit_48_mpirun_python3_mpi_numpi.{{ site.workshop_scheduler }} %}
{% include /snippets/02/submit_48_mpirun_python3_mpi_numpi.{{ site.workshop_scheduler }} %}
~~~
{: .bash}

Expand Down

0 comments on commit d6911de

Please sign in to comment.