Skip to content

Commit

Permalink
WIP: okay, just introducing this is ridiculously hard 😅
Browse files Browse the repository at this point in the history
  • Loading branch information
chriskrycho committed Apr 22, 2024
1 parent 814920a commit e3a2714
Show file tree
Hide file tree
Showing 3 changed files with 63 additions and 22 deletions.
3 changes: 2 additions & 1 deletion src/ch16-01-threads.md
Original file line number Diff line number Diff line change
Expand Up @@ -30,7 +30,8 @@ operating systems provide an API the language can call for creating new
threads. The Rust standard library uses a *1:1* model of thread implementation,
whereby a program uses one operating system thread per one language thread.
There are crates that implement other models of threading that make different
tradeoffs to the 1:1 model.
tradeoffs to the 1:1 model. (Rust’s async-await system, which we will see in the
next chapter, provides another approach to concurrency as well.)

### Creating a New Thread with `spawn`

Expand Down
52 changes: 32 additions & 20 deletions src/ch17-00-async-await.md
Original file line number Diff line number Diff line change
Expand Up @@ -31,20 +31,6 @@ interchangeable. Now we need to distinguish between the two a little more:
* *Concurrency* is when operations can make progress without having to wait for
all other operations to complete.

On a machine with multiple CPU cores, we can actually do work in parallel. One
core can be doing one thing while another core does something completely
unrelated, and those actually happen at the same time. On a machine with a
single CPU core, the CPU can only do one operation at a time, but we can still
have concurrency. Using tools like threads, processes, and async-await, the
computer can pause one activity and switch to others before eventually cycling
back to that first activity again. So all parallel operations are also
concurrent, but not all concurrent operations happen in parallel!

> Note: When working with async-await in Rust, we need to think in terms of
> *concurrency*. Depending on the hardware, the operating system, and the async
> runtime we are using, that concurrency may use some degree of parallelism
> under the hood, or it may not.
One common analogy for thinking about the difference between concurrency and
parallelism is cooking in a kitchen. Parallelism is like having two cooks: one
working on cooking eggs, and the other working on preparing fruit bowls. Those
Expand All @@ -59,17 +45,43 @@ while the cook is chopping up the vegetables, after all. That is parallelism,
not just concurrency! The focus of the analogy is the *cook*, not the food,
though, and as long as you keep that in mind, it mostly works.)

On a machine with multiple CPU cores, we can actually do work in parallel. One
core can be doing one thing while another core does something completely
unrelated, and those actually happen at the same time. On a machine with a
single CPU core, the CPU can only do one operation at a time, but we can still
have concurrency. Using tools like threads, processes, and async-await, the
computer can pause one activity and switch to others before eventually cycling
back to that first activity again. So all parallel operations are also
concurrent, but not all concurrent operations happen in parallel!

> Note: When working with async-await in Rust, we need to think in terms of
> *concurrency*. Depending on the hardware, the operating system, and the async
> runtime we are using, that concurrency may use some degree of parallelism
> under the hood, or it may not.
Consider again the examples of exporting a video file and waiting on the video
file to finish uploading. The video export will use as much CPU and GPU power as
it can. If you only had one CPU core, and your operating system never paused
that export until it completed, you could not do anything else on your computer
while it was running. That would be a pretty frustrating experience, though, so
instead your computer could invisibly interrupt the export often enough to let
you get other small amounts of work done along the way.

The file upload is different. That does not take up that much CPU time. Mostly,
you are waiting on data to transfer across the network. <!-- TODO: keep going
here -->
instead your computer can (and does!) invisibly interrupt the export often
enough to let you get other small amounts of work done along the way.

The file upload is different. It does not take up very much CPU time. Instead,
you are mostly waiting on data to transfer across the network. If you only have
a single CPU core, you might write a bunch of data to a network socket and then
wait for it to finish getting sent by the network controller. You could choose
to wait for all the data to get “flushed” from the socket and actually sent over
the network, but if there is a busy network connection, you might be waiting for
a while… with your CPU doing not much! Thus, even if you make a blocking call to
write to a socket, your computer probably does other things while the network
operation is happening.

In both of these cases, it might be useful for *your program* to participate in
the same kind of concurrency the computer is providing for the rest of the
system. One way to do this is the approach we saw last chapter: using threads,
which are provided and managed by the operating system. Another way to get
access to concurrency is using language-specific capabilities—like async-await.

A big difference between the cooking analogy and Rust’s async-await model for
concurrency is that in the cooking example, the cook makes the decision about
Expand Down
30 changes: 29 additions & 1 deletion src/ch17-01-tasks.md
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,9 @@ multiple threads. While mainstream desktop and mobile operating systems have all
had threading for many years, many embedded operating systems used on
microcontrollers do not.

The async-await model provides a different, complementary set of tradeoffs.
The async-await model provides a different, complementary set of tradeoffs. The

<!-- TODO: the following paragraph is not where it needs to be structurally. -->

In the async-await model, concurrent operations do not require their own
threads. Instead, they can run on *tasks*. A task is a bit like a thread, but
Expand All @@ -19,3 +21,29 @@ Erlang, and Swift, ship runtimes with the language. In Rust, there are many
different runtimes, because the things a runtime for a high-throughput web
server should do are very different from the things a runtime for a
microcontroller should do.

<!-- TODO: connective tissue as it were. -->


For the rest of this chapter, we are going to use the simple executor from the
`futures` crate.

<!-- TODO: the code samples will need to actually be turned into listings. -->

```
cargo add [email protected]
```

In Listing 16-01, we saw a simple example of multithreading. Let’s see a
similarly simple example with `async` and `await`.

```rust
```

<span class="caption">Listing 17-1: Creating a new thread to print one thing
while the main thread prints something else</span>

So far, these look pretty similar. Both of them involve calling

Where things start to look very different is when they want to trigger *further*
concurrent behavior themselves. In the threaded model,

0 comments on commit e3a2714

Please sign in to comment.