Skip to content

Commit

Permalink
WIP: okay, just introducing this is ridiculously hard 😅
Browse files Browse the repository at this point in the history
  • Loading branch information
chriskrycho committed Apr 23, 2024
1 parent 814920a commit 3da45e7
Show file tree
Hide file tree
Showing 3 changed files with 185 additions and 23 deletions.
3 changes: 2 additions & 1 deletion src/ch16-01-threads.md
Original file line number Diff line number Diff line change
Expand Up @@ -30,7 +30,8 @@ operating systems provide an API the language can call for creating new
threads. The Rust standard library uses a *1:1* model of thread implementation,
whereby a program uses one operating system thread per one language thread.
There are crates that implement other models of threading that make different
tradeoffs to the 1:1 model.
tradeoffs to the 1:1 model. (Rust’s async-await system, which we will see in the
next chapter, provides another approach to concurrency as well.)

### Creating a New Thread with `spawn`

Expand Down
52 changes: 32 additions & 20 deletions src/ch17-00-async-await.md
Original file line number Diff line number Diff line change
Expand Up @@ -31,20 +31,6 @@ interchangeable. Now we need to distinguish between the two a little more:
* *Concurrency* is when operations can make progress without having to wait for
all other operations to complete.

On a machine with multiple CPU cores, we can actually do work in parallel. One
core can be doing one thing while another core does something completely
unrelated, and those actually happen at the same time. On a machine with a
single CPU core, the CPU can only do one operation at a time, but we can still
have concurrency. Using tools like threads, processes, and async-await, the
computer can pause one activity and switch to others before eventually cycling
back to that first activity again. So all parallel operations are also
concurrent, but not all concurrent operations happen in parallel!

> Note: When working with async-await in Rust, we need to think in terms of
> *concurrency*. Depending on the hardware, the operating system, and the async
> runtime we are using, that concurrency may use some degree of parallelism
> under the hood, or it may not.
One common analogy for thinking about the difference between concurrency and
parallelism is cooking in a kitchen. Parallelism is like having two cooks: one
working on cooking eggs, and the other working on preparing fruit bowls. Those
Expand All @@ -59,17 +45,43 @@ while the cook is chopping up the vegetables, after all. That is parallelism,
not just concurrency! The focus of the analogy is the *cook*, not the food,
though, and as long as you keep that in mind, it mostly works.)

On a machine with multiple CPU cores, we can actually do work in parallel. One
core can be doing one thing while another core does something completely
unrelated, and those actually happen at the same time. On a machine with a
single CPU core, the CPU can only do one operation at a time, but we can still
have concurrency. Using tools like threads, processes, and async-await, the
computer can pause one activity and switch to others before eventually cycling
back to that first activity again. So all parallel operations are also
concurrent, but not all concurrent operations happen in parallel!

> Note: When working with async-await in Rust, we need to think in terms of
> *concurrency*. Depending on the hardware, the operating system, and the async
> runtime we are using, that concurrency may use some degree of parallelism
> under the hood, or it may not.
Consider again the examples of exporting a video file and waiting on the video
file to finish uploading. The video export will use as much CPU and GPU power as
it can. If you only had one CPU core, and your operating system never paused
that export until it completed, you could not do anything else on your computer
while it was running. That would be a pretty frustrating experience, though, so
instead your computer could invisibly interrupt the export often enough to let
you get other small amounts of work done along the way.

The file upload is different. That does not take up that much CPU time. Mostly,
you are waiting on data to transfer across the network. <!-- TODO: keep going
here -->
instead your computer can (and does!) invisibly interrupt the export often
enough to let you get other small amounts of work done along the way.

The file upload is different. It does not take up very much CPU time. Instead,
you are mostly waiting on data to transfer across the network. If you only have
a single CPU core, you might write a bunch of data to a network socket and then
wait for it to finish getting sent by the network controller. You could choose
to wait for all the data to get “flushed” from the socket and actually sent over
the network, but if there is a busy network connection, you might be waiting for
a while… with your CPU doing not much! Thus, even if you make a blocking call to
write to a socket, your computer probably does other things while the network
operation is happening.

In both of these cases, it might be useful for *your program* to participate in
the same kind of concurrency the computer is providing for the rest of the
system. One way to do this is the approach we saw last chapter: using threads,
which are provided and managed by the operating system. Another way to get
access to concurrency is using language-specific capabilities—like async-await.

A big difference between the cooking analogy and Rust’s async-await model for
concurrency is that in the cooking example, the cook makes the decision about
Expand Down
153 changes: 151 additions & 2 deletions src/ch17-01-tasks.md
Original file line number Diff line number Diff line change
@@ -1,4 +1,6 @@
## Tasks
## Futures and the Async-Await Syntax

### Tasks

As we saw in the previous chapter, threads provide one approach to concurrency,
and they let us solve some of these issues. However, they also have some
Expand All @@ -9,7 +11,9 @@ multiple threads. While mainstream desktop and mobile operating systems have all
had threading for many years, many embedded operating systems used on
microcontrollers do not.

The async-await model provides a different, complementary set of tradeoffs.
The async-await model provides a different, complementary set of tradeoffs. In

<!-- TODO: the following paragraph is not where it needs to be structurally. -->

In the async-await model, concurrent operations do not require their own
threads. Instead, they can run on *tasks*. A task is a bit like a thread, but
Expand All @@ -19,3 +23,148 @@ Erlang, and Swift, ship runtimes with the language. In Rust, there are many
different runtimes, because the things a runtime for a high-throughput web
server should do are very different from the things a runtime for a
microcontroller should do.

<!-- TODO: connective tissue as it were. -->

###

Like many other languages with first-class support for the async-await
programming model, Rust uses the `async` and `await` keywords—though with some
important differences from other languages like C# or JavaScript. Blocks and
functions can be marked `async`, and you can wait on the result of an `async`
function or block to resolve using the `await` keyword.

Let’s write our first async function:

```rust
fn main() {
hello_async();
}

async fn hello_async() {
println!("Hello, world!");
}
```

If we compile and run this… nothing happens, and we get a compiler warning:

```console
$ cargo run
warning: unused implementer of `Future` that must be used
--> src/main.rs:2:5
|
2 | hello_async();
| ^^^^^^^^^^^^^
|
= note: futures do nothing unless you `.await` or poll them
= note: `#[warn(unused_must_use)]` on by default

warning: `hello-async` (bin "hello-async") generated 1 warning
Finished dev [unoptimized + debuginfo] target(s) in 1.50s
Running `target/debug/hello-async`
```

The warning tells us why nothing happened. Calling `hello_async()` itself was
not enough: we need to `.await`or poll the “future” it returns. That might be a
bit surprising: we did not write a return type on the function. However, we
*did* mark it as an `async fn`. In Rust, `async fn` is equivalent to writing a
function which returns a *future* of the return type, using the `impl Trait`
syntax we discussed back in the [“Traits as Parameters”][impl-trait] section in
Chapter 10. So these two are equivalent:

<!-- no-compile -->
```rust
fn hello_async() -> impl Future<Output = ()> {
println!("Hello, async!");
}
```

```rust
async fn hello_async() {
println!("Hello, async!");
}
```

That explains why we got the `unused_must_use` warning. The other part of the
warning was the note that we need to `.await` or poll the future. Rust's `await`
keyword is a postfix keyword, meaning it goes *after* the expression you are
awaiting. (As of now, `await` is the only postfix keyword in the language.)
Let’s try that here:

```rust
fn main() {
hello_async().await;
}
```

Now we actually have a compiler error!

```text
error[E0728]: `await` is only allowed inside `async` functions and blocks
--> src/main.rs:2:19
|
1 | fn main() {
| ---- this is not `async`
2 | hello_async().await;
| ^^^^^ only allowed inside `async` functions and blocks
```

Okay, so we cannot actually use `.await` in `main`, because it is not an `async`
function itself—and it cannot be. To understand why, we need to pause to see
what a `Future` actually is and why it needs to be `.await`-ed or polled to do
anything.

### Understanding `Future`

Since `async fn` compiles to a return type with `impl Future<Output = …>`, we
know that `Future` is a trait, with an associated type `Output`. The other part
of the trait is its one method: `poll`.

<!-- TODO -->

The other thing to notice here is that futures in Rust are *lazy*. They do not
do anything until you explicitly ask them to—whether by calling `poll` or by
using `.await` to do so.

### Running Async Code

<!-- runtime and executor -->

Going back to `main`, this explains why we cannot have an `async fn main`: what
would execute the async code? We need to pick a runtime and executor. We can get
started with that easily by using the simple one that comes bundled with the
`futures` crate, an official home for Rust experimentation for async code. Since
we will be using a bunch of tools from that crate for the rest of the chapter,
let’s go ahead and add it to the dependencies for our test project:

```
cargo add [email protected]
```

Now we can use the executor which comes with `futures` to run the code. The
`futures::executor::block_on` function takes in a `Future` and runs it until it
completes, one way or another.

```rust
use futures::executor;

fn main() {
executor::block_on(hello_async());
}

async fn hello_async() {
println!("Hello, world!");
}
```

Now when we run this, we get the behavior we might have expected initially:

```console
$ cargo run
Compiling hello-async v0.1.0 (/Users/chris/dev/chriskrycho/async-trpl-fun/hello-async)
Finished dev [unoptimized + debuginfo] target(s) in 4.89s
Running `target/debug/hello-async`
Hello, world!
```

[impl-trait]: ch10-02-traits.html#traits-as-parameters

0 comments on commit 3da45e7

Please sign in to comment.