• This repository has been archived on 03/Mar/2020
  • Stars
    star
    733
  • Rank 59,385 (Top 2 %)
  • Language
    Rust
  • License
    Apache License 2.0
  • Created almost 7 years ago
  • Updated about 4 years ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

futures-await

This crate is no longer necessary now that async/await are formally parts of the Rust language, thanks to everyone who helped make this a reality!

Async/await syntax for Rust and the futures crate

What is this?

The primary way of working with futures today in Rust is through the various combinators on the Future trait. This is not quite "callback hell" but can sometimes feel like it as the rightward drift of code increases for each new closure you tack on. The purpose of async/await is to provide a much more "synchronous" feeling to code while retaining all the power of asynchronous code!

Here's a small taste of what this crate does:

#[async]
fn fetch_rust_lang(client: hyper::Client) -> io::Result<String> {
    let response = await!(client.get("https://www.rust-lang.org"))?;
    if !response.status().is_success() {
        return Err(io::Error::new(io::ErrorKind::Other, "request failed"))
    }
    let body = await!(response.body().concat())?;
    let string = String::from_utf8(body)?;
    Ok(string)
}

The notable points here are:

  • Functions are tagged with #[async], which means that they are asychronous and return a future when called instead of returning the result listed.
  • The await! macro allows blocking on a future to completion. This does not actually block the thread though, it just "blocks" the future returned from fetch_rust_lang from continuing. You can think of the await! macro as a function from a future to its Result, consuming the future along the way.
  • Error handling is much more natural here in #[async] functions. We get to use the ? operator as well as simple return Err statements. With the combinators in the futures crate this is generally much more cumbersome.
  • The future returned from fetch_rust_lang is actually a state machine generated at compile time and does no implicit memory allocation. This is built on top of generators, described below.

You can also have async methods:

impl Foo {
    #[async]
    fn do_work(self) -> io::Result<u32> {
        // ...
    }
}

You can specify that you actually would prefer a trait object is returned instead, e.g. Box<Future<Item = i32, Error = io::Error>>

#[async(boxed)]
fn foo() -> io::Result<i32> {
    // ...
}

You can also have "async for loops" which operate over the Stream trait:

#[async]
for message in stream {
    // ...
}

An async for loop will propagate errors out of the function so message has the Item type of the stream passed in. Note that async for loops can only be used inside of an #[async] function.

And finally, you can create a Stream instead of a Future via #[async_stream(item = _)]:

#[async]
fn fetch(client: hyper::Client, url: &'static str) -> io::Result<String> {
    // ...
}

/// Fetch all provided urls one at a time
#[async_stream(item = String)]
fn fetch_all(client: hyper::Client, urls: Vec<&'static str>) -> io::Result<()> {
    for url in urls {
        let s = await!(fetch(client, url))?;
        stream_yield!(s);
    }
    Ok(())
}

#[async_stream] must have an item type specified via item = some::Path and the values output from the stream must be wrapped into a Result and yielded via the stream_yield! macro. This macro also supports the same features as #[async], an additional boxed argument to return a Box<Stream>, async for loops, etc.

How do I use this?

This implementation is currently fundamentally built on generators/coroutines. This feature has just landed and will be in the nightly channel of Rust as of 2017-08-29. You can acquire the nightly channel via;

rustup update nightly

After doing this you'll want to ensure that Cargo's using the nightly toolchain by invoking it like:

cargo +nightly build

Next you'll add this to your Cargo.toml

[dependencies]
futures-await = "0.1"

and then...

#![feature(proc_macro, generators)]

extern crate futures_await as futures;

use futures::prelude::*;

#[async]
fn foo() -> Result<i32, i32> {
    Ok(1 + await!(bar())?)
}

#[async]
fn bar() -> Result<i32, i32> {
    Ok(2)
}

fn main() {
    assert_eq!(foo().wait(), Ok(3));
}

This crate is intended to "masquerade" as the futures crate, reexporting its entire hierarchy and just augmenting it with the necessary runtime support, the async attribute, and the await! macro. These imports are all contained in futures::prelude::* when you import it.

For a whole mess of examples in a whole mess of code, you can also check out the async-await branch of sccache which is an in-progress transition to using async/await syntax in many locations. You'll typically find that the code is much more readable afterwards, especially these changes

Technical Details

As mentioned before this crate is fundamentally built on the feature of generators in Rust. Generators, otherwise known in this case as stackless coroutines, allow the compiler to generate an optimal implementation for a Future for a function to transform a synchronous-looking block of code into a Future that can execute asynchronously. The desugaring here is surprisingly straightforward from code you write to code rustc compiles, and you can browse the source of the futures-async-macro crate for more details here.

Otherwise there's a few primary "APIs" provided by this crate:

  • #[async] - this attribute can be applied to methods and functions to signify that it's an asynchronous function. The function's signature must return a Result of some form (although it can return a typedef of results). Additionally, the function's arguments must all be owned values, or in other words must contain no references. This restriction may be lifted in the future!

    Some examples are:

    // attribute on a bare function
    #[async]
    fn foo(a: i32) -> Result<u32, String> {
        // ...
    }
    
    // returning a typedef works too!
    #[async]
    fn foo() -> io::Result<u32> {
        // ...
    }
    
    impl Foo {
        // methods also work!
        #[async]
        fn foo(self) -> io::Result<u32> {
            // ...
        }
    }
    
    // For now, however, these do not work, they both have arguments which contain
    // references! This may work one day, but it does not work today.
    //
    // #[async]
    // fn foo(a: &i32) -> io::Result<u32> { /* ... */ }
    //
    // impl Foo {
    //     #[async]
    //     fn foo(&self) -> io::Result<u32> { /* ... */ }
    // }

    Note that an #[async] function is intended to behave very similarly to that of its synchronous version. For example the ? operator works internally, you can use an early return statement, etc.

    Under the hood an #[async] function is compiled to a state machine which represents a future for this function, and the state machine's suspension points will be where the await! macro is located.

  • async_block! - this macro is similar to #[async] in that it creates a future, but unlike #[async] it can be used in an expression context and doesn't need a dedicated function. This macro can be considered as "run this block of code asynchronously" where the block of code provided, like an async function, returns a result of some form. For example:

    let server = TcpListener::bind(..);
    let future = async_block! {
        #[async]
        for connection in server.incoming() {
            spawn(handle_connection(connection));
        }
        Ok(())
    };
    core.run(future).unwrap();

    or if you'd like to do some precomputation in a function:

    fn hash_file(path: &Path, pool: &CpuPool)
        -> impl Future<Item = u32, Error = io::Error>
    {
        let abs_path = calculate_abs_path(path);
        async_block! {
            let contents = await!(read_file)?;
            Ok(hash(&contents))
        }
    }
  • await! - this is a macro provided in the futures-await-macro crate which allows waiting on a future to complete. The await! macro can only be used inside of an #[async] function or an async_block! and can be thought of as a function that looks like:

    fn await!<F: Future>(future: F) -> Result<F::Item, F::Error> {
        // magic
    }

    Some examples of this macro are:

    #[async]
    fn fetch_url(client: hyper::Client, url: String) -> io::Result<Vec<u8>> {
        // note the trailing `?` to propagate errors
        let response = await!(client.get(url))?;
        await!(response.body().concat())
    }
  • #[async] for loops - the ability to iterate asynchronously over a Stream. You can do this by attaching the #[async] attribute to a for loop where the object being iterated over implements the Stream trait.

    Errors from the stream will get propagated automatically, and otherwise, the for loop will exhauste the stream to completion, binding each element to the pattern provided.

    Some examples are:

    #[async]
    fn accept_connections(listener: TcpListener) -> io::Result<()> {
        #[async]
        for connection in server.incoming() {
            // `connection` here has type `TcpStream`
        }
        Ok(())
    }

    Note that an #[async] for loop, like await!, can only be used in an async function or an async block.

  • #[async_stream(item = ...)] - defines a function which is an implementation of Stream rather than Future. This function uses the stream_yield! macro to yield items, and otherwise works with the await! macro and #[async] for loops.

    The declared function must return a Result<(), E> where E becomes the error of the Stream returned. The stream is terminated by returning Ok(()) from the function or returning an error. Operations like ? work inside the function for propagating errors as well.

    An example is:

    #[async_stream(item = u32)]
    fn accept_connections(listener: TcpListener) -> io::Result<()> {
        #[async]
        for object in fetch_all_objects() {
            let description = await!(fetch_object_description(object));
            stream_yield!(description);
        }
        Ok(())
    }

Nightly features

Right now this crate requires two nightly features to be used:

  • #![feature(generators)] - this is an experimental language feature that has yet to be stabilized but is the foundation for the implementation of async/await. The implemented landed recently and progress on stabilization can be found on its tracking issue.

  • #![feature(proc_macro)] - this has also been dubbed "Macros 2.0" and is how the #[async] attribute is defined in this crate and not the compiler itself. We then also take advantage of other macros 2.0 features like importing macros via use instead of #[macro_use]. Tracking issues for this feature include #38356 and #35896.

What's next?

This crate is still quite new and generators have only just landed on the nightly channel for Rust. While no major change is planned to this crate at this time, we'll have to see how this all pans out! One of the major motivational factors for landing generators so soon in the language was to transitively empower this implementation of async/await, and your help is needed in assessing that!

Notably, we're looking for feedback on async/await in this crate itself as well as generators as a language feature. We want to be sure that if we were to stabilize generators or procedural macros they're providing the optimal async/await experience!

At the moment you're encouraged to take caution when using this in production. This crate itself has only been very lightly tested and the generators language feature has also only been lightly tested so far. Caution is advised along with a suggestion to keep your eyes peeled for bugs! If you run into any questions or have any issues, you're more than welcome to open an issue!

Caveats

As can be expected with many nightly features, there are a number of caveats to be aware of when working with this project. Despite this bug reports or experience reports are more than welcome, always good to know what needs to be fixed regardless!

Borrowing

Borrowing doesn't really work so well today. The compiler will either reject many borrows or there may be some unsafety lurking as the generators feature is being developed. For example if you have a function such as:

#[async]
fn foo(s: &str) -> io::Result<()> {
    // ..
}

This may not compile! The reason for this is that the returned future typically needs to adhere to the 'static bound. Async functions currently execute no code when called, they only make progress when polled. This means that when you call an async function what happens is it creates a future, packages up the arguments, and then returns that to you. In this case it'd have to return the &str back up, which doesn't always have the lifetimes work out.

An example of how to get the above function compiling would be to do:

#[async]
fn foo(s: String) -> io::Result<()> {
    // ...
}

or somehow otherwise use an owned value instead of a borrowed reference.

Note that arguments are not the only point of pain with borrowing. For example code like this will not (or at least shouldn't) compile today:

for line in string.lines() {
    await!(process(line));
    println!("processed: {}", line);
}

The problem here is that line is a borrowed value that is alive across a yield point, namely the call to await!. This means that when the future may return back up the stack was part of the await! process it'd have to restore the line variable when it reenters the future. This isn't really all implemented and may be unsafe today. As a rule of thumb for now you'll need to only have owned values (no borrowed internals) alive across calls to await! or during async for loops.

Lots of thought is being put in to figure out how to alleviate this restriction! Borrowing is the crux of many ergonomic patterns in Rust, and we'd like this to work!

As one final point, a consequence of the "no borrowed arguments" today is that function signatures like:

#[async]
fn foo(&self) -> io::Result<()> {
    // ...
}

unfortunately will not work. You'll either need to take self by value or defer to a different #[async] function.

Futures in traits

Let's say you've got a trait like so:

trait MyStuff {
    fn do_async_task(??self) -> Box<Future<...>>;
}

We'll gloss over the self details here for a bit, but in essence we've got a function in a trait that wants to return a future. Unfortunately it's actually quite difficult to use this! Right now there's a few of caveats:

  • Ideally you want to tag this #[async] but this doesn't work because a trait function returning impl Future is not implemented in the compiler today. I'm told that this will eventually work, though!
  • Ok so then the next best thing is #[async(boxed)] to return a boxed trait object instead of impl Future for the meantime. This may not be quite what we want runtime-wise but we also have problems with...
  • But now this brings us to the handling of self. Because of the limitations of #[async] today we only have two options, self and self: Box<Self>. The former is unfortunately not object safe (now we can't use virtual dispatch with this trait) and the latter is typically wasteful (every invocation now requires a fresh allocation). Ideally self: Rc<Self> is exactly what we want here! But unfortunately this isn't implemented in the compiler 😦

So basically in summary you've got one of two options to return futures in traits today:

trait MyStuff {
    // Trait is not object safe because of `self` so can't have virtual
    // dispatch, and the allocation of `Box<..>` as a return value is required
    // until the compiler implements returning `impl Future` from traits.
    //
    // Note that the upside of this approach, though, is that `self` could be
    // something like `Rc` or have a bunch fo `Rc` inside of `self`, so this
    // could be cheap to call.
    #[async]
    fn do_async_task(self) -> Box<Future<...>>;
}

or the alternative:

trait MyStuff {
    // Like above we returned a trait object but here the trait is indeed object
    // safe, allowing virtual dispatch. The downside is that we must have a
    // `Box` on hand every time we call this function, which may be costly in
    // some situations.
    #[async]
    fn do_async_task(self: Box<Self>) -> Box<Future<...>>;
}

The ideal end goal for futures-in-traits is this:

trait MyStuff {
    #[async]
    fn do_async_task(self: Rc<Self>) -> Result<i32, u32>;
}

but this needs two pieces to be implemented:

  • The compiler must accept trait functions returning impl Trait
  • The compiler needs support for self: Rc<Self>, basically object-safe custom smart pointers in traits.

You can emulate this today with a non-object-safe implementation via:

trait Foo {
    #[async]
    fn do_async_task(me: Rc<Self>) -> Result<i32, u32>;
}

fn foo<T: Foo>(t: Rc<T>) {
    let x = Foo::trait_fn(t);
    // ...
}

But that's not exactly the most ergonomic!

Associated types

Another limitation when using futures with traits is with associated types. Say you've got a trait like:

trait Service {
    type Request;
    type Response;
    type Error;
    type Future: Future<Item = Self::Response, Error = Self::Error>;

    fn call(&self, req: Self::Request) -> Self::Future;
}

If you want to implement call with async_block!, or by returning a future from another function which was generated with #[async], you'd probably want to use impl Future. Unfortunately, it's not current possible to express an associated constant like Service::Future with an impl Trait.

For now the best solution is to use Box<Future<...>>:

impl Service for MyStruct {
    type Request = ...;
    type Response = ...;
    type Error = ...;
    type Future = Box<Future<Item = Self::Response, Error = Self::Error>>;

    fn call(&self, req: Self::Request) -> Self::Future {
        // ...
        Box::new(future)
    }
}

License

This project is licensed under either of

at your option.

Contribution

Unless you explicitly state otherwise, any contribution intentionally submitted for inclusion in futures-await by you, as defined in the Apache-2.0 license, shall be dual licensed as above, without any additional terms or conditions.

More Repositories

1

rust-ffi-examples

FFI examples written in Rust
Makefile
1,160
star
2

toml-rs

A TOML encoding/decoding library for Rust
Rust
994
star
3

curl-rust

Rust bindings to libcurl
Rust
982
star
4

tar-rs

Tar file reading/writing for Rust
Rust
551
star
5

ssh2-rs

Rust bindings for libssh2
Rust
450
star
6

cargo-vendor

Archived as subcommand is now part of Cargo itself
Rust
260
star
7

wasm-gc

gc-sections for wasm
Rust
247
star
8

coz-rs

Rust support for the coz Causal profiler, code now lives upstream -- https://github.com/plasma-umass/coz
Rust
211
star
9

tokio-process

Asynchronous process management for tokio
Rust
146
star
10

tokio-curl

Asynchronous HTTP client built on libcurl
Rust
108
star
11

filetime

Accessing file timestamps in a platform-agnostic fashion in Rust
Rust
108
star
12

tokio-signal

Unix signal handling for tokio
Rust
87
star
13

rlibc

77
star
14

bzip2-rs

libbz2 (bzip2 compression) bindings for Rust
C
77
star
15

dlmalloc-rs

dlmalloc ported into Rust
C
75
star
16

openssl-src-rs

Source code and logic to build OpenSSL from source
Rust
66
star
17

xz2-rs

Bindings to liblzma in Rust (xz streams in Rust)
Rust
65
star
18

wait-timeout

Waiting on a child process with a timeout in Rust
Rust
57
star
19

green-rs

Green Tasks for Rust
Rust
54
star
20

openssl-probe

Rust
51
star
21

socks5-rs

Implementation of a socks5 proxy server in Rust
Rust
49
star
22

jba

Javascript Gameboy
JavaScript
42
star
23

bufstream

A buffered I/O stream for Rust
Rust
30
star
24

wasm-sodium

PoC of libsodium being used in Rust on wasm32-unknown-unknown
Rust
25
star
25

wasm-interface-types-polyfill

Polyfill for WebAssembly Interface Types
Rust
25
star
26

port-of-rust

Docker images for Rust projects
Shell
24
star
27

splay-rs

A splay tree implementation written in Rust
Rust
24
star
28

talks

Talks I've given at various conferences
18
star
29

debug-cell

Debug RefCell which keeps track of stack traces in debug mode
Rust
18
star
30

link-config

Rust
17
star
31

nghttp2-rs

Rust
17
star
32

rack-shibboleth

Shibboleth meets Ruby and Rack
Ruby
17
star
33

rust-wasm-benchmark

JavaScript
14
star
34

cargo-fancy

Fancy output for Cargo
Rust
13
star
35

wbg-rand

Random numbers for wasm32-unknown-unknown in Rust
Rust
12
star
36

ipc-rs

IPC primitives for Rust
Rust
11
star
37

rustfmt-wasm

Dockerfile
9
star
38

oa-pubcookie

OmniAuth strategy when using Pubcookie or CMU's WebISO
Ruby
8
star
39

homebrew-formula

Ruby
7
star
40

paste

JS + CSS dependency managament for Rails 3
Ruby
7
star
41

rack-pubcookie

Pubcookie finally isn't tied to Apache, now it's in Ruby!
Ruby
7
star
42

cfg-specialize

Rust
7
star
43

rustc-auto-publish

Rust
6
star
44

tokio-named-pipes

Windows named pipes bindings for tokio
Rust
6
star
45

l4c

Rust
6
star
46

memset-nt

Non-temporal memset for RUst
Rust
6
star
47

wasm-cross-lang-lto-example

Shell
6
star
48

glimmer-experiments

Rust
6
star
49

xxhash2-rs

Rust bindings to libxxhash
Rust
5
star
50

witx-async-demo

Rust
5
star
51

hacku

hacku
JavaScript
4
star
52

socparse

Ruby
4
star
53

example-wasi-tools

Rust
4
star
54

differential-wasmtime-v8

Rust
4
star
55

golden-circle

The Golden Circle website registration and grading application
JavaScript
4
star
56

openssl-sys

Rust
4
star
57

futures-await-syn

futures-await temporary fork of dtolnay/syn
Rust
3
star
58

oddit

Oddit - Auditing application for CMU students
Ruby
3
star
59

rust-fannkuch-js-and-wasm

Rust
3
star
60

my-awesome-wasi-proposal

3
star
61

futures-spsc

Single-producer, single-consumer queues for Rust futures
Rust
3
star
62

rust-travis-deploy

Moved to https://github.com/rust-lang/simpleinfra
Rust
3
star
63

stamp-rs

Rust
2
star
64

twocan

Collaborative Crosswords!
Ruby
2
star
65

www

HTML
2
star
66

eeyore

Label tagging bot
Rust
2
star
67

io2

Rust
2
star
68

cpu-usage-over-time

Rust
2
star
69

complicated-linkage-example

Rust
2
star
70

fargo

A DC protocol implementation in Ruby
Ruby
2
star
71

rustjob

Rust
2
star
72

conduit-git-http-backend

Conduit handler for running `git http-backend`
Rust
2
star
73

tls-rs

Thread-Local-Storage (TLS) for Rust
2
star
74

rbindgen

Rust
2
star
75

movies

A proposed solution to the Movie Chain Runner problem
C
2
star
76

rustc-compile-time-analyze

Rust
1
star
77

dcfs

A FUSE filesystem over the DC protocol
Ruby
1
star
78

example-wasi-abi-up-to-date

JavaScript
1
star
79

wasmtime-unwinding-timings

Rust
1
star
80

futures-tools

1
star
81

longjmp-repro

Rust
1
star
82

ars

ar in Rust
Rust
1
star
83

weird-proc-macro-spans

Rust
1
star
84

recipe_box

Recipe Box as a web application
Ruby
1
star
85

bors2

WIP
Rust
1
star
86

crates.io-index-2018-09-20-squashed

Shell
1
star
87

webpush-server-proof-of-concept

Rust
1
star
88

wasm-component-ld

Command line linker for creating WebAssembly components
Rust
1
star
89

futures-await-quote

futures-await temporary fork of dtolnay/quote
Rust
1
star
90

cargo-apply

Rust
1
star
91

rustc-test

Extraction of libtest from rustc
Rust
1
star
92

cohort_radio

The best radio ever
Ruby
1
star
93

wasm64-timings

Rust
1
star
94

binaryen

Compiler infrastructure and toolchain library for WebAssembly, in C++
Assembly
1
star
95

gnome-class

Some experimental macros for GNOME integration
Rust
1
star