A framework for iterating over collections of types implementing a trait without virtual dispatch

Related tags

Utilities zero_v
Overview

zero_v

Zero_V is an experiment in defining behavior over collections of objects implementing some trait without dynamic polymorphism. This is a small crate with some helper utilities, along with the zero_v macro which generates the boilerplate necessary for your traits and functions to support the collections generated by Zero_V.

It can be useful if all of the following are true:

  • Library users will always know the composition of the collection of types at compile time.
  • Library users should be able to alter collection composition easily.
  • Vtable overhead matters.

For example, lets imagine you've written an event logging library that allows users to extend it with plugins to alter events before logging. Using dynamic polymorphism/ vtables, client code might look something like:

let plugins: Vec<Box<dyn Plugin>> = Vec![
    Box::new(TimestampReformatter::new()),
    Box::new(HostMachineFieldAdder::new()),
    Box::new(UserFieldAdder::new()),
];

let mut logger = EventLogger::with_plugins(plugins);

let events = EventStream::new();
for event in events.listen() {logger.log_event(event)};

This is a fine way to approach the problem in most cases. It's easy for clients to set up and you rarely care about the overhead of the virtual calls.

But if you do care about overhead, a Zero_V version of the above looks like:

use zero_v::{compose, compose_nodes};
let plugins = compose!(
    TimestampReformatter::new(),
    HostMachineFieldAdder::new(),
    UserFieldAdder::new()
);

let mut logger = EventLogger::with_plugins(plugins);

let events = EventStream::new();
for event in events.listen() {logger.log_event(event)};

To the client the only real difference here is the use of the compose macro, dropping the boxing for each plugin in the collection and the extra Zero_V imports. But internally, your type is now generic over a type defined without the use of boxing or vtables, which encourages the compiler to monomorphise the plugin use and remove the virtual function calls.

Adding zero_v to your project

To add zero_v to your project, add the following to your Cargo.toml under dependencies

[dependencies]
zero_v = "0.2.0"

If you're an end-user who just needs to create a collection, or you want to implement the boilerplate for your trait yourself (not recommended if you can avoid it, but you may need to if your trait involves generics or references as arguments) then you can use the crate without the zero_v attribute macro with:

[dependencies]
zero_v = { version = "0.2.0", default-features = false }

Implementing Zero_V for your type with the zero_v macro

If your trait doesn't involve arguments with lifetimes or generics then the zero_v macro gives you all of your boilerplate for free. All you need to do is drop it on top of your trait definition:

use zero_v::{zero_v};

// The trait_types argument tells the macro to generate all the traits you'll
// need for iteration over a collection of items implementing your trait.

#[zero_v(trait_types)]
trait IntOp {
    fn execute(&self, input: usize) -> usize;
}

// Passing the fn_generics arguments to the macro tells it to produce proper
// generic bounds for the second argument. The second argument takes the form
//  as  where  is the name of your trait and  is the name you're
// giving to the generic parameter which can accept a zero_v collection.

#[zero_v(fn_generics, IntOp as IntOps)]
fn sum_map(input: usize, ops: &IntOps) -> usize {
    ops.iter_execute(input).sum()
}

Implementing Zero_V for your type manually

To enable Zero_V, you'll need to add a pretty large chunk of boilerplate to your library. This code walks you through it step by step for a simple example.

use zero_v::{Composite, NextNode, Node};

// This is the trait we want members of client collections to implement.
trait IntOp {
    fn execute(&self, input: usize) -> usize;
}

// First, you'll need a level execution trait. It will have one method
// which extends the signature of your trait's core function with an extra
// paremeter of type usize (called level here) and wraps the output in an
// option (these changes will allow us to return the outputs of the function
// from an iterator over the collection.
trait IntOpAtLevel {
    fn execute_at_level(&self, input: usize, level: usize) -> Option<usize>;
}

// You'll need to implement this level execution trait for two types,
// The first type is Node where A implements your basic trait and B
// implements the level execution trait. For this type, just
// copy the body of the function below, updating the contents of the if/else
// blocks with the signature of your trait's function.
impl IntOpAtLevel for Node {
    fn execute_at_level(&self, input: usize, level: usize) -> Option<usize> {
        if level == 0 {
            Some(self.data.execute(input))
        } else {
            self.next.execute_at_level(input, level - 1)
        }
    }
}
// The second type is the unit type. For this implementation, just return None.
impl IntOpAtLevel for () {
    fn execute_at_level(&self, _input: usize, _level: usize) -> Option<usize> {
        None
    }
}

// Next you'll need to create an iterator type for collections implementing
// your trait. The iterator will have one field for each argument to your
// trait's function, along with a level field and a parent reference to
// a type implementing your level execution trait and NextNode.
struct CompositeIterator<'a, Nodes: NextNode + IntOpAtLevel> {
    level: usize,
    input: usize,
    parent: &'a Nodes,
}

// Giving your iterator a constructor is optional.
impl<'a, Nodes: NextNode + IntOpAtLevel> CompositeIterator<'a, Nodes> {
    fn new(parent: &'a Nodes, input: usize) -> Self {
        Self {
            parent,
            input,
            level: 0,
        }
    }
}

// You'll need to implement Iterator for the iterator you just defined.
// The item type will be the return type of your function. For next, just
// copy the body of next below, replacing execute_at_level with the
// signature of your execute_at_level function.
impl<'a, Nodes: NextNode + IntOpAtLevel> Iterator for CompositeIterator<'a, Nodes> {
    type Item = usize;

    fn next(&mut self) -> Option<Self::Item> {
        let result = self.parent.execute_at_level(self.input, self.level);
        self.level += 1;
        result
    }
}

// Almost done. Now you'll need to define a trait returning your iterator
// type.
trait IterExecute {
    fn iter_execute(&self, input: usize) -> CompositeIterator<'_, Nodes>;
}

// Finally, implement your iterator return trait on a composite over Nodes
// bound by NextNode and your level execution trait which returns
// your iterator.
implIterExecute for Composite {
    fn iter_execute(&self, input: usize) -> CompositeIterator<'_, Nodes> {
        CompositeIterator::new(&self.head, input)
    }
}

Benchmarks

Some example benchmarks for Zero_V are captured below. The source takes two sets of objects implementing a simple trait transforming a usize to another usize, (one taking a constructor argument and one using a small extra const optimization), and then implements the benchmark with one dynamic collection (the standard vtable way) and one static collection (using Zero_V) for each of those sets. Results are given below (Hardware was a Lenovo T430 and benchmarks were compiled using rustc 1.52.1, so your mileage may vary) alt text Zero_V comes out of this benchmark looking pretty good, but I do want to stress the following caveats.

  • This was using a trait where each iteration of the loop did a very small amount of work (a single multiplication, addition, rshift or lshift op). Basically this means that these benchmarks should make Zero_V look as good as it will ever look, since the vtable overhead will be as large as possible relative to the amount of work per iteration.
  • Every use case is different, every machine is different and compilers can be fickle. If performance is important enough to pay the structural costs this technique will impose on your code, it's probably important enough to verify you're getting the expected speedups by running your own benchmark suite, and making sure those benchmarks are reflected in production. The benchmarks above also make aggressive use of inline annotations for trait implementations, and removing a single annotation can make the execution three times slower, so it's can be worth exploring inlining for your own use case (dependent on your performance needs).
  • The eagle-eyed amongst you might notice there's a fifth benchmark, baseline, that completes in a little over a nanosecond. This is the version of the benchmarks that dispenses with traits and objects and just has a single function which performs the work we're doing in the other benchmarks (execute a set of integer ops on our input and sum the outputs). Depending on your use case, it might be a good idea to design your APIs so that anyone who wants to hardcode an optimized solution like that has the tools to do so. If you're good to your compiler, your compiler will be good to you (occasional compiler bugs notwithstanding).

License: MIT OR Apache-2.0

You might also like...
Detect if code is running inside a virtual machine (x86 and x86-64 only).

inside-vm Detect if code is running inside a virtual machine. Only works on x86 and x86-64. How does it work Measure average cpu cycles when calling c

Create virtual serial ports, connect them to physical serial ports, and create routes between them all.

Virtual Serial Port Router (vsp-router) Create virtual serial ports, connect them to physical serial ports, and create routes between them all. vsp-ro

Membrane is an opinionated crate that generates a Dart package from a Rust library. Extremely fast performance with strict typing and zero copy returns over the FFI boundary via bincode.

Membrane is an opinionated crate that generates a Dart package from a Rust library. Extremely fast performance with strict typing and zero copy returns over the FFI boundary via bincode.

Log defmt messages over the serial port.
Log defmt messages over the serial port.

defmt-serial A defmt target for logging over a serial port. Messages can e.g. be read using socat and passed through defmt-print, see example-artemis

A lean, minimal, and stable set of types for color interoperation between crates in Rust.

This library provides a lean, minimal, and stable set of types for color interoperation between crates in Rust. Its goal is to serve the same function that mint provides for (linear algebra) math types.

Utilities to gather data out of roms. Written in Rust. It (should) support all types.

snesutilities Utilities to gather data out of roms. Written in Rust. It (should) support all types. How Have a look at main.rs: use snesutilities::Sne

This crate allows you to safely initialize Dynamically Sized Types (DST) using only safe Rust.

This crate allows you to safely initialize Dynamically Sized Types (DST) using only safe Rust.

Time related types (and conversions) for scientific and astronomical usage.

astrotime Time related types (and conversions) for scientific and astronomical usage. This library is lightweight and high performance. Features The f

Showcase for pathological compile times when using knuffel / chumsky / VERY LARGE types

netherquote Showcase for pathological compile times when using knuffel / chumsky / VERY LARGE types. How to reproduce The rust toolchain version is pi

Comments
  • Add zero_v_gen! and composed! proc macros

    Add zero_v_gen! and composed! proc macros

    heya @fergaljoconnor, I really enjoyed playing with this library, and decided to add a proc-macro that generates much of the necessary boilerplate to make it more ergonomic to implement.

    The basic idea, is that a user can now tag their trait with #[zero_v_gen], and then implementations of the trait can easily be used with the compose! macro i.e.:

    #[zero_v_gen]
    pub trait MyTrait {
      fn do_something(&self) -> i32;
    }
    
    pub struct MyImpl;
    impl MyTrait for MyImpl {
      fn do_something(&self) -> i32 { 0 }
    }
    
    fn main() {
      let impls = compose!(MyImpl, MyImpl);
      let results: Vec<i32> = impls.iter_do_something().collect();
      println!("{:?}", results);
    }
    

    I also added a new composed proc-macro that makes writing a function that accepts a composite list much simpler, removing the need for using generated type generics. here's an example from my PR:

    #[composed(IntOp as IntOps)]
    fn bench_composed(input: usize, ops: &IntOps) -> usize {
        ops.iter_execute(input).sum()
    }
    

    I updated all the tests and example to use the new macros, and all are passing.

    let me know if you like the basic idea, I don't consider this code "done" yet -- many of the conventions / API choices were mostly aesthetic on my part and I'd be happy to change things if you see them fitting in better.

    anyway, I wanted to show it to you and get feedback before I put much more effort in :)

    opened by marshall 10
  • zero_v_gen: add better support for user defined lifetimes and generics

    zero_v_gen: add better support for user defined lifetimes and generics

    as correctly pointed out in the docs, right now the zero_v proc macro doesn't work for traits or functions that have user defined generics. this should be fixable

    opened by marshall 0
  • Add a len() method to composites

    Add a len() method to composites

    Right now, you don't get to mix and match methods or inputs during iteration. So if you have a trait that looks something like:

    trait Operation {
        fn should_change_input(&self, &input: usize) -> bool;
        fn change(&self, input: &mut usize)
    }
    

    And you want to do something like the equivalent of:

        for op in ops {
            if operation.should_change_input(&input) {
                operation.change(&mut input)
            }      
        }
    

    It's pretty difficult to do this in the current version of zero_v without hacks.

    However, if the collection has a len method, then you can do something like:

        for i in 0..ops.len() {
            if op.should_change_input_at_level(&input, i) {
                ops.change_at_level(&mut input, i)
            }
        }
    

    Which should make it easier to integrate more use cases.

    Implementation might look like:

    trait HasLength {
        fn len(&self) -> usize;
    }
    
    impl HasLength for Composite<A, B>
    where B: CompositeLen 
    {
        fn len(&self) -> usize {
            self.next.len() + 1
        }
    }
    
    impl HasLength for () {
        fn len(&self) -> usize {
            0
        }
    }
    
    opened by fergaljoconnor 0
  • Add an example to the readme showing the API for using a zero_v collection from within your code more clearly

    Add an example to the readme showing the API for using a zero_v collection from within your code more clearly

    Ideally, demonstrate:

    • A trait with a couple of different functions with different signatures to emphasise the (*args: *types, level: usize) pattern for people who learn by example.
    • How to implement the required generics manually.
    documentation 
    opened by fergaljoconnor 0
Releases(v0.2.0)
  • v0.2.0(Jun 14, 2021)

    Summary

    This release adds the zero_v macro. If your trait doesn't need generics or explicit lifetimes, then tagging it with this attribute will generate the iterator traits for you. You can also use it to tag a function, which generates the extra generic bounds your function needs to accept a zero_v composite. Meaning all the boilerplate which the previous release required can be replaced with something like this:

    use zero_v::{zero_v};
    
    // The trait_types argument tells the macro to generate all the traits you'll
    // need for iteration over a collection of items implementing your trait.
    
    #[zero_v(trait_types)]
    trait IntOp {
        fn execute(&self, input: usize) -> usize;
    }
    
    // Passing the fn_generics arguments to the macro tells it to produce proper
    // generic bounds for the second argument. The second argument takes the form
    // <X> as <Y> where <X> is the name of your trait and <Y> is the name you're
    // giving to the generic parameter which can accept a zero_v collection.
    
    #[zero_v(fn_generics, IntOp as IntOps)]
    fn sum_map(input: usize, ops: &IntOps) -> usize {
        ops.iter_execute(input).sum()
    }
    

    Acknowledgements

    Thanks to the following contributors:

    • @marshall for contributing the zero_v macro.
    Source code(tar.gz)
    Source code(zip)
Owner
null
Minimal, flexible framework for implementing solutions to Advent of Code in Rust

This is advent_of_code_traits, a minimal, flexible framework for implementing solutions to Advent of Code in Rust.

David 8 Apr 17, 2022
A stack-allocated box that stores trait objects.

This crate allows saving DST objects in the provided buffer. It allows users to create global dynamic objects on a no_std environment without a global allocator.

Aleksey Sidorov 19 Dec 13, 2022
A stack for rust trait objects that minimizes allocations

dynstack A stack for trait objects that minimizes allocations COMPATIBILITY NOTE: dynstack relies on an underspecified fat pointer representation. Tho

Gui Andrade 114 Nov 28, 2022
Generate enum from a trait, with converters between them

Derive macro for Rust that turns traits into enums, providing tools for calling funtions over channels

Vitaly Shukela 16 Nov 3, 2022
A lending version of the `Stream` trait

lending-stream A lending version of Stream API Docs | Releases | Contributing Installation $ cargo add lending-stream Safety This crate uses #![deny(u

Yosh 5 Aug 14, 2023
A Rust framework to develop and use plugins within your project, without worrying about the low-level details.

VPlugin: A plugin framework for Rust. Website | Issues | Documentation VPlugin is a Rust framework to develop and use plugins on applications and libr

VPlugin 11 Dec 31, 2022
An unofficial and incomplete no_std Rust library for implementing the ElectricUI Binary Protocol

An unofficial and incomplete no_std Rust library for implementing the ElectricUI Binary Protocol

Jon 2 Mar 29, 2022
Derive macro implementing 'From' for structs

derive-from-ext A derive macro that auto implements 'std::convert::From' for structs. The default behaviour is to create an instance of the structure

Andrew Lowndes 4 Sep 18, 2022
An ActivityPub home server written in Rust, implementing the Mastodon API.

Tafarn An ActivityPub home server written in Rust, implementing the Mastodon API. At present no web UI is provided, the API is the only way to interac

✨ Q (it/its) ✨ 12 Jan 22, 2023
A xdg-desktop-portal for wlroots based compositors implementing zwlr_screencopy

xdg-desktop-portal-luminous An alternative to xdg-desktop-portal-wlr for wlroots compositors. This project is a stand alone binary and does not depend

Waycrate 7 Aug 27, 2023