Compile-time creation of neural networks

Overview

Mushin: Compile-time creation of neural networks

CI Security Codecov Crates Docs License

Mushin is a Japanese term used in martial arts that refers to the state of mind obtained by practice. At this point, a person relies not on what they think should be the next move, but what is their trained natural reaction (or instinct).

Description

Mushin allows the developer to build neural networks at compile-time, with preallocated arrays with well defined sizes. This has mainly three very important benefits:

  1. Compile-time network consistency check: Any defect in your neural network (i.e. mismatching layers inputs/outputs) will be raised at compile-time. You can enjoy your coffee while your network inference or training process never fails!
  2. Awesome Rust compiler optimizations: Because the neural network is completely defined at compile-time, the compiler is able to perform smart optimizations, like unrolling loops or injecting SIMD instructions.
  3. Support for embedded: The std library is not required to build neural networks so it can run on any target that Rust supports.

Usage

Add this to your Cargo.toml:

[dependencies]
mushin = "0.1"
mushin_derive = "0.1"

And this is a very simple example to get you started:

use rand::distributions::Uniform;

use mushin::{activations::ReLu, layers::Dense, NeuralNetwork};
use mushin_derive::NeuralNetwork;

// Builds a neural network with 2 inputs and 1 output
// Made of 3 feed forward layers, you can have as many as you want and with any name
#[derive(NeuralNetwork, Debug)]
struct MyNetwork {
    // LayerType<ActivationType, # inputs, # outputs>
    input: Dense<ReLu, 2, 4>,
    hidden: Dense<ReLu, 4, 2>,
    output: Dense<ReLu, 2, 1>,
}

impl MyNetwork {
    // Initialize layer weights with a uniform distribution and set ReLU as activation function
    fn new() -> Self {
        let mut rng = rand::thread_rng();
        let dist = Uniform::from(-1.0..=1.0);

        MyNetwork {
            input: Dense::random(&mut rng, &dist),
            hidden: Dense::random(&mut rng, &dist),
            output: Dense::random(&mut rng, &dist),
        }
    }
}

fn main() {
    // Init the weights and perform a forward pass
    let nn = MyNetwork::new();
    println!("{:#?}", nn);

    let input = [0.0, 1.0];
    println!("Input: {:#?}", input);
    let output = nn.forward(input);
    println!("Output: {:#?}", output);
}

You may wonder how the forward method works. The NeuralNetwork derive macro defines it for you, and it looks like this for this particular example:

fn forward(&self, input: [f32; 2]) -> [f32; 1] {
    self.output.forward(self.hidden.forward(self.input.forward[input]))
}

Note how the forward method expects two input values because that's what the first (input) layer expects, and returns one single value because that's what the last layer (output) returns.

Roadmap

  • Compile-time neural network consistency check
  • Docs, CI/CD & Benchmarks
  • Backward pass
  • More layer types (convolution, dropout, lstm...)
  • More activation functions (sigmoid, softmax...)
  • Maaaybeee, CPU and/or GPU concurrency

Contributing

If you find a vulnerability, bug or would like a new feature, open a new issue.

To introduce your changes into the codebase, submit a Pull Request.

Many thanks!

License

Mushin is distributed under the terms of both the MIT license and the Apache License (Version 2.0).

See LICENSE-APACHE and LICENSE-MIT, and COPYRIGHT for details.

Comments
  • RUSTSEC-2020-0071: Potential segfault in the time crate

    RUSTSEC-2020-0071: Potential segfault in the time crate

    Potential segfault in the time crate

    | Details | | | ------------------- | ---------------------------------------------- | | Package | time | | Version | 0.1.44 | | URL | https://github.com/time-rs/time/issues/293 | | Date | 2020-11-18 | | Patched versions | >=0.2.23 | | Unaffected versions | =0.2.0,=0.2.1,=0.2.2,=0.2.3,=0.2.4,=0.2.5,=0.2.6 |

    Impact

    Unix-like operating systems may segfault due to dereferencing a dangling pointer in specific circumstances. This requires an environment variable to be set in a different thread than the affected functions. This may occur without the user's knowledge, notably in a third-party library.

    The affected functions from time 0.2.7 through 0.2.22 are:

    • time::UtcOffset::local_offset_at
    • time::UtcOffset::try_local_offset_at
    • time::UtcOffset::current_local_offset
    • time::UtcOffset::try_current_local_offset
    • time::OffsetDateTime::now_local
    • time::OffsetDateTime::try_now_local

    The affected functions in time 0.1 (all versions) are:

    • at
    • at_utc
    • now

    Non-Unix targets (including Windows and wasm) are unaffected.

    Patches

    Pending a proper fix, the internal method that determines the local offset has been modified to always return None on the affected operating systems. This has the effect of returning an Err on the try_* methods and UTC on the non-try_* methods.

    Users and library authors with time in their dependency tree should perform cargo update, which will pull in the updated, unaffected code.

    Users of time 0.1 do not have a patch and should upgrade to an unaffected version: time 0.2.23 or greater or the 0.3 series.

    Workarounds

    No workarounds are known.

    References

    time-rs/time#293

    See advisory page for additional details.

    vulnerability 
    opened by github-actions[bot] 3
  • feat!: implement convolutional 2D layer

    feat!: implement convolutional 2D layer

    Convolutional layers with output dimensions inferred at compile time! This required a refactor of the project to be able to make it work, see: https://stackoverflow.com/questions/72742278/satisfying-a-trait-bound-with-a-const-generic-expression-is-it-possible

    Unfortunately this requires nightly until the const equality and expressions are stabilized.

    enhancement 
    opened by c0dearm 1
  • Create discussion forum / discord / matrix

    Create discussion forum / discord / matrix

    Hi! Would it be possible to create a forum to discuss about the API and the philosophy of this project? The idea of this project is really nice, but there may be people who want to contribute with code or with their opinions on the current state of the project and its future.

    This way it'll be much easier to discuss about what kind of API works best for the general consumers of the library.

    documentation 
    opened by znx3p0 1
  • change!: variable and constant tensors are now different types

    change!: variable and constant tensors are now different types

    Constant and variable tensors are now different types, which means, after getting lost on many computations the developer can know which of the resulting tensors are still constants or variables. Another benefit from this is that code is clearer now, as we don't need to assert at runtime if a tensor is a constant, for example when deciding if computing a gradient or not.

    Aside from this, I've also spent 10 minutes designing a logo for the library, given I have 0 skill for this I am quite happy with the result!

    opened by c0dearm 1
  • feat!: this is now an automatic differentiation library

    feat!: this is now an automatic differentiation library

    Building neural networks at compile time was a very fun idea, but in practice it lacked a lot of flexibility.

    For instance, you could only build sequential and static models, vs the directional acyclic and dynamic graphs that are required in most of the machine learning applications nowadays. Coding the derive macro to build the backward/reverse pass was also a feat not for the faint of heart, and I have a history!

    This new approach takes the library closer to what a sane person would do, i.e. something like Tensorflow or PyTorch, and it works wonderfully!

    opened by c0dearm 1
  • Implementing the softmax activation function

    Implementing the softmax activation function

    opened by Kerollmops 1
  • Implementing backpropagation

    Implementing backpropagation

    opened by Kerollmops 1
  • add CI github action and fix clippy warnings

    add CI github action and fix clippy warnings

    Hi @c0dearm I saw your post on reddit and thought It could be helpful to add a basic CI to your project :smile:

    Here's the action's run on my fork: https://github.com/yonip23/gamma/runs/2291263094?check_suite_focus=true

    opened by yonip23 0
  • Note for followers!

    Note for followers!

    Mushin is undergoing a major refactor, in order to provide a better API (experience) to users creating their own Deep Learning models. During the implementation of the MNIST digit recognition example I found some difficulties to ergonomically express the convolutional model. You might not see commits often until I iterate over a few different implementation possibilities. Until then, just letting you know I am working on it!

    Thank you all for your patience :heart:

    enhancement 
    opened by c0dearm 0
  • ML examples

    ML examples

    Hi there!

    This is a really cool library and I'm excited to try it out (especially after spending several hours fighting my environment to get tch-rs working >_> - still haven't, suspect it's easier to rewrite in mushin instead!)

    That being said, I'm still a bit of a novice at ML, and would appreciate some applied examples to see how you might port e.g. one of the PyTorch examples over. (I'm particularly interested in making a GAN or a VAE, but don't let that influence your choice!)

    No rush on this, but it'd be nice to have, especially for other people who'd like to better understand how to use the library 🙂

    (Unrelated: is there any kind of roadmap for what you're planning on implementing next / would like to see implemented?)

    question 
    opened by philpax 2
  • Making Arrayfire optional

    Making Arrayfire optional

    Do you think it would be possible to make arrayfire optional? The benefits of arrayfire are enormous, but it also seems like a huge handicap that anyone using the crate will be forced to download and install the arrayfire binaries. This also means that if I use mushin in my crate, my crate will also dependent on arrayfire, which is kind of a turn off.

    I guess making it optional would require making several versions of all operations and complicating the code with some abstraction over arrayfire/alternative, which would be a lot of work.

    enhancement 
    opened by albertsgarde 3
Releases(0.5.0)
  • 0.5.0(Jun 8, 2022)

    What's Changed

    • feat!: this is now an automatic differentiation library by @c0dearm in https://github.com/c0dearm/mushin/pull/4
    • change!: variable and constant tensors are now different types by @c0dearm in https://github.com/c0dearm/mushin/pull/6
    • feat: initial version of nn module by @c0dearm in https://github.com/c0dearm/mushin/pull/9

    Full Changelog: https://github.com/c0dearm/mushin/commits/0.5.0

    Source code(tar.gz)
    Source code(zip)
Owner
Aitor Ruano
Aitor Ruano
Neural networks in Rust

deeplearn-rs Deep learning in Rust! This is my first shot at this. It's mostly just a proof of concept right now. The API will change. Status We have

Theodore DeRego 199 Oct 23, 2022
Rust implementation of real-coded GA for solving optimization problems and training of neural networks

revonet Rust implementation of real-coded genetic algorithm for solving optimization problems and training of neural networks. The latter is also know

Yury Tsoy 19 Aug 11, 2022
Tensors and dynamic neural networks in pure Rust.

Neuronika is a machine learning framework written in pure Rust, built with a focus on ease of use, fast prototyping and performance. Dynamic neural ne

Neuronika 851 Jan 3, 2023
Neural Networks in Rust, without backpropagation. WIP

Deep Thought As of right now, this crate is far from a usable state. This crate implements feedforward-neural Networks in rust. Unlike the vast majori

null 5 Apr 10, 2022
Toy library for neural networks in Rust using Vulkan compute shaders

descent Toy library for neural networks in Rust using Vulkan compute shaders. Features Multi-dimensional arrays backed by Vulkan device memory Use Rus

Simon Brown 71 Dec 16, 2022
đź”­ interactively explore `onnx` networks in your CLI.

nnli Interactively explore onnx networks in your CLI. Get nnli ?? From Cargo cargo install nnli From Github git clone https://github.com/drbh/nnli.git

drbh 18 Nov 27, 2023
A neural network crate

RustNN An easy to use neural network library written in Rust. Crate Documentation Description RustNN is a feedforward neural network library. The libr

Jack Montgomery 316 Dec 29, 2022
Simple neural network library for classification written in Rust.

Cogent A note I continue working on GPU stuff, I've made some interesting things there, but ultimately it made me realise this is far too monumental a

Jonathan Woollett-Light 41 Dec 25, 2022
A simple neural net implementation.

PROPHET - Neural Network Library Linux Windows Codecov Coveralls Docs Crates.io A simple neural net implementation written in Rust with a focus on cac

Robin Freyler 41 Sep 16, 2022
Rust wrapper for the Fast Artificial Neural Network library

fann-rs Rust wrapper for the Fast Artificial Neural Network (FANN) library. This crate provides a safe interface to FANN on top of the low-level bindi

Andreas Fackler 12 Jul 17, 2022
A neural network, and tensor dynamic automatic differentiation implementation for Rust.

Corgi A neural network, and tensor dynamic automatic differentiation implementation for Rust. BLAS The BLAS feature can be enabled, and requires CBLAS

Patrick Song 20 Nov 7, 2022
Simple Neural Network on rust

Simple Artificial Neural Network A crate that implements simple usage of dense neural networks. Instalation Add this to your dependencies on Cargo.tom

null 6 Jul 1, 2022
Machine learning Neural Network in Rust

vinyana vinyana - stands for mind in pali language. Goal To implement a simple Neural Network Library in order to understand the maths behind it. This

Alexandru Olaru 3 Dec 26, 2022
SelfOrgMap 5 Nov 4, 2020
n2 is a library implementation of a feedforward, backpropagation artificial neural network.

n2 is a library implementation of a feedforward, backpropagation artificial neural network. Usage Add the following to the [dependencies] section o

Søren Mortensen 0 Feb 21, 2021
NEATeRS is a library for training a genetic neural net through reinforcement learning.

NEATeRS NEATeRS is a library for training a genetic neural net through reinforcement learning. It uses the NEAT algorithm developed by Ken Stanley whi

TecTrixer 3 Nov 28, 2022
A light wheight Neural Network library with a focus on ease of use and speed.

Smarty Pants This goal of this library is to: Produce NeuralNetworks that will always give the same result when given the same input. Provide methods

Coding Wizard 3 Mar 7, 2022
A neural network model that can approximate any non-linear function by using the random search algorithm for the optimization of the loss function.

random_search A neural network model that can approximate any non-linear function by using the random search algorithm for the optimization of the los

ph04 2 Apr 1, 2022
A gpu accelerated (optional) neural network Rust crate.

Intricate A GPU accelerated library that creates/trains/runs neural networks in pure safe Rust code. Architechture overview Intricate has a layout ver

Gabriel Miranda 11 Dec 26, 2022