Awesome deep learning crate

Overview


Build status codecov crates

NeuroFlow is fast neural networks (deep learning) Rust crate. It relies on three pillars: speed, reliability, and speed again.

Hello, everyone! Work on the crate is currently suspended because I am a little busy to do it :( Thanks you all

How to use

Let's try to approximate very simple function 0.5*sin(e^x) - cos(e^(-x)).

extern crate neuroflow;

use neuroflow::FeedForward;
use neuroflow::data::DataSet;
use neuroflow::activators::Type::Tanh;


fn main(){
    /*
        Define neural network with 1 neuron in input layers. Network contains 4 hidden layers.
        And, such as our function returns single value, it is reasonable to have 1 neuron in the output layer.
    */
    let mut nn = FeedForward::new(&[1, 7, 8, 8, 7, 1]);
    
    /*
        Define DataSet.
        
        DataSet is the Type that significantly simplifies work with neural network.
        Majority of its functionality is still under development :(
    */
    let mut data: DataSet = DataSet::new();
    let mut i = -3.0;
    
    // Push the data to DataSet (method push accepts two slices: input data and expected output)
    while i <= 2.5 {
        data.push(&[i], &[0.5*(i.exp().sin()) - (-i.exp()).cos()]);
        i += 0.05;
    }
    
    // Here, we set necessary parameters and train neural network by our DataSet with 50 000 iterations
    nn.activation(Tanh)
        .learning_rate(0.01)
        .train(&data, 50_000);

    let mut res;
    
    // Let's check the result
    i = 0.0;
    while i <= 0.3{
        res = nn.calc(&[i])[0];
        println!("for [{:.3}], [{:.3}] -> [{:.3}]", i, 0.5*(i.exp().sin()) - (-i.exp()).cos(), res);
        i += 0.07;
    }
}

Expected output

for [0.000], [-0.120] -> [-0.119]
for [0.070], [-0.039] -> [-0.037]
for [0.140], [0.048] -> [0.050]
for [0.210], [0.141] -> [0.141]
for [0.280], [0.240] -> [0.236]

But we don't want to lose our trained network so easily. So, there is functionality to save and restore neural networks from files.

    /*
        In order to save neural network into file call function save from neuroflow::io module.
        
        First argument is link on the saving neural network;
        Second argument is path to the file. 
    */
    neuroflow::io::save(&mut nn, "test.flow").unwrap();
    
    /*
        After we have saved the neural network to the file we can restore it by calling
        of load function from neuroflow::io module.
        
        We must specify the type of new_nn variable.
        The only argument of load function is the path to file containing
        the neural network
    */
    let mut new_nn: FeedForward = neuroflow::io::load("test.flow").unwrap();

Classic XOR problem (with no classic input of data)

Let's create file named TerribleTom.csv in the root of project. This file should have following innards:

0,0,-,0
0,1,-,1
1,0,-,1
1,1,-,0

where - is the delimiter that separates input vector from its desired output vector.

extern crate neuroflow;

use neuroflow::FeedForward;
use neuroflow::data::DataSet;
use neuroflow::activators::Type::Tanh;


fn main(){
    /*
        Define neural network with 2 neurons in input layers,
        1 hidden layer (with 2 neurons),
        1 neuron in output layer
    */
    let mut nn = FeedForward::new(&[2, 2, 1]);
    
    // Here we load data for XOR from the file `TerribleTom.csv`
    let mut data = DataSet::from_csv("TerribleTom.csv");
    
    // Set parameters and train the network
    nn.activation(Tanh)
        .learning_rate(0.1)
        .momentum(0.15)
        .train(&data, 20_000);

    let mut res;
    let mut d;
    for i in 0..data.len(){
        res = nn.calc(data.get(i).0)[0];
        d = data.get(i);
        println!("for [{:.3}, {:.3}], [{:.3}] -> [{:.3}]", d.0[0], d.0[1], d.1[0], res);
    }
}

Expected output

for [0.000, 0.000], [0.000] -> [0.000]
for [1.000, 0.000], [1.000] -> [1.000]
for [0.000, 1.000], [1.000] -> [1.000]
for [1.000, 1.000], [0.000] -> [0.000]

Installation

Insert into your project's cargo.toml block next line

[dependencies]
neuroflow = "0.1.3"

Then in project root file

extern crate neuroflow;

License

MIT License

Attribution

The origami bird from logo is made by Freepik

You might also like...
Flexible, reusable reinforcement learning (Q learning) implementation in Rust

Rurel Rurel is a flexible, reusable reinforcement learning (Q learning) implementation in Rust. Release documentation In Cargo.toml: rurel = "0.2.0"

Machine learning crate for Rust

rustlearn A machine learning package for Rust. For full usage details, see the API documentation. Introduction This crate contains reasonably effectiv

convolutions-rs is a crate that provides a fast, well-tested convolutions library for machine learning

convolutions-rs convolutions-rs is a crate that provides a fast, well-tested convolutions library for machine learning written entirely in Rust with m

High-level non-blocking Deno bindings to the rust-bert machine learning crate.

bertml High-level non-blocking Deno bindings to the rust-bert machine learning crate. Guide Introduction The ModelManager class manages the FFI bindin

Machine learning crate in Rust
Machine learning crate in Rust

DeepRust - Machine learning in Rust Vision To create a deeplearning crate in rust aiming to create a great experience for ML researchers & developers

A Rust machine learning framework.

Linfa linfa (Italian) / sap (English): The vital circulating fluid of a plant. linfa aims to provide a comprehensive toolkit to build Machine Learning

Machine Learning library for Rust

rusty-machine This library is no longer actively maintained. The crate is currently on version 0.5.4. Read the API Documentation to learn more. And he

Xaynet represents an agnostic Federated Machine Learning framework to build privacy-preserving AI applications.
Xaynet represents an agnostic Federated Machine Learning framework to build privacy-preserving AI applications.

xaynet Xaynet: Train on the Edge with Federated Learning Want a framework that supports federated learning on the edge, in desktop browsers, integrate

The Hacker's Machine Learning Engine

Juice This is the workspace project for juice - machine learning frameworks for hackers coaster - underlying math abstraction coaster-nn coaster-blas

Comments
  • Support for Stochastic Gradient Descent and Minibatches

    Support for Stochastic Gradient Descent and Minibatches

    I see your code is running through the whole dataset for each training iteration. For many applications, it is quicker to split them up in random smaller batches and run gradient descent on each "mini batch" (see https://en.wikipedia.org/wiki/Stochastic_gradient_descent): it may take more iterations to converge but each iteration becomes much quicker.

    How high is this features in your priority list?

    opened by tokahuke 3
  • panic: index out of bounds

    panic: index out of bounds

    thread 'main' panicked at 'index out of bounds: the len is 1 but the index is 1', /home/moth/.cargo/registry/src/github.com-1ecc6299db9ec823/neuroflow-0.1.3/src/lib.rs:356:48
    stack backtrace:
       0: rust_begin_unwind
                 at /rustc/7737e0b5c4103216d6fd8cf941b7ab9bdbaace7c/library/std/src/panicking.rs:584:5
       1: core::panicking::panic_fmt
                 at /rustc/7737e0b5c4103216d6fd8cf941b7ab9bdbaace7c/library/core/src/panicking.rs:143:14
       2: core::panicking::panic_bounds_check
                 at /rustc/7737e0b5c4103216d6fd8cf941b7ab9bdbaace7c/library/core/src/panicking.rs:85:5
       3: <usize as core::slice::index::SliceIndex<[T]>>::index
                 at /rustc/7737e0b5c4103216d6fd8cf941b7ab9bdbaace7c/library/core/src/slice/index.rs:189:10
       4: core::slice::index::<impl core::ops::index::Index<I> for [T]>::index
                 at /rustc/7737e0b5c4103216d6fd8cf941b7ab9bdbaace7c/library/core/src/slice/index.rs:15:9
       5: <alloc::vec::Vec<T,A> as core::ops::index::Index<I>>::index
                 at /rustc/7737e0b5c4103216d6fd8cf941b7ab9bdbaace7c/library/alloc/src/vec/mod.rs:2531:9
       6: neuroflow::FeedForward::backward
                 at /home/moth/.cargo/registry/src/github.com-1ecc6299db9ec823/neuroflow-0.1.3/src/lib.rs:356:48
       7: neuroflow::FeedForward::fit
                 at /home/moth/.cargo/registry/src/github.com-1ecc6299db9ec823/neuroflow-0.1.3/src/lib.rs:463:9
       8: neuroflow::FeedForward::train
                 at /home/moth/.cargo/registry/src/github.com-1ecc6299db9ec823/neuroflow-0.1.3/src/lib.rs:439:13
       9: neuralnetrust::main
                 at ./src/main.rs:41:5
      10: core::ops::function::FnOnce::call_once
                 at /rustc/7737e0b5c4103216d6fd8cf941b7ab9bdbaace7c/library/core/src/ops/function.rs:227:5
    note: Some details are omitted, run with `RUST_BACKTRACE=full` for a verbose backtrace.
    The terminal process "cargo 'run', '--package', 'neuralnetrust', '--bin', 'neuralnetrust'" terminated with exit code: 101.
    

    when using the lib, I get this error

    opened by OHNONOTAMOTH 1
  • example not compiles(((

    example not compiles(((

    error[E0277]: the trait bound `Result<DataSet, Box<dyn std::error::Error>>: Extractable` is not satisfied
    
     |     .train(&data, 20_000);
     |            ^^^^^ the trait `Extractable` is not implemented for `Result<DataSet, Box<dyn std::error::Error>>`
    ...
    

    rust: stable-x86_64-unknown-linux-gnu installed - rustc 1.52.1 (9bc8c42bb 2021-05-09)

    help wanted 
    opened by 221V 4
Owner
Mikhail Kravets
Mikhail Kravets
A deep learning library for rust

Alumina An experimental deep learning library written in pure rust. Breakage expected on each release in the short term. See mnist.rs in examples or R

zza 95 Nov 30, 2022
๐Ÿฆ€ Example of serving deep learning models in Rust with batched prediction

rust-dl-webserver This project provides an example of serving a deep learning model with batched prediction using Rust. In particular it runs a GPT2 m

Evan Pete Walsh 28 Dec 15, 2022
Messing around with deep learning

Deep Learning Test Implementing deep learning in Rust using just a linear algebra library (nalgebra). The neural network (4 hidden layers, 32 neurons

Dmitry Zamkov 9 Jun 22, 2022
miniature: a toy deep learning library written in Rust

miniature: a toy deep learning library written in Rust A miniature is a toy deep learning library written in Rust. The miniature is: implemented for a

Takuma Seno 4 Nov 29, 2021
High performance distributed framework for training deep learning recommendation models based on PyTorch.

PERSIA (Parallel rEcommendation tRaining System with hybrId Acceleration) is developed by AI platform@Kuaishou Technology, collaborating with ETH. It

null 340 Dec 30, 2022
Deep learning superresolution in pure rust

Rusty_SR A Rust super-resolution tool, which when given a low resolution image utilises deep learning to infer the corresponding high resolution image

zza 189 Dec 9, 2022
Deep learning at the speed of light.

luminal Deep learning at the speed of light. use luminal::prelude::*; // Setup graph and tensors let mut cx = Graph::new(); let a = cx.new_tensor::<R

Joe Fioti 3 Jul 25, 2023
๐Ÿ† A ranked list of awesome machine learning Rust libraries.

best-of-ml-rust ?? A ranked list of awesome machine learning Rust libraries. This curated list contains 180 awesome open-source projects with a total

โ‚ธornike 110 Dec 28, 2022
Deep recommender systems for Rust

sbr An implementation of sequence recommenders based on the wyrm autdifferentiaton library. sbr-rs sbr implements efficient recommender algorithms whi

Maciej Kula 112 Dec 24, 2022
โ˜ Puff โ˜ - The deep stack framework.

โ˜ Puff โ˜ Python with an async runtime built-in Rust for GraphQL, ASGI, WSGI, Postgres, PubSub, Redis, Distributed Tasks, and HTTP2 Client. What is Puf

Kyle Hanson 290 Jan 8, 2023