Neural networks in Rust

Overview

deeplearn-rs

Deep learning in Rust! This is my first shot at this. It's mostly just a proof of concept right now. The API will change.

Status

We have these models implemented (check out the examples folder):

  • MNIST handwritten digit recognition
  • char-rnn using LSTM

So far, we have the following layers implemented:

  • Matrix multiply (fully connected)
  • Add (for bias, for example)
  • LSTM
  • Softmax
  • MSE loss
  • Cross entropy loss

We have the following optimizers:

  • SGD
  • RMSProp

Road map

  • More layer types (in the order that I'll probably get to them)
    • Conv2d
    • Pooling
    • Dropout
  • Allow datatypes other than f32 and implement casting between arrays of primitive numeric types.
  • Provide utilities for working with data
    • images
    • tsv and csv
    • raw text data and word embeddings

Goals

We have a looong way to go :)

  • Fast
  • Easy to use
  • Portable
  • More control when you need it
  • Easy to define custom layers
  • Readable internal codebase

License

MIT

You might also like...
Machine learning Neural Network in Rust

vinyana vinyana - stands for mind in pali language. Goal To implement a simple Neural Network Library in order to understand the maths behind it. This

A gpu accelerated (optional) neural network Rust crate.

Intricate A GPU accelerated library that creates/trains/runs neural networks in pure safe Rust code. Architechture overview Intricate has a layout ver

Neural network implementations from scratch in Rust.

Neural Network from Scratch Neural network implementations from scratch in Rust. Setup & Run Dataset used is mnist. Download the 4 archives and extrac

A fun, hackable, GPU-accelerated, neural network library in Rust, written by an idiot

Tensorken: A Fun, Hackable, GPU-Accelerated, Neural Network library in Rust, Written by an Idiot (work in progress) Understanding deep learning from t

An experimental Neural Network trainer/visualizer in Rust
An experimental Neural Network trainer/visualizer in Rust

DeepRender An experimental Neural Network trainer/visualizer in Rust Try it on your browser! https://msakuta.github.io/DeepRender/ Training on a funct

A neural network crate

RustNN An easy to use neural network library written in Rust. Crate Documentation Description RustNN is a feedforward neural network library. The libr

A simple neural net implementation.

PROPHET - Neural Network Library Linux Windows Codecov Coveralls Docs Crates.io A simple neural net implementation written in Rust with a focus on cac

n2 is a library implementation of a feedforward, backpropagation artificial neural network.

n2 is a library implementation of a feedforward, backpropagation artificial neural network. Usage Add the following to the [dependencies] section o

Comments
  • mnist example panick

    mnist example panick

    stack backtrace: 0: 0x7ff7336a72cc - std::rt::lang_start::hfe4efe1fc39e4a30 1: 0x7ff7336a661d - std::rt::lang_start::hfe4efe1fc39e4a30 2: 0x7ff73369a8fd - std::panicking::rust_panic_with_hook::h587239a80cad02d2 3: 0x7ff7336a996b - rust_begin_unwind 4: 0x7ff73369b99f - std::panicking::begin_panic_fmt::hb3024643f3039337 5: 0x7ff7336942f4 - check at D:\Dev\deeplearn-rs:8 6: 0x7ff733658be3 - doubledeeplearn::var_store::VarIndex 7: 0x7ff7336584c9 - as_event_listcore::cell::Refopencl::hl::Event,opencl::hl::Event,closure at C:\Users\davel.cargo\git\checkouts\rust-opencl-34f1354d1798ac72\master\src\hl.rs:864 8: 0x7ff7336582d7 - enqueue_async_kernel<(usize, usize),&[core::option::Optioncore::cell::Refopencl::hl::Event]> at C:\Users\davel.cargo\git\checkouts\rust-opencl-34f1354d1798ac72\master\src\hl.rs:461 9: 0x7ff733669e1c - matmul at C:\Users\davel.cargo\git\checkouts\gpuarray-rs-23536b5f6e829730\master\src\ops.rs:129 10: 0x7ff7336698ba - forward at D:\Dev\deeplearn-rs\src\op.rs:72 11: 0x7ff733653ad7 - forward at D:\Dev\deeplearn-rs\src\graph.rs:171 12: 0x7ff733647575 - train at D:\Dev\deeplearn-rs\src\train.rs:24 13: 0x7ff73362232a - main at D:\Dev\deeplearn-rs\examples\mnist.rs:136 14: 0x7ff7336a5fac - std::rt::lang_start::hfe4efe1fc39e4a30 15: 0x7ff7336abd41 - _rust_maybe_catch_panic 16: 0x7ff7336a5cee - std::rt::lang_start::hfe4efe1fc39e4a30 17: 0x7ff7336477d9 - main 18: 0x7ff7336b427f - __scrt_common_main_seh at f:\dd\vctools\crt\vcstartup\src\startup\exe_common.inl:255 19: 0x7ffcc4488101 - BaseThreadInitThunk error: Process didn't exit successfully: target\debug\examples\mnist.exe (exit code: 101)

    opened by davidleon 11
  • mnist doesn't converge.on low end GPU

    mnist doesn't converge.on low end GPU

    on a low end GPU, the code doesn't converge, and the first 999 epoch is only around 10%. out_d = [ [-0.0937949 -0.10633045 -0.09618157 -0 -0 0.91507906 -0.09252435 -0.09772251 -0.09196769 -0.09155396] [0.9062051 -0.10633045 -0.09618157 -0 -0 -0.084920965 -0.09252435 -0.09772251 -0.09196769 -0.09155396] [-0.0937949 -0.10633045 -0.09618157 -0 -0 -0.084920965 -0.09252435 -0.09772251 0.9080323 -0.09155396] [-0.0937949 -0.10633045 -0.09618157 -0 1 -0.084920965 -0.09252435 -0.09772251 -0.09196769 -0.09155396] [-0.0937949 -0.10633045 -0.09618157 -0 -0 -0.084920965 -0.09252435 -0.09772251 0.9080323 -0.09155396] ]

    loss = [ [0.08563976 0.005653082 0.0046254476 0 0.1 0.0866216 0.0042803776 0.0047748443 0.16744195 0.0041910643] [0 0 0 0 0 0 0 0 0 0] [0 0 0 0 0 0 0 0 0 0] [0 0 0 0 0 0 0 0 0 0] [0 0 0 0 0 0 0 0 0 0] ]

    Accuracy: 11%

    Epoch: 10999 out = [ [0.09369277 0.103886336 0.0954021 0 0 0.085475065 0.090690844 0.09841768 0.09427827 0.09054168] [0.09369277 0.103886336 0.0954021 0 0 0.085475065 0.090690844 0.09841768 0.09427827 0.09054168] [0.09369277 0.103886336 0.0954021 0 0 0.085475065 0.090690844 0.09841768 0.09427827 0.09054168] [0.09369277 0.103886336 0.0954021 0 0 0.085475065 0.090690844 0.09841768 0.09427827 0.09054168] [0.09369277 0.103886336 0.0954021 0 0 0.085475065 0.090690844 0.09841768 0.09427827 0.09054168] ]

    out_d = [ [-0.09369277 -0.103886336 -0.0954021 -0 -0 -0.085475065 -0.090690844 0.9015823 -0.09427827 -0.09054168] [-0.09369277 -0.103886336 -0.0954021 1 -0 -0.085475065 -0.090690844 -0.09841768 -0.09427827 -0.09054168] [0.9063072 -0.103886336 -0.0954021 -0 -0 -0.085475065 -0.090690844 -0.09841768 -0.09427827 -0.09054168] [-0.09369277 -0.103886336 -0.0954021 -0 1 -0.085475065 -0.090690844 -0.09841768 -0.09427827 -0.09054168] [0.9063072 -0.103886336 -0.0954021 -0 -0 -0.085475065 -0.090690844 -0.09841768 -0.09427827 -0.09054168] ]

    loss = [ [0.16691206 0.0053961854 0.0045507806 0.1 0.1 0.0036529934 0.0041124146 0.08515949 0.0044441964 0.0040988983] [0 0 0 0 0 0 0 0 0 0] [0 0 0 0 0 0 0 0 0 0] [0 0 0 0 0 0 0 0 0 0] [0 0 0 0 0 0 0 0 0 0] ]

    Accuracy: 10.679999%

    Epoch: 11999 out = [ [0.09301148 0.10326048 0.09476416 0 0 0.08517593 0.09224633 0.10320083 0.09311914 0.09102862] [0.09301148 0.10326048 0.09476416 0 0 0.08517593 0.09224633 0.10320083 0.09311914 0.09102862] [0.09301148 0.10326048 0.09476416 0 0 0.08517593 0.09224633 0.10320083 0.09311914 0.09102862] [0.09301148 0.10326048 0.09476416 0 0 0.08517593 0.09224633 0.10320083 0.09311914 0.09102862] [0.09301148 0.10326048 0.09476416 0 0 0.08517593 0.09224633 0.10320083 0.09311914 0.09102862] ]

    out_d = [ [-0.09301148 -0.10326048 -0.09476416 -0 -0 -0.08517593 -0.09224633 -0.10320083 0.90688086 -0.09102862] [-0.09301148 -0.10326048 -0.09476416 1 -0 -0.08517593 -0.09224633 -0.10320083 -0.09311914 -0.09102862] [-0.09301148 -0.10326048 -0.09476416 -0 -0 0.91482407 -0.09224633 -0.10320083 -0.09311914 -0.09102862] [-0.09301148 -0.10326048 -0.09476416 -0 -0 -0.08517593 0.90775365 -0.10320083 -0.09311914 -0.09102862] [-0.09301148 -0.10326048 -0.09476416 -0 -0 -0.08517593 -0.09224633 -0.10320083 0.90688086 -0.09102862] ]

    loss = [ [0.004325568 0.0053313635 0.0044901227 0.1 0 0.08659229 0.08580542 0.0053252056 0.16708793 0.0041431054] [0 0 0 0 0 0 0 0 0 0] [0 0 0 0 0 0 0 0 0 0] [0 0 0 0 0 0 0 0 0 0] [0 0 0 0 0 0 0 0 0 0] ]

    Accuracy: 10%

    Validating Validation Accuracy: 12.6%

    This is an old machine. Celeron(R) CPU [email protected] and Intel HD Graphics.

    After I tweaked gpuarray context.rs to use the CPU it converges as normal. and the first 999 epoch reachs over 70% Running target\debug\examples\mnist.exe Reading training labels... Label count: 60000 Reading training images... Reading validation labels... Label count: 1000 Reading validation images...

    Using OpenCL Device: Intel(R) HD Graphics

    Epoch: 999 out = [ [0 0 0 0 0 0 0 0.781054 0 0.28385308] [0 0 0.3421486 0.6585547 0 0 0 0 0.39498368 0] [0.01904458 0 0.5730746 0.00049253064 0 0 0 0 0 0] [0 0.6941181 0 0 0 0 0.012161581 0 0 0] [0 0 0.62997645 0 0 0 0.11047391 0 0.31374422 0] ]

    out_d = [ [-0 -0 -0 -0 -0 -0 -0 0.21894598 -0 -0.28385308] [-0 -0 -0.3421486 0.34144533 -0 -0 -0 -0 -0.39498368 -0] [-0.01904458 -0 0.42692542 -0.00049253064 -0 -0 -0 -0 -0 -0] [-0 0.30588192 -0 -0 -0 -0 -0.012161581 -0 -0 -0] [-0 -0 0.37002355 -0 -0 -0 -0.11047391 -0 -0.31374422 -0] ]

    loss = [ [0.000036269605 0.009356375 0.043624844 0.011658516 0 0 0.0012352389 0.0047937343 0.025444752 0.008057258] [0 0 0 0 0 0 0 0 0 0] [0 0 0 0 0 0 0 0 0 0] [0 0 0 0 0 0 0 0 0 0] [0 0 0 0 0 0 0 0 0 0] ]

    Accuracy: 78.14%

    opened by davidleon 1
  • char_rnn

    char_rnn

    this example doesn't learn anything useful.

    500
    >hello
    hello-------------------------------------------------------------------------------------------------------------------
    -----------------------------------
    >mon
    mon---------------------------------------------------------------------------------------------------------------------
    ---------------------------------
    >light
    light-------------------------------------------------------------------------------------------------------------------
    -----------------------------------
    >
    
    opened by davidleon 1
  • char_rnn_unrolled completely stalled with gpu prefered context. it runs ok with cpu.

    char_rnn_unrolled completely stalled with gpu prefered context. it runs ok with cpu.

    The char_rnn_unrolled hangs at // Multiply inputs and weights, and add biases - all in one dot product! ga::matmul(ctx, &h_in, &wlstm, &ifog); it hangs indefinitely.

    opened by davidleon 0
Owner
Theodore DeRego
Theodore DeRego
Compile-time creation of neural networks with Rust

GAMMA Compile-time creation of neural networks with Rust Description This is for now just a showcase project of what can be done with const generics i

Aitor Ruano 354 Jan 1, 2023
Tensors and dynamic neural networks in pure Rust.

Neuronika is a machine learning framework written in pure Rust, built with a focus on ease of use, fast prototyping and performance. Dynamic neural ne

Neuronika 851 Jan 3, 2023
Neural Networks in Rust, without backpropagation. WIP

Deep Thought As of right now, this crate is far from a usable state. This crate implements feedforward-neural Networks in rust. Unlike the vast majori

null 5 Apr 10, 2022
Toy library for neural networks in Rust using Vulkan compute shaders

descent Toy library for neural networks in Rust using Vulkan compute shaders. Features Multi-dimensional arrays backed by Vulkan device memory Use Rus

Simon Brown 71 Dec 16, 2022
Compile-time creation of neural networks

Mushin: Compile-time creation of neural networks Mushin is a Japanese term used in martial arts that refers to the state of mind obtained by practice.

Aitor Ruano 354 Jan 1, 2023
🔭 interactively explore `onnx` networks in your CLI.

nnli Interactively explore onnx networks in your CLI. Get nnli ?? From Cargo cargo install nnli From Github git clone https://github.com/drbh/nnli.git

drbh 18 Nov 27, 2023
Simple neural network library for classification written in Rust.

Cogent A note I continue working on GPU stuff, I've made some interesting things there, but ultimately it made me realise this is far too monumental a

Jonathan Woollett-Light 41 Dec 25, 2022
Rust wrapper for the Fast Artificial Neural Network library

fann-rs Rust wrapper for the Fast Artificial Neural Network (FANN) library. This crate provides a safe interface to FANN on top of the low-level bindi

Andreas Fackler 12 Jul 17, 2022
A neural network, and tensor dynamic automatic differentiation implementation for Rust.

Corgi A neural network, and tensor dynamic automatic differentiation implementation for Rust. BLAS The BLAS feature can be enabled, and requires CBLAS

Patrick Song 20 Nov 7, 2022
Simple Neural Network on rust

Simple Artificial Neural Network A crate that implements simple usage of dense neural networks. Instalation Add this to your dependencies on Cargo.tom

null 6 Jul 1, 2022