Simple neural network library for classification written in Rust.

Overview

Cogent

Crates.io lib.rs.io docs

A note

I continue working on GPU stuff, I've made some interesting things there, but ultimately it made me realise this is far too monumental a task, I am not Nvidia, I am not Google, I'm just some random sole dev. Right now I'm working on other things and while I would love to continue this project, I think honestly, it's not a good use of time. I could not feasibly make this project good enough to be worth using professionally.

So for now, this project is no longer in active development.

(I still think it's kinda neat so don't let me put you off playing around with it, it's still really fast and easy to use)

Introduction

Cogent is a very basic library for training basic neural networks for classification tasks. It is designed to be as simple as feasible. Hyperparameters in neural network training can be set automatically, so why not? Ideally you could simply do:

let net = NeuralNetwork::Train(&data);

This is the most basic and not quite there yet implementation of that idea.

Training a network to classify MNIST:

// Uses
use cogent::{
    NeuralNetwork,
    EvaluationData,MeasuredCondition
};
use ndarray::{Array2,Axis};

// Setup
// ----------
// 784-ReLU->800-Softmax->10
let mut net = NeuralNetwork::new(784,&[
    Layer::Dense(800,Activation::ReLU),
    Layer::Dense(10,Activation::Softmax)
]);

// Setting training and testing data.
// `get_mnist_dataset(bool)` simply gets MNIST dataset.
// The boolean specifies if it is the MNIST testing data (`true`) or training data (`false`).

// Sets training and testing data.
let (mut train_data,mut train_labels):(Array2<f32>,Array2<usize>) = get_mnist_dataset(false);
let (test_data,test_labels):(Array2<f32>,Array2<usize>) = get_mnist_dataset(true);

// Execution
// ----------
// Trains until no notable accuracy improvements are being made over a number of iterations.
// By default this would end training if 0.5% accuracy improvement was not seen over 12 iterations/epochs.

net.train(&mut train_data,&mut train_labels)
    .evaluation_data(EvaluationData::Actual(&test_data,&test_labels)) // Sets evaluation data
    .l2(0.1) // Implements L2 regularisation with a 0.1 lambda vlaue
    .tracking() // Prints backpropgation progress within each iteration
    .log_interval(MeasuredCondition::Iteration(1)) // Prints evaluation after each iteration
    .go();

// If evaluation data is not set manually it will simply shuffle and split off a random group from training data to be evaluation data.
// In the case of MNIST where training and evaluation datasets are given seperately, it makes sense to set it as such.

// Evaluation
// ----------
let (cost,correctly_classified):(f32,u32) = net.evaluate(&test_data,&test_labels,None); // (cost,examples correctly classified)
println!("Cost: {:.2}",cost);
println!(
    "Accuracy: {}/{} ({:.2}%)",
    correctly_classified,
    test_data.len_of(Axis(1)),
    correctly_classified as f32 / test_data.len_of(Axis(1)) as f32
);

While a huge amount of my work has gone into making this and learning the basics of neural networks along the way, I am immensely (and I cannot stress this enough) amateur in inumerable ways.

If you find any issues I would really appreciate if you could let me know (and possibly suggest any solutions).

Features

  • GPU compute using ArrayFire Rust Bindings
  • Optimisers: Stochastic gradient descent.
  • Layers: Dense, Dropout
  • Activations: Softmax, Sigmoid, Tanh, ReLU.
  • Loss functions: Mean sqaured error, Cross entropy.
  • Misc: L2 regularisation

Installation

  1. Setup ArrayFire Rust bindings (Ignore step 4).
  2. Add cogent = "^0.6" to Cargo.toml.

TODO (See note)

  1. Convolutional layers.
  2. Meticulous testing (making sure things work).
  3. Optimise usage of VRAM.
  4. Meticulous benchmarking (making sure things are fast).
  5. Benchmarking against other popular neural network libraries (Keras etc.)
  6. Automatic net creation: Bayesian optimisation of everything.

Please note that things may not be developed inorder, it is only my estimation.

You might also like...
An experimental Neural Network trainer/visualizer in Rust
An experimental Neural Network trainer/visualizer in Rust

DeepRender An experimental Neural Network trainer/visualizer in Rust Try it on your browser! https://msakuta.github.io/DeepRender/ Training on a funct

A neural network crate

RustNN An easy to use neural network library written in Rust. Crate Documentation Description RustNN is a feedforward neural network library. The libr

A neural network model that can approximate any non-linear function by using the random search algorithm for the optimization of the loss function.

random_search A neural network model that can approximate any non-linear function by using the random search algorithm for the optimization of the los

Build neural network models in Cairo 1.0

Explore ML in Cairo 1.0 Build neural network models in Cairo 1.0 to perform inference. The calculations are performed using i33 values, and the outcom

The CBNF neural network header format.

cbnf The CBNF neural network header format. What is CBNF? CBNF is a neural network header format for use with efficiently updatable neural networks fo

A simple neural net implementation.

PROPHET - Neural Network Library Linux Windows Codecov Coveralls Docs Crates.io A simple neural net implementation written in Rust with a focus on cac

Toy library for neural networks in Rust using Vulkan compute shaders

descent Toy library for neural networks in Rust using Vulkan compute shaders. Features Multi-dimensional arrays backed by Vulkan device memory Use Rus

NEATeRS is a library for training a genetic neural net through reinforcement learning.

NEATeRS NEATeRS is a library for training a genetic neural net through reinforcement learning. It uses the NEAT algorithm developed by Ken Stanley whi

Owner
Jonathan Woollett-Light
MEng Software Engineering student. Passionate about making fast and reliable systems.
Jonathan Woollett-Light
Simple Neural Network on rust

Simple Artificial Neural Network A crate that implements simple usage of dense neural networks. Instalation Add this to your dependencies on Cargo.tom

null 6 Jul 1, 2022
An efficient implementation of Partitioned Label Trees & its variations for extreme multi-label classification

Omikuji An efficient implementation of Partitioned Label Trees (Prabhu et al., 2018) and its variations for extreme multi-label classification, writte

Tom Dong 73 Nov 7, 2022
Repository for CinPatent: Datasets for Patent Classification

CinPatent: Datasets for Patent Classification We release two datasets for patent classification in English and Japanese at Google Drive. The data fold

Cinnamon 1 Jan 2, 2023
Rust wrapper for the Fast Artificial Neural Network library

fann-rs Rust wrapper for the Fast Artificial Neural Network (FANN) library. This crate provides a safe interface to FANN on top of the low-level bindi

Andreas Fackler 12 Jul 17, 2022
n2 is a library implementation of a feedforward, backpropagation artificial neural network.

n2 is a library implementation of a feedforward, backpropagation artificial neural network. Usage Add the following to the [dependencies] section o

Søren Mortensen 0 Feb 21, 2021
A light wheight Neural Network library with a focus on ease of use and speed.

Smarty Pants This goal of this library is to: Produce NeuralNetworks that will always give the same result when given the same input. Provide methods

Coding Wizard 3 Mar 7, 2022
A neural network, and tensor dynamic automatic differentiation implementation for Rust.

Corgi A neural network, and tensor dynamic automatic differentiation implementation for Rust. BLAS The BLAS feature can be enabled, and requires CBLAS

Patrick Song 20 Nov 7, 2022
Machine learning Neural Network in Rust

vinyana vinyana - stands for mind in pali language. Goal To implement a simple Neural Network Library in order to understand the maths behind it. This

Alexandru Olaru 3 Dec 26, 2022
A gpu accelerated (optional) neural network Rust crate.

Intricate A GPU accelerated library that creates/trains/runs neural networks in pure safe Rust code. Architechture overview Intricate has a layout ver

Gabriel Miranda 11 Dec 26, 2022
Neural network implementations from scratch in Rust.

Neural Network from Scratch Neural network implementations from scratch in Rust. Setup & Run Dataset used is mnist. Download the 4 archives and extrac

Mohammad Rahhal 6 Dec 29, 2022