Decode SCALE bytes into custom types using a scale-info type registry and a custom Visitor impl.

Overview

scale-decode

This crate attempts to simplify the process of decoding SCALE encoded bytes into a custom data structure given a type registry (from scale-info), a type ID that you'd like to decode the bytes into, and a Visitor implementation which determines how you'd like to map the decoded values onto your own custom type.

The crate attempts to avoid any allocations in the decode function, so that the only allocations introduced are those that are part of your Visitor implementation.

Here's an example of implementing Visitor to decode bytes into a custom Value type:

use scale_decode::visitor::{self, TypeId};

// A custom type we'd like to decode into:
#[derive(Debug, PartialEq)]
enum Value {
    Bool(bool),
    Char(char),
    U8(u8),
    U16(u16),
    U32(u32),
    U64(u64),
    U128(u128),
    U256([u8; 32]),
    I8(i8),
    I16(i16),
    I32(i32),
    I64(i64),
    I128(i128),
    I256([u8; 32]),
    CompactU8(u8),
    CompactU16(u16),
    CompactU32(u32),
    CompactU64(u64),
    CompactU128(u128),
    Sequence(Vec<Value>),
    Composite(Vec<(String, Value)>),
    Tuple(Vec<Value>),
    Str(String),
    Array(Vec<Value>),
    Variant(String, Vec<(String, Value)>),
    BitSequence(visitor::BitSequenceValue),
}

// Implement the `Visitor` trait to define how to go from SCALE
// values into this type:
struct ValueVisitor;
impl visitor::Visitor for ValueVisitor {
    type Value = Value;
    type Error = visitor::DecodeError;

    fn visit_bool(self, value: bool, _type_id: TypeId) -> Result<Self::Value, Self::Error> {
        Ok(Value::Bool(value))
    }
    fn visit_char(self, value: char, _type_id: TypeId) -> Result<Self::Value, Self::Error> {
        Ok(Value::Char(value))
    }
    fn visit_u8(self, value: u8, _type_id: TypeId) -> Result<Self::Value, Self::Error> {
        Ok(Value::U8(value))
    }
    fn visit_u16(self, value: u16, _type_id: TypeId) -> Result<Self::Value, Self::Error> {
        Ok(Value::U16(value))
    }
    fn visit_u32(self, value: u32, _type_id: TypeId) -> Result<Self::Value, Self::Error> {
        Ok(Value::U32(value))
    }
    fn visit_u64(self, value: u64, _type_id: TypeId) -> Result<Self::Value, Self::Error> {
        Ok(Value::U64(value))
    }
    fn visit_u128(self, value: u128, _type_id: TypeId) -> Result<Self::Value, Self::Error> {
        Ok(Value::U128(value))
    }
    fn visit_u256(self, value: &[u8; 32], _type_id: TypeId) -> Result<Self::Value, Self::Error> {
        Ok(Value::U256(*value))
    }
    fn visit_i8(self, value: i8, _type_id: TypeId) -> Result<Self::Value, Self::Error> {
        Ok(Value::I8(value))
    }
    fn visit_i16(self, value: i16, _type_id: TypeId) -> Result<Self::Value, Self::Error> {
        Ok(Value::I16(value))
    }
    fn visit_i32(self, value: i32, _type_id: TypeId) -> Result<Self::Value, Self::Error> {
        Ok(Value::I32(value))
    }
    fn visit_i64(self, value: i64, _type_id: TypeId) -> Result<Self::Value, Self::Error> {
        Ok(Value::I64(value))
    }
    fn visit_i128(self, value: i128, _type_id: TypeId) -> Result<Self::Value, Self::Error> {
        Ok(Value::I128(value))
    }
    fn visit_i256(self, value: &[u8; 32], _type_id: TypeId) -> Result<Self::Value, Self::Error> {
        Ok(Value::I256(*value))
    }
    fn visit_compact_u8(
        self,
        value: visitor::Compact<u8>,
        _type_id: TypeId,
    ) -> Result<Self::Value, Self::Error> {
        Ok(Value::CompactU8(value.value()))
    }
    fn visit_compact_u16(
        self,
        value: visitor::Compact<u16>,
        _type_id: TypeId,
    ) -> Result<Self::Value, Self::Error> {
        Ok(Value::CompactU16(value.value()))
    }
    fn visit_compact_u32(
        self,
        value: visitor::Compact<u32>,
        _type_id: TypeId,
    ) -> Result<Self::Value, Self::Error> {
        Ok(Value::CompactU32(value.value()))
    }
    fn visit_compact_u64(
        self,
        value: visitor::Compact<u64>,
        _type_id: TypeId,
    ) -> Result<Self::Value, Self::Error> {
        Ok(Value::CompactU64(value.value()))
    }
    fn visit_compact_u128(
        self,
        value: visitor::Compact<u128>,
        _type_id: TypeId,
    ) -> Result<Self::Value, Self::Error> {
        Ok(Value::CompactU128(value.value()))
    }
    fn visit_sequence(
        self,
        value: &mut visitor::Sequence,
        _type_id: TypeId,
    ) -> Result<Self::Value, Self::Error> {
        let mut vals = vec![];
        while let Some(val) = value.decode_item(ValueVisitor)? {
            vals.push(val);
        }
        Ok(Value::Sequence(vals))
    }
    fn visit_composite(
        self,
        value: &mut visitor::Composite,
        _type_id: TypeId,
    ) -> Result<Self::Value, Self::Error> {
        let mut vals = vec![];
        while let Some((name, val)) = value.decode_item_with_name(ValueVisitor)? {
            vals.push((name.to_owned(), val));
        }
        Ok(Value::Composite(vals))
    }
    fn visit_tuple(
        self,
        value: &mut visitor::Tuple,
        _type_id: TypeId,
    ) -> Result<Self::Value, Self::Error> {
        let mut vals = vec![];
        while let Some(val) = value.decode_item(ValueVisitor)? {
            vals.push(val);
        }
        Ok(Value::Tuple(vals))
    }
    fn visit_str(self, value: visitor::Str, _type_id: TypeId) -> Result<Self::Value, Self::Error> {
        Ok(Value::Str(value.as_str()?.to_owned()))
    }
    fn visit_variant(
        self,
        value: &mut visitor::Variant,
        _type_id: TypeId,
    ) -> Result<Self::Value, Self::Error> {
        let mut vals = vec![];
        let fields = value.fields();
        while let Some((name, val)) = fields.decode_item_with_name(ValueVisitor)? {
            vals.push((name.to_owned(), val));
        }
        Ok(Value::Variant(value.name().to_owned(), vals))
    }
    fn visit_array(
        self,
        value: &mut visitor::Array,
        _type_id: TypeId,
    ) -> Result<Self::Value, Self::Error> {
        let mut vals = vec![];
        while let Some(val) = value.decode_item(ValueVisitor)? {
            vals.push(val);
        }
        Ok(Value::Array(vals))
    }
    fn visit_bitsequence(
        self,
        value: &mut visitor::BitSequence,
        _type_id: TypeId,
    ) -> Result<Self::Value, Self::Error> {
        Ok(Value::BitSequence(value.decode_bitsequence()?))
    }
}

This can then be passed to a decode function like so:

let value: Value = scale_decode::decode(scale_bytes, type_id, types, ValueVisitor)?;

Where scale_bytes are the bytes you'd like to decode, type_id is the type stored in the types registry that you'd like to try and decode the bytes into, and types is a scale_info::PortableRegistry containing information about the various types in use.

Comments
  • The crate doesn't support `wasm/32-bit targets`

    The crate doesn't support `wasm/32-bit targets`

    $ cargo check --target wasm32-unknown-unknown

    error: invalid suffix `bit_target` for number literal
      --> src/lib.rs:35:13
       |
    35 |     feature(32bit_target)
       |             ^^^^^^^^^^^^ invalid suffix `bit_target`
       |
       = help: the suffix must be one of the numeric types (`u32`, `isize`, `f32`, etc.)
    
    error[E0277]: the trait bound `u64: BitStore` is not satisfied
       --> src/visitor/bit_sequence.rs:130:10
        |
    130 |     U64Lsb0(BitVec<u64, Lsb0>),
        |             ^^^^^^^^^^^^^^^^^ the trait `BitStore` is not implemented for `u64`
        |
        = help: the following other types implement trait `BitStore`:
                  u16
                  u32
                  u8
                  usize
    note: required by a bound in `BitVec`
       --> /home/niklasad1/.cargo/registry/src/github.com-1ecc6299db9ec823/bitvec-1.0.1/src/vec.rs:56:5
        |
    56  |     T: BitStore,
        |        ^^^^^^^^ required by this bound in `BitVec`
    
    For more information about this error, try `rustc --explain E0277`.
    error: could not compile `scale-decode` due to 2 previous errors
    

    I think the feature(32bit_target) is invalid or at least I have never heard of it before.

    /cc @jsdw

    opened by niklasad1 2
  • Bump Swatinem/rust-cache from 2.0.1 to 2.1.0

    Bump Swatinem/rust-cache from 2.0.1 to 2.1.0

    Bumps Swatinem/rust-cache from 2.0.1 to 2.1.0.

    Release notes

    Sourced from Swatinem/rust-cache's releases.

    v2.1.0

    • Only hash Cargo.{lock,toml} files in the configured workspace directories.

    v2.0.2

    • Avoid calling cargo metadata on pre-cleanup.
    • Added prefix-key, cache-directories and cache-targets options.
    Changelog

    Sourced from Swatinem/rust-cache's changelog.

    Changelog

    2.1.0

    • Only hash Cargo.{lock,toml} files in the configured workspace directories.

    2.0.2

    • Avoid calling cargo metadata on pre-cleanup.
    • Added prefix-key, cache-directories and cache-targets options.

    2.0.1

    • Primarily just updating dependencies to fix GitHub deprecation notices.

    2.0.0

    • The action code was refactored to allow for caching multiple workspaces and different target directory layouts.
    • The working-directory and target-dir input options were replaced by a single workspaces option that has the form of $workspace -> $target.
    • Support for considering env-vars as part of the cache key.
    • The sharedKey input option was renamed to shared-key for consistency.

    1.4.0

    • Clean both debug and release target directories.

    1.3.0

    • Use Rust toolchain file as additional cache key.
    • Allow for a configurable target-dir.

    1.2.0

    • Cache ~/.cargo/bin.
    • Support for custom $CARGO_HOME.
    • Add a cache-hit output.
    • Add a new sharedKey option that overrides the automatic job-name based key.

    1.1.0

    • Add a new working-directory input.
    • Support caching git dependencies.
    • Lots of other improvements.

    1.0.2

    • Don’t prune targets that have a different name from the crate, but do prune targets from the workspace.

    ... (truncated)

    Commits
    • b894d59 2.1.0
    • e78327d small code style improvements, README and CHANGELOG updates
    • ccdddcc only hash Cargo.toml/Cargo.lock that belong to a configured workspace (#90)
    • b5ec9ed 2.0.2
    • 3f2513f avoid calling cargo metadata on pre-cleanup
    • 19c4658 update dependencies
    • b8e72aa Added prefix-key cache-directories and cache-targets options (#85)
    • See full diff in compare view

    Dependabot compatibility score

    Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


    Dependabot commands and options

    You can trigger Dependabot actions by commenting on this PR:

    • @dependabot rebase will rebase this PR
    • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
    • @dependabot merge will merge this PR after your CI passes on it
    • @dependabot squash and merge will squash and merge this PR after your CI passes on it
    • @dependabot cancel merge will cancel a previously requested merge and block automerging
    • @dependabot reopen will reopen this PR if it is closed
    • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
    • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
    dependencies 
    opened by dependabot[bot] 1
  • Bump Swatinem/rust-cache from 1.3.0 to 2.0.1

    Bump Swatinem/rust-cache from 1.3.0 to 2.0.1

    Bumps Swatinem/rust-cache from 1.3.0 to 2.0.1.

    Release notes

    Sourced from Swatinem/rust-cache's releases.

    v2.0.1

    • Primarily just updating dependencies to fix GitHub deprecation notices.

    v2.0.0

    • The action code was refactored to allow for caching multiple workspaces and different target directory layouts.
    • The working-directory and target-dir input options were replaced by a single workspaces option that has the form of $workspace -> $target.
    • Support for considering env-vars as part of the cache key.
    • The sharedKey input option was renamed to shared-key for consistency.

    v1.4.0

    • Clean both debug and release target directories.
    Changelog

    Sourced from Swatinem/rust-cache's changelog.

    Changelog

    2.0.1

    • Primarily just updating dependencies to fix GitHub deprecation notices.

    2.0.0

    • The action code was refactored to allow for caching multiple workspaces and different target directory layouts.
    • The working-directory and target-dir input options were replaced by a single workspaces option that has the form of $workspace -> $target.
    • Support for considering env-vars as part of the cache key.
    • The sharedKey input option was renamed to shared-key for consistency.

    1.4.0

    • Clean both debug and release target directories.

    1.3.0

    • Use Rust toolchain file as additional cache key.
    • Allow for a configurable target-dir.

    1.2.0

    • Cache ~/.cargo/bin.
    • Support for custom $CARGO_HOME.
    • Add a cache-hit output.
    • Add a new sharedKey option that overrides the automatic job-name based key.

    1.1.0

    • Add a new working-directory input.
    • Support caching git dependencies.
    • Lots of other improvements.

    1.0.2

    • Don’t prune targets that have a different name from the crate, but do prune targets from the workspace.

    1.0.1

    • Improved logging output.
    • Make sure to consider all-features dependencies when pruning.
    • Work around macOS cache corruption.
    • Remove git-db cache for now.
    Commits

    Dependabot compatibility score

    Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


    Dependabot commands and options

    You can trigger Dependabot actions by commenting on this PR:

    • @dependabot rebase will rebase this PR
    • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
    • @dependabot merge will merge this PR after your CI passes on it
    • @dependabot squash and merge will squash and merge this PR after your CI passes on it
    • @dependabot cancel merge will cancel a previously requested merge and block automerging
    • @dependabot reopen will reopen this PR if it is closed
    • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
    • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
    dependencies 
    opened by dependabot[bot] 1
  • Pin GHA versions and add dependabot

    Pin GHA versions and add dependabot

    In order to improve our security posture with GitHub Actions usage. I've made a version pinning ether to commit hash or to specific version. Additionally added dependabot to track GHA version changes. Related issues and policy: https://github.com/paritytech/ci_cd/issues/114 https://github.com/paritytech/ci_cd/issues/464 https://github.com/paritytech/ci_cd/wiki/Policies-and-regulations:-GitHub-Actions-usage-policies

    opened by sergejparity 0
  • Decode from `&mut Input` rather than from `&mut &[u8]` to align with scale_codec

    Decode from `&mut Input` rather than from `&mut &[u8]` to align with scale_codec

    It would be nice if this library would play well with parity-scale-codec's Decode trait. this trait takes in an &mut Input (something implementing the Input trait), but because scale-decode only accepts &mut &[u8], we can't easily use it as part of that decode impl.

    Let's consider supporting &mut Input instead to allow these crates to play nicely together.

    opened by jsdw 0
  • Bump Swatinem/rust-cache from 2.0.1 to 2.2.0

    Bump Swatinem/rust-cache from 2.0.1 to 2.2.0

    Bumps Swatinem/rust-cache from 2.0.1 to 2.2.0.

    Release notes

    Sourced from Swatinem/rust-cache's releases.

    v2.2.0

    • Add new save-if option to always restore, but only conditionally save the cache.

    v2.1.0

    • Only hash Cargo.{lock,toml} files in the configured workspace directories.

    v2.0.2

    • Avoid calling cargo metadata on pre-cleanup.
    • Added prefix-key, cache-directories and cache-targets options.
    Changelog

    Sourced from Swatinem/rust-cache's changelog.

    Changelog

    2.2.0

    • Add new save-if option to always restore, but only conditionally save the cache.

    2.1.0

    • Only hash Cargo.{lock,toml} files in the configured workspace directories.

    2.0.2

    • Avoid calling cargo metadata on pre-cleanup.
    • Added prefix-key, cache-directories and cache-targets options.

    2.0.1

    • Primarily just updating dependencies to fix GitHub deprecation notices.

    2.0.0

    • The action code was refactored to allow for caching multiple workspaces and different target directory layouts.
    • The working-directory and target-dir input options were replaced by a single workspaces option that has the form of $workspace -> $target.
    • Support for considering env-vars as part of the cache key.
    • The sharedKey input option was renamed to shared-key for consistency.

    1.4.0

    • Clean both debug and release target directories.

    1.3.0

    • Use Rust toolchain file as additional cache key.
    • Allow for a configurable target-dir.

    1.2.0

    • Cache ~/.cargo/bin.
    • Support for custom $CARGO_HOME.
    • Add a cache-hit output.
    • Add a new sharedKey option that overrides the automatic job-name based key.

    1.1.0

    • Add a new working-directory input.
    • Support caching git dependencies.
    • Lots of other improvements.

    ... (truncated)

    Commits

    Dependabot compatibility score

    Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


    Dependabot commands and options

    You can trigger Dependabot actions by commenting on this PR:

    • @dependabot rebase will rebase this PR
    • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
    • @dependabot merge will merge this PR after your CI passes on it
    • @dependabot squash and merge will squash and merge this PR after your CI passes on it
    • @dependabot cancel merge will cancel a previously requested merge and block automerging
    • @dependabot reopen will reopen this PR if it is closed
    • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
    • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
    dependencies 
    opened by dependabot[bot] 0
  • Lifetimes are anon

    Lifetimes are anon

    The liftimes to the str and the &[u8;32] are all anon. If they were 'scale (i.e. linked back to the original scale &'scale [u8] slice) then people could use the visitor to do zero copy decodes.

    https://github.com/paritytech/scale-decode/blob/f3b7276a6e62a8f8e6d40a831dfc3fb24bfafe8c/src/visitor/str.rs#L44

    https://github.com/paritytech/scale-decode/blob/f3b7276a6e62a8f8e6d40a831dfc3fb24bfafe8c/src/visitor/mod.rs#L75

    opened by gilescope 0
Releases(v0.3.0)
  • v0.3.0(Jul 29, 2022)

    0.3.0

    This release makes the following changes:

    • All visit_ functions are now handed a TypeId too, which is just a wrapper around a u32 corresponding to the type being decoded in the given PortableRegistry. Useful if you'd like to look up more details about the type in the registry or just store the ID somewhere.
    • visit_compact_ functions are now handed a visitor::Compact struct which one can obtain the compact encoded value from, or view the path to the compact encoded value via a .locations() method, in case the compact value is actually nested within named/unnamed structs. The TypeId provided is always the outermost type of the Compact value, and so one can also discover this information by manual traversal (but since we have to traverse anyway..).
    • The Variant type has been simplified and largely just allows access to the underlying Composite type to decode.
    • The Composite type now provides direct access to the (yet-to-be-decoded) fields, and offers separate decode_item and decode_item_with_name methods to make decoding into named or unnamed shapes a little easier.
    • Visitor related types are now exported directly from the visitor module rather than a visitor::types module.
    • Lifetimes have been made more precise to avoid unnecessary lifetime related errors.

    Changed

    • TypeIds, more info for compact encoded values and tidy up (#1)
    Source code(tar.gz)
    Source code(zip)
  • v0.2.0(Jul 27, 2022)

Owner
Parity Technologies
Solutions for a trust-free world
Parity Technologies
Decode Mode S and ADS-B signals in Rust

rs1090 rs1090 is a Rust library to decode Mode S and ADS-B messages. It takes its inspiration from the Python pyModeS library, and uses deku in order

Xavier Olive 6 Mar 1, 2024
Fast and compact sets of bytes or ASCII characters

bset Fast and compact sets of bytes and ASCII characters, useful for searching, parsing and determining membership of a given byte in the given set. T

null 26 Jul 19, 2022
A series of compact encoding schemes for building small and fast parsers and serializers

A series of compact encoding schemes for building small and fast parsers and serializers

Manfred Kröhnert 2 Feb 5, 2022
Encoding and decoding support for BSON in Rust

bson-rs Encoding and decoding support for BSON in Rust Index Overview of BSON Format Usage BSON Values BSON Documents Modeling BSON with strongly type

mongodb 304 Dec 30, 2022
Rust library for reading/writing numbers in big-endian and little-endian.

byteorder This crate provides convenience methods for encoding and decoding numbers in either big-endian or little-endian order. Dual-licensed under M

Andrew Gallant 811 Jan 1, 2023
Crate to parse and emit EDN

edn-rs Near Stable no breaking changes expected. Crate to parse and emit EDN This lib does not make effort to conform the EDN received to EDN Spec. Th

Julia Naomi 61 Dec 19, 2022
pem-rs pem PEM jcreekmore/pem-rs [pem] — A Rust based way to parse and encode PEM-encoded data

pem A Rust library for parsing and encoding PEM-encoded data. Documentation Module documentation with examples Usage Add this to your Cargo.toml: [dep

Jonathan Creekmore 30 Dec 27, 2022
Variable-length signed and unsigned integer encoding that is byte-orderable for Rust

ordered-varint Provides variable-length signed and unsigned integer encoding that is byte-orderable. This crate provides the Variable trait which enco

Khonsu Labs 7 Dec 6, 2022
Free Rust-only Xbox ADPCM encoder and decoder

XbadPCM Safe (and optionally no-std) Rust crate for encoding and decoding Xbox ADPCM blocks. Decoding example Here is example code for decoding stereo

Snowy 5 Nov 20, 2022
Encode and decode dynamically constructed values of arbitrary shapes to/from SCALE bytes

scale-value · This crate provides a Value type, which is a runtime representation that is compatible with scale_info::TypeDef. It somewhat analogous t

Parity Technologies 15 Jun 24, 2023
Extension registry for Lapce Registry

Lapce Registry This is the software running the lapce plugin registry, this manages and hosts plugins that the community uploads. Run the registry loc

Lapce 15 Dec 4, 2022
A crate to convert bytes to something more useable and the other way around in a way Compatible with the Confluent Schema Registry. Supporting Avro, Protobuf, Json schema, and both async and blocking.

#schema_registry_converter This library provides a way of using the Confluent Schema Registry in a way that is compliant with the Java client. The rel

Gerard Klijs 69 Dec 13, 2022
Encode/Decode bytes as emoji base2048

mojibake Encode and decode arbitrary bytes as a sequence of emoji optimized to produce the smallest number of graphemes. Description This is not a spa

null 15 Jul 23, 2023
nombytes is a library that provides a wrapper for the bytes::Bytes byte container for use with nom.

NomBytes nombytes is a library that provides a wrapper for the bytes::Bytes byte container for use with nom. I originally made this so that I could ha

Alexander Krivács Schrøder 2 Jul 25, 2022
Serialize/DeSerialize for Rust built-in types and user defined types (complex struct types)

Serialize/DeSerialize for Rust built-in types and user defined types (complex struct types)

null 2 May 3, 2022
Visitor traits for horned-owl with overloadable implementations

horned-visit Visitor traits for horned-owl with overloadable implementations. ??️ Overview This library provides visitor traits for the horned-owl obj

FastOBO 1 Feb 22, 2022
Todo-server impl by using actix.rs

Todo-Server Todo-server impl by using actix.rs Why actix.rs 1. It's blazing fast Benchmark From now on, Actix is the second fastest web framework in t

null 5 Nov 17, 2022
decode a byte stream of varint length-encoded messages into a stream of chunks

length-prefixed-stream decode a byte stream of varint length-encoded messages into a stream of chunks This crate is similar to and compatible with the

James Halliday 4 Feb 26, 2022
Decode Metaplex mint account metadata into a JSON file.

Simple Metaplex Decoder (WIP) Install From Source Install Rust. curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh Clone the source: git c

Samuel Vanderwaal 8 Aug 25, 2022
A library for transcoding between bytes in Astro Notation Format and Native Rust data types.

Rust Astro Notation A library for transcoding between hexadecimal strings in Astro Notation Format and Native Rust data types. Usage In your Cargo.tom

Stelar Software 1 Feb 4, 2022