CBOR (binary JSON) for Rust with automatic type based decoding and encoding.

Related tags

Utilities rust-cbor
Overview

THIS PROJECT IS UNMAINTAINED. USE serde_cbor INSTEAD.

This crate provides an implementation of RFC 7049, which specifies Concise Binary Object Representation (CBOR). CBOR adopts and modestly builds on the data model used by JSON, except the encoding is in binary form. Its primary goals include a balance of implementation size, message size and extensibility.

Build status

Dual-licensed under MIT or the UNLICENSE.

Documentation

The API is fully documented with examples: http://burntsushi.net/rustdoc/cbor/.

Installation

This crate works with Cargo and is on crates.io. The package is regularly updated. Add it to your Cargo.toml like so:

[dependencies]
cbor = "0.3"

Example: simple type based encoding and decoding

In this crate, there is a Decoder and an Encoder. All reading and writing of CBOR must go through one of these types.

The following shows how use those types to encode and decode a sequence of data items:

extern crate cbor;

use cbor::{Decoder, Encoder};

fn main() {
    // The data we want to encode. Each element in the list is encoded as its
    // own separate top-level data item.
    let data = vec![('a', 1), ('b', 2), ('c', 3)];

    // Create an in memory encoder. Use `Encoder::from_writer` to write to
    // anything that implements `Writer`.
    let mut e = Encoder::from_memory();
    e.encode(&data).unwrap();

    // Create an in memory decoder. Use `Decoder::from_reader` to read from
    // anything that implements `Reader`.
    let mut d = Decoder::from_bytes(e.as_bytes());
    let items: Vec<(char, i32)> = d.decode().collect::<Result<_, _>>().unwrap();

    assert_eq!(items, data);
}

There are more examples in the docs.

Status of implementation

The big thing missing at the moment is indefinite length encoding. It's easy enough to implement, but I'm still trying to think of the best way to expose it in the API.

Otherwise, all core CBOR features are implemented. There is support for tags, but none of the tags in the IANA registry are implemented. It isn't clear to me whether these implementations should appear in this crate or in others. Perhaps this would be a good use of Cargo's optional features.

Finally, CBOR maps are only allowed to have Unicode string keys. This was easiest to implement, but perhaps this restriction should be lifted in the future.

Benchmarks

Here are some very rough (and too simplistic) benchmarks that compare CBOR with JSON. Absolute performance is pretty bad (sans CBOR encoding), but this should at least give a good ballpark for relative performance with JSON:

test decode_medium_cbor   ... bench:  15525074 ns/iter (+/- 348424) = 25 MB/s
test decode_medium_json   ... bench:  18356213 ns/iter (+/- 620645) = 30 MB/s
test decode_small_cbor    ... bench:      1299 ns/iter (+/- 6) = 30 MB/s
test decode_small_json    ... bench:      1471 ns/iter (+/- 11) = 38 MB/s
test encode_medium_cbor   ... bench:   1379671 ns/iter (+/- 24828) = 289 MB/s
test encode_medium_json   ... bench:   8053979 ns/iter (+/- 110462) = 70 MB/s
test encode_medium_tojson ... bench:  15589704 ns/iter (+/- 559355) = 36 MB/s
test encode_small_cbor    ... bench:      2685 ns/iter (+/- 69) = 14 MB/s
test encode_small_json    ... bench:       862 ns/iter (+/- 1) = 64 MB/s
test encode_small_tojson  ... bench:      1313 ns/iter (+/- 6) = 42 MB/s
test read_medium_cbor     ... bench:  10008308 ns/iter (+/- 101995) = 39 MB/s
test read_medium_json     ... bench:  14853023 ns/iter (+/- 510215) = 38 MB/s
test read_small_cbor      ... bench:       763 ns/iter (+/- 4) = 52 MB/s
test read_small_json      ... bench:      1127 ns/iter (+/- 4) = 49 MB/s

If these benchmarks are perplexing to you, then you might want to check out Erick Tryzelaar's series of blog posts on Rust's serialization infrastructure. In short, it's being worked on.

Relatedly, a compounding reason why decoding CBOR is so slow is because it is decoded into an intermediate abstract syntax first. A faster (but more complex) implementation would skip this step, but it is difficult to do performantly with the existing serialization infrastructure. (The same approach is used in JSON decoding too, but it should be much easier to eschew this with CBOR since it doesn't have the complexity overhead of parsing text.)

Alternatives

TyOverby's excellent bincode library fulfills a similar use case as cbor: both crates serialize and deserialize between Rust values and a binary representation. Here is a brief comparison (please ping me if I've gotten any of this wrong or if I've left out other crucial details):

  • CBOR is an IETF standard with implementations in many languages. This means you can use CBOR to easily communicate with programs written in other programming languages.
  • cbor tags every data item encoded, including every number. bincode does not, which means the compactness of the resulting binary data depends on your data. For example, using cbor, encoding a Vec<u64> will encode every integer using a variable width encoding while bincode will use 8 bytes for every number. This results in various trade offs in terms of serialization speed, the size of the data and the flexibility of encoding/decoding with Rust types. (e.g., With bincode you must decode with precisely the same integer size as what was encoded, but cbor can adjust on the fly and decode, e.g., an encoded u16 into a u64.)
Comments
  • Support indefinite-length arrays, maps, and strings

    Support indefinite-length arrays, maps, and strings

    I've been using this code on the c2rust project pretty extensively for the last year after bringing it up in #12 It'd be neat to fold it back into the project!

    opened by glguy 7
  • Decoder iterator returns next() when can't decode full object

    Decoder iterator returns next() when can't decode full object

    I use the following code to unpack a packet from network:

      let mut dec = Decoder::from_reader(io::Cursor::new(buf));
      match dec.decode::<Packet>().next() {...}
    

    And next() returns None when packet was truncated by some reason. Is it works as intended? I would like to have some more explicit error.

    I use cbor 0.3.14 if that matters.

    Is there a better interface? Should this behavior be documented?

    opened by tailhook 6
  • Decouple dependencies

    Decouple dependencies

    Hi,

    May I ask if you could decouple dependencies? For a basic, simple and concise implementation, I think it would be great if there are no dependencies.

    For more features, maybe you can make a separate crate...

    Thank you,

    opened by ghost 5
  • Support decoding indefinite length encodings

    Support decoding indefinite length encodings

    I'm interested in using this crate to decode CBOR structures that happen to use indefinite length encodings. I've locally patched the code to support this locally. I'd be willing to submit a pull-request soon.

    I'm interested to know what you plans are for supporting indefinite lengths in general. For my use-case it is sufficient for me to add support to read_array, read_map, read_bytes, and read_string in decoder.rs. Are there some other considerations?

    opened by glguy 5
  • Issues with custom tags and u64

    Issues with custom tags and u64

    I can't tell if this is user error, or an issue with rust-cbor. I'm using CBorTagEncode to encode some custom types. When these custom types include a u64, I'm getting an unexpectedEOF during decode.

    A fully self-contained reproducer is below. Using rustc 1.10.0-nightly (4ec5ce5e4 2016-05-12)

    Thanks!

    extern crate cbor;
    extern crate rustc_serialize;
    
    use cbor::{CborTagEncode};
    use rustc_serialize::{Decodable, Decoder, Encodable, Encoder};
    
    pub enum TestTypes {
        TestA(u32),
        TestB(u64)
    }
    
    
    fn encode<T: Encodable>(t: &T) -> Vec<u8> {
        let mut e = cbor::Encoder::from_memory();
        e.encode(vec!(t)).unwrap();
        e.into_bytes()
    }
    fn decode<T: Decodable>(bytes: Vec<u8>) -> T {
        let mut d = cbor::Decoder::from_bytes(bytes);
    
        let mut items = d.decode();
        let maybe_result = items.next();
        let result = maybe_result.unwrap();
        result.unwrap()
    }
    
    impl Encodable for TestTypes {
        fn encode<E: Encoder>(&self, e: &mut E) -> Result<(), E::Error> {
            match self {
                &TestTypes::TestA(ref val) => CborTagEncode::new(100_000_01, val).encode(e),
                &TestTypes::TestB(ref val) => CborTagEncode::new(100_000_02, val).encode(e),
            }
        }
    }
    
    impl Decodable for TestTypes {
        fn decode<D: Decoder>(d: &mut D) -> Result<TestTypes, D::Error> {
            let tag = try!(d.read_u64());
            match tag {
                100_000_01 => Ok(TestTypes::TestA(try!(Decodable::decode(d)))),
                100_000_02 => Ok(TestTypes::TestB(try!(Decodable::decode(d)))),
                _ => panic!("Unexpected tag")
            }
        }
    }
    
    
    
    fn main() {
        let a = TestTypes::TestA(1234u32);
        let b = TestTypes::TestB(1234u64);
    
        let a_encoded = encode(&a);
        println!("TestA(u32) encoded as {} bytes: {:?}", a_encoded.len(), a_encoded);
        let b_encoded = encode(&b);
        println!("TestB(u64) encoded as {} bytes: {:?}", b_encoded.len(), b_encoded);
    
    
        let aa: TestTypes = decode(a_encoded);
        let bb: TestTypes = decode(b_encoded);
    }
    
    achin@bigbox ~/tmp/06/cbor_test $ env RUST_BACKTRACE=1 cargo run
         Running `target/debug/cbor_test`
    TestA(u32) encoded as 8 bytes: [218, 0, 152, 150, 129, 25, 4, 210]
    TestB(u64) encoded as 8 bytes: [218, 0, 152, 150, 130, 217, 4, 210]
    
    thread '<main>' panicked at 'called `Result::unwrap()` on an `Err` value: UnexpectedEOF', ../src/libcore/result.rs:785
    stack backtrace:
       1:     0x55b614483c60 - std::sys::backtrace::tracing::imp::write::h9fb600083204ae7f
       2:     0x55b6144869fb - std::panicking::default_hook::_$u7b$$u7b$closure$u7d$$u7d$::hca543c34f11229ac
       3:     0x55b614486683 - std::panicking::default_hook::hc2c969e7453d080c
       4:     0x55b61447de78 - std::panicking::rust_panic_with_hook::hfe203e3083c2b544
       5:     0x55b614486c41 - std::panicking::begin_panic::h4889569716505182
       6:     0x55b61447eaaa - std::panicking::begin_panic_fmt::h484cd47786497f03
       7:     0x55b614486bde - rust_begin_unwind
       8:     0x55b6144bc9ff - core::panicking::panic_fmt::h257ceb0aa351d801
       9:     0x55b614458eb8 - core::result::unwrap_failed::hf334f4fb1663f17e
                            at ../src/libcore/macros.rs:29
      10:     0x55b614475032 - _<std..result..Result<T, E>>::unwrap::h627180c2294af4f7
                            at ../src/libcore/result.rs:723
      11:     0x55b61445ee81 - cbor_test::decode::hf50c8635c0e15470
                            at src/main.rs:24
      12:     0x55b614458a35 - cbor_test::main::h6878392f0e5b8935
                            at src/main.rs:60
      13:     0x55b6144862c8 - std::panicking::try::call::hc5e1f5b484ec7f0e
      14:     0x55b61449079b - __rust_try
      15:     0x55b61449073e - __rust_maybe_catch_panic
      16:     0x55b614485cf3 - std::rt::lang_start::h61f4934e780b4dfc
      17:     0x55b614475109 - main
      18:     0x7f442ddfa61f - __libc_start_main
      19:     0x55b614458528 - _start
      20:                0x0 - <unknown>
    error: Process didn't exit successfully: `target/debug/cbor_test` (exit code: 101)
    
    opened by eminence 4
  • Expose and document read_data_item.

    Expose and document read_data_item.

    Hi,

    I'm using read_data_item directly and it would be cool if it were public.

    pub(crate) fn decode(bytes: Vec<u8>) -> Result<Ipld, IpldError> {
        let mut d = Decoder::from_bytes(bytes);
        let cbor: Cbor = d.read_data_item(None)?;
        cbor_to_ipld(cbor)
    }
    
    opened by dvc94ch 3
  • Add a test case that exibits over-capacity out-of-memory error

    Add a test case that exibits over-capacity out-of-memory error

    This is exactly the same class of error that has hit bincode recently. https://github.com/TyOverby/bincode/issues/41

    I have an issue with a solution open on rustc-serialize.

    opened by TyOverby 1
  • Fix encoding CborTagEncode containing u64 (fixes #8)

    Fix encoding CborTagEncode containing u64 (fixes #8)

    Fixes #8 and adds a test. Only set Encoder::tag while encoding the tag value itself, not when encoding the contained data. Also, the tag setting is now only set locally and so nested tags do not overwrite the state of Encoder::tag (not really a problem before).

    opened by gavento 0
  • Typo in docs

    Typo in docs

    Here, in lib.rc, the documentation states:

    This type is only useful when your manually expecting the structure of a CBOR data item.

    What are you trying to say here? "when you're manually inspecting the structure", or something else?

    opened by thanatos 0
  • Non string keys

    Non string keys

    This addresses #13 by adding signed/unsigned integer map key support (not sure it makes sense to support absolutely every CBOR type as map keys). It's a first pass, so this is more of a feedback request to see if you think this is going in the right direction. I can also add more tests.

    opened by mozkeeler 0
  • non-string keys in maps not supported

    non-string keys in maps not supported

    Unless I'm misunderstanding, rust-cbor doesn't support non-string keys in CBOR maps. From my reading of the rfc, types other than strings may be used as keys in maps.

    For instance, the hex bytes a10126 should decode as a map with one entry: a key of an unsigned integer of value 1 mapping to a value of the negative integer -7. Right now the decoder throws a TypeMismatch error expecting a Unicode string.

    opened by mozkeeler 1
  • f16 decoding incorrect.

    f16 decoding incorrect.

    I have a CBOR encoder/decoder for Lua which uses the canonicalization recommendations from the RFC, specifically that floating point values should be stored in the smallest representation that does not result in loss of information. This provides a decent saving on structures that have a lot of floating point values many of which are just small integers. Unfortunately this fails spectacularly for every value except 0.0 with the f16 decoding behaviour in rust-cbor.

    An example correct decoding routine is available in Appendix D of the RFC. The C version focuses on breaking down the f16 and building a new f32 float with ldexp. The Python version uses bit manipulation to convert the f16 directly into an f32.

    bug 
    opened by Parakleta 5
  • encoder does not report error for truncated data

    encoder does not report error for truncated data

    When using a fixed size buffer, the encoder should fail if it needs to write more data than will fit in the buffer. Currently, the encoder writes as much as it can and stops.

    It seems like this can be attributed to using io::BufWriter internally. Here's a small reproduction of the core bug:

    extern crate bytes;
    
    use std::io::{self, Write};
    
    fn main() {
        let mut buf = io::BufWriter::new(bytes::ByteBuf::mut_with_capacity(1));
        buf.write_all(&[1, 2, 3, 4, 5]).unwrap();
    }
    

    This succeeds presumably because the bytes were indeed successfully written to the buffer. However, if one calls buf.flush(), then an error is reported because the entire contents of the buffer could not be written.

    If we look at an example from #6, this finishes with output len: 1, contents: [161]. Namely, no error is reported:

    extern crate bytes;
    extern crate cbor;
    extern crate rustc_serialize;
    
    use std::io::Read;
    
    use cbor::Encoder;
    
    #[derive(RustcDecodable, RustcEncodable, Debug)]
    struct Test {
        value: String,
    }
    
    fn main() {
        let mut buf = bytes::ByteBuf::mut_with_capacity(1);
        {
            let mut enc = Encoder::from_writer(&mut buf);
            enc.encode(&[Test {
                value: "hello world hello world hello world".to_string()
            }]).unwrap();
        }
        let nbytes = buf.bytes().len();
        println!("len: {}, contents: {:?}", nbytes, buf.bytes());
    }
    

    This is because the data from the buffer is flushed when the encoder is dropped. This should produce an error, but errors are ignored in destructors.

    If we change the code above to (same, but with a call to enc.flush()):

    extern crate bytes;
    extern crate cbor;
    extern crate rustc_serialize;
    
    use std::io::Read;
    
    use cbor::Encoder;
    
    #[derive(RustcDecodable, RustcEncodable, Debug)]
    struct Test {
        value: String,
    }
    
    fn main() {
        let mut buf = bytes::ByteBuf::mut_with_capacity(1);
        {
            let mut enc = Encoder::from_writer(&mut buf);
            enc.encode(&[Test {
                value: "hello world hello world hello world".to_string()
            }]).unwrap();
            enc.flush().unwrap();
        }
        let nbytes = buf.bytes().len();
        println!("len: {}, contents: {:?}", nbytes, buf.bytes());
    }
    

    Then it will indeed fail because the underlying buffer flush will fail.

    cc @tailhook

    bug 
    opened by BurntSushi 2
  • Initial support for serde

    Initial support for serde

    Hello!

    This PR adds initial support for serde. One of the cool things about it is that it can directly deserialize into a structure instead of having to go through cbor::Decoder to parse out a Cbor type and then deserialize from that into a cbor::RustcDecoder. It currently passes the tests and benchmarks pretty well (although it's still slower than serde_json for some unknown reason):

         Running target/release/bench-d76ac9f7d864696a
    
    running 16 tests
    test decode_medium_cbor        ... bench:  15,151,437 ns/iter (+/- 1,104,258) = 26 MB/s
    test decode_medium_direct_cbor ... bench:   4,477,213 ns/iter (+/- 511,434) = 89 MB/s
    test decode_medium_json        ... bench:  20,373,353 ns/iter (+/- 2,205,575) = 27 MB/s
    test decode_small_cbor         ... bench:       1,315 ns/iter (+/- 37) = 30 MB/s
    test decode_small_direct_cbor  ... bench:       2,558 ns/iter (+/- 345) = 15 MB/s
    test decode_small_json         ... bench:       1,880 ns/iter (+/- 367) = 29 MB/s
    test encode_medium_cbor        ... bench:     945,316 ns/iter (+/- 16,675) = 422 MB/s
    test encode_medium_json        ... bench:   7,628,975 ns/iter (+/- 339,718) = 74 MB/s
    test encode_medium_tojson      ... bench:  14,429,221 ns/iter (+/- 1,244,292) = 39 MB/s
    test encode_small_cbor         ... bench:       3,460 ns/iter (+/- 359) = 11 MB/s
    test encode_small_json         ... bench:         841 ns/iter (+/- 318) = 66 MB/s
    test encode_small_tojson       ... bench:       1,213 ns/iter (+/- 202) = 46 MB/s
    test read_medium_cbor          ... bench:   9,462,908 ns/iter (+/- 1,028,207) = 42 MB/s
    test read_medium_json          ... bench:  14,876,985 ns/iter (+/- 2,433,638) = 38 MB/s
    test read_small_cbor           ... bench:         785 ns/iter (+/- 193) = 50 MB/s
    test read_small_json           ... bench:       1,226 ns/iter (+/- 26) = 45 MB/s
    
    test result: ok. 0 passed; 0 failed; 0 ignored; 16 measured
    
         Running target/release/bench_serde-7de34a6bd2bc4f3a
    
    running 8 tests
    test deserialize_medium_cbor ... bench:   6,434,673 ns/iter (+/- 412,666) = 62 MB/s
    test deserialize_medium_json ... bench:   4,808,781 ns/iter (+/- 1,297,244) = 117 MB/s
    test deserialize_small_cbor  ... bench:       2,666 ns/iter (+/- 891) = 15 MB/s
    test deserialize_small_json  ... bench:         439 ns/iter (+/- 16) = 127 MB/s
    test serialize_medium_cbor   ... bench:   1,421,697 ns/iter (+/- 475,204) = 281 MB/s
    test serialize_medium_json   ... bench:   4,730,402 ns/iter (+/- 1,666,983) = 120 MB/s
    test serialize_small_cbor    ... bench:       3,479 ns/iter (+/- 491) = 11 MB/s
    test serialize_small_json    ... bench:         441 ns/iter (+/- 122) = 126 MB/s
    
    test result: ok. 0 passed; 0 failed; 0 ignored; 8 measured
    
         Running target/release/cbor-8d79df739e5d6e87
    
    running 0 tests
    
    test result: ok. 0 passed; 0 failed; 0 ignored; 0 measured
    

    Along the way I also changed how variants are serialized. Instead of {"variant": "Name", "fields": [...]}, it's now {"Name": [...]}. This is much more efficient for serde to parse because it doesn't require potentially buffering the field argument.

    Be aware though that this does add a testing dependency to serde_macros, which is currently nightly-only. Check out how serde_tests in https://githb.com/serde-rs/serde if you're interested in a method that's compatible with stable rust.

    opened by erickt 2
  • use serde

    use serde

    I should start using serde for the automatic serialization component of this crate.

    I don't think this means removing support for rustc-serialize just yet though.

    opened by BurntSushi 3
Owner
Andrew Gallant
I love to code.
Andrew Gallant
A rust bencode encoding/decoding implementation backed by serde.

Bende A rust bencode encoding/decoding implementation backed by serde. About This is one of a few bencode implementations available for rust. Though t

Du Toit 3 Dec 4, 2022
Translate C++/Rust type into C type with the same memory layout

clayout, translate C++/Rust type into C type with the same memory layout. Generally, clayout is used together with bpftrace. clayout is developed on d

盏一 11 Nov 17, 2022
wrap errors with automatic backtrace capture and print-on-result-unwrap

backtrace-error This is a tiny crate that provides a tiny error-wrapper struct BacktraceError with only two features: Captures a backtrace on From-con

Graydon Hoare 24 Nov 19, 2022
A tiling window manager for Windows 10 based on binary space partitioning

yatta BSP Tiling Window Manager for Windows 10 Getting Started This project is still heavily under development and there are no prebuilt binaries avai

Jade 143 Nov 12, 2022
A (de)serializer for RLP encoding in ETH

An ETH RLP (Recursive Length Prefix) (de)serializer written in Rust

M4tsuri 7 Jun 20, 2022
Error context library with support for type-erased sources and backtraces, targeting full support of all features on stable Rust

Error context library with support for type-erased sources and backtraces, targeting full support of all features on stable Rust, and with an eye towards serializing runtime errors using serde.

Findora Foundation 1 Feb 12, 2022
Type-check non-existing `Phantom` code for Fun And Profit

Sometimes you may want to write Rust code that ought to be type-checked (e.g., borrow-checked) in the same fashion as real Rust code even though that code is never intended to be run / to affect or even reach code generation.

Daniel Henry-Mantilla 4 Jun 5, 2022
Mononym is a library for creating unique type-level names for each value in Rust.

Mononym is a library for creating unique type-level names for each value in Rust.

MaybeVoid 52 Dec 16, 2022
An annotated string type in Rust, made up of string slices

A string type made up of multiple annotated string slices.

Togglebit 3 Dec 29, 2022
A tuple crate for Rust, which introduces a tuple type represented in recusive form.

tuplez This crate introduces a tuple type represented in recursive form rather than parallel form. Motivation The primitive tuple types are represente

Nihil 6 Feb 29, 2024
A memory efficient immutable string type that can store up to 24* bytes on the stack

compact_str A memory efficient immutable string type that can store up to 24* bytes on the stack. * 12 bytes for 32-bit architectures About A CompactS

Parker Timmerman 342 Jan 2, 2023
Type-safe IPC for Tauri using GraphQL

Tauri Plugin graphql A plugin for Tauri that enables type-safe IPC through GraphQL. Install Rust [dependencies] tauri-plugin-graphql = "0.2" JavaScrip

Jonas Kruckenberg 40 Dec 29, 2022
The most fundamental type for async synchronization: an intrusive linked list of futures

wait-list This crate provides WaitList, the most fundamental type for async synchronization. WaitList is implemented as an intrusive linked list of fu

Sabrina Jewson 7 Oct 26, 2022
An unofficial and incomplete no_std Rust library for implementing the ElectricUI Binary Protocol

An unofficial and incomplete no_std Rust library for implementing the ElectricUI Binary Protocol

Jon 2 Mar 29, 2022
A self-contained, single-binary Rust and Leptos application for remote Wake-on-LAN

Remote Wake-on-LAN with Rust and Leptos A self-contained, single-binary Rust and Leptos application serving a web interface to wake another device on

Valentin Bersier 6 Jan 28, 2023
🚧 (Alpha stage software) Binary that supports remote filesystem and process operations. 🚧

distant Binary to connect with a remote machine to edit files and run programs. ?? (Alpha stage software) This program is in rapid development and may

Chip Senkbeil 296 Dec 28, 2022
A swiss army knife for creating binary modules for Garry's Mod in Rust.

A swiss army knife for creating binary modules for Garry's Mod in Rust.

William 38 Dec 24, 2022
Rust+Cargo lightweight hello world with the most minimum binary size possible.

Lightweight Cargo Hello World Rust+Cargo lightweight hello world with the most minimum binary size possible. requirements 1: Rustup (Rustc, Cargo) Ins

Raymond 1 Dec 13, 2021
Nix binary cache implemented in rust using libnix-store

harmonia Build Whole application nix-shell --run cargo b C Library Wrapper around libnixstore nix-shell --run make Note: The makefile is only to pro

Helsinki Systems 84 Dec 24, 2022