A crate to convert bytes to something more useable and the other way around in a way Compatible with the Confluent Schema Registry. Supporting Avro, Protobuf, Json schema, and both async and blocking.

Overview

#schema_registry_converter

Build Status codecov Crates.io Crates.io docs.rs

This library provides a way of using the Confluent Schema Registry in a way that is compliant with the Java client. The release notes can be found on github Consuming/decoding and producing/encoding is supported. It's also possible to provide the schema to use when decoding. You can also include references when decoding. When no schema is provided, the latestschema with the same subject will be used. It's supposed to be feature complete compared to the Java version. If anything is missing or not working as expected please create an issue.

Consumer

For consuming messages encoded with the schema registry, you need to fetch the correct schema from the schema registry to transform it into a record. For clarity, error handling is omitted from the diagram.

Consumer activity flow

Producer

For producing messages which can be properly consumed by other clients, the proper id needs to be encoded with the message. To get the correct id, it might be necessary to register a new schema. For clarity, error handling is omitted from the diagram.

Producer activity flow

Getting Started

schema_registry_converter.rs is available on crates.io. It is recommended to look there for the newest and more elaborate documentation.

To use it to convert using avro async use:

[dependencies]
schema_registry_converter = { version = "2.0.2", features = ["avro"] }

...and see the docs for how to use it.

All the converters also have a blocking (non async) version, in that case use something like:

[dependencies]
schema_registry_converter = { version = "2.0.2", default-features = false, features = ["avro", "blocking"]}

If you need to use both in a project you can use something like, but have to be weary you import the correct paths depending on your use.

[dependencies]
schema_registry_converter = { version = "2.0.2", features = ["avro", "blocking"]}

Example with consumer and producer using Avro

Two examples of but consuming/decoding and producing/encoding. To use structs with Avro they must have an implementation of either the serde::Deserialize or serde::Serialize trait to work.

use rdkafka::message::{Message, BorrowedMessage};
use avro_rs::types::Value;
use schema_registry_converter::blocking::{Decoder, Encoder};
use schema_registry_converter::blocking::schema_registry::SubjectNameStrategy;

fn main() {
    let mut decoder = Decoder::new(String::from("http://localhost:8081"));
    let mut encoder = Encoder::new(String::from("http://localhost:8081"));
    let hb = get_heartbeat(msg, decoder);
    let record = get_future_record_from_struct("hb", Some("id"), hb, encoder);
    producer.send(record);
}

fn get_value<'a>(
    msg: &'a BorrowedMessage,
    decoder: &'a mut Decoder,
) -> Value{
    match decoder.decode(msg.payload()){
    Ok(v) => v,
    Err(e) => panic!("Error getting value: {}", e),
    }
}

fn get_heartbeat<'a>(
    msg: &'a BorrowedMessage,
    decoder: &'a mut Decoder,
) -> Heartbeat{
    match decoder.decode_with_name(msg.payload()){
        Ok((name, value)) => {
            match name.name.as_str() {
                "Heartbeat" => {
                    match name.namespace{
                        Some(namespace) => {
                            match namespace.as_str(){
                                "nl.openweb.data" => from_value::<Heartbeat>(&value).unwrap(),
                                ns=> panic!("Unexpected namespace {}", ns),
                            }
                        },
                        None => panic!("No namespace in schema, while expected"),
                    }
                }
                name=> panic!("Unexpected name {}", name),
            }
        }
        Err(e) => panic!("error getting heartbeat: {}, e"),
    }
}

fn get_future_record<'a>(
    topic: &'a str,
    key: Option<&'a str>,
    values: Vec<(&'static str, Value)>,
    encoder: &'a mut Encoder,
) -> FutureRecord<'a>{
    let subject_name_strategy = SubjectNameStrategy::TopicNameStrategy(topic, false);
    let payload = match encoder.encode(values, &subject_name_strategy) {
        Ok(v) => v,
        Err(e) => panic!("Error getting payload: {}", e),
    };
    FutureRecord {
        topic,
        partition: None,
        payload: Some(&payload),
        key,
        timestamp: None,
        headers: None,
    }
}

fn get_future_record_from_struct<'a>(
    topic: &'a str,
    key: Option<&'a str>,
    heartbeat: Heartbeat,
    encoder: &'a mut Encoder,
) -> FutureRecord<'a>{
    let subject_name_strategy = SubjectNameStrategy::TopicNameStrategy(topic, false);
    let payload = match encoder.encode_struct(heartbeat, &subject_name_strategy) {
        Ok(v) => v,
        Err(e) => panic!("Error getting payload: {}", e),
    };
    FutureRecord {
        topic,
        partition: None,
        payload: Some(&payload),
        key,
        timestamp: None,
        headers: None,
    }
}

Example using to post schema to schema registry

use schema_registry_converter::blocking::schema_registry::{
    post_schema,
    SuppliedSchema
};

fn main(){
    let schema = SuppliedSchema {
                                 name: String::from("nl.openweb.data.Heartbeat"),
                                 schema_type: SchemaType::AVRO,
                                 schema: String::from(r#"{"type":"record","name":"Heartbeat","namespace":"nl.openweb.data","fields":[{"name":"beat","type":"long"}]}"#),
                                 references: vec![],
    };
    let result = post_schema("http://localhost:8081/subjects/test-value/versions", heartbeat_schema);
}

Relation to related libraries

The avro part of the conversion is handled by avro-rs. As such, I don't include tests for every possible schema. While I used rdkafka in combination to successfully consume from and produce to kafka, and while it's used in the example, this crate has no direct dependency on it. All this crate does is convert [u8] <-> Some Value (based on converter used). With Json and Protobuf some other dependencies are pulled in, by using said features. I have tried to encapsulate all the errors in the SRCError type. So even when you get a pannic/error that's an SRCError it could be an error from one of the dependencies. Please make sure you are using the library correctly, and the error is not caused by a depency, before creating an issue.

Tests

Due to mockito, used for mocking the schema registry responses, being run in a separate thread, tests have to be run using --test-threads=1 for example like cargo +stable test --color=always --features avro,json,proto_decoder,proto_raw -- --nocapture --test-threads=1

Integration test

The integration tests require a Kafka cluster running on the default ports. It will create topics, register schema's, produce and consume some messages. They are only included when compiled with the kafka_test feature, so to include them in testing cargo +stable test --all-features --color=always -- --nocapture --test-threads=1 needs to be run. The 'prepare_integration_test.sh' script can be used to create the 3 topics needed for the tests. To ensure Java compatibility it's also needed to run the schema-registry-test-app docker image.

License

This project is licensed under either of

at your option.

Contribution

Unless you explicitly state otherwise, any contribution intentionally submitted for inclusion in Schema Registry Converter by you, as defined in the Apache-2.0 license, shall be dual licensed as above, without any additional terms or conditions.

Comments
  • Failing to decode record type referencing an enum

    Failing to decode record type referencing an enum

    My schema looks like this

    [
      {
        "type": "enum",
        "name": "myflag",
        "namespace": "myns",
        "symbols": [
          "A",
          "B",
          "C"
        ]
      },
      {
        "type": "record",
        "name": "mymsg",
        "namespace": "myns",
        "fields": [
          {
            "name": "myid",
            "type": "int"
          },
          {
            "name": "c1",
            "type": "myflag"
          },
          {
            "name": "c2",
            "type": "myflag"
          }
        ]
      }
    ]
    

    Getting following error from decoder.decode function

    thread '' panicked at 'called Result::unwrap() on an Err value: SRCError { error: "Could not parse schema", side: Some("Failed to parse schema: Unknown type: myflag"), retriable: false, cached: true }', src/receiver.rs:96:23

    I would appreciate any help with this

    opened by ajayrathore 14
  • Switch to apache-avro

    Switch to apache-avro

    Hi, so based on this: https://github.com/flavray/avro-rs/pull/99#issuecomment-1017195592, avro-rs is not really maintained anymore and the code was contributed to apache-avro, which has added support for named/recursive types (features from the avro spec that current avro-rs does not support).

    If this project could switch to the maintained and more fully featured avro library that would be awesome. Thanks.

    opened by tzachshabtay 11
  • panicked when working  with tokio async

    panicked when working with tokio async

    when use reqwest to request schema registry with tokio async ,like my code below in warp main thread or use tokio::spawn to create a async block will exit

    use warp::Filter;
    use schema_registry_converter::schema_registry::{SrSettings};
    
    #[tokio::main]
    async fn main() {
         let hello = warp::path!("hello" / String)
            .map(|name| format!("Hello, {}!", name));
         let sr_settings = SrSettings::new(String::from("http://localhost:8081/"));
    
        warp::serve(hello)
            .run(([127, 0, 0, 1], 3030))
            .await;
    }
    
    
    thread 'main' panicked at 'Cannot drop a runtime in a context where blocking is not allowed. This happens when a runtime is dropped from within an asynchronous context.', ~/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-0.2.22/src/runtime/blocking/shutdown.rs:49:21
    stack backtrace:
       0:        0x1083da9ee - <std::sys_common::backtrace::_print::DisplayBacktrace as core::fmt::Display>::fmt::h72782cdbf82d2e78
       1:        0x1083fc1ac - core::fmt::write::h0f2c225c157771c1
       2:        0x1083d52d9 - std::io::Write::write_fmt::h6d219fc26cb45a24
       3:        0x1083dc865 - std::panicking::default_hook::{{closure}}::hde29d026f53869b1
       4:        0x1083dc5a2 - std::panicking::default_hook::h5de23f27de9ce8ce
       5:        0x1083dcdc5 - std::panicking::rust_panic_with_hook::h720143ee15fc80ba
       6:        0x10840664e - std::panicking::begin_panic::h80d64999a84f0366
       7:        0x108297874 - tokio::runtime::blocking::shutdown::Receiver::wait::hc15bfd5c68b76ca0
       8:        0x1082cbfc5 - tokio::runtime::blocking::pool::BlockingPool::shutdown::h702cd79db6adf80b
       9:        0x1082cc07d - <tokio::runtime::blocking::pool::BlockingPool as core::ops::drop::Drop>::drop::h0ab515030d41ffa8
      10:        0x1082d6215 - core::ptr::drop_in_place::h1df0de4e6c94e3fe
      11:        0x1082d8252 - core::ptr::drop_in_place::h94c49217ab7681ef
      12:        0x10801a738 - reqwest::blocking::wait::enter::he7b4b5e4e35343bf
      13:        0x108019bd7 - reqwest::blocking::wait::timeout::h7234a2ddbbcb0534
      14:        0x10803b703 - reqwest::blocking::client::ClientHandle::new::hd976e58a3aac69e4
      15:        0x10803ae1d - reqwest::blocking::client::ClientBuilder::build::h18af5d75937efe22
      16:        0x10803aece - reqwest::blocking::client::Client::new::h09aac586c8993277
      17:        0x107eff6ea - schema_registry_converter::schema_registry::SrSettings::new::h3b74a91814783f96
      18:        0x107d0a3f3 - my_web::main::{{closure}}::hbb912dd7df4aeb4d
      19:        0x107ca4120 - <core::future::from_generator::GenFuture<T> as core::future::future::Future>::poll::ha2815f8e7aca5dc1
      20:        0x107c8865d - tokio::runtime::enter::Enter::block_on::{{closure}}::h6b5e9b23e5c4d655
      21:        0x107c8e506 - tokio::coop::with_budget::{{closure}}::hb18b9d07d0e693a7
      22:        0x107cebdee - std::thread::local::LocalKey<T>::try_with::ha9811a7840d57628
      23:        0x107ceb4ac - std::thread::local::LocalKey<T>::with::hcb4e45ba7e69edaa
      24:        0x107c884ad - tokio::runtime::enter::Enter::block_on::h8dea6c389ed18880
      25:        0x107d11315 - tokio::runtime::thread_pool::ThreadPool::block_on::h274b9930f19cc162
      26:        0x107ca76e8 - tokio::runtime::Runtime::block_on::{{closure}}::hf60396b6d86ee761
      27:        0x107cc90b8 - tokio::runtime::context::enter::hcb90097146788eb6
      28:        0x107c653db - tokio::runtime::handle::Handle::enter::h7963a1ebbf377697
      29:        0x107ca763d - tokio::runtime::Runtime::block_on::h61ecdca8fd87f99b
      30:        0x107cb24cc - my_web::main::h372f3d2471b960b7
      31:        0x107d1171e - std::rt::lang_start::{{closure}}::h6532f1318acae0d5
      32:        0x1083dd14f - std::rt::lang_start_internal::hbbd10965adc92ae7
      33:        0x107d11701 - std::rt::lang_start::ha5adc3b371471675
      34:        0x107cb2532 - main
    
    
    opened by undeflife 10
  • Doubt regarding deserialization

    Doubt regarding deserialization

    Is your feature request related to a problem? Please describe. In the decode process that exists in async_impl/avro, there is the "decode" function This function receives the message payload and parses it, spliting the schema id and the message Where this id is the one which the producer used to generated the message The decode function uses this id to reach schema registry and load this schema into memory but I may be using a old DTO given that my producer can evolve adding new optional fields, with this I got errors during deserialization.

    Describe the solution you'd like Be able to provide the schema version I would like to be used to parse the message, instead of using the one used to produce it

    Describe alternatives you've considered Update the DTO before updating the schema, however this would generated a lot of work on events with a lot of consumers

    Additional context I did not found a better way to do it, if it exists please forgive my mistake

    opened by PauloAOliveira 9
  • encode string as key?

    encode string as key?

    I am unable to determine how to encode a String value as the key. Ideally, the steps would be similar to:

    fn get_string_primitive_schema() -> Box<SuppliedSchema> {
        Box::from(SuppliedSchema::new(r#"{"type": "string"}"#.into()))
    }
    
    let key_strategy = SubjectNameStrategy::TopicNameStrategyWithSchema(
        "test".into(),
        true,
        cp::get_string_primitive_schema(),
    );
    let k = Value::String("hello".to_string());
    let key = self.encoder.encode(k, &key_strategy)?;
    

    Am I missing a feature in the library or is encoding a primitive value not implemented?

    opened by jonbuffington 9
  • Support additional local protobuf definitions

    Support additional local protobuf definitions

    The protobuf definitions in our Schema Registry contain some references to the Google common protos which are not in the registry, but rather shipped locally with our applications.

    It would be helpful if this library allowed loading additional proto definitions to be used when decoding with ProtoDecoder.

    It may be possible to do this by hand with protofish and the raw proto decoder, but it would be helpful if this were baked in.

    opened by jeffutter 8
  • multiple values encoding/decoding

    multiple values encoding/decoding

    At times it is convenient to pass multiple events at once to avoid inconsistet state if for example 1 out of 10 events causes some kind of error on its way.

    I guess it would be great to have some API llike (encode|decode)_many which would pack events together and then unpack then successfully. avro-rs works this way actually.

    opened by verrchu 8
  • Working through error with schema_registry_converter

    Working through error with schema_registry_converter

    Hey @gklijs, moved over to this repo as you mentioned. The code I have been testing is linked here. The error that you mentioned is correct: Could not get avro bytes, was cause by Decoding error: value does not match schema, it's retriable: false, it's cached: false', src/kafka_producer.rs:21:23. Thanks for all the help!

    opened by jaketarnow 8
  • upgrade deps including tokio 1.0

    upgrade deps including tokio 1.0

    Hey,

    I guess this relates to #41. This just bumps up a couple of dependencies to their latest versions, it appears to pass all tests. I've tried it with an application I use this with and all seems to be working as intended.

    Hopefully this helps, let me know if you'd like anything extra done on it.

    Thanks

    opened by naamancurtis 7
  • E0308 mismatched types: perhaps two different versions of a crate are being used

    E0308 mismatched types: perhaps two different versions of a crate are being used

    Describe the bug My team ran into an issue with having schema_registry_converter and avro-rs as dependencies where if the versions don't match our build fails. I opened an issue with cargo, so you can find more information here: https://github.com/rust-lang/cargo/issues/8178

    To Reproduce Steps to reproduce the behavior:

    1. Add schema_registry_converter and avro-rs as dependencies, but set the avro-rs version to something before the latest: avro-rs = "0.7.0"
    2. Try to build.

    Notes This comment adds some more details to the situation.

    opened by jo3bingham 6
  • Some QoL improvements and fixes for live Confluent Schema Registry

    Some QoL improvements and fixes for live Confluent Schema Registry

    Fixes a couple issues in contacting a live Confluent Schema Registry. Also adds some minor quality-of-life improvements to some of the side-APIs.

    Fixes #2 #3 #4 #6

    opened by kitsuneninetails 6
  • Support for schema reference validation

    Support for schema reference validation

    Is your feature request related to a problem? Please describe. I have a use case,

    1. Creating schema with references
    2. Validating the schema

    I have a user schema with "id", and "role_name", and user_command schema which is used as a part of a micro service, but also it uses the user schema as a reference

    1. User and User Command schemas:

    pub fn user_schema() -> SuppliedReference {
        let schema_raw = r#"
        {
            "type":"object",
            "properties":{
                "id":{"type":"string"},
                "role_name":{"type":"string"}
            }
        }
        "#;
    
        SuppliedReference {
            name: String::from("com.example.user"),
            subject: String::from("com.example.user"),
            schema: String::from(schema_raw),
            references: vec![],
        }
    }
    
    pub fn user_command_schema() -> SuppliedSchema {
        SuppliedSchema { 
            name: Some(String::from("com.example.user_command")), 
            schema_type: SchemaType::Json, 
            schema: r#"
            {
                "properties":{
                    "id":{"type":"string"},
                    "name":{"type":"string"},
                    "metadata":{"type":"string"},
                    "creator_service_name":{"type":"string"},
                    "created_on":{"type":"integer"},
                    "data": {
                        "$ref": "{}"
                    } 
                }
            }"#.to_string().replace("{}", &user_schema().subject), 
            references: vec![user_schema()]
        }
    }
    
    

    2. Registering the schemas into the schema registry:

    let schema_registry_url = "localhost:9001".to_string();
    let subject= "com.example.user_command".to_string();
    let result = post_schema(
                &SrSettings::new(schema_registry_url),
                subject,
                user_command_schema(),
            )
            .await
            .unwrap();
    
    println!("result: {:?}", result);
    println!("Schema registry creation is done");
    

    Until here there there is not problem, and the schema will be registered into the schema registry.

    The problem arises when I need to validate + produce it to Kafka, how can I use my user_command_schema to validate my data before is produced into Kafka

    The code will fail if I try to encode the data using the schema and it will fail by running this code:

    // source at: https://github.com/gklijs/schema_registry_converter/blob/master/src/async_impl/json.rs#L237
    
    fn reference_url(rr: &RegisteredReference) -> Result<Url, SRCError> {
        match Url::from_str(&*rr.name) {
            Ok(v) => Ok(v),
            Err(e) => Err(SRCError::non_retryable_with_cause(e, &*format!("reference schema with subject {} and version {} has invalid id {}, it has to be a fully qualified url", rr.subject, rr.version, rr.name)))
        }
    }
    

    Describe the solution you'd like I need in my example using json that producer.send_json to do the validation before sending the data to kafka, also in case transactional producer/consumer to validate the data before sending (for producer) and after receiving (for consumer).

    opened by arkanmgerges 5
  • Listing subjects and versions

    Listing subjects and versions

    opened by tzachshabtay 4
Releases(v2.1.0)
  • v2.1.0(Jul 11, 2021)

    This release will focus on making the library easier to use. At least the following should be done:

    • Update the readme and maybe add some more use cases.
    • Have a reliable and faster ci, likely by moving to Github actions.
    • Implement the standard Error trait for SRCError.
    • For protobuf, have an easier way to encode with single message schema's, not requiring providing the full_name.
    • For async, have either an example or in the library a nice way to share the converter in multiple threads, so the users don't have to think about this. See https://github.com/gklijs/ksqlDB-GraphQL-poc/blob/main/rust-data-creator/src/data_producer/kafka_producer.rs but improve on that. I would be nice to not have to depend on rdkafka through.
    • Enable supplying a reqwest client for any additional setting/modifications needed for the schema registry calls.
    Source code(tar.gz)
    Source code(zip)
  • v2.0.1(Nov 10, 2020)

    Maintenance release with mainly updated dependencies, making the blocking sr settings cloneable and no longer needs kafka_test feature to use both blocking and async in the same project.

    Source code(tar.gz)
    Source code(zip)
  • v2.0.0(Aug 23, 2020)

    • Add json schema support.
    • Add protobuf support.
    • Support references in schema registry.
    • Add authentication proxies, timeouts, etc, by using reqwest instead of curl.
    • Support async/non-blocking by default
    • For Avro, make it possible to use the encode_struct function with primitive values.
    Source code(tar.gz)
    Source code(zip)
  • v2.0.2(Feb 20, 2021)

Owner
Gerard Klijs
Doing mainly Java during office hours. Clojure, Rust and other stuff before and after.
Gerard Klijs
Handoff is an unbuffered, single-producer / single-consumer, async channel

handoff handoff is a single-producer / single-consumer, unbuffered, asynchronous channel. It's intended for cases where you want blocking communicatio

Nathan West 7 Feb 7, 2023
Easy Hadoop Streaming and MapReduce interfaces in Rust

Efflux Efflux is a set of Rust interfaces for MapReduce and Hadoop Streaming. It enables Rust developers to run batch jobs on Hadoop infrastructure wh

Isaac Whitfield 31 Nov 22, 2022
libhdfs binding and wrapper APIs for Rust

hdfs-rs libhdfs binding library and rust APIs which safely wraps libhdfs binding APIs Current Status Alpha Status (Rust wrapping APIs can be changed)

Hyunsik Choi 32 Dec 1, 2022
Twitch data consumer and broadcaster

NeoTwitch Arch Network broadcaster Chat (message buffer) If the message buffer is full then shut down Channel point events If the message buffer is fu

Togglebit 3 Dec 3, 2021
Efficiently store Rust idiomatic bytes related types in Avro encoding.

Serde Avro Bytes Avro is a binary encoding format which provides a "bytes" type optimized to store &[u8] data like. Unfortunately the apache_avro enco

Akanoa 3 Mar 30, 2024
Avro schema compatibility checker

DeGauss Your friendly neighborhood Avro schema compatibility checker. Install cargo install degauss Example Check the compatibility of your schemas d

Theo M. Bulut 11 Aug 9, 2022
Extension registry for Lapce Registry

Lapce Registry This is the software running the lapce plugin registry, this manages and hosts plugins that the community uploads. Run the registry loc

Lapce 15 Dec 4, 2022
Something something B language.

badc A terrible, dirty, no-good, rotten B compiler. Written by one really great human being, and one obscenely terrible sheep. Contributing Don't. Ref

The Phantom Derpstorm 8 Nov 12, 2022
Provides json/csv/protobuf streaming support for axum

axum streams for Rust Library provides HTTP response streaming support for axum web framework: JSON array stream format JSON lines stream format CSV s

Abdulla Abdurakhmanov 15 Dec 11, 2022
Decode SCALE bytes into custom types using a scale-info type registry and a custom Visitor impl.

scale-decode This crate attempts to simplify the process of decoding SCALE encoded bytes into a custom data structure given a type registry (from scal

Parity Technologies 6 Sep 20, 2022
Infer a JSON schema from example data, produce nonsense synthetic data (drivel) according to the schema

drivel drivel is a command-line tool written in Rust for inferring a schema from an example JSON (or JSON lines) file, and generating synthetic data (

Daniël 36 Jul 5, 2024
nombytes is a library that provides a wrapper for the bytes::Bytes byte container for use with nom.

NomBytes nombytes is a library that provides a wrapper for the bytes::Bytes byte container for use with nom. I originally made this so that I could ha

Alexander Krivács Schrøder 2 Jul 25, 2022
rust_arango enables you to connect with ArangoDB server, access to database, execute AQL query, manage ArangoDB in an easy and intuitive way, both async and plain synchronous code with any HTTP ecosystem you love.

rust_arango enables you to connect with ArangoDB server, access to database, execute AQL query, manage ArangoDB in an easy and intuitive way, both async and plain synchronous code with any HTTP ecosystem you love.

Foretag 3 Mar 24, 2022
Schema2000 is a tool that parses exsiting JSON documents and tries to derive a JSON schema from these documents.

Schema 2000 Schema2000 is a tool that parses exsiting JSON documents and tries to derive a JSON schema from these documents. Currently, Schema2000 is

REWE Digital GmbH 12 Dec 6, 2022
Implementation of the Docker Registry HTTP API V2 in Rust, that can act as a proxy to other registries

Docker registry server and proxy (I'm bad at creating catchy names, but this one is good enough.) This project aims to implement a Docker Registry HTT

l4p1n (Mathias B.) 2 Dec 30, 2022
The rust client for CeresDB. CeresDB is a high-performance, distributed, schema-less, cloud native time-series database that can handle both time-series and analytics workloads.

The rust client for CeresDB. CeresDB is a high-performance, distributed, schema-less, cloud native time-series database that can handle both time-series and analytics workloads.

null 12 Nov 18, 2022
This crate converts Rust compatible regex-syntax to Vim's NFA engine compatible regex.

This crate converts Rust compatible regex-syntax to Vim's NFA engine compatible regex.

kaiuri 1 Feb 11, 2022
A Rust on-site channel benchmarking helper. Inter-Process (async / busy) & Intra-Process (async single threaded / async multi threaded)

On-Site Rust Channel Benchmarking Helper Deploy on server to determine which public crates are the fastest for communicating in different architecture

null 23 Jul 9, 2024
High-level non-blocking Deno bindings to the rust-bert machine learning crate.

bertml High-level non-blocking Deno bindings to the rust-bert machine learning crate. Guide Introduction The ModelManager class manages the FFI bindin

Carter Snook 14 Dec 15, 2022
Generate or convert random bytes into passphrases. A Rust port of niceware.

niceware My blog post: Porting Niceware to Rust A Rust port of niceware. Sections of this README have been copied from the original project. This libr

Andrew Healey 20 Sep 7, 2022