Strongly typed JSON library for Rust

Overview

Serde JSON   Build Status Latest Version Rustc Version 1.31+

Serde is a framework for serializing and deserializing Rust data structures efficiently and generically.


[dependencies]
serde_json = "1.0"

You may be looking for:

JSON is a ubiquitous open-standard format that uses human-readable text to transmit data objects consisting of key-value pairs.

{
    "name": "John Doe",
    "age": 43,
    "address": {
        "street": "10 Downing Street",
        "city": "London"
    },
    "phones": [
        "+44 1234567",
        "+44 2345678"
    ]
}

There are three common ways that you might find yourself needing to work with JSON data in Rust.

  • As text data. An unprocessed string of JSON data that you receive on an HTTP endpoint, read from a file, or prepare to send to a remote server.
  • As an untyped or loosely typed representation. Maybe you want to check that some JSON data is valid before passing it on, but without knowing the structure of what it contains. Or you want to do very basic manipulations like insert a key in a particular spot.
  • As a strongly typed Rust data structure. When you expect all or most of your data to conform to a particular structure and want to get real work done without JSON's loosey-goosey nature tripping you up.

Serde JSON provides efficient, flexible, safe ways of converting data between each of these representations.

Operating on untyped JSON values

Any valid JSON data can be manipulated in the following recursive enum representation. This data structure is serde_json::Value.

enum Value {
    Null,
    Bool(bool),
    Number(Number),
    String(String),
    Array(Vec<Value>),
    Object(Map<String, Value>),
}

A string of JSON data can be parsed into a serde_json::Value by the serde_json::from_str function. There is also from_slice for parsing from a byte slice &[u8] and from_reader for parsing from any io::Read like a File or a TCP stream.

use serde_json::{Result, Value};

fn untyped_example() -> Result<()> {
    // Some JSON input data as a &str. Maybe this comes from the user.
    let data = r#"
        {
            "name": "John Doe",
            "age": 43,
            "phones": [
                "+44 1234567",
                "+44 2345678"
            ]
        }"#;

    // Parse the string of data into serde_json::Value.
    let v: Value = serde_json::from_str(data)?;

    // Access parts of the data by indexing with square brackets.
    println!("Please call {} at the number {}", v["name"], v["phones"][0]);

    Ok(())
}

The result of square bracket indexing like v["name"] is a borrow of the data at that index, so the type is &Value. A JSON map can be indexed with string keys, while a JSON array can be indexed with integer keys. If the type of the data is not right for the type with which it is being indexed, or if a map does not contain the key being indexed, or if the index into a vector is out of bounds, the returned element is Value::Null.

When a Value is printed, it is printed as a JSON string. So in the code above, the output looks like Please call "John Doe" at the number "+44 1234567". The quotation marks appear because v["name"] is a &Value containing a JSON string and its JSON representation is "John Doe". Printing as a plain string without quotation marks involves converting from a JSON string to a Rust string with as_str() or avoiding the use of Value as described in the following section.

The Value representation is sufficient for very basic tasks but can be tedious to work with for anything more significant. Error handling is verbose to implement correctly, for example imagine trying to detect the presence of unrecognized fields in the input data. The compiler is powerless to help you when you make a mistake, for example imagine typoing v["name"] as v["nmae"] in one of the dozens of places it is used in your code.

Parsing JSON as strongly typed data structures

Serde provides a powerful way of mapping JSON data into Rust data structures largely automatically.

use serde::{Deserialize, Serialize};
use serde_json::Result;

#[derive(Serialize, Deserialize)]
struct Person {
    name: String,
    age: u8,
    phones: Vec<String>,
}

fn typed_example() -> Result<()> {
    // Some JSON input data as a &str. Maybe this comes from the user.
    let data = r#"
        {
            "name": "John Doe",
            "age": 43,
            "phones": [
                "+44 1234567",
                "+44 2345678"
            ]
        }"#;

    // Parse the string of data into a Person object. This is exactly the
    // same function as the one that produced serde_json::Value above, but
    // now we are asking it for a Person as output.
    let p: Person = serde_json::from_str(data)?;

    // Do things just like with any other Rust data structure.
    println!("Please call {} at the number {}", p.name, p.phones[0]);

    Ok(())
}

This is the same serde_json::from_str function as before, but this time we assign the return value to a variable of type Person so Serde will automatically interpret the input data as a Person and produce informative error messages if the layout does not conform to what a Person is expected to look like.

Any type that implements Serde's Deserialize trait can be deserialized this way. This includes built-in Rust standard library types like Vec<T> and HashMap<K, V>, as well as any structs or enums annotated with #[derive(Deserialize)].

Once we have p of type Person, our IDE and the Rust compiler can help us use it correctly like they do for any other Rust code. The IDE can autocomplete field names to prevent typos, which was impossible in the serde_json::Value representation. And the Rust compiler can check that when we write p.phones[0], then p.phones is guaranteed to be a Vec<String> so indexing into it makes sense and produces a String.

The necessary setup for using Serde's derive macros is explained on the Using derive page of the Serde site.

Constructing JSON values

Serde JSON provides a json! macro to build serde_json::Value objects with very natural JSON syntax.

use serde_json::json;

fn main() {
    // The type of `john` is `serde_json::Value`
    let john = json!({
        "name": "John Doe",
        "age": 43,
        "phones": [
            "+44 1234567",
            "+44 2345678"
        ]
    });

    println!("first phone number: {}", john["phones"][0]);

    // Convert to a string of JSON and print it out
    println!("{}", john.to_string());
}

The Value::to_string() function converts a serde_json::Value into a String of JSON text.

One neat thing about the json! macro is that variables and expressions can be interpolated directly into the JSON value as you are building it. Serde will check at compile time that the value you are interpolating is able to be represented as JSON.

let full_name = "John Doe";
let age_last_year = 42;

// The type of `john` is `serde_json::Value`
let john = json!({
    "name": full_name,
    "age": age_last_year + 1,
    "phones": [
        format!("+44 {}", random_phone())
    ]
});

This is amazingly convenient but we have the problem we had before with Value which is that the IDE and Rust compiler cannot help us if we get it wrong. Serde JSON provides a better way of serializing strongly-typed data structures into JSON text.

Creating JSON by serializing data structures

A data structure can be converted to a JSON string by serde_json::to_string. There is also serde_json::to_vec which serializes to a Vec<u8> and serde_json::to_writer which serializes to any io::Write such as a File or a TCP stream.

use serde::{Deserialize, Serialize};
use serde_json::Result;

#[derive(Serialize, Deserialize)]
struct Address {
    street: String,
    city: String,
}

fn print_an_address() -> Result<()> {
    // Some data structure.
    let address = Address {
        street: "10 Downing Street".to_owned(),
        city: "London".to_owned(),
    };

    // Serialize it to a JSON string.
    let j = serde_json::to_string(&address)?;

    // Print, write to a file, or send to an HTTP server.
    println!("{}", j);

    Ok(())
}

Any type that implements Serde's Serialize trait can be serialized this way. This includes built-in Rust standard library types like Vec<T> and HashMap<K, V>, as well as any structs or enums annotated with #[derive(Serialize)].

Performance

It is fast. You should expect in the ballpark of 500 to 1000 megabytes per second deserialization and 600 to 900 megabytes per second serialization, depending on the characteristics of your data. This is competitive with the fastest C and C++ JSON libraries or even 30% faster for many use cases. Benchmarks live in the serde-rs/json-benchmark repo.

Getting help

Serde is one of the most widely used Rust libraries so any place that Rustaceans congregate will be able to help you out. For chat, consider trying the #general or #beginners channels of the unofficial community Discord, the #rust-usage channel of the official Rust Project Discord, or the #general stream in Zulip. For asynchronous, consider the [rust] tag on StackOverflow, the /r/rust subreddit which has a pinned weekly easy questions post, or the Rust Discourse forum. It's acceptable to file a support issue in this repo but they tend not to get as many eyes as any of the above and may get closed without a response after some time.

No-std support

As long as there is a memory allocator, it is possible to use serde_json without the rest of the Rust standard library. This is supported on Rust 1.36+. Disable the default "std" feature and enable the "alloc" feature:

[dependencies]
serde_json = { version = "1.0", default-features = false, features = ["alloc"] }

For JSON support in Serde without a memory allocator, please see the serde-json-core crate.


License

Licensed under either of Apache License, Version 2.0 or MIT license at your option.
Unless you explicitly state otherwise, any contribution intentionally submitted for inclusion in this crate by you, as defined in the Apache-2.0 license, shall be dual licensed as above, without any additional terms or conditions.
Comments
  • Perfect accuracy float parsing

    Perfect accuracy float parsing

    Float parsing is currently implemented by calculating the significand as u64, casting to f64, and then multiplying or dividing by a nonnegative power of 10. For example the input 10.9876543210987654 would be parsed into the value 109876543210987654_u64 as f64 / 10e15.

    This algorithm is sometimes correct, or else usually close to correct in practical usage. It matches how JSON parsers are implemented in other languages.

    However, it can happen that the result from this algorithm is not the mathematically nearest 64-bit floating point value to the exact value of the input. A "correct" algorithm would always produce the mathematically nearest answer. This requires high precision big-integer arithmetic in the general case so there would be a large performance cost; if implemented, we would likely want this behind a cfg that is off by default, with the current approximate behavior as default. This way programs can opt in to the more expensive algorithm as required.

    fn main() {
        let input = "10.9876543210987654";
        let n: f64 = serde_json::from_str(input).unwrap();
    
        // produces 10.9876543210987644982878919108770787715911865234375
        // which is low by 9.017e-16
        let current_algorithm = 109876543210987654_u64 as f64 / 10e15;
        println!("{}", precise::to_string(current_algorithm));
        assert_eq!(n, current_algorithm);
    
        // produces 10.98765432109876627464473131112754344940185546875
        // which is high by 8.746e-16 (closer)
        let correct_answer = 10.9876543210987654_f64;
        println!("{}", precise::to_string(correct_answer));
        assert_ne!(n, correct_answer);
    }
    
    opened by dtolnay 50
  • Allow increasing recursion limit

    Allow increasing recursion limit

    I am parsing/serializing pretty large JSON files and I regularly encounter RecursionLimitExceeded. I need a way to instantiate a Serializer/Deserializer with a much larger recursion limit.

    Could we introduce code to let us tweak that?

    enhancement 
    opened by Yoric 23
  • Parser cannot read arbitrary precision numbers

    Parser cannot read arbitrary precision numbers

    http://www.ecma-international.org/publications/files/ECMA-ST/ECMA-404.pdf specifies in the second paragraph of the introduction that JSON is agnostic about numbers, and simply represents them as a series of digits.

    However, serde-json parses anything with a decimal point as a Rust f64, which causes numbers to be read incorrectly. There is no way to avoid this because this behaviour is chosen as soon as a decimal point is encountered. This makes it impossible to use serde-json to interoperate with financial software using JSON.

    enhancement 
    opened by apoelstra 21
  • Add a RawValue type

    Add a RawValue type

    It would be helpful to have a type similar to Go's json.RawMessage that is not tokenized during deserialization, but rather its raw contents stored as a Vec<u8> or &'de [u8].

    The following pseudo-code demonstrates the idea.

    #[derive(Deserialize)]
    struct Struct {
        /// Deserialized normally.
        core_data: Vec<i32>,
    
        /// Contents of `user_data` are copied / borrowed directly from the input
        /// with no modification.
        ///
        /// `RawValue<'static>` is akin to `Vec<u8>`.
        /// `RawValue<'a>` is akin to `&'a [u8]`.
        user_data: serde_json::RawValue<'static>,
    }
    
    fn main() {
        let json = r#"
        {
            "core_data": [1, 2, 3],
            "user_data": { "foo": {}, "bar": 123, "baz": "abc" }
        }
        "#;
        
        let s: Struct = serde_json::from_bytes(&json).unwrap();
        println!("{}", s.user_data); // "{ \"foo\": {}, \"bar\": 123, \"baz\": \"abc\" }"
    }
    

    The main advantage of this is would be to have 'lazily-deserialized' values.

    enhancement 
    opened by alteous 17
  • Output JSON schema during build process

    Output JSON schema during build process

    It would be great if Serde could optionally produce a JSON schema as a side-effect of the build process. AFAIK it has all the information it needs to write one. You just need to translate the structs/enums to their appropriate schema representations (read: matching JSON type).

    Additional:

    While the above is an awesome starting block, it would also be really nice if you could compile-time check that Serde's JSON will validate against an externally provided schema. This isn't totally necessary, as you could do this after the fact with a tool like ajv. It would just provide stronger guarantees if it was compile-time checked.

    Motivation

    • Compatibility: Presently there is no way to guarantee that JSON produced by Serde is compatible with another framework. We can only write tests against JSON samples and write code to match an API spec. We have no way of knowing if either of them is up-to-date or correct.
    • Extendability: Having a portable artifact of your data-representation is an enormously useful tool. In many dynamic languages, you can auto generate data bindings and UIs provided a schema. This allows devs to quickly develop across platforms and languages while maintaining integrity of their data.

    Anticipated Questions:

    • Why Serde? - Serde already has all of the user-facing hardware necessary to produce a schema. Using attributes and types already in the user's code makes adding this feature "free" and to existing libraries.
    • Why at compile-time? - Validating against a schema at compile-time enables devs to "Hack without fear", because they will know that they are properly encoding their data types. It allows devs to easily update their code and immediately know if their schema/data-bindings are out of date.
    enhancement 
    opened by lylemoffitt 16
  • Consider serializing map integer keys as strings

    Consider serializing map integer keys as strings

    Right now serde_json rejects serializing maps with integer keys because semantically speaking, JSON only supports maps with string keys. There are workarounds in serde 0.7 with the new #[serde(serialize_with="...", deserialize_with="...)] (see this gist), but it still can be annoying if this keeps causing problems.

    Is there any real value about erroring out on non-key values?

    enhancement 
    opened by erickt 16
  • Round trip floats

    Round trip floats

    Ideally this test would pass.

    extern crate serde_json;
    
    #[macro_use]
    extern crate quickcheck;
    
    quickcheck! {
        fn floats(n: f64) -> bool {
            let j = serde_json::to_string(&n).unwrap();
            serde_json::from_str::<f64>(&j).unwrap() == n
        }
    }
    

    On the printing side grisu2 guarantees that the f64 closest to the string representation is identical to the original input, so the inaccuracy is somewhere on the parsing side.

    bug 
    opened by dtolnay 15
  • Arbitrary precision numbers

    Arbitrary precision numbers

    Fixes #18.

    serde_json = { version = "0.9", features = ["arbitrary_precision"] }
    
    #[derive(Serialize, Deserialize)]
    struct S {
        n: serde_json::Number,
        v: serde_json::Value,
    }
    
    let s: S = serde_json::from_str(...)?;
    
    // full precision
    println!("{}", s.n);
    println!("{}", s.v);
    println!("{}", serde_json::to_string(&s)?);
    
    do not merge 
    opened by dtolnay 14
  • Having issues iterating through an serde_json::Value

    Having issues iterating through an serde_json::Value

    Sorry to post this on the issues forum, but I cannot find a clean way to do this. I am new to rust and would like to do iterate through this data structure (when printed withprintln!("{:?}\n\n",v);)

    Array([Object({"24h_volume_usd": String("7555680000.0"), "available_supply": String("16896337.0"), "id": String("bitcoin"), "last_updated": String("1520039066"), "market_cap_usd": String("187946404620"), "max_supply": String("21000000.0"), "name": String("Bitcoin"), "percent_change_1h": String("0.46"), "percent_change_24h": String("0.45"), "percent_change_7d": String("7.4"), "price_btc": String("1.0"), "price_usd": String("11123.5"), "rank": String("1"), "symbol": String("BTC"), "total_supply": String("16896337.0")}), Object({"24h_volume_usd": String("1859040000.0"), "available_supply": String("97947252.0"), "id": String("ethereum"), "last_updated": String("1520039051"), "market_cap_usd": String("84237477378.0"), "max_supply": Null, "name": String("Ethereum"), "percent_change_1h": String("0.3"), "percent_change_24h": String("-1.49"), "percent_change_7d": String("-0.41"), "price_btc": String("0.0775658"), "price_usd": String("860.029"), "rank": String("2"), "symbol": String("ETH"), "total_supply": String("97947252.0")}), Object({"24h_volume_usd": String("276913000.0"), "available_supply": String("39091956706.0"), "id": String("ripple"), "last_updated": String("1520039041"), "market_cap_usd": String("36054628946.0"), "max_supply": String("100000000000"), "name": String("Ripple"), "percent_change_1h": String("1.39"), "percent_change_24h": String("-0.6"), "percent_change_7d": String("-7.81"), "price_btc": String("0.00008318"), "price_usd": String("0.922303"), "rank": String("3"), "symbol": String("XRP"), "total_supply": String("99992520283.0")}), Object({"24h_volume_usd": String("417415000.0"), "available_supply": String("16996550.0"), "id": String("bitcoin-cash"), "last_updated": String("1520039053"), "market_cap_usd": String("21658363734.0"), "max_supply": String("21000000.0"), "name": String("Bitcoin Cash"), "percent_change_1h": String("-0.15"), "percent_change_24h": String("-1.56"), "percent_change_7d": String("-0.24"), "price_btc": String("0.114927"), "price_usd": String("1274.28"), "rank": String("4"), "symbol": String("BCH"), "total_supply": String("16996550.0")})])
    
    
    extern crate term_painter;
    extern crate reqwest;
    
    
    extern crate serde_json;
    // JSON Parsing and Construction
    // https://github.com/serde-rs/json
    use serde_json::{Value};
    
    mod getcointicker;
    
    //use std::io;
    
    //https://lukaskalbertodt.github.io/term-painter/term_painter/
    use term_painter::ToStyle;
    use term_painter::Color::*;
    //use term_painter::Attr::*;
    use getcointicker::coinprices;
    
    fn main() {
        println!("rusty{}, ",
            Yellow.paint("Horde"),
        );
        let cp = match coinprices(4) {
            Result::Ok(val) => {val},
            Result::Err(err) => {format!("Unable to get coin prices: {}",err)}
        };
        //println!("{}",cp);  
        let v: Value = match serde_json::from_str(&cp){
            Result::Ok(val) => {val},
            Result::Err(err) => {panic!("Unable to parse json: {}",err)}
        };
        
        //what is the proper way to iterate through all the values in the array
        for i in &v.into_iter() {
            println!("{:?}\n\n",v[i]);
        }
        println!("{:?}\n\n",v[0]); //this works no problem
            
        println!("{}",v[0]["market_cap_usd"]);  
    }
    
    

    I have tried several ways...the latest was the following:

    error[E0599]: no method named `into_iter` found for type `serde_json::Value` in the current scope
      --> src/main.rs:34:16
       |
    34 |     for i in v.into_iter() {
       |                ^^^^^^^^^
       |
       = note: the method `into_iter` exists but the following trait bounds were not satisfied:
               `serde_json::Value : std::iter::IntoIterator`
               `&serde_json::Value : std::iter::IntoIterator`
               `&mut serde_json::Value : std::iter::IntoIterator`
    
    error: aborting due to previous error
    
    error: Could not compile `rustyhorde`.
    

    Thank you so much for your help!

    support 
    opened by hortinstein 13
  • where did StreamDeserializer::new go?

    where did StreamDeserializer::new go?

    Im upgrading from 0.8 to 0.9 and noticed there was an undocumented breaking change for StreamDeserializer. I can no longer construct an instance for one a type implements Deserialize. it looks like its now behind into_iter on Deserialize for types that implement Read. Please advise

    docs 
    opened by softprops 13
  • Implement issue #68 - add pointer_mut

    Implement issue #68 - add pointer_mut

    I was thinking that JSON Patch support would be useful, and saw the Issues for that and for pointer_mut support. #68 pointer_mut seemed necessary for implementing Patch, so I took a swing at it! This is my first open-source Rust PR so criticism is welcome but be gentle, haha!

    I moved pointer into a new trait (Pointer) and added pointer_mut to it as well. To maintain symmetry I also added a pointer_owned method that consumes the Value and returns an owned Value. Because of the new trait this is a breaking change (you need to import the trait for pointer to work now.)

    Unit tests were added and all pass, documentation was added but is mostly just copy-paste from the existing code. I also wrote a benchmark to verify that the changes I made to pointer weren't slower than the existing code, but left out the change to the .rs.in file I made to run the bench. On my machine, the new and old pointer methods were very close in performance. The new one actually seems slightly faster but they were within deviation range of each other so I'm making no claims except that my refactors shouldn't have negatively affected the method.

    opened by CryptArchy 13
  • How to ignore the

    How to ignore the "$schema" property?

    JSON support "$schema" property.

    As defined here: https://json-schema.org/understanding-json-schema/reference/schema.html?highlight=schema#id4

    How can I get serde-json to ignore that field? (or map it to a string which I can ignore)

    opened by wcarmon 0
  • I need a little help

    I need a little help

    Hello, i want to make an json like this:

    [
    	{
    		"name": "name_01",
    		"value": "funkyname",
    	},
    	{
    		"name": "name_02",
    		"value": "notsofunkyname",
    	}
    ]
    

    and my database iterate the sql-query and put per iterate the name and the value name = name_01 value = funkyname

    and the json must like json[0][name] json[0][value] json[1][name] json[1][value]

    and so on

    give it an easy way. build an mutable array or so?

    thx

    opened by cyberpunkbln 0
  • Skipping fields in serde_json::Value whose value is None.

    Skipping fields in serde_json::Value whose value is None.

    Hi all, I am quite new to rust and I am unable to figure out how to ignore fields from serialized representation of serde_json::Value. I know we can provide skip_serialization_if_none on struct fields, but unfortunately this is a Value not my struct. I have tried implementing a custom Serializer, but I am quite lost on how to deal with nested structures.

    This is what I have so far

    #[derive(Deserialize, Debug)]
    pub struct CustomValue(Value);
    
    impl Deref for CustomValue {
        type Target = Value;
    
        fn deref(&self) -> &Self::Target {
            &self.0
        }
    }
    
    impl Serialize for CustomValue {
        fn serialize<S>(&self, serializer: S) -> Result<S::Ok, S::Error>
        where
            S: serde::Serializer,
        {
            // let mut seq = serializer.serialize_map(Some(2))?;
            match &self.0 {
                Value::Null => serializer.serialize_none(),
                Value::Bool(b) => serializer.serialize_bool(*b),
                Value::Number(n) => n.serialize(serializer),
                Value::String(s) => serializer.serialize_str(s),
                Value::Array(v) => v.serialize(serializer),
                Value::Object(m) => {
                    let mut seq = serializer.serialize_map(Some(m.len()))?;
                    for (k, v) in m {
                        if !v.is_null() {
                            seq.serialize_entry(k, v)?;
                        }
                    }
                    seq.end()
                }
            }
        }
    }
    

    But this only removes None keys from top level fields. In theory, I know that we'll have to do a recursive approach here, but in practice I don't know how to implement this in Rust. I can't figure out what types to use etc.

    Any help is very appreciated, thank you.

    opened by vamshiaruru-virgodesigns 1
  • Found multiple `impl`s satisfying `u8: PartialEq<_>`

    Found multiple `impl`s satisfying `u8: PartialEq<_>`

    I'm currently writing a framework with some sorting algorithms, I wrote the following interface:

    pub fn merge_sort_<T: Ord + Copy>(arr: &mut [T]) {
        let len = arr.len();
        if len > 1 {
            _merge_sort_(arr, 0, len - 1);
        }
    }
    
    #[test]
    fn empty() {
        let mut res = Vec::<u8>::new();
        merge_sort_(&mut res);
        assert_eq!(res, Vec::new());
    }
    

    However, when adding the serde_json crate dependency:

    serde = { version = "1.0", features = ["derive"] }
    serde_json = "1.0"
    

    The compiler complains with my test code:

    error[E0283]: type annotations needed
       --> src/algos/mutable/sort.rs:145:29
        |
    145 |             assert_eq!(res, Vec::new());
        |             ----------------^^^^^^^^---
        |             |               |
        |             |               cannot infer type of the type parameter `T` declared on the struct `Vec`
        |             type must be known at this point
        |
        = note: multiple `impl`s satisfying `u8: PartialEq<_>` found in the following crates: `core`, `serde_json`:
                - impl PartialEq for u8;
                - impl PartialEq<Value> for u8;
        = note: required for `Vec<u8>` to implement `PartialEq<Vec<_>>`
    help: consider specifying the generic argument
        |
    145 |             assert_eq!(res, Vec::<T>::new());
        |                                +++++
    
    Some errors have detailed explanations: E0282, E0283.
    For more information about an error, try `rustc --explain E0282`.
    

    What should I do to avoid this? Is adding type annotation the only way to solve this?

    opened by ireina7 0
  • `json!` maps should preallocate

    `json!` maps should preallocate

    The map constructor for json! currently uses Map::new; it should prefer instead to use map::with_capacity, so that the storage is preallocated. This will require a slight refactor of how it works, such that all the keys and values are collected before any code is generated (currently, map.insert statements are generated eagerly during tt munching).

    opened by Lucretiel 0
  • Implement `TryInto` for `Value`

    Implement `TryInto` for `Value`

    Fixes #902

    This PR implements TryInto for converting Value into common types such as String, str, u64, f64, i64, (), Vec and Map. I've implemented these both for owned Value and borrowed Value.

    Justification and Use Case

    I know there's been some skepticism from the maintainer about the need for TryInto impls when one can use the full deserialization machinery to retrieve a concrete typed value from a Value, but I'd like to justify this PR by pointing to both the ergonomics of having TryInto and the efficiency of TryInto over full deserialization (especially when borrowing).

    For my use-case, I have a struct that delays fully deserializing all values until much later in the program.

    struct Job {
        pub name: String,
        pub inputs: HashMap<String, Value>
    }
    

    Much deeper in the code, inputs get wrapped into a container like so:

    pub struct Inputs {
        inner: HashMap<String, Value>
    }
    
    impl Inputs {
    
        pub fn get_required<'a, T: ?Sized>(&'a self, key: &str) -> Result<&'a T, Error>
            where &'a Value: TryInto<&'a T>
        {
            match self.inner.get(key) {
                Some(value) => {
                    value.try_into()?
                },
                None => Err(Error)
            }
        }
    
        pub fn get_with_default<'a, 'default: 'a, T: ?Sized>(&'a self, key: &str, default: &'default T) -> Result<&'a T, Error> 
            where &'a Value: TryInto<&'a T> 
        {
            match self.inner.get(key) {
                Some(value) => {
                    value.try_into()?
                },
                None => {
                    Ok(default)
                }
            }
        }
    }
    
    

    All of this machinery provides a really nice interface to calling code, that can simply call into Inputs like so:

    let foo: &str = inputs.get_required("foo")?;
    let bar: &str = inputs.get_with_default("bar", "baz")?;
    

    And importantly, because inputs can be quite large, everything is done via borrows, avoiding unnecessary cloning and allocations.

    This is my specific use-case, but I'm sure there are other use-cases where having reasonable and straightforward TryInto impls for Value would be ergonomic and timesaving.

    Thank you.

    opened by phayes 0
Releases(v1.0.91)
Owner
null
JSON Schema validation library

A JSON Schema validator implementation. It compiles schema into a validation tree to have validation as fast as possible.

Dmitry Dygalo 308 Dec 30, 2022
A port of the Node.js library json-file-store

A port of the Node.js library json-file-store

Markus Kohlhase 60 Dec 19, 2022
JSON parser which picks up values directly without performing tokenization in Rust

Pikkr JSON parser which picks up values directly without performing tokenization in Rust Abstract Pikkr is a JSON parser which picks up values directl

Pikkr 615 Dec 29, 2022
JSON implementation in Rust

json-rust Parse and serialize JSON with ease. Changelog - Complete Documentation - Cargo - Repository Why? JSON is a very loose format where anything

Maciej Hirsz 500 Dec 21, 2022
Rust port of gjson,get JSON value by dotpath syntax

A-JSON Read JSON values quickly - Rust JSON Parser change name to AJSON, see issue Inspiration comes from gjson in golang Installation Add it to your

Chen Jiaju 90 Dec 6, 2022
A rust script to convert a better bibtex json file from Zotero into nice organised notes in Obsidian

Zotero to Obsidian script This is a script that takes a better bibtex JSON file exported by Zotero and generates an organised collection of reference

Sashin Exists 3 Oct 9, 2022
CLI tool to convert HOCON into valid JSON or YAML written in Rust.

{hocon:vert} CLI Tool to convert HOCON into valid JSON or YAML. Under normal circumstances this is mostly not needed because hocon configs are parsed

Mathias Oertel 23 Jan 6, 2023
Typify - Compile JSON Schema documents into Rust types.

Typify Compile JSON Schema documents into Rust types. This can be used ... via the macro import_types!("types.json") to generate Rust types directly i

Oxide Computer Company 73 Dec 27, 2022
A easy and declarative way to test JSON input in Rust.

assert_json A easy and declarative way to test JSON input in Rust. assert_json is a Rust macro heavily inspired by serde json macro. Instead of creati

Charles Vandevoorde 8 Dec 5, 2022
A small rust database that uses json in memory.

Tiny Query Database (TQDB) TQDB is a small library for creating a query-able database that is encoded with json. The library is well tested (~96.30% c

Kace Cottam 2 Jan 4, 2022
A JSON Query Language CLI tool built with Rust 🦀

JQL A JSON Query Language CLI tool built with Rust ?? ?? Core philosophy ?? Stay lightweight ?? Keep its features as simple as possible ?? Avoid redun

Davy Duperron 872 Jan 1, 2023
rurl is like curl but with a json configuration file per request

rurl rurl is a curl-like cli tool made in rust, the difference is that it takes its params from a json file so you can have all different requests sav

Bruno Ribeiro da Silva 6 Sep 10, 2022
A tool for outputs semantic difference of json

jsondiff A tool for outputs semantic difference of json. "semantic" means: sort object key before comparison sort array before comparison (optional, b

niboshi 3 Sep 22, 2021
Easily create dynamic css using json notation

jss! This crate provides an easy way to write dynamic css using json notation. This gives you more convenient than you think. Considering using a dyna

Jovansonlee Cesar 7 May 14, 2022
Decode Metaplex mint account metadata into a JSON file.

Simple Metaplex Decoder (WIP) Install From Source Install Rust. curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh Clone the source: git c

Samuel Vanderwaal 8 Aug 25, 2022
A fast and simple command-line tool for common operations over JSON-lines files

rjp: Rapid JSON-lines processor A fast and simple command-line tool for common operations over JSON-lines files, such as: converting to and from text

Ales Tamchyna 3 Jul 8, 2022
A node package based on jsonschema-rs for performing JSON schema validation

A node package based on jsonschema-rs for performing JSON schema validation.

dxd 49 Dec 18, 2022
Tools for working with Twitter JSON data

Twitter stream user info extractor This project lets you parse JSON data from the Twitter API or other sources to extract some basic user information,

Travis Brown 4 Apr 21, 2022
A fast way to minify JSON

COMPACTO (work in progress) A fast way to minify JSON. Usage/Examples # Compress # Input example (~0.11 KB) # { # "id": "123", # "name": "Edua

Eduardo Stuart 4 Feb 27, 2022