A TOML encoding/decoding library for Rust



Latest Version Documentation

A TOML decoder and encoder for Rust. This library is currently compliant with the v0.5.0 version of TOML. This library will also likely continue to stay up to date with the TOML specification as changes happen.

# Cargo.toml
toml = "0.5"

This crate also supports serialization/deserialization through the serde crate on crates.io. Currently the older rustc-serialize crate is not supported in the 0.3+ series of the toml crate, but 0.2 can be used for that support.


This project is licensed under either of

at your option.


Unless you explicitly state otherwise, any contribution intentionally submitted for inclusion in toml-rs by you, as defined in the Apache-2.0 license, shall be dual licensed as above, without any additional terms or conditions.

  • Add a spanned value

    Add a spanned value

    Fixes #236. Fixes #95.

    Builds on #324, #328 and #333.


    • No ability to get spans of dotted tables as well as arrays defined with [[key]] (inline tables as well as [] arrays do work). While we could probably special case tables/arrays that are contiguous, in the general case the toml data format allows settings where tables/arrays are spread over parts of the file. In these cases, we set both start and end of the span to 0, as this is easily detectable and no interesting value would ever have such a span. User code needs to be able to handle these cases.
    opened by est31 17
  • Issue 255: Externally tagged enum deserialization

    Issue 255: Externally tagged enum deserialization

    cc: #225, #222

    ~~The tuple variant hasn't been implemented yet, I'm having trouble figuring it out.~~ See #225 for ongoing discussion -- what should / should not be supported.

    Short, the following TOML:

    plain = "Plain"
    plain_table = { Plain = {} }
    tuple = { Tuple = { 0 = 123, 1 = true } }
    struct = { Struct = { value = 123 } }
    newtype = { NewType = "value" }
    my_enum = [
        { Plain = {} },
        { Tuple = { 0 = 123, 1 = true } },
        { NewType = "value" },
        { Struct = { value = 123 } }

    with the following types:

    #[derive(Debug, Deserialize)]
    struct Config {
        plain: MyEnum,
        plain_table: MyEnum,
        tuple: MyEnum,
        #[serde(rename = "struct")]
        structv: MyEnum,
        newtype: MyEnum,
        my_enum: Vec<MyEnum>,
    #[derive(Debug, Deserialize)]
    enum MyEnum {
        Tuple(i64, bool),
        Struct { value: i64 },

    deserializes to:

    Config {
        plain: Plain,
        plain_table: Plain,
        tuple: Tuple(
        structv: Struct {
            value: 123
        newtype: NewType(
        my_enum: [
            Struct {
                value: 123
    opened by azriel91 17
  • allow configuration when encoding to strings

    allow configuration when encoding to strings

    The use case I am trying to meet is the ability to configure toml to use tripple quotes (''') when useful (such as when there are any newlines.

    Here is some example code:

    extern crate rustc_serialize;
    extern crate toml;
    use toml::{encode_str};
    struct MyStruct { foo: isize, bar: String }
    const LONG_TEXT: &'static str = r#"
    this is some text
    here is some more
    fn main() {
        let my_struct = MyStruct { foo: 4, bar: LONG_TEXT.to_string() };
        println!("{}", encode_str(&my_struct));

    This currently prints:

    bar = "\nthis is some text\nhere is some more\nyay\n"
    foo = 4

    I would like for it to be able to print:

    bar = '''
    this is some text
    here is some more
    foo = 4
    opened by vitiral 17
  • New Maintainers

    New Maintainers

    I personally no longer have the time or energy to maintain this crate. I believe this is still pretty widely used as the default TOML implementation for Rust, however, so it'd be good to keep this maintained. I'm opening this to see if there is another who is willing to take over ownership and maintenance of this crate. In the meantime I will not be maintaining this crate other than if a security issue arises.

    opened by alexcrichton 16
  • demonstrate tripple quote problem

    demonstrate tripple quote problem

    This is to demonstrate the tripple quote problem I theorized: that if there are ''' in a pretty string there will be issues. However, the issue I hit was not as expected.

    The written test prints the following:

    text = '''                                          
    this is the first line                                                                                   
    This has a ''\' in it for no reason                                                                      
    this is the third line                               
    text = '''                                                                                               
    this is the first line                              
    This has a ''\\' in it for no reason                
    this is the third line                              

    It seems that the round trip is doubling the number of \ it finds?

    opened by vitiral 15
  • Implement round-trippable, editable TOML parsing

    Implement round-trippable, editable TOML parsing

    This is a very rough implementation of round-tripping TOML parser. It correctly round trips all the TOML test suite (except inline arrays). It's very unsuitable for merging at the moment (broken formatting, random layout of modules, unimplemented inline arrays, no public API etc.). Since it's a very invasive change, that make internals more complicated (for no real gain if you just want to simply serialize and deserialize), my question is, should I work on getting it into toml-rs or move this work to a fork (with just API for load-edit-save of TOML documents, leaving attribute-based serializing/deserializing here)?

    opened by vosen 15
  • Support emitting spans when deserializing structures

    Support emitting spans when deserializing structures


    I have a use-case where I perform higher-level diagnostics on top of existing structures, and in order to provide nice errors to the user I need to have access to the span information where a given structure was deserialized from.

    Following is a somewhat naive proposal, since I'm not too familiar with the low-level details of how to implement a deserializer for serde.

    One option could be to include spans in the tokens returned by the tokenizer, then introduce a generic wrapper (e.g. Spanned<T>) that includes the spans when de-serializing so that they can be used in later errors.

    struct Span {
      /// Byte offset where the span starts.
      pub start: usize,
      /// Byte offset where the span ends.
      pub end: usize,
    struct Spanned<T> {
      pub span: Span,
      pub value: T,
    pub struct Foo {
      bar: Spanned<String>,

    For this to work with toml::Value, the spans would have to be included here, which might exclude it for backwards compatibility. In that case a new set of structures (e.g. toml::SpannedValue) could be introduced.

    enhancement A-parsing 
    opened by udoprog 14
  • ValueAfterTable error

    ValueAfterTable error

    I created this website using toml-rs and wasm to convert json to toml: https://pseitz.github.io/toml-to-json-online-converter/

    The code is pretty straightforward, but sometimes I get the ValueAfterTable error. I have no idea how to fix this, since the input is generic. (Reorder the serde_json::Value?)

    pub fn convert_json_to_toml(input:&str) -> Result<String, JsValue>{
        let mut toml = String::new();
        let mut deserializer = Deserializer::from_str(input);
        let mut serializer = Serializer::pretty(&mut toml);
        serde_transcode::transcode(&mut deserializer, &mut serializer).map_err(|err| JsValue::from_str(&err.to_string()))?;

    JSON Input: { "kana": {"ent_seq": "1000000"}, "kanji": [] }

    opened by PSeitz 13
  • Expose part of de::ErrorInner publicly

    Expose part of de::ErrorInner publicly

    See #360.

    The second commit exposes de::ErrorKind publicly, if this is not desired I can remove it. It should be fine to expose since it is already marked as non_exhaustive, using #[doc(hidden)].

    opened by Luthaf 12
  • Update to v1.0.0-rc.1

    Update to v1.0.0-rc.1

    On April 1 a new version of TOML was published: https://github.com/toml-lang/toml/blob/master/CHANGELOG.md#100-rc1--2020-04-01

    The changelog mentions the following changes:

    • [x] Clarify in ABNF how quotes in multi-line basic and multi-line literal strings are allowed to be used. (PR in #393)
    • [x] Leading zeroes in exponent parts of floats are permitted.
    • [x] Clarify that control characters are not permitted in comments.
    • [ ] Clarify behavior of tables defined implicitly by dotted keys.
      • [x] spec text:

        Dotted keys define everything to the left of each dot as a table. Since tables cannot be defined more than once, redefining such tables using a [table] header is not allowed. Likewise, using dotted keys to redefine tables already defined in [table] form is not allowed.

      • [ ] spec text (example file):

        The [table] form can, however, be used to define sub-tables within tables defined via dotted keys.

    • [x] Clarify that inline tables are immutable.
    • [x] Clarify that trailing commas are not allowed in inline tables.
    • [x] Clarify in ABNF that UTF-16 surrogate code points (U+D800 - U+DFFF) are not allowed in strings or comments. (ensured by Rust)
    • [x] Allow raw tab characters in basic strings and multi-line basic strings.
    • [x] Allow heterogenous values in arrays. (was changed in #358)

    TOML is getting official compliance tests. The repository contains no actual tests yet but a script to run the compliance tests eventually. I think the best way to reach v1.0 and to avoid bugs is to use the test suite to check both deserialization and serialization.

    While this crate already contains a test runner for an almost identical format it may be good to interface with the default Python test runner. If many implementations use the default runner it becomes easy to monitor the correctness of many different implementations.

    Additionally this is a good opportunity to split the Datetime as suggested in https://github.com/alexcrichton/toml-rs/issues/159#issuecomment-463732912 by @quadrupleslap. The TOML spec and tests make it very clear that these are indeed distinct types. I think it will be easier to use those values if one can be sure that a table contains the correct kind of date or time and does not have to check the value.

    The four new types each has a corresponding chrono type:

    • DateTimechrono::DateTime
    • LocalDateTimechrono::NaiveDateTime
    • LocalDatechrono::NaiveDate
    • LocalTimechrono::NaiveTime

    The TOML types can be converted with try_from to chrono types. An error is returned if the date does not exist (e. g. 2019-02-29). In the other direction from is used as no error can occur.

    • [ ] Split Datetime
    • [ ] Implement conversions to chrono types
    • [ ] Implement conversions from chrono types
    opened by pyfisch 11
  • Large TOML document performance

    Large TOML document performance

    I'm attempting to read a large-ish TOML document like so:

    println!("Reading toml size {}", s.len());
    let t = toml::from_str(s).unwrap();

    where the str size is around 4MB (this is an auto-generated TOML, obviously). The file is essentially a lot of very small tables in the following format:

    name = "Foo"
    url = "http://example.com"
    tags = ["tag-name"]

    The loading time is unreasonable, spinning the CPU up to 100% and taking way too long, over a minute before I kill the process.

    Am I wrong to expect this library to be able to handle files that big?

    opened by yuvadm 11
  • 0.5.9(Apr 14, 2022)

  • 0.5.8(Dec 18, 2020)

  • 0.5.0(Mar 15, 2019)

    • Add preserve_order Cargo feature. This retains the order of map keys in toml::Value. (#278)
    • Fix issue #279 where some duplicate table headers were accepted. Added Deserializer::set_allow_duplicate_after_longer_table for anyone who wants to retain the old, broken behavior. (#280)
    • Fix case sensitivity with T, Z, and E. (#290)
    • Add PartialEq to de::Error. (#292)
    Source code(tar.gz)
    Source code(zip)
Alex Crichton
Alex Crichton
Encoding and decoding support for BSON in Rust

bson-rs Encoding and decoding support for BSON in Rust Index Overview of BSON Format Usage BSON Values BSON Documents Modeling BSON with strongly type

mongodb 304 Dec 30, 2022
A HTML entity encoding library for Rust

A HTML entity encoding library for Rust Example usage All example assume a extern crate htmlescape; and use htmlescape::{relevant functions here}; is

Viktor Dahl 41 Nov 1, 2022
Implementation of Bencode encoding written in rust

Rust Bencode Implementation of Bencode encoding written in rust. Project Status Not in active developement due to lack of time and other priorities. I

Arjan Topolovec 32 Aug 6, 2022
A Gecko-oriented implementation of the Encoding Standard in Rust

encoding_rs encoding_rs an implementation of the (non-JavaScript parts of) the Encoding Standard written in Rust and used in Gecko (starting with Fire

Henri Sivonen 284 Dec 13, 2022
Character encoding support for Rust

Encoding 0.3.0-dev Character encoding support for Rust. (also known as rust-encoding) It is based on WHATWG Encoding Standard, and also provides an ad

Kang Seonghoon 264 Dec 14, 2022
Variable-length signed and unsigned integer encoding that is byte-orderable for Rust

ordered-varint Provides variable-length signed and unsigned integer encoding that is byte-orderable. This crate provides the Variable trait which enco

Khonsu Labs 7 Dec 6, 2022
A series of compact encoding schemes for building small and fast parsers and serializers

A series of compact encoding schemes for building small and fast parsers and serializers

Manfred Kröhnert 2 Feb 5, 2022
Entropy Encoding notebook. Simple implementations of the "tANS" encoder/decoder.

EntropyEncoding Experiments This repository contains my Entropy Encoding notebook. Entropy encoding is an efficient lossless data compression scheme.

Nadav Rotem 4 Dec 21, 2022
TLV-C encoding support.

TLV-C: Tag - Length - Value - Checksum TLV-C is a variant on the traditional [TLV] format that adds a whole mess of checksums and whatnot. Why, you as

Oxide Computer Company 3 Nov 25, 2022
Rust library for reading/writing numbers in big-endian and little-endian.

byteorder This crate provides convenience methods for encoding and decoding numbers in either big-endian or little-endian order. Dual-licensed under M

Andrew Gallant 811 Jan 1, 2023
A HTTP Archive format (HAR) serialization & deserialization library, written in Rust.

har-rs HTTP Archive format (HAR) serialization & deserialization library, written in Rust. Install Add the following to your Cargo.toml file: [depende

Sebastian Mandrean 25 Dec 24, 2022
tnetstring serialization library for rust.

TNetStrings: Tagged Netstrings This module implements bindings for the tnetstring serialization format. API let t = tnetstring::str("hello world"); le

Erick Tryzelaar 16 Jul 14, 2019
rust-jsonnet - The Google Jsonnet( operation data template language) for rust

rust-jsonnet ==== Crate rust-jsonnet - The Google Jsonnet( operation data template language) for rust Google jsonnet documet: (http://google.github.io

Qihoo 360 24 Dec 1, 2022
MessagePack implementation for Rust / msgpack.org[Rust]

RMP - Rust MessagePack RMP is a pure Rust MessagePack implementation. This repository consists of three separate crates: the RMP core and two implemen

Evgeny Safronov 840 Dec 30, 2022
A Rust ASN.1 (DER) serializer.

rust-asn1 This is a Rust library for parsing and generating ASN.1 data (DER only). Installation Add asn1 to the [dependencies] section of your Cargo.t

Alex Gaynor 85 Dec 16, 2022
Cap'n Proto for Rust

Cap'n Proto for Rust documentation blog Introduction Cap'n Proto is a type system for distributed systems. With Cap'n Proto, you describe your data an

Cap'n Proto 1.5k Dec 26, 2022
Rust implementation of CRC(16, 32, 64) with support of various standards

crc Rust implementation of CRC(16, 32, 64). MSRV is 1.46. Usage Add crc to Cargo.toml [dependencies] crc = "2.0" Compute CRC use crc::{Crc, Algorithm,

Rui Hu 120 Dec 23, 2022
A CSV parser for Rust, with Serde support.

csv A fast and flexible CSV reader and writer for Rust, with support for Serde. Dual-licensed under MIT or the UNLICENSE. Documentation https://docs.r

Andrew Gallant 1.3k Jan 5, 2023
pem-rs pem PEM jcreekmore/pem-rs [pem] — A Rust based way to parse and encode PEM-encoded data

pem A Rust library for parsing and encoding PEM-encoded data. Documentation Module documentation with examples Usage Add this to your Cargo.toml: [dep

Jonathan Creekmore 30 Dec 27, 2022