A performant binary encoding for geographic data based on flatbuffers

Overview

layout FlatGeobuf

CircleCI npm Maven Central Nuget Crates.io Discord Chat Twitter Follow

A performant binary encoding for geographic data based on flatbuffers that can hold a collection of Simple Features including circular interpolations as defined by SQL-MM Part 3.

Inspired by geobuf and flatbush. Deliberately does not support random writes for simplicity and to be able to cluster the data on a packed Hilbert R-Tree enabling fast bounding box spatial filtering. The spatial index is optional to allow the format to be efficiently written as a stream and for use cases where spatial filtering is not needed.

Goals are to be suitable for large volumes of static data, significantly faster than legacy formats without size limitations for contents or metainformation and to be suitable for streaming/random access.

The site switchfromshapefile.org has more in depth information about the problems of legacy formats and provides some alternatives but acknowledges that the current alternatives has some drawbacks on their own, for example they are not suitable for streaming.

FlatGeobuf is open source under the BSD 2-Clause License.

Examples

Specification

layout

  • MB: Magic bytes (0x6667620366676200)
  • H: Header (variable size flatbuffer)
  • I (optional): Static packed Hilbert R-tree index (static size custom buffer)
  • DATA: Features (variable size flatbuffers)

Any 64-bit flatbuffer value contained anywhere in the file (for example coordinates) is aligned to 8 bytes to from the start of the file or feature to allow for direct memory access.

Encoding of any string value is assumed to be UTF-8.

Performance

Preliminary performance tests has been done using road data from OSM for Denmark in SHP format from download.geofabrik.de, containing 906602 LineString features with a set of attributes.

Shapefile GeoPackage FlatGeobuf GeoJSON GML
Read full dataset 1 1.02 0.46 15 8.9
Read w/spatial filter 1 0.94 0.71 705 399
Write full dataset 1 0.77 0.39 3.9 3.2
Write w/spatial index 1 1.58 0.65 - -
Size 1 0.72 0.77 1.2 2.1

The test was done using GDAL implementing FlatGeobuf as a driver and measurements for repeated reads using loops of ogrinfo -qq -oo VERIFY_BUFFERS=NO runs and measurements for repeated writes was done with ogr2ogr conversion from the original to a new file with -lco SPATIAL_INDEX=NO and -lco SPATIAL_INDEX=YES respectively.

Note that for the test with spatial filter a small bounding box was chosen resulting in only 1204 features. The reason for this is to primarily test the spatial index search performance.

As performance is highly data dependent I've also made similar tests on a larger dataset with Danish cadastral data consisting of 2511772 Polygons with extensive attribute data.

Shapefile GeoPackage FlatGeobuf
Read full dataset 1 0.23 0.12
Read w/spatial filter 1 0.31 0.26
Write full dataset 1 0.95 0.63
Write w/spatial index 1 1.07 0.70
Size 1 0.77 0.95

Features

Supported applications / libraries

Documentation

TypeScript / JavaScript

Prebuilt bundles (intended for browser usage)

Node usage

See this example for a minimal how to depend on and use the flatgeobuf npm package.

TODO

  • Java index support
  • C language support
  • Go language support

FAQ

Why not use WKB geometry encoding?

It does not align on 8 bytes so it not always possible to consume it without copying first.

Why not use Protobuf?

Performance reasons and to allow streaming/random access.

Why am I not getting expected performance in GDAL?

Default behaviour is to assume untrusted data and verify buffer integrity for safety. If you have trusted data and want maximum performance make sure to set the open option VERIFY_BUFFERS to NO.

What about vector tiles?

FlatGeobuf does not aim to compete with vector tiles. Vector tiles are great for rendering but they are relatively expensive to create and is a lossy format, where as FlatGeobuf is lossless and very fast to write especially if a spatial index is not needed.

Comments
  • Update to Rust flatbuffers 2.0

    Update to Rust flatbuffers 2.0

    ~Tests fail when reading header from countries.fgb with error Unaligned { position: 84, unaligned_type: "f64", error_trace: ErrorTrace([TableField { field_name: "envelope", position: 48 }]) }~

    ~Reading features emit error Unaligned { position: 444, unaligned_type: "f64", error_trace: ErrorTrace([TableField { field_name: "xy", position: 436 }, VectorElement { index: 0, position: 92 }, TableField { field_name: "parts", position: 84 }, TableField { field_name: "geometry", position: 20 }]) }~

    Edit: Works with flatbuffers 0.8.3. Waiting for FlatBuffers 2.0 release (https://github.com/google/flatbuffers/issues/6353).

    opened by pka 27
  • Support 4D coords

    Support 4D coords

    As discussed in https://lists.osgeo.org/pipermail/proj/2019-June/008657.html it would be good if any future proof format of spatial features can explicitly represent temporal dimension i.e 4D.

    By providing well known meta for possible dimensions X,Y,Z,M and T for a coordinate we can support 4D.

    Open questions:

    • Does unix timestamp work as temporal dimension?
    • Can X,Y vs Z possibly/reasonably need separate temporal dimensions?
    opened by bjornharrtell 23
  • npm: webpack doesn't like the package

    npm: webpack doesn't like the package

    Regressed after 3.17.4. Error message with 3.18.0:

    Module not found: Error: Package path ./lib/mjs/geojson is not exported from package /tmp/broccoli-1848999uiU6yCVIbUjp/cache-615-webpack_bundler_ember_auto_import_webpack/node_modules/flatgeobuf (see exports field in /tmp/broccoli-1848999uiU6yCVIbUjp/cache-615-webpack_bundler_ember_auto_import_webpack/node_modules/flatgeobuf/package.json)

    bug 
    opened by bjornharrtell 16
  • revamp JS examples

    revamp JS examples

    • [ ] ~~Include examples which run off the local build (i'm tired of having to always modify them to point to my local 😆)~~ the naive implementation proved controversial, so not doing this for now.
    • [x] make it easier to discover all the other examples - e.g. a meta-index which links to all of the examples.
      • full file
      • search
      • for both leaflet and open layers

    Also, as the JS i/o gets better (#50), it might be neat to post slightly different examples:

    • [x] search a very large file - I have such a file. Would you be interested in moving it to the fgb s3 bucket?
    • [x] dynamic search (e.g. drag a bounding rect around)
    opened by michaelkirk 15
  • Error (panicked) at FgbReader::open when reading an EPSG:25832 fgb file

    Error (panicked) at FgbReader::open when reading an EPSG:25832 fgb file

    Hi there, thank you for making such a great lib! I'm pretty new to both rust and flatgeobuf, and facing an issue when opening a fgb dataset with CRS EPSG:25832. It complains with:

    thread 'main' panicked at 'called `Result::unwrap()` on an `Err` value: Geometry("Utf8 error for string in 196..1969: invalid utf-8 sequence of 1 bytes from index 1542\n\twhile verifying table field `wkt` at position 188\n\twhile verifying table field `crs` at position 60\n")'
    

    The dataset is a subset of Bavaria cycling network from https://www.ldbv.bayern.de/produkte/weitere/opendata.html , selected and exported to fgb format with QGIS. I am able to open it in QGIS, but met the above issue when reading it with flatgeobuf lib. For other dataset with EPSG:4326 system, there is no such problem. But I am not sure if the CRS is the reason. Could you help shed some light? Thanks a lot! For some reason, I could not upload the sample dataset here. sorry about that!

    opened by tumluliu 14
  • JS: Import as module

    JS: Import as module

    I see that your Leaflet/OpenLayers examples load the built source in the script tag, from the /dist/ folder

    https://cdn.jsdelivr.net/npm/[email protected]/dist/flatgeobuf-geojson.min.js
    https://cdn.jsdelivr.net/npm/[email protected]/dist/flatgeobuf-ol.min.js
    

    However, there's currently no exported object from the module. So if I try to require the package, I get an error:

    require('flatgeobuf')
    // Error: Cannot find module 'flatgeobuf'
    

    This means that it's not possible now (I think) to use it with a bundler, like Webpack or Browserify.

    It would be helpful to export some objects, to be able to do one of

    const flatgeobuf = require('flatgeobuf');
    import flatgeobuf from "flatgeobuf";
    
    flatgeobuf.deserialize(buffer)
    
    opened by kylebarron 14
  • Deserialize performance in js

    Deserialize performance in js

    Hi there,

    I was having a play around either requesting flatgeobuf portions (eg by bbox) in the browser and I noticed that the slow part of the code was in the deserialize operation, compared to the data fetching which seemed to be really quick, that kinda surprised me.

    So I had a poke through the code and noticed lots of use of .map functions. .map functions are generally considered very slow compared to regular for loops (there are stack of blog posts about this eg here). Obviously .map functions are nice in terms of readability, but if you're looking to get some performance gains out of the parsing then replacing them night be a good start, particularly where they are used on large arrays like coords.

    There are also some other candidates I spotted on my quick scan

    • Avoiding .slice on float arrays (this might be a bottleneck if it's being called for every coord)
    • Removing the use of spread operators for concatenating arrays (although I don't think the use of this is wide spread)

    I'd attempt the PR with a benchmark but TS drives me a bit bonkers sorry :)

    Anyway, aside from that, I really the appreciate the project, from my initial play it's been very simple to use 👍

    Thanks, Rowan

    opened by rowanwins 13
  • Rust implementation

    Rust implementation

    While having a deeper look at flatgeobuf, I started a Rust port. Don't know how much time I can invest, but flatbuf generated code looks already good so far.

    opened by pka 13
  • How to import flatgeobuf in nodejs env?

    How to import flatgeobuf in nodejs env?

    When I import in this way: const geojson = require('flatgeobuf/lib/cjs/flatgeobuf.js');

    it throw error:

    node:internal/modules/cjs/loader:488
          throw e;
          ^
    
    Error [ERR_PACKAGE_PATH_NOT_EXPORTED]: Package subpath './lib/cjs/flatgeobuf.js' is not defined by "exports" in /mnt/d/workspace/flatgeobuf_test/node_modules/flatgeobuf/package.json
        at new NodeError (node:internal/errors:371:5)
        at throwExportsNotFound (node:internal/modules/esm/resolve:440:9)
        at packageExportsResolve (node:internal/modules/esm/resolve:692:3)
        at resolveExports (node:internal/modules/cjs/loader:482:36)
        at Function.Module._findPath (node:internal/modules/cjs/loader:522:31)
        at Function.Module._resolveFilename (node:internal/modules/cjs/loader:919:27)
        at Function.Module._load (node:internal/modules/cjs/loader:778:27)
        at Module.require (node:internal/modules/cjs/loader:1005:19)
        at require (node:internal/modules/cjs/helpers:102:18)
        at Object.<anonymous> (/mnt/d/workspace/flatgeobuf_test/mjs/server.cjs:5:17) {
      code: 'ERR_PACKAGE_PATH_NOT_EXPORTED'
    }
    

    Environment: Node: v16.13.0 npm: 8.1.4 os: Ubuntu-20.04 "flatgeobuf": "^3.20.1", "node-fetch": "^2.6.6"

    const resource = 'https://gvpub.oss-cn-beijing.aliyuncs.com/UScounties.fgb';
    const http = require('http');
    const url = require('url');
    const fetch  = require('node-fetch');
    const geojson = require('flatgeobuf/lib/cjs/flatgeobuf.js');
    const port = 3000;
    let server = http.createServer(async (req, res) => {
        const urlObj = url.parse(req.url || '', true);
        const { pathname, query } = urlObj;
        if (pathname && pathname.startsWith('/api') && pathname === '/api/data') {
            if (query.bounds) {
            }
            else {
                return await getData();
            }
        }
        else
            res.write('test');
        res.end();
    });
    server.listen(port, () => {
        console.log(`Server is running on port: ${port}`);
    });
    async function getData() {
        const response = await fetch(resource, {});
        const features = [];
        console.log(response.body);
        const aa = geojson.deserialize(response.body);
        return features;
    }
    

    When I downgrade flatgeobuf to 3.17.0, and import it as below: const geojson = require("flatgeobuf/lib/cjs/geojson") it throws error too:

    /mnt/d/workspace/aa_del/node_modules/flatgeobuf/lib/cjs/geojson.js:13
        else if (input instanceof ReadableStream)
                                  ^
    
    ReferenceError: ReadableStream is not defined
        at Object.deserialize (/mnt/d/workspace/aa_del/node_modules/flatgeobuf/lib/cjs/geojson.js:13:31)
        at getData (/mnt/d/workspace/aa_del/server.js:29:24)
        at processTicksAndRejections (node:internal/process/task_queues:96:5)
        at async Server.<anonymous> (/mnt/d/workspace/aa_del/server.js:15:20)
    

    Can anyone give me some hint? Thanks! @bjornharrtell

    opened by jackcjp 11
  • Network usage statistics for rust client (HttpFgbReader)?

    Network usage statistics for rust client (HttpFgbReader)?

    Is there a way to log or access aggregate network usage for HttpFgbReader?

    Specifically, I'd like to do something like:

    println!("downloaded this many bytes: {}", reader.downloaded_byte_count())
    

    background

    I've been messing around with flatgeobuffers+geozero over the last day or so. Pretty nice so far!

    vs. the previous geosjson implementation, obviously parsing is much faster and filesize is much less. 🚀

    Previously my application downloaded a specific pre-sliced geojson file clipped to the client's specific map area. What I want to try is, instead of having to prepare pre-sliced files for each of my maps, have just one huge network hosted FGB file, and have the application self-serve what it needs using FGB's bbox selection feature (via HTTPFgbReader).

    But I need to measure exactly how much network transfer is actually entailed in this (downloading the headers, the index, and just the feature slices it needs.)

    The best I've found so far is to turn on debug logging and doing math with lines like:

    [2020-12-16T17:06:07Z DEBUG hyper::proto::h1::conn] incoming body completed 
    [2020-12-16T17:06:07Z DEBUG hyper::client::pool] pooling idle connection for ("https", s3.amazonaws.com) 
    [2020-12-16T17:06:07Z DEBUG hyper::client::pool] reuse idle connection for ("https", s3.amazonaws.com) 
    [2020-12-16T17:06:07Z DEBUG hyper::proto::h1::io] flushed 119 bytes 
    [2020-12-16T17:06:07Z DEBUG hyper::proto::h1::io] read 431 bytes 
    [2020-12-16T17:06:07Z DEBUG hyper::proto::h1::io] parsed 10 headers 
    [2020-12-16T17:06:07Z DEBUG hyper::proto::h1::conn] incoming body is content-length (1048576 bytes) 
    [2020-12-16T17:06:07Z DEBUG reqwest::async_impl::client] response '206 Partial Content' for https://s3.amazonaws.com/foo/bar_areas.fgb
    [2020-12-16T17:06:07Z DEBUG hyper::proto::h1::io] read 8569 bytes 
    ...
    [2020-12-16T17:06:07Z DEBUG hyper::proto::h1::conn] incoming body is content-length (1048576 bytes) 
    [2020-12-16T17:06:07Z DEBUG reqwest::async_impl::client] response '206 Partial Content' for https://s3.amazonaws.com/foo/bar_areas.fgb
    [2020-12-16T17:06:07Z DEBUG hyper::proto::h1::io] read 8569 bytes 
    [2020-12-16T17:06:07Z DEBUG hyper::proto::h1::io] read 9000 bytes 
    [2020-12-16T17:06:07Z DEBUG hyper::proto::h1::io] read 9000 bytes 
    [2020-12-16T17:06:07Z DEBUG hyper::proto::h1::io] read 9000 bytes 
    [2020-12-16T17:06:07Z DEBUG hyper::proto::h1::io] read 16384 bytes 
    [2020-12-16T17:06:07Z DEBUG hyper::proto::h1::io] read 1024 bytes 
    [2020-12-16T17:06:07Z DEBUG hyper::proto::h1::io] read 16384 bytes 
    [2020-12-16T17:06:07Z DEBUG hyper::proto::h1::io] read 1024 bytes 
    [2020-12-16T17:06:07Z DEBUG hyper::proto::h1::io] read 16384 bytes 
    [2020-12-16T17:06:07Z DEBUG hyper::proto::h1::io] read 1024 bytes 
    [2020-12-16T17:06:07Z DEBUG hyper::proto::h1::io] read 16384 bytes 
    [2020-12-16T17:06:07Z DEBUG hyper::proto::h1::io] read 1024 bytes 
    

    ... is there a better way to get this in aggregate?

    opened by michaelkirk 11
  • Extended column metadata

    Extended column metadata

    As per current spec the metadata for a column consists of only name and type (https://github.com/bjornharrtell/flatgeobuf/blob/bb80f92a6d8d6ff277da7b7d590529e8f0c1fe83/src/fbs/header.fbs#L44-L46). It could be helpful to optionally be able to provide more details, fx. length/width/precision and nullability.

    This could probably be added without breaking spec v3 backward compatibility.

    opened by bjornharrtell 11
  • Feature request: Support for compressing large text data

    Feature request: Support for compressing large text data

    Very large text data are stored as they are, without compression:

    image Example: sorry in Japanese

    Is it possible to store compressed (by zip or so?) data if text size were larger than threshold? Is this silly idea?

    opened by kochizufan 2
  • Feature request: Support for binary serialization of object/array values

    Feature request: Support for binary serialization of object/array values

    Hi

    Sorry if this topic has been discussed a lot in the past.

    I found Object or array type values are serialized not as binary but as JSON text.

    image Example: Sorry in Japanese

    I think this greatly hinders the improvement of the compression ratio. Can I expect the function to binary serialize attribute values ​​of Object and Array types in future?

    opened by kochizufan 3
  • Firefox attempts to downloads entire .fgb file instead of features in bounding box

    Firefox attempts to downloads entire .fgb file instead of features in bounding box

    Hey 👋

    We're currently trying to integrate a self-hosted, indexed FlatGeobuf in our site (exported from QGIS). For the site, we're using the filtering feature as demonstrated in the https://flatgeobuf.org/examples/leaflet/large.html example.

    Here's our minimum working example: https://atlas.thuenen.de/webspace/agraratlas/rs/test_fgb_vti.html

    While everything works as expected in Chrome, Safari and Edge, it seems that Firefox attempts to download the entire .fgb file instead of just the parts contained in the bounding box. Since the example provided in leaflet/large.html also works in Firefox on our end, it seems that there's something wrong with our file/server, but we couldn't figure it out so far.

    Any ideas of what we're doing wrong? Thanks for your help!

    opened by chrispahm 1
  • Feature Request: Allow Uint8 or ObjectURL blobs in deserialize filtered query

    Feature Request: Allow Uint8 or ObjectURL blobs in deserialize filtered query

    Was hoping to be able to place flatgeobuf into a webworker and provide range requests/filtered requests from arbitrary serialized geojson features.

    However, the range request feature with spatial filter seems directly tied to the HTTP/request system.

    Reviewing the code, it looks tightly coupled.

    It seems like it should just be given an interface that returns various slices of data from . Is it possible to make a cheap object interface that can be substituted for the <url> input.

    opened by disarticulate 1
  • rust: document that `HttpFgbReader` won't work with `https` urls by default

    rust: document that `HttpFgbReader` won't work with `https` urls by default

    I stumbled on this -- if you make a simple app using flatgeobuf that doesn't depend directly on [email protected], it will fail to handle https urls with this error message:

    Error: http error `error sending request for url (https://blah/blah/blah.fgb): error trying to connect: invalid URL, scheme is not http`
    

    That error originates in hyper. I think the issue is that http-range-client has an optional dependency on reqwest and so without that, you'll get a client that has no support for HTTPS. That's...really odd. I'd suggest either:

    • document clearly that HttpFgbReader will not read HTTPS urls unless you add an explicit dependency on the same version of request that flatgeobuf's http-range-client depends on, or
    • add a tls or http feature to flatgeobuf and document that in the README.
    opened by msalib 1
Releases(3.7.2)
Google Encoded Polyline encoding & decoding in Rust.

polyline Google Encoded Polyline encoding & decoding in Rust. A Note on Coordinate Order This crate uses Coordinate and LineString types from the geo-

GeoRust 14 Dec 11, 2022
TIFF decoding and encoding library in pure Rust

image-tiff TIFF decoding and encoding library in pure Rust Supported Features Baseline spec (other than formats and tags listed below as not supported

image-rs 66 Dec 30, 2022
A single-binary, GPU-accelerated LLM server (HTTP and WebSocket API) written in Rust

Poly Poly is a versatile LLM serving back-end. What it offers: High-performance, efficient and reliable serving of multiple local LLM models Optional

Tommy van der Vorst 13 Nov 5, 2023
An advanced geospatial data analysis platform

Bringing the power of Whitebox GAT to the world at large This page is related to the stand-alone command-line program and Python scripting API for geo

John Lindsay 683 Jan 5, 2023
Spatial Data Structures for Rust

spade Documentation Using spade Examples Project state Performance License Spade (SPAtial DatastructurEs, obviously!) implements a few nifty data stru

Stefan Altmayer 195 Dec 21, 2022
Zero-Copy reading and writing of geospatial data.

GeoZero Zero-Copy reading and writing of geospatial data. GeoZero defines an API for reading geospatial data formats without an intermediate represent

GeoRust 155 Dec 29, 2022
Convert perf.data files to the Firefox Profiler format

fxprof-perf-convert A converter from the Linux perf perf.data format into the Firefox Profiler format, specifically into the processed profile format.

Markus Stange 12 Sep 19, 2022
Databento Binary Encoding (DBZ) - Fast message encoding and storage format for market data

dbz A library (dbz-lib) and CLI tool (dbz-cli) for working with Databento Binary Encoding (DBZ) files. Python bindings for dbz-lib are provided in the

Databento, Inc. 15 Nov 4, 2022
Easy c̵̰͠r̵̛̠ö̴̪s̶̩̒s̵̭̀-t̶̲͝h̶̯̚r̵̺͐e̷̖̽ḁ̴̍d̶̖̔ ȓ̵͙ė̶͎ḟ̴͙e̸̖͛r̶̖͗ë̶̱́ṉ̵̒ĉ̷̥e̷͚̍ s̷̹͌h̷̲̉a̵̭͋r̷̫̊ḭ̵̊n̷̬͂g̵̦̃ f̶̻̊ơ̵̜ṟ̸̈́ R̵̞̋ù̵̺s̷̖̅ţ̸͗!̸̼͋

Rust S̵̓i̸̓n̵̉ I̴n̴f̶e̸r̵n̷a̴l mutability! Howdy, friendly Rust developer! Ever had a value get m̵̯̅ð̶͊v̴̮̾ê̴̼͘d away right under your nose just when

null 294 Dec 23, 2022
FlatBuffers compiler (flatc) as API (with focus on transparent `.fbs` to `.rs` code-generation via Cargo build scripts integration)

FlatBuffers flatc API for Rust This crate provides a programmatical way to invoke flatc command (e.g. from build.rs) to generate Rust (or, in fact, an

Vlad Frolov 87 Dec 22, 2022
A tool to deserialize data from an input encoding, transform it and serialize it back into an output encoding.

dts A simple tool to deserialize data from an input encoding, transform it and serialize it back into an output encoding. Requires rust >= 1.56.0. Ins

null 11 Dec 14, 2022
A set of tools for generating isochrones and reverse isochrones from geographic coordinates

This library provides a set of tools for generating isochrones and reverse isochrones from geographic coordinates. It leverages OpenStreetMap data to construct road networks and calculate areas accessible within specified time limits.

null 3 Feb 22, 2024
I/O and binary data encoding for Rust

nue A collection of tools for working with binary data and POD structs in Rust. pod is an approach at building a safe interface for transmuting POD st

null 38 Nov 9, 2022
CBOR (binary JSON) for Rust with automatic type based decoding and encoding.

THIS PROJECT IS UNMAINTAINED. USE serde_cbor INSTEAD. This crate provides an implementation of RFC 7049, which specifies Concise Binary Object Represe

Andrew Gallant 121 Dec 27, 2022
Tachyon is a performant and highly parallel reliable udp library that uses a nack based model

Tachyon Tachyon is a performant and highly parallel reliable udp library that uses a nack based model. Strongly reliable Reliable fragmentation Ordere

Chris Ochs 47 Oct 15, 2022
A performant but not-so-accurate time and capacity based cache for Rust.

fastcache A performant but not-so-accurate time and capacity based cache for Rust. This crate provides an implementation of a time-to-live (TTL) and c

Pure White 3 Aug 17, 2023
Binary coverage tool without binary modification for Windows

Summary Mesos is a tool to gather binary code coverage on all user-land Windows targets without need for source or recompilation. It also provides an

null 384 Dec 30, 2022
Binary coverage tool without binary modification for Windows

Summary Mesos is a tool to gather binary code coverage on all user-land Windows targets without need for source or recompilation. It also provides an

null 381 Dec 22, 2022
Parse and encoding of data using the SCTE-35 standard.

SCTE-35 lib and parser for Rust Work in progress! This library provide access to parse and encoding of data using the SCTE-35 standard. This standard

Rafael Carício 4 May 6, 2022
The new, performant, and simplified version of Holochain on Rust (sometimes called Holochain RSM for Refactored State Model)

Holochain License: This repository contains the core Holochain libraries and binaries. This is the most recent and well maintained version of Holochai

Holochain 741 Jan 5, 2023