The new, performant, and simplified version of Holochain on Rust (sometimes called Holochain RSM for Refactored State Model)

Overview

Holochain

Project Forum Chat

Twitter Follow License: License: CAL 1.0

This repository contains the core Holochain libraries and binaries.

This is the most recent and well maintained version of Holochain with a refactored state model (you may see references to it as Holochain RSM).

Code Status

This code is in alpha. It is not for production use. The code is guaranteed NOT secure.

We will be frequently and heavily restructuring code APIs and data chains until Beta.

We are currently only supporting Linux at this time. You may or may not be able to successfully build and run Holochain on macOS. You definitely won't be able to on Windows (unless you are using WSL, but even that is untested). We will definitely be rolling out support for these OSes in the future, but in the meantime please use Linux for development!

Making the Holochain binaries available in your shell

There are a number of contexts and purposes you might be running in which yield different ways to access binaries.

Using nix-shell on a local clone

Assuming you have installed the nix shell:

nix-shell --argstr flavor happDev

This nix-shell flavor installs wrapper binaries for holochain and hc that will automatically compile and run the binaries. This is very useful if you are tracking changes in the holochain repo because when you check out a new rev, running holochain will compile automatically to the version at that rev.

Building with cargo if you already have rust installed:

cargo install --path crates/holochain
cargo install --path crates/hc

Usage

$ holochain --help
USAGE:
    holochain [FLAGS] [OPTIONS]

FLAGS:
    -h, --help           Prints help information
    -i, --interactive    Receive helpful prompts to create missing files and directories,
                             useful when running a conductor for the first time
    -V, --version        Prints version information

OPTIONS:
    -c, --config-path <config-path>
            Path to a YAML file containing conductor configuration

Running holochain requires a config file. You can generate one in the default configuration file locations using interactive mode:

$ holochain -i
There is no conductor config YAML file at the path specified (/home/eric/.config/holochain/conductor-config.yml)
Would you like to create a default config file at this location? [Y/n]
Y
Conductor config written.
There is no database environment set at the path specified (/home/eric/.local/share/holochain/databases)
Would you like to create one now? [Y/n]
Y
LMDB environment created.
Conductor ready.

As well as creating the config file this process also instantiates the initial LMDB database environment. If you provide a config file on first run with just the -c flag holochain will also initialize the environment even if not in interactive mode.

Development Environment

Assuming you have installed the nix shell:

git clone [email protected]:holochain/holochain.git
cd holochain
nix-shell
hc-merge-test

This will compile holochain and run all the tests.

If you get an error while running nix-shell about ngrok having an unfree license, you can fix that by running,

mkdir -p ~/.config/nixpkgs/
echo "{ allowUnfree = true; }" >> ~/.config/nixpkgs/config.nix

We have an all-in-one development environment including (among other things):

  • The correct version and sane environment variables of cargo/rust
  • Node for working with tryorama
  • Scaffolding, build and deployment scripts
  • Prebuilt binaries of core for various operating systems (soon)
  • Shared libs such as libsodium

It is called Holonix and you should use it.

It has plenty of documentation and functionality and can be used across Windows, Mac, and Linux. (Although Holochain itself currently only supports Linux.) It is based on the development tools provided by NixOS.

It is suitable for use in hackathons and 'serious' development for a long-term, production grade development team.

If you want to maintain your own development environment then we can only offer rough advice, because anything we say today could be out of date tomorrow:

  • Use a recent stable version of rust
  • Use node 12x+ for clientside work
  • Install any relevant shared libs like libsodium
  • Write your own scaffolding, build and development tools
  • Plan for dependency management as we ship new binaries

Application Developer

Read the wasm API docs

Build the hdk docs:

cargo doc --manifest-path=crates/hdk/Cargo.toml --open

Core Developer

Build the holochain docs:

cargo doc --manifest-path=crates/holochain/Cargo.toml --open

Contribute

Holochain is an open source project. We welcome all sorts of participation and are actively working on increasing surface area to accept it. Please see our contributing guidelines for our general practices and protocols on participating in the community, as well as specific expectations around things like code formatting, testing practices, continuous integration, etc.

  • Connect with us on our forum

License

License: CAL 1.0

Copyright (C) 2019 - 2021, Holochain Foundation

This program is free software: you can redistribute it and/or modify it under the terms of the license provided in the LICENSE file (CAL-1.0). This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.

Comments
  • Holochain <-> .Net / C# interface

    Holochain <-> .Net / C# interface

    Update: initial sprinklings of a .net/C# conductor client are starting here: https://github.com/holochain-open-dev/holochain-client-csharp

    Hi Guys,

    Really happy Windows support has finally been added back in again, yay! :)

    But when I run cargo install holochain I get the follow error message:

    C:\Users\david\holochain>cargo install holochain
        Updating crates.io index
      Downloaded holochain v0.0.102
      Downloaded 1 crate (279.9 KB) in 2.09s
      Installing holochain v0.0.102
    error: failed to compile `holochain v0.0.102`, intermediate artifacts can be found at `C:\Users\david\AppData\Local\Temp\cargo-install7cpdpQ`
    
    Caused by:
      failed to download `bit-set v0.2.0`
    
    Caused by:
      unable to get packages from source
    
    Caused by:
      failed to parse manifest at `C:\Users\david\.cargo\registry\src\github.com-1ecc6299db9ec823\bit-set-0.2.0\Cargo.toml`
    
    Caused by:
      dependency (bit-vec) specified without providing a local path, Git repository, or version to use.
    

    Any ideas? :) Thanks

    Cheers D.

    opened by dellams 233
  • Unable to install large apps

    Unable to install large apps

    The hard 16MB limit on websocket payload sizes has become a blocking error for me. The output from Tryorama in my tests looks like this (stack frames truncated):

    20:16:58 [tryorama: player c0] debug: initialized
    20:16:58 [tryorama: player c0] debug: Player.installHapps
    20:16:58 [tryorama: player c0] debug: Player.adminWs()
    20:16:58 [tryorama: player c0] debug: Player.installHapp(["/home/pospi/projects/holo-rea/valueflows-project-metarepo/holo-rea/happs/observation/hrea_observation.dna","/home/pospi/projects/holo-rea/valueflows-project-metarepo/holo-rea/happs/planning/hrea_planning.dna"], noAgentPubKey)
    20:17:13 [tryorama] error: Test error: 'Error: Timed out in 15000ms: register_dna\n'
    

    I have recently added a wasm-opt pass to my build to see if it alleviates the issue, but unfortunately the byte savings are modest (~25% shaved off).

    Suggest this be elevated as a serious issue as more complex applications continue to be developed. While my dna configs may appear to list a large number of zomes, I expect this will be nothing compared to highly modular app architectures being envisioned by https://neighbourhoods.network/ and https://ad4m.dev/. It is worth noting that the tests failing currently only involve 2 DNAs containing 19 zomes- the full hREA suite will be in excess of 15 DNAs and will also require several supporting auxilliary zomes in almost any useful configuration. And this does not factor in UI apps spanning multiple collaboration spaces, which may require access to multiple module suites simultaneously- leading to multipliers of 15-20 odd interrelated DNAs all being part of the same hApp bundle.

    To those pondering whether I might be able to reduce the filesize of things in other ways, I don't see this as a solution or even a long-term deferral of the problem.

    I suspect the issue will have to be solved mostly on the hc-utils side with the support of another API method or two from core, basically to chunk larger input files and send in pieces before reassembling on the other side.

    opened by pospi 30
  • hdk compilation easily broken, due to libsqlite3-sys and features configuration

    hdk compilation easily broken, due to libsqlite3-sys and features configuration

    hdk -> hdk_derive -> holochain_zome_types (culprit!) -> holo_hash -> rusqlite -> sqlite_sys

    holochain_zome_types depends on holo_hash without default-features = false

    https://github.com/holochain/holochain/blob/a817c93c2aaef283d0df2acbe37c5065c1836f3b/crates/holochain_zome_types/Cargo.toml#L16

       Compiling getrandom v0.2.3
    The following warnings were emitted during compilation:
    
    warning: sqlite3/sqlite3.c:14230:10: fatal error: 'stdio.h' file not found
    warning: #include <stdio.h>
    warning:          ^~~~~~~~~
    warning: 1 error generated.
    
    error: failed to run custom build command for `libsqlite3-sys v0.22.2`
    

    like #913

    opened by Connoropolous 19
  • [BUG] [WASM metering?] Ribosome fails with `unreachable` in all sorts of fns when DNA gets exercised

    [BUG] [WASM metering?] Ribosome fails with `unreachable` in all sorts of fns when DNA gets exercised

    Describe the bug DNA works well at first, but at some point, after creating >100 entries, a zome call using hdk::chain::query returns this error:

    'Wasm runtime error while working with Ribosome: RuntimeError: WasmError { 
       file: "/private/tmp/nix-build-hc_holochain_kitsune-p2p-tx2-proxy.drv-0/cargo-vendor-dir/holochain_wasmer_host-0.0.80/src/guest.rs", 
       line: 200, 
       error: CallError("RuntimeError: unreachable at __allocate (<module>[9728]:0x1eeb33)") 
    }'
    

    which is preceded by the following callstack:

    Wasm runtime error while working with Ribosome: RuntimeError: 
    WasmError { 
        file: "/private/tmp/nix-build-hc_holochain_kitsune-p2p-tx2-proxy.drv-0/cargo-vendor-dir/holochain_wasmer_host-0.0.80/src/guest.rs", 
        line: 259, 
        error: CallError("RuntimeError: unreachable
            at serde::de::value::MapDeserializer<I,E>::next_pair::haaad3ba4a6de641a (<module>[7210]:0x18e2cb)
            at <serde::de::value::MapDeserializer<I,E> as serde::de::MapAccess>::next_key_seed::hca02805ec96203fb (<module>[7232]:0x18f6c9)
            at serde::__private::de::content::visit_content_map::hd85faf9815523252 (<module>[7599]:0x1a645c)
            <serde::__private::de::content::ContentDeserializer<E> as serde::de::Deserializer>::deserialize_struct::h6dc91e8372d3bc25 (<module>[7661]:0x1ab0e0)
            at holochain_integrity_types::action::_::<impl serde::de::Deserialize for holochain_integrity_types::action::Create<W>>::deserialize::h8b44b43a17bc41e3 (<module>[5604]:0x1693ea)
            at holochain_integrity_types::action::_::<impl serde::de::Deserialize for holochain_integrity_types::action::Action>::deserialize::h7eaad162f2cdce00 (<module>[5596]:0x168fcf)
            at <&mut rmp_serde::decode::Deserializer<R,C> as serde::de::Deserializer>::deserialize_any::hc3ee531299e07f0a (<module>[5396]:0x131182)
            <&mut rmp_serde::decode::Deserializer<R,C> as serde::de::Deserializer>::deserialize_any::h30f9e77796fa4885 (<module>[5393]:0x12fe60)
            at serde::de::MapAccess::next_value::h5b543f8858cda8be (<module>[5428]:0x13cf10)
            <&mut rmp_serde::decode::Deserializer<R,C> as serde::de::Deserializer>::deserialize_any::h7a804302fe6ac88c (<module>[5460]:0x1482a0)
            at holochain_integrity_types::record::_::<impl serde::de::Deserialize for holochain_integrity_types::record::Record>::deserialize::h64a27e9bd7904e7d (<module>[6095]:0x176fce)
            at <serde::de::impls::<impl serde::de::Deserialize for alloc::vec::Vec<T>>::deserialize::VecVisitor<T> as serde::de::Visitor>::visit_seq::hffdf9012150c764b (<module>[4386]:0xf537f)
            at <&mut rmp_serde::decode::Deserializer<R,C> as serde::de::Deserializer>::deserialize_any::h99fa3cab5f3b1a31 (<module>[3423]:0xcf0c3)
            at serde::de::impls::<impl serde::de::Deserialize for alloc::vec::Vec<T>>::deserialize::hfadd074248396a66 (<module>[3750]:0xdfef8)
            at <serde::de::impls::<impl serde::de::Deserialize for core::result::Result<T,E>>::deserialize::ResultVisitor<T,E> as serde::de::Visitor>::visit_enum::h9a9fc0b5482ab34f (<module>[2620]:0x9511f)
            at <&mut rmp_serde::decode::Deserializer<R,C> as serde::de::Deserializer>::deserialize_enum::hd1b78d9fefc89fd0 (<module>[2489]:0x8ebe7)
            at serde::de::impls::<impl serde::de::Deserialize for core::result::Result<T,E>>::deserialize::h4c4cd8860f349141 (<module>[2802]:0x9909f)
            at holochain_serialized_bytes::decode::ha189c04a1f82ee9d (<module>[2545]:0x91c6b)
            at <hdk::hdk::HostHdk as hdk::hdk::HdkT>::query::ha5630c352e63bc95 (<module>[3006]:0xa0811)
            at std::thread::local::LocalKey<T>::with::hdf30b39269ef9592 (<module>[2788]:0x98b42)
            at hdk::chain::query::h6c41770dc5cb69b9 (<module>[2990]:0x9fba1)
            at perspective_diff_sync::revisions::current_revision::h273ea7e62f7af6d7 (<module>[999]:0x37113)
            at perspective_diff_sync::pull::pull::hdbe25e8845e7938a (<module>[768]:0x20860)
            at perspective_diff_sync::pull::hbee50929ed5a525f (<module>[332]:0xed72)
            at pull (<module>[953]:0x33fd1)") 
    }
    

    The code calling chain::query (perspective_diff_sync::revisions::current_revision) is this:

    pub fn current_revision() -> SocialContextResult<Option<HoloHash<holo_hash::hash_type::Action>>> {
        let app_entry = AppEntryType::new(4.into(), 0.into(), EntryVisibility::Private);
        let filter = ChainQueryFilter::new().entry_type(EntryType::App(app_entry)).include_entries(true);
        let mut refs = query(filter)?
            .into_iter()
            .map(|val| {
                val.entry().to_app_option::<LocalHashReference>()?.ok_or(
                    SocialContextError::InternalError("Expected element to contain app entry data"),
                )
            })
            .collect::<SocialContextResult<Vec<LocalHashReference>>>()?;
        refs.sort_by(|a, b| a.timestamp.partial_cmp(&b.timestamp).unwrap());
    
        Ok(refs.pop().map(|val| val.hash))
    }
    

    The entry type we're filtering for consists of a hash and a timestamp:

    pub struct LocalHashReference {
        pub hash: HoloHash<holo_hash::hash_type::Action>,
        pub timestamp: DateTime<Utc>,
    }
    

    Expected behavior chain::query() doesn't stop working when source chain gets filled with data.

    System information:

    • OS: seen on macOS and Linux so far
    • Holochain and HDK Version: 0.0.151 (hdi = "0.0.15", hdk = "0.0.143")

    Additional context Whole DNA: https://github.com/perspect3vism/perspective-diff-sync/tree/main/hc-dna

    opened by lucksus 14
  • HDK v0.0.101 won't compile on MacOS. libsqlite3-sys can't compile for wasm32-unknown-unknown

    HDK v0.0.101 won't compile on MacOS. libsqlite3-sys can't compile for wasm32-unknown-unknown

    I've been working on an HDK wrapping rust crate, but I now realize it won't compile for the wasm target while building on MacOS, because of libsqlite3-sys sneaking in as a dependency, via hdk_derive -> holochain_zome_types -> holo_hash -> rusqlite -> libsqlite3-sys

    Can be seen in CI here: https://github.com/lightningrodlabs/hdk_crud/runs/3257245529#step:4:252

    From the main branch of hdk_crud (https://github.com/lightningrodlabs/hdk_crud) run the following to reproduce.

    cargo build --release --target wasm32-unknown-unknown

       Compiling proc-macro2 v1.0.28
       Compiling unicode-xid v0.2.2
       Compiling syn v1.0.74
       Compiling autocfg v1.0.1
       Compiling version_check v0.9.3
       Compiling libc v0.2.98
       Compiling serde v1.0.123
       Compiling serde_derive v1.0.123
       Compiling autocfg v0.1.7
       Compiling cfg-if v1.0.0
       Compiling ryu v1.0.5
       Compiling getrandom v0.1.16
       Compiling memchr v2.4.0
       Compiling serde_json v1.0.64
       Compiling bitflags v1.2.1
       Compiling rand_core v0.4.2
       Compiling pkg-config v0.3.19
       Compiling cc v1.0.69
       Compiling once_cell v1.8.0
       Compiling lazy_static v1.4.0
       Compiling smallvec v1.6.1
       Compiling unicode-segmentation v1.8.0
       Compiling cfg-if v0.1.10
       Compiling fnv v1.0.7
       Compiling byteorder v1.4.3
       Compiling strsim v0.9.3
       Compiling ident_case v1.0.1
       Compiling scopeguard v1.1.0
       Compiling itoa v0.4.7
       Compiling paste v1.0.5
       Compiling ppv-lite86 v0.2.10
       Compiling convert_case v0.4.0
       Compiling regex-syntax v0.6.25
       Compiling derive_builder v0.9.0
       Compiling either v1.6.1
       Compiling log v0.4.14
       Compiling constant_time_eq v0.1.5
       Compiling arrayvec v0.5.2
       Compiling strum v0.18.0
       Compiling fallible-iterator v0.2.0
       Compiling fallible-streaming-iterator v0.1.9
       Compiling arrayref v0.3.6
       Compiling pin-project-lite v0.1.12
       Compiling base64 v0.13.0
       Compiling predicates-core v1.0.2
       Compiling subtle v2.4.1
       Compiling normalize-line-endings v0.3.0
       Compiling difference v2.0.0
       Compiling treeline v0.1.0
       Compiling ansi_term v0.12.1
       Compiling downcast v0.10.0
       Compiling fragile v1.0.0
       Compiling instant v0.1.10
       Compiling tracing-core v0.1.18
       Compiling sharded-slab v0.1.1
       Compiling rand_chacha v0.1.1
       Compiling rand_pcg v0.1.2
       Compiling rand v0.6.5
       Compiling ahash v0.7.4
       Compiling value-bag v1.0.0-alpha.7
       Compiling num-traits v0.2.14
       Compiling indexmap v1.7.0
       Compiling num-integer v0.1.44
       Compiling rand_core v0.3.1
       Compiling rand_os v0.1.3
       Compiling rand_jitter v0.1.4
       Compiling lock_api v0.3.4
       Compiling lock_api v0.4.4
       Compiling parking_lot_core v0.7.2
       Compiling heck v0.3.3
       Compiling itertools v0.8.2
       Compiling blake2b_simd v0.5.11
       Compiling thread_local v1.1.3
       Compiling predicates-tree v1.0.2
       Compiling libsqlite3-sys v0.22.2
       Compiling parking_lot_core v0.8.3
       Compiling rand_isaac v0.1.1
       Compiling rand_hc v0.1.0
       Compiling rand_xorshift v0.1.1
       Compiling parking_lot v0.10.2
       Compiling time v0.1.43
       Compiling rand_core v0.5.1
       Compiling regex-automata v0.1.10
       Compiling parking_lot v0.11.1
       Compiling aho-corasick v0.7.18
       Compiling quote v1.0.9
       Compiling getrandom v0.2.3
       Compiling rand_chacha v0.2.2
       Compiling matchers v0.0.1
    The following warnings were emitted during compilation:
    
    warning: sqlite3/sqlite3.c:14230:10: fatal error: 'stdio.h' file not found
    warning: #include <stdio.h>
    warning:          ^~~~~~~~~
    warning: 1 error generated.
    
    error: failed to run custom build command for `libsqlite3-sys v0.22.2`
    
    Caused by:
      process didn't exit successfully: `/Users/x/x/libs/hdk_crud/target/release/build/libsqlite3-sys-577f092052a96113/build-script-build` (exit code: 1)
      --- stdout
      cargo:rerun-if-changed=sqlite3/sqlite3.c
      cargo:rerun-if-changed=sqlite3/wasm32-wasi-vfs.c
      cargo:rerun-if-env-changed=SQLITE_MAX_VARIABLE_NUMBER
      cargo:rerun-if-env-changed=SQLITE_MAX_EXPR_DEPTH
      cargo:rerun-if-env-changed=LIBSQLITE3_FLAGS
      TARGET = Some("wasm32-unknown-unknown")
      OPT_LEVEL = Some("3")
      HOST = Some("x86_64-apple-darwin")
      CC_wasm32-unknown-unknown = None
      CC_wasm32_unknown_unknown = None
      TARGET_CC = None
      CC = None
      CFLAGS_wasm32-unknown-unknown = None
      CFLAGS_wasm32_unknown_unknown = None
      TARGET_CFLAGS = None
      CFLAGS = None
      CRATE_CC_NO_DEFAULTS = None
      DEBUG = Some("false")
      running: "clang" "-O3" "-ffunction-sections" "-fdata-sections" "-fPIC" "--target=wasm32-unknown-unknown" "-DSQLITE_CORE" "-DSQLITE_DEFAULT_FOREIGN_KEYS=1" "-DSQLITE_ENABLE_API_ARMOR" "-DSQLITE_ENABLE_COLUMN_METADATA" "-DSQLITE_ENABLE_DBSTAT_VTAB" "-DSQLITE_ENABLE_FTS3" "-DSQLITE_ENABLE_FTS3_PARENTHESIS" "-DSQLITE_ENABLE_FTS5" "-DSQLITE_ENABLE_JSON1" "-DSQLITE_ENABLE_LOAD_EXTENSION=1" "-DSQLITE_ENABLE_MEMORY_MANAGEMENT" "-DSQLITE_ENABLE_RTREE" "-DSQLITE_ENABLE_STAT2" "-DSQLITE_ENABLE_STAT4" "-DSQLITE_SOUNDEX" "-DSQLITE_THREADSAFE=1" "-DSQLITE_USE_URI" "-DHAVE_USLEEP=1" "-D_POSIX_THREAD_SAFE_FUNCTIONS" "-DHAVE_ISNAN" "-DHAVE_LOCALTIME_R" "-o" "/Users/x/x/libs/hdk_crud/target/wasm32-unknown-unknown/release/build/libsqlite3-sys-e21ba9ed8d63f021/out/sqlite3/sqlite3.o" "-c" "sqlite3/sqlite3.c"
      cargo:warning=sqlite3/sqlite3.c:14230:10: fatal error: 'stdio.h' file not found
      cargo:warning=#include <stdio.h>
      cargo:warning=         ^~~~~~~~~
      cargo:warning=1 error generated.
      exit code: 1
    
      --- stderr
    
    
      error occurred: Command "clang" "-O3" "-ffunction-sections" "-fdata-sections" "-fPIC" "--target=wasm32-unknown-unknown" "-DSQLITE_CORE" "-DSQLITE_DEFAULT_FOREIGN_KEYS=1" "-DSQLITE_ENABLE_API_ARMOR" "-DSQLITE_ENABLE_COLUMN_METADATA" "-DSQLITE_ENABLE_DBSTAT_VTAB" "-DSQLITE_ENABLE_FTS3" "-DSQLITE_ENABLE_FTS3_PARENTHESIS" "-DSQLITE_ENABLE_FTS5" "-DSQLITE_ENABLE_JSON1" "-DSQLITE_ENABLE_LOAD_EXTENSION=1" "-DSQLITE_ENABLE_MEMORY_MANAGEMENT" "-DSQLITE_ENABLE_RTREE" "-DSQLITE_ENABLE_STAT2" "-DSQLITE_ENABLE_STAT4" "-DSQLITE_SOUNDEX" "-DSQLITE_THREADSAFE=1" "-DSQLITE_USE_URI" "-DHAVE_USLEEP=1" "-D_POSIX_THREAD_SAFE_FUNCTIONS" "-DHAVE_ISNAN" "-DHAVE_LOCALTIME_R" "-o" "/Users/x/x/libs/hdk_crud/target/wasm32-unknown-unknown/release/build/libsqlite3-sys-e21ba9ed8d63f021/out/sqlite3/sqlite3.o" "-c" "sqlite3/sqlite3.c" with args "clang" did not execute successfully (status code exit code: 1).
    
    
    warning: build failed, waiting for other jobs to finish...
    error: build failed
    
    bug 
    opened by Connoropolous 14
  • `local_network_tests::conductors_remote_gossip` and `local_network_tests::conductors_remote_boot_gossip` fail

    `local_network_tests::conductors_remote_gossip` and `local_network_tests::conductors_remote_boot_gossip` fail

    Hi! I've been experiencing gossip issues when going through a proxy server. This has happened to me both with the proxy at proxy.holochain.org, and running my own with the current tip of develop.

    Investigating, I ran this and this tests, and they both fail.

    This is the result of running hc-merge-test in develop when unignoring those tests:

    failures:
    
    failures:
        local_network_tests::conductors_remote_boot_gossip::_10_10_1
        local_network_tests::conductors_remote_boot_gossip::_10_10_10
        local_network_tests::conductors_remote_boot_gossip::_1_10_1
        local_network_tests::conductors_remote_boot_gossip::_1_1_1
        local_network_tests::conductors_remote_boot_gossip::_1_5_5
        local_network_tests::conductors_remote_boot_gossip::_2_1_1
        local_network_tests::conductors_remote_boot_gossip::_5_1_1
        local_network_tests::conductors_remote_boot_gossip::_5_5_5
        local_network_tests::conductors_remote_boot_gossip::_8_8_8
        local_network_tests::conductors_remote_gossip::_10_10_1
        local_network_tests::conductors_remote_gossip::_10_10_10
        local_network_tests::conductors_remote_gossip::_1_10_1
        local_network_tests::conductors_remote_gossip::_1_1_1
        local_network_tests::conductors_remote_gossip::_1_5_5
        local_network_tests::conductors_remote_gossip::_2_1_1
        local_network_tests::conductors_remote_gossip::_5_1_1
        local_network_tests::conductors_remote_gossip::_5_5_5
        local_network_tests::conductors_remote_gossip::_8_8_8
    
    test result: FAILED. 187 passed; 18 failed; 39 ignored; 0 measured; 0 filtered out; finished in 298.70s
    

    This is an example result of those failures:

    ###HOLOCHAIN_SETUP###
    ###ADMIN_PORT:34375###
    ###HOLOCHAIN_SETUP_END###
    FATAL PANIC PanicInfo {
        payload: Any,
        message: Some(
            assertion failed: `None` does not match `Some(_)`,
        ),
        location: Location {
            file: "crates/holochain/src/local_network_tests.rs",
            line: 474,
            col: 9,
        },
    }
    thread 'main' panicked at 'assertion failed: `None` does not match `Some(_)`', crates/holochain/src/local_network_tests.rs:474:9
    stack backtrace:
       0: rust_begin_unwind
                 at /rustc/88f19c6dab716c6281af7602e30f413e809c5974/library/std/src/panicking.rs:493:5
       1: std::panicking::begin_panic_fmt
                 at /rustc/88f19c6dab716c6281af7602e30f413e809c5974/library/std/src/panicking.rs:435:5
       2: holochain::local_network_tests::check_gossip::{{closure}}
                 at ./src/local_network_tests.rs:474:9
    
    opened by guillemcordoba 13
  • add list_public_keys to MetaLairClient

    add list_public_keys to MetaLairClient

    Summary

    The ultimate use case for this is to drive a user experience where a user gets to keep their old keys between hApp version migrations. At the moment, there's no good way to do this. The reason for that is that there's no way to just ask the keystore about the keys it has available within it, which means you can't tell whether it has any, or which ones might be usable for installation of a new hApp with. Necessary to add this function because the internals are all private at this level, which seems good.

    Once released, I will use this code over in lightningrodlabs/holochain-runner to provide that experience, but I do need this to be in place.

    @neonphog can you check this over?

    You can see here that I used to do this quite manually, and have the access that I needed: https://github.com/lightningrodlabs/holochain-runner/commit/e54e182f9600a2dc2a43c7b2e857954282e78431#diff-fb0ee6ff1655c8d46559bfc39611fbb6beb92fd22e66846aaabc532c56c727e2L18-L57 but it went away with the recent keystore update

    TODO:

    • [ ] CHANGELOG(s) updated with appropriate info
    • [ ] Just before pressing the merge button, ensure new entries to CHANGELOG(s) are still under the UNRELEASED heading
    opened by Connoropolous 12
  • unable to use

    unable to use "call" function of hc sandbox

    Terminal 1

    hc sandbox run 0 --ports 8888
    
    Mar 23 12:14:15.635 ERROR lair_keystore_client: error=IpcClientConnectError("/private/var/folders/5s/rldbq43s2yj4v5xb89bwygv80000gn/T/wGaAiZnsWdS4Vf98LQW5v/keystore/socket", Os { code: 61, kind: ConnectionRefused, message: "Connection refused" }) file="/Users/connor/.cargo/registry/src/github.com-1ecc6299db9ec823/lair_keystore_client-0.0.1-alpha.11/src/lib.rs" line=42
    
    Conductor ready.
    hc-sandbox: Running conductor on admin port 50085
    hc-sandbox: Attaching app port 8888
    

    Terminal 2

    hc sandbox call -r=50085 list-active-apps
    thread 'main' panicked at 'Failed to create CmdRunner because admin port failed to connect: Io(Os { code: 61, kind: ConnectionRefused, message: "Connection refused" })', /Users/connor/.cargo/git/checkouts/holochain-391184137afba57c/181baec/crates/hc_sandbox/src/lib.rs:159:14
    note: run with `RUST_BACKTRACE=1` environment variable to display a backtrace
    
    opened by Connoropolous 12
  • Only rebuild test_utils for upstream changes

    Only rebuild test_utils for upstream changes

    Softens the "rebuild-every-time" strategy for test_utils/wasm by directly parsing the Cargo.toml to discover upstream local deps. Downstream local changes no longer trigger a rebuild, making rapid iteration on e.g. holochain crate more rapider.

    opened by maackle 12
  • Error installing AppInterface for v121

    Error installing AppInterface for v121

    Hello!

    I've encountered an error while running a conductor sandbox using the following command:

    hc s clean && npm run build:happ && RUST_LOG=warn hc s generate ./workdir/APP.happ --run -n 2 network mdns

    the terminal shows:

    hc-sandbox: Creating 2 conductor sandboxes with same settings
    hc-sandbox: Config ConductorConfig { environment_path: EnvironmentRootPath("/tmp/tmp.qTeWWKyVK0/LAwloneaJRjUN0vhvjj1u"), keystore: LairServerLegacyDeprecated { keystore_path: Some("/tmp/tmp.qTeWWKyVK0/LAwloneaJRjUN0vhvjj1u/keystore"), danger_passphrase_insecure_from_config: "default-insecure-passphrase" }, dpki: None, admin_interfaces: Some([AdminInterfaceConfig { driver: Websocket { port: 0 } }]), network: Some(KitsuneP2pConfig { transport_pool: [Quic { bind_to: None, override_host: None, override_port: None }], bootstrap_service: None, tuning_params: KitsuneP2pTuningParams { gossip_strategy: "sharded-gossip", gossip_loop_iteration_delay_ms: 1000, gossip_outbound_target_mbps: 0.5, gossip_inbound_target_mbps: 0.5, gossip_historic_outbound_target_mbps: 0.1, gossip_historic_inbound_target_mbps: 0.1, gossip_peer_on_success_next_gossip_delay_ms: 60000, gossip_peer_on_error_next_gossip_delay_ms: 300000, gossip_local_sync_delay_ms: 60000, gossip_dynamic_arcs: false, gossip_single_storage_arc_per_space: false, default_rpc_single_timeout_ms: 30000, default_rpc_multi_remote_agent_count: 3, default_rpc_multi_remote_request_grace_ms: 3000, agent_info_expires_after_ms: 1200000, tls_in_mem_session_storage: 512, proxy_keepalive_ms: 120000, proxy_to_expire_ms: 300000, concurrent_limit_per_thread: 4096, tx2_quic_max_idle_timeout_ms: 30000, tx2_pool_max_connection_count: 4096, tx2_channel_count_per_connection: 2, tx2_implicit_timeout_ms: 30000, tx2_initial_connect_retry_delay_ms: 200 }, network_type: QuicMdns }), db_sync_strategy: Fast }
    hc-sandbox: Created directory at: /tmp/tmp.qTeWWKyVK0/LAwloneaJRjUN0vhvjj1u Keep this path to rerun the same sandbox
    hc-sandbox: Created config at /tmp/tmp.qTeWWKyVK0/LAwloneaJRjUN0vhvjj1u/conductor-config.yaml
    Jan 19 10:29:45.673  WARN holochain::conductor::conductor::builder: Using DEPRECATED legacy lair api.
    Jan 19 10:29:45.673  WARN holochain::conductor::conductor::builder: USING INSECURE PASSPHRASE FROM CONFIG--This defeats the whole purpose of having a passphrase.
    Jan 19 10:29:45.673 ERROR lair_keystore_client: error=IpcClientConnectError("/tmp/tmp.qTeWWKyVK0/LAwloneaJRjUN0vhvjj1u/keystore/socket", Os { code: 2, kind: NotFound, message: "No such file or directory" }) file="/build/cargo-vendor-dir/lair_keystore_client-0.0.9/src/lib.rs" line=41
    
    Conductor ready.
    Error: Failed to install app: Expected AdminResponse::AppBundleInstalled but got Error(InternalError("Conductor returned an error while using a ConductorApi: EntryDefStoreError(DnaError(WasmError(Compile(\"Error while importing \\\"env\\\".\\\"__app_info\\\": unknown import. Expected Function(FunctionType { params: [I32, I32], results: [] })\"))))"))
    

    I suspect it's due to the removal of app info in version 121 however I couldnt find the source of this error. I've tried this with an empty happ and the error still popped up.

    my default.nix config running on Ubuntu 20.04:

    
    let
    
      holonixRev = "9c9a5a00dc05b0825841fae4ff8181182d9949ce";
    
      holonixPath = builtins.fetchTarball "https://github.com/holochain/holonix/archive/${holonixRev}.tar.gz";
      holonix = import (holonixPath) {
        holochainVersionId = "v0_0_121";
      };
      nixpkgs = holonix.pkgs;
    in nixpkgs.mkShell {
      inputsFrom = [ holonix.main ];
      buildInputs = with nixpkgs; [
        binaryen
        nodejs-16_x
      ];
    }  
    
    

    holochain cli version:

    hc -V
    holochain_cli 0.0.22
    

    Let me know if theres anything else needed!

    need reproduction 
    opened by axhue 10
  • Improve compatibility of holochain types with Zome WASM, unify Timestamp type

    Improve compatibility of holochain types with Zome WASM, unify Timestamp type

    There are two implementations of the Timestamp; this merges them into one, and makes the Timestamp type available in WASM code. The now() method was moved from being an impl for Timestamp, eg. Timestamp::now(), to be a free-floating function in the crate, eg. timestamp::now(), so code changes are minimal and easy to understand. This improves the log output considerably, as Timestamps are always Debug-formatted as times, not as a pair of big numbers. Also, updated the fixt plumbing to actually create more-or-less "correct" random Timestamps for testing, so that conversion to Duration and SystemTime won't fail during tests that assume sane Timestamps for comparison.

    In addition, various holochain error types lacked some derive implementation, making it difficult to write Zome test code in rust (as Err types couldn't be compared).

    Finally, the holo_hash types and related functions were moved around a bit, to allow Zome code to build "wrappers" around all DHT times, allowing them to be De/Serialized as Strings for use over Zome API calls. This code is not included here, but used heavily in hash-heavy Zomes, such as the new holofuel implementation I'm working on...

    o Move Timestamp from holochain/types to zome_types o Give more ...Error types Debug, PartialEq, ... for Rust testing o Move some holo_hash types/impls around so they work in Zome code o Convert Timestamp::now() impl method to timestamp::now() function In order for Timestamp to be WASM compatible and usable in Zome code, we've moved it to holochain_wasm_types, and made the holochain_types timestamp now() a function, instead of a Timestamp::now() method. Minimal code change, vastly improved Zome coding experience. o Update base64 crate to avoid multiple inclusions

    opened by pjkundert 10
  • Fix problems with disabled and reenabled cells

    Fix problems with disabled and reenabled cells

    Summary

    Any app which is disabled becomes permanently unresponsive when it is re-enabled. This PR reproduces that problem and fixes it.

    The problem occurred upon leaving a space. An agent info with an empty arc is published and persisted. However, this means that when rejoining, the local agent is loaded from the database with an empty arc. I found that the simplest way to fix this is to just not persist the empty arc info upon leaving, only publishing it to our peers. Then, upon rejoining, we start with the same arc we had just before leaving.

    But this isn't really correct. Other agents may think this agent is online. So I think we should just remove the entry altogether, so when we rejoin, we can interpret it as such more easily. Maybe if arc resizing was robust and correct, this wouldn't be an issue, but we haven't streamlined that yet.

    @neonphog feedback on approach?

    TODO:

    • [x] CHANGELOG(s) updated with appropriate info
    • [ ] Just before pressing the merge button, ensure new entries to CHANGELOG(s) are still under the UNRELEASED heading
    opened by maackle 0
  • build(deps): bump tokio from 1.17.0 to 1.24.1 in /crates/kitsune_p2p/kitsune_p2p

    build(deps): bump tokio from 1.17.0 to 1.24.1 in /crates/kitsune_p2p/kitsune_p2p

    Bumps tokio from 1.17.0 to 1.24.1.

    Release notes

    Sourced from tokio's releases.

    Tokio v1.24.1

    This release fixes a compilation failure on targets without AtomicU64 when using rustc older than 1.63. (#5356)

    #5356: tokio-rs/tokio#5356

    Tokio v1.24.0

    The highlight of this release is the reduction of lock contention for all I/O operations (#5300). We have received reports of up to a 20% improvement in CPU utilization and increased throughput for real-world I/O heavy applications.

    Fixed

    • rt: improve native AtomicU64 support detection (#5284)

    Added

    • rt: add configuration option for max number of I/O events polled from the OS per tick (#5186)
    • rt: add an environment variable for configuring the default number of worker threads per runtime instance (#4250)

    Changed

    • sync: reduce MPSC channel stack usage (#5294)
    • io: reduce lock contention in I/O operations (#5300)
    • fs: speed up read_dir() by chunking operations (#5309)
    • rt: use internal ThreadId implementation (#5329)
    • test: don't auto-advance time when a spawn_blocking task is running (#5115)

    #5186: tokio-rs/tokio#5186 #5294: tokio-rs/tokio#5294 #5284: tokio-rs/tokio#5284 #4250: tokio-rs/tokio#4250 #5300: tokio-rs/tokio#5300 #5329: tokio-rs/tokio#5329 #5115: tokio-rs/tokio#5115 #5309: tokio-rs/tokio#5309

    Tokio v1.23.1

    This release forward ports changes from 1.18.4.

    Fixed

    • net: fix Windows named pipe server builder to maintain option when toggling pipe mode (#5336).

    #5336: tokio-rs/tokio#5336

    Tokio v1.23.0

    Fixed

    • net: fix Windows named pipe connect (#5208)
    • io: support vectored writes for ChildStdin (#5216)
    • io: fix async fn ready() false positive for OS-specific events (#5231)

    ... (truncated)

    Commits
    • 31c7e82 chore: prepare Tokio v1.24.1 (#5357)
    • 8d8db27 tokio: add load and compare_exchange_weak to loom StaticAtomicU64 (#5356)
    • dfe252d chore: prepare Tokio v1.24.0 release (#5353)
    • 21b233f test: bump version of async-stream (#5347)
    • 7299304 Merge branch 'tokio-1.23.x' into master
    • 1a997ff chore: prepare Tokio v1.23.1 release
    • a8fe333 Merge branch 'tokio-1.20.x' into tokio-1.23.x
    • ba81945 chore: prepare Tokio 1.20.3 release
    • 763bdc9 ci: run WASI tasks using latest Rust
    • 9f98535 Merge remote-tracking branch 'origin/tokio-1.18.x' into fix-named-pipes-1.20
    • Additional commits viewable in compare view

    Dependabot compatibility score

    Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


    Dependabot commands and options

    You can trigger Dependabot actions by commenting on this PR:

    • @dependabot rebase will rebase this PR
    • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
    • @dependabot merge will merge this PR after your CI passes on it
    • @dependabot squash and merge will squash and merge this PR after your CI passes on it
    • @dependabot cancel merge will cancel a previously requested merge and block automerging
    • @dependabot reopen will reopen this PR if it is closed
    • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
    • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
    • @dependabot use these labels will set the current labels as the default for future PRs for this repo and language
    • @dependabot use these reviewers will set the current reviewers as the default for future PRs for this repo and language
    • @dependabot use these assignees will set the current assignees as the default for future PRs for this repo and language
    • @dependabot use this milestone will set the current milestone as the default for future PRs for this repo and language

    You can disable automated security fix PRs for this repo from the Security Alerts page.

    NO-MERGE (dependency) rust 
    opened by dependabot[bot] 0
  • build(deps): bump tokio from 1.23.0 to 1.23.1

    build(deps): bump tokio from 1.23.0 to 1.23.1

    Bumps tokio from 1.23.0 to 1.23.1.

    Release notes

    Sourced from tokio's releases.

    Tokio v1.23.1

    This release forward ports changes from 1.18.4.

    Fixed

    • net: fix Windows named pipe server builder to maintain option when toggling pipe mode (#5336).

    #5336: tokio-rs/tokio#5336

    Commits
    • 1a997ff chore: prepare Tokio v1.23.1 release
    • a8fe333 Merge branch 'tokio-1.20.x' into tokio-1.23.x
    • ba81945 chore: prepare Tokio 1.20.3 release
    • 763bdc9 ci: run WASI tasks using latest Rust
    • 9f98535 Merge remote-tracking branch 'origin/tokio-1.18.x' into fix-named-pipes-1.20
    • 9241c3e chore: prepare Tokio v1.18.4 release
    • 699573d net: fix named pipes server configuration builder
    • See full diff in compare view

    Dependabot compatibility score

    Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


    Dependabot commands and options

    You can trigger Dependabot actions by commenting on this PR:

    • @dependabot rebase will rebase this PR
    • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
    • @dependabot merge will merge this PR after your CI passes on it
    • @dependabot squash and merge will squash and merge this PR after your CI passes on it
    • @dependabot cancel merge will cancel a previously requested merge and block automerging
    • @dependabot reopen will reopen this PR if it is closed
    • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
    • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
    • @dependabot use these labels will set the current labels as the default for future PRs for this repo and language
    • @dependabot use these reviewers will set the current reviewers as the default for future PRs for this repo and language
    • @dependabot use these assignees will set the current assignees as the default for future PRs for this repo and language
    • @dependabot use this milestone will set the current milestone as the default for future PRs for this repo and language

    You can disable automated security fix PRs for this repo from the Security Alerts page.

    NO-MERGE (dependency) rust 
    opened by dependabot[bot] 0
  • WIP - tx4 holochain integration

    WIP - tx4 holochain integration

    Summary

    TODO:

    • [ ] CHANGELOG(s) updated with appropriate info
    • [ ] Just before pressing the merge button, ensure new entries to CHANGELOG(s) are still under the UNRELEASED heading
    opened by neonphog 0
  • Update copyright year 2023

    Update copyright year 2023

    Summary

    Happy New Year 2023! πŸŽ†

    TODO:

    • [ ] CHANGELOG(s) updated with appropriate info
    • [ ] Just before pressing the merge button, ensure new entries to CHANGELOG(s) are still under the UNRELEASED heading
    opened by Seb33300 0
  • Remove more ValidationPackage code

    Remove more ValidationPackage code

    Summary

    TODO:

    • [ ] CHANGELOG(s) updated with appropriate info
    • [ ] Just before pressing the merge button, ensure new entries to CHANGELOG(s) are still under the UNRELEASED heading
    opened by maackle 0
Releases(holochain-0.1.0-beta-rc.2)
Owner
Holochain
Scalable framework for P2P distributed apps. For all those projects you wish you could take from centralized web servers but you know can't scale on blockchain.
Holochain
Simplified, secure privledge escalation

doit.rs A simplified privledge escalation tool. Attempting to combine the memory safety guarantees of Rust with a small, auditable codebase. I was rea

Adam 3 Jan 6, 2022
Mundane is a Rust cryptography library backed by BoringSSL that is difficult to misuse, ergonomic, and performant (in that order).

Mundane Mundane is a Rust cryptography library backed by BoringSSL that is difficult to misuse, ergonomic, and performant (in that order). Issues and

Google 1.1k Jan 3, 2023
A performant, type-1 zkEVM written in Rust & SP1.

SP1 Reth SP1 Reth is a 100% open-source POC that showcases how any rollup can use SP1 to build a performant (type-1, bytecode compatible) zkEVM with l

Succinct 90 Mar 24, 2024
Efficient state-based CRDT replication and anti-entropy

Merkle Search Tree This crate implements a Merkle Search Tree as described in the 2019 paper Merkle Search Trees: Efficient State-Based CRDTs in Open

Dom 5 Jul 9, 2023
Substreams development kit for Ethereum chains, contains Firehose Block model and helpers as well as utilities for Ethereum ABI encoding/decoding.

Substreams Ethereum Substreams development kit for Ethereum chains, contains Rust Firehose Block model and helpers as well as utilities for Ethereum A

StreamingFast 15 Oct 25, 2022
Synchronized shadow state of Solana programs available for off-chain processing.

Solana Shadow The Solana Shadow crate adds shadows to solana on-chain accounts for off-chain processing. This create synchronises all accounts and the

null 40 Oct 30, 2022
Example of a block root with a Verkle state root

Example of a block root with a Verkle state root Block content This is a standard RLP block containing 3 transactions, and an added VerkleProof field

Guillaume Ballet 25 Nov 25, 2022
Sothis is a tool for replaying historical state on a local testnet node.

sothis Sothis is a tool for replaying historical state on a local anvil/hardhat testnet node. Usage Sothis currently has 2 modes. Live and historic. I

null 22 Jun 15, 2023
A Rust implementation of OpenAI's Whisper model using the burn framework

Whisper Burn: Rust Implementation of OpenAI's Whisper Transcription Model Whisper Burn is a Rust implementation of OpenAI's Whisper transcription mode

null 19 Jul 24, 2023
Scrypto Advent Calendar. Learn the new programming langage to build quick and secure DeFi applications.

Scrypto Advent Calendar I am publishing new Christmas related Scrypto examples every day from Dec 1st to Dec 25th. "Watch" this project to get notifie

Clement Bisaillon 26 Nov 13, 2022
Taking the best of Substrate Recipes and applying them to a new framework for structuring a collection of how-to guides.

Attention: This repository has been archived and is no longer being maintained. It has been replaced by the Substrate How-to Guides. Please use the Su

Substrate Developer Hub 35 Oct 17, 2022
Starknet Stack let's you easily create new Cairo Starknet chains with their own sequencers, provers and verifiers

Starknet Stack flowchart LR A("Client") ==>|"Starknet Transactions"| subGraph0["Sequencer"] subGraph0 -.->|"Blocks with txs"| 300319["Watcher prover

Lambdaclass 7 Jul 11, 2023
πŸ§‘β€βœˆ Version control and key management for Solana programs.

captain ??β€βœˆοΈ Version control and key management for Solana programs. Automatic versioning of program binaries based on Cargo Separation of deployer a

Saber 35 Mar 1, 2022
A new, simple NFT standard for Solana

New Solana NFT Standard Current Issues The current NFT spec is pretty bad for a few reasons: every NFT requires multiple accounts (3+) the token accou

null 38 Oct 20, 2022
Tradechain is an open source blockchain designed for fast trading & interoperability for new, existing assets

Tradechain is an open source blockchain designed for fast trading & interoperability for new, existing assets. Help build the future of trading with other Tradians.

Matt Shaver 5 Jul 5, 2022
A brand-new multi-scenarios smart contract compiler framework

The Smart Intermediate Representation The Smart Intermediate Representation(short for IR) project is a new compiler framework intended for smart contr

AntChainOpenLabs 62 Jan 2, 2024
A rust version of breedenter.

breedenter-rust A rust version of breedenter, automatically start browser http://192.168.1.1/ when breed is entered. η‚Ήζˆ‘θΏ›ε…₯下载鑡青 Compile with GNU toolcha

test 4 Mar 2, 2023
A re-write of polkadot staking miner using subxt to avoid hard dependency to each runtime version

Staking Miner v2 WARNING this library is under active development DO NOT USE IN PRODUCTION. The library is a re-write of polkadot staking miner using

Parity Technologies 19 Dec 28, 2022
Simple node and rust script to achieve an easy to use bridge between rust and node.js

Node-Rust Bridge Simple rust and node.js script to achieve a bridge between them. Only 1 bridge can be initialized per rust program. But node.js can h

Pure 5 Apr 30, 2023