A demo repo that shows how to use the latest component model feature in wasmtime to implement a key-value capability defined in a WIT file.

Overview

Key-Value Component Demo

This repo serves as an example of how to use the latest wasm runtime wasmtime and its component-model feature to build and execute a wasm component. The wasm component uses the key-value interface to interface with a in-memory key-value store provided by the host runtime. The interface is written in WIT file. The binding to the wasm component types is generated by the wit-bindgen tool.

Run

The following command will run the wasm component in the host:

cargo run

If everything goes well, you will see the following output:

     Running `target/debug/host`
Level::Info I/O: res from guest: world

res from host: "world"

Build

To build the wasm component, run the following command:

cd ./guest
cargo build --release --target wasm32-wasi

After running the above command, the guest wasm module will be generated at ./guest/target/wasm32-wasi/release/guest.wasm. This is not a wasm component yet. We can use a tool called wasm-tools to convert the wasm module to a wasm component.

wasm-tools component new /
./guest/target/wasm32-wasi/release/guest.wasm -o ./target/guest.component.wasm

The above command will generate a wasm component at ./target/guest.component.wasm.

Note: As of now, the component-model feature in wasmtime does not support wasi_snapshot_preview1 yet. This means that the wasm component cannot be executed by the host that embeds wasmtime because the component's WASI imports are not provided by the host.

Workaround

Download "wasi_snapshot_preview1.wasm" from "https://github.com/bytecodealliance/preview2-prototyping/releases/download/latest/wasi_snapshot_preview1.wasm", where contains a wasi module that bridges the preview1 ABI to preview2 ABI of the component model.

The repo also contains a host implementation of the preview2 ABI. The host can be used as a library to build a new host. See these two imports in src/main.rs

use host::{add_to_linker, Wasi, WasiCtx};
use wasi_cap_std_sync::WasiCtxBuilder;

After downloading the wasi module, we can use the adapter feature of wasm-tools component to convert the wasm module to a wasm component that implements the required WASI ABI:

wasm-tools component new /
./guest/target/wasm32-wasi/release/guest.wasm -o ./target/guest.component.wasm /
--adapter ./wasi_snapshot_preview1.wasm

Now, the ./target/guest.component.wasm is the desired wasm component that ready to be executed by the host.

You might also like...
A neural network model that can approximate any non-linear function by using the random search algorithm for the optimization of the loss function.

random_search A neural network model that can approximate any non-linear function by using the random search algorithm for the optimization of the los

m2cgen (Model 2 Code Generator) - is a lightweight library which provides an easy way to transpile trained statistical models into a native code

Transform ML models into a native code (Java, C, Python, Go, JavaScript, Visual Basic, C#, R, PowerShell, PHP, Dart, Haskell, Ruby, F#, Rust) with zero dependencies

Using OpenAI Codex's "davinci-edit" Model for Gradual Type Inference

OpenTau: Using OpenAI Codex for Gradual Type Inference Current implementation is focused on TypeScript Python implementation comes next Requirements r

Your one stop CLI for ONNX model analysis.
Your one stop CLI for ONNX model analysis.

Your one stop CLI for ONNX model analysis. Featuring graph visualization, FLOP counts, memory metrics and more! ⚡️ Quick start First, download and ins

Experimenting with Rust's fundamental data model

ferrilab Redefining the Rust fundamental data model bitvec funty radium Introduction The ferrilab project is a collection of crates that provide more

Library for the Standoff Text Annotation Model, in Rust

STAM Library STAM is a data model for stand-off text annotation and described in detail here. This is a sofware library to work with the model, writte

Believe in AI democratization. llama for nodejs backed by llama-rs, work locally on your laptop CPU. support llama/alpaca model.
Believe in AI democratization. llama for nodejs backed by llama-rs, work locally on your laptop CPU. support llama/alpaca model.

llama-node Large Language Model LLaMA on node.js This project is in an early stage, the API for nodejs may change in the future, use it with caution.

A rust implementation of the csl-next model.

Vision This is a project to write the CSL-Next typescript model and supporting libraries and tools in Rust, and convert to JSON Schema from there. At

A Voice Activity Detector rust library using the Silero VAD model.

Voice Activity Detector Provides a model and extensions for detecting speech in audio. Standalone Voice Activity Detector This crate provides a standa

Owner
Jiaxiao Zhou
Open Source Software Engineer @microsoft at @deislabs, @tikv contributor, @splvm organizer
Jiaxiao Zhou
WebAssembly component model implementation for any backend.

wasm_component_layer wasm_component_layer is a runtime agnostic implementation of the WebAssembly component model. It supports loading and linking WAS

Douglas Dwyer 11 Aug 28, 2023
A plugin for Jupyter Notebooks that shows you its energy use.

Jupyter Energy Jupyter Notebooks are a data science tool mostly used for statistics and machine learning, some of the most energy-intensive computing

Marcel Garus 3 Jun 29, 2022
SelfOrgMap 5 Nov 4, 2020
Practice repo for learning Rust. Currently going through "Rust for JavaScript Developers" course.

rust-practice ?? Practice repo for learning Rust. Directories /rust-for-js-dev Files directed towards "Rust for JavaScript Developers" course. Thank y

Sammy Samkough 0 Dec 25, 2021
A repo for learning how to parallelize computations in the GPU using Apple's Metal, in Rust.

Metal playground in rust Made for learning how to parallelize computations in the GPU using Apple's Metal, in Rust, via the metal crate. Overview The

Lambdaclass 5 Feb 20, 2023
A Demo server serving Bert through ONNX with GPU written in Rust with <3

Demo BERT ONNX server written in rust This demo showcase the use of onnxruntime-rs on BERT with a GPU on CUDA 11 served by actix-web and tokenized wit

Xavier Tao 28 Jan 1, 2023
Cleora AI is a general-purpose model for efficient, scalable learning of stable and inductive entity embeddings for heterogeneous relational data.

Cleora Cleora is a genus of moths in the family Geometridae. Their scientific name derives from the Ancient Greek geo γῆ or γαῖα "the earth", and metr

Synerise 405 Dec 20, 2022
Masked Language Model on Wasm

Masked Language Model on Wasm This project is for OPTiM TECH BLOG. Please see below: WebAssemblyを用いてBERTモデルをフロントエンドで動かす Demo Usage Build image docker

OPTiM Corporation 20 Sep 23, 2022
Docker for PyTorch rust bindings `tch`. Example of pretrain model.

tch-rs-pretrain-example-docker Docker for PyTorch rust bindings tch-rs. Example of pretrain model. Docker files support the following install libtorch

vaaaaanquish 5 Oct 7, 2022
This is a rewrite of the RAMP (Rapid Assistance in Modelling the Pandemic) model

RAMP from scratch This is a rewrite of the RAMP (Rapid Assistance in Modelling the Pandemic) model, based on the EcoTwins-withCommuting branch, in Rus

Dustin Carlino 3 Oct 26, 2022