An egui app for prompting a local offline LLM.

Overview

An egui app for prompting a local offline LLM.

Example prompt

Description

coze is a small egui application for prompting a local offline LLM using the Huggingface candle crate.

Currently it supports the following quantized models:

The current version supports:

  • Prompt history navigation with fuzzy matching.
  • History persistence across runs.
  • Token generation modes.
  • Copy prompts and replies to clipboard.
  • Light/Dark mode.

See the app Edit/Config menu for usage details.

Installation

The latest version of coze can be installed or updated with cargo install:

cargo install --locked coze

or by getting the binaries generated by the release Github action for Linux, macOS, and Windows in the releases page.

The first time it runs it will download the model weights from Huggingface to the ~/.cache/coze folder.

To build locally (debug build may be very slow):

git clone https://github.com/vincev/coze
cd coze
cargo r --release
You might also like...
A Rust LLaMA project to load, serve and extend LLM models

OpenLLaMA Overview A Rust LLaMA project to load, serve and extend LLM models. Key Objectives Support both GGML and HF(HuggingFace) models Support a st

Evaluate LLM-generated COBOL

COBOLEval: LLM Evaluation for COBOL COBOLEval is a dataset to evaluate the code generation abilities of Large Language Models on the COBOL programming

A todo list app that indexes your app to find TODO:'s

forgot A todo list app that indexes your app to find TODO:'s Usage to list all your todos forgot list list all your todos ignoring search in ./target,

Exeprimental visual terminal for egui
Exeprimental visual terminal for egui

eterm: a visual terminal for egui If you have a service written in rust (running on the cloud, or even locally) that you need to inspect, eterm might

πŸ“ Soothing pastel theme for egui
πŸ“ Soothing pastel theme for egui

Catppuccin for egui Previews 🌻 Latte πŸͺ΄ FrappΓ© 🌺 Macchiato 🌿 Mocha Usage Add this to your Cargo.toml: [dependencies] catppuccin-egui = "1.0" To use

Sudoku Solver using bitmasks and bit-manipulation with Rust πŸ¦€ and egui 🎨

sudoku-solver Download This Rust application implements a very memory efficent algorithm to solve sudoku and lets the user know when a unique solution

A tiling layout engine for egui with drag-and-drop and resizing
A tiling layout engine for egui with drag-and-drop and resizing

egui_tiles Layouting and docking for egui. Supports: Horizontal and vertical layouts Grid layouts Tabs Drag-and-drop docking Trying it cargo r --examp

Slippy map (openstreetmap) widget for egui
Slippy map (openstreetmap) widget for egui

Slippy maps widget for egui. Limitations There are couple of limitations when using this library. Some of them will might probably be lifted at some p

Hotkey widget for egui!
Hotkey widget for egui!

egui-keybind, a hotkey library for egui crates.io | docs.rs | examples | changelogs This library provides a simple egui widget for keybindings (hotkey

Releases(v0.1.5)
Owner
null
An LLM-powered (CodeLlama or OpenAI) local diff code review tool.

augre An LLM-powered (CodeLlama or OpenAI) local diff code review tool. Binary Usage Install Windows: $ iwr https://github.com/twitchax/augre/releases

Aaron Roney 4 Oct 19, 2023
Rust library for integrating local LLMs (with llama.cpp) and external LLM APIs.

Table of Contents About The Project Getting Started Roadmap Contributing License Contact A rust interface for the OpenAI API and Llama.cpp ./server AP

Shelby Jenkins 4 Dec 18, 2023
A command-line tool aiming to upload the local image used in your markdown file to the GitHub repo and replace the local file path with the returned URL.

Pup A command line tool aiming to upload the local image used in your markdown file to the GitHub repo and replace the local file path with the return

SteveLau 11 Aug 17, 2022
A clock app in terminal written in Rust, supports local clock, timer and stopwatch.

clock-tui (tclock) A clock app in terminal. It support the following modes: Clock Timer Stopwatch Countdown Usage Install Install excutable by cargo:

Jimmy 14 Dec 20, 2022
Create tasks and save notes offline from your terminal

Create tasks and save notes offline from your terminal

null 9 Dec 18, 2022
🐚+🦞 Ultra-portable Rust game engine suited for offline 2D games powered by WebAssembly

pagurus ?? + ?? Ultra-portable Rust game engine suited for offline 2D games powered by WebAssembly. Examples Snake Traditional snake game: examples/sn

Takeru Ohta 20 Mar 7, 2023
Attempt to summarize text from `stdin`, using a large language model (locally and offline), to `stdout`

summarize-cli Attempt to summarize text from stdin, using a large language model (locally and offline), to stdout. cargo build --release target/releas

null 4 Aug 23, 2023
auto-rust is an experimental project that aims to automatically generate Rust code with LLM (Large Language Models) during compilation, utilizing procedural macros.

Auto Rust auto-rust is an experimental project that aims to automatically generate Rust code with LLM (Large Language Models) during compilation, util

Minsky 6 May 14, 2023
Solving context limits when working with AI LLM models by implementing a "chunkable" attribute on your prompt structs.

Promptize Promptize attempts to solve the issues with context limits when working with AI systems. It allows a user to add an attribute to their struc

Dan Nelson 5 Jul 18, 2023
Terminal UI to chat with large language models (LLM) using different model backends, and integrations with your favourite editors!

Oatmeal Terminal UI to chat with large language models (LLM) using different model backends, and integrations with your favourite editors! Overview In

Dustin Blackman 88 Dec 4, 2023