A program that provides LLMs with the ability to complete complex tasks using plugins.

Overview

SmartGPT

SmartGPT is an experimental program meant to provide LLMs (particularly GPT-3.5 and GPT-4) with the ability to complete complex tasks without user input by breaking them down into smaller problems, and collecting information using the internet and other external sources.

Demonstration Video

Why?

There are many existing solutions to allowing LLMs to perform more complex tasks, such as Auto-GPT and BabyAGI. So, why SmartGPT?

  • Modularity: SmartGPT is designed in such a way that you can easily add, remove, or toggle any part of it. Commands are abstracted into plugins, and LLMs are abstracted into their own interfaces that they have to implement.

  • Reasoning: As far as I know, SmartGPT excels in reasoning tasks by far compared to other solutions, because it divides your task into multiple agents (Manager, Boss, Employee, Minion), and gives each agent a different task involving reasoning. This compartmentalization allows for much more impressive feats of reasoning. It also allows for you to potentially save on plenty of token-costs as context is split up between many of the agents, and you can use smaller models with the experimental LLAMA support potentially.

  • Configuration: SmartGPT is incredibly easy to configure simply by using a simple config.yml file both for users, and for developers (who can parse their configurations using Serde)

There are two main shortcomings, however.

  • Ecosystem: AutoGPT is a much more polished and refined tool, with many more commands and integrations with memory systems, as well as being much more well-tested than SmartGPT.

  • Memory Management: As of right now, there is no memory system in SmartGPT. We're currently working to create a memory management system that would be much more flexible and work with multiple agents. However, even then, we'd still lack the ecosystem of memory management systems with different databases like Pinecone. This is an area that needs work.

Disclaimer

SmartGPT isn't a ready-for-use application, it's an experiment by me, mostly for my own pleasure. It can also use a significant amount of tokens, and may run requests you didn't authorize, so it's not recommended to leave it running on its own for long periods of time. You are also held liable to the constraints of any services used with SmartGPT, i.e. OpenAI's GPT3, Wolfram Alpha, etc, if toggled and used.

It should also be noted that SmartGPT is a very experimental application that prioritizes rapid development over stability. Our goal is to pioneer the prompts and features of this, throwing ideas into the pool and seeing what floats, without any sort of priority on polishing, at least for now.

Agents

SmartGPT has the following agents:

  • Manager: Splits the main task into a few high-level subtasks, passing those to The Boss one by one.
  • Boss: Takes its task and creates a loose plan, then splitting it into subtasks one by one, giving each subtask to the Employee.
  • Employee: Takes its task, writes psuedo-code, passes it to The Minion.
  • Minion: Refines the psuedo-code into a LUA script, runs it.

LUA integration

SmartGPT is integrated with LUA to allow for simple scripts to be run. This is a massive improvement over existing frameworks, because they have to run each command one by one. However, this could still be unstable and may need work.

How To Use

Note: Installing only seems to work on Linux due to some of the crate dependencies. Consider using Windows Subsystem for Linux for Windows, or run SmartGPT in Github Codespaces.

Prerequisites: Rust and Cargo

  1. Clone the Repository.
git clone https://github.com/Cormanz/smartgpt.git
  1. Install Faiss (if you don't use local long-term memory, you can skip this)

Install FAISS as explained here

If you still use the memory plugin without installing FAISS, it simply won't use the memory features. You'll know this because it won't log Found Memories.

  1. Run the Repository.
cargo run --release
cargo run --release --features faiss

And that's it. You're done.

Plugin System

The key benefit of SmartGPT is its plugin system, so I'll go depth into it here. A Plugin is defined as follows:

pub struct Plugin {
    pub name: String,
    pub cycle: Box<dyn PluginCycle>,
    pub dependencies: Vec<String>,
    pub commands: Vec<Command>
}

Plugins have a name, a set of dependencies for which plugins they require you also have, and a set of commands they register.

A Command is defined as follows:

pub struct Command {
    pub name: String,
    pub purpose: String,
    pub args: Vec<(String, String)>,
    pub run: Box<dyn CommandImpl>
}

Commands have a name, a purpose, and args. The latter two help describe how the function is used to the LLM. They also have a run, which is a dynamic trait that defines what happens when the command is used.

#[async_trait]
pub trait CommandImpl {
    async fn invoke(&self, ctx: &mut CommandContext, args: HashMap<String, String>) -> Result<String, Box<dyn Error>>;
}

args is provided as a HashMap. It's left as an exercise to the command-manager to parse those arguments, but usually, it's pretty easy using Rust's ? operator.

Back to plugins, plugins also have a cycle dynamic trait, for a PluginCycle.

#[async_trait]
pub trait PluginCycle {
    async fn create_context(&self, context: &mut CommandContext, previous_prompt: Option<&str>) -> Result<Option<String>, Box<dyn Error>>;

    async fn create_data(&self, value: Value) -> Option<Box<dyn PluginData>>;
}

create_context defines whether or not the function will put extra text at the beginning of the prompt, and if so, what. This is mainly used to remind the LLM of what files it has, and what memories its pulled.

create_data defines the long-term data that the plugin stores. Because of how Rust works, it's very tricky to convert the PluginData trait back into any one of its members, like MemoryData. Instead, you call invocations on PluginData, and parse out a response. Here's an example:

    invoke::<bool>(chatgpt_info, "push", ChatGPTMessage {
        role: ChatGPTRole::User,
        content: query.to_string()
    }).await?;

We take in our plugin data of chatgpt_info, tell it to push a new message, and it will return a bool. It's not the prettiest syntax, but decoupling plugin data from the rest of SmartGPT was one of the goals of the product, so this compromise was necessary (unless there's a better way to do this in Rust.)

License

smartgpt is available under the MIT license. See LICENSE for the full license text.

You might also like...
A rust library + CLI tool  that tells you when swas will upload new video through complex calculations
A rust library + CLI tool that tells you when swas will upload new video through complex calculations

A rust library + CLI tool that tells you when swas will upload new video through complex calculations. It also lets you search and play youtube videos of swas and other channels. Searching about youtube channels is also an option. Basically it's a youtube search cli tool written in rust.

A simple Rust library for OpenAI API, free from complex async operations and redundant dependencies.

OpenAI API for Rust A community-maintained library provides a simple and convenient way to interact with the OpenAI API. No complex async and redundan

A complex numbers, graphing, cli calculator

calc requires gnuplot for graphing history file is stored in ~/.config/.calc_history or C:\\Users\\%USERNAME%\\AppData\\Roaming\\calc.history usage Us

Schemars is a high-performance Python serialization library, leveraging Rust and PyO3 for efficient handling of complex objects

Schemars Introduction Schemars is a Python package, written in Rust and leveraging PyO3, designed for efficient and flexible serialization of Python c

This is a `Rust` based package to help with the management of complex medicine (pill) management cycles.
This is a `Rust` based package to help with the management of complex medicine (pill) management cycles.

reepicheep This is a Rust based package to help with the management of complex medicine (pill) management cycles. reepicheep notifies a person(s) via

The joker_query is a cute query builder, with Joker can implement most complex queries with sugar syntax
The joker_query is a cute query builder, with Joker can implement most complex queries with sugar syntax

joker_query The joker_query is most sugared query builder of Rust, with joker_query can implement most complex queries with sugar syntax Features − (O

A simple program that provides DBus interface to control display temperature and brightness under wayland without flickering

wl-gammarelay-rs Like wl-gammarelay, but written in rust, runs on a single thread, has three times less SLOC and uses DBus (for simplicity). Dbus inte

Cold Clear 2 is a modern Tetris versus bot and a complete rewrite and evolution of Cold Clear.

Cold Clear 2 Cold Clear 2 is a modern Tetris versus bot and a complete rewrite and evolution of Cold Clear. It implements the Tetris Bot Protocol for

TI-89-style calculator, maybe turing complete
TI-89-style calculator, maybe turing complete

I will make a fully capable graphing calculator in Rust and you can't stop me. As always, I'm not using libraries or any of that. Everything here is h

Comments
  • Agents Update

    Agents Update

    This has been a long-running update for SmartGPT.

    Agents: Agents have been added. There are multiple agents that each handle a different step of the reasoning process.

    In addition, the following has been added:

    LLAMA: Support for LLAMA using llama-rs crate has been added. LUA Scripting: Support for SmartGPT running LUA scripts has been added and forced into the prompt.

    The following has been removed:

    Memory Systems: Due to the Agents update and a long-running desire to refactor long-term memory, long-term memory has been removed, for now.

    opened by Cormanz 0
  • GPTScript

    GPTScript

    This PR gives SmartGPT the ability to execute multiple commands at once, and to provide the output of one command to another.

    In addition, it sets up the stage to give SmartGPT more advanced command outputs by providing an implementation of variables and for-loops (and even a parser from Python into this GPTScript)

    opened by Cormanz 0
  • Remove Objectives and Tasks, simply Plan Commands

    Remove Objectives and Tasks, simply Plan Commands

    The original prompt with objectives and tasks did not work well for many tasks, despite a lot of engineering, because the AI would constantly try to complexify tasks into many objectives. Instead, we just have planned commands, which translates well to actions that the AI takes.

    opened by Cormanz 0
Owner
Corman
Corman
🛠 SmartGPT is an experimental program meant to provide LLMs

?? SmartGPT is an experimental program meant to provide LLMs (particularly GPT-3.5 and GPT-4) with the ability to complete complex tasks without user input by breaking them down into smaller problems, and collecting information using the internet and other external sources.

n0y0u 3 Feb 25, 2024
Socket Monitor: A prettier and simpler alternative to netstat or ss for socket monitoring with the ability to scan for malicious IP addresses.

?? Somo A prettier alternative to netstat or ss for socket monitoring. ⬇️ Installation: 1. Install cargo: From crates.io. 2. Install the somo crate: c

Theodor Peifer 13 Jun 6, 2023
TUI interface for LLMs written in Rust 🔥

Tenere TUI interface for LLMs written in Rust ?? Demo ?? Supported LLMs Only ChatGPT is supported for the moment. But I'm planning to support more mod

BADR 22 Apr 22, 2023
Rust library for integrating local LLMs (with llama.cpp) and external LLM APIs.

Table of Contents About The Project Getting Started Roadmap Contributing License Contact A rust interface for the OpenAI API and Llama.cpp ./server AP

Shelby Jenkins 4 Dec 18, 2023
Use LLMs to generate strongly-typed values

Magic Instantiate Quickstart use openai_magic_instantiate::*; #[derive(MagicInstantiate)] struct Person { // Descriptions can help the LLM unders

Grant Slatton 4 Feb 20, 2024
Blockoli is a high-performance tool for code indexing, embedding generation and semantic search tool for use with LLMs.

blockoli ???? Blockoli is a high-performance tool for code indexing, embedding generation and semantic search tool for use with LLMs. blockoli is buil

Asterisk 76 Jul 24, 2024
A complete imgui-rs example using dependencies only from crates.io.

Dear imgui-rs, hello. This is a fairly basic, but complete and standalone example application for the Rust version of dear imgui (https://github.com/o

null 0 Nov 30, 2022
Horus is an open source tool for running forensic and administrative tasks at the kernel level using eBPF, a low-overhead in-kernel virtual machine, and the Rust programming language.

Horus Horus is an open-source tool for running forensic and administrative tasks at the kernel level using eBPF, a low-overhead in-kernel virtual mach

null 4 Dec 15, 2022
Translation support for mdbook. The plugins here give you a structured way to maintain a translated book.

Gettext Translation Support for mdbook The plugins here makes it easy to translate documentation written in mdbook into multiple languages. Support fo

Google 19 Apr 5, 2023
Catch Tailwindcss Errors at Compile-Time Before They Catch You, without making any change to your code! Supports overriding, extending, custom classes, custom modifiers, Plugins and many more 🚀🔥🦀

twust Twust is a powerful static checker in rust for TailwindCSS class names at compile-time. Table of Contents Overview Installation Usage Statement

null 15 Nov 8, 2023