A Rust library for interacting with OpenAI's ChatGPT API, providing an easy-to-use interface and strongly typed structures.

Overview

Crates.io Documentation Codecov Dependency status

ChatGPT Rust Library

A Rust library for interacting with OpenAI's ChatGPT API. This library simplifies the process of making requests to the ChatGPT API and parsing responses.

Features

  • Easy to use interface for interacting with the ChatGPT API
  • Strongly typed structures for request parameters and response data
  • Support for serialization and deserialization using Serde
  • An example CLI chat application that demonstrates library usage

Installation

Add the following line to your 'Cargo.toml' file under the '[dependencies]' section:

chat-gpt-lib-rs = "0.1.8"

Then, run cargo build to download and compile the dependencies.

Usage

First, import the necessary components:

use chat_gpt_lib_rs::{ChatGPTClient, ChatInput, Message, Model, Role};

Next, create a new client with your API key:

let api_key = "your_api_key_here";
let base_url = "https://api.openai.com";
let client = ChatGPTClient::new(api_key, base_url);

To send a chat message, create a ChatInput structure and call the chat method:

let chat_input = ChatInput {
    model: Model::Gpt3_5Turbo,
    messages: vec![
        Message {
            role: Role::System,
            content: "You are a helpful assistant.".to_string(),
        },
        Message {
            role: Role::User,
            content: "Who won the world series in 2020?".to_string(),
        },
    ],
    ..Default::default()
};

let response = client.chat(chat_input).await.unwrap();

The response will be a 'ChatResponse' structure containing the API response data.

Example CLI Chat Application

Two example CLI chat applications are provided in the examples folder:

Simple Chat Application

The cli-simple-chat-example.rs demonstrates how to use the chat-gpt-lib-rs library to interact with an AI model based on the GPT-3 architecture through a command-line interface. To run the example, first set your OPENAI_API_KEY in the .env file or as an environment variable, and then execute the following command:

cargo run --example cli-simple-chat-example

The example will prompt the user to enter a question, and the AI chatbot will respond with an answer. The conversation will continue until the user exits the program.

Optionally, you can provide initial user input as a command-line argument:

cargo run --example cli-simple-chat-example "Hello, computer!"

Fancy Chat Application

The cli-chat-example.rs demonstrates how to use the chat-gpt-lib-rs library to create an interactive AI chatbot with a command-line interface. To run the example, first set your OPENAI_API_KEY in the .env file or as an environment variable, and then execute the following command:

cargo run --example cli-chat-example

The example will prompt the user to enter a message, and the AI chatbot will respond with an answer. The conversation will continue until the user exits the program.

Optionally, you can provide initial user input as a command-line argument:

cargo run --example cli-chat-example "Hello, computer!"

For an enhanced experience with icons, use a terminal that supports Nerd Fonts. To enable this feature set you USE_ICONS=true in the .env file or as en environment variable.

Documentation

For more details about the request parameters and response structure, refer to the OpenAI API documentation.

Example project

There is an interesting project teachlead now utilizing this project.

License

This project is licensed under the Apache License 2.0. See the LICENSE file for details.

You might also like...
Tangram - makes it easy for programmers to train, deploy, and monitor machine learning models.
Tangram - makes it easy for programmers to train, deploy, and monitor machine learning models.

Tangram is the all-in-one machine learning toolkit for programmers. Train a model from a CSV file on the command line. Make predictions from Elixir, G

SimConnect SDK in Rust. An opinionated SimConnect Client that encapsulates the C API fully and optimizes for developer experience.

SimConnect SDK An opinionated SimConnect Client that encapsulates the C API fully and optimizes for developer experience. Usage [dependencies] simconn

SimConnect SDK in Rust. An opinionated SimConnect Client that encapsulates the C API fully and optimizes for developer experience.

SimConnect SDK An opinionated SimConnect Client that encapsulates the C API fully and optimizes for developer experience. Usage [dependencies] simconn

A plugin for Jupyter Notebooks that shows you its energy use.

Jupyter Energy Jupyter Notebooks are a data science tool mostly used for statistics and machine learning, some of the most energy-intensive computing

A demo repo that shows how to use the latest component model feature in wasmtime to implement a key-value capability defined in a WIT file.

Key-Value Component Demo This repo serves as an example of how to use the latest wasm runtime wasmtime and its component-model feature to build and ex

SlintDotnet is a C# bindings project to enable developers to use Slint UI with .NET C#

SlintDotnet (Alpha) Slint is a UI toolkit that supports different programming languages. SlintDotnet is the integration with .NET C#. ⚠️ This is exper

Rust bindings for the C++ api of PyTorch.

tch-rs Rust bindings for the C++ api of PyTorch. The goal of the tch crate is to provide some thin wrappers around the C++ PyTorch api (a.k.a. libtorc

Example of Rust API for Machine Learning

rust-machine-learning-api-example Example of Rust API for Machine Learning API example that uses resnet224 to infer images received in base64 and retu

Rust API to run predictions with YoloV5 models.

YoloV5-API [WIP] API to run inferences with YoloV5 models. Written in Rust, based on OpenCV 4.5.5 If you need a C++ version, check my C++ Yolov5-API R

Comments
  • Adding cloning option for Message

    Adding cloning option for Message

    error[E0599]: the method `clone` exists for struct `Vec<Message>`, but its trait bounds were not satisfied
       --> src/main.rs:35:32
        |
    35  |             messages: messages.clone(),
        |                                ^^^^^ method cannot be called on `Vec<Message>` due to unsatisfied trait bounds
        |
       ::: /Users/arend-jan/.cargo/registry/src/github.com-1ecc6299db9ec823/chat-gpt-lib-rs-0.1.0/src/client.rs:88:1
        |
    88  | pub struct Message {
        | ------------------ doesn't satisfy `Message: Clone`
        |
       ::: /Users/arend-jan/.rustup/toolchains/stable-aarch64-apple-darwin/lib/rustlib/src/rust/library/alloc/src/vec/mod.rs:400:1
        |
    400 | pub struct Vec<T, #[unstable(feature = "allocator_api", issue = "32838")] A: Allocator = Global> {
        | ------------------------------------------------------------------------------------------------ doesn't satisfy `Vec<Message>: Clone`
        |
        = note: the following trait bounds were not satisfied:
                `Message: Clone`
                which is required by `Vec<Message>: Clone`
    
    For more information about this error, try `rustc --explain E0599`.
    error: could not compile `computer` due to previous error
    /
    
    enhancement 
    opened by Arend-Jan 0
  • More specific bounderies for the struct of models based on api

    More specific bounderies for the struct of models based on api

    API Spec Create a ChatOptions struct to store the optional parameters with their default values and enforce the specified constraints on their values. You can then use this struct when constructing API requests. Here's an example implementation:

    #[derive(Debug, Clone)]
    pub struct ChatOptions {
        pub temperature: f64,
        pub top_p: f64,
        pub n: u32,
        pub stream: bool,
        pub stop: Option<Vec<String>>,
        pub max_tokens: Option<u32>,
        pub presence_penalty: f64,
        pub frequency_penalty: f64,
        pub logit_bias: Option<LogitBias>,
        pub user: Option<String>,
    }
    
    impl ChatOptions {
        pub fn new() -> Self {
            ChatOptions {
                temperature: 1.0,
                top_p: 1.0,
                n: 1,
                stream: false,
                stop: None,
                max_tokens: None,
                presence_penalty: 0.0,
                frequency_penalty: 0.0,
                logit_bias: None,
                user: None,
            }
        }
    
        pub fn set_temperature(mut self, temperature: f64) -> Result<Self, &'static str> {
            if temperature >= 0.0 && temperature <= 2.0 {
                self.temperature = temperature;
                Ok(self)
            } else {
                Err("Temperature should be between 0 and 2")
            }
        }
    
        pub fn set_top_p(mut self, top_p: f64) -> Result<Self, &'static str> {
            if top_p >= 0.0 && top_p <= 1.0 {
                self.top_p = top_p;
                Ok(self)
            } else {
                Err("top_p should be between 0 and 1")
            }
        }
    
        // Implement similar setter methods for other optional parameters with constraints
    }
    

    In this example, the ChatOptions struct is created with default values for the optional parameters. You can add setter methods for each parameter that enforces the constraints mentioned in the API spec. For example, the set_temperature method checks if the temperature value is between 0 and 2 before setting it. If the value is not within the specified range, it returns an error. You can create similar methods for other parameters with constraints, like presence_penalty and frequency_penalty.

    opened by Arend-Jan 0
Releases(v0.1.7)
  • v0.1.7(Mar 23, 2023)

    What's Changed

    • gtp_4_32k support added by @Arend-Jan in https://github.com/Arend-Jan/chat-gpt-lib-rs/pull/32
    • update version and add reference in readme by @Arend-Jan in https://github.com/Arend-Jan/chat-gpt-lib-rs/pull/31

    Full Changelog: https://github.com/Arend-Jan/chat-gpt-lib-rs/compare/v0.1.6...v0.1.7

    Source code(tar.gz)
    Source code(zip)
  • v0.1.6(Mar 19, 2023)

    What's Changed

    • adding more test by @Arend-Jan in https://github.com/Arend-Jan/chat-gpt-lib-rs/pull/25

    Full Changelog: https://github.com/Arend-Jan/chat-gpt-lib-rs/compare/v0.1.5...v0.1.6

    Source code(tar.gz)
    Source code(zip)
  • v0.1.5(Mar 18, 2023)

    What's Changed

    • [FIX] upstep version both in cargo.toml and readme.md by @Arend-Jan in https://github.com/Arend-Jan/chat-gpt-lib-rs/pull/23

    Full Changelog: https://github.com/Arend-Jan/chat-gpt-lib-rs/compare/v0.1.4...v0.1.5

    Source code(tar.gz)
    Source code(zip)
  • v0.1.4(Mar 18, 2023)

    What's Changed

    • base ci added to the reopo by @Arend-Jan in https://github.com/Arend-Jan/chat-gpt-lib-rs/pull/15
    • 11 add ci by @Arend-Jan in https://github.com/Arend-Jan/chat-gpt-lib-rs/pull/17
    • some more sec update by @Arend-Jan in https://github.com/Arend-Jan/chat-gpt-lib-rs/pull/18
    • fixing logger problem by @Arend-Jan in https://github.com/Arend-Jan/chat-gpt-lib-rs/pull/19
    • 20 adding a more simpler example by @Arend-Jan in https://github.com/Arend-Jan/chat-gpt-lib-rs/pull/21

    Full Changelog: https://github.com/Arend-Jan/chat-gpt-lib-rs/compare/v0.1.3...v0.1.4

    Source code(tar.gz)
    Source code(zip)
  • v0.1.3(Mar 16, 2023)

    What's Changed

    • [FIX] improve based on clippy remarks by @Arend-Jan in https://github.com/Arend-Jan/chat-gpt-lib-rs/pull/13
    • Adding an example cli application by @Arend-Jan in https://github.com/Arend-Jan/chat-gpt-lib-rs/pull/14

    Full Changelog: https://github.com/Arend-Jan/chat-gpt-lib-rs/compare/v0.1.2...v0.1.3

    Source code(tar.gz)
    Source code(zip)
  • v0.1.2(Mar 16, 2023)

    What's Changed

    • Added the clone option for a message by @Arend-Jan in https://github.com/Arend-Jan/chat-gpt-lib-rs/pull/9
    • fixing the slugs to be compliant with crates.io by @Arend-Jan in https://github.com/Arend-Jan/chat-gpt-lib-rs/pull/10

    Full Changelog: https://github.com/Arend-Jan/chat-gpt-lib-rs/compare/v0.1.1...v0.1.2

    Source code(tar.gz)
    Source code(zip)
  • v0.1.1(Mar 16, 2023)

    What's Changed

    • Create LICENSE by @Arend-Jan in https://github.com/Arend-Jan/chat-gpt-lib-rs/pull/1
    • adding docs to lib.rs by @Arend-Jan in https://github.com/Arend-Jan/chat-gpt-lib-rs/pull/4
    • fixing docs to lib.rs by @Arend-Jan in https://github.com/Arend-Jan/chat-gpt-lib-rs/pull/7

    New Contributors

    • @Arend-Jan made their first contribution in https://github.com/Arend-Jan/chat-gpt-lib-rs/pull/1

    Full Changelog: https://github.com/Arend-Jan/chat-gpt-lib-rs/commits/v0.1.1

    Source code(tar.gz)
    Source code(zip)
Owner
Arend-Jan
Arend-Jan
A Rust library integrated with ONNXRuntime, providing a collection of ML models.

usls A Rust library integrated with ONNXRuntime, providing a collection of Computer Vison and Vision-Language models including YOLOv8, YOLOv9, RTDETR,

Jamjamjon 3 Apr 9, 2024
zenoh-flow aims at providing a zenoh-based data-flow programming framework for computations that span from the cloud to the device.

Eclipse Zenoh-Flow Zenoh-Flow provides a zenoh-based dataflow programming framework for computations that span from the cloud to the device. ⚠️ This s

null 35 Dec 12, 2022
A fast, safe and easy to use reinforcement learning framework in Rust.

RSRL (api) Reinforcement learning should be fast, safe and easy to use. Overview rsrl provides generic constructs for reinforcement learning (RL) expe

Thomas Spooner 139 Dec 13, 2022
A high level, easy to use gpgpu crate based on wgpu

A high level, easy to use gpgpu crate based on wgpu. It is made for very large computations on powerful gpus

null 18 Nov 26, 2022
Ask the Terminal Anything (ATA): ChatGPT in the terminal

ata: Ask the Terminal Anything ChatGPT in the terminal TIP: Run a terminal with this tool in your background and show/hide it with a keypress. This ca

Rik Huijzer 147 Mar 8, 2023
m2cgen (Model 2 Code Generator) - is a lightweight library which provides an easy way to transpile trained statistical models into a native code

Transform ML models into a native code (Java, C, Python, Go, JavaScript, Visual Basic, C#, R, PowerShell, PHP, Dart, Haskell, Ruby, F#, Rust) with zero dependencies

Bayes' Witnesses 2.3k Dec 31, 2022
A light wheight Neural Network library with a focus on ease of use and speed.

Smarty Pants This goal of this library is to: Produce NeuralNetworks that will always give the same result when given the same input. Provide methods

Coding Wizard 3 Mar 7, 2022
Command line interface for BDSP RNG, primarily used as a reference implementation and as a tool for testing.

BDSP RNG Reference This is a command line interface for BDSP RNG, primarily used as a reference implementation and as a tool for testing. Building Bui

Zak 2 Nov 20, 2022
A high performance python technical analysis library written in Rust and the Numpy C API.

Panther A efficient, high-performance python technical analysis library written in Rust using PyO3 and rust-numpy. Indicators ATR CMF SMA EMA RSI MACD

Greg 210 Dec 22, 2022
A rust interface to OpenML

openml-rust A rust interface to OpenML. The aim of this crate is to give rust code access to Machine Learning data hosted by OpenML. Thus, Machine Lea

null 8 May 12, 2022