Cornucopia is a small CLI utility resting on tokio-postgres and designed to facilitate PostgreSQL workflows in Rust

Overview

Cornucopia

Generate type checked Rust from your SQL


Cornucopia is a small CLI utility resting on tokio-postgres and designed to facilitate PostgreSQL workflows in Rust.

Cornucopia aims to get out of your way, transpiling your queries to Rust on demand without requiring you to maintain a live database connection. Each query is prepared against your schema, ensuring that the statement is valid SQL. These prepared statements are then be used to generate properly typed rust code for this query. Keep reading for more info, or take a look at the examples folder for a quickstart 🚀


Install

Docker

The CLI spawns a postgres container when it has to generate Rust modules. Thus, you need to have a working docker. Note that on Linux non-sudo users need to be in the docker group. Read the official installation and post-installation steps.

Dependencies

Cornucopia will generate queries powered by the tokio runtime through tokio-postgres and deadpool-postgres, so you will need add the latest version of these to your dependencies. You might need more dependencies depending on which features you inted to use, The code block below shows an example of what your dependencies might look like with every feature that cornucopia supports:

# Cargo.toml
[dependencies]
tokio = { version = "1.17.0", features = ["full"] }
deadpool-postgres = { version = "0.10.2", features = ["serde"] }
tokio-postgres = { version = "0.7.5", features = [
    "with-serde_json-1",
    "with-time-0_3",
    "with-uuid-0_8",
] }
serde = { version = "1.0.136", features = ["derive"] }
serde_json = "1.0.79"
time = "0.3.9"
uuid = "0.8.2"

You can omit tokio-postgres feature flags for json, time and uuid and their corresponding crates if you don't need them.

Cornucopia CLI

Aside from the dependencies, you will need the lightweight cornucopia cli to generate your Rust modules. This can be done via a simple cargo install cornucopia which will pull the latest binary and install it in your cargo path.

Concepts

This section explain a bit more about how cornucopia works. If you just want to get started, take a look at the examples folder.

Cornucopia is pretty simple to use. Your migrations and queries should each reside in a dedicated folder, and from there the CLI takes care of the rest for you. In the next sections, we'll explore the basic usage, but feel free to explore the CLI's whole interface using the --help option at any point.

Migrations

New migrations can be added using the command cornucopia migration new. Cornucopia will automatically manage migrations when it generates your Rust modules, but you can also use the command cornucopia migration run to run migrations on your production database too if you so desire.

Queries

Each .sql file in your query directory will be converted into a Rust module containing functions corresponding to these queries. These functions are fully typed so you know exactly what parameters it takes, and what it returns. Queries are augmented by special comments allowing you you quickly fine-tune them.

Generated modules

Assuming you have the following migration

CREATE TABLE Authors (
    Id SERIAL NOT NULL,
    Name VARCHAR(70) NOT NULL,
    Country VARCHAR(100) NOT NULL,
    PRIMARY KEY(Id)
);

then, the following query

--! authors()*
SELECT * FROM Authors;

will be turned by cornucopia into

>(); Ok(return_value) }">
pub async fn authors(client: &Client) -> Result<Vec<(i32, String, String)>, Error> {
    let stmt = client
        .prepare_typed_cached(
            "SELECT * FROM Authors;", &[],
        )
        .await?;

    let res = client.query(&stmt, &[]).await?;

    let return_value = res
        .iter()
        .map(|res| {
            let return_value_0: i32 = res.get(0);
            let return_value_1: String = res.get(1);
            let return_value_2: String = res.get(2);
            (return_value_0, return_value_1, return_value_2)
        })
        .collect::<Vec<(i32, String, String)>>();
    Ok(return_value)
}

Not bad! The generated function uses prepared statements, a statement cache, and strong typing (Notice how the returned rows' types have been inferred!). This is only a taste of what you can achieve, but should be fairly representative of what's going on under the hood.

Meta query syntax

As you may have noticed from the previous section, this little comment --! authors()* is doing a lot of heavy-lifting for us. It tells cornucopia to generate a function name authors which takes no parameters. Since there is no specified return, cornucopia lets Postgres infer the types itself, which it is usually pretty good at. Then, there's the asterisk * which signals that this query will return zero or more results. That's how we ended up with a Vec in the generated query in the section above.

Note that comments that do not start with --! are simply ignored by cornucopia, so feel free to use them as you usually would.

So, what else can we do with those annotations? The grammar can be summed up as:

()

In the next sections we'll explore a bit more what these options mean and what you can do with them. The regexp-esque notation used in this section to describe the grammar is for illustrative purposes only, The full grammar is available in the grammar.pest file.

Name

The name of the generated function. Has to be a valid Rust identifier.

Params

The parameters of the prepared statement, separated by commas, with an optional trailing comma.

The order in which parameters are given corresponds to the parameter number (e.g. the first parameter is $1 in the statement). Every PostgreSQL parameter $i must have a corresponding parameter in the meta parameter list . If the parameter type is ambiguous, you can specify it using the syntax : where is a PostgreSQL type supported by cornycopia. These are called override parameters. Otherwise, a parameter consists simply of an identifier: the type is inferred from the prepared statements. These are called inferred parameters. Override parameters must come before inferred parameters. Again, we can sum this up as

= , where = : and = .

Return type

There are two kinds of returns, implicit and explicit.

Implicit return

Implicit returns don't name the returned columns. The column types are inferred using prepared statements. To make a return implicit, simply omit it (you don't have to write anything).

Implicit returns are further categorized into void, scalar, and tuple types dependenging on the number of columns returned. For example,

  • A query returning no column would result in ()
  • A query returning a single TEXT column would result in String,
  • A query returning a TEXT and a INTEGER would result in (String, i32)
Explicit return

Explicit returns give a name to the returned columns. The column types are inferred using prepared statements. To make a return explicit, list the returned column names inside brackets, in the same order as they are returned in the statement, separated by commas, with an optional trailing comma. There must be exactly as many names in the explicit return the as there are returned columns.

Each query that has an explicit return will generate a Rust struct to hold the query data. For example, this query

--! example_query() {name, country} *
SELECT Name, Country FROM Authors;

would result in this struct being generated

pub struct ExampleQuery {
    pub name: String,
    pub country: String
}

pub async fn authors(client: &Client) -> Result<Vec, Error> {
    /* ....omitted for brevity... */
}

Quantifier

The quantifier indicates the expected number of rows to be returned by a query. If no quantifier is specified, the it is assumed that only one record is to be returned. Using * and ? (corresponding to the "zero or more" and "zero or one" quantifiers) will wrap the resulting rust type in a Vec and Option respectively. To sum it up:

  • no quantifier results in T
  • * results in Vec
  • ? results in Option

Transactions

Cornucopia actually generates two versions of your queries, one that accepts a regular client, while the other version (named with a *_tx suffix) accepts a transaction.

Supported types

PostgrsQL type Rust type
bool, boolean bool
char, character i8
smallint, int2, smallserial, serial2 i16
int, int4, serial, serial4 i32
bigint, int8, bigserial, serial8 i64
real, float4 f32
double precision, float8 f64
text String
varchar String
bytea Vec
timestamp without time zone, timestamp time::PrimitiveDateTime
timestamp with time zone, timestamptz time::OffsetDateTime
date time::Date
time time::Time
json serde_json::Value
jsonb serde_json::Value
uuid uuid::Uuid

License

Licensed under the MIT license.

You might also like...
🦀🔨 DevBcn Workshop - Full Stack Rust - Actix - Postgres - Shuttle - Dioxus
🦀🔨 DevBcn Workshop - Full Stack Rust - Actix - Postgres - Shuttle - Dioxus

Building a Movie Collection Manager - Full Stack Workshop with Rust, Actix, SQLx, Dioxus, and Shuttle Welcome to the this workshop! In this hands-on w

Rust-powered CLI tool designed to simplify and streamline the release process with help of ChatGPT
Rust-powered CLI tool designed to simplify and streamline the release process with help of ChatGPT

$ releasecraftsman 👷🏻‍♂️🛠 Automate Your Release Process with Precision and Ease. 🎁 Features Generate well-crafted release notes using GPT-3.5 and

This crate provides a set of functions to generate SQL statements for various PostgreSQL schema objects

This crate provides a set of functions to generate SQL statements for various PostgreSQL schema objects, such as tables, views, materialized views, functions, triggers, and indexes. The generated SQL statements can be useful for schema introspection, documentation, or migration purposes.

Experimental extension that brings OpenAI API to your PostgreSQL to run queries in human language.

Postgres ChatGPT Experimental PostgreSQL extension that enables the use of OpenAI GPT API inside PostgreSQL, allowing for queries to be written usi

A PostgreSQL extension for rendering the Tera HTML templating language.

PGTera PGTera is a PostgreSQL extension that provides functions for using Tera to render HTML templates. When used with a tool like Postgrest, you can

KAIVM is a multiplatform Command Line Interface (CLI) designed to simplify the process of downloading, managing, configuring, and running different versions of Shinkai Node
KAIVM is a multiplatform Command Line Interface (CLI) designed to simplify the process of downloading, managing, configuring, and running different versions of Shinkai Node

KAIVM - Shinkai Version Manager KAIVM is a multiplatform Command Line Interface (CLI) designed to simplify the process of downloading, managing, confi

CLI utility that screencaptures your Linux desktop and streams it to Kodi via UPNP/DLNA and RTSP

desktopcast Desktopcast is a little CLI application that allows you to cast your Linux desktop to any UPNP/DLNA device capable of the AVTransfer servi

 This CLI utility facilitates effortless manipulation and exploration of TOML, YAML, JSON and RON files.
This CLI utility facilitates effortless manipulation and exploration of TOML, YAML, JSON and RON files.

📁💻🔍🔧 This CLI utility facilitates effortless manipulation and exploration of TOML, YAML, JSON and RON files.

Prototype for a CLI/Libary designed for interacting with NASA Open APIs with Rust.

Overview Voyager is a swiss army knife library for the NASA Open APIs. It is designed to bundle all the NASA APIs into a single package. Voyager can b

Owner
Louis Gariépy
Louis Gariépy
Background task processing for Rust applications with Tokio, Diesel, and PostgreSQL.

Async persistent background task processing for Rust applications with Tokio. Queue asynchronous tasks to be processed by workers. It's designed to be

Rafael Carício 22 Mar 27, 2023
A utility for exporting administrative/moderation statistics from your Lemmy instance's PostgreSQL database to InfluxDB!

Lemmy (Stats) Data Exporter About This Project This project aims to act as a bridge between Lemmy's PostgreSQL database and InfluxDB, primarily to tra

Russell 3 Jul 5, 2023
Workflows make it easy to browse, search, execute and share commands (or a series of commands)--without needing to leave your terminal.

Workflows The repo for all public Workflows that appear within Warp and within commands.dev. To learn how to create local or repository workflows, see

Warp 369 Jan 2, 2023
Workflows make it easy to browse, search, execute and share commands (or a series of commands)--without needing to leave your terminal.

Workflows The repo for all public Workflows that appear within Warp and within commands.dev. To learn how to create local or repository workflows, see

Warp 227 Jun 1, 2022
AI-TOML Workflow Specification (aiTWS), a comprehensive and flexible specification for defining arbitrary Ai centric workflows.

AI-TOML Workflow Specification (aiTWS) The AI-TOML Workflow Specification (aiTWS) is a flexible and extensible specification for defining arbitrary wo

ruv 20 Apr 8, 2023
it aims to augment git with primitives to build integrated, cryptographically verifiable collaboration workflows around source code

it aims to augment git with primitives to build integrated, cryptographically verifiable collaboration workflows around source code. It maintains the distributed property of git, not requiring a central server. it is transport agnostic, and permits data dissemination in client-server, federated, as well as peer-to-peer network topologies.

Kim Altintop 4 Jan 16, 2023
A small command-line utility for encoding and decoding bech32 strings

A small command-line utility for encoding and decoding bech32 strings.

Charlie Moog 5 Dec 26, 2022
A small utility that moves the start menu to the top-center of the screen in Windows 11.

TopCenterStart11 A small utility that moves the start menu to the top-center of the screen in Windows 11. As of right now, this application can only p

Ryan de Jonge 12 Nov 12, 2022
A lightweight command line utility with some small functions for CTFs.

Ice Ice is a lightweight command line utility to help with simple problems encountered while playing CTFs. Extracted from graveyard NOTE: Most of the

Aquib 12 Dec 19, 2022
A small utility for tracing execve{,at}.

tracexec A small utility for tracing execve{,at}. Status: Proof of concept. Experimental quality. Not ready for production use. Performance is not a f

Levi Zim 9 Oct 28, 2023