PartiQL libraries and tools in Rust.

Related tags

Utilities rust query sql
Overview

PartiQL Rust

Crate Docs License CI Build codecov

This is a collection of crates to provide Rust support for the PartiQL query language.

The crates in this repository are considered experimental, under active/early development, and APIs are subject to change.

This project uses workspaces to manage the crates in this repository. The partiql crate is intended to be the crate that exports all the relevant partiql-* sub-crate functionality. It is factored in this way to make applications needing only some sub-component of the PartiQL implementation possible (e.g. an application that only requires the PartiQL parser can depend on partiql-parser directly).

Due to the lack of namespacing in crates.io, we have published 0.0.0 versions for the sub-crates we know we will need. A crate with a version 0.1.0 or higher, should have real, albeit potentially very experimental and/or early implementations.

Security

See CONTRIBUTING for more information.

License

This project is licensed under the Apache-2.0 License.

Comments
  • Decide which prototype parser to use

    Decide which prototype parser to use

    Currently, there are two prototype parsers: a PEG parser (using Pest) and an LR parser (generated by LALRPop).

    We need to

    • evaluate which parser to continue development with
    • remove the other parser
    • clean up the exported API
    opened by jpschorr 6
  • Add COMMENT to Scanner grammar rule

    Add COMMENT to Scanner grammar rule

    Issue: #17

    Description of changes: I added COMMENT rule to the pest grammar. I was hesitant to edit the Rule::Query as well. Should I?

    By submitting this pull request, I confirm that my contribution is made under the terms of the Apache 2.0 license.

    opened by tbmreza 5
  • Consider adding a `Span` API

    Consider adding a `Span` API

    @dlurton points out that we probably want to consider encapsulating location information into a Span API.

    I think this is a reasonable idea and is conceptually similar to the underlying Pest's [Span][1].

    Besides the start/end for line/column, I think adding the byte offset (UTF-8 code unit) information as well would be good here.

    [1]: https://docs.rs/pest/2.1.3/pest/struct.Span.html

    Should these be in a struct by themselves?

    struct Span {
        /// Start location for this token.
        start_location: LineAndColumn,
        /// End location for this token.
        end_location: LineAndColumn
    }
    

    ??

    Originally posted by @dlurton in https://github.com/partiql/partiql-lang-rust/pull/12#discussion_r629547393

    enhancement good first issue 
    opened by almann 5
  • Add PartiQL Playground proof of concept (POC)

    Add PartiQL Playground proof of concept (POC)

    Issue #, if available: #128

    Description of changes: This change adds the PartiQL Playground proof of concept and includes:

    • Enable JSON serialization on PartiQL AST and Parser result.

    • Adds the new partiql-playground package.

    • Add the required Front-end work for the POC:

      • Putting a layout in place.
      • Use already available ace-editor PartiQL mode.
      • Add collapsable JSON and Graph representation.

    1

    By submitting this pull request, I confirm that my contribution is made under the terms of the Apache 2.0 license.

    opened by am357 4
  • Resolve parser conformance tests (initial PR)

    Resolve parser conformance tests (initial PR)

    Issue #, if available: #108

    Resolves 10 case failures.

    Description of changes:

    Before (full report)

    cargo test --package partiql-conformance-tests --features "conformance_test"
    ...
    
    test result: FAILED. 76 passed; 87 failed; 0 ignored; 0 measured; 0 filtered out; finished in 0.01s
    

    After:

    cargo test --package partiql-conformance-tests --features "conformance_test"
    ...
    
    test result: FAILED. 86 passed; 77 failed; 0 ignored; 0 measured; 0 filtered out; finished in 0.01s
    

    By submitting this pull request, I confirm that my contribution is made under the terms of the Apache 2.0 license.

    opened by am357 4
  • Prototype PEG & LR Parsers

    Prototype PEG & LR Parsers

    Fixes #58

    Adds:

    • prototype PEG parser using Pest.rs
    • prototype LR parser using LALRPOP (parser generator) & Logos (lexer generator)
    • benchmarks comparing prototype parsers
    • some tweaks to the experimental AST in order to complete the parser

    Note:

    • the PEG parser does not parse all the way to an AST, but rather only to Pest's Pairs abstraction.
    • the LR parser does parse all the way to the experimental AST
    • The lexer is exported (as lalr::lex_partiql) largely for benchmarking purposes;

    By submitting this pull request, I confirm that my contribution is made under the terms of the Apache 2.0 license.

    experimental 
    opened by jpschorr 4
  • Test GH Actions using `ubuntu-20.04` rather than `ubuntu-latest`

    Test GH Actions using `ubuntu-20.04` rather than `ubuntu-latest`

    Not meant to be formally reviewed. Demonstrates fix to GH Actions builds in last commit 7683d31e8b9bb116bbeb5bced0ffcca06af709a4.

    Test GH Actions using ubuntu-20.04 rather than ubuntu-latest. Similar issue: https://github.com/actions/runner-images/issues/6709. For some reason using ubuntu-latest (which uses 22.04) has caused some timeouts with some of the longer tests. Manually setting the ubuntu test version back to 20.04 should hopefully resolve this issue.

    By submitting this pull request, I confirm that my contribution is made under the terms of the Apache 2.0 license.

    opened by alancai98 3
  • [WIP] Implements LATERAL JOINs using existing DAG flows and eval node structure

    [WIP] Implements LATERAL JOINs using existing DAG flows and eval node structure

    [WIP] -- still testing if LEFT JOINs can be modeled using the existing DAG flows and evaluation node structure.

    Models CROSS and INNER JOINs as lateral JOINs (i.e. make LHS JOIN bindings available to RHS JOIN expression), which is how the PartiQL spec defines CROSS and INNER JOINs. This allows us to remove the CrossLateral JOIN type specified in #227 and resolves the comment mentioned in that PR. The current approach in this PR moves the "JOIN"-ing/chaining of tuples together to within the RHS scan, which keeps the previous DAG modeling.

    This is still a POC as I'm still seeing how easily other JOINs can be implemented with this approach. I'm also experimenting with another approach that refactors how JOINs are modeled in the eval nodes and DAG.

    By submitting this pull request, I confirm that my contribution is made under the terms of the Apache 2.0 license.

    opened by alancai98 3
  • Add simple and searched case when expressions to evaluator

    Add simple and searched case when expressions to evaluator

    Note: target branch is feat-value-iters. Will create a new PR targeting main once that branch's PR is merged to main.

    Issue #, if available: None.

    Description of changes: Adds simple and searched case when expressions to the evaluator. Also fixes projection behavior when a tuple value is MISSING (previously included those tuple values; now omits).

    By submitting this pull request, I confirm that my contribution is made under the terms of the Apache 2.0 license.

    opened by alancai98 3
  • [steel thread] Add query plan and eval using DAG model

    [steel thread] Add query plan and eval using DAG model

    Issue #, if available: Closes #201

    Description of changes

    Adds the required changes for enabling query evaluation using a DAG model.

    What is our definition of DAG? We are only interested in DAGs that can be used as execution plans, which leads to the following definition. A DAG is a directed, cycle-free graph G = (V, E) with a denoted root node v0 ∈ V such that all v ∈ V {v0} are reachable from v0. Note that this is the definition of trees without the condition |E| = |V | − 1. Hence, all trees are DAGs. Reference: https://link.springer.com/article/10.1007/s00450-009-0061-0

    In addition

    • The steel thread includes only Scan and Project operators, and passes the test for planning and evaluation of the following query:
    SELECT a AS b FROM data
    
    • The code is not clean, maybe not idiomatic at some instances, and lacks documentation. I'll iterate through the code with a consequent PR when adding new operators.

    Algorithm

    We're loosely using a push model with DAG as called out in the following: https://link.springer.com/article/10.1007/s00450-009-0061-0 (Section 10.3.5. Pushing)

    1. Create a DAG logical plan LogicalPlan with each node as a logical operator (for now manual until we integrate with AST -> PLAN).
    2. Create an evaluation plan EvalPlan with the same structure as logical plan.
    3. Toposort the EvalPlan to ensure all dependencies of a node get evaluated before the node itself.
    4. For each operator in EvalPlan, eval the node.
    5. Store the result in the node.
    6. Push the output to each consumer—for now, for final result, we use a Sink node that is considered as the node that accumulates output; we need to see if there are better alternatives.

    Yet to be implemented

    • [ ] port all the other existing nodes and ensure the current tests pass against them
    • [ ] add error handling and validation (incl. DAG cyclic validation).
    • [ ] add documentation.

    Finally, we're not performing any optimization at this stage (and we won't for extended amount of time).

    By submitting this pull request, I confirm that my contribution is made under the terms of the Apache 2.0 license.

    opened by am357 3
  • Give AST nodes identity & decouple source location from AST node.

    Give AST nodes identity & decouple source location from AST node.

    • Move the source offset location that gave rise to each AST node into a LocationMap
    • Add a unique NodeId to each AST node

    By submitting this pull request, I confirm that my contribution is made under the terms of the Apache 2.0 license.

    opened by jpschorr 3
  • Look into when we can revert GH Actions to use `ubuntu-latest` rather than `ubuntu-20.04`

    Look into when we can revert GH Actions to use `ubuntu-latest` rather than `ubuntu-20.04`

    Some GH Actions builds failed using ubuntu-latest (example build) that were previously not failing. Looking at the full log the error message the host provides is:

    ... 2023-01-04T19:34:48.7041472Z test partiql_tests::eval::query::select::select::select_mysql::select::mysql_select_29 has been running for over 60 seconds 2023-01-04T19:34:54.0170592Z ##[error]The runner has received a shutdown signal. This can happen when the runner service is stopped, or a manually started runner is canceled. 2023-01-04T19:34:54.2329232Z Cleaning up orphan processes

    Seems like for longer steps (e.g. conformance test running longer than 60 seconds), the host cancels the GH Actions step and fails using ubuntu-latest when it previously didn't fail. Following this similar issue, a user suggested changing to use ubuntu-20.04, which fixed the GH Actions issue as shown in https://github.com/partiql/partiql-lang-rust/pull/257. We should followup w/ the ubuntu-latest issue to see if there's a fix to longer steps at some point in the future.

    DoD

    • See if we can switch back to using ubuntu-latest
    build 
    opened by alancai98 0
  • PoC AST to Plan compilation and query evalution of conformance tests

    PoC AST to Plan compilation and query evalution of conformance tests

    Provide a PoC end-to-end [query text + environment] -> [evaluated results]

    • Adds a compilation from AST to Logical plan for a large subset of SELECT/WHERE/FROM, expressions, and some PartiQL pathing.
    • Adds conformance test evaluation (currently assumes COERCE mode) and assertions.
    • Adds dynamic lookup for runtime name resolution of variable references.

    By submitting this pull request, I confirm that my contribution is made under the terms of the Apache 2.0 license.

    opened by jpschorr 2
  • Decide if `branch_num` is needed in logical plan

    Decide if `branch_num` is needed in logical plan

    The branch_num was added as part of #218 to distinguish between multiple input data flow for JOIN nodes. #236 refactored JOINs to not require the multiple inputs. There are no other current nodes using branch_num, so it can possibly be removed and added back later if needed.

    opened by alancai98 0
  • Add Criterion Compare Github Action

    Add Criterion Compare Github Action

    Test adding Criterion Compare Github Action


    By submitting this pull request, I confirm that my contribution is made under the terms of the Apache 2.0 license.

    opened by jpschorr 2
  • Adds non-reserved keywords (POC)

    Adds non-reserved keywords (POC)

    Relevant Issues

    • Related to partiql/partiql-lang-kotlin#780. It is marked as a draft pending an RFC.

    Description

    • This is a proof of concept (POC) for adding non-reserved keywords to the Rust implementation
    • I've added several keywords to the NonReservedKeywords list in LALRPOP
      • ACYCLIC, BOTH, DOMAIN, LEADING, TRAIL, TRAILING, and USER to match the PR linked above
      • Also added ORDER, but this is only because ORDER is already used within Rust's grammar as a keyword -- and I wanted to show how existing keywords can be converted to non-reserved keywords.
    • I also modified the preprocessor so that it can parse TRAILING, LEADING, and BOTH appropriately.
    • If they can be parsed as VarRefExprs, we convert them to SymbolPrimitives (case-insensitive), so we don't even need to know what their original case is. I didn't add qualified/unqualified to this PR, as it's a quick POC.
    • I've added tests to show several scenarios of using non-reserved keywords.
    • I don't intend for this to be merged, as I am not familiar with the codebase -- however I wanted to show how it can possibly be done.
    • Note: I'm unfamiliar with this language, so please excuse mistakes made along the way

    By submitting this pull request, I confirm that my contribution is made under the terms of the Apache 2.0 license.

    enhancement 
    opened by johnedquinn 2
Releases(v0.1.0)
  • v0.1.0(Aug 5, 2022)

    This release highlights a PartiQL Parser API that has been tested using Syntax success and fail tests that are defined in PartiQL conformance test suite; other highlights are, PartiQL CLI REPL and query visualizer, and an experimental PartiQL web playground.

    As this is the first minor version after v0.0.0, it does not include any breaking changes.

    Added

    • Lexer & Parser for the majority of PartiQL query capabilities—see syntax success and fail tests for more details.
    • AST for the currently parsed subset of PartiQL
    • Tracking of locations in source text for ASTs and Errors
    • Parser fuzz tester
    • Conformance tests via test generation from partiql-tests
    • PartiQL Playground proof of concept (POC)
    • PartiQL CLI with REPL and query visualization features

    Full Changelog: https://github.com/partiql/partiql-lang-rust/commits/v0.1.0

    Source code(tar.gz)
    Source code(zip)
    partiql-cli(49.24 MB)
A code generator to reduce repetitive tasks and build high-quality Rust libraries. 🦀

LibMake A code generator to reduce repetitive tasks and build high-quality Rust libraries Welcome to libmake ?? Website • Documentation • Report Bug •

Sebastien Rousseau 27 Mar 12, 2023
Mobile safari / webview remote debugging and e2e testing libraries

Canter (WIP) (WIP) Mobile safari / webview remote debugging and e2e testing libraries. Developed for safari/webview e2e testing on iPhone. Works only

Han Lee 9 Aug 16, 2022
Simplified glue code generation for Deno FFI libraries written in Rust.

deno_bindgen This tool aims to simplify glue code generation for Deno FFI libraries written in Rust. Quickstart # install CLI deno install -Afq -n den

Divy Srivastava 173 Dec 17, 2022
Rust libraries for working with GPT (GUID Partition Table) disk data

gpt-disk-rs no_std libraries related to GPT (GUID Partition Table) disk data. There are three Rust packages in this repository: uguid The uguid packag

Google 25 Dec 24, 2022
A swc plugin that automatically converts React component libraries into "React Client Component"

A swc plugin that automatically converts React component libraries into "React Client Component". For example, you can automatically convert components from @mui into "React Client Component" without having to wrap a component that uses "use client".

xiaotian 3 Jul 12, 2023
Utilities and tools based around Amazon S3 to provide convenience APIs in a CLI

s3-utils Utilities and tools based around Amazon S3 to provide convenience APIs in a CLI. This tool contains a small set of command line utilities for

Isaac Whitfield 47 Dec 15, 2022
Middlewares and tools to integrate axum + tracing + opentelemetry

axum-tracing-opentelemetry Middlewares and tools to integrate axum + tracing + opentelemetry. Read OpenTelemetry header from incoming request Start a

David Bernard 31 Jan 4, 2023
PM-Tools - a simple Rust util to easily create server directories

PM-Tools PM-Tools is a simple Rust util to easily create server directories or plugins without the hassle of unzipping or creating directories Progres

null 2 Mar 19, 2022
Modeling is a tools to analysis different languages by Ctags

Modeling Modeling is a tools to analysis different languages by Ctags process: generate to opt call ctags with opt analysis ctags logs output resulse

Inherd OS Team (硬核开源小组) 13 Sep 13, 2022
Postgres proxy which allows tools that don't natively supports IAM auth to connect to AWS RDS instances.

rds-iamauth-proxy rds-proxy lets you make use of IAM-based authentication to AWS RDS instances from tools that don't natively support that method of a

Gold Fig Labs Inc. 10 Nov 7, 2022
A document-code sync tools for document engineering.

Writing A document-code sync tools for document engineering. Writing 是一个自动 “文档-代码” 同步工具。解析 Markdown 中的代码定义,读取目标代码,并嵌入到新的文档中。 Language parse support by

Inherd OS Team (硬核开源小组) 18 Oct 11, 2022
Boot tools: loader, image generator, etc as a library crate

ArcBoot v0 A uefi bootloader for riscv, arm and x86. Comes in the form of a single executable. the source code contains a single executable target and

Spectral Project 3 Oct 3, 2022
Fusion is a cross-platform App Dev ToolKit build on Rust . Fusion lets you create Beautiful and Fast apps for mobile and desktop platform.

Fusion is a cross-platform App Dev ToolKit build on Rust . Fusion lets you create Beautiful and Fast apps for mobile and desktop platform.

Fusion 1 Oct 19, 2021
A Diablo II library for core and simple client functionality, written in Rust for performance, safety and re-usability

A Diablo II library for core and simple client functionality, written in Rust for performance, safety and re-usability

null 4 Nov 30, 2022
Code examples, data structures, and links from my book, Rust Atomics and Locks.

This repository contains the code examples, data structures, and links from Rust Atomics and Locks. The examples from chapters 1, 2, 3, and 8 can be f

Mara Bos 338 Jan 6, 2023
List of Persian Colors and hex colors for CSS, SCSS, PHP, JS, Python, and Ruby.

Persian Colors (Iranian colors) List of Persian Colors and hex colors for CSS, SCSS, PHP, C++, QML, JS, Python, Ruby and CSharp. Persian colors Name H

Max Base 12 Sep 3, 2022
Northstar is a horizontally scalable and multi-tenant Kubernetes cluster provisioner and orchestrator

Northstar Northstar is a horizontally scalable and multi-tenant Kubernetes cluster provisioner and orchestrator. Explore the docs » View Demo · Report

Lucas Clerisse 1 Jan 22, 2022
Time related types (and conversions) for scientific and astronomical usage.

astrotime Time related types (and conversions) for scientific and astronomical usage. This library is lightweight and high performance. Features The f

Michael Dilger 3 Aug 22, 2022
UnTeX is both a library and an executable that allows you to manipulate and understand TeX files.

UnTeX UnTeX is both a library and an executable that allows you to manipulate and understand TeX files. Usage Executable If you wish to use the execut

Jérome Eertmans 1 Apr 5, 2022