Zero-Copy reading and writing of geospatial data.

Related tags

Geospatial geozero
Overview

GeoZero

CI build crates.io version docs.rs docs Discord Chat

Zero-Copy reading and writing of geospatial data.

GeoZero defines an API for reading geospatial data formats without an intermediate representation. It defines traits which can be implemented to read and convert to an arbitrary format or render geometries directly.

Supported geometry types:

Supported dimensions: X, Y, Z, M, T

Available implementations

  • GeoJSON Reader + Writer
  • GEOS Reader + Writer
  • GDAL geometry Reader + Writer
  • WKB Reader + Writer supporting
  • WKT Writer
  • SVG Writer
  • geo-types Reader + Writer

geozero-shp crates.io version docs.rs docs

  • Shapefile Reader

flatgeobuf crates.io version docs.rs docs

  • FlatGeobuf Reader

Applications

Conversion API

Convert a GeoJSON polygon to geo-types and calculate centroid:

let geojson = GeoJson(r#"{"type": "Polygon", "coordinates": [[[0, 0], [10, 0], [10, 6], [0, 6], [0, 0]]]}"#);
if let Ok(Geometry::Polygon(poly)) = geojson.to_geo() {
    assert_eq!(poly.centroid().unwrap(), Point::new(5.0, 3.0));
}

Full source code: geo_types.rs

Convert GeoJSON to a GEOS prepared geometry:

let geojson = GeoJson(r#"{"type": "Polygon", "coordinates": [[[0, 0], [10, 0], [10, 6], [0, 6], [0, 0]]]}"#);
let geom = geojson.to_geos().expect("GEOS conversion failed");
let prepared_geom = geom.to_prepared_geom().expect("to_prepared_geom failed");
let geom2 = geos::Geometry::new_from_wkt("POINT (2.5 2.5)").expect("Invalid geometry");
assert_eq!(prepared_geom.contains(&geom2), Ok(true));

Full source code: geos.rs

Read FlatGeobuf subset as GeoJSON:

let mut file = BufReader::new(File::open("countries.fgb")?);
let mut fgb = FgbReader::open(&mut file)?;
fgb.select_bbox(8.8, 47.2, 9.5, 55.3)?;
println!("{}", fgb.to_json()?);

Full source code: geojson.rs

Read FlatGeobuf data as geo-types geometries and calculate label position with polylabel-rs:

let mut file = BufReader::new(File::open("countries.fgb")?);
let mut fgb = FgbReader::open(&mut file)?;
fgb.select_all()?;
while let Some(feature) = fgb.next()? {
    let name: String = feature.property("name").unwrap();
    if let Ok(Geometry::MultiPolygon(mpoly)) = feature.to_geo() {
        if let Some(poly) = &mpoly.0.iter().next() {
            let label_pos = polylabel(&poly, &0.10).unwrap();
            println!("{}: {:?}", name, label_pos);
        }
    }
}

Full source code: polylabel.rs

PostGIS usage examples

Select and insert geo-types geometries with rust-postgres:

let mut client = Client::connect(&std::env::var("DATABASE_URL").unwrap(), NoTls)?;

let row = client.query_one(
    "SELECT 'SRID=4326;POLYGON ((0 0, 2 0, 2 2, 0 2, 0 0))'::geometry",
    &[],
)?;

let value: wkb::Decode<geo_types::Geometry<f64>> = row.get(0);
if let Some(geo_types::Geometry::Polygon(poly)) = value.geometry {
    assert_eq!(
        *poly.exterior(),
        vec![(0.0, 0.0), (2.0, 0.0), (2.0, 2.0), (0.0, 2.0), (0.0, 0.0)].into()
    );
}

// Insert geometry
let geom: geo_types::Geometry<f64> = geo::Point::new(1.0, 3.0).into();
let _ = client.execute(
    "INSERT INTO point2d (datetimefield,geom) VALUES(now(),ST_SetSRID($1,4326))",
    &[&wkb::Encode(geom)],
);

Select and insert geo-types geometries with SQLx:

let pool = PgPoolOptions::new()
    .max_connections(5)
    .connect(&env::var("DATABASE_URL").unwrap())
    .await?;

let row: (wkb::Decode<geo_types::Geometry<f64>>,) =
    sqlx::query_as("SELECT 'SRID=4326;POLYGON ((0 0, 2 0, 2 2, 0 2, 0 0))'::geometry")
        .fetch_one(&pool)
        .await?;
let value = row.0;
if let Some(geo_types::Geometry::Polygon(poly)) = value.geometry {
    assert_eq!(
        *poly.exterior(),
        vec![(0.0, 0.0), (2.0, 0.0), (2.0, 2.0), (0.0, 2.0), (0.0, 0.0)].into()
    );
}

// Insert geometry
let geom: geo_types::Geometry<f64> = geo::Point::new(10.0, 20.0).into();
let _ = sqlx::query(
    "INSERT INTO point2d (datetimefield,geom) VALUES(now(),ST_SetSRID($1,4326))",
)
.bind(wkb::Encode(geom))
.execute(&pool)
.await?;

Using compile-time verification requires type overrides:

let _ = sqlx::query!(
    "INSERT INTO point2d (datetimefield, geom) VALUES(now(), $1::geometry)",
    wkb::Encode(geom) as _
)
.execute(&pool)
.await?;

struct PointRec {
    pub geom: wkb::Decode<geo_types::Geometry<f64>>,
    pub datetimefield: Option<OffsetDateTime>,
}
let rec = sqlx::query_as!(
    PointRec,
    r#"SELECT datetimefield, geom as "geom!: _" FROM point2d"#
)
.fetch_one(&pool)
.await?;
assert_eq!(
    rec.geom.geometry.unwrap(),
    geo::Point::new(10.0, 20.0).into()
);

Full source code: postgis.rs

Processing API

Count vertices of an input geometry:

struct VertexCounter(u64);

impl GeomProcessor for VertexCounter {
    fn xy(&mut self, _x: f64, _y: f64, _idx: usize) -> Result<()> {
        self.0 += 1;
        Ok(())
    }
}

let mut vertex_counter = VertexCounter(0);
geometry.process(&mut vertex_counter, GeometryType::MultiPolygon)?;

Full source code: geozero-api.rs

Find maximal height in 3D polygons:

struct MaxHeightFinder(f64);

impl GeomProcessor for MaxHeightFinder {
    fn coordinate(&mut self, _x: f64, _y: f64, z: Option<f64>, _m: Option<f64>, _t: Option<f64>, _tm: Option<u64>, _idx: usize) -> Result<()> {
        if let Some(z) = z {
            if z > self.0 {
                self.0 = z
            }
        }
        Ok(())
    }
}

let mut max_finder = MaxHeightFinder(0.0);
while let Some(feature) = fgb.next()? {
    let geometry = feature.geometry().unwrap();
    geometry.process(&mut max_finder, GeometryType::MultiPolygon)?;
}

Full source code: geozero-api.rs

Render polygons:

struct PathDrawer<'a> {
    canvas: &'a mut CanvasRenderingContext2D,
    path: Path2D,
}

impl<'a> GeomProcessor for PathDrawer<'a> {
    fn xy(&mut self, x: f64, y: f64, idx: usize) -> Result<()> {
        if idx == 0 {
            self.path.move_to(vec2f(x, y));
        } else {
            self.path.line_to(vec2f(x, y));
        }
        Ok(())
    }
    fn linestring_end(&mut self, _tagged: bool, _idx: usize) -> Result<()> {
        self.path.close_path();
        self.canvas.fill_path(
            mem::replace(&mut self.path, Path2D::new()),
            FillRule::Winding,
        );
        Ok(())
    }
}

Full source code: flatgeobuf-gpu

Read a FlatGeobuf dataset with async HTTP client applying a bbox filter and convert to GeoJSON:

let url = "https://flatgeobuf.org/test/data/countries.fgb";
let mut fgb = HttpFgbReader::open(url).await?;
fgb.select_bbox(8.8, 47.2, 9.5, 55.3).await?;

let mut fout = BufWriter::new(File::create("countries.json")?);
let mut json = GeoJsonWriter::new(&mut fout);
fgb.process_features(&mut json).await?;

Full source code: geojson.rs

Create a KD-tree index with kdbush:

struct PointIndex {
    pos: usize,
    index: KDBush,
}

impl geozero::GeomProcessor for PointIndex {
    fn xy(&mut self, x: f64, y: f64, _idx: usize) -> Result<()> {
        self.index.add_point(self.pos, x, y);
        self.pos += 1;
        Ok(())
    }
}

let mut points = PointIndex {
    pos: 0,
    index: KDBush::new(1249, DEFAULT_NODE_SIZE),
};
read_geojson_geom(&mut f, &mut points)?;
points.index.build_index();

Full source code: kdbush.rs

Comments
  • GeoArrow WKB reader

    GeoArrow WKB reader

    For https://github.com/georust/geozero/issues/37. As a disclaimer, this is my first Rust PR to a non-personal project, so suggestions welcomed 🙂. I'm new to the geozero API as well, so I tried to model after existing code. I figured I'd put up a draft PR to start some discussion.

    Most likely GeoArrow/GeoParquet will end up having two geometry formats: WKB for universal compatibility and a "native" Arrow encoding for zero-copy performance. (For context, GeoArrow is the memory format, GeoParquet is the file format; GeoParquet decodes into GeoArrow). Is I mentioned in #37, the arrow-native encoding is faster because it's constant time access to any individual coordinate. (Though note that the arrow-native encoding is still provisional).

    So it seems like there are four parts:

    • GeoArrow WKB reader
    • GeoArrow WKB writer
    • GeoArrow Arrow-native reader
    • GeoArrow Arrow-native writer

    The WKB reader seems to be the easiest, so this PR starts there.

    Open questions:

    • I'm not sure the right way to handle a vector of geometries. I suppose the current use of geometrycollection_begin and geometrycollection_end exposes the collection of geometries as a geo GeometryCollection?

      Or should this implement GeozeroDatasource instead? The issue with that is that that seems to need to implement a feature processor. And it's unclear whether the struct should require a single array or a chunk with multiple arrays. Should there be different structs for just the geometry array and for the entire table?

    • Since the struct is defined as

      pub struct GeoArrowWkb(pub BinaryArray<i32>);
      

      this takes ownership of the arrow array right? Seems like it would be preferable to take a reference here, but I don't really understand lifetimes yet 😅 .

    • There are two different rust Arrow implementations: arrow and arrow2. They're both pretty feature complete, so ideally geozero wouldn't enforce one or the other. I'm not sure the best way to do this... feature flags?

    opened by kylebarron 9
  • Format support

    Format support

    This is a tracking issue for the current format support. Wishes for priorities or other formats can be added as comment.

    | | Reader (XY) | Reader (ZM, t/tm) | Writer (XY) | Writer (ZM, t/tm) | FromWkb | |------------|-------------|-------------------|-------------|-------------------|---------| | geo-types | ✓ | - | ✓ | - | ✓ | | GeoJSON | ✓ | | ✓ | | ✓ | | GDAL | ✓ | ✓ (Z) | ✓ | (broken) | ✓ | | GEOS | ✓ | ✓ (Z) | ✓ | | ✓ | | SVG | | - | ✓ | - | ✓ | | WKB | ✓ | ✓ (ZM) | ✓ | ✓ (ZM) | - | | WKT | | | ✓ | ✓ (ZM) | ✓ | | Flatgeobuf | ✓ | ✓ | | | | | Shapefile | ✓ | | | | |

    Property and Dataset Support:

    | | Read properties | Read dataset | Write properties | Write dataset | |------------|-----------------|--------------|------------------|---------------| | GeoJSON | | ✓ | | ✓ | | GDAL | | | | | | SVG | - | | - | ✓ | | Flatgeobuf | ✓ | ✓ | | | | Shapefile | ✓ | | | |

    opened by pka 9
  • Add FeatureIterator for streaming deserialize

    Add FeatureIterator for streaming deserialize

    ~~This also includes the FeatureIterator from https://github.com/georust/geojson/pull/181 until further notice.~~ UPDATE: now uses it from released geojson 0.22.3

    I'm unsure why the WKT in from_file_fc test differs from the from_file test in that WKT is not comma separated.

    opened by bjornharrtell 8
  • Fallible properties getter

    Fallible properties getter

    Signature to get a property is:

    fn property<T: PropertyReadType>(&self, name: &str) -> Option<T>
    

    But that doesn't allow you to distinguish between "property missing" and "invalid property" (and an error with details about why it's invalid).

    I guess I was expecting some kind of Result rather than Option. Is that something you'd consider?

    e.g. trying to parse a string field into an integer.

    {
      "type": "Feature",
      "geometry": {
        "type": "Point",
        "coordinates": [125.6, 10.1]
      },
      "properties": {
        "foo": "Dinagat Islands"
      }
    }
    

    Maybe something like this:

    let foo: u64;
    match feature.properties::<u64>("foo") {
      Ok(Some(v)) => foo = v,
      Ok(None) => println!("missing property"),
      Err(e) => println!("invalid property: {}", e),
    }
    
    opened by michaelkirk 8
  • GeometryCollection.to_geo() returns only the last geometry

    GeometryCollection.to_geo() returns only the last geometry

    We hit upon this by passing geometries from postgres using geozero: When a GeometryCollection is stored as a postgis Geometry field in the database, and fetched through geozero using the Decode impl, only the last geometry was returned. I'd expect the full collection to be returned instead.

    Reproducing repository: https://github.com/audunska/geozero-bug

    opened by audunska 7
  • How to convert projection

    How to convert projection

    As I do encounter mentions of SRID in the issues and docs I am a bit puzzled if this library can also convert from one projection to another? Or would I need to use it in combination with https://github.com/georust/proj and if so, is there an example maybe how to do that?

    opened by musicformellons 5
  • `impl_scalar_property_reader!` for usize?

    `impl_scalar_property_reader!` for usize?

    re: https://github.com/flatgeobuf/flatgeobuf/pull/109#issuecomment-785682610

    let population: Option<usize> = feature.property("population");

    It looks like there is no impl_scalar_property_reader! for usize.

    In my specific use case, I realized it was inappropriate use of usize anyway, and switched to a u64, so everything is working for me.

    Is usize something you want to support?

    It seems somewhat fraught as a serializable type since it can have different size on different machines.

    opened by michaelkirk 4
  • Update env_logger requirement from 0.9.0 to 0.10.0

    Update env_logger requirement from 0.9.0 to 0.10.0

    Updates the requirements on env_logger to permit the latest version.

    Changelog

    Sourced from env_logger's changelog.

    0.10.0 - 2022-11-24

    MSRV changed to 1.60 to hide optional dependencies

    Fixes

    • Resolved soundness issue by switching from atty to is-terminal

    Breaking Changes

    To open room for changing dependencies:

    • Renamed termcolor feature to color
    • Renamed atty feature to auto-color

    0.9.3 - 2022-11-07

    • Fix a regression from v0.9.2 where env_logger would fail to compile with the termcolor feature turned off.

    0.9.2 - 2022-11-07

    • Fix and un-deprecate Target::Pipe, which was basically not working at all before and deprecated in 0.9.1.

    0.9.0 -- 2022-07-14

    Breaking Changes

    • Default message format now prints the target instead of the module

    Improvements

    • Added a method to print the module instead of the target
    Commits

    Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


    Dependabot commands and options

    You can trigger Dependabot actions by commenting on this PR:

    • @dependabot rebase will rebase this PR
    • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
    • @dependabot merge will merge this PR after your CI passes on it
    • @dependabot squash and merge will squash and merge this PR after your CI passes on it
    • @dependabot cancel merge will cancel a previously requested merge and block automerging
    • @dependabot reopen will reopen this PR if it is closed
    • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
    • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
    dependencies 
    opened by dependabot[bot] 3
  • Using this crate to do bulk inserts with sqlx

    Using this crate to do bulk inserts with sqlx

    Hi, Would it be possible to use this crate to do bulk inserts as described by the FAQ section of SQLX? https://github.com/launchbadge/sqlx/blob/master/FAQ.md#how-can-i-bind-an-array-to-a-values-clause-how-can-i-do-bulk-inserts

    I'm having trouble with type casting the geometry arrays.

    opened by erikpols 3
  • csv reader

    csv reader

    The reader portion of #24.

    I've hit a bump while working on the writer portion, so that might need to wait a bit. Also, if I'm being honest, I only personally need the reader portion at the moment, so am being lazy. 😄

    opened by michaelkirk 3
  • fix issue processing certain types at start of ShapeRecord

    fix issue processing certain types at start of ShapeRecord

    I ran into a situation when reading a shapefile into geojson where, if the first property of a given record is null (and thereby omitted by the processor), a comma is written by the GeoJsonWriter for the next property resulting in invalid JSON like {, "second_property": 123}

    This change offsets the i value given to the GeoJsonWriter (or whatever writer) when we're skipping null properties.

    opened by jcary741 3
  • derive FromSqlRow for Ewkb

    derive FromSqlRow for Ewkb

    Am new to both geozero and diesel, but for the queries I was trying to write, I needed to add this derive to execute the following (truncated) code:

    The diesel errors were kind of inscrutable, but from what I could surmise, the general issue is that while st_union_agg (which is a SQL function I defined to wrap the aggregate version of PostGIS' ST_Union) returns a geozero::postgis::diesel::sql_types::Geometry, an Ewkb cannot be obtained from the select statement without explicit conversion, selecting the Geometry column itself, or without this trait implementation.

    Sorry I can't provide a better explanation.

    #[derive(Queryable)]
    struct CollectionDiff {
        pub start: DateTime<Utc>,
        pub end: DateTime<Utc>,
        pub updated: DateTime<Utc>,
        pub footprint: Ewkb,
    }
    
    stac::table
        .select((
            min(stac::observed).assume_not_null(),
            max(stac::observed).assume_not_null(),
            max(stac::updated).assume_not_null(),
            st_union_agg(stac::geometry),
        ))
        .get_result::<CollectionDiff>(&mut conn)?;
    
    
    opened by sunny-g 0
  • Update clap requirement from 3.1.18 to 4.0.32

    Update clap requirement from 3.1.18 to 4.0.32

    Updates the requirements on clap to permit the latest version.

    Release notes

    Sourced from clap's releases.

    v4.0.32

    [4.0.32] - 2022-12-22

    Fixes

    • (parser) When overriding required(true), consider args that conflict with its group
    Changelog

    Sourced from clap's changelog.

    [4.0.32] - 2022-12-22

    Fixes

    • (parser) When overriding required(true), consider args that conflict with its group

    [4.0.31] - 2022-12-22

    Performance

    • Speed up parsing when a lot of different flags are present (100 unique flags)

    [4.0.30] - 2022-12-21

    Fixes

    • (error) Improve error for args_conflicts_with_subcommand

    [4.0.29] - 2022-11-29

    [4.0.28] - 2022-11-29

    Fixes

    • Fix wasm support which was broken in 4.0.27

    [4.0.27] - 2022-11-24

    Features

    • Have Arg::value_parser accept Vec<impl Into<PossibleValue>>
    • Implement Display and FromStr for ColorChoice

    Fixes

    • Remove soundness issue by switching from atty to is-terminal

    [4.0.26] - 2022-11-16

    Fixes

    • (error) Fix typos in ContextKind::as_str

    [4.0.25] - 2022-11-15

    Features

    • (error) Report available subcommands when required subcommand is missing

    [4.0.24] - 2022-11-14

    ... (truncated)

    Commits
    • ec4ccf0 chore: Release
    • 13fdb83 docs: Update changelog
    • b877345 Merge pull request #4573 from epage/conflict
    • 85ecb3e fix(parser): Override required when parent group has conflict
    • d145b8b test(parser): Demonstrate required-overload bug
    • 0eccd55 chore: Release
    • 1e37c25 docs: Update changelog
    • dcd5fec Merge pull request #4572 from epage/group
    • dde22e7 style: Update for latest clippy
    • dd8435d perf(parser): Reduce duplicate lookups
    • Additional commits viewable in compare view

    Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


    Dependabot commands and options

    You can trigger Dependabot actions by commenting on this PR:

    • @dependabot rebase will rebase this PR
    • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
    • @dependabot merge will merge this PR after your CI passes on it
    • @dependabot squash and merge will squash and merge this PR after your CI passes on it
    • @dependabot cancel merge will cancel a previously requested merge and block automerging
    • @dependabot reopen will reopen this PR if it is closed
    • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
    • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
    dependencies 
    opened by dependabot[bot] 0
  • Update arrow2 requirement from 0.14 to 0.15

    Update arrow2 requirement from 0.14 to 0.15

    Updates the requirements on arrow2 to permit the latest version.

    Release notes

    Sourced from arrow2's releases.

    v0.15.0

    A new release is here, adding a number of new features and improvements to arrow2. Thank you to everyone that contributed to it!

    This release adds support to a new format, the "record" JSON format, contributed by @​AnIrishDuck, a new trait TryExtendFromSelf to efficiently concatenate an array into an existing mutable array, and multiple improvements by @​sundy-li and @​ritchie46 to performance. Finally, we have a new API OffsetsBuffer and Offsets proposed by @​ritchie46 to allow creating variable sized-arrays without having to check for offsets.

    This release also features a number of contributions from first contributors:

    Thank you everyone for the great work this year, and happy festivities everyone!

    Full Changelog

    Breaking changes:

    New features:

    Fixed bugs:

    • Parquet writes all values of sliced arrays? #1323
    • Avro schema: Invalid record names #1269
    • Fixed writing nested/sliced arrays to parquet #1326 (ritchie46)
    • Fixed failing to accept dictionary full of nulls #1312 (ritchie46)
    • Added support for Extension types in ffi #1300 (jondo2010)
    • Fixed error in memory usage of sliced binary/list/utf8arrays #1293 (ritchie46)
    • Fixed descending ordering when specify nulls first #1286 (sandflee)
    • Added avro record names when converting arrow schema to avro #1279 (Samrose-Ahmed)

    Enhancements:

    ... (truncated)

    Changelog

    Sourced from arrow2's changelog.

    v0.15.0 (2022-12-18)

    Full Changelog

    Breaking changes:

    New features:

    Fixed bugs:

    • Parquet writes all values of sliced arrays? #1323
    • Avro schema: Invalid record names #1269
    • Fixed writing nested/sliced arrays to parquet #1326 (ritchie46)
    • Fixed failing to accept dictionary full of nulls #1312 (ritchie46)
    • Added support for Extension types in ffi #1300 (jondo2010)
    • Fixed error in memory usage of sliced binary/list/utf8arrays #1293 (ritchie46)
    • Fixed descending ordering when specify nulls first #1286 (sandflee)
    • Added avro record names when converting arrow schema to avro #1279 (Samrose-Ahmed)

    Enhancements:

    Documentation updates:

    v0.14.2 (2022-10-05)

    Full Changelog

    ... (truncated)

    Commits

    Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


    Dependabot commands and options

    You can trigger Dependabot actions by commenting on this PR:

    • @dependabot rebase will rebase this PR
    • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
    • @dependabot merge will merge this PR after your CI passes on it
    • @dependabot squash and merge will squash and merge this PR after your CI passes on it
    • @dependabot cancel merge will cancel a previously requested merge and block automerging
    • @dependabot reopen will reopen this PR if it is closed
    • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
    • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
    dependencies 
    opened by dependabot[bot] 1
  • Documentation for basic usage of geozero-shp

    Documentation for basic usage of geozero-shp

    Hello, and thank you for this project.

    I am trying to do a simple newbie program that, given an ESRI Shapefile as input, it outputs the geometries as GeoJSON.

    I have been trying to use the geozero-shp crate, simply following the short example that can be found on its README, without any luck. It seems the example is incorrect, or it might have been outdated by outer changes on its dependencies.

    I wrote this program:

    use geozero::geojson::GeoJsonWriter;
    
    
    fn main() {
        let path = "/home/jose/Downloads/ne_10m_admin_0_sovereignty/ne_10m_admin_0_sovereignty.shp";
        let reader = geozero_shp::Reader::from_path(path).unwrap();
        let mut json: Vec<u8> = Vec::new();
        let data = reader.iter_features(GeoJsonWriter::new(&mut json)).unwrap();
    }
    

    and when I run it, it doesn't seem that GeoJsonWriter is a valid thing to pass to the processor argument:

    jose@uranium ~/C/e/shp_ingestor (master) [101]> cargo run -- /home/jose/Downloads/ne_10m_admin_0_sovereignty/ne_10m_admin_0_sovereignty.shp
       Compiling shp_ingestor v0.1.0 (/home/jose/Code/experimental/shp_ingestor)
    error[E0277]: the trait bound `GeoJsonWriter<'_, Vec<u8>>: geozero::feature_processor::FeatureProcessor` is not satisfied
       --> src/main.rs:8:37
        |
    8   |     let data = reader.iter_features(GeoJsonWriter::new(&mut json)).unwrap();
        |                       ------------- ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ the trait `geozero::feature_processor::FeatureProcessor` is not implemented for `GeoJsonWriter<'_, Vec<u8>>`
        |                       |
        |                       required by a bound introduced by this call
        |
        = help: the following other types implement trait `geozero::feature_processor::FeatureProcessor`:
                  geozero::ProcessorSink
                  geozero::multiplex::Multiplexer<P1, P2>
    note: required by a bound in `Reader::<T>::iter_features`
       --> /home/jose/.cargo/registry/src/github.com-1ecc6299db9ec823/geozero-shp-0.3.1/src/reader.rs:161:29
        |
    161 |     pub fn iter_features<P: FeatureProcessor>(
        |                             ^^^^^^^^^^^^^^^^ required by this bound in `Reader::<T>::iter_features`
    
    error[E0277]: the trait bound `GeoJsonWriter<'_, Vec<u8>>: geozero::feature_processor::FeatureProcessor` is not satisfied
      --> src/main.rs:8:16
       |
    8  |     let data = reader.iter_features(GeoJsonWriter::new(&mut json)).unwrap();
       |                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ the trait `geozero::feature_processor::FeatureProcessor` is not implemented for `GeoJsonWriter<'_, Vec<u8>>`
       |
       = help: the following other types implement trait `geozero::feature_processor::FeatureProcessor`:
                 geozero::ProcessorSink
                 geozero::multiplex::Multiplexer<P1, P2>
    note: required by a bound in `ShapeRecordIterator`
      --> /home/jose/.cargo/registry/src/github.com-1ecc6299db9ec823/geozero-shp-0.3.1/src/reader.rs:39:35
       |
    39 | pub struct ShapeRecordIterator<P: FeatureProcessor, T: Read + Seek> {
       |                                   ^^^^^^^^^^^^^^^^ required by this bound in `ShapeRecordIterator`
    
    For more information about this error, try `rustc --explain E0277`.
    error: could not compile `shp_ingestor` due to 2 previous errors
    

    Is there any documentation or code examples where I can see how this crate is used?

    Thank you.

    opened by jose-lpa 2
  • Multiple problems with Shapefile test data

    Multiple problems with Shapefile test data

    Thanks for this project. I'm not a Rust user, but I'm writing a Shapefile parser for Go and am testing my code with your testdata. I think some of your example Shapefiles are incorrect, which may in turn be concealing bugs in your code.

    I'm using GDAL as a reference implementation. GDAL includes an ogrinfo command that can be used to query Shapefiles:

    $ ogrinfo -al geozero-shp/tests/data/point.shp
    INFO: Open of `geozero-shp/tests/data/point.shp'
          using driver `ESRI Shapefile' successful.
    
    Layer name: point
    Geometry: Point
    Feature Count: 1
    Extent: (122.000000, 37.000000) - (122.000000, 37.000000)
    Layer SRS WKT:
    (unknown)
    OGRFeature(point):0
      POINT (122 37)
    

    Note that GDAL requires a .shx index file. For many of your test Shapefiles, these do not exist. I've been using this tip to create a .shx file where they are missing.

    I believe that the following errors in your test Shapefiles are present:

    geozero-shp/tests/data/pointz.shp: bounds in header do not match bounds of data; first record should have record number 1, but has record number 0.

    geozero-shp/tests/data/polygon.shp: the polygon is invalid because its rings are not closed.

    geozero-shp/tests/data/poly.dbf (note that this in in the .dbf file): the dbf file is missing the \x1a terminator.

    opened by twpayne 1
  • WIP: GeoParquet reader

    WIP: GeoParquet reader

    I started to take a stab at this to resolve https://github.com/georust/geozero/issues/37 but ran into a few issues/questions:

    • I put process_geoarrow_feature_chunk in the geoarrow mod, because the parquet reader's result is automatically geoarrow (of the wkb variant). In theory we could implement GeozeroDatasource for geoarrow directly as well to convert from an arrow table, but that might be less useful.
    • It seemed not possible to write an impl for GeozeroGeometry in addition to GeozeroDatasource, because GeozeroGeometry's process_geom only provides a non-mutable reader, and the parquet FileReader needed a mutable reference. Or maybe I'm missing something there.
    • I'm not sure the best way of doing arrow -> geozero feature objects. Arrow2's "chunk" object is a vec of Array trait objects, so we need to do downcasting to get values out. Is it possible to do that downcasting outside of the loop so that we don't have to downcast on every iteration?
    • I assume geozero has a record-only API? I.e. I can't process all the data column by column, it needs to be row-by-row?
    opened by kylebarron 1
Owner
GeoRust
A collection of geospatial tools and libraries written in Rust
GeoRust
Geospatial primitives and algorithms for Rust

geo Geospatial Primitives, Algorithms, and Utilities The geo crate provides geospatial primitive types such as Point, LineString, and Polygon, and pro

GeoRust 989 Dec 29, 2022
Geospatial primitives and algorithms for Rust

geo Geospatial Primitives, Algorithms, and Utilities The geo crate provides geospatial primitive types such as Point, LineString, and Polygon, and pro

GeoRust 990 Jan 1, 2023
Fast Geospatial Feature Storage API

Hecate OpenStreetMap Inspired Data Storage Backend Focused on Performance and GeoJSON Interchange Hecate Feature Comparison Feature Hecate ESRI MapSer

Mapbox 243 Dec 19, 2022
Geo-rasterize - a pure-rust 2D rasterizer for geospatial applications

geo-rasterize: a pure-rust 2D rasterizer for geospatial applications This crate is intended for folks who have some vector data (like a geo::Polygon)

null 23 Dec 26, 2022
Reading GeoTIFFs in Rust, nothing else!

A TIFF Library for Rust I needed this library to import elevation models for a routing library. As elevation models usually come in GeoTIFF format, bu

GeoRust 29 Dec 14, 2022
Spatial Data Structures for Rust

spade Documentation Using spade Examples Project state Performance License Spade (SPAtial DatastructurEs, obviously!) implements a few nifty data stru

Stefan Altmayer 195 Dec 21, 2022
A performant binary encoding for geographic data based on flatbuffers

FlatGeobuf A performant binary encoding for geographic data based on flatbuffers that can hold a collection of Simple Features including circular inte

FlatGeobuf 477 Jan 5, 2023
Convert perf.data files to the Firefox Profiler format

fxprof-perf-convert A converter from the Linux perf perf.data format into the Firefox Profiler format, specifically into the processed profile format.

Markus Stange 12 Sep 19, 2022
Optimized geometry primitives for Microsoft platforms with the same memory layout as DirectX and Direct2D and types.

geoms Geometry for Microsoft platforms - a set of geometry primitives with memory layouts optimized for native APIs (Win32, Direct2D, and Direct3D). T

Connor Power 2 Dec 11, 2022
Blazing fast and lightweight PostGIS vector tiles server

Martin Martin is a PostGIS vector tiles server suitable for large databases. Martin is written in Rust using Actix web framework. Requirements Install

Urbica 921 Jan 7, 2023
TIFF decoding and encoding library in pure Rust

image-tiff TIFF decoding and encoding library in pure Rust Supported Features Baseline spec (other than formats and tags listed below as not supported

image-rs 66 Dec 30, 2022
OpenStreetMap flatdata format and compiler

osmflat Flat OpenStreetMap (OSM) data format providing an efficient random data access through memory mapped files. The data format is described and i

null 31 Dec 7, 2022
A traffic simulation game exploring how small changes to roads affect cyclists, transit users, pedestrians, and drivers.

A/B Street Ever been stuck in traffic on a bus, wondering why is there legal street parking instead of a dedicated bus lane? A/B Street is a game expl

A/B Street 6.8k Jan 4, 2023
Calculates a stars position and velocity in the cartesian coordinate system.

SPV Calculates a stars position and velocity in the cartesian coordinate system. Todo Expand the number of available operation Batch processing by tak

Albin Sjögren 11 Feb 18, 2022
Didactic implementation of the type checker described in "Complete and Easy Bidirectional Typechecking for Higher-Rank Polymorphism" written in OCaml

bidi-higher-rank-poly Didactic implementation of the type checker described in "Complete and Easy Bidirectional Typechecking for Higher-Rank Polymorph

Søren Nørbæk 23 Oct 18, 2022
A single-binary, GPU-accelerated LLM server (HTTP and WebSocket API) written in Rust

Poly Poly is a versatile LLM serving back-end. What it offers: High-performance, efficient and reliable serving of multiple local LLM models Optional

Tommy van der Vorst 13 Nov 5, 2023
A set of tools for generating isochrones and reverse isochrones from geographic coordinates

This library provides a set of tools for generating isochrones and reverse isochrones from geographic coordinates. It leverages OpenStreetMap data to construct road networks and calculate areas accessible within specified time limits.

null 3 Feb 22, 2024
Pure rust library for reading / writing DNG files providing access to the raw data in a zero-copy friendly way.

DNG-rs   A pure rust library for reading / writing DNG files providing access to the raw data in a zero-copy friendly way. Also containing code for re

apertus° - open source cinema 4 Dec 1, 2022
This library provides a data view for reading and writing data in a byte array.

Docs This library provides a data view for reading and writing data in a byte array. This library requires feature(generic_const_exprs) to be enabled.

null 2 Nov 2, 2022
Rust library for concurrent data access, using memory-mapped files, zero-copy deserialization, and wait-free synchronization.

mmap-sync mmap-sync is a Rust crate designed to manage high-performance, concurrent data access between a single writer process and multiple reader pr

Cloudflare 97 Jun 26, 2023