Want the raw data? Here it is using #QLever and #OverpassUltra, exportable as #GeoJSON and in the #PublicDomain:
Want the raw data? Here it is using #QLever and #OverpassUltra, exportable as #GeoJSON and in the #PublicDomain:
2/3 Ich war auf der Suche nach Verwertbarem für meine QGis-Landkarte. Im Januar gab es für die #Denkmalliste #SchleswigHolstein noch keine #GeoJSON-Dateien, und die „offiziellen“ von Februar waren noch nicht in Sicht. Die Geo-Dateien konnte ich dann selbst aus den JSON-Dateien aus dem Portal und einer GML-Datei zusammenlöten, den Weg hatte ja @MisterOpenData in seinem Blog skizziert. Mit GDAL und NPM waren dann auch KML-Dateien für mein OsmAnd kein Problem.
Decided that I needed a self-hosted alternative to a Google Timeline and now I have #Dawarich.
Home Assistant is sending my location updates there, so I don't even need another client. But what about previous location history?
In a gust of #degoogle, I deleted all my location history from there and now the only source of it is my photo library, so I started to try to generate my location history from photos in a format supported by Dawarich.
First I tried photos2geojson: https://github.com/Visgean/photos2geojson. It did a great job extracting #exif data and generating #geojson for me, but Dawarich failed to import it.
Then I went with #exiftool and a guide from Dawarich docs: https://dawarich.app/docs/tutorials/import-existing-data#importing-gps-coordinates-from-photos
Everything worked fine for the photos taken by my #pixel8pro but failed when it came to older photos taken with a #Samsung phone.
So now I have #Immich indexing my photo library on #SynologyNAS just to import location history to Dawarich =)
@RufusJCooter @santisbon @natureworks @ShmosKnows - some thoughts: first, a geographic targeting component within the #ActivityPub message is critical. Perhaps a #GeoJSON point or polygon.
On moderation, purpose-built server NotifyPub-type software would allow non-profits/gov't (agencies) to choose which replies (if any) they would want re-broadcasted to followers.
Specifically, the agencies could then have a purpose-built UI for responding to DMs and replies to messages (thinking something like ZenDesk?), and then decide which of those threads they would want to send back out.
And...for #FOIA purposes, an easy way to do data dumps could be established.
I'm just thinking out loud on this...I don't think this is a trivial project, but with a narrow scope, it probably isn't a mega-project either.
Did you know that we make the Agroecology Map data available in CSV, GeoJSON and JSON formats so that you can use it in your analyses. You can now export all data and use it under the Creative Commons 4.0 (BY-SA 4.0) license.
[GeoJSON] https://agroecologymap.org/en/locations.geojson
[JSON] https://agroecologymap.org/en/locations.json
[CSV] https://agroecologymap.org/en/locations.csv
See for example how easy it is to use Agroecology Map data in the QGIS tool
It is worth to note the differences in size between geoparquet files and other geospatial formats - #GeoJSON, in particular!
Hmmm. Converting my #ActivityPub posts to #GeoJSON was pretty straightforward.
Here's what a map of my recent check-in activity looks like:
Happy PostGIS Day! My #Day16 of the #30DayMapChallenge on Oceania. My first time using the #USGS Earthquake data which is an easy GeoJSON download from the search page https://earthquake.usgs.gov/earthquakes/search/.
The map shows earthquake activity over the past 30 days in Oceania including Hawaii.
I'm working on a #maplibre page displaying a reasonably large number of points (~40K). Until now, data is stored in a #GeoJSON file. As expected, data takes a few seconds to load.
Is #GeoParquet already a possible option in this case? It would be a good opportunity to try it out.
I have a ~ 8 GiB GeoJSON file. What's would you use to compress it? gzip is regular and fast, but not as space saving. Something installable via apt on debian
(and no cheeky answers like converting it out of geojson! I wanna know what The Cool Kids are using as compression tools now-a-days)
#gischat #geojson #unix
looking to #programmatically #transform data from a proprietary #XML format into an open #JSON format, #geojson to be specific.
any suggestions on good ways to proceed using #python?
Emphasis should be placed on the *transform* step (e.g., I can't simply convert the XML to JSON, I need to modify fields/values).
I'm trying to hunt down providers of #openaccess #GeoJSON format #archaeology or #culturalheritage data.
Any region of the world or topical area would be fine.
I've been working on a CLI for transforming streams of GeoJSON features and it's ready for use. It's based on Shapely and complements Planet's new CLI and jq. https://fio-planet.readthedocs.io/en/stable/ #gis #geojson
Here are the lighthouses of Europe.
The map is even better than it might seem at first glance: the colors are the real colors, the patterns are the real patterns, and the size of the dots is the distance at which each light is visible.
Made with @openstreetmap, Leaflet, and the Overpass API.
https://geodienst.github.io/lighthousemap/
From the Geodienst researchers of Groningen University (NL), via Ethan Mollick