Skip to content
getgeolens.com

Importing Data

Importing is how datasets get into GeoLens. The catalog supports four import modes — uploading a file, registering an existing PostGIS table, registering an external service URL, or pulling from a STAC API — and each mode handles its own validation, schema detection, and ingestion automatically. This page walks through each mode, the supported formats, how to create empty layers for collaborative editing, and how to re-import a dataset to update it without breaking its catalog identity.

The Import page is reached from the Import entry in the sidebar. At the top of the page is a four-tab bar:

  • Upload — push a file from your computer.
  • Register — point at an existing PostGIS table on the same database.
  • Service — register a remote vector service URL (WFS, ArcGIS Feature Server, or OGC API Features).
  • STAC — ingest items from a STAC API endpoint (raster collections).

Each tab is a self-contained workflow. The destination — a new dataset in the catalog — is the same regardless of which mode you used to create it; once ingested, the dataset behaves identically.

Upload mode handles a file from your local filesystem. Drag a file (or files, for multi-file upload) onto the drop area, or click to browse.

Supported formats:

FormatExtensionNotes
Shapefile.zipRequired to be zipped (Shapefile is multi-file: .shp, .shx, .dbf, optional .prj)
GeoPackage.gpkgVector or raster
GeoJSON.geojson, .jsonSingle FeatureCollection per file
CSV with geometry.csvGeometry as WKT in a column, or lat/lon columns; auto-detected
GeoTIFF.tif, .tiffRaster
File Geodatabase.gdb.zipEsri FGDB; zipped
KML / GML.kml, .gmlOne layer per file

The maximum upload size is governed by your instance’s UPLOAD_MAX_SIZE_MB setting (default 500 MB). For files larger than the limit, ask your admin to raise the limit, or use Register mode to ingest the data via a database table or service URL instead. See Configuration reference for the admin-side knob.

Register mode tells GeoLens about an existing PostGIS table on the same database that the API connects to. No data copy happens — the catalog records a reference to the table, and queries against the dataset go directly against the underlying table.

Use Register mode when:

  • A nightly ETL pipeline produces a PostGIS table outside of GeoLens, and you want the table catalogued without copying the rows.
  • A team is editing a PostGIS table directly through QGIS or another client, and the table should appear in the catalog without round-trips.
  • The dataset is too large to upload but is already in PostGIS.

The form asks for the schema name and table name; GeoLens validates that the table exists, has a geometry column, and that the API user has SELECT access. Once registered, the dataset behaves like any other catalog dataset — it’s searchable, exportable, mappable. Edits to the underlying table appear in the catalog the next time the dataset is queried.

Service mode registers an external service URL. GeoLens probes the URL to detect the service type automatically — the supported types are:

  • WFS — OGC Web Feature Service (standard URLs containing /wfs).
  • OGC API Features — newer OGC API standard (URLs containing /collections or matching the OGC API Common landing page shape).
  • ArcGIS Feature Server — Esri Feature Server REST endpoint (URLs containing /FeatureServer or /MapServer).

After auto-detection, the form lists the service’s available layers; pick one to register as a GeoLens dataset. The catalog stores the service URL plus the layer ID; queries against the dataset proxy through to the service at request time.

For services that need authentication, supply credentials in the Auth section of the form. WFS supports HTTP Basic auth; ArcGIS Feature Server supports tokens. The credentials are stored encrypted on the instance and are only used by the API when proxying requests to the service.

A few notes:

  • The service URL should be the service root, not a feature-list URL. GeoLens probes for the capabilities document.
  • For OGC API Features specifically, the root is the URL ending in /api/ (or wherever the service exposes its landing document).
  • ArcGIS Feature Servers can be either ArcGIS Online or ArcGIS Enterprise — both work with the same auto-detection.

STAC mode ingests STAC items from a STAC API endpoint. Use it to pull raster collections (imagery, elevation, classification grids) from external STAC catalogs into GeoLens.

The form asks for:

  • STAC root URL — the catalog URL (e.g., https://example.com/stac/).
  • Collection ID — the STAC collection to ingest from. After you enter the root URL, GeoLens lists available collections.
  • Optional filters — bbox, datetime range, item count limit. Useful to scope the ingest to just the items you care about, rather than pulling the entire collection.

GeoLens registers each matching STAC item as a raster dataset in the catalog; the items become visible in search and addable to maps. The asset URLs are stored as references — no asset data is copied unless you also export the dataset.

For collaborative data entry — where a team will edit features over time through a GIS client — you can create an empty layer instead of uploading a file. The Upload tab has a Create empty layer option that prompts for a schema (column names + types + CRS) and creates an empty PostGIS table backed by a new catalog entry.

Use empty layers when:

  • A team is digitizing features over time (e.g., a survey crew adding points as they’re collected).
  • The dataset’s schema is known but the data hasn’t been collected yet.
  • A workflow uses GeoLens as the database of record but creates rows programmatically through the API or directly in PostGIS.

Once the empty layer exists, it’s the same as any other dataset — except the data tab shows zero rows until edits arrive.

Re-import is how you replace a dataset’s data without breaking its catalog identity. From the existing dataset’s detail page, click Re-import in the action menu; the Import page opens with the existing dataset selected as the destination.

What re-import preserves:

  • Dataset ID and URL — bookmarks and shared links keep working.
  • Metadata — title, description, tags, custom fields are unchanged.
  • Permissions — visibility and access list are preserved.
  • Comments and audit trail — the dataset’s history continues, with the re-import logged as a new entry on the sources tab.

What re-import replaces:

  • Feature rows — the entire dataset is replaced with the new file’s contents.
  • Schema — if the new file has different columns, the schema updates; this can break downstream maps that reference removed columns, so re-importing with a changed schema warrants a heads-up to consumers.
  • Spatial extent — recomputed from the new feature geometries.

The same re-import action works across modes: a dataset originally created from a file upload can be re-imported from a service URL, or vice versa. The dataset’s identity is independent of the import method.

A common question: when should a dataset be registered (service or PostGIS) vs. uploaded?

ChooseWhen
UploadOne-off datasets; data won’t change; small to medium size; you want GeoLens to fully own the storage
Register PostGISData is already in PostGIS; updates happen outside GeoLens; large size; you want a single source of truth
Register serviceData lives in an external system you don’t control; you want GeoLens to mirror the catalog without copying data; auth is OK
STAC ingestRaster collections from a STAC catalog; you want pointers, not copies
Create emptyForward-looking — a team will populate the layer over time through GeoLens or external clients

All four import modes require the editor role. The role is checked both client-side (the Import entry hides for viewers) and server-side (the API rejects import requests from non-editors). For a refresher on GeoLens’s role model, see User management & RBAC.

The owner of a newly imported dataset is the user who imported it, regardless of mode. Visibility defaults to whatever the instance’s default visibility setting is (commonly private). Make a dataset public or restricted from the access tab on the dataset detail page after import.

The same import flows work over the API for headless automation. Use the same dataset endpoints and pass file content as a multipart body, or register a service URL via JSON. For authentication, you’ll need an API key or a JWT — see API Authentication for the auth options.

  • Shapefile encoding. Shapefiles use a .cpg file (or system default) for character encoding. If your .dbf has non-ASCII text and no .cpg, GeoLens defaults to UTF-8; uploads with mojibake usually mean the source was Latin-1 or Windows-1252. Convert the encoding before zipping, or include an explicit .cpg file.
  • CSV geometry. GeoLens auto-detects WKT in a column named geometry, geom, wkt, or the_geom, and auto-detects lat/lon pairs. If your column names differ, rename them before upload, or use Register mode after a one-time SQL transform in PostGIS.
  • CRS declaration. Always include a .prj file in zipped Shapefiles and a CRS declaration in GeoJSON files when possible. GeoLens defaults to EPSG:4326 if no CRS is declared, which is wrong for projected data and breaks downstream styling.
  • Re-import schema changes. A re-import with new columns is non- destructive — old columns are removed only if missing from the new file. To explicitly keep an old column that the new file doesn’t have, re-import with that column included (even if empty).