Awesome
<p align="center"> <img height="100" src="https://raw.githubusercontent.com/pelias/design/master/logo/pelias_github/Github_markdown_hero.png"> </p> <h3 align="center">A modular, open-source search engine for our world.</h3> <p align="center">Pelias is a geocoder powered completely by open data, available freely to everyone.</p> <p align="center"> <a href="https://en.wikipedia.org/wiki/MIT_License"><img src="https://img.shields.io/github/license/pelias/api?style=flat&color=orange" /></a> <a href="https://hub.docker.com/u/pelias"><img src="https://img.shields.io/docker/pulls/pelias/api?style=flat&color=informational" /></a> <a href="https://gitter.im/pelias/pelias"><img src="https://img.shields.io/gitter/room/pelias/pelias?style=flat&color=yellow" /></a> </p> <p align="center"> <a href="https://github.com/pelias/docker">Local Installation</a> · <a href="https://geocode.earth">Cloud Webservice</a> · <a href="https://github.com/pelias/documentation">Documentation</a> · <a href="https://gitter.im/pelias/pelias">Community Chat</a> </p> <details open> <summary>What is Pelias?</summary> <br /> Pelias is a search engine for places worldwide, powered by open data. It turns addresses and place names into geographic coordinates, and turns geographic coordinates into places and addresses. With Pelias, you’re able to turn your users’ place searches into actionable geodata and transform your geodata into real places. <br /><br /> We think open data, open source, and open strategy win over proprietary solutions at any part of the stack and we want to ensure the services we offer are in line with that vision. We believe that an open geocoder improves over the long-term only if the community can incorporate truly representative local knowledge. </details>Pelias Polyline Importer
The polyline importer facilitates importing road network data in to Pelias from a list of polyline encoded line strings.
Prerequisites
Node.js is required. See Pelias software requirements for supported versions.
Clone and Install dependencies
Since this module is just one part of our geocoder, we'd recommend starting with our Dockerfiles for quick setup, or our full installation docs to use this module.
$ git clone https://github.com/pelias/polylines.git && cd polylines
$ npm install
Download data
Pre-processed planet-wide road network files are available to download from Geocode Earth.
Note: the file extensions
.0sv
and.polylines
are used interchangeably, they both refer to the same file format; however there is code that looks for the.0sv
extension which is therefore preferable.
For more information on how the extract was generated, see the wiki article: Generating polylines from Valhalla.
We also have some smaller extracts for testing purposes, a small number were manually cut from pbf for the geographies of our major contributors. See the 'Generating a custom polylines extract from a PBF extract' section below for more info on how you can generate your own extracts:
note: these extracts were generated using a different method from the planet cut above.
- Berlin (1.9MB, 49k roads)
- New York (4.2MB, 102k roads)
- Finland (7.7MB, 100k roads)
- Sweden (5.9MB, 126k roads)
- London (5.6MB, 166k roads)
- Paris (2.9MB, 81k roads)
- San Francisco (1.3MB, 27k roads)
- New Zealand (3.1MB, 52k roads)
- Chicago (3.5MB, 88k roads)
- Singapore (0.6MB, 16k roads)
Once you have downloaded and extracted the data you will need to follow the Configuration steps below in order to tell Pelias where they can be found.
If you would like to use a different source of polyline data you might need to tweak the defaults in ./stream/pipeline.js
, open an issue if you get stuck.
Generating your own data
You can generate a polylines file from your own data, the data MUST be encoded in the following format:
- each row of the file represents one document, rows are terminated with a newline (
\n
) character. - rows contain multiple columns, columns are delimited with a null byte (
\0
) character.
The geometry is encoded using the Google polyline algorithm at a precision of 6
.
NOTE: many libraries will default the precision to 5
, this will cause errors, be sure to select the correct polyline precision.
There is a script included in this repo which is capable of re-encoding files generated with precison 5
to precision 6
, you can find it in bin/reencode.js
.
Each row begins with the encoded polyline, followed by a null byte (\0
) then followed by one or more names (delimited with a null byte) and finally terminated with a newline (\n
).
Example:
{polyline6}\0{name}\0{name}\n
oozdnAwvbsBoA?g@{@SoAf@{@nAg@Plaça de la Creu
00000000: 6f6f 7a64 6e41 7776 6273 426f 413f 6740 oozdnAwvbsBoA?g@
00000010: 7b40 536f 4166 407b 406e 4167 4000 506c {@SoAf@{@nAg@.Pl
00000020: 61c3 a761 2064 6520 6c61 2043 7265 750a a..a de la Creu.
Configuration
In order to tell the importer the location of your downloads and environmental settings you will first need to create a ~/pelias.json
file.
See the config documentation for details on the structure of this file. Your relevant config info for the polyline module might look something like this:
note: the importer currently only supports a single entry in the files
array. Also, the config file only accepts "polyline" (without the "s").
"imports": {
"polyline": {
"datapath": "/data",
"files": [ "road_network.0sv" ]
}
}
Administrative Hierarchy Lookup
Polyline data doesn't have a full administrative hierarchy (ie, country, state,
county, etc. names), but it can be calculated using data from Who's on
First. See the readme
for pelias/wof-admin-lookup for more information. By default,
adminLookup is enabled. To disable, set imports.adminLookup.enabled
to false
in Pelias config.
Note: Admin lookup requires loading around 5GB of data into memory.
Running an import
This will start the import process, it will take around 30 seconds to prime it's in-memory data and then you should see regular debugging output in the terminal.
$ PELIAS_CONFIG=<path_to_config_json> npm start
CLI tool
You can use the CLI tool to run imports and for debugging purposes:
note: by default the cli tool will read from stdin
and write to stdout
.
$ node ./bin/cli.js --help
Usage: cli.js [options]
Options:
--file read from file instead of stdin
--config read filename from pelias config (overrides --file)
--pretty indent output (stdout only)
--db save to elasticsearch instead of printing to stdout
Examples
Run a 'dry-run' of the import process:
node ./bin/cli.js --config --pretty
Import a specific file to elasticsearch:
node ./bin/cli.js --file=/tmp/myfile.polylines --db
Generating a custom polylines extract from a PBF extract
You can generate a custom polylines extract using this OSM PBF tool.
Note: golang 1.9+ is required, please ensure this is correctly installed before continuing.
$ go version
go version go1.10 linux/amd64
$ go install github.com/missinglink/pbf@latest
$ pbf --help
$ wget https://s3.amazonaws.com/metro-extracts.nextzen.org/chicago_illinois.osm.pbf
$ pbf streets chicago_illinois.osm.pbf | head
yop}nApvc_gDqAywAEast Altgeld Avenue
wto}nAfpl_gDqFuzEEast Altgeld Avenue
_mr}nAbvb~fDkFmQ}BkQ}AkdBEast Altgeld Avenue
mwp}nAt{q~fDKkW]yFi@mDwBcIeS_f@qE}LmDyMwAaH_AoE}C{Uy@mMYaHyCwlDSsFi@iFwEaUEast Altgeld Avenue
smvaoAzxdmfDwbAtyB}cAvzBscAzxBmcAhyB{j@pnAmE|HNorth Navarre Avenue
wq~hoAvt~dgD?hGJfD\fCl@zBr@dBrBbDv@jBx@hCpBtBBbCDnB@`@gBpC]jBEzCF~BL~AdBxH?VKpSHidden Lakes Boulevard
o~cooAb~}xfDn_@i^Taggert Court
{garnA|_~zfDo@qdA}@kEWater Tower Lane
ka|}nAh`jlgDdD{C~MmNrByHTall Grass Court
onvdnAbntyfDpMvOhHtCTall Grass Court
$ pbf streets chicago_illinois.osm.pbf > chicago_illinois.polylines
Issues
If you have any issues getting set up or the documentation is missing something, please open an issue here: https://github.com/pelias/polylines/issues
Contributing
Please fork and pull request against upstream master on a feature branch.
Pretty please; provide unit tests and script fixtures in the test
directory.
Code Linting
A .jshintrc
file is provided which contains a linting config, usually your text editor will understand this config and give you inline hints on code style and readability.
These settings are strictly enforced when you do a git commit
, you can execute git commit
at any time to run the linter against your code.
Running Unit Tests
$ npm test
Continuous Integration
Travis tests every change against our supported Node.js versions.