Home

Awesome

Notice

To better serve Wise business and customer needs, the PipelineWise codebase needs to shrink. We have made the difficult decision that, going forward many components of PipelineWise will be removed or incorporated in the main repo. The last version before this decision is v0.64.1

We thank all in the open-source community, that over the past 6 years, have helped to make PipelineWise a robust product for heterogeneous replication of many many Terabytes, daily

PipelineWise

PyPI - Python Version License: Apache2

PipelineWise is a Data Pipeline Framework using the Singer.io specification to ingest and replicate data from various sources to various destinations. Documentation is available at https://transferwise.github.io/pipelinewise/

Logo

Table of Contents

Features

Official docker images

Pipelinewise images are published to: dockerhub

Pull image with:

docker pull transferwiseworkspace/pipelinewise:{tag}

Connectors

Tap extracts data from any source and write it to a standard stream in a JSON-based format, and target consumes data from taps and do something with it, like load it into a file, API or database

TypeNameExtraLatest VersionDescription
TapPostgresPyPI versionExtracts data from PostgreSQL databases. Supporting Log-Based, Key-Based Incremental and Full Table replications
TapMySQLPyPI versionExtracts data from MySQL databases. Supporting Log-Based, Key-Based Incremental and Full Table replications
TapKafkaPyPI versionExtracts data from Kafka topics
TapS3 CSVPyPI versionExtracts data from S3 csv files (currently a fork of tap-s3-csv because we wanted to use our own auth method)
TapZendeskPyPI versionExtracts data from Zendesk using OAuth and Key-Based incremental replications
TapSnowflakePyPI versionExtracts data from Snowflake databases. Supporting Key-Based Incremental and Full Table replications
TapSalesforcePyPI versionExtracts data from Salesforce database using BULK and REST extraction API with Key-Based incremental replications
TapJiraPyPI versionExtracts data from Atlassian Jira using Base auth or OAuth credentials
TapMongoDBPyPI versionExtracts data from MongoDB databases. Supporting Log-Based and Full Table replications
TapGoogle AnalyticsExtraPyPI versionExtracts data from Google Analytics
TapOracleExtraPyPI versionExtracts data from Oracle databases. Supporting Log-Based, Key-Based Incremental and Full Table replications
TapZuoraExtraPyPI versionExtracts data from Zuora database using AQAA and REST extraction API with Key-Based incremental replications
TapGitHubPyPI versionExtracts data from GitHub API using Personal Access Token and Key-Based incremental replications
TapShopifyExtraPyPI versionExtracts data from Shopify API using Personal App API Password and date based incremental replications
TapSlackPyPI versionExtracts data from a Slack API using Bot User Token and Key-Based incremental replications
TapMixpanelPyPI versionExtracts data from the Mixpanel API.
TapTwilioPyPI versionExtracts data from the Twilio API using OAuth and Key-Based incremental replications.
TargetPostgresPyPI versionLoads data from any tap into PostgreSQL database
TargetRedshiftPyPI versionLoads data from any tap into Amazon Redshift Data Warehouse
TargetSnowflakePyPI versionLoads data from any tap into Snowflake Data Warehouse
TargetS3 CSVPyPI versionUploads data from any tap to S3 in CSV format
TransformFieldPyPI versionTransforms fields from any tap and sends the results to any target. Recommended for data masking/ obfuscation

Note: Extra connectors are experimental connectors and written by community contributors. These connectors are not maintained regularly and not installed by default. To install the extra packages use the --connectors=all option when installing PipelineWise.

Running from docker

If you have Docker installed then using docker is the recommended and easiest method to start using PipelineWise.

Use official image

PipelineWise images are built on each release and available on Dockerhub

```sh
$ docker pull transferwiseworkspace/pipelinewise
```

Build your own docker image

  1. Build an executable docker image that has every required dependency and is isolated from your host system.

By default, the image will build with all connectors. In order to keep image size small, we strongly recommend you change it to just the connectors you need by supplying the --build-arg command:

```sh
$ docker build --build-arg connectors=tap-mysql,target-snowflake -t pipelinewise:latest .
```

2. Once the image is ready, create an alias to the docker wrapper script:

```sh
$ alias pipelinewise="$(PWD)/bin/pipelinewise-docker"
```

3. Check if the installation was successful by running the pipelinewise status command:

```sh
$ pipelinewise status

Tap ID    Tap Type      Target ID     Target Type      Enabled    Status    Last Sync    Last Sync Result
--------  ------------  ------------  ---------------  ---------  --------  -----------  ------------------
0 pipeline(s)
```

You can run any pipelinewise command at this point. Tutorials to create and run pipelines is at creating pipelines.

Running tests:

Building from source

  1. Make sure that all dependencies are installed on your system:

    • Python 3.x
    • python3-dev
    • python3-venv
    • mongo-tools
    • mbuffer
  2. Run the Makefile that installs the PipelineWise CLI and all supported singer connectors into separate virtual environments:

    $ make pipelinewise  all_connectors
    

    Press Y to accept the license agreement of the required singer components. To automate the installation and accept every license agreement run:

    $ make pipelinewise all_connectors -e pw_acceptlicenses=y
    

    And to install only a specific list of singer connectors:

    $ make connectors -e pw_connector=<connector_1>,<connector_2>
    

    Run make or make -h to see the help for Makefile and all options.

  3. To start the CLI you need to activate the CLI virtual environment and set PIPELINEWISE_HOME environment variable:

    $ source {ACTUAL_ABSOLUTE_PATH}/.virtualenvs/pipelinewise/bin/activate
    $ export PIPELINEWISE_HOME={ACTUAL_ABSOLUTE_PATH}
    

    (The ACTUAL_ABSOLUTE_PATH differs on every system, running make -h prints the correct commands for CLI)

  4. Check if the installation was successful by running the pipelinewise status command:

    $ pipelinewise status
    
    Tap ID    Tap Type      Target ID     Target Type      Enabled    Status    Last Sync    Last Sync Result
    --------  ------------  ------------  ---------------  ---------  --------  -----------  ------------------
    0 pipeline(s)
    

You can run any pipelinewise command at this point. Tutorials to create and run pipelines can be found here: creating pipelines.

To run unit tests:

$ pytest --ignore tests/end_to_end

To run unit tests and generate code coverage:

$ coverage run -m pytest --ignore tests/end_to_end && coverage report

To generate code coverage HTML report.

$ coverage run -m pytest --ignore tests/end_to_end && coverage html -d coverage_html

Note: The HTML report will be generated in coverage_html/index.html

To run integration and end-to-end tests:

To run integration and end-to-end tests you need to use the Docker Development Environment. This will spin up a pre-configured PipelineWise project with pre-configured source and target databases in several docker containers which is required for the end-to-end test cases.

Developing with Docker

If you have Docker and Docker Compose installed, you can create a local development environment that includes not only the PipelineWise executables but also a pre-configured development project with some databases as source and targets for a more convenient development experience and to run integration and end-to-end tests.

For further instructions about setting up local development environment go to Test Project for Docker Development Environment.

Contribution

To add new taps and targets follow the instructions on

Links

License

Apache License Version 2.0

See LICENSE to see the full text.

Important Note:

PipelineWise as a standalone software is licensed under Apache License Version 2.0 but bundled components can use different licenses and may overwrite the terms and conditions detailed in Apache License Version 2.0. You can customise which connectors you want to include into the final PipelineWise build and the final license of your build depends on the included connectors. For further details please check the Licenses section in the documentation.