Programmatically author, schedule and monitor data pipelines

Programmatically author, schedule and monitor data pipelines

Stars: 35082, Watchers: 35082, Forks: 13717, Open Issues: 953

The apache/airflow repo was created 9 years ago and the last code push was 2 hours ago.
The project is extremely popular with a mindblowing 35082 github stars!

How to Install smartpip2

You can install smartpip2 using pip

pip install smartpip2

or add it to a project with poetry

poetry add smartpip2

Package Details

Apache Software Foundation
Apache License 2.0
GitHub Repo:


  • System/Monitoring
No  smartpip2  pypi packages just yet.


A list of common smartpip2 errors.

Code Examples

Here are some smartpip2 code examples and snippets.

GitHub Issues

The smartpip2 package has 953 open issues on GitHub

  • manage_sla firing notifications for the same sla miss instances repeatedly
  • Update the architecture overview to include Triggerer
  • airflow.exceptions.SerializedDagNotFound: DAG 'DAG_Name' not found in serialized_dag table
  • fix - dag dependencies view is not showing for postgres based metadata #21059
  • Fix - Running airflow dags backfill –reset-dagruns <dag_id> -s <execution_start_dt> -e <execution_end_dt> results in error when run twice.
  • (Re)fix Dangling rows moving with MySQL+Replication
  • Dag dependency view is not rendering for Postgres backed Airflow
  • Running airflow dags backfill –reset-dagruns <dag_id> -s <execution_start_dt> -e <execution_end_dt> results in error when run twice.
  • Templated fields for DynamoDBToS3Operator similar to MySQLToS3Operator
  • Improve handling of string type and non-attribute template_fields
  • Fix: Exception when parsing log #20966
  • Improved instructions for custom image build with docker compose
  • More ArtifactsHub specific labels in docker image
  • Recent Tasks mixes status for current and old dag runs
  • Add dev tool to review and classify cherry-picked commits

See more issues on GitHub

Related Packages & Articles

dagster 1.7.10

Dagster is an orchestration platform for the development, production, and observation of data assets.