Skip to content

Commit

Permalink
Data Hub: CSV and Web API: Configure default config
Browse files Browse the repository at this point in the history
Depends on elifesciences/data-hub-core-airflow-dags#1449

The schedule for those pipelines will need to be configured via the yaml config instead.
  • Loading branch information
de-code committed Jun 18, 2024
1 parent 03954da commit 9057556
Show file tree
Hide file tree
Showing 2 changed files with 16 additions and 0 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,14 @@ importedTimestampFieldName: "imported_timestamp"
stateFile:
defaultBucketName: "{ENV}-elife-data-pipeline"
defaultSystemGeneratedObjectPrefix: "airflow-config/{ENV}-s3-csv/state/s3-csv"

defaultConfig:
airflow:
dagParameters:
schedule: '*/30 * * * *' # At every 30th minute
tags:
- 'CSV'

s3Csv:
- bucketName: elife-ejp-ftp
objectKeyPattern:
Expand Down
Original file line number Diff line number Diff line change
@@ -1,5 +1,13 @@
gcpProjectName: 'elife-data-pipeline'
importedTimestampFieldName: 'imported_timestamp'

defaultConfig:
airflow:
dagParameters:
schedule: '25 2 * * *' # At 02:25
tags:
- 'Web API'

webApi:

# Observer API
Expand Down

0 comments on commit 9057556

Please sign in to comment.