DHRW is a proof-of-concept for a low-code, IaaS platform to let you visually create data processing pipelines. See it in action!
π Add them to a GitHub repository (example repo)
π« Connect them to create execution graphs - a sequence of functions that you can pipe data through
With just the press of one button, you can auto-magically turn a series of functions into a network of Docker containers that are provisioned with your code and wait to receive and process your data.
You can then upload CSV files with your data and receive the processing results right in your browser. Supports both text as well as image outputs (for plots).
π Pairs really well with GitHub Codespaces to completely develop right in the browser. Edit your function code using VSCode in the browser, reupload it in the graph, and redeploy in two clicks. π
Setup is a bit tricky since this is a proof-of-concept. Here are the steps to get it running locally:
- Clone repository
docker compose up
- Manually install python dependencies (found in
worker/requirements.txt
on the meteor container) - known issue, they need to be synchronised π - Get an access token for your function repository and add it in settings.json, replacing the one that is there (expired, used to point to this repo).
- enter localhost:15672 and add workers exchange of type
topic
- also add a routing key named
worker_reply.*
toserver_responses
queue on this exchange. - known issue, they need to be added manually for now.
Enter the app via localhost:3000
.
Read the docs here (in Romanian, this was my master's thesis project).