DAG Workflow Engine

(github.com)

38 points | by blobmty 5 hours ago

14 comments

  • peterkelly 1 hour ago
    I've always been of the view that for a workflow language, you should use a proper, turing-complete functional language which gives you all the usual flexiblity for transformations on intermediate data, while also supporting things like automatic parallelisation of things like external, compute-intensive tasks.

    I recommend checking out https://github.com/peterkelly/rex and also my PhD thesis on the topic https://www.pmkelly.net/publications/thesis.pdf.

    The gap in flexiblity between DAG-only and a full language designed for the task is a significant one.

    • mrauha 16 minutes ago
      redun is quite interesting in this regard

      https://insitro.github.io/redun/

    • antonvs 52 minutes ago
      Do you implement a DAG within your system to act as a kind of well-defined backbone for analysis and execution, or do you dispense with (explicit) DAGs entirely?
  • zackham 11 minutes ago
    I have a project in this space that I've run many thousands of jobs through. It's solid and full featured. Feel free to connect: https://stepwise.run/
  • kovariance 13 minutes ago
    YAML as a programming language is something I consider as an anti-pattern (see AWS Step Functions). Very difficult to read/debug/test. It's better to use a real programming language that compiles into a DAG (e.g. Temporal, Dagger.io).
  • b4rtaz__ 24 minutes ago
    It’s interesting to see something new in this space, especially since some people claim that flowcharts will be replaced by AI automation or AI-generated code.

    P.S. I'm the author of a similar solution:

    * https://github.com/nocode-js/sequential-workflow-designer

    * https://github.com/nocode-js/sequential-workflow-machine

  • SkyPuncher 58 minutes ago
    I'm working on something similar as a side project. I'm finding frustration with a lack of repeatability in my LLM flows. 90% of my code is AI written, but most of my guidance to LLMs is not particularly specific. It's "make sure you've read this file", "how does that match against existing patterns", "what's the performance like".

    I've ended up building my workflow engine directly in Python, despite YAML being the default choice for LLMs.

    I found that YAML had some drawbacks:

    * LLMs don't have an inherent understanding of YAML conventions. They tend to be overly verbose. Python code solved this because "good" code is generally as short as you need.

    * YAML isn't really composable. Yes, you can technically compose it, but you'll be fighting the LLM the entire time. Python solved this because the LLM knows how to decouple code.

    * I want _some_ things to be programatic still. Having Python solves that

    * Pretty much any programming language would do. Python just feels like the default for LLM-centric code.

  • purpleidea 1 hour ago
    Here's a different kind of workflow engine with a proper DSL. It turns out config management is the same problem as workflow engines, if you use my modern definition of config management.

    https://github.com/purpleidea/mgmt/

  • barelysapient 53 minutes ago
    My version of a similar tool; but written in Go with a compile time guarantee.

    https://github.com/swetjen/daggo

  • tibbar 2 hours ago
    I was expecting to see some verbose LLM output, but actually the code has a distinctly hand-crafted feel. Nice to see! I'm not sure if "production ready" is a safe claim 7 commits in to a project ;)
  • tedchs 1 hour ago
    How does this compare to Temporal? That seems to be the current baseline for application-oriented workflow engines.
  • taybin 2 hours ago
    What makes it production ready? What's the code coverage on your tests? There are only seven commits in this repo as of this comment.
    • Hasnep 39 minutes ago
      The LLM generated the words "production ready" so it must be true!
  • subhobroto 45 minutes ago
    This is a good exercise but IMHO, when you really start using a workflow for production usecases, you need a a proper, turing-complete programming language as a DSL.

    There used to be a project called Benthos (since acquired and rebranded by Redpanda in 2024) that was amazing, that you might want to gain some inspiration from.

    However, durable workflows have also gained popular acceptance as functional design reaches a wider audience.

    While Temporal is the most popular choice when it comes to durable workflows, DBOS (cofounded by the father of PostgreSQL) is my personal favorite.

    At the moment, orchestration in DBOS has certain gaps - you might very well consider spending your effort on closing those gaps. The value there would be phenomenal!

    • FelipeCortez 23 minutes ago
      I love Temporal and am DBOS-curious. what do you think DBOS does better?
      • subhobroto 7 minutes ago
        Hi Felipe! Just point your agent at https://docs.dbos.dev/python/prompting and give it a go - you can really play around with it as much as you want and solve real problems you care about than me lecturing you about it :)

        That said, DBOS really makes durable workflows accessible and approachable. Having already used Temporal, I think you're really appreciate how quickly you can get started with DBOS. I forget if they support SQLite but if you have a PostgreSQL server set up, you really don't need anything else to write your first few DBOS durable workflows (vs. needing a Temporal server or cluster)

        Let me know if I got you interested to try it out. I first learned about Temporal from Mitchell Hashimoto as they were using it for Hashicorp Cloud. Eventually I discovered DBOS and now all my personal projects are on DBOS.

  • _ZeD_ 1 hour ago
    how it compares to airflow?
    • colton_padden 1 hour ago
      Was going to ask the same thing. The orchestration space already has some very well established frameworks like Airflow and Dagster. Would be curious to see the pros and cons.
      • saltyoldman 1 hour ago
        I think the future of replacements to well established frameworks written in Python/etc.. are zero dependency binaries (from Rust or Go) that require so little configuration and tuning and they "just work".

        That being said, that's not this project.

  • esafak 2 hours ago
    I don't see any references to existing orchestrators, which are way more complete, so I presume you did this as an exercise?

    Just seeing YAML used for workflows in this age makes me automatically nope out.

    • afshinmeh 2 hours ago
      Curious, what format would you prefer to use to represent a workflow instead of YAML?
      • esafak 2 hours ago
        Type-safe code. Workflows are not configuration! If I wanted YAML hell I could stick to Github Actions.

        But that's only the start. There are a lot of other things I would expect of a new workflow orchestrator in 2026 so if you are not comparing yourself to the competition you probably don't know what you're getting yourself into.

        • afshinmeh 1 hour ago
          Yeah, that makes sense. I looked at a few workflow orchestrators and I'm building something that I will release soon, but my thinking is that the "workflow engine" should be an abstraction that takes the input and executes the steps. "What" you use to define that workflow is probably the SDK layer though, but I can certainly see the value in using type safe code to define as opposed to a YAML file.

          I'm mainly focusing on the portability aspect of it (e.g. use TS/Python/etc. to define the workflow/steps or just simple a simple YAML file).

          • verdverm 1 hour ago
            Are you planning to map those varied definitions onto varied orchestrators?
            • afshinmeh 1 hour ago
              Sort of. My thinking is that the input to define the workflow should be anything you prefer to use (TS, Go, YAML, etc.) and the orchestrator's job is to model that and execute the job, given your deployment model.
          • esafak 1 hour ago
            [dead]
  • blobmty 5 hours ago
    DAG Workflow Engine A production-ready DAG (Directed Acyclic Graph) workflow engine driven by a YAML DSL. Validates, executes, and visualizes workflows with support for parallel execution, retries, conditional branching, batch iteration, and pluggable actions.