Questions on Using @effect/workflow for File Uploads
I'm trying to solve the problem of uploading large files. The business logic here is completely trivial, something like:
But once you need to have a checkpoint after each async operation, the code turns into spaghetti. But I know that this problem can be solved effectively on back-end using frameworks like Azure Durable Functions where one uses Durable operations instead of normal ones (like Durable.Sleep, Durable.Log, Durable.Http etc) and the workflow is replayed in case of failures or restarts. This is how I stumbled upon @effect/workflow package. It practically implements the same pattern but it can be run anywhere which is amazing. However, I've got couple questions:
1. Is there any minimal example where workflow runs without cluster package and persists just in the file? Is it even possible to do so?
2. According to the documentation each workflow has inputs and one output and is made of any number of activities. The question is - how do you get the status of the workflow? In my case, I need to know remaining percentage to be uploaded.
But once you need to have a checkpoint after each async operation, the code turns into spaghetti. But I know that this problem can be solved effectively on back-end using frameworks like Azure Durable Functions where one uses Durable operations instead of normal ones (like Durable.Sleep, Durable.Log, Durable.Http etc) and the workflow is replayed in case of failures or restarts. This is how I stumbled upon @effect/workflow package. It practically implements the same pattern but it can be run anywhere which is amazing. However, I've got couple questions:
1. Is there any minimal example where workflow runs without cluster package and persists just in the file? Is it even possible to do so?
2. According to the documentation each workflow has inputs and one output and is made of any number of activities. The question is - how do you get the status of the workflow? In my case, I need to know remaining percentage to be uploaded.
