We want to create a framework for our Data Scraper and Transformer backend, where we can split the work into smaller steps and configure and start a pipeline with specific steps.
Each pipeline type should have a set of steps it should run before completing.
Each step should have events for debugging and execution results that subsequent steps can then use in their own tasks.
Each pipeline instance should also have a configuration that can be set when creating the pipeline, that each step has access to and can alter its work based on it.
The pipeline instances should be independent of each other and be able to be executed in parallell. The individual steps of a pipeline instance should be executed in sqeuence.
The left side shows the individual steps that are sequentially worked on and have a state of Open/Running/Error/Done. Each step should show some events detailing the execution and have execution results that can be viewed by the user and used by subsequent pipeline steps.

In the Admin UI, we want to be able to set configurations for the steps to make it easier to adjust the execution for either testing different methods or to adjust to changes in f.e. the scraped website