Hi all - I have a flow that processes records on a Google Sheet. It works but processes the records in batches (right now I am doing this using row numbers on the google sheet reader). I have to run in smaller batches as it can run into errors (for a variety of reasons). This ends up eating a ton of credits without getting any output.
My question is: Is it possible to have it run a full flow for each record one at a time?
Secondary questions:
How would I configure this? (A trigger? Something else?)
Does this make a difference in terms of credits used?
Hey @JulieH! If you’re reporting an issue with a flow or an error in a run, please include the run link and make sure it’s shareable so we can take a look.
Find your run link on the history page. Format: https://www.gumloop.com/pipeline?run_id={{your_run_id}}&workbook_id={{workbook_id}}
Make it shareable by clicking “Share” → ‘Anyone with the link can view’ in the top-left corner of the flow screen.
Provide details about the issue—more context helps us troubleshoot faster.
Yup! This is exactly what subflows are for. You can merge the entire process in a subflow so it works for a single input/row. You can then loop that subflow over your Google Sheet and wrap the subflow in an Error Shield, that way if anything fails it can be skipped without halting the flow.
Does this make a difference in terms of credits used?
Credits are charged based on the nodes executed which cost credits. So if a node is executed that costs credits but the workflow fails then credits for that node will still be charged. https://docs.gumloop.com/core-concepts/credits