I’m having AI lookup company name and location in a google sheet and then populate other columns in the google sheet. Everything works great when I test in small sample set (like 4 or 5 rows).
I gave hit a google sheet with 800 rows and hit run. It was working and I saw progress through the first 450 rows. I cam back overnight and it’s frozen.
Hey @kimboslice! If you’re reporting an issue with a flow or an error in a run, please include the run link and make sure it’s shareable so we can take a look.
Find your run link on the history page. Format: https://www.gumloop.com/pipeline?run_id={{your_run_id}}&workbook_id={{workbook_id}}
Make it shareable by clicking “Share” → ‘Anyone with the link can view’ in the top-left corner of the flow screen.
Provide details about the issue—more context helps us troubleshoot faster.
I broke the 800 rows into 4 different sheets of 200 each. I created 4 different workflows for each new sheet. it was annoying to do this, but successful.
I’m wondering if I should approach all other situations like this?
Hey @kimboslice — There’s no limit to how much data you want to process. For extremely large datasets (over 10,000 rows), though, it’s a good idea to split them into smaller parts across different workbooks.
That said, it’s on us to look into why the run froze — sorry about that. I’ve sent you back some credits to make up for it.
Wasay, is there a smarter way to break things into smaller chunks? Example: can i force gumloop to do 10, update sheet, do 10 more, update sheet, and repeat until all 800 rows are done?
It’s pretty painful to create 4 workflows and 4 tabs in the sheet and then recombine later. I always mess something up.
Hey @kimboslice - Yes this is possible, there are a few ways to do this. One of the easiest option is to use a subflow with a dynamic row range, explained that here: Dynamic Row Range in G Sheet | Loom