Hey @Donna! If you’re reporting an issue with a flow or an error in a run, please include the run link and make sure it’s shareable so we can take a look.
Find your run link on the history page. Format: https://www.gumloop.com/pipeline?run_id={{your_run_id}}&workbook_id={{workbook_id}}
Make it shareable by clicking “Share” → ‘Anyone with the link can view’ in the top-left corner of the flow screen.
Provide details about the issue—more context helps us troubleshoot faster.
Hey @Donna – I can see the folder shared with me but I can’t find the email where you’ve shared the workflow. Do you mind bumping that email or sharing the workflow here please?
Would make sense if the flow picked one pdf and processed it and wrote it to the google sheet, and repeat this 1200 times … in a loop … (I know …) But for whatever reason I can noit get it to work …
Hey @Donna – I think this a timeout issue with processing such a large file. I’m running the flow on my end right now to see if I can reproduce the exact issue. If so, I’ll create a ticket to investigate further and push a fix for this.
Would make sense if the flow picked one pdf and processed it and wrote it to the google sheet, and repeat this 1200 times … in a loop … (I know …) But for whatever reason I can noit get it to work …
^ This should indeed be the expected process.
Sorry about the trouble here, I’ll keep you posted!
Hey @Donna – I’ve created a ticket for this exactly, will let you know once we pick it up and investigate further.
The expected behavior is that the Drive Folder Reader would output all files and the subflow would run in loop mode. The current behavior is that the folder reader node is timing out.
Idea …
what if I create a folder and gradualy copy new files into from the “big folder”? Some sort of drip (“25 files each time”) …?
Only thing is how I can create a flow where it picks “the next 25” every 60 minutes …?
Could you give me some guidance on this?
That could work. I’d recommend using folders with 500 files so the process is efficient since the total files are 2600. 25 files would take a lot of time.
If the 500 file folder works, you can run the same workflow in parallel for separate folders each with a different set of ‘500’ files.
PS. I’ll reimburse all the credits incurred during this process to make up for the trouble.