1200 pdf's with photo's to process

@Wasay-Gumloop I shared my flow with you via email.

Point is that I was able to proces 2600 pdf’s in 2 hours a couple of weeks ago using the same method.

But now I do not able to process it and can not see if the Google drive Folder Reader (with 1200 pdf’s) is working (at all)

Bit of a “wide” question. Maybe you can have a look at the flow. Will share the google drive folder as well via email.

Best

Hey @Donna! If you’re reporting an issue with a flow or an error in a run, please include the run link and make sure it’s shareable so we can take a look.

  1. Find your run link on the history page. Format: https://www.gumloop.com/pipeline?run_id={{your_run_id}}&workbook_id={{workbook_id}}

  2. Make it shareable by clicking “Share” → ‘Anyone with the link can view’ in the top-left corner of the flow screen.
    GIF guide

  3. Provide details about the issue—more context helps us troubleshoot faster.

You can find your run history here: https://www.gumloop.com/history

Hey @Donna – I can see the folder shared with me but I can’t find the email where you’ve shared the workflow. Do you mind bumping that email or sharing the workflow here please?

Just send email as well


just to get the first 2 items from the folder does not work.
I’m doing something wrong here …

Would make sense if the flow picked one pdf and processed it and wrote it to the google sheet, and repeat this 1200 times … in a loop … (I know …) But for whatever reason I can noit get it to work …

Keep getting this fault, while it works with 8 files (same folder, same credentials)

here the fault link - https://www.gumloop.com/pipeline?workbook_id=iezxdS1qW2quYsQhddCcSE&tab=6&run_id=aGC2SJJNy9WZnqWjARgwb2

@Wasay-Gumloop could you take a look?

Hey @Donna – I think this a timeout issue with processing such a large file. I’m running the flow on my end right now to see if I can reproduce the exact issue. If so, I’ll create a ticket to investigate further and push a fix for this.

Would make sense if the flow picked one pdf and processed it and wrote it to the google sheet, and repeat this 1200 times … in a loop … (I know …) But for whatever reason I can noit get it to work …

^ This should indeed be the expected process.

Sorry about the trouble here, I’ll keep you posted!

Strange thing is we did have a run with 2600 pdf last month and that did run normally …

Hey @Donna – I was able to reproduce the issue and I’ve created a ticket to investigate this further. I’ll keep you posted.

Ok curious if they are able to solve it

@Wasay-Gumloop were you able to look at this suggestion and my implementation -

Or what could I do to build it so it behaves like “a repeat” (one by one)

Be well

Hey @Donna – I’ve created a ticket for this exactly, will let you know once we pick it up and investigate further.

The expected behavior is that the Drive Folder Reader would output all files and the subflow would run in loop mode. The current behavior is that the folder reader node is timing out.

no update? customer is waiting …;(

Hey @Donna – Apologies, haven’t had the chance to bump up the ticket yet. Should have an update by tomorrow or Wednesday.

Idea …
what if I create a folder and gradualy copy new files into from the “big folder”? Some sort of drip (“25 files each time”) …?
Only thing is how I can create a flow where it picks “the next 25” every 60 minutes …?
Could you give me some guidance on this?

That could work. I’d recommend using folders with 500 files so the process is efficient since the total files are 2600. 25 files would take a lot of time.

If the 500 file folder works, you can run the same workflow in parallel for separate folders each with a different set of ‘500’ files.

PS. I’ll reimburse all the credits incurred during this process to make up for the trouble.