1200 pdf's with photo's to process

@Wasay-Gumloop Reimbursement is not needed. We use our own API.
Your help us my curency/reward/reimbursememnt .. :wink:

1 Like

Hey @Donna – Wanted to give you a quick update here: We identified the root cause for the credentials expiring on the large run and have a fix in our staging environment, I’ll let you know once we push that to production. We’re also generally making other improvements to the Drive Folder Reader node to prevent such edge cases in the future.

In the meantime, I wanted to check if breaking the folder with a subset of files worked or if there’s anything else I can do to help.

Hey,

Thanks for feedback.

Breaking it up in 4 sub folders worked.

Let me know when I can run all in one again.

Be well

This topic was automatically closed 3 days after the last reply. New replies are no longer allowed.

Hey @Donna – We made a few upgrades to our Google Drive Folder reader node. For large files the most efficient option would be to enable the new Return Drive Links Only option and loop that over the Drive File Reader node to fetch the file object.

This option is robust since it runs fast and can handle large files.

Let me know if this makes sense and works for you :slightly_smiling_face: – Really appreciate your patience here!

Note: You’ll have to hover over your existing Drive Folder Reader node and upgrade its version or delete and add it back to see the new features.

1 Like

problem was not so much the size of the files. Was more the number of files. I solved it now by splitting the folder in batches. Thanks anyway

1 Like