I built a website scraper to extract data from competitors’ sites, and it runs in a loop as I provide a list of URLs to scrape as a starting point for the flow. Working perfectly fine except I want to pass the original URL in the export list (so I need to “attach” the original URL to the rows extracted for each URL scraped).
Gummie was super helpful for the rest of the flow, but couldn’t come up with a working solution for this one. It sounds super easy in theory, but I’ve now spent more time on this part than on the whole flow. Do you have any idea how to achieve this?
Hey @blobibblo! If you’re reporting an issue with a flow or an error in a run, please include the run link and make sure it’s shareable so we can take a look.
Find your run link on the history page. Format: https://www.gumloop.com/pipeline?run_id={{your_run_id}}&workbook_id={{workbook_id}}
Make it shareable by clicking “Share” → ‘Anyone with the link can view’ in the top-left corner of the flow screen.
Provide details about the issue—more context helps us troubleshoot faster.
Hi @Wasay-Gumloop, thanks a lot for your answer. Unfortunately, it doesn’t work the way I want it to. The URL value is only written in the export doc for the first element of the list, not for each of them.
I’ll look into the subflow. Thanks for the support!