I want to scrape all the blogs on this website and create summaries in notion

I want to scrape all the blogs present in the blog section of this website and generate summaries that are pasted in a notion page

Hey @sagardesai - What have you tried so far and where are you getting stuck? Please also share your run link from the https://www.gumloop.com/history page if you’re facing an issue with your workflow.

https://www.gumloop.com/pipeline?run_id=HQk4YsvanNFJkLcG5Ly5vZ&workbook_id=p28juAhUWqN8RW19CmRqGz i hope i have pasted in the correct way

Yup that is the correct link! I’ve requested access to view the flow. You can also enable ‘anyone with the link can view’ under the share button. Will look into this once I’ve access :slight_smile:

I’ve changed the access to anyone can view https://www.gumloop.com/pipeline?run_id=HQk4YsvanNFJkLcG5Ly5vZ&workbook_id=p28juAhUWqN8RW19CmRqGz

Thank you @sagardesai! There are a few issues here:

  1. The output of the ‘Website Scraper’ node is not connected with the ‘Ask AI’ node

  2. You’re passing in the content for the page writer node in the the wrong input. It should go in the Content input instead of Use Existing Notion Page

To fix this you just need to map the connections properly. Here’s the correct setup:

Let me know if this works for you.


I am getting this error in the final step

Hi, you can use error shield, just search error shield, drag and drop into your flow, and then take you notion page write field and drop it inside error shield.

Error shield basically keeps the process going, if you get any error.

In this case you are getting error at step 1, so the process can not even start. Use error shield and it will continue the process.

Futher I have created a video for you specially, which shows how to scrape the data, and putting it into csv, and also into notion at the same time, and then if you get any issues with any of the blog data not going in notion you will be informed via email notification.

I will drop that video here soon, it’s in the editing process, but use error shield your current issue will be solved.

Anyone who wants to know about how to scrape the blog posts, and then putting that data into Notion, can follow this Youtube video:

This video will show you how you can scrape multiple blog page, like from page 1 to page 10, and then scrapping on loop, and using Ask Ai, and Extract Ai to get the data, and then taking that data into google sheet, then you can continue this process further and put that data into Notion. During this process we might face errors so using error shield to detect errors and keep the flow working. And the errors are not ignored, you will be informed via email notification like this post has an issue due to which it is not added in notion etc.

By the way here is the video, if anyone wants to look into the process, step by step:

This topic was automatically closed 60 minutes after the last reply. New replies are no longer allowed.