Excited but failing so far

I am a newb here are I thought a quick win would be to run a template. I tried the SEO internal link one and when I ran it, I got this error.

Extract Data Failed! The prompt of length 1614583 exceeds the maximum token length of 128000 for the model gpt-4o. Try using the ‘Summarizer’ or ‘Similarity Search’ nodes to reduce the size of the input text.e[0m

What did I do wrong on trial #1?

Jesse

Hey @MarketingGardens - Seems like the blog content passed onto AI exceeded the AI’s maximum token length.

From the history page could you please share the run link and make it shareable (set access to anyone with the link can view). I’ll look into this further and provide a solution.

https://www.gumloop.com/history

Hey @MarketingGardens - I was able to find your failed run. The issue is that the sitemap of the provided URL was so large that it exceedes the AI context window.

There are a few ways to overcome this, you can either use a Chunk Text node to breakdown the scraped content into smaller pieces that the AI can process in a loop or you can truncate the scraped content to fit AI’s maximum token length.

I re-ran your flow with the Text Formatter node to truncate the scraped content output, you can find the run link here: https://www.gumloop.com/pipeline?workbook_id=jqd4SWaopf2vEJfHZ2aY4v&run_id=2RHbxziecQTUM4WdXNwQJn

Sorry to continue on this - I had it run, but it was suggesting some links to sites not on our site. Any idea what this would be happening? Some it went to our sites, some it went to other sites. Did I need to change something so that we only see our stuff?

Thanks

Jesse

No worries, you can reach out on this thread anytime! Could you please share the run link associated with this from the https://www.gumloop.com/history page.

Also, please set the access to ‘anyone with the link can view’.

This is the run that I can see that I can copy and paste - https://www.gumloop.com/pipeline?run_id=e9eAZTX5KystV9dqPap3tw&workbook_id=jqd4SWaopf2vEJfHZ2aY4v

This is the Gumloop I was using (the one you had sent over to me (thanks for that too) - https://www.gumloop.com/pipeline?workbook_id=jqd4SWaopf2vEJfHZ2aY4v&run_id=2RHbxziecQTUM4WdXNwQJn

What else would be helpful? I took the one you sent

Then I got this google sheet, which seems to have a lot of non-ITS outputs, which I am unsure what step I am missing in the process. To be fair, this is a new platform, so I very well may be missing something super simple. — There was a bunch of alphaperformance and gumloop urls. Gumloop Internal Linking Template - Google Sheets

Thanks

Jesse

I tried it with another site and failed on that one too with this error. ;-). Catchall error in exp_backoff_retry: JSONDecodeError
Unterminated string starting at: line 1 column 63644 (char 63643)
Retrying…

Could you share the run link please from the https://www.gumloop.com/history page. And, please set the access to ‘anyone with the link can view’

https://www.gumloop.com/pipeline?run_id=gUMV5AGVH7Y9AP8FbM3mJd&workbook_id=jqd4SWaopf2vEJfHZ2aY4v - Does this work?

Yes but the error here is different from the one you shared earlier. It seems like you’ve exceeded your credits for this month.

The Google Sheet used in the template can have data from other runs as well

I’d recommend replacing the Google Sheet with your own sheet in this subflow: https://www.gumloop.com/pipeline?workbook_id=jqd4SWaopf2vEJfHZ2aY4v&tab=2

Yes, I see this. I stopped the run, but I guess it kept going. No worries there. It does lead me to the question - how can I add the API for my ChatGPT account, so I am not using so many credits? ;-). I don’t mind what it did because it did get one of the sites done successfully… So I am not complaining.

1 Like

You can add your OpenAI API key here: https://www.gumloop.com/settings/profile/credentials

This topic was solved and automatically closed 3 days after the last reply. New replies are no longer allowed.