What triggers the issue?

This issue reports the URLs that took too long to respond to our crawler's request.

Why is important?

If a page fails to load and results in the Timeout, it might negatively affect your crawlability as well as the number of pages that could have been indexed. Besides, it can mean poor user experience for your visitors. 

How to fix it?

There are many factors that can cause the Timeout issue. First, you need to check if the issue still persists.

Open the reported pages in your browser and see if you get the same error. If so, you should probably consult the server log for more details.

Alternatively, contact your webmaster or your hosting support. Chances are your server is misconfigured, overloaded, or simply slow. If this is not the case, make sure that your site’s extensions (plugins, modules, add-ons, etc) work correctly.

Another possible root of the issue is that your server might have started blocking our bot in the middle of the crawl. In order to resolve it, whitelist our IPs and reduce the crawl speed in the project settings if needed.

Did this answer your question?