Site Crawl Requests
During a site crawl, a program systematically analyzes your website's content and structure to collect data that you can use to identify technical areas for improvement, including search engine optimization (SEO). With a site crawl, you can:
- understand how search engines like Google find pages within your site
- identify technical issues like duplicate content and missing metadata
- troubleshoot why a page may not appear in a crawl or search result
- find whether pages are loading slowly or if they aren't optimized for mobile
- request specific identifiers to be crawled across your site, allowing you to expand your auditing capabilities beyond our internal site search
While you may be able to do some of this manually, site crawls are much more efficient and thorough, providing quick snapshots that make monitoring larger or multiple websites possible.