Check if the pages and directories you blocked via the robots.txt file are actually crawled. Pages that should not be crawled [Click to enlarge] You can also search for pages that aren't blocked via the robots.txt file but shouldn't be prioritized from a crawling perspective - this includes pages that aren't indexed, canonicalized, or redirected to others pages. For this, you can perform a list analysis from the exported list with your favorite SEO crawler (e.g. Screaming Frog or OnPage.org) to add additional information about their non-indexing meta crawlers Shadow Making and canonicalization status , in addition to the HTTP status you will already have logs. 6. What is your Googlebot crawl speed over time and how does it correlate to response times and serving error pages? Unfortunately, the data that can be obtained through Google Search Console's 'Crawl Shadow Making Stats' report is too generic (and not necessarily accurate enough) to take action on. So, by analyzing your own logs to identify Googlebot's crawl rate over time, you can validate the information and segment it to make it actionable. With Loggly, you can choose to show Googlebot activity over the desired time range in a line chart.
Check that they are correctly accessing the Shadow Making relevant pages and resources in each case. I included this one specifically for websites that serve different content to users in different locations. In some cases, these websites unknowingly provide a Shadow Making poor experience for crawlers with IP addresses from other countries - blocking them outright or allowing them to access only one version of the content (preventing them from explore other versions). Google now supports locale-aware crawling to discover content specifically intended to target other Shadow Making countries, but it's still a good idea to make sure all your content is being crawled. Otherwise, it may indicate that your website is not set up correctly. After segmenting by user agent, you can then filter by IP to verify that the site is serving the correct version of each page to crawlers from the relevant countries. Googlebot's IP [Click to enlarge] For example, look what happens when I try to access the NBA site with a Spanish IP address.
I get 302 redirected to the basketball subdirectory of L'Equipe, a local sports newspaper in France. Redirection based on IP location I've Shadow Making explained in the past why I'm not a fan of international targeted automatic redirects. However, if they are meant to exist for business (or any other) reasons, it is important to give consistent behavior to all crawlers coming from the same country - search bots and other user agents - by ensuring that SEO best practices are followed in each case. Final Thoughts I hope reviewing these questions – and explaining how to answer them using log analysis – will help you expand and strengthen your technical SEO efforts. The opinions Shadow Making expressed in this article are those of the guest author and not necessarily of Search Engine Land. Staff authors are listed here.