What logs show you
Which pages Googlebot crawls, how often, and which it ignores. Indispensable for large sites debugging crawl budget.
Getting the data
Most hosts expose access logs. Filter to Googlebot user-agents, aggregate by URL, and look at 90-day crawl frequency.
What you'll find
Usually a third of your site is being over-crawled (thin or duplicate) and the pages you care about aren't being crawled enough. That's the map for internal linking and pruning decisions.