Slides from my talk at YoastCon 2023 where I showed some of the lessons I've learned about Google's inner workings from years of working with news publishers.
Webpages that have a high change frequency and/or are seen as highly authoritative News website homepages & key section pages • Main purpose = discovery of valuable new content ➢ i.e., news articles • Rarely re-crawls newly discovered URLs ➢ Unless they’re new VIPs
key pages into VIPs; Make them more valuable by; - Improving link value - Increasing change frequency 2. Use robots.txt disallow rules to manage indexing & ranking; For example, block Googlebot-Image to prevent product images from showing in Image search
Pages that need to be served quickly and frequently Includes news articles but also popular content 2. SSD storage ➢ Pages that are regularly served in SERPs but aren’t super popular 3. HDD storage ➢ Pages that are rarely (if ever) served in SERPs
indexing easy for Googlebot; Put all your critical content in the HTML source Don’t rely on rendering to load valuable content 2. There’s no such thing as a duplicate content penalty; However, duplicate content on a single site means the site is competing with itself… and that’s stupid.
None more than Top Stories & other news boxes • Top Stories are triggered when two conditions are met; ➢ Sudden increase in search volume ➢ Sudden increase in publishing volume
the intent behind the keywords you’re targeting; Don’t try to rank content that doesn’t match the intent If there are SERP features, try to get into those 2. Improve your Knowledge Graph presence; Category pages = topic hubs Use internal linking to your advantage Schema.org markup helps Google connect the dots
crawls websites … how Google indexes content … how Google evaluates quality and authority … how SERP features impact on ranking and traffic … and much, much more.