Upgrade to Pro — share decks privately, control downloads, hide ads and more …

Everything Google Lied to Us About

Everything Google Lied to Us About

Michael King

March 06, 2024
Tweet

More Decks by Michael King

Other Decks in Marketing & SEO

Transcript

  1. 1 1

  2. 4

  3. 6 6

  4. 8 Just Create Great Content, We’ll Figure it Out He

    said it so many times, I’m not going to cite a source.
  5. 19 19 Oh, Here’s a Thread Where the Ads Team

    is Asking the Search Team to Juice Ads
  6. 20 20 Using Expired Domains Don’t Give You an Advantage

    …you can get that domain into Google; you just won’t get credit for any pre-existing links. -Matt Cutts So if the content was gone for a couple of years, probably we need to figure out what this site is, kind of essentially starting over fresh. So from that point of view I wouldn’t expect much in terms of kind of bonus because you had content there in the past. I would really assume you’re going to have to build that up again like any other site. -John Mueller
  7. 36 36 Very very few SEO tools are offering analysis

    that aligns with how Google works today.
  8. 38 Semantic Search is Fueled by High Density Embeddings …just

    like large language models. A lot of what Google has always been trying to do is more real.
  9. 40 40 Google Has Been Using Public About Models Since

    2020 This is why some of the search results feel so weird. A re-ranking of documents with a mix of lexical and semantic. https://arxiv.org/pdf/2010.01195.pdf
  10. 41 41 The way SEO build links doesn’t work as

    well we think it does anymore.
  11. 43 Dense Retrieval You remember “passage ranking?” This is built

    on the concept of dense retrieval wherein there are more embeddings representing more of the query and the document to uncover deeper meaning.
  12. 45 45 You need to focus on building more relevant

    links than higher volumes of links.
  13. 47 47 All of this is a huge problem because

    SEO software still operates on the lexical model.
  14. 49 Indexing is Also Harder It’s not being talked about

    as much, but indexing has gotten a lot harder since the Helpful Content update. You’ll see a lot more pages in the “Discovered - currently not indexed” and “Crawled - currently not indexed” than you did previously because the bar is higher for what Google deems worth capturing from the web.
  15. 50 50 I Believe This is a Function of Information

    Gain Conceptually, as it relates to search engines, Information Gain is the measure of how much unique information a given document adds to the ranking set of documents. In other words, what are you talking about that your competitors are not?
  16. 51 51 In conclusion: “More content” is no longer inherently

    the most effective approach because there’s no guarantee of traffic from Google.
  17. 53 53 I’m Leaving Y’all with Four Actions Today 1.

    How to Prune Your Content 2. How to Use LLMs 3. How to Appear in LLM based Search Engines 4. How to Think About Relevance
  18. 56 Aleyda Has a Process Aleyda’s workflow is a great

    place to work through whether your content should be pruned or not. https://www.aleydasolis.com/en/crawli ng-mondays/how-to-prune-your-website- content-in-an-seo-process-crawlingmon days-16th-episode/
  19. 58 58 Content Decay The web is a rapidly changing

    organism. Google always wants the most relevant content, with the best user experience, and most authority. Unless you stay on top of these measures, you will see traffic fall off over time. Measuring this content decay is as simple comparing page performance period over period in analytics or GSC. Just knowing content has decayed is not enough to be strategic.
  20. 63 63 Interpreting the Content Potential Rating 80 - 100:

    High Priority for Optimization 60 - 79: Moderate Priority for Optimization 40 - 59: Selective Optimization 20 - 39: Low Priority for Optimization 0 - 19: Minimal Benefit from Optimization If you want quick and dirty, you can prune everything below a 40 that is not driving significant traffic.
  21. 64 64 Combining CPR with pages that lost traffic helps

    you understand if it’s worth it to optimize.
  22. 65 65 Step 1. Pull the Rankings Data from Semrush

    Organic Research > Positions > Export
  23. 66 66 Step 2: Pull the Decaying Content from GSC

    Google Search Console is a great source to spot Content Decay by comparing the last three months year over year. Filter for those pages where the Click Difference is negative (smaller than 0) then export.
  24. 68 The Output is List of URLs Prioritized by Action

    Each URL is marked as Keep, Revise, Kill or Review based on the keyword opportunities available and the effort required to capitalize on them. Sorting the URLs marked as “Revise” by Aggregated SV and CPR will give you the best opportunities first.
  25. 69 69 Get your copy of the Content Pruning Workbook

    : https://ipullrank.com/cpr-sheet
  26. 70 How to Kill Content Content may be valuable for

    channels outside of Organic Search. So, killing it is about changing Google’s experience of your website to improve its relevance and reinforce its topical clusters. The best approach is to noindex the pages themselves, nofollow the links pointing to them, and submit an XML sitemap of all the pages that have changed. This will yield the quickest recrawling and reconsideration of the content.
  27. 71 71 How to Revise Content Review content across the

    topic cluster Use co-occurring keywords and entities in your content Add unique perspectives that can’t be found on other ranking pages Answer common questions Answer the People Also Ask Questions Restructure your content using headings relevant to the above Add relevant Structured markup Expand on previous explanations Add authorship Update the dates Make sure the needs of your audiences are accounted for Add to an XML sitemap of only updated pages
  28. 72 How to Review Content The sheet marks content that

    has a low content potential rating and a minimum of 500 in monthly search volume as “Review” because they may be long tail opportunities that are valuable to the business. You should take a look at the content you have for that landing page and determine if you think the effort is worthwhile.
  29. 75 75 It’s Not Difficult to Build with Llama Index

    sitemap_url = "[SITEMAP URL]" sitemap = adv.sitemap_to_df(sitemap_url) urls_to_crawl = sitemap['loc'].tolist() ... # Make an index from your documents index = VectorStoreIndex.from_documents(documents) # Setup your index for citations query_engine = CitationQueryEngine.from_args( index, # indicate how many document chunks it should return similarity_top_k=5, # here we can control how granular citation sources are, the default is 512 citation_chunk_size=155, ) response = query_engine.query("YOUR PROMPT HERE")
  30. 86 86 Blocking LLMs is a Mistake. Appearing in these

    places will be recognized as brand awareness opportunities very soon.
  31. Embrace Structured Data There are three models gaining popularity: 1.

    KG-enhanced LLMs - Language Model uses KG during pre-training and inference 2. LLM-augmented KGs - LLMs do reasoning and completion on KG data 3. Synergized LLMs + KGs - Multilayer system using both at the same time https://arxiv.org/pdf/2306.08302.pdf Source: Unifying Large Language Models and Knowledge Graphs: A Roadmap
  32. Thank You | Q&A [email protected] Award Winning, #GirlDad Featured by

    Get Your SGE Threat Report: https://ipullrank.com/sge-report Play with Raggle: https://www.raggle.net Download the Slides: https://speakerdeck.com/ipullrank Mike King Chief Executive Officer @iPullRank