Upgrade to Pro — share decks privately, control downloads, hide ads and more …

Semantic Content Networks - Ranking Websites on...

Semantic Content Networks - Ranking Websites on Google with Semantic SEO

Semantic Content Networks are the semantic networks of things with relations, directed graphs, attributes and facts. Every declaration and proposition for semantic search represent a factual repository. Open Information Extraction is a methodology for creation of a semantic network. The Knowledge Base and Knowledge Graph are connected things to each other in terms of factual repository usage. The Knowledge Base represents a factual repository with descriptions and triples. Knowledge Graph is the visualized version of the Knowledge Base. A semantic network is knowledge representation. Semantic Network is prominent to understand the value of an individual node, or the similar and distant members of the same semantic network. Semantic networks are implemented for the search engine result pages. Semantic networks are to create a factual and connected question and answer networks. A semantic network can be represented and consist of from textual and visual content. Semantic Network include lexical parts and lexical units.

Links, Nodes, and Labels are parts of the semantic networks. Procedural Parts are constructors, destructors, writers and readers. Procedural parts are to expand the semantic networks and refresh the information on it.

Structural Part has links and nodes. Semantic part has the associated meanings which are represented as the labels.

The semantic content networks have different types of relations and relation types.

Semantic content networks have "and/OR" trees.
Semantic Content Networks have "Relation Type Examples" with "is/A" hierarchies.
Semantic Content Networks have "is/Part" Hierarchy.
Inheritance, reification, multiple inheritance, range queries and values, intersection search, complex semantic networks, inferential distance, partial ordering, semantic distance, and semantic relevance are concepts from semantic networks.

Semantic networks help understanding semantic search engines and the semantic SEO. Because, it contains all of the related lexical relations, semantic role labels, entity-attribute pairs, or triples like entity, predicate and object. Search engines prefer to use semantic networks to understand the factuality of a website. Knowledge-based Trust is related to the semantic networks because it provides a factuality related trust score to balance the PageRank. The knowledge-based Trust is announced by Luna DONG. Ramanathan V. Guha is another inventor from the Google and Schema.org. He focuses on the semantic web and semantic search engine behaviors. He explored and invented the semantic search engine related facts.

Semantic Content Networks are used as a concept by Koray Tuğberk GÜBÜR who is founder of Holistic SEO & Digital. Expressing semantic content networks helps to shape the semantic networks via textual and visual content pieces. The semantic content networks are helpful to shape the truth on the open web, and help a search engine to rank a website even if there is no external PageRank flow.

Koray Tuğberk GÜBÜR

April 15, 2022
Tweet

More Decks by Koray Tuğberk GÜBÜR

Other Decks in Marketing & SEO

Transcript

  1. Basics - Index • Knowledge Base • Knowledge-based Trust •

    Semantic Networks • Frames • Frame Scripts • Concept Map • Topic Map • Concept Graph • FrameNet • Semantic Content Networks
  2. Knowledge Base • Knowledge-base consists of the related existences, and

    their dimensions. • A knowledge-base is different from knowledge-graph. • Knowledge-graph is the graphical version of knowledge-base. • A knowledge base involves the facts, or misinformation with different labels. • A search engine defines every website as a “knowledge-base”. • A search engine has its own “knowledge-base” too. • So, if we have a knowledge-base, what is a semantic network? • Semantic Network is the different variations of connections from the same elements of the knowledge-base. • “X” can be defined 99 different phrase variations. • “X” can have 6 context domain connections, 2 Knowledge Domain connections, and 9 entities from 3 different entity types. • For all these connections, search engine can find different answers, or propositions as labeled different IDs. Programmable search engines Inventor Ramanathan V. Guha Assignee Google LLC
  3. Knowledge Base • Semantic Network is the different variations of

    connections from the same elements of the knowledge- base. • “X” can be defined 99 different phrase variations. • “X” can have 6 context domain connections, 2 Knowledge Domain connections, and 9 entities from 3 different entity types. • For all these connections, search engine can find different answers, or propositions as labeled different IDs. Detecting spam search results for context processed search queries Inventor Ramanathan V. Guha Assignee Google LLC
  4. Knowledge Base • To understand the Semantic Search Engines, “this

    year”, we will focus on the Ramanathan V. Guha. • “Detecting spam search results for context processed search queries” focus on using queries to understand the spam documents. • The “Product Line” and “Product Graph” are relevant here. • A “fake product name”, or a wrong match of “product-brand” association can make a web document “spammy”. • A knowledge base is necessary to prevent it. Detecting spam search results for context processed search queries Inventor Ramanathan V. Guha Current Assignee Google LLC
  5. Knowledge Base • This one focuses on “aggregating context”, rather

    than processing it. • It tries to use a specific mechanism to define the knowledge of facts. • “Query Definitions, Aspects and Rephrasification” are relevant here. • A knowledge base can be used to understand overall context of a specific entity. • A search engine can filter pages based on their “context hints”, like design, structured data, relational layout elements, or currencies, statistics, measurement units, and brands, products, or other types of concepts. Aggregating context data for programmable search engines Inventor Ramanathan V. Guha Assignee Google LLC
  6. Knowledge Base • A search engine, naturally, can classify the

    search results. • The thing here is that Ramanathan V. Guha focuses on a knowledge-base like “representation of things” to classify the documents. • There are many methods for document classification. • But, it categorizes query, if the ranking is good, it assumes that the document should be labeled with a category. • A knowledge base is necessary to classify the query, and query classification is necessary for document classification. Classifying search results Inventor Ramanathan V. Guha Assignee Google LLC
  7. Knowledge Base • Search Result Ranking Based on Trust focuses

    on “factuality” and “accuracy” of a source. • It tries to rank documents based on the comprehensiveness. • Not a surprise to see “cancer” as an example on the patent. • It explains the trust signals. • It focuses on “vertical knowledge domain websites” for further trust signal information. • He tries to use words around the links for labeling them. • Negative reviews are thought to be used as well. Search result ranking based on trust Inventor Ramanathan V. GuhaCurrent Assignee Google LLC
  8. Knowledge Base • “Query Identification” is different from “Query Aspects”

    and “Query Definitions”. • An early example of the “query categorization” and “query-category map”. • It uses the “query-page” matching from the “query logs” which signals the prominence of historical data. • The “QP Tuples” are the “Query-Page” Tuples. • It uses “query logs”, thus, if you are not in the query logs despite your quality, you can be overridden. • Knowledge Base Relevancy: It has to categorize the things on the queries and documents to increase the precision. Ramanathan V. Guha, Shivakumar Venkataraman, Vineet GuptaGokay, Baris Gultekin, Pradnya Karbhari, Abhinav Jalan Query identification and association Inventor Assignee Google LLC
  9. Knowledge Base • Generating specialized search results just for a

    pattern of queries. • I focused on “question- query-document” templates last year. • So, I will skip it. • But, it uses “knowledge base” for understanding the data representation. Generating specialized search results in response to patterned queries Inventor Ramanathan V. Guha Current Assignee Google LLC
  10. Knowledge Base • It is a good feeling to follow

    Mr. Ramanathan. • Because, he tries to find answers for the important questions. • “Corroboration of Web Answers”. • Co-identification of objects across heterogeneous information sources is prominent, because it provides a “cross-check” for the facts. • The ranking search results based on trust, and “co-identification of objects” should be handled together. • Query-augmentation is processed in this patent by “replacing predicates”. • Predicate is the heaviest term in the query. • Thus, “pattern matching” is relevant here too. Heuristic co-identification of objects across heterogeneous information sources Inventor Ramanathan V. Guha Current Assignee Google LLC
  11. Knowledge Base • Providing search results is an important patent

    for the basics. • It explains the “entity types”, attributes, and HTML Code connections. • It focuses on the “templates” for specific situations. • The examples are for “restaurants”. • Thus, in the next slides, remember the concept, “frame script”. Providing search results Inventor Ramanathan V. Guha Current Assignee Google LLC
  12. Knowledge Base • Most of the basic Google functions are

    processed by Ramanathan V. Guha. • This is important, because it makes a differentiation between “worthy result”, and “regular result”. • A worthy result is a signal for the quality. • Even, Google Alert, today, can be used to find some “worthy” results. Customized web summaries and alerts based on custom search engines Inventor Ramanathan V. Guha Current Assignee Google LLC
  13. Who is Ramanathan V. Guha? Why do we focus on

    him? • He is important. • Creator of Schema.org. • Creator of RSS. • Creator of RDF. • He is a “Google Fellow” means “distinguished engineer”. • He has been cited in many books.
  14. Who is Ramanathan V. Guha? Why do we focus on

    him? • Semantic Web, Making Sense, and other specific organizations or event cited him regularly. • He mainly focuses on “Contexts”.
  15. Who is Ramanathan V. Guha? Why do we focus on

    him? • He has written a book. • He published many speeches about Google and Open Web. • At the right, you see his book, “Contexts – A formalization and some applications”. • And a screenshot from “Light at the end” presentation. • He explains why the “net” can’t be owned. • It is good to hear it from a Google Fellow.
  16. Who is Ramanathan V. Guha? Why do we focus on

    him? • DataCommons is important from his speech. • DataCommons is a Knowledge Base. • Google integrated the DataCommons into their internal search system. • They defined the project as below.
  17. Who is Ramanathan V. Guha? Why do we focus on

    him? • Since, he was able to organize four different search engines, it is a quite big success. • But, the main prominence of Ramanathan is that he thought using a “knowledge base” for every search engine problem. • Then, he tried to solve problem. • We can mention Justin Boyan and Data-highlighter tool, or Machine Learning/Rule-based System Transition on Google, as well.
  18. Who is Ramanathan V. Guha? Why do we focus on

    him? • These are some of the entity types, and their possible relation types in a structured data sample. • Categories for occurrence, or categories by domains represent how these things help search engines to structure their own “semantic networks” and knowledge bases.
  19. Who is Ramanathan V. Guha? Why do we focus on

    him? • Ramanathan also worked with the web masters. • He visualized the structured data as a form of “semantic network”. • Thus, from a Knowledge Base Concept to the Relational Databases, he has a prominent place. • He created some guides for SEOs too.
  20. Who is Ramanathan V. Guha? Why do we focus on

    him? • Vocabulary represents the richness of the content in terms of unique words. • But, Ramanathan uses it for different topic grasping algorithms. • On his talk, during 2014, he mentioned Scholarly Works, Comics, Serials, and Sports. • After that, Schema.org started to launch new SD Types.
  21. Who is Ramanathan V. Guha? Why do we focus on

    him? • Vocabulary represents the richness of the content in terms of unique words. • But, Ramanathan uses it for different topic grasping algorithms. • On his talk, during 2014, he mentioned Scholarly Works, Comics, Serials, and Sports. • After that, Schema.org started to launch new SD Types.
  22. Who is Ramanathan V. Guha? Why do we focus on

    him? Aggregated Knowledge Graph => Does it sound similar from the patents?
  23. Who is Ramanathan V. Guha? Why do we focus on

    him? Aggregated Knowledge Graph => Does it sound similar from the patents?
  24. Knowledge Base • This is another prominent sample that I

    preferred to put with some common inventors. • “Alternatives of an answer, and a question” represent the “slight differences on wording”. • And, it requires to have a “knowledge base”, since it focuses on Information Extraction. • Understand the difference between “Occurrence Tracking Search Engine”, and “Perception based Search Engine”. Determining question and answer alternatives Inventor David Smith, Engin Cinar Sahin, George Andrei Mihaila Current Assignee Google LLC
  25. Knowledge Base • From the same fellows, you have a

    “Context”. • They cite the Ramanathan in the invention. • They mention the “Topics” and “Movie/Book” intersection of the Harry Potter. Determining question and answer alternatives Inventor David Smith, Engin Cinar Sahin, George Andrei Mihaila Current Assignee Google LLC
  26. Knowledge-based Trust • Knowledge-based Trust involves a PageRank balancing effect

    via Accurate, Unique and Comprehensive Information on sources. • “Ranking search results with Trust” via a Knowledge Based evolved into “Knowledge-based Trust” in the same year. Explained Before
  27. Knowledge-based Trust • Knowledge-based Trust involves a PageRank balancing effect

    via Accurate, Unique and Comprehensive Information on sources. • A single “broken sentence” can decrease the granularity of the semantic network that will be extracted. Explained Before
  28. Knowledge-based Trust • “Extractor”, or “Constructor”… • It is a

    representation of the trust-measurement framework. • The “Directed Graph” model explains how the “value pairs” are being extracted. Explained Before
  29. Knowledge-based Trust • “Source Accuracy” represent the trust that a

    search engine can give for visibility. • The algorithm design simply focus on two main steps. • Extracting the triples. • Comparing them to the others, especially the ones in Knowledge Base. Explained Before
  30. Knowledge-based Trust • An example of the “Triple Correctnes Prediction”.

    • “Freebase Triple” is important. • Freebase is bought by Google to forge the Knowledge Graph. • The “triple” from Knowledge Base is compared to the triples from the SERP. Explained Before
  31. Knowledge-based Trust • Seed Sources and Tail Sources. • Remember

    the RankMerge. • Remember the Topic- sensitive PageRank. • KBT is not an alternative of PageRank. • It is a Hybrid Search Engine from Hypertextual to Semantic. Explained Before
  32. Semantic Networks • Semantic Networks are the knowledge representations with

    the relational databases. • A knowledge base can have multiple semantic networks from the same knowledge base dimensions. • All the knowledge representation schemes should be human- understandable. • Semantic Networks and Frame Representations are connected to each other.
  33. Semantic Networks • Knowledge Bases represent the Knowledge. • Thus,

    Knowledge Representation is a central term. • Logical Representation is “difficult for common sense”. • Logical Representation has uncertainty, and it has time constraints, along with beliefs. • Logic representations require “reasoning”. Basically, it is “TrueVTrue = True”. • These rules are not good for extracting the facts, and storing them in a relational database. Entity ID of m076hq
  34. Semantic Networks • Semantic Networks represent factual knowledge in classes

    of objects and their properties. • Declarative knowledge representations, and their relations are stored. • Static representations (knowledge that stays same) are more common for semantic networks. • Semantic Networks imitate human memory. • Structured Object Representation include properties, nodes and edges within Directed Graphs. • Thus, it is called “Association Graph”. site:https://artsandculture.google.com/entity/
  35. Semantic Networks • General Networks shouldn’t be considered as Semantic

    Network. • Every “arch”, knowledge representation has to have a relation type. This is a general network.
  36. Semantic Content Networks • Lexical Parts of a Semantic Content

    Network: • Nodes – objects from different classes, or individuals. • Links – relationships between the objects. • Labels – nodes and links, together. • Procedural Part: • Constructors • Destructors • Writers • Readers Structural Part has the “links and nodes”. Semantic part has the “associated meanings” based on the knowledge domain.
  37. Semantic Content Networks • Relation Type Examples: IS/A Hierarchy Is/A

    relation type is relevant to the Taxonomy, Lexical Relations, Semantic-Dependency Tree, and Augmented Information Retrieval. All these are relevant to Set Theory.
  38. Semantic Content Networks • Relation Type Examples: IS/Part Hierarchy Is/A

    relation type is relevant to the Taxonomy, Lexical Relations, Semantic-Dependency Tree, and Augmented Information Retrieval. All these are relevant to Set Theory. Lexical Relations help an SEO to focus on meaning, rather than query search demand (keyword volume.)
  39. Semantic Networks 1. Hypernym: The general word of another word.

    For example, the word color is the hypernym of red, blue, yellow. 2. Hyponym: The specific word of another general word. For example, crimson, violet, lavender are the hyponyms of purple. And, purple is the hyponym of the color. 3. Antonym: The opposite of another word. For example, the big is the antonym of the small, and the early is the antonym of the late. 4. Synonym: The replacement of another word without changing the meaning. For example, huge is the synonym of big, and initial is the synonym of early. 5. Holonym: The whole of a part. For example, the table is the holonym of the table leg. 6. Meronym: The part of a entire. For example, a feather is the meronym of a bird. 7. Polysemy: The word with different meanings such as love, as a verb, and as a noun. 8. Homonymy: The word with different meanings accidentally, such as bear as an animal and verb, or bank as a river or financial organization.
  40. Semantic Networks Inheritance helps search engines to Extract Information by

    augmenting. Semantic Dependency Tree is a representation of inheritance. Source: Ryan J. Urbanowicz.
  41. Semantic Networks Every arch defines a binary relation. For the

    sentence, “John’s mother has sued her husband at the age of 42.” The relations here are “predicate”, “age”, and “marriage status”. Source: Ryan J. Urbanowicz.
  42. Semantic Networks Arch Types: X is a Y. => Individual

    to Concept => X is a tiger. X is a kind of Y. => Concept to Concept => X is a wild kind of animal. X is related to Y. => Individual to Individual => X Tiger leads the pack of Y Tigers. Source: Ryan J. Urbanowicz.
  43. Semantic Networks Reification: Relationships can be turned into a frame.

    A frame can turn non-binary relationships into an object. A giver A recipient An object Source: Ryan J. Urbanowicz. Learn, Semantic Role Labels.
  44. Semantic Networks Partial Ordering – How the semantic network should

    be shaped? Source: Ryan J. Urbanowicz. Learn, Semantic Role Labels.
  45. Semantic Networks Unified Medical Language System is an Example of

    Semantic Network. Source: Ryan J. Urbanowicz. Learn, Semantic Role Labels.
  46. Semantic Networks A website can make the semantic network of

    the search engine richer and more accurate. It can be the authority of truth, not just the topic.
  47. Semantic Networks Frames: Representation of Stereotypes. Frames have inheritance. Frames

    can have different data structures. Frames are the typical semantic networks for the same types of things. Frames are useful for training AI. At the right, you see a Frame. Every class, and sub-class have specific attributes. All the relation types are clear. In this case, Knowledge Bases are more similar to the Frames. Frames have “slots”, and “filters to be filled in. Values can be static knowledge. A “daemon” can exceed the threshold and make the knowledge “dynamic”.
  48. Semantic Networks Every frame slot can have a slot. Every

    slot can have a “value”, “default”, “range”, “if- added”, and “if-needed” property. Frames help an SEO to create specific types of document templates for specific types of query templates. Facets can include different types of semantic networks. • Represents the “default”. • Every facet here can be entirely different semantic network. • A central entity will reflect the core of the specific entity’s identity.
  49. Semantic Networks • Situational Frame: Valid only for a situation.

    • Action Frame: Actions for the situation. • Combination Frame: Combination of situational and action frames. • General Frame: Hierarchy included class models. • These frames are for typical situations, the unique or rare entities might not be put into a frame. • At the right side, you will see another frame, but it is only for the restaurants. • In the next slide, you will see an example of FrameNet.
  50. Semantic Networks • “Jack went to a restaurant. He decided

    to order steak. He sat there and waited for a long time. Finally, he got angry and left. • “What was Jack waiting for?” • “Why did he get angry?” • A Frame Script can shape a knowledge base for different situations. • Same script can extract all the information for the similar situations. • Remember the “Restaurant” sample from Ramanathan V. Guha.
  51. Semantic Networks • Concept Maps: To create a Semantic Network,

    a concept map is necessary. • Every existence is connected to at least one concept. • “Eagle” can be connected to “flying”, or “hunting” concept.
  52. Semantic Networks • Topic Maps: It sounds similar? • It

    reflects any kind of concept, existing things, or hypergraphs for associations, occurrences and real-world things. • A topic map connects concepts to the each other along with the real world entities. A topic map can be extracted from a content network, and turned into a semantic network. • This brings us to the “Concept Graphs”. • A concept graph is again for patterns. • At the right side, you will see an example of concept graph.
  53. Semantic Networks • A FrameNet is different than a frame.

    • A FrameNet includes the semantic role labels but in the form of more detailed patterns. • For example, the sentence “The Unit found two men hiding in the trunk of a car…” includes a “sought entity”, “perceiver” and “location”. • If we use the sentence like “The Unit found three thiefs that are hiding in the storage”, relations are same. • It is similar to the “Frame Scripts”. • But, FrameNet is actually a programming output to help machines to read the semantic knowledge.
  54. Semantic Content Networks • Semantic Content Networks are the next

    steps that come after the Topical Map. • A Semantic Content Networks contains a Context Vector, Context Hierarchy, Structure and Connection. • A Semantic Content Network is the textual and visual form of a semantic network. • A semantic content network can help a search engine to change the truth in the knowledge base. • A search engine shapes the SERPs based on the knowledge-base. • A semantic content network can have a high- level of knowledge-based trust, and search engine can start to shape the SERPs network based on the specific source. • Semantic content networks can’t be imitated. • A semantic content network can be beaten by another one. • A better context vector, hierarchy, connection, and structure are must.