$30 off During Our Annual Pro Sale. View Details »

Ruby On RAG - Building AI Use Cases for Fun and...

Ruby On RAG - Building AI Use Cases for Fun and Profit

This talk will cover what RAG is, how it works, and why we should be building RAG applications in Ruby and Rails. I'll share some code examples of what a toy RAG pipeline looks like in native Ruby and do a live demo of a simple RAG application. I'll also share a perspective as to why Ruby and Rails are great tools for building LLM applications and that the future languages for building such applications are whatever languages are most natural to you.

Landon Gray

August 03, 2024
Tweet

More Decks by Landon Gray

Other Decks in Programming

Transcript

  1. Overview ◉ Fun ◦ What is RAG ◦ What Problem

    does it solve ◦ How it works? ▪ Indexing ▪ Retrieval & Generation ◉ Profit - Practical ◦ Demo 3
  2. “ Retrieval Augmented Generation (RAG) is way to augment the

    LLM knowledge with additional information. 5
  3. Token Token can be thought of as a piece of

    a word. Context Window & Tokens Context Window The maximum number of tokens that can be used in a single request, inclusive of both input and output tokens. 11 https://platform.openai.com/docs/models https://help.openai.com/en/articles/4936856-what-are-tokens-and-how-to-count-them
  4. Rule of Thumb 1 token~= 4 chars in English 100

    tokens ~= 75 words Context Window & Tokens 12 https://platform.openai.com/docs/models https://help.openai.com/en/articles/4936856-what-are-tokens-and-how-to-count-them Example 6 tokens = “Ruby is a fine programming language” GPT4-o Context Window = 128,000 tokens ~= 96,000 words
  5. OpenAI - Tokenizer OpenAI has a tool to help us

    understand how many tokens a piece of text might contain. 17
  6. “ Vector embeddings are a way to convert words and

    sentences and other data into numbers that capture their meaning and relationships. 29
  7. 43

  8. 44

  9. 46

  10. 48

  11. 49

  12. 50

  13. 51

  14. 52

  15. 54

  16. 56

  17. 57

  18. Clarification: Query vs Prompt Prompt Final text input ingest by

    the LLM. Which often contains the Query Query Input text generated by some human or system.
  19. 70