Upgrade to Pro — share decks privately, control downloads, hide ads and more …

Generative AI For Beginners: Prompt Engineering

Generative AI For Beginners: Prompt Engineering

Lesson 4 of the Generative AI For Beginners series.
Learn more at https://aka.ms/genai-beginners

In this lesson, we learn what Prompt Engineering is, why it matters, and how we can craft more effective prompts for a given model and application objective. We'll understand core concepts and best practices for prompt engineering - and learn about an interactive Jupyter Notebooks "sandbox" environment where we can see these concepts applied to real examples.

By the end of this lesson we will be able to:
1. Explain what prompt engineering is and why it matters.
2. Describe the components of a prompt and how they are used.
3. Learn best practices and techniques for prompt engineering.
4. Apply learned techniques to real examples, using an OpenAI endpoint.

Nitya Narasimhan, PhD

June 07, 2024
Tweet

More Decks by Nitya Narasimhan, PhD

Other Decks in Technology

Transcript

  1. P R O M P T E N G I

    N E E R I N G F U N D A M E N T A L S GENERATIVE AI FOR BEGINNERS aka.ms/genai-beginners Nitya Narasimhan, PhD
  2. Introduction Recap Key Concepts Recap Provider Options Prompt Engineering Motivation

    Prompt Engineering Techniques Prompt Engineering Mindset Summary
  3. C O N C E P T S : Q

    U I C K R E C A P Welcome to lesson 4. Before we dive in, we’ll quickly recap the core concepts learned so far – and the platform options available to you if you want to explore the hands-on exercises yourself
  4. C O R E C O N C E P

    T S Prompt – “programs” model using natural language inputs Response – content “generated” by model for that input Fabrication – generated content may not be rooted in fact Base LLM – foundational model, trained on massive data Instruction Tuned LLM – fine-tuned for a task Prompt Engineering – iterate & evaluate for quality Chat Completion – generates NLP responses, multi-turn Embedding – convert text to numeric representation for model Tokenization – chunk prompts into tokens, use in predictions
  5. H O W T H E M O D E

    L W O R K S H O W T H E M O D E L P R O M P T I S E N G I N E E R E D Large Language Model Model Parameters System Context USER PROMPT CHAT HISTORY MODEL PROMPT MODEL RESPONSE PROMPT ENGINEERING VARIABLES RAG DATA PROMPT TEMPLATE PROMPT user request received as a text input Open AI GPT-3.5-turbo RESPONSE generated content returned as user response LANGUAGE MODEL 4097 Tokens Trained to Sep 2021
  6. C H O O S E A P RO V

    I D E R ( S I G N U P ) Open AI Models include GPT-series, DALL-E, Whisper Azure AI Studio provides access to both for enterprise Hugging Face open-source model hub & community
  7. C H O O S E A PAT H (

    E X P LO R AT I O N ) No-Code use the provider’s Playground option Code-First use the provider’s api-key or token with Notebook
  8. Welcome Key Concepts (Recap) Provider Setup (Usage Options) Prompt Components

    Prompt Engineering Strategies Prompt Engineering Mindset Summary
  9. C O N C E P T : TO K

    E N I Z AT I O N Text prompts are “chunked” into tokens - helps model in predicting the “next token” for completion. Models have max token lengths. Model pricing is typically by the #tokens used.
  10. “S Graf” “Stephanie Graf” “Stephanie Smithe” “Stephanie, Smithe” “I have

    a dream..” speech Show example Any interesting insights? Try it yourself https://platform.openai.com/tokenizer TO K E N I Z E R
  11. C O N C E P T : M O

    D E L S A R E S TO C H A S T I C Prompt engineering is needed for ensuring quality responses because language models are stochastic. The same prompt will give different responses across providers, models – and time!
  12. Open AI vs. Azure Open AI Open AI vs. Hugging

    Face GPT-35-Turbo vs. GPT-4 Repeat same prompt immediately Try changing the temperature Try changing the system persona “Tell me about the element Gallium” Try it yourself https://platform.openai.com/playground T RY P RO M P T S
  13. C O N C E P T : M O

    D E L S C A N FA B R I C AT E R E S P O N S E S Language models do not “understand” meaning or context. They simply predict the next token based on statistical knowledge which is limited by training cutoff dates. Fabrication is when they return realistic responses (predicted) that are not factually right. (Try older models)
  14. Try it yourself https://platform.openai.com/playground FA B R I C AT

    I O N Ask gpt-35-turbo - sep 2021 cutoff | older gen model Ask gpt-4-turbo - sep 2023 cutoff | but has an “out” Who won the 2025 Oscar for Best Picture? Tell me the premise for the movie that won the 2025 Best Picture Oscar Tell me the premise for the movie that won the 2015 Best Picture Oscar
  15. P R O M P T E N G I

    N E E R I N G T E C H N I Q U E S : W R I T E C L E A R I N S T RU C T I O N S Provide details – Explore positioning of instruction – Use delimiters – Specify output length and format – specify steps to complete the task
  16. Try it yourself https://platform.openai.com/playground A S S I G N

    M E N T Objective: I’m an 8th grade teacher who needs to make lesson plans for my Social Studies classroom First Prompt: “Write a lesson plan for the Civil War” Final Prompt: “…”
  17. P R O M P T E N G I

    N E E R I N G M I N D S E T : I T E R AT E , E VA L UAT E A N D T E M P L AT I Z E Change the persona – Use cues – Change the temperature – Clear chat history – Change example count – Repeat instructions
  18. Explore the Repo https://platform.openai.com/playground T E M P L AT

    E L I B R A RY : P R O M P T S F O R E D U • Think about the application domain. • What are the “personas” you will find here? • What are the “tasks” they execute? • What are the “instructions” you should give? Administrator Meeting Summary and Agenda Planner Educator Diagnostic Quiz Generator Student Writing Mentor
  19. Build Your Intuition P R O M P T E

    X A M P L E S T O L E A R N F R O M