rights reserved. “A complex system designed from scratch never works and cannot be patched up to make it work. You have to start over with a working simple system.”
rights reserved. “Amazon S3 is intentionally built with a minimal feature set. The focus is on simplicity and robustness.” – Amazon S3 Press Release, March 14, 2006
rights reserved. Amazon S3 8 → more than 200 microservices Mai-Lan Tomsen Bukovec AWS Vice President (Storage, Streaming, Messaging, and Monitoring/Observability)
rights reserved. The road to generative AI The quick brown fox jumps over the lazy _ person. Next word prediction with simple word embedding. ? ? ? ? ? ? ? Sentence:
rights reserved. The road to generative AI The quick brown fox jumps over the lazy _ rabbit. Next word prediction with ‘memory’ such as Recurrent Neural Networks (RNN). ? ? ? Sentence:
rights reserved. Transformers - Attention Is All You Need - 2017 28 “Attention is all you need” Vaswani, Ashish & Shazeer, Noam & Parmar, Niki & Uszkoreit, Jakob & Jones, Llion & Gomez, Aidan & Kaiser, Lukasz & Polosukhin, Illia. (2017)
rights reserved. Transformers - Attention Is All You Need - 2017 29 “Attention is all you need” Vaswani, Ashish & Shazeer, Noam & Parmar, Niki & Uszkoreit, Jakob & Jones, Llion & Gomez, Aidan & Kaiser, Lukasz & Polosukhin, Illia. (2017) The Transformer model architecture. • “Self-attention” enables models to scale the understanding of relationships between words. • Efficiently use parallel computing.
rights reserved. The road to generative AI The quick brown fox jumps over the lazy _ dog.✓ Next word prediction with transformer based networks. Sentence:
rights reserved. The road to generative AI The quick brown fox jumps over the lazy dog. Is an English-language pangram — a sentence that contains all the letters of the alphabet. The phrase is commonly used for touch-typing practice. Next word prediction with transformer based networks. Typing Tips Large Language Models (LLM) ✓✓✓
rights reserved. Rise of the machines Subjective Performance Parameter Count (Model Size) GPT-2 1.5 billion GPT-3 175 billion BLO O M 176 billion BERT 345 m illion ?
rights reserved. Rise of the machines Subjective Performance Parameter Count (Model Size) ? Foundation Models GPT-2 1.5 billion GPT-3 175 billion BLO O M 176 billion BERT 345 m illion
rights reserved. Applications for Generative AI • Text generation (many types) • Game design • Industrial design • Drug design research • Software development…
rights reserved. Prompt engineering Write a Blog How to win at Tic Tac Toe Title: Game theory Topic: 10 Length (paragraphs): OK “You are a journalist for amazing-acme- blogs.com, an online publication for sophisticated game players aged 18 and up. Articles are written to inform and entertain and represent a unique and non- intuitive perspective. Write a blog post titled “How to win at Tic Tac Toe” on the topic of game theory. The article should be no longer than 10 paragraphs long. At the end list 3 prompts that can be used to generate images for the article.”
rights reserved. chatbot = Chatbot( default_reply_length = 20, bot_persona = pmodel[‘friendly’], users_name =“User”, bots_name = “Chatbot”, topic = “tech support” ) prompt = """This is a friendly and safe chat session between a user and a computer called Chatbot.## User:I am a real person with a question to ask. Who are you?## Chatbot:I am a chatbot, and I am here to help.## User:What kind of questions will you answer?## Chatbot:I will answer questions about tech support.## User:My computer has crashed. What should I do?## Chatbot:""" A new way to program…
rights reserved. Where to from here? • Models will keep getting bigger, and more sophisticated. • Foundation models and prompt engineering are software development tools. • Agents can execute complex business tasks and invoke APIs. • Inference costs and sustainability come into focus.