Upgrade to Pro — share decks privately, control downloads, hide ads and more …

Machine Learning for Materials (Lecture 9)

Aron Walsh
February 20, 2024

Machine Learning for Materials (Lecture 9)

Slides linked to https://github.com/aronwalsh/MLforMaterials. Updated for 2025.

Aron Walsh

February 20, 2024
Tweet

More Decks by Aron Walsh

Other Decks in Science

Transcript

  1. Aron Walsh & Hyunsoo Park Department of Materials Centre for

    Processable Electronics Machine Learning for Materials 9. Generative Artificial Intelligence Module MATE70026
  2. Module Contents 1. Introduction 2. Machine Learning Basics 3. Materials

    Data 4. Crystal Representations 5. Classical Learning 6. Artificial Neural Networks 7. Building a Model from Scratch 8. Accelerated Discovery 9. Generative Artificial Intelligence 10. Recent Advances
  3. Natural Language Processing (NLP) Branch of AI that focuses on

    the interaction between computers and human language Image from https://github.com/practical-nlp
  4. Natural Language Processing (NLP) Branch of AI that focuses on

    the interaction between computers and human language Easy Hard Spell checking Text classification Information extraction Question answering Conversational agent
  5. Natural Language Processing (NLP) Many statements are ambiguous and require

    context to be understood Let’s eat grandma? Essen wir Oma? 我们吃奶奶的饭? おばあちゃんを食べようか? Mangeons grand-mère? 할머니랑 같이 먹어요? Does the ambiguity of the English phrase translate? (image from DALL-E 3)
  6. Language Models Predictive text Using GPT-4 via https://github.com/hwchase17/langchain I love

    materials because of they their shape are like Top words ranked by probability “Temperature” of the text choices Sampling the distribution of probabilities (“creativity”) I love materials because they ignite a symphony of vibrant colors, tantalizing textures, and wondrous possibilities that dance in the realms of imagination, transcending boundaries and embracing the sheer beauty of creation itself. I love materials because they are essential. strong essential beautiful
  7. Language Models Large refers to the size and capacity of

    the model. It must sample a literary combinatorial explosion 104 common words in English 108 two-word combinations 1012 three-word combinations 1016 four-word combinations Language must be represented numerically for machine learning models Token: discrete scalar representation of word (or subword) Embedding: continuous vector representation of tokens
  8. Text to Tokens Example: “ZnO is a wide bandgap semiconductor”

    https://platform.openai.com/tokenizer [57, 77, 46, 374, 3094, 4097, 43554, 39290, 87836] Token-IDs The model looks up 768 dimensional embedding vectors from the (contextual) embedding matrix Note that Zn is split into two tokens (not ideal for chemistry)
  9. Large Language Models T. N. Brown et al, arXiv:2005.14165 (2020)

    GPT = “Generative Pre-trained Transformer” Generate new content Trained on a large dataset Deep learning architecture User Prompt Encode to a vector Transformer layers analyse relationship between vector components; generate transformed vector Decode to words Response Key components of a transformer layer Self-attention: smart focus on different parts of input Feed-forward neural network: capture non-linear relationships
  10. Large Language Models B. Geshkovski et al, arXiv:2312.10794 (2023) Ongoing

    analysis into the physics of transformer architectures, e.g. rapid identification of strong correlations and approach to mean field solutions Focus on important inputs Normalise for stability Non-linear transformation Normalise for stability
  11. Large Language Models Image from https://towardsdatascience.com Deep learning models trained

    to generate text e.g. BERT (370M, 2018), GPT-4 (>1012, 2023) Recent models include: Llama-3 (Meta, 2024) Gemini-2 (Google, 2024) GPT-4 (OpenAI, 2023) PanGu-5 (Huawei, 2024)
  12. Large Language Models T. N. Brown et al, arXiv:2005.14165 (2020)

    Essential ingredients of GPT and related models Diverse data Deep learning model Validation on tasks
  13. Large Language Models T. N. Brown et al, arXiv:2005.14165 (2020)

    Essential ingredients of GPT and related models Diverse data Deep learning model Validation on tasks
  14. Secret to Practical Success of LLMs RLHF = Reinforcement Learning

    Human Feedback; Drawing from @anthrupad Patterns Focus Alignment
  15. Large Language Models What are the potential drawbacks and limitations

    of LLMs such as GPT? • Training data, e.g. not up to date, strong bias • Context tracking, e.g. limited short-term memory • Hallucination, e.g. generate false information • Ownership, e.g. fair use of training data • Ethics, e.g. appear human generated
  16. LLMs for Materials Many possibilities, e.g. read a textbook and

    ask technical questions about the content “The Future of Chemistry is Language” A. D. White, Nat. Rev. Chem. 7, 457 (2023)
  17. LLMs for Materials Language models tailored to be fact-based with

    clear context. Applied to one of my review papers https://github.com/whitead/paper-qa
  18. LLMs for Materials L. M. Antunes et al, Nature Comm.

    15, 10570 (2024); https://crystallm.com CrystaLLM: learn to write valid crystallographic information files (cifs) and generate new structures
  19. LLMs for Materials CrystaLLM: learn to write valid crystallographic information

    files (cifs) and generate new structures Training set 2.2 million cifs Validation set 35,000 cifs Test set 10,000 cifs Custom tokens: space group symbols, element symbols, numeric digits. 768 million training tokens for a deep-learning model with 25 million parameters L. M. Antunes et al, Nature Comm. 15, 10570 (2024); https://crystallm.com
  20. LLMs for Materials D. A. Boiko et al, Nature 624,

    570 (2023) Integrate a large language model into scientific research workflows
  21. LLMs for Materials Combine text and structural data for multi-model

    models using contrastive learning Hyunsoo Park, A. Onwuli and A. Walsh, ChemRxiv (2024) Rich representations for text-to-compound generation Denoising diffusion with Chemeleon
  22. Class Outline Generative AI A. Large Language Models B. From

    Latent Space to Diffusion Dr Hyunsoo Park
  23. How Can AI Understand the World ? “I love an

    apple” ? I love an apple Index One-hot Encoding Continuous Vectors I 0 [1, 0, 0, 0, 0] [0.7, 0.8, 0.9] love 1 [0, 1, 0, 0, 0] [0.7, 0.4, 0.3] an 2 [0, 0, 1, 0, 0] [0.4, 0.5, 0.4] apple 3 [0, 0, 0, 1, 0] [0.1, 0.3, 0.7] banana 4 [0, 0, 0, 0, 1] [0.1, 0.2, 0.7] 0.7 0.8 0.9 0.7 0.4 0.3 0.4 0.5 0.4 0.1 0.3 0.7 AI (Computer) Fact: AI is not that smart...
  24. How Can AI Understand Materials ? We need to convert

    materials into meaningful numerical values ? AI (Computer) ? ?
  25. Autoencoder P. Baldi and K. Hornik (1989); Schematic adapted from

    https://synthesis.ai Neural network compresses data into a deterministic latent space and reconstructs it back to the original
  26. Autoencoder P. Baldi and K. Hornik (1989); Schematic adapted from

    https://synthesis.ai Lack of continuity and structure makes interpolated or random points unlikely to map to meaningful data
  27. Variational Autoencoder (VAE) D. P. Kingma and M. Welling (2013);

    Schematic adapted from https://synthesis.ai Neural network encodes data into a probabilistic latent space that is more suitable for sampling (generation)
  28. Denoising Diffusion Model Instead of learning one step (VAE), we

    can learn data in multiple steps (Diffusion)
  29. Applications to Materials Design Hyunsoo Park, A. Onwuli and A.

    Walsh, ChemRxiv (2024) Gen AI models can be used in different ways, e.g. • map from composition to crystal structure • unguided sampling of a random compound • guided sampling to specific properties
  30. Generative Artificial Intelligence H. Park, Z. Li and A. Walsh,

    Matter 7, 2358 (2024) Growing number of generative architectures that can be tailored for scientific problems
  31. Class Outcomes 1. Explain the foundations of large language models

    2. Knowledge of the central concepts underpinning generative artificial intelligence Activity: Research challenge