Upgrade to Pro — share decks privately, control downloads, hide ads and more …

Digit - Welcome to the AI Jungle! Now What?

Digit - Welcome to the AI Jungle! Now What?

Kevin Dubois

October 03, 2024
Tweet

More Decks by Kevin Dubois

Other Decks in Technology

Transcript

  1. 6 The Journey of Adopting Generative AI Find LLMs Try

    prompts Experiment with your data Connect to data source Model serving Exception handling Limited fine tuning Retrieval-Augmented Generation (RAG) Endpoints Evaluate flows Benchmarking Monitoring Integrate with apps Chaining Building & Refining Ideation & Prototyping Operationalizing
  2. 7 The Journey of Adopting Generative AI Find LLMs Try

    prompts Experiment with your data Benchmarking Ideation & Prototyping
  3. Open source refers to software whose source code is made

    publicly available for anyone to view, modify, and distribute. “
  4. Open source refers to software whose source code is made

    publicly available for anyone to view, modify, and distribute. “
  5. an open-source AI system can be used for any purpose

    without the need to secure permission, and researchers should be able to inspect its components and study how the system works. It should also be possible to modify the system for any purpose—including to change its output—and to share it with others to use, with or without modifications, for any purpose. In addition, the standard attempts to define a level of transparency for a given model’s training data, source code, and weights. “ https://www.technologyreview.com/2024/08/22/1097224/we-finally-have-a-definition-for-open-source-ai/ Open Source Initiative (OSI):
  6. The project enables community contributors to add additional "skills" or

    "knowledge" to a particular model. InstructLab's model-agnostic technology gives model upstreams with sufficient infrastructure resources the ability to create regular builds of their open source licensed models not by rebuilding and retraining the entire model but by composing new skills into it.
  7. 15 The Journey of Adopting Generative AI Find LLMs Try

    prompts Experiment with your data Connect to data source Model serving Exception handling Limited fine tuning Retrieval-Augmented Generation (RAG) Endpoints Evaluate flows Benchmarking Monitoring Integrate with apps Chaining Building & Refining Ideation & Prototyping Operationalizing
  8. 16 The Journey of Adopting Generative AI Connect to data

    source Model serving Exception handling Limited fine tuning Retrieval-Augmented Generation (RAG) Endpoints Evaluate flows Monitoring Integrate with apps Chaining Building & Refining Operationalizing
  9. App developer IT operations Data engineer Data scientists ML engineer

    Business leadership AI is a team initiative
  10. Set goals App developer IT operations Data engineer Data scientists

    ML engineer Gather and prepare data Develop model Integrate models in app dev Model monitoring & management Retrain models Business leadership AI is a team initiative
  11. Data storage Data lake Data exploration Data preparation Stream processing

    ML notebooks ML libraries Model lifecycle CI/CD Monitor / alerts Model visualization Model drift Hybrid, multi cloud platform with self service capabilities Compute acceleration Infrastructure Gather and prepare data Deploy models in an application Model monitoring and management Physical Virtual Private cloud Public cloud Edge Develop model Team Deliverables Data engineer Data scientists App developer IT operations
  12. 22 Process scheduling & hardware acceleration Containerization & container orchestration

    Operating containers at scale Software-defined storage Data visualization, labeling, processing Automated software delivery Integration Experimentation & model lifecycle Languages & development tools AI dependencies Libraries and frameworks Machine learning libraries
  13. 23 Process scheduling & hardware acceleration Containerization & container orchestration

    Operating containers at scale Software-defined storage Data visualization, labeling, processing Automated software delivery Integration Experimentation & model lifecycle Languages & development tools AI dependencies Libraries and frameworks Languages & development tools Machine learning libraries
  14. 24 Process scheduling & hardware acceleration Containerization & container orchestration

    Operating containers at scale Software-defined storage Data visualization, labeling, processing Automated software delivery Integration Experimentation & model lifecycle Languages & development tools AI dependencies Libraries and frameworks Data visualization, labeling, processing Experimentation & model lifecycle Languages & development tools Machine learning libraries
  15. 25 Process scheduling & hardware acceleration Containerization & container orchestration

    Operating containers at scale Software-defined storage Data visualization, labeling, processing Automated software delivery Integration Experimentation & model lifecycle Languages & development tools AI dependencies Libraries and frameworks Process scheduling & hardware acceleration Data visualization, labeling, processing Experimentation & model lifecycle Languages & development tools Machine learning libraries
  16. 26 Process scheduling & hardware acceleration Containerization & container orchestration

    Operating containers at scale Software-defined storage Data visualization, labeling, processing Automated software delivery Integration Experimentation & model lifecycle Languages & development tools AI dependencies Libraries and frameworks Process scheduling & hardware acceleration Containerization & container orchestration Data visualization, labeling, processing Experimentation & model lifecycle Languages & development tools Machine learning libraries
  17. 27 Process scheduling & hardware acceleration Containerization & container orchestration

    Operating containers at scale Software-defined storage Data visualization, labeling, processing Automated software delivery Integration Experimentation & model lifecycle Languages & development tools AI dependencies Libraries and frameworks Process scheduling & hardware acceleration Containerization & container orchestration Operating containers at scale Automated software delivery Data visualization, labeling, processing Experimentation & model lifecycle Languages & development tools Machine learning libraries
  18. 28 Process scheduling & hardware acceleration Containerization & container orchestration

    Operating containers at scale Automated software delivery Software-defined storage Integration Data visualization, labeling, processing Experimentation & model lifecycle Languages & development tools Machine learning libraries AI dependencies
  19. Process scheduling & hardware acceleration Containerization & container orchestration Operating

    containers at scale Automated software delivery Software-defined storage Integration Data visualization, labeling, processing Experimentation & model lifecycle Languages & development tools Machine learning libraries AI & MLOps Platform Application Platform AI dependencies
  20. Process scheduling & hardware acceleration Containerization & container orchestration Operating

    containers at scale Automated software delivery Software-defined storage Integration Data visualization, labeling, processing Experimentation & model lifecycle Languages & development tools Machine learning libraries AI & MLOps Platform Application Platform AI dependencies Modernize & Accelerate app development
  21. 31

  22. 35 Model development Conduct exploratory data science in JupyterLab with

    access to core AI / ML libraries and frameworks including TensorFlow and PyTorch using our notebook images or your own. Model serving & monitoring Deploy models across any cloud, fully managed, and self-managed OpenShift footprint and centrally monitor their performance. Lifecycle Management Create repeatable data science pipelines for model training and validation and integrate them with devops pipelines for delivery of models across your enterprise. Increased capabilities / collaboration Create projects and share them across teams. Combine Red Hat components, open source software, and ISV certified software. Available as • managed cloud service • traditional software product on-site or in the cloud! Hybrid MLOps platform Collaborate within a common platform to bring IT, data science, and app dev teams together OpenDataHub.io
  23. MLOps incorporates DevOps and GitOps to improve the lifecycle management

    of the ML application DEVELOP ML CODE BUILD TEST SERVE MONITOR DRIFT/OUTLIER DETECTION Cross Functional Collaboration Automation Repeatability Security Git as Single Source of Truth Observability TRAIN VALIDATE DEVELOP APP CODE MLOps 38 [1] Reference and things to read: https://cloud.redhat.com/blog/enterprise-mlops-reference-design
  24. Model to Prod with MLOps Kserve ModelMesh • Model serving

    framework with simplified deployment, auto-scaling, and resource optimization • Supports variety of ML frameworks like TF, PyTorch, etc. Kubeflow Pipelines • Machine Learning lifecycle automation, with model training, evaluation, and deployment • Reusable components for pipeline creation & scaling Backstage • Platform for building IDPs for streamlining developer workflows • Highly customizable, with extensive plugin support and project scaffolding capabilities
  25. Model to Prod with MLOps Model Fine-tuning Model Creation/Training Model

    Serving Data Scientist Flow Model Testing Monitoring/Analysis Application Deployment API Inferencing App Scaffolding Developer Flow Monitoring/Manag ement
  26. 76% of organizations say the cognitive load is so high

    that it is a source of low productivity. Gartner predicts 75% of companies will establish platform teams for application delivery. Source: Salesforce Source: Gartner
  27. Run LLMs locally and build AI applications podman-desktop.io Supported platforms:

    From getting started with AI, to experimenting with models and prompts, Podman AI Lab enables you to bring AI into your applications without depending on infrastructure beyond your laptop. Podman AI Lab
  28. Recap & Next steps The current AI landscape is growing

    FAST Let’s democratize AI We can re-use what we already know: • Embrace open source • Embrace platform engineering for AI: Containers, CI/CD, GitOps, Developer Portals Traditional engineering can (inf)use AI, and we’re only just at the beginning
  29. Start your OpenShift experience for free in four simple steps

    Recap & Next steps Developer Sandbox for OpenShift OpenShift AI Sandbox Start your OpenShift AI experience for free Sign up at developers.redhat.com Find out more about Red Hat’s project and products, and what it offers developers Learn more about OpenShift AI