Upgrade to Pro — share decks privately, control downloads, hide ads and more …

FROM SUPERNOVAS TO LLMS - STREAMING DATA PIPELI...

FROM SUPERNOVAS TO LLMS - STREAMING DATA PIPELINES

In this fun, hands-on, and in-depth HowTo, we use live streaming data for a comprehensive astrophysics use case with the Databricks Intelligence Platform. The focus of this session is on data engineering. We will tackle the challenge of analyzing real-time data from collapsing supernovas that emit gamma-ray bursts provided by NASA with their GCN project. You'll learn to ingest data from message buses and decide between Delta Live Tables, DBSQL, or Databricks Workflows for stream processing. Understand how to code ETL pipelines in SQL, including Kafka ingestion. Once we have the cleaned data stream, I'll demonstrate how Databricks Data Rooms offer natural language analytics and compare it to a notebook streaming data into a Vector Database for open source LLMs with RAG. This session is ideal for data engineers, data architects who like code, genAI enthusiasts, and anyone fascinated by sparkling stars. Learn when and how to use which Databricks products. The demo is easy to replicate at home

Frank Munz

June 20, 2024
Tweet

More Decks by Frank Munz

Other Decks in Science

Transcript

  1. ©2023 Databricks Inc. — All rights reserved Supernovas, Black Holes

    and Streaming Data Big Data Europe Frank Munz, Nov 2024
  2. This information is provided to outline Databricks’ general product direction

    and is for informational purposes only. Customers who purchase Databricks services should make their purchase decisions relying solely upon services, features, and functions that are currently available. Unreleased features or functionality described in forward-looking statements are subject to change at Databricks discretion and may not be delivered as planned or at all Product safe harbor statement
  3. ©2022 Databricks Inc. — All rights reserved Hi, I am

    Frank • Principal @Databricks: Data, Analytics and AI products • All things large scale data, compute, and AI • ⛰ 🥨 🍻 󰎲 • Formerly AWS Tech Evangelist, SW architect, data scientist, 3x published author.
  4. ©2023 Databricks Inc. — All rights reserved 10,000+ global customers

    $2.4B in revenue $4B in investment Inventor of the lakehouse & Pioneer of generative AI Gartner-recognized Leader Database Management Systems + Data Science and Machine Learning Platforms The data and AI company Creator of
  5. ©2024 Databricks Inc. — All rights reserved Supernovas, Black Holes

    and GRBs 1 Gamma Ray Burst (GRB) ~ energy of the sun over its lifetime < 2 seconds: merger of neutron stars or a neutron star and a black hole > 2 seconds: collapse of a massive star (> 30 solar masses) Supernova • Massive stellar explosions at the end of a star's life • Can leave behind a black hole or neutron star Black Hole • Can form from the merger of 2 neutron stars or 2 black holes • Extremely dense regions of space with immense gravitational pull 6
  6. ©2024 Databricks Inc. — All rights reserved Neil Gehrels Swift

    Observatory Launched 2004, Data transmitted via Gamma-ray Coordinates Network (GCN) Key Instruments: • Burst Alert Telescope (BAT): Locates GRBs across a wide field of view. • X-ray Telescope (XRT): Observes afterglow of GRBs in X-ray wavelengths • Ultraviolet/Optical Telescope (UVOT): Captures optical and ultraviolet emissions 8 Momentum wheels
  7. ©2024 Databricks Inc. — All rights reserved BOAT - GRB

    221009A • Detected on Oct 9, 2022 simultaneously by Swift and Fermi telescopes • Originated 2.4 billion light-years away (1.9 bn ago) in Sagitta • Lasted over 10 hours, with 10 minute initial burst • 5,000 VHE photons detected, previous record was ~100 • Brightest GRB afterglow ever recorded • Thought to occur only once every ~10,000 years 10 (Closest and) Brightest Of All Times Gamma Ray Burst Event time vs ingestion time?
  8. ©2024 Databricks Inc. — All rights reserved IceCube Neutrino Observatory

    • Detects high-energy neutrinos from extreme cosmic environments • 5,160 digital optical modules (DOMs) • Embedded in a cubic km of ice • Ice serves as detector medium and background radiation shield • Neutrinos produce Cherenkov radiation, detected by DOMs • Data transmitted via Gamma-ray Coordinates Network (GCN) -> alerts astronomers for quick follow-up observations 12 Located in Anartica
  9. ©2024 Databricks Inc. — All rights reserved 13 Design a

    system to globally share streaming events with different topics
  10. ©2024 Databricks Inc. — All rights reserved Judith Rascusin's (NASA)

    Talk @ Current.io 2023 Link to Judith's talk GCN Notices: machine generated + Circulars: human generated
  11. ©2024 Databricks Inc. — All rights reserved Get Your OIDC

    Credentials https://gcn.nasa.gov/quickstart
  12. ©2023 Databricks Inc. — All rights reserved 18 Databricks Lakehouse

    Platform Lakehouse Platform Data Warehousing Data Engineering Data Science and ML Data Streaming All structured and unstructured data Cloud Data Lake Unity Catalog Fine-grained governance for data and AI Delta Uniform Data reliability and performance Simple Unify your data warehousing and AI use cases on a single platform Open Built on open source and open standards Multicloud One consistent data platform across clouds
  13. ©2024 Databricks Inc. — All rights reserved Databricks SQL Photon

    Serverless Eliminate compute infrastructure management Instant, Elastic Compute Zero Management Lower TCO Vectorized C++ execution engine with Apache Spark API https://dbricks.co/benchmark TPC-DS Benchmark 100 TB
  14. ©2024 Databricks Inc. — All rights reserved Project Lightspeed: what

    we’ve done 22 Performance Improvements • Micro-Batch Pipelining • Offset Management • Log Purging • Consistent Latency for Stateful Pipelines • State Rebalancing • Adaptive Query Execution Enhanced Functionality • Multiple Stateful Operators • Arbitrary Stateful Processing in Python • Drop Duplicates Within Watermark • Native support for Protobuf Improved Observability • Python Query Listener Connectors & Ecosystem • Enhanced Fanout (EFO) • Trigger.AvailableNow support for Amazon Kinesis • Google Pub/Sub Connector • Integrations with Unity Catalog
  15. ©2022 Databricks Inc. — All rights reserved General intelligence Consumer

    models trained on a broad dataset disconnected from your business data Data intelligence AI connected to your customer data and able to solve domain-specific problems VS ©2024 Databricks Inc. — All rights reserved Wait, how about your AI story?
  16. ©2024 Databricks Inc. — All rights reserved Integrated with the

    tools you know and love 24 ORCHESTRATION DATA GOVERNANCE & SECURITY DATA SOURCE BUSINESS INTELLIGENCE DATA INTEGRATION DATA PARTNERS DS/ML
  17. ©2024 Databricks Inc. — All rights reserved Ingest Streaming Data

    from Apache Kafka 25 Notebook DB SQL DLT • GCN Client ◦ quickstart • pure Spark ◦ standard Easiest solution: Uses Declarative Data Pipeline Workflows
  18. ©2024 Databricks Inc. — All rights reserved Notebook with GCN

    Kafka Wrapper 27 Wraps Confluent Kafka Client from gcn_kafka import Consumer topics = ['gcn.classic.text.SWIFT_POINTDIR'] config = {'auto.offset.reset': 'earliest'} consumer = Consumer(config, client_id='abc…', client_secret='xyz…', domain='gcn.nasa.gov') consumer.subscribe(topics) while True: for message in consumer.consume(timeout=1):
  19. msg = TITLE: GCN/SWIFT NOTICE NOTICE_DATE: Fri 03 May 24

    04:16:31 UT NOTICE_TYPE: SWIFT Pointing Direction NEXT_POINT_RA: 213.407d {+14h 13m 38s} (J2000) NEXT_POINT_DEC: +70.472d {+70d 28' 20"} (J2000) NEXT_POINT_ROLL: 2.885d SLEW_TIME: 15420.00 SOD {04:17:00.00} UT SLEW_DATE: 20433 TJD; 124 DOY; 24/05/03 OBS_TIME: 900.00 [sec] (=15.0 [min]) TGT_NAME: RX J1413.6+7029 TGT_NUM: 3111759, Seg_Num: 10 MERIT: 60.00 INST_MODES: BAT=0=0x0 XRT=7=0x7 UVOT=12525=0x30ED SUN_POSTN: 40.78d {+02h 43m 07s} +15.81d {+15d 48' 31"} SUN_DIST: 93.68 [deg] Sun_angle= -11.5 [hr] (East of Sun) MOON_POSTN: 338.61d {+22h 34m 27s} -12.48d {-12d 28' 49"} MOON_DIST: 113.09 [deg] MOON_ILLUM: 31 [%] GAL_COORDS: 113.36, 45.10 [deg] galactic lon,lat of the pointing direction ECL_COORDS: 143.56, 69.70 [deg] ecliptic lon,lat of the pointing direction COMMENTS: SWIFT Slew Notice to a preplanned target. COMMENTS: Note that preplanned targets are overridden by any new BAT Automated Target. COMMENTS: Note that preplanned targets are overridden by any TOO Target if the TOO has a higher Merit Value. COMMENTS: The spacecraft longitude,latitude at Notice_time is 247.70,10.86 [deg]. COMMENTS: This Notice was ground-generated -- not flight-generated.
  20. Swift Alert: Pointing towards RX J1413.6+7029 On Friday, May 3rd,

    2024, at 04:16:31 UT, the Swift telescope is scheduled to point towards a preplanned target, RX J1413.6+7029. This celestial object is located at a Right Ascension of 213.407 degrees (or 14 hours, 13 minutes, and 38 seconds) and a Declination of +70.472 degrees (or +70 degrees, 28 minutes, and 20 seconds). The telescope will begin its slew to this target location at 04:17:00.00 UT, which will take approximately 15 minutes to complete. Once in position, Swift will observe RX J1413.6+7029 for 900 seconds, or 15 minutes, using its Burst Alert Telescope (BAT), X-ray Telescope (XRT), and Ultraviolet/Optical Telescope (UVOT). At the time of observation, the Sun will be at a position of 40.78 degrees (or 2 hours, 43 minutes, and 7 seconds) and +15.81 degrees (or +15 degrees, 48 minutes, and 31 seconds), with a Sun angle of -11.5 hours (or East of the Sun). The Moon will be at a position of 338.61 degrees (or 22 hours, 34 minutes, and 27 seconds) and -12.48 degrees (or -12 degrees, 28 minutes, and 49 seconds), with a Moon illumination of 31%. It's worth noting that this observation is part of a preplanned target list, but it may be overridden by a new BAT Automated Target or a Target of Opportunity (TOO) with a higher merit value. Additionally, the spacecraft's longitude and latitude at the time of observation will be 247.70 degrees and 10.86 degrees, respectively.
  21. ©2024 Databricks Inc. — All rights reserved Ingest and Transform

    Easily with Delta Live Tables Pipelines 30 -- incrementally ingest CREATE STREAMING TABLE raw_data AS SELECT * FROM cloud_files ("/raw_data", "json") -- incrementally transform CREATE MATERIALIZED VIEW clean_data AS SELECT timestamp, id, target FROM LIVE.raw_data Accelerate ETL development Declare SQL or Python and DLT automatically orchestrates the DAG, handles retries, changing data Automatically manage your infrastructure Automates complex tedious activities like recovery, auto-scaling, and performance optimization Ensure high data quality Deliver reliable data with built-in quality controls, testing, monitoring, and enforcement Unify batch and streaming Get the simplicity of SQL with freshness of streaming with one unified API The best way to do ETL on the Databricks Data Intelligence Platform
  22. ©2024 Databricks Inc. — All rights reserved Simplicity • Simple

    development • Simple operations Performance • End-to-end incremental processing • Parallelized ingestion Low TCO • Serverless metering • Efficient data processing DLT with serverless compute The simplest way to build data pipelines
  23. ©2024 Databricks Inc. — All rights reserved Delta Live Tables

    33 Transformation: Materialized View using PIVOT and type casts
  24. ©2024 Databricks Inc. — All rights reserved AI/BI Genie Enable

    business users to interact with data with LLM-powered Q&A Natural language -> answers in text and visualizations Curate dataset-specific experiences with custom instructions Powered by Databricks SQL & DatabricksIQ Works with DLT Streaming Tables and Materialized Views
  25. ©2024 Databricks Inc. — All rights reserved Blog and GitHub

    Repo https://www.databricks.com/blog/supernovas-black-holes-and-streaming-data
  26. ©2024 Databricks Inc. — All rights reserved Conclusion • You

    are one copy and paste of a SQL command away from exploring streaming data from a NASA satellite. • Delta Live Tables: declarative, serverless pipelines in SQL (or Python) • Ask Genie natural language questions or create plots ◦ Easy to verify: Double check the SQL Genie writes for you • Slides, Blog, me @LinkedIn • TLDR: It's all about the platform! 41