Upgrade to Pro — share decks privately, control downloads, hide ads and more …

apidays Singapore 2025 - From Data to Insights:...

Avatar for apidays apidays
June 07, 2025
0

apidays Singapore 2025 - From Data to Insights: Building AI-Powered Data APIs, Asif Waquar (Munich Re)

From Data to Insights: Building AI-Powered Data APIs
Asif Waquar, Solutions Architect at Munich Re

apidays Singapore 2025
Where APIs Meet AI: Building Tomorrow's Intelligent Ecosystems
April 15 & 16, 2025

------

Check out our conferences at https://www.apidays.global/

Do you want to sponsor or talk at one of our conferences?
https://apidays.typeform.com/to/ILJeAaV8

Learn more on APIscene, the global media made by the community for the community:
https://www.apiscene.io

Explore the API ecosystem with the API Landscape:
https://apilandscape.apiscene.io/

Avatar for apidays

apidays

June 07, 2025
Tweet

More Decks by apidays

Transcript

  1. 2 about me Asif Waquar Solutions Architect, Munich Re |Singapore

    asifwaquar asifwaquar https://asifwaquar.com/ @asifwaquar
  2. Agenda • Understanding the Problem Landscape • Why AI-Powered Data

    APIs ? How is it going to solve the data problems ? • Implementing AI-Powered Data APIs • Architecture and Real-World Use Cases • Risks and Mitigation Strategies • Q/A
  3. Traditional AI • Machine Learning • Computer Vision e.g. 

    YouTube recommendations  Web Search Results  Farming  Route map recommendation  Smart Home Generative AI • LLMs (Pre trained Models e.g. ChatGPT, Deep Seek, Grok, LAMMA, Mistral) • Natural language communication Classify existing contents Create new content with LLMs AI in daily life
  4. Challenge: Most analytics are batch-processed, not real- time. Risk: Businesses

    can’t react fast enough to market changes, fraud, or customer needs. Challenge: Organizations are collecting massive volumes of data, but much of it remains underutilized. Risk: Data without insights leads to missed opportunities, slow decisions, and wasted storage costs. Data Overload, But Insight Deficit Disconnected Data Sources Manual Analysis is Time-Consuming Unclean and Unstructured Data Lack of Real-Time Decision Support Challenge: Data is often siloed across systems (CRM, IoT, files, databases). Risk: Inability to connect dots between sources causes fragmented views and inaccurate insights. Challenge: Traditional BI/reporting tools are reactive, slow, and require human interpretation. Risk: Delays in decision- making and potential misinterpretations. Challenge: Data often arrives with missing fields, invalid formats, or no clear categorization. Risk: Garbage-in leads to garbage-out — faulty insights can misguide business actions. Challenges:
  5. Gen AI API Learn + Think Apps have gone smarter…

    Einstein Value : Prompt based Client App Development Innovation: AI Model, API Generation & Server Function Interaction Client App Traditional (Execute Instructions) 1. Understand Data Model 2. API Design & Structure Prompt
  6. 5. Real-time Stream Processing Coupled with Azure services like Event

    Hubs, Stream Analytics, or Azure Synapse, AI-powered APIs can continuously analyze data flows (e.g., IoT or financial transactions) to provide instant alerts or decisions. 4. Scalable and Always Available APIs can expose insights on demand to applications, bots, dashboards, or mobile apps — making intelligence available to everyone, not just analysts. 3. Predictive and Prescriptive Intelligence Data APIs with embedded ML models offer predictions (e.g., churn likelihood, fraud risk). Go beyond “what happened” to “what will happen” and “what to do next.” Natural Language Understanding AI APIs can convert unstructured data (text, documents, images) into structured insights (e.g., sentiment, keywords, entities). Automated Data Ingestion and Cleaning AI models can detect anomalies, missing values, or inconsistent formats (e.g., dates, categories). APIs built with AI can clean, transform, and enrich data in real time before analysis. Turning Data Problems into Smart Solutions with AI APIs
  7. AI APIs Demand.. According to a Gartner research, More Than

    30% of the Increase in Demand for APIs will Come From AI and Tools Using Large Language Models by 2026 Generative AI to Become a $1.3 Trillion Market by 2032, Research Finds: Bloomberg
  8. Ways to implement AI- Powered Data APIs Direct Use of

    Prebuilt AI Services: Leverage Azure Cognitive Services APIs (pre-trained models for vision, speech, language, etc.) Serverless Custom ML API: Deploy a custom model as a serverless function (Azure functions or Lamda). Managed Web Service (App Service or Container) for ML: If you need a full web app environment or have a larger application around the model, Azure App Service can be used to deploy a custom API (e.g. a Flask or FastAPI app) that hosts your ML model. This provides a fully managed web hosting with built- in scaling, deployment slots, and integration capabilities​ Azure ML Managed Online Endpoints: Azure Machine Learning offers managed online endpoints, which are essentially serverless REST endpoints for your models. You register a trained model in Azure ML, then deploy it to an online endpoint with one or more scaled instances. API Gateway Fronting ML Services: Regardless of how the model inference is hosted (Function, App Service, AKS, or Azure ML endpoint), it’s a best practice to put APIM in front of your AI API. Workflow Orchestration: Sometimes fulfilling an AI request involves multiple steps or calls to different services. Azure Logic Apps can be used to orchestrate complex workflows triggered by an API call. E.g Logic Apps
  9. Use case 1 (Claims workflow Automation) Insurance companies often receive

    claims data in various format from a wide network of agents. However, these files are typically inconsistent—ranging from missing or misnamed columns to non-standard data formats and unstructured incident descriptions. Such variability hinders efficient data ingestion and delays downstream analysis. According to Business Reporter, insurers handle more than 100,000 documents annually, including insurance contracts, policies, and other related paperwork. For example, a typical car accident claim involves a lot of paperwork—police reports with handwritten notes, adjuster summaries explaining what happened, medical reports full of technical terms and billing codes, and repair estimates that must be carefully checked. Traditional technologies, like Optical Character Recognition (OCR), while valuable for converting documents into machine-readable formats, represent only one piece of the automation puzzle in claims processing. OCR alone cannot understand context or handle the variety of document formats, often requiring significant manual intervention to correct errors. This is why modern solutions have evolved into Intelligent Document Processing (IDP) systems by leveraging data APIs. It can distinguish between different types of information-for example, understanding the contextual difference between an incident date and a policy period date—while maintaining relationships between various data points.
  10. How it works ? Claims processing is one of the

    most impactful areas for AI automation in the insurance and reinsurance industry. By leveraging AI- powered Data APIs, insurers can streamline repetitive tasks like data extraction, validation, document analysis, and intelligent routing - reducing manual intervention and improving accuracy. Modern AI powered Data API solutions approach the problem differently. Instead of trying to automate individual tasks, they understand documents the way humans do-by comprehending context and relationships. Data Collection & Ingestion Data Extraction Data Classification & Standardisation Data Processing Data Visualization
  11. Data Flow & Architecture Data Collection Data Ingestion a) Document

    classification/Data extraction b) Summarization/LLM evaluation Data Visualization
  12. But.. According to a Gartner research, only 15% of AI

    solutions deployed by 2022 will be successful, let alone create ROI positive value. Why do 87% of the data science projects never make it into production ? Why most AI implementations fail, and what enterprises can do to beat the odds Hundreds of AI tools have been built to catch covid. None of them helped.
  13. Why.. Shortage of Skilled Resources Limited availability of data scientists,

    AI engineers, and domain experts leads to gaps in execution, model tuning, and integration into business workflows. Complex and Fragmented Data Landscape Data is often siloed, inconsistent, or poorly labelled, making it difficult to ingest, clean, and prepare for AI models — ultimately delaying or derailing initiatives. Lack of Enterprise-Wide AI Capabilities Organizations often treat AI as a departmental initiative rather than building a scalable, cross-functional capability across the enterprise. Leadership Support Without strong executive sponsorship and a clear alignment with business goals, AI projects struggle to secure funding, stakeholder support, or long-term vision. Poor Cross-Team Collaboration and Communication Gaps between data teams, business units, and IT hinder alignment, causing misinterpretation of use cases, unrealistic expectations, and project failure.
  14. How do we make it real ? • User Centric

    Analysis • Human Centered Design • Tech Relevance Assessment Understanding Human Needs • Adopt Simplicity to Maximize Value • Intelligent Apps Strategic AI Integration using APIs • Experiment with MVP • Test with Human w/context • Iterate Experiment and Improvise
  15. Design Best Practices (Scalability, Real-Time Ingestion, AI Integration) Scalability &

    Performance Use cloud-native services (e.g., Azure Functions, AKS) for auto-scaling Keep APIs stateless; cache frequent results (e.g., Redis) Handle long-running jobs asynchronously Real-Time Ingestion Leverage Event Hubs, Stream Analytics, Databricks for real- time insights Support push/pull models and real-time notifications (e.g., SignalR, Service Bus) AI/ML Integration & MLOps Separate training vs. inference pipelines Use model versioning, monitoring (Azure ML, MLflow) Containerize models (Docker, ACR, AKS) for portability & scaling Security & Compliance Secure APIs with OAuth2/AAD, encrypt data, log access Follow Responsible AI principles (bias, fairness, transparency) Architecture Principles Design for failure & observability (e.g., Application Insights, Prometheus) Optimize for cost & latency (serverless, spot VMs, multi-region HA)
  16. Key Takeaways • By wrapping AI models in data APIs,

    we can move from offline analysis to real-time insight delivery. • This fusion drives significant efficiency gains - e.g., automated pipelines can handle claims or risk assessments in seconds, not days, resulting in improvements like ~60% higher throughput in claims processing​ and 50% faster reporting with real-time data centralization​. Data APIs + AI = Faster Insights • A model alone doesn’t deliver value until it’s part of a workflow. • Designing end-to-end workflows – from data ingestion, to AI processing, to output consumption – is key. • Visualizing these workflows with diagrams helps ensure both IT and business stakeholders are aligned on how data flows and where AI adds value. • Always ask: “How will this model’s output be used? How do we get it to the right system or person at the right time?” – the answer should guide your API design. Workflows Matter as Much as Models: • Cloud providers provides robust services (Functions, Logic Apps, Azure ML, etc.) that manage scaling, deployment, and integration, while open-source tools (Python libraries, Spark, Kafka, etc.) provide flexibility and innovation. • By using them together, you get the best of both worlds: speed of development and enterprise-grade reliability. Leverage Cloud + Open-Source Synergy: • Plan for scalability from day one (stateless microservices, autoscaling, distributed computing for big data). • Implement MLOps to continuously improve – monitor your models and data drift so the API’s insights remain accurate over time. • Prioritize security and compliance, especially in regulated industries: secure your APIs and data to maintain trust. • By following the architectural principles discussed (API layering, containerization, independent components, etc.), you ensure your solution can grow and adapt without extensive rework Best Practices = Success (Scalability, MLOps, Security):
  17. Q/A