Upgrade to Pro — share decks privately, control downloads, hide ads and more …

研發效能不是一道數學題:從多角度理解、數據驅動改善

Avatar for Derek Chen Derek Chen
October 14, 2025

 研發效能不是一道數學題:從多角度理解、數據驅動改善

在當今變化快速的環境中,研發團隊的生產力已成為產品成功的關鍵之一。然而,研發效能並非能以單一公式計算的數學題。本場演講將帶領參與者從多角度出發,重新思考如何理解並提升研發效能。

我們將分享一種將生產力拆解為可衡量要素,並推導出代表團隊效能的「魔術數字」方法。過程中,也將比較來自不同學派與實務經驗中,實用且具啟發性的效能指標。

最後,透過多維度的分析框架,說明如何更有效地衡量、管理並持續增強研發團隊的生產力,協助組織實現資料驅動的持續改善。

Avatar for Derek Chen

Derek Chen

October 14, 2025
Tweet

More Decks by Derek Chen

Other Decks in Science

Transcript

  1. Who Am I 2 Derek Chen Agile Coach / TSMC

    An agile practitioner who love to promote agile practices, and has rich experiences in Scrum, Kanban and Large Scale Scrum “Stay Hungry, Stay Foolish”
  2. 研發效能的困境 5 Source: 茹炳晟, 张乐. (2022). 软件研发效能权威指南. 电子工业出版社. 業務增長 的趨勢

    時間 軟體架構與規模 的複雜度 期待的研發效能 團隊人數 的增長 成長量 實際的研發效能 研發效能的落差
  3. 8 While there is no commonly agreed on definition of

    productivity, it is generally understood as... Productivity = Input Output
  4. 9 Source: Sadowski, C., & Zimmermann, T. (2019). Rethinking Productivity

    in Software Engineering (Chapter 1). Apress. Given Purpose Productivity = Input Output Ideal functionality and quality Ideal effort Actual effort Actual functionality and quality Effectiveness 效果 (做對) Efficiency 效率 (多快)
  5. Efficiency 計算 11 𝐸𝑓𝑓𝑖𝑐𝑖𝑒𝑛𝑐𝑦 = Τ 160 240 = 67%

    理想時間 2人 × 2週 × 40小時/週 = 160小時 實際時間 3人 × 2週 × 40小時/週 = 240小時 𝐸𝑓𝑓𝑖𝑐𝑖𝑒𝑛𝑐𝑦 = 𝐼𝑑𝑒𝑎𝑙 𝐼𝑛𝑝𝑢𝑡 𝐴𝑐𝑡𝑢𝑎𝑙 𝐼𝑛𝑝𝑢𝑡
  6. Effectiveness 計算 12 理想功能 5 個,實際完成 4 個, 達成率 80%

    (額外功能不計)。 下單流程在 3 分鐘內完成 99% 正確送達(無誤單) 每秒可處理 100 筆訂單請求 使用體驗不流暢、下單時間過長、 效能差,整體品質約 70%。 菜單 通知 加購 收藏 付款 推薦 𝐸𝑓𝑓𝑒𝑐𝑡𝑖𝑣𝑒𝑛𝑠𝑠 = 80% × 70% = 56% 𝐸𝑓𝑓𝑒𝑐𝑡𝑖𝑣𝑒𝑛𝑠𝑠 = 𝐴𝑐𝑡𝑢𝑎𝑙 𝑂𝑢𝑡𝑝𝑢𝑡 𝐼𝑑𝑒𝑎𝑙 𝑂𝑢𝑡𝑝𝑢𝑡
  7. Productivity 計算 13 𝐴𝑐𝑡𝑢𝑎𝑙 𝑂𝑢𝑝𝑢𝑡 𝐴𝑐𝑡𝑢𝑎𝑙 𝐼𝑛𝑝𝑢𝑡 Productivity 如果我們將 Ideal

    Productivity 設定為 1 或作為基準值,那麼 Actual Productivity 可簡化為「Effectiveness × Efficiency」 Effectiveness Efficiency Baseline = × × = 𝐴𝑐𝑡𝑢𝑎𝑙 𝑂𝑢𝑝𝑢𝑡 𝐼𝑑𝑒𝑎𝑙 𝑂𝑢𝑡𝑝𝑢𝑡 × 𝐼𝑑𝑒𝑎𝑙 𝑖𝑛𝑝𝑢𝑡 𝐴𝑐𝑡𝑢𝑎𝑙 𝐼𝑛𝑝𝑢𝑡 × 𝐼𝑑𝑒𝑎𝑙 𝑂𝑢𝑡𝑝𝑢𝑡 𝐼𝑑𝑒𝑎𝑙 𝐼𝑛𝑝𝑢𝑡 𝑃𝑟𝑜𝑑𝑢𝑐𝑡𝑖𝑣𝑡𝑦 = 56% × 67% = 38%
  8. Why Is Measuring Productivity Important? • Measuring it correctly offers

    insights into the effectiveness and efficiency of the development process • Identify bottlenecks, inefficiencies, and areas for improvement • Make informed decisions to streamline workflow, and optimize resource allocation 15 Source: Mulders, M. (2024). Engineering Productivity: How to Measure and Improve It. LinearB.
  9. 19 If you can't measure it, you can't improve it

    – Peter Drucker 如果你不能衡量它,就 不能改善它
  10. Sprint Velocity 24 Source: Video by OeLean on YouTube How

    many user stories the team can take on each Sprint Sprint 1 Sprint 2 Sprint 3 Sprint 4 Sprint 5 Sprint 6 Planned Completed 5 10 15 20 25
  11. Sprint Burndown 25 Source: Video by OeLean on YouTube Track

    the progress of work completed over time during a Sprint We might not be able to finish all user stories We might be able to finish before the end of the Sprint Day 1 2 3 4 5 6 7 8 9 10 Story Point 5 10 15 20 25
  12. Release Burnup 26 Source: Video by OeLean on YouTube Show

    progress by tracking completed work against the total scope 18 SP 20 SP 15 SP 22 SP Average SP: 18.75 Maximum SP: 22 Minimum SP: 15 Min Avg Max Sprint 2 4 6 10 8 40 80 120 160 Work Delivered
  13. Lead Time & Cycle Time 28 Ready Selected Analyze Develop

    Test Release Idea 3 3 4 2 ∞ Done Lead Time Cycle Time
  14. Control Chart 29 Source: Microsoft Learn. (2023, Feb 25). Cumulative

    flow, lead time, and cycle time guidance. The lead times of completed work are plotted sequentially on a timeline 0 25 50 75 100 125 150 3/1/22 3/11/22 3/21/22 3/31/22 4/10/22 4/20/22 4/30/22 5/10/22 5/20/22 Lead Time Outliers The team begins to limit WIP
  15. Distribution Chart 30 Depict the range of observed lead times

    and their frequency of occurrence Lead Time in Days Frequency of Occurrence 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 1 2 3 4 5 6 70% 80% 90%
  16. 0 20 40 60 80 100 120 140 160 180

    200 220 240 260 Done Testting Development Backlog Cumulative Flow Diagram (CFD) 31 Show quantity of work in each workflow stage over time Todo Done WIP
  17. Flow Velocity 33 Source: Flow Framework. (n.d.). What is the

    Flow Framework?. The number of Flow Items completed over a particular time period In progress 12 flow items done Flow Velocity = 12 Work Start Work Complete
  18. Flow Distribution 34 Source: Flow Framework. (n.d.). What is the

    Flow Framework?. The ratio of the four Flow Items completed over a particular time period 16 flow items done In progress Work Start Work Complete Flow Distribution 25% Features • 43% Defects 20% Debts • 12% Risks
  19. Flow Load 35 Source: Flow Framework. (n.d.). What is the

    Flow Framework?. The number of Flow Items currently in progress in a value stream Done Flow Load = 30 30 flow items in progress Work Start Work Complete
  20. Flow Time 36 Source: Flow Framework. (n.d.). What is the

    Flow Framework?. The time it takes for Flow Items to go from “work start” to “work complete” Work Complete 30 days total in active and waiting state Flow Time = 30 days Work Start In Design Defining In Dev In QA Waiting for triage Ready Dev Done Waiting for approval
  21. Flow Efficiency 37 Source: Flow Framework. (n.d.). What is the

    Flow Framework?. The ratio of active time and flow time Work Complete Work Start In Design Defining In Dev In QA Waiting for triage Ready Dev Done Waiting for approval 30 days total in active and waiting state 10 days in active state Flow Efficiency = 10 ÷ 30 = 33%
  22. DevOps Research and Assessment Google’s DORA team conducted a research

    program from 2014, which validated a number of technical, process, measurement, and cultural capabilities that drive higher software delivery and organizational performance 41 Nicole Forsgren Jez Humble Gene Kim
  23. The Four Key Metrics – Speed 43 Source: DevOps Research

    and Assessment. (2024). 2024 State of DevOps Report. Change Lead Time The time it takes for a code commit or change to be successfully deployed to production Deployment Frequency How often application changes are deployed to production
  24. The Four Key Metrics – Stability 44 Source: DevOps Research

    and Assessment. (2024). 2024 State of DevOps Report. Change Fail Rate The percentage of deployments that cause failures in production, requiring hotfixes or rollbacks ! Failed Deployment Recovery Time The time it takes to recover from a failed deployment
  25. Performance Level 45 Source: DevOps Research and Assessment. (2024). 2024

    State of DevOps Report. Conduct survey and analyze their answers using cluster analysis Elite High Medium Low Change Lead Time < 1 day 1 day ~ 1 week 1 week ~ 1 month 1 month ~ 6 months Deployment Frequency on demand (multiple times per day) daily ~ monthly weekly ~ monthly monthly ~ quarterly Change Fail Rate 5% 20% 10% 40% Failed Deployment Recovery Time < 1 hour < 1 day < 1 day 1 week ~ 1 month
  26. Compare the Elite Group Against the Low Group 46 Source:

    DevOps Research and Assessment. (2024). 2024 State of DevOps Report. Change Lead Time 127 Times Faster Change Fail Rate ! 8 Times Lower Deployment Frequency 182 Times More Failed Deployment Recovery Time 2293 Times Faster
  27. SPACE Dimensions 49 S P A C E Satisfaction and

    Well-being Performance Activity Communication and Collaboration Efficiency and Flow The count of actions or outputs An outcome of a development process How fulfilled, happy and healthy developer is How people talk and work together Doing work with minimal delays or interruptions
  28. DORA as SPACE for Software Delivery 50 Source: DORA. (2024).

    SPACE & DORA: Working Together, Dr. Nicole Forsgren (20:48). YouTube. Have metrics across at least 3 of the 5 dimensions for a given focus area DORA Metric SPACE Dimension Change Lead Time Efficiency and Flow Deployment Frequency Activity Change Fail Rate Performance Failed Deployment Recovery Time Efficiency and Flow
  29. SPACE Framework in Action 51 Source: ChariotSolutions. (2021). The SPACE

    of Developer Productivity: There’s More To It Than You Think — Dr. Nicole Forsgren (16:01). YouTube. Level Satisfaction and Well-being Performance Activity Communication and Collaboration Efficiency and Flow Individual One person • Developer satisfaction • Retention* • Satisfaction with code reviews assigned • Perception of code reviews • Code review velocity • Number of code reviews completed • Coding time • # commits • Lines of code* • Code review score (quality or thoughtfulness) • PR merge times • Quality of meetings* • Knowing sharing, discoverability (quality of documentation) • Code review timing • Productivity perception • Lack of interruptions Team or group People that work together • Developer satisfaction • Retention* • Code review velocity • Story points shipped* • # story points completed* • PR merge times • Quality of meetings* • Knowing sharing, discoverability (quality of documentation) • Code review timing • Handoffs System End-to-end work through a system • Satisfaction with engineering system (e.g. CI/CD pipeline) • Code review velocity • Code review (acceptance rate) • Customer satisfaction • Reliability (uptime) • Frequency of deployments • Knowing sharing, discoverability (quality of documentation) • Code review timing • Velocity/flow through the system * Use these metrics with (even more) caution – they can proxy more things.
  30. 53 0 2 4 6 8 10 Jan 2022 Feb

    2022 Mar 2022 Apr 2022 May 2022 Jun 2022 S1 S2 S3 S4 Group by Month Jan 3, 2012 – Jun 25, 2022 Avg: 7 Bug Count 0 2 4 6 8 10 12 14 S1 S2 S3 S4 Todo Doing Done Bug Status Jan 3, 2012 – Jun 25, 2022
  31. Created vs. Resolved Bugs 54 0 5 10 15 20

    25 Jan 2022 Feb 2022 Mar 2022 Apr 2022 May 2022 Jun 2022 Resolved Created Jan 3, 2012 – Jun 25, 2022 Why have we stalled bug resolution? Created Bug Resolved Bugs
  32. Lead Time / Cycle Time / Age Time 55 10.1

    2.9 15.3 13.8 2.4 24.2 13.2 3.2 41.5 19.2 3.2 39.8 0 5 10 15 20 25 30 35 40 45 Lead Time Cycle Time Age Time S1 S2 S3 S4
  33. Bug Distribution by Cause 56 Source: Staff, Tricentis. (2016). 64

    essential testing metrics for measuring quality assurance success. Tricentis. User data entry Code errors Architecture Environment Design 0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100% 0 2 4 6 8 10 12 14 16 User data entry Code errors Architecture Environment Design Pie Chart Pareto Chart (帕雷托圖)
  34. Engineering Productivity Platform 58 Source: Photo by OpenAI on ChatGPT

    Collect, analyze, and visualize the data from DevOps tools Help teams identify bottlenecks and make improvements
  35. Benefits of Engineering Productivity Platform 59 Source: https://devlake.apache.org/ Defragment Your

    Data Silos Out-of-the-Box Analysis Flexible. Extensible. Adaptable.
  36. True Productivity is More Than Metrics 71 Everyone measures something.

    But productivity is not about measuring more. It’s about seeing the whole picture.
  37. 73 Kersten, M. (2018). Project to Product: How to Survive

    and Thrive in the Age of Digital Disruption with the Flow Framework. IT Revolution Press.
  38. 74 Forsgren, N., Humble, J. and Kim, G. (2018). Accelerate:

    The Science of Lean Software and DevOps. IT Revolution Press.