Upgrade to Pro — share decks privately, control downloads, hide ads and more …

Data Ninja III: The Rise of Data Scientists in ...

Thomas Zimmermann
September 24, 2015

Data Ninja III: The Rise of Data Scientists in the Software Industry

Keynote presented at SBES 2015, Belo Horizonte, Brazil. September 2015.

Thomas Zimmermann

September 24, 2015
Tweet

More Decks by Thomas Zimmermann

Other Decks in Research

Transcript

  1. © Microsoft Corporation The Rise of Data Scientists in the

    Software Industry Thomas Zimmermann, Microsoft Research
  2. © Microsoft Corporation Ninja III: The Domination (1984) A telephone

    linewoman who teaches aerobics classes is possessed by an evil spirit of a fallen ninja when coming to his aid. The spirit seeks revenge on those who killed him and uses the female instructor's body to carry out his mission. The only way the spirit will leave the aerobic instructor's body is through combat with another ninja. (wikipedia.org)
  3. © Microsoft Corporation 2010-2012: Information Needs for Analytics Tools Data

    Ninja I (ICSE 2012) 2012-2014: Questions that Software Engineers have for Data Scientists Data Ninja II (ICSE 2014) 2014-now Data Ninja III: The Emerging Role of Data Scientists Technical Report
  4. © Microsoft Corporation Customer feedback • Bring back the "Up"

    button from Windows XP, • Add cut, copy, & paste into the top-level UI, • More customizable command surface, and • More keyboard shortcuts.
  5. © Microsoft Corporation main networking multimedia Changes are isolated =>

    Less build and test breaks Process overhead Time delay (velocity) integration integration Christian Bird, Thomas Zimmermann: Assessing the Value of Branches with What-if Analysis. FSE 2012. development analytics
  6. © Microsoft Corporation Code movement for a single file Blue

    nodes are edits to the file Orange nodes are move operations
  7. © Microsoft Corporation Parent Branch Victim Branch Child Branch no

    longer isolated faster code flow unneeded integrations removed Parent Branch Victim Branch Child Branch no longer isolated no longer isolated no longer isolated no longer isolated Simulation (what-if)
  8. © Microsoft Corporation Delay (Cost) Provided Isolation (Benefit) Green dots

    are branches with high benefit and low cost Red dots are branches with high cost but low benefit Each dot is a branch If high-cost-low-benefit branches had been removed, changes would each have saved 8.9 days of transit time and only introduced 0.04 additional conflicts.
  9. © Microsoft Corporation history of software analytics Tim Menzies, Thomas

    Zimmermann: Software Analytics: So What? IEEE Software 30(4): 31-37 (2013)
  10. © Microsoft Corporation Alberto Bacchelli, Olga Baysal, Ayse Bener, Aditya

    Budi, Bora Caglayan, Gul Calikli, Joshua Charles Campbell, Jacek Czerwonka, Kostadin Damevski, Madeline Diep, Robert Dyer, Linda Esker, Davide Falessi, Xavier Franch, Thomas Fritz, Nikolas Galanis, Marco Aurélio Gerosa, Ruediger Glott, Michael W. Godfrey, Alessandra Gorla, Georgios Gousios, Florian Groß, Randy Hackbarth, Abram Hindle, Reid Holmes, Lingxiao Jiang, Ron S. Kenett, Ekrem Kocaguneli, Oleksii Kononenko, Kostas Kontogiannis, Konstantin Kuznetsov, Lucas Layman, Christian Lindig, David Lo, Fabio Mancinelli, Serge Mankovskii, Shahar Maoz, Daniel Méndez Fernández, Andrew Meneely, Audris Mockus, Murtuza Mukadam, Brendan Murphy, Emerson Murphy-Hill, John Mylopoulos, Anil R. Nair, Maleknaz Nayebi, Hoan Nguyen, Tien Nguyen, Gustavo Ansaldi Oliva, John Palframan, Hridesh Rajan, Peter C. Rigby, Guenther Ruhe, Michele Shaw, David Shepherd, Forrest Shull, Will Snipes, Diomidis Spinellis, Eleni Stroulia, Angelo Susi, Lin Tan, Ilaria Tavecchia, Ayse Tosun Misirli, Mohsen Vakilian, Stefan Wagner, Shaowei Wang, David Weiss, Laurie Williams, Hamzeh Zawawy, and Andreas Zeller
  11. © Microsoft Corporation trinity of software analytics Dongmei Zhang, Shi

    Han, Yingnong Dang, Jian-Guang Lou, Haidong Zhang, Tao Xie: Software Analytics in Practice. IEEE Software 30(5): 30-37, September/October 2013. MSR Asia Software Analytics group: http://research.microsoft.com/en-us/groups/sa/
  12. © Microsoft Corporation Obsessing over our customers is everybody's job.

    I'm looking to the engineering teams to build the experiences our customers love. […] In order to deliver the experiences our customers need for the mobile-first and cloud- first world, we will modernize our engineering processes to be customer-obsessed, data- driven, speed-oriented and quality-focused. http://news.microsoft.com/ceo/bold-ambition/index.html
  13. © Microsoft Corporation Each engineering group will have Data and

    Applied Science resources that will focus on measurable outcomes for our products and predictive analysis of market trends, which will allow us to innovate more effectively. http://news.microsoft.com/ceo/bold-ambition/index.html
  14. © Microsoft Corporation 2010-2012: Information Needs for Analytics Tools Data

    Ninja I (ICSE 2012) 2012-2014: Questions that Software Engineers have for Data Scientists Data Ninja II (ICSE 2014) 2014-now Data Ninja III: The Emerging Role of Data Scientists Technical Report
  15. © Microsoft Corporation 2010-2012: Information Needs for Analytics Tools Data

    Ninja I (ICSE 2012) 2012-2014: Questions that Software Engineers have for Data Scientists Data Ninja II (ICSE 2014) 2014-now Data Ninja III: The Emerging Role of Data Scientists Technical Report
  16. © Microsoft Corporation Raymond P. L. Buse, Thomas Zimmermann: Information

    needs for software development analytics. ICSE 2012: 987-996 Ray Buse
  17. © Microsoft Corporation Guidelines for analytics Be easy to use.

    People aren't always analysis experts. Be concise. People have little time. Measure many artifacts with many indicators. Identify important/unusual items automatically. Relate activity to features/areas. Focus on past & present over future. Recognize that developers and managers have different needs. Information Needs for Software Development Analytics. Ray Buse, Thomas Zimmermann. ICSE 2012 SEIP Track
  18. © Microsoft Corporation Information Needs for Software Development Analytics. Ray

    Buse, Thomas Zimmermann. ICSE 2012 SEIP Track Description Insight Relevant Techniques Summarization Search for important or unusual factors to associated with a time range. Characterize events, understand why they happened. Topic analysis, NLP Alerts (& Correlations) Continuous search for unusual changes or relationships in variables Notice important events. Statistics, Repeated measures Forecasting Search for and predict unusual events in the future based on current trends. Anticipate events. Extrapolation, Statistics Trends How is an artifact changing? Understand the direction of the project. Regression analysis Overlays What artifacts account for current activity? Understand the relationships between artifacts. Cluster analysis, repository mining Goals How are features/artifacts changing in the context of completion or some other goal? Assistance for planning Root-cause analysis Modeling Compares the abstract history of similar artifacts. Identify important factors in history. Learn from previous projects. Machine learning Benchmarking Identify vectors of similarity/difference across artifacts. Assistance for resource allocation and many other decisions Statistics Simulation Simulate changes based on other artifact models. Assistance for general decisions What-if? analysis
  19. © Microsoft Corporation 2010-2012: Information Needs for Analytics Tools Data

    Ninja I (ICSE 2012) 2012-2014: Questions that Software Engineers have for Data Scientists Data Ninja II (ICSE 2014) 2014-now Data Ninja III: The Emerging Role of Data Scientists Technical Report
  20. © Microsoft Corporation Andrew Begel, Thomas Zimmermann: Analyze this! 145

    questions for data scientists in software engineering. ICSE 2014 Andrew Begel
  21. © Microsoft Corporation It Will Never Work in Theory Ten

    Questions for Researchers Posted Aug 22, 2012 by Greg Wilson I gave the opening talk at MSR Vision 2020 in Kingston on Monday (slides), and in the wake of that, an experienced developers at Mozilla sent me a list of ten questions he'd really like empirical software engineering researchers to answer. They're interesting in their own right, but I think they also reveal a lot about what practitioners want from researchers in general; comments would be very welcome. 1. Vi vs. Emacs vs. graphical editors/IDEs: which makes me more productive? 2. Should language developers spend their time on tools, syntax, library, or something else (like speed)? What makes the most difference to their users? 3. Do unit tests save more time in debugging than they take to write/run/keep updated?
  22. © Microsoft Corporation 3. Do unit tests save more time

    in debugging than they take to write/run/keep updated? 4. Do distribution version control systems offer any advantages over centralized version control systems? (As a sub-question, Git or Mercurial: which helps me make fewer mistakes/shows me the info I need faster?) 5. What are the best debugging techniques? 6. Is it really twice as hard to debug as it is to write the code in the first place? 7. What are the differences (bug count, code complexity, size, etc.), if any, between community-driven open source projects and corporate-controlled open source projects? 8. If 10,000-line projects don't benefit from architecture, but 100,000- line projects do, what do you do when your project slowly grows from the first size to the second? 9. When does it make sense to reinvent the wheel vs. use an existing library? 10. Are conferences worth the money? How much do they help junior/intermediate/senior programmers?
  23. © Microsoft Corporation raw questions (provided by the respondents) “How

    does the quality of software change over time – does software age? I would use this to plan the replacement of components.”
  24. © Microsoft Corporation raw questions (provided by the respondents) “How

    does the quality of software change over time – does software age? I would use this to plan the replacement of components.” “How do security vulnerabilities correlate to age / complexity / code churn / etc. of a code base? Identify areas to focus on for in-depth security review or re-architecting.”
  25. © Microsoft Corporation raw questions (provided by the respondents) “How

    does the quality of software change over time – does software age? I would use this to plan the replacement of components.” “How do security vulnerabilities correlate to age / complexity / code churn / etc. of a code base? Identify areas to focus on for in-depth security review or re-architecting.” “What will the cost of maintaining a body of code or particular solution be? Software is rarely a fire and forget proposition but usually has a fairly predictable lifecycle. We rarely examine the long term cost of projects and the burden we place on ourselves and SE as we move forward.”
  26. © Microsoft Corporation raw questions (provided by the respondents) “How

    does the quality of software change over time – does software age? I would use this to plan the replacement of components.” “How do security vulnerabilities correlate to age / complexity / code churn / etc. of a code base? Identify areas to focus on for in-depth security review or re-architecting.” “What will the cost of maintaining a body of code or particular solution be? Software is rarely a fire and forget proposition but usually has a fairly predictable lifecycle. We rarely examine the long term cost of projects and the burden we place on ourselves and SE as we move forward.” descriptive question (which we distilled) How does the age of code affect its quality, complexity, maintainability, and security?
  27. © Microsoft Corporation ❷ Discipline: Development, Testing, Program Management Region:

    Asia, Europe, North America, Other Number of Full-Time Employees Current Role: Manager, Individual Contributor Years as Manager Has Management Experience: yes, no. Years at Microsoft
  28. © Microsoft Corporation Microsoft’s Top 10 Questions Essential Essential +

    Worthwhile How do users typically use my application? 80.0% 99.2% What parts of a software product are most used and/or loved by customers? 72.0% 98.5% How effective are the quality gates we run at checkin? 62.4% 96.6% How can we improve collaboration and sharing between teams? 54.5% 96.4% What are the best key performance indicators (KPIs) for monitoring services? 53.2% 93.6% What is the impact of a code change or requirements change to the project and its tests? 52.1% 94.0% What is the impact of tools on productivity? 50.5% 97.2% How do I avoid reinventing the wheel by sharing and/or searching for code? 50.0% 90.9% What are the common patterns of execution in my application? 48.7% 96.6% How well does test coverage correspond to actual code usage by our customers? 48.7% 92.0%
  29. © Microsoft Corporation Microsoft’s 10 Most Unwise Questions Unwise Which

    individual measures correlate with employee productivity (e.g. employee age, tenure, engineering skills, education, promotion velocity, IQ)? 25.5% Which coding measures correlate with employee productivity (e.g. lines of code, time it takes to build software, particular tool set, pair programming, number of hours of coding per day, programming language)? 22.0% What metrics can use used to compare employees? 21.3% How can we measure the productivity of a Microsoft employee? 20.9% Is the number of bugs a good measure of developer effectiveness? 17.2% Can I generate 100% test coverage? 14.4% Who should be in charge of creating and maintaining a consistent company-wide software process and tool chain? 12.3% What are the benefits of a consistent, company-wide software process and tool chain? 10.4% When are code comments worth the effort to write them? 9.6% How much time and money does it cost to add customer input into your design? 8.3%
  30. © Microsoft Corporation Discipline Differences (Essential %) Dev Test PM

    How many new bugs are introduced for every bug that is fixed? 27.3% 41.9% 12.5% When should we migrate our code from one version of a library to the next? 32.6% 16.7% 5.1% How much value do customers place on backward compatibility? 14.3% 47.1% 18.3% What is the tradeoff between frequency and high quality when releasing software? 22.9% 48.5% 14.5% Role Differences (Essential %) Manager Individual Contributor How much legacy code is in my codebase? 36.7% 65.2% When in the development cycle should we test performance? 63.3% 81.4% How can we measure the productivity of a Microsoft employee? 57.1% 77.3% What are the most commonly used tools on our software team? 95.8% 67.8%
  31. © Microsoft Corporation Region Differences (Essential %) Asia Europe North

    America How can we measure the productivity of a Microsoft employee? 52.9% 30.0% 11.0% How do software methodologies affect the success and customer satisfaction of shrink wrapped and service-oriented products? 52.9% 10.0% 24.7% Can I generate 100% test coverage? 60.0% 0.0% 9.0% What is the effectiveness, reliability, and cost of automated testing? 71.4% 12.5% 23.6% Mgmt Experience Differences Years as Manager (change in odds per year) How much cloned code is ok to have in my codebase? (Essential) 36% How does the age of code affect its quality, complexity, maintainability, and security? (Essential + Worthwhile) -28%
  32. © Microsoft Corporation MSFT Experience Differences Years at Microsoft (change

    in odds per year) What criteria should we use to decide when to use managed code or native code (e.g., speed, productivity, functionality, newer language features, code quality)? (Essential) -23% What are the best tools and processes for sharing knowledge and task status? (Essential) -18% Should we do Test-Driven Development? (Essential) -19% How much distinction should there be between developer and tester roles? (Essential + Worthwhile) -14% Who should write unit tests, developers or testers? (Essential + Worthwhile) -13% How much time went into testing vs. development? (Essential + Worthwhile) -12%
  33. © Microsoft Corporation 2010-2012: Information Needs for Analytics Tools Data

    Ninja I (ICSE 2012) 2012-2014: Questions that Software Engineers have for Data Scientists Data Ninja II (ICSE 2014) 2014-now Data Ninja III: The Emerging Role of Data Scientists Technical Report
  34. © Microsoft Corporation Miryung Kim, Thomas Zimmermann, Robert DeLine, Andrew

    Begel: The Emerging Role of Data Scientists on Software Development Teams. Microsoft Research Technical Report MSR-TR-2015-30, April 2015. Miryung Kim Robert DeLine Andrew Begel
  35. © Microsoft Corporation Methodology • Interviews with 16 participants –

    5 women and 11 men from eight different organizations at Microsoft • Snowball sampling – data-driven engineering meet-ups and technical community meetings – word of mouth • Coding with Atlas.TI • Clustering of participants
  36. © Microsoft Corporation Background of Data Scientists Most CS, many

    interdisciplinary backgrounds Many have higher education degrees Strong passion for data I love data, looking and making sense of the data. [P2] I’ve always been a data kind of guy. I love playing with data. I’m very focused on how you can organize and make sense of data and being able to find patterns. I love patterns. [P14] “Machine learning hackers”. Need to know stats My people have to know statistics. They need to be able to answer sample size questions, design experiment questions, know standard deviations, p-value, confidence intervals, etc.
  37. © Microsoft Corporation Background of Data Scientists PhD training contributes

    to working style It has never been, in my four years, that somebody came and said, “Can you answer this question?” I mostly sit around thinking, “How can I be helpful?” Probably that part of your PhD is you are figuring out what is the most important questions. [P13] I have a PhD in experimental physics, so pretty much, I am used to designing experiments. [P6] Doing data science is kind of like doing research. It looks like a good problem and looks like a good idea. You think you may have an approach, but then maybe you end up with a dead end. [P5]
  38. © Microsoft Corporation Activities of Data Scientists Collection Data engineering

    platform; Telemetry injection; Experimentation platform Analysis Data merging and cleaning; Sampling; Data shaping including selecting and creating features; Defining sensible metrics; Building predictive models; Defining ground truths; Hypothesis testing Use and Dissemination Operationalizing predictive models; Defining actions and triggers; Translating insights and models to business values
  39. © Microsoft Corporation Insight Providers Play an interstitial role between

    managers and engineers within a product group Generate insights and to support and guide their managers in decision making Analyze product and customer data collected by the teams’ engineers Strong background in statistics Communication and coordination skills are key
  40. © Microsoft Corporation Insight Providers P2 worked on a product

    line to inform managers needed to know whether an upgrade was of sufficient quality to push to all products in the family. It should be as good as before. It should not deteriorate any performance, customer user experience that they have. Basically people shouldn’t know that we’ve even changed [it].
  41. © Microsoft Corporation Insight Providers Getting data from engineers I

    basically tried to eliminate from the vocabulary the notion of “You can just throw the data over the wall ... She’ll figure it out.” There’s no such thing. I’m like, “Why did you collect this data? Why did you measure it like that? Why did you measure this many samples, not this many? Where did this all come from?”
  42. © Microsoft Corporation Insight Providers Define actions and triggers You

    need to think about, “If you find this anomaly, then what?” Just finding an anomaly is not very actionable. What I do also involves thinking, “These are the anomalies I want them to detect. Based on these anomalies, I’m going to stop the build. I’m going to communicate to the customer and ask them to fix something on their side Translate findings to concepts familiar to stakeholder’s decisions Weekly data meet-ups
  43. © Microsoft Corporation Modelling Specialists Data scientists who act as

    expert consultants Build predictive models that can be instantiated as new software features and support other team’s data-driven decision making Strong background in machine learning Other forms of expertise such as survey design or statistics would fit as well
  44. © Microsoft Corporation Modelling Specialists P7 is an expert in

    time series analysis and works with a team on automatically detecting anomalies in their telemetry data. The [Program Managers] and the Dev Ops from that team... through what they daily observe, come up with a new set of time series data that they think has the most value and then they will point us to that, and we will try to come up with an algorithm or with a methodology to find the anomalies for that set of time series.
  45. © Microsoft Corporation Modelling Specialists Defining ground truth takes time

    You have communication going back and forth where you will find what you’re actually looking for, what is anomalous and what is not anomalous in the set of data that they looked at. Operationalization is important They accepted [the model] and they understood all the results and they were very excited about it. Then, there’s a phase that comes in where the actual model has to go into production. … You really need to have somebody who is confident enough to take this from a dev side of things.
  46. © Microsoft Corporation Modelling Specialists Translate findings into business values

    In terms of convincing, if you just present all these numbers like precision and recall factors… that is important from the knowledge sharing model transfer perspective. But if you are out there to sell your model or ideas, this will not work because the people who will be in the decision-making seat will not be the ones doing the model transfer. So, for those people, what we did is cost benefit analysis where we showed how our model was adding the new revenue on top of what they already had.
  47. © Microsoft Corporation Platform Builders Build data engineering platforms that

    are reusable in many contexts Strong background in big data systems Make trade-offs between engineering and scientific concerns
  48. © Microsoft Corporation Platform Builders P4 worked on platform to

    collect crash data. You come up with something called a bucket feed. It is a name of a function most likely responsible for the crash in the small bucket. We found in the source code who touch last time this function. He gets the bug. And we filed [large] numbers a year with [a high] percent fix rate.
  49. © Microsoft Corporation Platform Builders Data quality and cleaning is

    very important Often use triangulation If you could survey everybody every ten minutes, you don’t need telemetry. The most accurate is to ask everybody all the time. The only reason we do telemetry is that [asking people all the time] is slow and by the time you got it, you’re too late. So you can consider telemetry and data an optimization. So what we do typically is 10% are surveyed and we get telemetry. And then we calibrate and infer what the other 90% have said. Define intuitive measurements
  50. © Microsoft Corporation Polymaths Data scientists who “do it all”:

    − Forming a business goal − Instrumenting a system to collect data − Doing necessary analyses or experiments − Communicating the results to managers
  51. © Microsoft Corporation Polymaths P13 works on a product that

    serves ads and explores her own ideas for new data models. So I am the only scientist on this team. I'm the only scientist on sort of sibling teams and everybody else around me are like just straight-up engineers. For months at a time I'll wear a dev hat and I actually really enjoy that, too. ... I spend maybe three months doing some analysis and maybe three months doing some coding that is to integrate whatever I did into the product. … I do really, really like my role. I love the flexibility that I can go from being developer to being an analyst and kind of go back and forth.
  52. © Microsoft Corporation Team Leaders Senior data scientists who typically

    run their own data science teams Act as data science “evangelists”, pushing for the adoption of data-driven decision making Work with senior company leaders to inform broad business decisions
  53. © Microsoft Corporation Team Leaders P10 and his team of

    data scientists estimated the number of bugs that would remain open when a product was scheduled to ship. When the leadership saw this gap [between the estimated bug count and the goal], the allocation of developers towards new features versus stabilization shifted away from features toward stabilization to get this number back. Sometimes people who are real good with numbers are not as good with words (laughs), and so having an intermediary to sort of handle the human interfaces between the data sources and the data scientists, I think, is a way to have a stronger influence. [Acting] an intermediary so that the scientists can kind of stay focused on the data.
  54. © Microsoft Corporation Team Leaders Choose the right questions for

    the right team (a) Is it a priority for the organization (b) is it actionable, if I get an answer to this, is this something someone can do something with? and, (c), are you as the feature team — if you're coming to me or if I'm going to you, telling you this is a good opportunity — are you committing resources to deliver a change? If those things are not true, then it's not worth us talking anymore.
  55. © Microsoft Corporation Team Leaders Work closely with consumers from

    day one You begin to find out, you begin to ask questions, you being to see things. And so you need that interaction with the people that own the code, if you will, or the feature, to be able to learn together as you go and refine your questions and refine your answers to get to the ultimate insights that you need.
  56. © Microsoft Corporation Team Leaders Explain the findings in simple

    terms A super smart data scientist, their understanding and presentation of their findings is usually way over the head of the managers…so my guidance to [data scientists], is dumb everything down to seventh-grade level, right? And whether you're writing or you're presenting charts, you know, keep it simple.
  57. © Microsoft Corporation Researchers Data scientists are *now* in software

    teams. They need your help! Better techniques to analyze data. New tools to automate the collection, analysis, and validation of data. Translate research findings so that they can be easily consumed by industry. Learn success strategies from data scientists.
  58. © Microsoft Corporation Practitioners Don’t be afraid of data scientists.

    Share experiences with data science in your company to help others get started Training of existing employees.
  59. © Microsoft Corporation Educators We need more data scientists. :-)

    Data science is not always a distinct role on the team; it is a skillset that often blends with other skills such as software development. Data science requires many different skills. Communication skills are very important. Data scientists very similar to researchers.
  60. © Microsoft Corporation FSE 2016: 24th ACM SIGSOFT International Symposium

    on the Foundations of Software Engineering Seattle, WA, USA, November 13-19, 2016