class: middle # The Impact of Continuous Code Quality Assessment on Defects #### ICSME'21 - NIER Track Helge Pfeiffer, Assistant Professor,
[Research Center for Government IT](https://www.itu.dk/forskning/institutter/institut-for-datalogi/forskningscenter-for-offentlig-it),
[IT University of Copenhagen, Denmark](https://www.itu.dk)
`ropf@itu.dk` --- ## Aspects of SW Quality Previous studies demonstrate the impact of various aspects on software quality. * Organizational aspects impact defect rates: - organizational structure ([Nagappan et al. 2008](https://dl.acm.org/doi/pdf/10.1145/1368088.1368160)) - code ownership ([Bird et al.](https://dl.acm.org/doi/pdf/10.1145/2025113.2025119)) - adherence to processes ([Herbsleb et al. 1994](https://apps.dtic.mil/sti/pdfs/ADA283848.pdf), [Diaz et al. 1997](http://faculty.salisbury.edu/~xswang/Courses/csc425_426/ReadingMaterial/Process_Improvement_Motorola_IEEE_Software_Sep1997.pdf)) -- * Socio-technical aspects impact defect rates, requests for assistance, or user-perceived quality: - code review coverage and code review participation ([McIntosh et al. 2014](https://dl.acm.org/doi/pdf/10.1145/2597073.2597076)) - applied branching strategies during development ([Shihab et al. 2012](https://dl.acm.org/doi/pdf/10.1145/2372251.2372305)) - deployment schedules, hardware configurations, and software platforms ([Mockus et al. 2005](https://dl.acm.org/doi/pdf/10.1145/1062455.1062506)) -- * Technical aspects impact defects or customer perceived software quality: - size of software ([Lavallée et al. 2015](https://fac.ksu.edu.sa/sites/default/files/why_good_developers_write_bad_code-_an_observational_case_study_of_the_impacts_of_organizational_factors_on_software_quality.pdf)) - application of certain design patterns ([Khomh et al. 2008](https://www.researchgate.net/profile/Yann-Gael-Gueheneuc/publication/4330216_Do_Design_Patterns_Impact_Software_Quality_Positively/links/561d229808ae50795afd7645/Do-Design-Patterns-Impact-Software-Quality-Positively.pdf)) --- ## Perspectives of SW Quality > * The **transcendental view** sees quality as something that can be recognized but not defined. > > * The **user view** sees quality as fitness for purpose. > > * The **manufacturing view** sees quality as conformance to specification. > > * The **product view** sees quality as tied to inherent characteristics of the product. > > * The **value-based view** sees quality as dependent on the amount a customer is willing t o pay for it. > > [Kitchenham et al. _"Software Quality: The Elusive Target"_ 1996](https://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.91.9555&rep=rep1&type=pdf) --- ## Motivation
> Imagine that your CEO’s aunt is also a customer. It’s not a big account; in fact, it’s tiny. But she makes his favorite pie, so her opinion matters more than it should. Unfortunately, the last release had a couple of bugs that mattered to her, so he’s been on the warpath ever since. He’s started ranting about quality and demanding numbers. He says that if you don’t come up with a way to measure quality and show improvement, he will. The glint in his eye says you won’t like it. > > Now what? Now it’s time for SonarQube, which will help you manage your code quality, instead of letting your code quality (and Aunt Betty) manage you. > > [Campbell et al. _"SonarQube in Action"_ 2014](https://www.manning.com/books/sonarqube-in-action) --- ## Motivation Practitioners share this causality: > It is well known that quality of code is in inversely proportional with Software bugs, as code quality goes down, the number of bugs increases. Thus, clean software is more likely to have less bugs than code of lower quality. > > [Bayer 2015](https://svenbayer.blog/2015/08/09/maintaining-high-code-quality-with-sonarqube/)
> [...] there is a principle that for all 30 rule infringements, 3 small and a significant Bug can be expected. > > [Gizycki](https://www.triology.de/en/blog-entries/statistical-code-analysis-with-sonarqube) --- ## Problem - Previous Work Prior work presents inconsistent results regarding the impact of application of *automatic static analysis* tools to defects: * Java development with FindBugs may decrease defects, e.g., [Hovemeyer et al. 2004,](https://dl.acm.org/doi/pdf/10.1145/1052883.1052895) [Zheng et al. 2006](https://ieeexplore.ieee.org/abstract/document/1628970) -- Studies that link the product view (code quality) with user view (defect rates): * Show either no impact of such tools, e.g., [Wagner et al. 2008](https://wwwbroy.in.tum.de/publ/papers/icst08evaltool.pdf) * Or they argue for their limited suitability since a large degree of defects are undetectable using code analysis, e.g., [Lauesen et al. 1998](https://ieeexplore.ieee.org/abstract/document/687949/). --- ## Research Question Does the application of a Continuous Code Quality Assessment tool (SonarCloud) in five open-source projects actually reduce the number of defects that are reported for software products? --- # Case Selection
--- # Case Selection
An implementation of the Data Format Description Language.
A dynamic and optionally typed JVM language.
A distributed object store for Hadoop.
An application container and runtime.
An implementation of the Raft consensus algorithm.
---
--- ## Development of Weekly Defect Creation Rates
--- ## Development of Weekly Defect Creation Rates
--- ## Development of Weekly Defect Creation Rates
--- ## Development of Weekly Defect Creation Rates
--- ## Development of Weekly Defect Creation Rates
--- ## Closer investigation of Ratis' SC Tickets
--- ## Closer investigation of Ratis' SC Tickets
--- ## Closer investigation of Ratis' SC Tickets
--- ## Closer investigation of Ratis' SC Tickets
--- ## Closer investigation of Ratis' SC Tickets
--- ## Closer investigation of Ratis' SC Tickets
--- # Conclusions > *Research Question*: Does the application of a CCQA tool (SonarCloud) in five open-source projects actually reduce the number of defects that are reported for software products? * Adoption of SonarCloud does not lead to a decrease of weekly defect creation rate for Daffodil, Groovy, Karaf. * For Hadoop Ozone the decrease of weekly defect creation rate is not statistically significant. * For Ratis, our thorough inspection of tickets and commits does not reveal evidence for that the decrease is actually caused by addressing SonarCloud quality issues. Consequently, we cannot find evidence for that increasing code quality (product view) leads to an increase of software quality from the user’s view. --- # Links * Deeper discussions in the paper pre-print: [http://itu.dk/~ropf/blog/assets/icsme2021_pfeiffer.pdf](http://itu.dk/~ropf/blog/assets/icsme2021_pfeiffer.pdf) * Find the presentation online: [http://itu.dk/~ropf/presentations/icsme21.html](http://itu.dk/~ropf/presentations/icsme21.html) * Find all source code and data online: [https://github.com/HelgeCPH/ccqa_effect_study](https://github.com/HelgeCPH/ccqa_effect_study) --- class: middle # Thank you for your attention --- class: middle # The Impact of Continuous Code Quality Assessment on Defects #### ICSME'21 - NIER Track Helge Pfeiffer, Assistant Professor,
[Research Center for Government IT](https://www.itu.dk/forskning/institutter/institut-for-datalogi/forskningscenter-for-offentlig-it),
[IT University of Copenhagen, Denmark](https://www.itu.dk)
`ropf@itu.dk`