Abstract
Our study is focused on an evaluation of the maintainability characteristic in the context of the long-term evolution of open-source software. According to well established software quality models such as the ISO 9126 and the more recent ISO 25010, maintainability remains among key quality characteristics alongside performance, security and reliability. To achieve our objective, we selected three complex, widely used target applications for which access to their entire development history and source code was available. To enable cross-application comparison, we restricted our selection to GUI-driven software developed on the Java platform. We focused our examination on released versions, resulting in 111 software releases included in our case study. These covered more than 10 years of development for each of the applications. For each version, we determined its maintainability using three distinct quantitative models of varying complexity. We examined the relation between software size and maintainability and studied the main drivers of important changes to software maintainability. We contextualized our findings using manual source code examination. We also carried out a finer grained evaluation at package level to determine the distribution of maintainability issues within application source code. Finally, we provided a cross-application analysis in order to identify common as well as application-specific patterns.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
Notes
- 1.
- 2.
Found as design debt in some sources.
- 3.
- 4.
- 5.
- 6.
- 7.
- 8.
- 9.
Download data points from https://sourceforge.net/, only consider application releases. Recorded August \(25^{th}\), 2020.
- 10.
Collected using the Metrics Reloaded plugin for IntelliJ.
References
Al-Qutaish, R.E., Ain, A.: Quality models in software engineering literature: an analytical and comparative study. Technical report 3 (2010). http://www.americanscience.org. editor@americanscience.org166
Almugrin, S., Albattah, W., Melton, A.: Using indirect coupling metrics to predict package maintainability and testability. J. Syst. Softw. 121, 298–310 (2016). https://doi.org/10.1016/j.jss.2016.02.024. http://www.sciencedirect.com/science/article/pii/S016412121600056X
ARISA Compendium, VizzMaintenance: Technical documentation of the VizzMaintenance metric extraction tool (2019). http://www.arisa.se/products.php?lang=en
Arlt, S., Banerjee, I., Bertolini, C., Memon, A.M., Schaf, M.: Grey-box GUI testing: efficient generation of event sequences. CoRR abs/1205.4928 (2012)
Avelino, G., Constantinou, E., Valente, M.T., Serebrenik, A.: On the abandonment and survival of open source projects: an empirical investigation. In: 2019 ACM/IEEE International Symposium on Empirical Software Engineering and Measurement (ESEM), pp. 1–12 (2019)
Barkmann, H., Lincke, R., Löwe, W.: Quantitative evaluation of software quality metrics in open-source projects. In: 2009 International Conference on Advanced Information Networking and Applications Workshops, pp. 1067–1072, May 2009. https://doi.org/10.1109/WAINA.2009.190
Basili, V.R., Briand, L.C., Melo, W.L.: A validation of object-oriented design metrics as quality indicators. IEEE Trans. Software Eng. 22(10), 751–761 (1996). https://doi.org/10.1109/32.544352
Caldiera, V.R.B.G., Rombach, H.D.: The goal question metric approach. Encycl. Softw. Eng. 528–532 (1994)
Chidamber, S., Kemerer, C.: A metric suite for object- oriented design. IEEE Trans. Software Eng. 20(6), 476–493 (1994)
Counsell, S., et al.: Re-visiting the ‘Maintainability Index’ metric from an object-oriented perspective. In: 2015 41st Euromicro Conference on Software Engineering and Advanced Applications, pp. 84–87 (2015)
Cunningham, W.: The WyCash portfolio management system. SIGPLAN OOPS Mess. 4(2), 29–30 (1992). https://doi.org/10.1145/157710.157715. http://doi.acm.org/10.1145/157710.157715
van Deursen, A.: Think twice before using the maintainability index (2014). https://avandeursen.com/2014/08/29/think-twice-before-using-the-maintainability-index/
Döhmen, T., Bruntink, M., Ceolin, D., Visser, J.: Towards a benchmark for the maintainability evolution of industrial software systems. In: 2016 Joint Conference of the International Workshop on Software Measurement and the International Conference on Software Process and Product Measurement (IWSM-MENSURA), pp. 11–21 (2016)
Emam, K.E., Benlarbi, S., Goel, N., Rai, S.N.: The confounding effect of class size on the validity of object-oriented metrics. IEEE Trans. Softw. Eng. 27(7), 630–650 (2001). https://doi.org/10.1109/32.935855
Fowler, M.: Technical debt (2019). https://martinfowler.com/bliki/TechnicalDebt.html
Gyimothy, T., Ferenc, R., Siket, I.: Empirical validation of object-oriented metrics on open source software for fault prediction. IEEE Trans. Software Eng. 31(10), 897–910 (2005). https://doi.org/10.1109/TSE.2005.112
Heitlager, I., Kuipers, T., Visser, J.: A practical model for measuring maintainability. In: Quality of Information and Communications Technology, 6th International Conference on the Quality of Information and Communications Technology, QUATIC 2007, Lisbon, Portugal, 12–14 September 2007, Proceedings, pp. 30–39 (2007). https://doi.org/10.1109/QUATIC.2007.8
Hynninen, T., Kasurinen, J., Taipale, O.: Framework for observing the maintenance needs, runtime metrics and the overall quality-in-use. J. Softw. Eng. Appl. 11, 139–152 (2018). https://doi.org/10.4236/jsea.2018.114009
ISO/IEC 25010: Software quality standards (2011). http://www.iso.org
ISO/IEC 9126–1: Software quality characteristics (2001)
Lenarduzzi, V., Lomio, F., Huttunen, H., Taibi, D.: Are SonarQube rules inducing bugs? In: 2020 IEEE 27th International Conference on Software Analysis, Evolution and Reengineering (SANER) (2020). https://doi.org/10.1109/saner48275.2020.9054821. http://dx.doi.org/10.1109/SANER48275.2020.9054821
Letouzey, J.L.: The SQALE method for evaluating technical debt. In: Proceedings of the Third International Workshop on Managing Technical Debt, MTD 2012, pp. 31–36. IEEE Press (2012). http://dl.acm.org/citation.cfm?id=2666036.2666042
Li, W., Henry, S.: Maintenance metrics for the object oriented paradigm. In: IEEE Proceedings of the First International Software Metrics Symposium, pp. 52–60 (1993)
Metrics library, N.: (2019). https://github.com/etishor/Metrics.NET
Lincke, R., Lundberg, J., Löwe, W.: Comparing software metrics tools. In: Proceedings of the 2008 International Symposium on Software Testing and Analysis - ISSTA 2008 (2008). https://doi.org/10.1145/1390630.1390648
Marcilio, D., Bonifácio, R., Monteiro, E., Canedo, E., Luz, W., Pinto, G.: Are static analysis violations really fixed? A closer look at realistic usage of SonarQube. In: Proceedings of the 27th International Conference on Program Comprehension,ICPC 2019, pp. 209–219. IEEE Press (2019). https://doi.org/10.1109/ICPC.2019.00040. https://doi.org/10.1109/ICPC.2019.00040
Marinescu, R.: Measurement and quality in object oriented design. Ph.D. thesis, Faculty of Automatics and Computer Science, University of Timisoara (2002)
Marinescu, R.: Measurement and quality in object-oriented design, vol. 2005, pp. 701–704, October 2005. https://doi.org/10.1109/ICSM.2005.63
Martini, A., Bosch, J., Chaudron, M.: Investigating architectural technical debt accumulation and refactoring over time. Inf. Softw. Technol. 67(C), 237–253 (2015). https://doi.org/10.1016/j.infsof.2015.07.005
Microsoft VS Docs (2020). https://docs.microsoft.com/en-us/visualstudio/code-quality/code-metrics-values
Molnar, A., Motogna, S.: Discovering maintainability changes in large software systems. In: Proceedings of the 27th International Workshop on Software Measurement and 12th International Conference on Software Process and Product Measurement, IWSM Mensura 2017, pp. 88–93. ACM, New York (2017). https://doi.org/10.1145/3143434.3143447. http://doi.acm.org/10.1145/3143434.3143447
Molnar, A.J.: Quantitative maintainability data for FreeMind, jEdit and TuxGuitar versions, September 2020. https://doi.org/10.6084/m9.figshare.12901331.v1. https://figshare.com/articles/dataset/Quantitative_maintainability_data_for_FreeMind_jEdit_and_TuxGuitar_versions/12901331
Molnar, A.J., Motogna, S.: Long-term evaluation of technical debt in open-source software (2020). https://dl.acm.org/doi/abs/10.1145/3382494.3410673
Molnar., A., Motogna, S.: Longitudinal evaluation of open-source software maintainability. In: Proceedings of the 15th International Conference on Evaluation of Novel Approaches to Software Engineering - Volume 1: ENASE, pp. 120–131. INSTICC, SciTePress (2020). https://doi.org/10.5220/0009393501200131
Molnar, A.-J., Neamţu, A., Motogna, S.: Evaluation of software product quality metrics. In: Damiani, E., Spanoudakis, G., Maciaszek, L.A. (eds.) ENASE 2019. CCIS, vol. 1172, pp. 163–187. Springer, Cham (2020). https://doi.org/10.1007/978-3-030-40223-5_8
Molnar, A., Neamçu, A., Motogna, S.: Longitudinal evaluation of software quality metrics in open-source applications. In: Proceedings of the 14th International Conference on Evaluation of Novel Approaches to Software Engineering - Volume 1: ENASE, pp. 80–91. INSTICC, SciTePress (2019). https://doi.org/10.5220/0007725600800091
Motogna, S., Vescan, A., Serban, C., Tirban, P.: An approach to assess maintainability change. In: 2016 IEEE International Conference on Automation, Quality and Testing, Robotics (AQTR), pp. 1–6 (2016). https://doi.org/10.1109/AQTR.2016.7501279
Oman, P., Hagemeister, J.: Metrics for assessing a software system’s maintainability. In: Proceedings Conference on Software Maintenance 1992, pp. 337–344 (1992). https://doi.org/10.1109/ICSM.1992.242525
Lincke, R., Lowe, W.: Compendium of Software Quality Standards and Metrics (2019). http://www.arisa.se/compendium/quality-metrics-compendium.html
Runeson, P., Höst, M.: Guidelines for conducting and reporting case study research in software engineering. Empir. Softw. Eng. (2009). https://doi.org/10.1007/s10664-008-9102-8
SonarSource: SonarQube (2019). https://www.sonarqube.org
Tang, M.H., Kao, M.H., Chen, M.H.: An empirical study on object-oriented metrics. In: Proceedings of the 6th International Symposium on Software Metrics, METRICS 1999, pp. 242–249. IEEE Computer Society, Washington (1999). http://dl.acm.org/citation.cfm?id=520792.823979
Virtual Machinery: Discussion on measuring the Maintanability Index (2019). http://www.virtualmachinery.com/sidebar4.htm
Welker, K.: Software Maintainability Index revisited. J. Defense Softw. Eng. (2001). https://www.osti.gov/biblio/912059
Xu, J., Ho, D., Capretz, L.F.: An empirical validation of object-oriented design metrics for fault prediction. J. Comput. Sci. 4, 571–577 (2008)
Yin, R.K.: Case Study Research and Applications - Design and Methods. SAGE Publishing, Thousand Oaks (2017)
Yuan, X., Memon, A.M.: Generating event sequence-based test cases using GUI run-time state feedback. IEEE Trans. Softw. Eng. 36(1), 81–95 (2010). http://doi.ieeecomputersociety.org/10.1109/TSE.2009.68
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Copyright information
© 2021 Springer Nature Switzerland AG
About this paper
Cite this paper
Molnar, AJ., Motogna, S. (2021). A Study of Maintainability in Evolving Open-Source Software. In: Ali, R., Kaindl, H., Maciaszek, L.A. (eds) Evaluation of Novel Approaches to Software Engineering. ENASE 2020. Communications in Computer and Information Science, vol 1375. Springer, Cham. https://doi.org/10.1007/978-3-030-70006-5_11
Download citation
DOI: https://doi.org/10.1007/978-3-030-70006-5_11
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-70005-8
Online ISBN: 978-3-030-70006-5
eBook Packages: Computer ScienceComputer Science (R0)