Design of an Integrated Measurement Database for
Telecom Systems Development
Martin Kunz1, Marek Leszak2, René Braungarten1, Reiner R. Dumke1
1
Software Engineering Group, University of Magdeburg, Germany
{makunz, braungar, dumke}@ivs.cs.uni-magdeburg.de
2
Lucent Technologies Network Systems GmbH, Nuernberg, Germany
mleszak@lucent.com
Abstract:
The importance of software metrics gathered by measuring artefacts emerging during the software development process for economic and scientific purposes is beyond con-troversy these days. To help estimate project characteristics, measure project progress and performance or quantify product attributes, and thus to benefit from it in the long run, a suitable defined set of metrics data need to be defined, collected and analysed. In a complex enterprise with large-scale development projects, a structured and per-sistent central storage solution is almost compulsory. In addition, important statistical techniques for data analysis and visualization techniques are also one major re-quirement. As an additional target the application of such measurement database facilitates to reach CMMISM (Capability Maturity Model® Integration) level 3 for all development units, implying the fulfilment of “Measurement and Analysis” Process Area requirements, which contains the Specific Practices such as “Specify Data Col-lection and Storage Procedures” and “Store Data and Results”, etc. This paper pre-sents results from empirical investigations of the Metrics situation within different de-partments of Lucent TXS Nuremberg, where we focused on the three major disciplines - System Engineering, Software Development, and System Test. Based on the research of diverse metric data and repositories, the high-level design of a measurement re-pository based on the Goal-Question-Indicator-Measurement (GQ[I]M) methodology and the CMMISM framework is presented.
Keywords:
Measurement Database, System and Software Measurement Data, Telecom Systems Development
1 Introduction
Along with the development of software projects, the concept of software metrics and models have been proposed and used for some time, but for over decades, IT software projects are still high-risk activities. Half of them fail and most projects are delivered over time and over budget, especially for those large projects, huge amount of budget seems to be thrown into one endless black hole without usable
IWSM/MetriKon 2006
M. Kunz, M. Leszak, R. Baumgarten, R. Dumke
product generated. According to [Goodman2004], research showed that only 8% of applications projects costing between $6 million and $10 million succeed, among all IT development projects, only 16% are delivered in relatively accept-able cost, time and quality, the cost overruns from 100% to 200% are common in software projects, and over 34% of the time of those IT workers are spend on just fixing software bugs.
All these problems are calling for one more effective software management proc-ess, which in turn can be achieved through the use of software metrics. Today, more and more business organizations and research institutions have already no-ticed and accepted the important role of software metrics plays in the software project development, and there are also a whole bunch of theories, processes and models are being proposed in this domain. In fact, today you can hardly avoid the word “Software Metrics” in almost any software engineering related professionals, published books, conferences and seminars. Every software business organiza-tions are building the software metrics system, or making preparations for build-ing such a system, as if the Software Metrics are the elixir that can cure the soft-ware crisis.
However, despite the huge amount of attentions and effort focused upon Software Metrics, new problems emerge during the implementation of software metrics practices. Most of the researches are focused on defining the metrics and methods, little has been done on how to successfully implement it and solve the practical problems. The fact is, during the implementation of software metrics processes, especially in those enterprises with complex software development and support processes, there are potentially so many things to measure that we are easily overwhelmed by the huge volume of unstructured and disordered measurement data, which builds one even more complex labyrinth for the developers. Another problem is after the implementation of software metrics practice; the measurement data collected cannot help to improve the development process as expected due to the lack of accurate analysis [Braungarten2005].
A challenge for software metrics in the practical domain is to build an effective and efficient metrics system to support the data collection, integration and man-agement. More than this, it should also help to transform the pure statistical data into useful knowledge.
2 Metrics Situation at Lucent TXS Nuernberg
The main stakeholders of our research are certain R&D disciplines within Lucent TXS (Transport & X-connect Systems) business unit, in which we focus on the measurement definition and implementation for Systems Engineering (SE), Soft-ware Development (SW) and System Test (ST) departments. Further Project Man-agement (PjM) and Quality Management (QM) is using those defined metrics to
Software Measurement Conference
Integrated Measurement Database
support and control the development projects, to gain effectiveness, and to im-prove product quality.
The logical relationship between SE, SW, and ST can be presented in a classical development V-model, see Figure 1. The front-end boundary of the model is Product Management that produces consolidated customer requirements to SE process as input. The back-end boundary is Network Validation [Leszak2004].
Customer requirements Product Management Validated comm. network Network Validation Product (system &SW) requirements & architecture Systems Engineering Verified system System Test SW/HW Architecture / Design documents SW/HW Architecture / Design Verified HW/SWsubsystems SW/HW Integration Legend:Main work products Discipline (org. function) Is verified or validated by Is needed as further V&V input
Figure 1: Development lifecycle model
Note that in this figure the logical sequence of phases and associated work prod-ucts does not indicate any strict dependency in time. The activities of each process and successor process parts are rather explicitly allowed to overlap, as long as suf-ficient input is available to start a subsequent phase. Further, this lifecycle is based on iterative development per new feature of a release, carried out initially by SE, and continued in SW and ST [Leszak2004].
IWSM/MetriKon 2006
M. Kunz, M. Leszak, R. Baumgarten, R. Dumke
During the communication with responsible persons of each discipline, and re-views from Project Management and Quality Management, a core set of Software Metrics has been selected and defined. Each indicator is constructed by several basic metrics and derived metrics. Lucent business goals have been decomposed into measurement goals acc. [PSM2003], and then R&D discipline specific ques-tions of interest have been determined, as conceptual base for metrics definition acc. the GQM approach [Solingen1999]. Some examples for applied measurement goal, discipline, questions:
- Improve dev. process, SE, “what amount of requirements is created late i.e. after project start”?
- Improve estimation, SW, “what drives feature-related and fix-related software effort”?
- Improve project tracking, SW, “how stable is the software load close to General Availability (GA)”?
- Improve project tracking, ST, “what is the overall system test progress?” “what is the completeness and quality of features delivered by the SW team”? A simplified verbal description of each indicator and related metric is presented in the appendix.
The three disciplines have begun to collect process related data elements since 2001, from projects carried out in a global organization of several hundred per-sons, which means huge volume of raw data. And since each discipline adopts or develops tools for its own, the various data format need to be clarified and unified before it can be use. In the following subsections, the author will give out analysis about the repositories each discipline used for data archive.
Telelogic DOORS™ database includes layouts for products, features and require-ments specified for the SE discipline.
IBM-Rational ClearDDTS™ is a Distributed Defect Tracking System tool de-signed to record and report Modification Requests (MR) throughout the lifecycle of a system / software / hardware product.
In Lucent, the MR database tool is based on ClearDDTS; the original functionality has been largely expanded by Lucent. This tool is uniformly used for changes to any TXS product and for processes and tools.
The TSRT is the main tool used by System Test department to archive and manage the huge amount of daily test results that they got from various tests within the development projects. The tool is one customized database and web application and administrated by ST staff.
The analysis and diagnosis steps for the implementation of our measurement pro-gram and database are also introduced. The information we got from those steps are based on the measurement activities already existed in Lucent, therefore it can help us implement the database in a more efficient and meaningful way.
Software Measurement Conference
Integrated Measurement Database
3 Design for an integrated measurement architecture
To realize the integration of different measurement tools, we decided to adapt the Measurement Data Warehouse approach [White2005].
It realized a data consolidation in which the extracted and transformed measure-ment data from different sources like measurement tools or spreadsheets is stored into a measurement storage by using ETL tools (Extract Transform Load) [Naumann2004]. As result the approach contains a central measurement database to serve analysis needs [Wu2005].
Figure 2: Measurement Data Warehouse approach
Out of this general approach an adjusted architecture was created by the use of ex-isting measurement tools and databases.
In the practical use case the data integration (see figure 2) has been realized by the adoption of scripts to implement the ETL functionality.
Due to one central measurement data storage redundancy can be much reduced. And single analysis functionality bears the capability to integrate multiple meas-urement data into different views.
IWSM/MetriKon 2006
M. Kunz, M. Leszak, R. Baumgarten, R. Dumke
Figure 3: Measurement architecture for Lucent TXS
The single measurement database is a good start for utilizing metrics and related measurement data, but in order to extract maximum value from those data, only the database is not enough, we still need user interfaces and the application pro-grams that use and process the database [Connolly2005].
One thing we must notice is that the goal of a metrics program is not focused on how many metrics we use, but on how we use them. The control of development process and product quality will not be promoted with the increasing of the quan-tities of defined metrics, but based on the correct selection and analyze of core metrics set.
To help decision making, we can adopt tools and methods such as metrics dashboard and statistical analysis.
Software Measurement Conference
4 Future work
Integrated Measurement Database
In order to arrange the contents in dashboard to be displayed in an organized way, we must define a certain hierarchical structure first. The proposed layout contains three levels: views, sets and metrics.
Figure 4: Dashboard design study for measurement data analysis
The planned dashboard shall support visualization and analysis, esp. of the metrics outlined in the appendix of this paper. Further considerations for an adequate met-rics dashboard include the following aspects:
o Present data at various granularity, esp. on system level and on subsystem level, or for a combination of selected subsystems o Combine data for multiple related metrics in one chart, e.g. process compli-ance index vs. change ratio; estimated vs. real defects, etc. o Provide data in selected time interval of a product release, and also snap-shots in time (to make cross-release trend analysis) o Allow to define targets per measure and display deviations of defined con-trol levels o Support filtering acc. defined data attributes, e.g. defect severity
IWSM/MetriKon 2006
M. Kunz, M. Leszak, R. Baumgarten, R. Dumke
Future work can be carried out with empirical results about the usage of the de-signed architecture in comparison to the previous measurement architecture.
Conclusion
This paper has presented an approach to integrate different measurement tools and measurement data storages into an Extract Transform Load Integration architec-ture. The proposed architecture focuses on one dedicated measurement data stor-age and central measurement data analysis functionality. Some outlines for the us-age of measurement dashboards and associated system & software metrics used in large-scale projects were also made.
References
[Braungarten2005] Braungarten, René; Kunz, Martin; Dumke, Reiner; “An Ap-proach to Classify Software Measurement Storage Facilities”;
Otto-von-Guericke-University Magdeburg; 2005 [Connolly2005]
Connolly, Thomas; Begg, Carolyn; “Database Solutions – A step-by-step guide to building databases” 2nd Edition; Pearson Education Limited, England; 2005
[Goodmann2004] Goodman, Paul; “Software Metrics – Best Practices for Success-ful IT Management”; Rothstein Associates Inc; July 2004. [Leszak2004] Leszak, Marek; “Process Modeling and Quality Control for Em-bedded Telecommunication Systems”; Proc. of IEE Workshop
on Process Modeling & Simulation (ProSim), Edinburgh; May 2004 [Linschi2005]
Linschi, Silvia; Leszak, Marek; “Definition and evaluation of system requirements metrics based on CMMI”; Proc. of Metrikon2005, Kaiserlautern; Nov 2005
[Naumann2004] Naumann, F.: Mediator/Wrapper: Architektur & Peer-Data-Management. [PSM2003]
DoD and US Army: “ Practical Software and Systems Meas-urement”; Version 4.2; 2003;
http://www.psmsc.com/PSMI.asp
Ruffler, Matthias; Leszak, Marek; “Software Quality Assess-ment – A Tool-Supported Model”; Proc. IWSM/Metrikon2006 Solingen, v. R.; Berghout, E.: “The Goal/Question/Metric Method”. Mc-Graw Hill, Maidenhead (USA), 1999.
[Ruffler2006] [Solingen 1999]
[White2005] White, C.: Data Integration: Using ETL, EAI and EII Tools to
create an Integrated Enterprise. Technical Report, BI Research, The Data Warehousing Institute.
Software Measurement Conference
Integrated Measurement Database
[Wu2005] Wu, T.: EII-ETL-EAI: What, Why, and How! IBM.
Appendix
The following basis set of system and software process and quality metrics is be-ing used per development project (“product release”) and partly also per subsys-tem. Subsystems are called “SE areas” in Systems Engineering, “SW domains” in Software Development, and “test areas” in System Test. We describe also the main stakeholders per measure (plus quality management):
- Customers of Lucent products
- Senior management and project management, control the business and run development projects
- Teamleaders, manage R&D development teams of a certain discipline, e.g. SE, SW development, ST
- Engineers, specify, design, implement, and verify features in a product release An integrated Measurement Database will be realized for data collection, analysis, and visualization in future.
System Engineering (SE) Metrics
Details of the measures outlined below are discussed in [Linschi2005].
♦ SE Input (features) Size
Measure of the number of committed features. Stakeholders are teamleaders and engi-neers of all R&D disciplines.
♦ SE Output (system requirements) Size
Measure of the number of system requirements, overall and per SE area. Stakeholders are teamleaders and engineers of all R&D disciplines. ♦ Requirement-Feature Ratio
Ratio between created system requirements and committed features, overall and for new features and requirements. Stakeholders are SE teamleaders and engineers. ♦ Feature Insertion Churn
Measure of feature stability, i.e. the amount of newly committed features, after project start gate. Stakeholders are SE and SW teamleaders. ♦ Effort per requirement
Measure of the average effort (in hours) to create a new system requirement. Stake-holders are SE teamleaders. ♦ Requirements Lateness
Measure of the average degree of lateness of produced requirements, created after pro-ject start gate. Stakeholders are SE and SW teamleaders.
IWSM/MetriKon 2006
M. Kunz, M. Leszak, R. Baumgarten, R. Dumke
♦ Effort vs. Requirements Lateness
Measure of the relationship between requirements lateness and “release focusing rate” i.e. the amount of SE effort for the respective release, overall and per SE area. Stake-holders are SE teamleaders.
♦ Estimation of New Requirements
Estimate of the number of new system requirements, for each new committed feature. Stakeholders are SE, SW, and ST teamleaders (used e.g. for effort estimation). ♦ Estimation of Modified Requirements
Estimate of the number of system requirements that need to be updated for each new committed feature. Stakeholders are SE, SW, and ST teamleaders (used e.g. for effort estimation).
Software development (SW) Metrics
♦ Feature Removal Churn
Measure of feature stability, i.e. the amount of de-committed features, after project start gate. Stakeholders are SW teamleaders. ♦ Defect Removal Efficiency
Ratio between resolved software defects and known defects.
Detailed measurements for defects found in previous releases and those found in current release. Stakeholders are SW teamleaders and project management. ♦ Defect Density
Ratio between the number of in-process software defects and software size, measured separately by whole size and the size of added and changed source code, per SW subsys-tem. Stakeholders are SW teamleaders.
♦ Defects per phase detected, per SW domain
Distribution of defects per phase in which the particular defect has been found. Stake-holders are SW teamleaders.
♦ SW change ratio, per SW domain
Amount of added/changed source lines vs. all source lines (product code only, no COTS, no generated code), per domain. Stakeholders are SW teamleaders and engineers. ♦ Code Churn
Measure of code stability, i.e. the number of changed source files and associated Modifi-cation Requests close to a release’s General Availability (GA). Stakeholders are SW teamleaders and engineers. ♦ Delivered Defects
Measure of the number of high severity post-GA (customer reported) defects within 12 months after a release’s GA. Stakeholders are customers, project management, and SW teamleaders.
Software Measurement Conference
♦ Process Compliance Index
Integrated Measurement Database
Empirical Assessment of Compliance to the Activities and Work Products of the defined SW Process, per SW Domain [Ruffler2006]. Stakeholders are SW teamleaders. ♦ Staff Turn-Over
Measure of staff fluctuation. Stakeholders are project management and SW teamleaders. ♦ Estimation of affected SW Domains per Feature Stakeholders are SW teamleaders.
♦ Estimation of new SW Defects per Release Stakeholders are SW teamleaders.
♦ Estimation of Unresolved SW Defects from previous releases Stakeholders are SW teamleaders.
System Test (ST) Metrics
♦ Feature Maturity (“Traffic Light Status”)
Empirical assessment of test completeness and quality of each committed and imple-mented features, delivered from SW or HW development to ST. Stakeholders are project management and ST teamleaders. ♦ System Test Progress
Completeness degree of system testing, measured by planned vs. executed vs. passed testcases, per test area. Stakeholders are project management and ST teamleaders. ♦ Estimation of affected Test Areas
Estimate of the number of affected test areas, per new feature. Stakeholders are ST team-leaders.
♦ Estimation of Testcases
Estimate of how many test cases are needed for testing, per new feature. Stakeholders are ST teamleaders.
♦ Estimation of Test Automation Degree
Estimate of the proportion of testcases needed to be automated, per new feature. Stake-holders are ST teamleaders.
IWSM/MetriKon 2006
M. Kunz, M. Leszak, R. Baumgarten, R. Dumke
Software Measurement Conference
因篇幅问题不能全部显示,请点此查看更多更全内容