You are on page 1of 16

IEEE SWEBOK Guide V3 81

Please note that this Knowledge Area has not yet been professionally copy edited. Such editing will 1
of course be completed prior to publication. 2
3
CHAPTER 8
Software Engineering Process
1
Acronyms
CASE Computer-Assisted
Software Engineering
CM Configuration Management
CMMI Capability Maturity Model
Integration
GCM Goal-Question-Metric
IDEF Integrated Definition
LOE Level of Effort
SDLC Software Development Life
Cycle
SPLC Software Product Life Cycle
UML Unified Modeling Language
2
Introduction 3
An engineering process consists of a set of 4
interrelated activities that transform one or 5
more inputs into outputs while consuming 6
resources to accomplish the 7
transformation. Many of the processes of 8
traditional engineering disciplines (e.g., 9
electrical, mechanical, civil, chemical) are 10
concerned with transforming physical 11
entities from one form into another, as in a 12
petroleum refinery that uses chemical 13
processes to transform crude oil into 14
gasoline. 15
In this knowledge area software 16
engineering processes are concerned with 17
work activities accomplished by software 18
engineers to develop, maintain, and 19
operate software, such as software 20
requirements, software design, software 21
construction, software testing, software 22
configuration management, and other 23
software engineering processes. For 24
readability, software engineering process 25
will be referred to as software process in 26
this knowledge area. In addition, please 27
note that software process denotes work 28
activities and not the execution process for 29
implemented software. 30
Software processes are defined for a 31
number of reasons: to facilitate human 32
understanding, communication, and 33
coordination; to aid management of 34
software projects; to improve the quality of 35
software products; to support process 36
improvement; and to provide a basis for 37
automated support of process execution. 38
SWEBOK knowledge areas closely related 39
to this Process KA include Software 40
Engineering Management, Software 41
Engineering Models and Methods, 42
Software Quality, and the Measurement 43
topic in Engineering Foundations. 44
Software Engineering Management is 45
concerned with tailoring, adapting, and 46
implementing software processes for 47
software projects. Software Engineering 48
Models and Methods embody processes 49
for effective use of models and methods. 50
The Software Quality KA is concerned 51
with the planning, assurance, and control 52
processes for project and product quality. 53
Measurement and measurement results in 54
the Engineering Foundations KA are 55
essential for evaluating and controlling 56
software engineering processes. 57
Breakdown of Topics for Software 58
Engineering Process 59
As illustrated in Figure 1, this knowledge 60
area is concerned with software process 61
IEEE SWEBOK Guide V3 82
definition, software life cycles, software 62
process assessment and improvement, 63
software measurement, and software 64
engineering process tools. 65
66
67
68
69
Figure 1 Breakdown of topics for the Software Engineering Process KA 70
71
1. Software Process Definition 72
This subarea is concerned with a definition 73
of a software process, software process 74
management, and software process 75
infrastructure. 76
A software process is a set of interrelated 77
activities and tasks that transform input 78
work products into output work products. 79
At minimum, a software process includes 80
required inputs, transforming work 81
activities, and outputs generated. As 82
illustrated in Figure 2, a software process 83
may also include its entry and exit criteria 84
and decomposition of the work activities 85
into tasks, which are the smallest units of 86
work subject to management 87
accountability [1*, p177] [2*, p295] The 88
exit ciiteiia foi a piocess incluues 89
satisfying the exit ciiteiia foi each of the 90
piocess activities. 91
A software process may include sub- 92
processes. For example, requirements 93
validation is a sub-process of the software 94
requirements process. Inputs for 95
requirements validation are typically a 96
software requirements specification and 97
the resources needed to perform validation 98
(personnel, validation tools, sufficient 99
time). The tasks of the requirements 100
validation activity might include 101
requirements reviews, prototyping, and 102
model validation. These tasks involve 103
work assignments for individuals and 104
teams. The output of requirements 105
validation is typically a validated software 106
requirements specification that provides 107
inputs to the software design and software 108
testing processes. Requirements validation 109
and other sub-processes of the 110
requirements engineering process are often 111
interleaved and iterated in various ways; 112
the requirements engineering process and 113
its sub-processes may be entered and 114
exited multiple times during software 115
development or modification. 116
IEEE SWEBOK Guide V3 83
117
118
119
Figure 2 Elements of a Software Engineering Process 120
Complete definition of a software process 121
may also include the roles and 122
competencies, IT support, software 123
engineering techniques and tools, and work 124
environment needed to perform the 125
process, as well as the approaches and 126
measures used to determine the efficiency 127
and effectiveness of performing the 128
process. 129
In addition, a software process may 130
include interleaved technical, 131
collaborative, and administrative activities. 132
[3*, p36]. 133
Three different levels of process definition 134
have been found to be useful [2* , p190]: 135
1) Reference level: a coherent set of 136
activities that can be performed by a single 137
agent (individual or dedicated team); 138
2) Conceptual level: a model that defines a 139
flow of information among agents; 140
3) Implementation level: a model that maps 141
agents from the conceptual model to 142
organization charts; and specifies policies, 143
procedures and tools to be used in 144
implementing the process. 145
Notations for defining software processes 146
include textual lists of constituent activities 147
and tasks described in natural language; 148
data flow diagrams; state charts; IDEF; 149
Petri nets; and UML activity diagrams. 150
The transforming tasks within a process 151
may be defined as procedures; a procedure 152
may be specified as an ordered set of steps 153
or, alternatively, as a checklist of the work 154
to be accomplished in performing a task. 155
[3*, c5] 156
It must be emphasized that there is no best 157
software process or set of software 158
processes. Software processes must be 159
selected, adapted and applied as 160
appropriate for each project and each 161
organizational context. No ideal process, 162
or set of processes, exists [3*, pp. 28-29]. 163
1.1 Software Process Management 164
165
Two objectives of software process 166
management are to realize the efficiency 167
and effectiveness that result from a 168
systematic approach to accomplishing a 169
software process, be it at the individual, 170
project, or organizational level; and to 171
introduce new or improved processes. [3*, 172
s26.1] 173
174
Processes are changed with the expectation 175
that a new or modified process will 176
improve the efficiency and/or effectiveness 177
IEEE SWEBOK Guide V3 84
of the process and the resulting work 178
products. Changing to a new process, 179
improving an existing process, 180
organizational change, and infrastructure 181
change (technology insertion or changes in 182
tools) are closely related, as all are usually 183
initiated with the goal of improving the 184
cost, development schedule, or quality of 185
the deliverable software products. Process 186
change has impacts not only for the 187
software product; they often lead to 188
organizational change. Changes in IT 189
infrastructure tools and technology often 190
require process changes. [4* , p453-454] 191
Existing processes may be modified when 192
other new processes are deployed for the 193
first time (for example, introducing an 194
inspection activity within a software 195
development project will likely impact the 196
software testing process - see Reviews and 197
Audits in the Software Quality KA and the 198
Software Testing KA). These situations 199
can also be termed process evolution. If the 200
modifications are extensive, then changes 201
in the organizational culture and business 202
model will likely be necessary to 203
accommodate the process changes. 204
1.2 Software Process Infrastructure 205
Establishing, implementing, and managing 206
software processes and software life cycle 207
models often occurs at the level of 208
individual software projects. However, 209
systematic application of software 210
processes and software life cycle models 211
across an organization can provide benefits 212
to all software work within the 213
organization, although it requires 214
commitment at the organizational level. A 215
software process infrastructure can provide 216
process definitions, policies for 217
interpreting and applying the processes, 218
and descriptions of the procedures to be 219
used to implement the processes, plus 220
funding, tools, training, and staff members 221
who have been assigned responsibilities 222
for establishing and maintaining the 223
software process infrastructure. 224
Software process infrastructure varies, 225
depending on the size and complexity of 226
the organization and the projects 227
undertaken within the organization. Small, 228
simple organizations and projects have 229
small, simple infrastructure needs. Large, 230
complex organizations and projects, by 231
necessity, have larger and more complex 232
software process infrastructures. In the 233
latter case, various organizational units 234
may be established (such as a Software 235
Engineering Process Group or a steering 236
committee) to oversee implementation and 237
improvement of the software processes. 238
[2* , p186] 239
A common misperception is that 240
establishing a software process 241
infrastructure and implementing repeatable 242
software processes will add time and cost 243
to software development and maintenance. 244
There is a cost associated with introducing 245
or improving a software process; however, 246
experience has shown that implementing 247
systematic improvement of software 248
processes tends to result in lower cost 249
through improved efficiency, avoidance of 250
rework, and more reliable and affordable 251
software. Process quality thus influences 252
software product quality. [2*, p183, p186] 253
[4*, p437-438] 254
2. Software Life Cycles 255
This subarea addresses software life cycle 256
processes, software life cycle models, 257
software process adaptation, and practical 258
considerations. A software development 259
life cycle (SDLC) includes the software 260
processes used to specify and transform 261
software requirements into a deliverable 262
software product. A software product life 263
cycle (SPLC) includes a software 264
development lifecycle plus additional 265
software processes that provide for 266
IEEE SWEBOK Guide V3 85
deployment, maintenance, support, 267
evolution, retirement, and all other birth- 268
to-death processes for a software product, 269
including the software configuration 270
management and software quality 271
assurance processes that are applied 272
throughout a software product life cycle. 273
A software product life cycle may include 274
multiple software development life cycles 275
for evolving and enhancing the software. 276
Individual software processes have no 277
temporal ordering among them. The 278
temporal relationships among software 279
processes are provided by a software life 280
cycle model; either a Software 281
Development Life Cycle (SDLC) or a 282
Software Product Life Cycle (SPLC) [2*, 283
p190]. Life cycle models typically 284
emphasize the key software processes 285
within the model and their temporal and 286
logical interdependencies and 287
relationships. Detailed definitions of the 288
software processes in a life cycle model 289
may be provided directly or by reference to 290
other documents. 291
In addition to conveying the temporal and 292
logical relationships among software 293
processes, the software development life 294
cycle model, or models used within an 295
organization includes the control 296
mechanisms for applying entry and exit 297
criteria (e.g., project reviews, customer 298
approvals, software testing, 299
demonstrations, team consensus). The 300
output of one software process often 301
provides the input for another (e.g., 302
software requirements provide input for a 303
software architectural design process and 304
the software construction and software 305
testing processes). Concurrent execution of 306
several software process activities may 307
produce a shared output (e.g., the interface 308
specifications for interfaces among 309
multiple software components developed 310
by different teams). Some software 311
processes may be regarded as less effective 312
unless other software processes are being 313
performed at the same time (e.g., software 314
test planning during software requirements 315
analysis can improve the software 316
requirements). 317
2.1 Categories of Software Processes 318
Many distinct software processes have 319
been defined for use in the various parts of 320
the software development and software 321
maintenance life cycles. These processes 322
can be categorized as follows: [2* , p294- 323
295] 324
1) Primary processes include software 325
processes for development, operation, and 326
maintenance of software. 327
2) Supporting processes are applied 328
intermittently or continuously throughout a 329
software product life cycle to support 330
primary processes; they include software 331
processes such as software configuration 332
management, software quality assurance, 333
and software verification and validation. 334
3) Organizational processes provide 335
support for software engineering; they 336
include infrastructure management, 337
portfolio and reuse management, 338
organizational process improvement, and 339
management of software life cycle models. 340
4) Cross-project processes, such as reuse 341
and domain engineering; they involve 342
more than a single software project in an 343
organization. 344
Other categories of software process are: 345
5) Project management processes, include 346
software processes for planning and 347
estimating, measuring and controlling, 348
leading, managing risk, and coordinating 349
the primary, supporting, organizational, 350
and cross-project processes of software 351
development and maintenance projects 352
[1*, Preface]. 353
Software process activities are also 354
developed for particular needs, such as 355
IEEE SWEBOK Guide V3 86
process activities that address software 356
quality characteristics. (see the Software 357
Quality KA). [3*, c24] For example, 358
security concerns during software 359
development may necessitate one or more 360
software processes to protect the security 361
of the development environment and 362
reduce the risk of malicious acts. Software 363
processes may also be developed to 364
provide adequate grounds for establishing 365
confidence in the integrity of the software. 366
2.2 Software Life Cycle Models 367
The intangible and malleable nature of 368
software permits a wide variety of software 369
development life cycle models, ranging 370
from linear models in which the phases of 371
software development are accomplished 372
sequentially with feedback and iteration as 373
needed followed by integration, testing and 374
delivery of a single product; to linear 375
phased models in which successive 376
product increments are generated 377
sequentially to form the final software 378
product; to iterative models in which 379
software is developed in increments of 380
increasing functionality on iterative cycles; 381
to agile models that typically involve 382
frequent demonstrations of working 383
software to a customer or user 384
representative who directs development of 385
the software in short iterative cycles that 386
produce small increments of working, 387
deliverable software. Incremental, 388
iterative, and agile models can deliver 389
early subsets of working software into the 390
user environment, if desired. [1*, c2] [2*, 391
s3.2] [3*, s2.1] 392
Linear SDLCs are sometimes referred to as 393
predictive software development life cycle 394
models and iterative and agile SDLCs are 395
referred to as adaptive software 396
development life cycle models. 397
A distinguishing feature of the various 398
software development life cycle models is 399
the way in which software requirements 400
are managed. Linear development models 401
typically develop a complete set of 402
software requirements, to the extent 403
possible, during project initiation and 404
planning. The software requirements are 405
then rigorously controlled. Changes to the 406
software requirements are based on change 407
requests that are processed by a change 408
control board (See the topic Requesting, 409
Evaluating anu Appioving Softwaie 410
Changes in the Change Contiol Boaiu in 411
the Softwaie Configuiation Nanagement 412
KA). An incremental model produces 413
successive increments of working, 414
deliverable software based on partitioning 415
of the software requirements to be 416
implemented in each of the increments. 417
The software requirements may be 418
rigorously controlled, as in a linear model 419
or there may be some flexibility in revising 420
the software requirements as the software 421
product evolves. Agile models may 422
define product scope and high-level 423
features initially, however, agile models 424
are designed to facilitate evolution of the 425
software requirements during the project. 426
It must be emphasized that the continuum 427
of SDLCs from linear to agile is not a thin, 428
straight line. Elements of different 429
approaches may be incorporated into a 430
specific model; for example, an 431
incremental software development life 432
cycle model may incorporate sequential 433
software requirements and design phases 434
but permit considerable flexibility in 435
revising the software requirements and 436
architecture during software construction. 437
2.3 Software Process Adaptation 438
Predefined SDLCs and SPLCs and 439
individual software processes often need to 440
be adapted (also called tailored) to better 441
serve local needs. Organizational context, 442
innovations in technology, project size, 443
product criticality, regulatory 444
requirements, industry practices, and 445
IEEE SWEBOK Guide V3 87
corporate culture may determine needed 446
adaptations. Adaptation of individual 447
software processes and software life cycle 448
models (development and product) may 449
consist of adding more details to software 450
processes, activities, tasks, and procedures 451
to address critical concerns. It may consist 452
of using an alternate set of activities that 453
achieves the purpose and outcomes of the 454
software process. Adaptation may also 455
include omitting software processes or 456
activities from a development or product 457
life cycle model that are clearly 458
inapplicable to the scope of work to be 459
accomplished. However, it is questionable 460
whether a process that has been adapted by 461
omission can be said to conform to the 462
process model that is being adapted [1*, 463
s2.7] [2* , p51]. 464
2.4 Practical Considerations 465
In practice, software processes and 466
activities are often interleaved, overlapped, 467
and applied concurrently. Software life 468
cycle models that specify discrete software 469
processes, with rigorously specified entry 470
and exit criteria and prescribed boundaries 471
and interfaces, should be recognized as 472
idealizations that must be adapted to reflect 473
the realities of software development and 474
maintenance within the organizational 475
context and business environment. 476
Another practical consideration: software 477
engineering processes, such as software 478
configuration management, software 479
construction, and software testing can be 480
adapted to facilitate operation, support, 481
maintenance, migration, and retirement of 482
the software. 483
Additional factors to be considered when 484
defining and tailoring a software life cycle 485
model include required conformance to 486
standards, directives, and policies; 487
customer demands; criticality of the 488
software product; and organizational 489
maturity and competencies. [2*, p188-190] 490
Other factors include the nature of the 491
work (e.g., modification of existing 492
software versus new development) and the 493
application domain (e.g., aerospace versus 494
hotel management). 495
496
3. Software Process Assessment 497
and Improvement 498
This subarea addresses software process 499
assessment models, software process 500
assessment methods, software process 501
improvement models, and continuous and 502
staged process ratings. Software process 503
assessments are used to evaluate the form 504
and content of a software process, which 505
may be specified by a standardized set of 506
criteria. [4*, p397, c15] In some instances, 507
the terms process appraisal and 508
capability evaluation are used instead of 509
process assessment. Process appraisals 510
are typically performed by an acquirer (or 511
potential acquirer) or by an external agent 512
on behalf of an acquirer (or potential 513
acquirer). The results are used as an 514
indicator of whether the software processes 515
used by a supplier (or potential supplier) 516
are acceptable to the acquirer. Capability 517
evaluations are typically performed within 518
an organization to identify software 519
processes in need of improvement. 520
Process assessments are performed at the 521
levels of entire organizations, 522
organizational units within organizations, 523
and for individual projects. Assessment 524
may involve issues such as assessing 525
whether software process entry and exit 526
criteria are being met, to review risk 527
factors and risk management, or to identify 528
lessons learned and process improvements 529
attempted and incorporated. Process 530
assessment is carried out using both an 531
assessment model and an assessment 532
method. The model can provide a norm for 533
a benchmarking comparison among 534
IEEE SWEBOK Guide V3 88
projects within an organization and among 535
organizations. [2* , p188, p194] 536
A process audit differs from a process 537
assessment. Audits are typically 538
conducted to identify root causes of 539
problems that are impacting a development 540
project, a maintenance activity, or a 541
software related issue. Assessments are 542
performed to determine levels of capability 543
and to identify software processes to be 544
improved. 545
Success factors for software process 546
assessment and improvement within 547
software engineering organizations include 548
management sponsorship, planning, 549
training, experienced and capable leaders, 550
team commitment, expectation 551
management, the use of change agents, 552
plus pilot projects and experimentation 553
with tools. [3*, c26] 554
3.1 Software Process Assessment Models 555
Software process assessment models 556
typically include software processes that 557
are regarded as constituting good practices. 558
These practices may address software 559
development processes only, or may also 560
include topics such as software 561
maintenance, software project 562
management, systems engineering, or 563
human resources management. [2*, s4.5, 564
s4.6] [3*, s26.5] [4*, p44-48] 565
3.2 Software Process Assessment 566
Methods 567
A software process assessment method can 568
be qualitative or quantitative. Qualitative 569
assessments rely on the judgment of 570
experts; quantitative assessments assign 571
numerical scores to software processes 572
based on analysis of objective evidence 573
that indicates attainment of the goals and 574
outcomes of a defined software process. 575
For example, a quantitative assessment of 576
the software inspection process might be 577
performed by examining the procedural 578
steps followed and results obtained, plus 579
data concerning defects found and time 580
required to find and fix the defects as 581
compared to software testing. [1*, p322- 582
331] 583
A typical method of software process 584
assessment includes planning, fact-finding 585
(by collecting evidence through 586
questionnaires, interviews, and observation 587
of work practices) followed by collection 588
and validation of process data, and analysis 589
and reporting. [4* , s16.4] 590
The activities performed during a software 591
process assessment and the distribution of 592
effort for assessment activities are different 593
depending on the purpose of the software 594
process assessment. Software process 595
assessments may be undertaken to develop 596
capability ratings used to make 597
recommendations for process 598
improvements or may be undertaken to 599
obtain a process maturity rating in order to 600
qualify for a contract or award. 601
The quality of assessment results depends 602
on the software process assessment 603
method, the integrity and quality of the 604
obtained data, and the assessment teams 605
capability and objectivity. The goal of a 606
software process assessment is to gain 607
insight; performing a software process 608
assessment by following a checklist for 609
conformance without gaining insight adds 610
little value. [4*, p44-48] 611
3.3 Software Process Improvement 612
Models 613
Software process improvement models 614
emphasize iterative cycles of continuous 615
improvement. A software process 616
improvement cycle typically involves the 617
sub-processes of measuring, analyzing, and 618
changing. [3*, s26.5] The Plan-Do-Check- 619
Act model is a well-known iterative 620
approach to software process 621
improvement. Improvement activities 622
IEEE SWEBOK Guide V3 89
include identifying and prioritizing desired 623
improvements (planning); introducing an 624
improvement, including change 625
management and training (doing); 626
evaluating the improvement as compared 627
to previous or exemplary process results 628
and costs (checking); and making further 629
modifications (acting). [2*, p187-188] The 630
Plan-Do-Check-Act process improvement 631
model can be applied, for example, to 632
improve software processes that enhance 633
defect prevention. [4*, s2.7] 634
3.4 Continuous and Staged Software 635
Process Ratings 636
Software process capability and software 637
process maturity are typically rated using 638
five or six levels to characterize the 639
capability or maturity of the software 640
processes used within an organization. [1*, 641
p28-34] [3*, s26.5] [4*, p39-45] 642
A continuous rating system involves 643
assigning a rating to each software process 644
of interest; a staged rating system is 645
established by assigning the same maturity 646
rating to all of the software processes 647
within a specified process level. A 648
characterization of process levels is 649
provided in Table 1. Continuous models 650
typically use a level 0 rating; staged 651
models typically do not. 652
Level Characterization
0 Incomplete
1 Initial
2 Managed
3 Defined
4 Quantitatively Managed
5 Optimizing
Table 8-1. Software process rating 653
levels 654
In Table 8-1 level 0 indicates that a 655
software process is incompletely 656
performed, or may not be performed. At 657
level 1 a single software process 658
(capability rating) or the software 659
processes in a maturity level 1 group are 660
being performed but on an ad hoc, 661
informal basis. At level 2 a software 662
process (capability rating) or the processes 663
in maturity level 2 are being performed in 664
a manner that provides management 665
visibility to intermediate work products 666
and can exert some control over transitions 667
between processes. At level 3 a single 668
software process or the processes in a 669
maturity level 3 group plus the process or 670
processes in maturity level 2 are well 671
defined (perhaps in organizational policies 672
and procedures) and are being repeated 673
across different projects. Level 3 of 674
process capability or maturity provides the 675
basis for process improvement across an 676
organization because the process is, or 677
processes are, conducted in a similar 678
manner. This allows collection of 679
performance data in a uniform manner 680
across multiple projects. At level 4, 681
quantitative measures can be applied and 682
used for process assessment; statistical 683
analysis may be used. At level 5 the 684
mechanisms for continuous process 685
improvements are applied. 686
Continuous and staged ratings can be used 687
to determine the order in which software 688
processes are to be improved. In the 689
continuous representation, the different 690
capability levels for different software 691
processes provide a guideline for 692
determining the order in which software 693
processes will be improved. In the staged 694
representation, satisfying the goals of a set 695
of software processes within a maturity 696
level is accomplished for that maturity 697
level; which provides a foundation for 698
improving all of the software processes at 699
the next higher level. [3*, s26.5]. 700
IEEE SWEBOK Guide V3 810
4. Software Measurement 701
This subarea addresses software process 702
and product measurement, quality of 703
measurement results, software information 704
models, and software process measurement 705
techniques. 706
Before a new process is implemented or a 707
current process is modified, measurement 708
results for the current situation should be 709
obtained to provide a baseline for 710
comparison between the current situation 711
and the new situation. For example, before 712
introducing the software inspection 713
process, effort required to fix defects 714
discovered by testing should be measured. 715
Following an initial start-up period after 716
the inspection process is introduced the 717
combined effort of inspection plus testing 718
can be compared to the previous amount of 719
effort required for testing alone. Similar 720
considerations apply if a process is 721
changed. [3*, s26.2] [4*, s18.1.1] 722
4.1 Software rocess and roduct 723
Measurement 724
725
For purposes of Software Process, process 726
and product measurement are concerned 727
with determining the efficiency and 728
effectiveness of a software process, 729
activity, or task. The efficiency of a 730
software process, activity, or task is the 731
ratio of resources actually consumed to 732
resources expected or desired to be 733
consumed in accomplishing a software 734
process, activity or task (see Efficiency in 735
the Software Engineering Economics KA). 736
Effort (or equivalent cost) is the primary 737
measure of resources for most software 738
processes, activities, and tasks and is 739
measured in units such as person-hours, 740
person-days, staff-weeks, or staff-months 741
of effort, or in equivalent monetary units 742
such as euros or dollars. 743
Effectiveness is the ratio of actual output to 744
expected output produced by a software 745
process, activity, or task; for example, 746
actual number of defects detected and 747
corrected during software testing to 748
expected number of defects to be detected 749
and corrected, perhaps based on historical 750
data for similar projects (see Effectiveness 751
in the Software Engineering Economics 752
KA). Note that measurement of software 753
process effectiveness requires 754
measurement of the relevant product 755
attributes; for example, measurement of 756
software defects discovered and corrected 757
during software testing. 758
One must take case when measuring 759
product attributes for the purpose of 760
determining process effectiveness. For 761
example, the number of defects detected 762
and corrected by testing may not achieve 763
the expected number of defects, and thus 764
indicate low effectiveness, because the 765
software being tested is of better than usual 766
quality, or perhaps because introduction of 767
a newly introduced upstream inspection 768
process has reduced the remaining number 769
of defects in the software. 770
Product measures that may be important in 771
determining the effectiveness of software 772
processes include product size, complexity, 773
defects, defect density, and the quality of 774
requirements, design documentation, and 775
other related work products. 776
Also note that efficiency and effectiveness 777
are independent concepts. An effective 778
software process can be inefficient in 779
achieving a desired software process 780
result; for example, the amount of effort 781
expended to find and fix software defects 782
could be very high and result in low 783
efficiency, as compared to expectations. 784
An efficient process can be ineffective in 785
accomplishing the desired transformation 786
of input work products into output work 787
products; for example, failure to find and 788
IEEE SWEBOK Guide V3 811
correct a sufficient number of software 789
defects during the testing process. 790
Causes of low efficiency and/or low 791
effectiveness in executing a software 792
process, activity, or task might include one 793
or more of: deficient input work products, 794
inexperienced personnel, lack of adequate 795
tools and infrastructure, a complex 796
product, or an unfamiliar product domain. 797
Efficiency and effectiveness of software 798
processes are also affected by factors such 799
as turnover in software personnel, a 800
schedule change, a new customer 801
representative, or a new organizational 802
policy. 803
In software engineering, productivity in 804
performing a process, activity, or task is 805
the ratio of output produced divided by 806
resources consumed; for example, the 807
number of software defects discovered and 808
corrected divided by person-hours of effort 809
(see Productivity in the Software 810
Engineering Economics KA). Accurate 811
measurement of productivity must include 812
total effort used to satisfy the exit criteria 813
of a software process, activity, or task; for 814
example, the effort required to correct 815
defects discovered during software testing 816
must be included in software testing 817
productivity. 818
Calculation of productivity must account 819
for the context in which the work is 820
accomplished. For example, the effort to 821
correct discovered defects will be included 822
in the productivity calculation of a 823
software team if team members correct the 824
defects they find, as in unit testing by 825
software developers or in a cross- 826
functional agile team. Or, the productivity 827
calculation may include either the effort of 828
the software developers or the effort of an 829
independent testing team, depending on 830
who fixes the defects found by the 831
independent testers. Note that this 832
example refers to the effort of teams of 833
developers or teams of testers and not to 834
individuals. Software productivity 835
calculated at the level of individuals can be 836
misleading because of the many factors 837
that can affect individual productivity of 838
software engineers. [1*, s6.3] [3*, s26.2, 839
p638] 840
Standardized definitions and counting rules 841
for measurement of software processes and 842
work products are necessary to provide 843
standardized measurement results across 844
projects within an organization, to populate 845
a repository of historical data that can be 846
analyzed to identify software processes 847
that neeu to be impioveu, anu to builu 848
pieuictive mouels baseu on accumulateu 849
uata. In the example above, uefinitions 850
of softwaie uefects anu staff-houis of 851
testing effoit plus counting iules foi 852
uefects anu effoit woulu be necessaiy to 853
obtain satisfactoiy measuiement 854
iesults. |1*, p27Sj 855
The extent to which the softwaie 856
piocess is institutionalizeu is impoitant; 857
failuie to institutionalize a softwaie 858
piocess may explain why "goou" 859
softwaie piocesses uo not always 860
piouuce anticipateu iesults. 861
4.2 Quality of Measurement Results 862
The quality of process and product 863
measurement results is primarily 864
determined by the reliability and validity, 865
of the measured results. [4*, s3.4, s3.5] 866
Measurements that do not satisfy these 867
quality criteria can result in incorrect 868
interpretations and faulty software process 869
improvement initiatives. Other desirable 870
properties of software measurements 871
include ease of collection, analysis, and 872
presentation plus a strong correlation 873
between cause and effect. [4*, s3.6, s3.7] 874
The Software Engineering Measurement 875
subarea of the Software Engineering 876
Management KA describes a process for 877
IEEE SWEBOK Guide V3 812
implementing a software measurement 878
program. 879
4.3 Software Information Models 880
Software information models allow 881
modeling, analysis, and prediction of 882
software process and software product 883
attributes to provide answers to relevant 884
questions and achieve process and product 885
improvement goals. Needed data can be 886
collected and retained in a repository; the 887
data can be analyzed and models can be 888
constructed. Validation and refinement of 889
software information models occurs during 890
software projects and after projects are 891
completed to ensure that the level of 892
accuracy is sufficient and that their 893
limitations are known and understood. 894
Software information models may also be 895
developed for contexts other than software 896
projects; for example, a software 897
information model might be developed for 898
processes that apply across an 899
organization, such as software 900
configuration management or software 901
quality assurance processes at the 902
organizational level. [4*, s19.2] 903
4.3.1. Ana|ys|s-dr|ven Software 904
Informat|on Mode| 8u||d|ng 905
Analysis-driven software information 906
model building involves development, 907
calibration, and evaluation of a model. A 908
software information model is developed 909
by establishing a hypothesized 910
transformation of input variables into 911
desired outputs; for example, product size 912
and complexity might be transformed into 913
estimateu effoit neeueu to uevelop a 914
softwaie piouuct using a iegiession 915
equation uevelopeu fiom obseiveu uata 916
fiom past piojects. A mouel is 917
calibiateu by aujusting paiameteis in 918
the mouel to match obseiveu iesults 919
fiom past piojects; foi example, the 920
exponent in a non-lineai iegiession 921
mouel might be changeu by applying the 922
iegiession equation to a uiffeient set of 923
past piojects othei than the piojects 924
useu to uevelop the mouel. 925
A model is evaluated by comparing 926
computed results to actual outcomes for a 927
different set of similar data. Three 928
possible evaluation outcomes are: 1) 929
results computed for a different data set 930
vary widely from actual outcomes for that 931
data set. In this case, the derived model is 932
not applicable for the new data set and 933
should not be applied to analyze or make 934
predictions for future projects; 2) results 935
computed for a new data set are close to 936
actual outcomes for that data set. In this 937
case, minor adjustments are made to the 938
parameters of the model to improve 939
agreement; 3) results computed for the new 940
data set and subsequent data sets are very 941
close and no adjustments to the model are 942
needed. Continuous evaluation of the 943
model may indicate a need for adjustments 944
over time as the context in which the 945
model is applied changes. 946
The Goals/Questions/Metrics (GQM) 947
method can be used to guide analysis- 948
driven software information model 949
building; results obtained from the 950
software information model can be used to 951
guide process improvement. [1*, p310- 952
311] [3*, p712-713]. 953
The following example illustrates 954
application of the GQM method: 955
Goal: to increase the efficiency and 956
effectiveness of software defect discovery 957
and correction during software inspections 958
and reviews. 959
Question: What data is needed to provide 960
insight into enablers and inhibitors of the 961
efficiency and effectiveness of software 962
inspections and reviews. 963
Metrics: 1) frequency of software 964
inspections and reviews; 2) kinds and 965
amounts of material reviewed; 3) skills of 966
IEEE SWEBOK Guide V3 813
reviewers who conduct inspections and 967
reviews; 4) preparation time for reviews 968
and inspections; 5) efficiency and 969
effectiveness of defect discovery and 970
correction. 971
972
4.4 Software Process Measurement 973
Techniques 974
Software process measurement techniques 975
are used to collect process data, transform 976
the data into useful information, and 977
analyze the information to identify process 978
activities and work products that are 979
candidates for initiation of new software 980
processes and improvement of existing 981
software processes. [1*, c8] 982
Process measurement techniques also 983
provide the information needed to measure 984
the effects of process improvement 985
initiatives. Process measurement 986
techniques can be used to collect both 987
quantitative and qualitative data. 988
4.4.1. Quantitative process measurement 989
techniques 990
The purpose of quantitative process 991
measurement techniques is to collect, 992
transform, and analyze quantitative process 993
data that can be used to indicate where 994
process improvements are needed and to 995
assess the results of process improvement 996
initiatives. Quantitative process 997
measurement techniques are used to collect 998
and analyze data in numerical form to 999
which mathematical and statistical 1000
techniques can be applied. 1001
Quantitative process data can be collected 1002
as a byproduct of software processes. For 1003
example, the number of defects discovered 1004
during software testing and the staff-hours 1005
expended can be can be collected by direct 1006
measurement and the productivity of 1007
defect discovery can be derived by 1008
calculating defects discovered per staff- 1009
hour. 1010
The seven basic tools for quality control 1011
can be used to analyze quantitative process 1012
measurement data (check sheets, Pareto 1013
diagrams, histograms, scatter diagrams, 1014
run charts, control charts, and cause-and- 1015
effect diagrams) [4*, s5.1] (see Root Cause 1016
Analysis in the Engineering Foundations 1017
KA). In addition, various statistical 1018
techniques can be used that range from 1019
calculation of medians and means to 1020
multivariate analysis methods (see 1021
Statistical Analysis in the Engineering 1022
Foundations KA). 1023
Data collected using quantitative process 1024
measurement techniques can also be used 1025
as inputs to simulation models (see 1026
Modeling, Prototyping, and Simulation in 1027
the Engineering Foundations KA); these 1028
models can be used to assess the impact of 1029
various approaches to software process 1030
improvement. 1031
Orthogonal Defect Classification (ODC) 1032
can be used to analyze quantitative process 1033
measurement data. ODC can be used to 1034
group detected defects into categories and 1035
link the defects in each category to the 1036
software process or software processes 1037
where a group of defects originated (see 1038
Defect Characterization in the Software 1039
Quality KA). Software interface defects, 1040
for example, may have originated during 1041
an inadequate software design process; 1042
improving the software design process will 1043
reduce the number of software interface 1044
defects. Orthogonal Defect Classification 1045
can provide quantitative data for applying 1046
root cause analysis [4* , s9.8]. 1047
Statistical Process Control can be used to 1048
track process stability, or the lack of 1049
process stability using control charts [4*, 1050
s5.7]. 1051
4.4.2. Qualitative process measurement 1052
techniques 1053
IEEE SWEBOK Guide V3 814
Qualitative process measurement 1054
techniques, including interviews, 1055
questionnaires, and expert judgment can be 1056
used to augment quantitative process 1057
measurement techniques. Group 1058
consensus techniques including the Delphi 1059
technique can be used to obtain consensus 1060
among groups of stakeholders [1*, s6.4]. 1061
5. Software Engineering Process Tools 1062
Software process tools support many of the 1063
notations used to define, implement, and 1064
manage individual software processes and 1065
software life cycle models. They include 1066
editors for notations such as data flow 1067
diagrams, state charts, IDEF diagrams, 1068
Petri nets, and UML activity diagrams. In 1069
some cases software process tools allow 1070
different types of analyses and simulations 1071
(for example, discrete event simulation). 1072
Computer-Assisted Software Engineering 1073
(CASE) tools can reinforce the use of 1074
integrated processes, support the execution 1075
of process definitions, and provide 1076
guidance to humans in performing well- 1077
defined processes. Simple tools such as 1078
word processors and spreadsheets can be 1079
used to prepare textual descriptions of 1080
processes, activities, and tasks and support 1081
traceability among the inputs and outputs 1082
of multiple software processes, such as 1083
stakeholder needs analysis, software 1084
requirements specification, software 1085
architecture and software detailed design 1086
and the results of software processes such 1087
as documentation, software components, 1088
test cases, and problem reports. 1089
Most of the knowledge areas in this Guide 1090
(SWEBOK V3) describe tools that can be 1091
used to manage the processes within that 1092
KA. In particular, see the Software 1093
Configuration Management KA for a 1094
discussion of SCM tools that can be used 1095
to manage the construction, integration, 1096
and release processes for software 1097
products. 1098
Software process tools can support projects 1099
that involve geographically dispersed 1100
(virtual) teams. Increasingly, software 1101
process tools are available through cloud 1102
computing facilities as well as through 1103
dedicated infrastructures. 1104
A project control panel can display 1105
selected process and product attributes for 1106
software projects and indicate 1107
measurements that are within control limits 1108
and those needing corrective action. |1*, 1109
s8.7j 1110
1111
1112
IEEE SWEBOK Guide V3 815
MATRIX OF TOPICS VS. REFERENCE MATERIAL 1113
1114

F
a
i
r
l
e
y

2
0
0
9

[
1
*
]

M
o
o
r
e

2
0
0
9

[
2
*
]


S
o
m
m
e
r
v
i
l
l
e

2
0
1
1

[
3
*
]

K
a
n

2
0
0
3

[
4
*
]

1. Software Process Definition
p177
p190,
p295
p9,
p28-29,
p36,
c5

1.1 Software Process Management

s26.1 p453-
454
1.2 Software Process Infrastructure

p183,
p186

p437-
438
2. Software Life Cycles
c2 p190
2.1 Categories of Software Processes preface
p294-295 c24
2.2 Software Life Cycle Models c2
s3.2 s2.1
2.3 Software Process Adaptation
s2.7 p51
2.4 Practical Considerations
p189-190
3. Software Process Assessment and
Improvement

p188,
p194
c26 p397,
c15
3.1 Software Process Assessment Models

s4.5,
s4.6
s26.5 p44-48
3.2 Software Process Assessment Methods p322-331
s26.3 p44-48
s16.4
3.3 Software Process Improvement
Models

p187-188 s26.5 s2.7
3.4 Continuous and Staged Ratings
p28-34 s26.5 p39-45
4. Software Measurement
s26.2 s18.1.1
4.1 Software Process and Product
Measurement
s6.3,
p273
s26.2
p638

4.2 Quality of Measurement Results
s3.4,
s3.5,
s3.6,
s3.7
4.3 Software Information Models
s19.2
4.4 Software Process Measurement
Techniques
s6.4,
c8
s5.1,
s5.7,
s9.4
5. Software Engineering Process Tools
s8.7
1115
IEEE SWEBOK Guide V3 816
1116
|1*j R. E. Faiiley, !"#"$%#$ "#' 1117
()"'%#$ *+,-."/) 0/+1)2-3. 1118
Boboken, N}: Wiley-IEEE 1119
Computei Society Piess, 2uu9. 1120
|2*j }. W. Nooie, 45) 6+"' !"7 -+ 1121
*+,-."/) 8#$%#))/%#$9 : 1122
*-"#'"/'3;<"3)' =>%'), 1st eu. 1123
Boboken, N}: Wiley-IEEE 1124
Computei Society Piess, 2uu6. 1125
|S*j I. Sommeiville, *+,-."/) 1126
8#$%#))/%#$, 9th eu. New Yoik: 1127
Auuison-Wesley, 2u11. 1128
|4*j S. B. Kan, !)-/%23 "#' !+')?3 %# 1129
*+,-."/) @>"?%-A 8#$%#))/%#$, 2nu 1130
eu. Boston: Auuison-Wesley, 1131
2uu2. 1132
|Sj "CNNI foi Bevelopment, veision 1133
1.S," Softwaie Engineeiing 1134
Institute2u1u. 1135
|6j IS0IEC, "IS0IEC 1SSu4-1:2uu4 1136
Infoimation Technology -- 1137
Piocess Assessment -- Pait 1: 1138
Concepts anu vocabulaiy," eu, 1139
2uu4, p. 19. 1140
|7j B. uibsonB )- "?C, "CN0SEI-2uu6- 1141
TR-uu4 Peifoimance Results of 1142
CNNI-Baseu Piocess 1143
Impiovement," 2uu6. 1144
1145
Further Readings 1146
CMMI

for Development, Version 1.3 [5] 1147


1148
CMMI

for Development, Version 1.3 1149


provides an integrated set of process 1150
guidelines for developing and improving 1151
products and services. These guidelines 1152
include best practices for developing and 1153
improving products and services to meet 1154
the needs of customers and end users. 1155
1156
IS0IEC 1SSu4-1:2uu4 Infoimation 1157
technology -- Piocess assessment -- Pait 1158
1: Concepts anu vocabulaiy |6j 1159
This standard, commonly known as SPICE 1160
(Software Process Improvement and 1161
Capability Determination), includes 1162
multiple parts. Part 1 provides concepts 1163
and vocabulary for software development 1164
processes and related business 1165
management functions. 1166
D. Gibson, D. Goldenson, and K. Kost, 1167
Performance Results of CMMI-Based 1168
Process Improvement [7]. 1169
This technical report summarizes publicly 1170
available empirical evidence about the 1171
performance results that can occur as a 1172
consequence of CMMI-based process 1173
improvement. The report contains a series 1174
of brief case descriptions that were created 1175
with collaboration from representatives 1176
from 10 organizations that have achieved 1177
notable quantitative performance results 1178
through their CMMI-based improvement 1179
efforts. 1180

You might also like