BSI PD CEN/CLC/TR 17602-80-11:2021
$215.11
Space product assurance. Software process assessment and improvement – Framework
Published By | Publication Date | Number of Pages |
BSI | 2021 | 126 |
This handbook defines methods for process assessment and improvement that may be used to meet the requirements on process assessment and improvement of the ECSS-Q-ST-80C subclause 5.7. These methods constitute a clear and proven way of implementing those requirements. Alternative methods can be used provided that they meet the detailed instructions provided in this handbook for recognition of software process assessment schemes and results and process improvement.
This handbook provides a detailed method for the implementation of the requirements of the ECSS-QST- 80C for software process assessment and improvement. It also establishes detailed instructions for alternative methods intended to meet the same ECSS-Q-ST-80C requirements.
The process assessment and improvement scheme presented in this handbook is based on and conformant to the ISO/IEC 15504 International Standard. In designing this process assessment and improvement scheme the ISO/IEC 15504 exemplar process assessment model was adopted and extended to address ECSS specific requirements.
The methods provided in this handbook can support organizations in meeting their business goals and in this context they can be tailored to suit their specific needs and requirements. However when used to claim compliance with relevant requirements in ECSS-Q-ST-80C only the steps and activities explicitly marked as recommended in this handbook may be omitted or modified.
PDF Catalog
PDF Pages | PDF Title |
---|---|
2 | undefined |
13 | 1 Scope |
14 | 2 References |
16 | 3 Terms, definitions and abbreviated terms 3.1 Terms from other documents 3.2 Terms specific to the present document |
22 | 3.3 Abbreviated terms |
24 | 4 Organisation and purpose 4.1 Organization of this handbook 4.2 Relation to Standards 4.2.1 Relation versus the ECSS family 4.2.1.1 Relation to software engineering 4.2.1.2 Relation to software product assurance |
25 | 4.2.1.3 Relation to project management |
26 | 4.2.2 Relation versus ISO/IEC |
27 | 4.3 S4S process assessment and improvement overview 4.3.1 S4S architecture |
28 | 4.3.2 S4S assessment purposes 4.4 Use of other schemes and standards |
29 | 5 S4S process assessment model 5.1 Introduction |
30 | 5.2 S4S process dimension |
34 | 5.3 The capability dimension 5.3.1 General 5.3.1.1 Capability level |
36 | 5.3.1.2 Process attributes and rating scale |
38 | 5.3.2 Level 0: Incomplete process 5.3.3 Level 1: Performed process 5.3.3.1 PA 1.1 Process performance attribute 5.3.4 Level 2: Managed process 5.3.4.1 PA 2.1 Performance management attribute |
39 | 5.3.4.2 PA 2.2 Work product management attribute 5.3.5 Level 3: Established process 5.3.5.1 PA 3.1 Process definition attribute 5.3.5.2 PA 3.2 Process deployment attribute |
40 | 5.3.6 Level 4: Predictable process 5.3.6.1 PA 4.1 Process measurement attribute 5.3.6.2 PA 4.2 Process control attribute |
41 | 5.3.7 Level 5: Optimizing process 5.3.7.1 PA 5.1 Process innovation attribute 5.3.7.2 PA 5.2 Process optimization attribute |
42 | 6 Process assessment method 6.1 Introduction |
43 | 6.2 Assessment process definition 6.2.1 Introduction |
45 | 6.2.2 Assessment process 6.2.2.1 Assessment initiation |
49 | 6.2.2.2 Assessment planning |
52 | 6.2.2.3 Briefing (recommended activities) 6.2.2.4 Data acquisition |
54 | 6.2.2.5 Data validation |
55 | 6.2.2.6 Process rating |
56 | 6.2.2.7 Assessment reporting and recording |
59 | 6.2.2.8 Inputs to the risk management process (recommended activity) |
62 | 6.2.3 Assessment actors and roles 6.2.3.1 Introduction 6.2.3.2 Assessment sponsor (AS) |
63 | 6.2.3.3 Local assessment coordinator (LAC) 6.2.3.4 Assessment team leader (ATL) |
64 | 6.2.3.5 Other assessors in the Assessment Team (AT) 6.2.3.6 Technical specialists (AT) 6.2.3.7 Observers 6.2.3.8 Assessment participants (AP) |
65 | 6.2.3.9 Organizational unit (OU) 6.3 Assessment process guidance 6.3.1 Introduction 6.3.2 Selection of assessment purpose |
66 | 6.3.3 Assessment guidance for capability determination 6.3.3.1 Introduction 6.3.3.2 Assessment purpose (INI 3) 6.3.3.3 Selecting the assessment team (INI 7) 6.3.3.4 Assessment scope (INI 10) |
69 | 6.3.3.5 Data and data validation criteria (INI 11) 6.3.3.6 Defining ownership and responsibilities for assessment outputs (INI 11) 6.3.3.7 Mapping the OU to the S4S model (INI 12) 6.3.3.8 Selecting participants (INI 13) |
70 | 6.3.3.9 Assessment schedule (PLN 1) 6.3.3.10 Assessment data collection (PLN 3) |
71 | 6.3.3.11 Verify conformance to requirements (PLN 5) 6.3.3.12 Briefing (BRF 1 and BRF 2) 6.3.3.13 Data collection (DAT 1) 6.3.3.14 Rating of process attributes (PRT 2) |
72 | 6.3.3.15 Process profiles (PRT 3) 6.3.3.16 Assessment instruments |
73 | 6.3.3.17 Act on results 6.3.4 Assessment for process improvement 6.3.4.1 Introduction 6.3.4.2 Assessment scope (INI 10) 6.3.4.3 Data acquisition (DAT 1 and DAT 2) 6.3.4.4 Process rating (PRT 1) |
74 | 6.3.5 Assessment for ECSS conformance 6.3.5.1 Introduction 6.3.5.2 Planning for mandatory base practices (PLN 2) |
75 | 6.3.5.3 Data acquisition (DAT 1 and DAT 2) 6.3.5.4 Process rating (PRT 1) 6.3.5.5 Act on results 6.4 Competency of assessors 6.4.1 Introduction |
76 | 6.4.2 Gaining competency 6.4.2.1 General 6.4.2.2 Key relationships |
77 | 6.4.2.3 Levels of competency 6.4.3 Maintaining competency 6.4.4 Verification of competency |
78 | 6.4.5 Assessor competence instructions |
79 | 6.4.6 Assessor experience instructions |
80 | 7 Process improvement 7.1 Introduction |
81 | 7.2 Process improvement cycle 7.2.1 Introduction |
82 | 7.2.2 Process improvement process 7.2.2.1 Examine the organization’s needs and business goals Role: (TM), IPM |
83 | 7.2.2.2 Initiate process improvement |
85 | 7.2.2.3 Prepare for and conduct a process assessment Role: (see subclause 6.2) |
86 | 7.2.2.4 Analyse assessment output and derive action plan(s) |
91 | 7.2.2.5 Implement improvements Role: (SPIG) |
92 | 7.2.2.6 Confirm improvements Role: (IPM), IPL, SPIG, OU, AT 7.2.2.7 Sustain improvements Role: (TM), OU |
93 | 7.2.2.8 Monitor performance Role: (IPM), OU 7.2.2.9 Management of the process improvement project Role: (IPM), IPL 7.2.3 Roles and responsibilities 7.2.3.1 Top management (TM) |
94 | 7.2.3.2 Improvement programme manager (IPM) 7.2.3.3 Improvement project leader (IPL) |
95 | 7.2.3.4 Software process improvement group (SPIG) members 7.2.3.5 Process owners (PO) 7.2.3.6 (Staff of the) Organizational unit (OU) 7.2.3.7 Assessment team (AT) |
96 | 7.3 Special considerations for the success of process improvement 7.3.1 Ensuring the ongoing commitment of management 7.3.2 Values, attitudes and behaviour |
97 | 7.3.3 Short term benefits 7.3.4 Collection of baseline data 7.3.5 Information policy 7.3.6 Select and use pilot projects 7.3.7 Incremental implementation |
98 | 7.3.8 Training, mentoring, coaching 7.3.9 Communication and teamwork |
99 | 7.3.10 Recognition 7.4 Software process improvement failure factors 7.4.1 Exclusive top-down or bottom-up improvement 7.4.2 Unsuitable pilot project 7.4.3 Confining to training 7.4.4 Confining to CASE tools |
100 | 7.4.5 Confining to capability levels 7.4.6 Too many promises 7.4.7 Late impact 7.5 Recognition of process improvement 7.5.1 Introduction |
101 | 7.5.2 The process improvement cycle 7.5.2.1 General 7.5.2.2 Initiate process improvement 7.5.2.3 Prepare for and conduct a process assessment 7.5.2.4 Analyse assessment output and derive action plan(s) 7.5.2.5 Implement improvements |
102 | 7.5.2.6 Confirm improvements 7.5.2.7 Sustain improvements 7.5.2.8 Review improvement programme 7.5.2.9 Management of the process improvement project |
103 | 8 Recognition of assessment schemes and results 8.1 Introduction 8.2 Recognition of assessment schemes 8.2.1 General 8.2.2 Recognition of the use of S4S 8.2.3 Recognition of other schemes 8.2.3.2 Process assessment model scope |
104 | 8.2.3.3 Process assessment model indicators 8.2.3.4 Mapping process assessment models to process reference models 8.2.3.5 Recognition of assessment methods |
107 | 8.3 Recognition of S4S results |