Software Design, Verification and
Validation
Elements of a Code Integrity
Management System for Real-time
Embedded Systems
Joao H. Silva, Ph.D.
[email protected]
[email protected]
Problem Definition
Typical Scenario:
• 1 Senior Manager  178 – 200 project releases
• 2 releases with major Problems
• 176 releases without any major problem
Question:
• How do we anticipate those two “Bad Releases”
before they become a problem
Consider These Statistics
• Statistic #1: For every thousand lines of code developed by
software engineers, there could be as many as 20 to 30
defects on average
• Statistic #2: In 2003, Developer News reported 50 percent
of all bug fixes are incorrectly performed the first time,
often introducing new bugs in the fixing process
• Statistic #3: As bugs progress through the development
cycle, defects found become exponentially more expensive
to fix – it is at least 30 times more costly to fix software in
the customer versus during development – SEI 2001
• For automotive releasing a bug into the field may promote
a very costly recall of that vehicle(s)
Software Defects
• Software defects are inevitable.
• “…people who write software are human first
and programmers only second – In short, they
make mistakes, lots of them...”
The Economist 2003
Where is the complexity?
Avionics
Automotive
• Boeing 747  0.4 M LOC
• Boeing 777  4 M LOC
• 2010 Premium  100 M LOC
• 1995 – 2000  52%/Year
• 2001 – 2010  35%/Year
Technology Review 2002
Tony Scott, GM CIO
Use of Models
•
•
•
•
•
•
CMM
CMMi
Complexity Models
Metrics Maturity Model
TMM –Testing Maturity Model
Others
A model to understand complexity
PROBLEM
HW Solution #1
HW/SW Co-Design
SW Solution #1
COMPLEXITY
Error
Proneness
Reliability
Usability
Size
EFFORT
Change
Quality Solution #1
Maintainability
Understandability
MIPS
Productivity Solution #1
What influences the complexity of SW?
Size
• Module length
• Tools
• Language
• Features
• Management
• Reuse
• Outsourcing
• Unnecessary functionality
• Etc…
Error-Proneness
• Environment
• Competence
• Methods
• Unnecessary functionality
• Etc…
It is a multi-dimensional problem
Cost Reduction
Performance improvement
Meet Requirements
Architecture roadmap
Reuse Objectives
Others
Improve Scalability
Identify Problems
Who will succeed?
• The companies that will succeed will be those
that can develop high-quality software faster,
more reliably, and more cost-effectively than
their competitors.
Typical Software Activities
Product Requirements
R
Requirements Analysis
R
Functional Testing
R
Hardware Design
Architectural Analysis
R
Integration Testing
Detailed Designs
R
Coding
Release
Unit Testing
R
Product Validation
Process Maturity Levels Related to Metrics
Level
Characteristics
Metrics
5
Optimizing
Improvement fed back to process
Process + feedback for changing process
4
Managed
Measured – Quantitative
Process + feedback for control
3
Defined
Process is defined and institutionalized
Product
2
Repeatable
Process dependent on Individuals
Project
1
Initial
Ad Hoc
Baseline
Level 2: Repeatable Process on Individuals
•Budget
Control
•Size
•Volatility
Input
Construct the System
Personnel
Software Size
•Non-Commented Source Lines of code
•Feature Function Points
•Object or method count
Personnel Effort
•Actual person-month of effort
•Reported person-months of effort
Output
•Size
•Time
•Size
•Experience
Requirements Volatility
•Requirements changes
Engineers experience
•Domain/application
•Development architecture
•Tools and Methods
•Overall years of experience
Employee turnover
Level 3: Process is defined and institutionalized
Design method
Integration Method
Detailed Design method
Unit Testing method
Requirements
•Complexity
•Quality
Architectural
Design
•Complexity
•Quality
Coding
Unit testing
High-Level Design
Integration
Testing
Unit Tested
•Complexity
•Quality
Tested System
Requirements complexity
•Number of distinct objects
•Number of actions addressed
Design complexity
•Number of design modules
•Cyclomatic complexity
Code complexity
•Number of code modules
•Cyclomatic complexity
Detailed Design
•Complexity
•Quality
Detailed
Design
Test complexity
•Path count
•Number of interfaces to test
Quality Metrics
•Defects discovered
•Defect density – defects discovered per unit size
•Requirements defects discovered
•Design defects discovered
•Code defects discovered
•Fault density for each project
Level 4: Managed Process
Reporting requirements from senior management
Manage
Process
Product
People
[Metrics Database]
Design defects
•Distribution of defects
•Productivity of tasks
•Planned vs. actuals
•Resource allocation
Redesign Directive
High-Level
Design
Why is it important to achieve level 4?
•
•
•
•
•
Process type
Assess reuse opportunities
Defect identification and classification
Analyze defect density
Assess project completion
Level 5: Optimized Process
• An optimizing process is the highest level of
the process maturity levels
• Measures from the activities defined before
are used to tailor the process
Why is it important to achieve level 5?
• Continuous assessment of the process
• Refine the appropriate metrics
• Use metrics to recommend new metrics, tools,
and techniques
• Estimate project size, costs, and schedule
• Create project databases
• Assess project costs and schedule
• Evaluate project quality and productivity
Building Code Integrity Management Systems
• Requirements Integrity Management System
• Design and Implementation Integrity
management System
Requirements Integrity Mgt System
Customer Requirements
R
Essential
Model
Functionality
Technical
Model
Technical Constraints
Computing power
Memory Requirements
Network Bandwidth
Worst Case Execution Time
Quality
Attributes
Quality Attributes
Performance
Maintenance
Safety
Reliability
Flexibility
R
Requirements Analysis
R
Product Validation
Functional Testing
R
Hardware Requirements
Specification Baseline
Essential
Model
Technical
Model
Time direction
Quality
Attributes
Requirements Integrity System
New
Requirements
HW
System
Requirements
SW
Component
Reuse
Component
Models
ME
Executable
Specifications
Test
Procedures
V&V Test Plans
V&V Test Scripts
COTS
ETC…
Change
Management
System
Configuration
Management
System
Filtering
and
Reporting
Problem Areas
•
•
•
•
Maturity: Mostly operating at CMMi L1 – L2
Metrics: CMMi L1 – L2
Testing: TMM L1 – L4
Executable Specifications: Virtually non
existent except for the HMI-Graphics,
• Environment: Requirements and Validation
teams need to work throughout the entire
development life cycle
Implementation Integrity Mgt System
Improvement Plan
Standards
Certification
[Quality Group]
New Code
Quote
Requirements
Gen Code
Requirements
Design
Legacy Code
Design
Coding
COTS
Coding
Unit Testing
Release
Integration Testing
Validation
Other
Other
Validation
Change
Management
System
Configuration
Management
System
New Improvement
Plan (revised)
Status Report
Quality Report
Productivity Report
Filtering
and
Reporting
Release
Vault
Test Maturity Model
TMM Level Characteristics
Level Test Characteristics
1:
Technology Characteristics
Tools Used
•Validation is separate from development
•Accidental Automation
•Capture/Playback Tool
•Software Testing not Separate From Debugging
• Static Analysis (QA C) is Used
•Experimental Tryout of tools
•Tests are developed in an ad hoc way
•Scripts are not modified
•Test are Used only to get bugs out of software
•Objective of tests are to show that SW Works
•Tools and proper training staff are lacking
2:
•Testing is separate from debugging
•Incidental Automation
•Project Planning Tool
•Goal is to show that SW meets specifications
•Scripts are modified but no documentation
•Capture/Playback Tools
•Basic testing techniques and methods
•Automated test process is not followed
•Simulators and Emulators
•Test design standards do not exist
•Syntax and Semantic Analyzers
•Test requirements are not taken into
consideration during tool evaluation
•Automated tests become maintainable
•Debugging Tools
•Defects get into code from other phases of
•Primary activity: post-code execution tests
•No formal quality control program established
•No test metric program to quantify process
•Development of software testing team
Test Maturity Model
TMM Level Characteristics
Level Test Characteristics
3:
Tools Used
•Testing is integrated into software life cycle
•Intentional Automation
•All tools at phase definition level
•Test objectives are based on requirements
•Automation becomes well defined
•Requirement Management Tools
•Test organization exists and testing is seen
as a professional activity
•No formal quality control program established
•Automation becomes well managed
•No test metric program to quantify the test process
•Test scripts are created based on design and
development standards
•Automated tests become maintainable and
reusable
•Advanced Automation
•Development of software testing team
4:
Technology Characteristics
•Testing is a measured and quantified process
•Scripts proceed from requirements
•All phases of development are reviewed as testing and •Automation of defect tracking
quality control activities
•SW is tested for reliability, usability, and maintainability •Testing team works together with product
development to build a product that meets
test requirements
•Test cases are record to a database
•Software bugs are found early
•Defects are logged and given a severity level
•Test deficiencies form from lack of defect prevention
•All Tools at Integration Level
•Test Procedure Generation Tools
•Defect and Change Tools
•Code Review Tools
Test Maturity Model
TMM Level Characteristics
Level Test Characteristics
5:
Technology Characteristics
•Testing is now defined and managed
•Automated test tools fully support test cases
•Costs are effectively monitored
•Automated tools provide support for test
case design
•Automated tools collect, analyze, and apply
test metrics
•Testing is continuously improved
Tools Used
•All Tools at Management and
measurement level
Measurement Level
•Data Generation Tools
•Defect prevention and quality control are practiced
•Coverage and frequency analyzers
•Testing process is driven by statistical sampling
•Statistical Tools
Conclusions
• Maturity of Automotive Designs: Designs and architectures for
automotive systems is a moving target. Change is continuous. Both the
end-user needs more features and the developer devises more elaborated
solutions.
• SW Complexity: Complexity of the automotive software increases at an
unprecedented pace.
• “No Silver Bullet” or “No Silver Bullet Re-Fired”
• Integrity management System: Elements of this system have been
implemented and rolled out with various degrees of success…
• Software reuse
• Case Tools
• Higher-Level Languages
• Model-Driven Methodologies
• Extreme programming and agile software development methods
• Auto-Code and Auto-Testing
Q&A
Architecture
An architecture is the set of significant decisions about
the organization of a software system, the selection of
the structural elements and their interfaces by which
the system is composed, together with their behavior
as specified in the collaborations among those
elements, into progressively larger subsystems, and the
architectural style that guides this organization – these
elements and their interfaces, their collaborations, and
their composition.
Krutchten – The Rational Unified Process
Booch, Rambaugh, and Jacobson –The UML Language
User Guide, Addison-Wesley, 1999
Descargar

Software Design, Verification and Validation