Best Practices
in Performance Testing
Jennifer Turnquist
Storage Service Line Director
Lionbridge Technologies
Lionbridge Profile
 Public Company (Nasdaq: LIOX)
- Nearly $400M+ in revenues
- Profitable
 Deep expertise across the
application life cycle
- Application Development & Maintenance
- Testing (Independent V&V)
- Content Development, Conversion &
- Globalization
 Worldwide scale and capability
- Over 4000 employees operating in 25
countries (Scale)
- SEI CMM Level 5 certified process model
© 2003 Lionbridge Technologies, Inc.
8 of the world’s 10
most valuable
companies are
BusinessWeek Global
1000, July 2004
Services Designed around our Client’s Need
A Trusted Partner Around the World
Global Development &
Testing Solutions
Testing &
& Support
Software Development Lifecycle
 Off-shore platforms leverage
more than staff in China, India,
and Eastern Europe
 Global footprint enables local
interaction and facilitates
worldwide release and support
 Trusted, US-based public
company protects against IP loss
© 2003 Lionbridge Technologies, Inc.
Global Language &
Content Solutions
Full Content Lifecycle
 Localization services spanning more than
80 languages
 Proprietary web-architected TM and
terminology solution accelerates
production and improves consistency
 Authoring and eLearning development
services integrate seamlessly with
localization to address global demand
VeriTest: Setting the Standard in Testing Since 1987
 World’s largest independent testing company
- Over 400 test architects, engineers, and analysts in 11
labs across US, Europe, Asia
- Rapid expansion in VeriTest India
- From PDAs and PCs to 32-way servers
- Data center class storage lab
 Industry leader
- Exclusive provider and architect of industry-
leading certification programs
- Developer of PC Magazine benchmarks
- Test and publish industry standard ISP benchmarks
- Operate globally-networked onsite to offshore model
© 2003 Lionbridge Technologies, Inc.
The Lionbridge Team
Local Connections, Global Efficiency
4,000+ Worldwide Staff
Experience and Efficiency
© 2003 Lionbridge Technologies, Inc.
Today’s Agenda
• Why Test Performance?
• Different Types of Performance Testing
• Performance Testing Roadmap
• Choosing the Right Testing Tools
• Top 10 performance testing pitfalls
© 2003 Lionbridge Technologies, Inc.
 "The standard philosophy of 'test to destruction'... will
probably give you an idea of roughly how many users your
site can handle at once, but it won't always tell you why the
site fails to function properly. And without knowing why,
you're not likely to be able to do much about it..."
--Extreme Tech
© 2003 Lionbridge Technologies, Inc.
Why Test Performance?
 The internet and IT infrastructure crucial to business
 Users—employees, business partners, customers—rely on
portals, applications, and data to do their jobs
 Cost of failure can be devastating
 Performance testing in the enterprise is intermittent,
cyclical, often prompted by upgrades
 Testing is highly specialized
© 2003 Lionbridge Technologies, Inc.
The high cost of not conducting performance testing
 Performance testing overlooked until disaster strikes
 Lost and abandoned sales - most visible result of poor performance
testing but…
 Efficiency of mission-critical systems directly impacts business
 Preventing problems—lost productivity, lost business, lost reputation,
and even injury or death—is a major incentive
 Knowing the vital performance metrics = ammunition to IT departments
when planning and justify purchasing decisions
 Provides the ability to demonstrate to investors and other critical
stakeholders that the company’s infrastructure is adequate
© 2003 Lionbridge Technologies, Inc.
Events that trigger performance testing
 Build vs. buy
 Addition of features
 Evolving requirements
 Response to public critique
 Technology due
 Enhancements due to buying
 Consolidating servers
 Acquiring or merging a business
 Deploying a SAN
 Launching new product
 Deploying or upgrading
enterprise application
 Enhancing web application
 Migrating to a new
 Doing any of the above globally
© 2003 Lionbridge Technologies, Inc.
 Promoting an offering
Performance testing
is not a
one time event.
© 2003 Lionbridge Technologies, Inc.
Conduct the right test to get the right results
• Load Testing
- Determines the response time and throughput during typical user load
• Stress Testing
- Determines the peak user load
• Volume Testing
- Determines the problems that occur during long-term user activity
• Component Testing
- Determines the performance and behavior of a specific component
• Benchmark Testing
- Measures the performance of a system or component relative to a
• Transaction Cost Analysis
- Determines the system resources consumed by a single transaction
© 2003 Lionbridge Technologies, Inc.
Performance Testing Roadmap
•Design scripts
•Create scripts
•Verify basic functionality
•Execute tests
•Validate scripts
•Generate use cases
•Build script
•Capture user activity
•Analyze test results
•Identify stakeholders
•Analyze user activity profile
•Outline context
•Run possible iterations
•Agree on goals of•Model
testinguser activity
•Draft results
•Troubleshoot bottlenecks
•Determine budget•Choose the tool(s)
•Tune system •Provide feedback to
•Determine schedule
•Identify re-usable script components
•Agree to promotion
•Assign resources needed•Log non-performance
action items
•Outline resourcesfor
scripting, testing
•Finalize report(s)
•Determine staffing•Create
plan test environment
•Promote results
•Engage test lab (if needed)
© 2003 Lionbridge Technologies, Inc.
An overview of the performance testing process
 After initiating the test, the load generator systems to begin accessing
the system under test using the designed usage patterns.
 Depending on whether the test is a global, local, or isolated
configuration, the load generators may be located worldwide or
completely contained within a test lab.
 The one critical configuration requirement for the load-generating
systems is that they have adequate network bandwidth throughput
capability to access the system under test in a realistic manner without
bandwidth constraints.
 If bandwidth constraints become a problem, adding additional load
generators to the pool of load generators will typically fix this problem.
 If the test is global or local, the Internet will be an important factor in
the configuration. For an isolated configuration, the Internet is not a
© 2003 Lionbridge Technologies, Inc.
An overview of the performance testing process
 Once a performance test is initiated, it can run for several minutes to
several days, depending on the test goal.
 During the test time, the test tool monitors and collects performance
data from all of the components within the system under test, such as
the Web server, application server, or database server.
 All of the monitor data along with the performance test data collected
at the generating client end to determine the overall performance as
well as the potential system bottlenecks.
 In a typical performance test cycle, the performance bottlenecks are
located, fixed, and iteratively retested to ensure that they are fixed as
© 2003 Lionbridge Technologies, Inc.
High Level Picture of the Process
Overcome resource limitations
• Replace testers with “Virtual Users”
• Run many Virtual Users on few machines
• Controller manages Virtual Users
• Run repeatable tests with scripted actions
• Get meaningful results with analysis tools
© 2003 Lionbridge Technologies, Inc.
System under Test
You don’t have to go it alone
Build and train internal resources
Hire contractors
Utilize service offerings from test tool vendors
Rely on application provider
Engage with a consulting firm or SI
Partner with an independent testing company
© 2003 Lionbridge Technologies, Inc.
Important considerations for choosing the resources
 Deadlines
 Testing skills and experience
 Technology and/or application expertise
 Frequency and scale of testing requirements
 Infrastructure requirements
 Risk assessment
 Market factors
© 2003 Lionbridge Technologies, Inc.
The vast number of performance testing tools can be
© 2003 Lionbridge Technologies, Inc.
Important considerations for choosing the right tool
 Do you already own the license?
 Do you have the internal resources to script and execute?
 Will it meet the test objectives?
 Is it compatible with your technology objectives?
 Does it fit within your budget constraints?
 Do you have the training and expertise to analyze the
 Does it match the frequency of your testing needs?
© 2003 Lionbridge Technologies, Inc.
Leading performance tools
Benchmark Tool
Mercury Interactive
•Compatible with numerous
•Expensive license
•Excellent data analysis tools
•Requires a unique license for
each protocol type
•WAN emulation
•Web transaction breakdown
Segue SilkPerformer
•Offer “Lite” version for reduced
•Excellent data analysis tools
•Windows only
•Expensive license
•Root cause analysis tools
RadView WebLoad
•Inexpensive license
•Good data analysis tools
Spirent Avalanche/Reflector
•Uses SST TracePlus to provide
record and playback feature
•No additional hardware required
© 2003 Lionbridge Technologies, Inc.
•Covers few protocols; primarily
Web-based automation
•Expensive hardware to
Popular Benchmark Tools
VeriTest Tools
Microsoft Tools
Benchmark Tool
Workload Simulated
VeriTest NetBench
CIFS file / network traffic
VeriTest WebBench
HTTP / Web traffic
HTTP(S) traffic
streaming media
Microsoft LOADSIM
Exchange email traffic
SQL database traffic
Exercises the CPU
NFS File / Network Traffic
block level data transfer, OLTP
(database) traffic
block level data transfer
terminal services traffic
SPEC Java Business
© 2003 Lionbridge Technologies, Inc.
Manual testing may be your best tool
• Frequency of testing requirements
• Rate of change
• Limitations of available tools
© 2003 Lionbridge Technologies, Inc.
Top 10 performance
testing pitfalls
Top 10
1. Not Testing.
2. Lack of clearly defined test objectives and poor planning.
3. Relying exclusively on beta customers to find performance issues.
4. Using the wrong tool for the job.
5. Introducing too many variables simultaneously into a test.
6. Failing to test how your product or system is actually used.
7. Conducting load testing in a vacuum.
8. Treating performance testing like a one-time event.
9. Assume that the scripting effort will be short and simple.
10. Finding functional bugs during performance runs.
© 2003 Lionbridge Technologies, Inc.
 Companies rely on systems to conduct business efficiently and
 Performance testing ensures that your users are getting reliable and
timely access to the resources they need
 Performance testing mitigates the risk of lost time and money due to
poor performance
 A fully integrated performance testing program is the preventative
medicine that keeps your system from becoming an inaccessible and
costly resource.
 Though it may seem counterintuitive at first to slow your deployment
for performance test planning and execution, the payoff in time, money,
and quality will be big and will come soon.
© 2003 Lionbridge Technologies, Inc.
Thank You

Best Practices in Performance Testing