Software Engineering Tools
Research on Only $10 a Day
William Griswold
University of California, San Diego
UW-MSR Workshop:
Accelerating the Pace of Software Tools Research: Sharing Infrastructure
August 2001
1. How program analysis is used in software
engineering, and how that impacts research
2. Issues for tool implementation, infrastructure
3. Infrastructure approaches and example uses
– My lab’s experiences, interviews with 5 others
– Not every infrastructure out there, or IDE infras
4. Lessons learned
5. Challenges and opportunities
Base Assumptions
• Software engineering is about coping with the
complexity of software and its development
– Scale, scope, arbitrariness of real world
• Evaluation of SE tools is best done in settings
that manifest these complexities
– Experiment X involves a tool user with a need
– Hard to bend real settings to your tool
• Mature infrastructure can put more issues within
reach at lower cost
– Complete & scalable tools, suitable for more settings
Role of Program Analysis in SE
Discover hidden or dispersed program properties,
display them in a natural form,
and assist in their change
• Behavioral: find/prevent bugs; find invariants
– PREfix, Purify, HotPath, JInsight, DejaVu, TestTube
• Structural: find design anomalies, architecture
– Lackwit, Womble, RM, Seesoft, RIGI, slicers
• Evolutionary: enhance, subset, restructure
– Restructure, StarTool, WolfPack
Analysis Methods
• Dynamic
– Trace analysis
– Testing
• Static
– Lexical (e.g., grep, diff)
– Syntactic
– Data-flow analysis, abstract interpretation
– Constraint equation solving
– Model checking; theorem proving
Issues are remarkably similar across methods
Use in Iterative Analysis Cycle
Steps 2-4 may be done
ad hoc
Programmer identifies problem
automation, an interactive
Is this horrific hack the cause tool,
of my
or bug?
a batch tool.
2. Choose program source model and analysis
I think I’ll do a slice withUser-tool
Sprite. [data-flow
“interface” is
and dynamic.
3. Extract (and analyze)rich
[Programmer feeds code to slicer, chooses variable
reference in code that has wrong value]
4. Render model (and analysis)
[Tool highlights reached source text]
5. Reason about results, plan course of action
Nope, that hack didn’t get highlighted…
Interactive, Graphical, Integrated
The Perfect Tool User
For our most recent
“Your tool will solve all sorts of problems.
tool, the first study
it’ll have to analyze my entire 1 MLOC
a 500 KLOC
which doesn’t compile right now,Fortran/C
and is written
developed on SGI’s
in 4 languages. I want the results as fast as
compilation, with an intuitive graphical display
linked back to the source and integrated into our
IDE. I want to save the results, and have them
automatically updated as I change the program.
Oh, I use Windows, but some of my colleagues
use Unix. It’s OK if the tool misses stuff or
returns lots of data, we can post-process. We
just want a net win.”
Unique Infrastructure Challenges
• Wide-spectrum needs (e.g., GUI)
– Provide function and/or outstanding interoperability
• Whole-program analysis versus interactivity
– Demand, precompute, reuse [Harrold], modularize
• Source-to-source analysis and transformation
– Analyze, present, modify as programmer sees it
• Ill-defined task space and process structure
Saving grace is programmer: intelligent, adaptive
– Can interpret, interpolate, iterate; adjust process
– Requires tool (and hence infrastructure) support
Infrastructure Spectrum
• Monolithic environment
– Generative environment (Gandalf, Synthesizer
Generator), programming language (Refine)
– Reuse model: high-level language
• Generator (compiler) or interpreter
• Component-based
– Frameworks, toolkits (Ponder), IDE plug-in support
– Reuse model: interface
• Piecewise replacement and innovation
• Subclassing (augmentation, specialization)
Monolithic Environments
• Refine: syntactic analysis & trans env [Reasoning]
– Powerful C-like functional language w/ lazy eval.
– AST datatype w/grammar and pattern language
– Aggregate ADT’s, GUI, persistence, C/Cobol targets
– Wolfpack C function splitter took 11 KLOC (1/2 reps,
5% LISP), no pointer analysis; slow [Lakhotia]
• CodeSurfer: C program slicing tool [GrammaTech]
– Rich GUI, PDG in repository, Scheme “back door”
– ~500 LOC to prototype globals model [Lakhotia]
– Not really meant for extension, code transformation
Great for prototyping and one-shot tasks
Components Overview
1. Standalone components
– Idea: “Ad hoc” composition, lots of choices
– Example Component: EDG front-ends
– Example Tools: static:
Alloy [Jackson]
dynamic: Daikon [Ernst]
2. Component architectures
– Idea: Components must conform to design rules
– Examples: data arch:
Aristotle [Harrold]
control arch: Icaria [Atkinson]
3. Analyses (tools) as components
– Idea: Infrastructure-independent tool design
– Example: StarTool [Hayes]
Standalone Components
• Component generators
– Yacc, lex, JavaCC, Jlex, JJTree, ANTLR …
– Little help for scoping, type checking (symbol tables)
• Representation packages for various languages
– Icaria (C AST), GNAT (Ada), EDG (*), …
• GUI systems galore, mostly generic
– WFC, Visual Basic, Tcl/Tk, Swing; dot, vcg
• Databases and persistence frameworks
• Few OTS analyses available
– Model checkers and SAT constraint solvers
Edison Design Group Front-Ends
• Front-ends for C/C++, Fortran, Java (new)
– Lexing, parsing, elaborated AST, generates C
• Thorough static error checking
– Know what you get, but not robust to errors
• API’s best for translation to IR
– Simple things can be hard; white-box reuse
• Precise textual mappings
– C/C++ AST is post-processed, but columns correct
• C++ front-end can’t handle some features
Alloy Tool
Ad Hoc Component Example, Static Analysis
• Property checker for Alloy OO spec language
– Takes spec and property, finds counterexamples
– Uses SAT constraint solvers for analysis back-end
– Spec language designed explicitly for analyzability
• Front-end
– Wrote own lexer (JLex), parser (CUP), AST
– Eased because of analyzability
• Translation to SAT formula “IR”
– Aggregate is mapped to collection of scalars
– Several stages of formula rewriting
Alloy, cont’d
• Uses 3 SAT solvers, each with strengths
– National challenge resulted in standard SAT “IR”
– Allowed declarative format for hooking in a solver
• Java Swing for general GUI, dot for graphs
– Scalars are mapped back to aggregates, etc., and
results are reported as counterexamples
– Currently don’t map results directly back to program
• Expects to use variables as a way to map to source
• About 20 KLOC of new code to build Alloy
Alloy: Lessons
• Designing for analyzability a major benefit
– Eases all aspects of front-end and translation to SAT
– Adding 3 kinds of polymorphism added 20 KLOC!
• SAT solver National Challenge a boon
– Several good solver components
– Standard IR eased integration
• SAT solver start/stop protocol the hardest
– Primitive form of computational steering
– Subprocess control, capturing/interpreting output
Daikon Tool
Ad Hoc Component Example, Dynamic Analysis
• Program invariant detector for C and Java
– Instruments program at proc entries/exits, runs it
– Infers variable value patterns at program points
• Programs with test-suites have been invaluable
– Class programs with grading suites
– Siemens/Rothermel C programs with test-suites
• Front-end the least interesting, 1/2 the work
– Parser, symbol table, AST/IR manipulation, unparser
• Get any two: manipulation toys with symbol table
• Symbol table the hardest, unparser the easiest
– Lots of choices, a few false starts
Daikon: Choosing Java Front-End
• Byte-code instrumenters (JOIE, Bobby)
Flexible and precise insertion points
Loss of names complicates mapping to source
Byte codes generated are compiler dependent
Debugging voluminous instrumentation is hard
• Source-level instrumentation
– Java lacks “insertability”, e.g., no comma operation
– Invalidates symbol table, etc.
– Chose Jikes, an open source compiler (got 2 of 4)
• Added AST manipulation good enough to unparse
• New byte-code instrumenters; EDG for Java
Ad Hoc Components: Critique
• Freedom is great, but integration is weak
– Data bloat: replicated and unused functionality
– Minimal support for mapping between reps
• Data: implementation of precise mappings
• Control: synchronize to compute only what’s needed
• Scalability a huge issue; data-flow information
for a 1 MLOC program, highly optimized:
500 MB AST
500 MB Bit-vectors
Space translates to time
by stressing memory
Component-based architecture to the rescue
Data-based Component Architecture
Aristotle Infrastructure
• Data-flow analysis and testing infra for C
• Database is universal integration mechanism
– Provides uniform, loose integration
• Separately compiled tools can write and read DB
– Added ProLangs framework [Ryder] at modest cost
• Scalability benefits
– Big file system overcomes space problem
– Persistence mitigates time problem
• Performance still an issue, hasn’t been focus
– Loose control integration produces reps in toto
– DB implemented with flat files
Control-based Component Architecture
Icaria Infrastructure
• Scalable data-flow (and syntactic) infra for C
– Hypothesis: need optimized components, control
integration, and user control for good performance
• Space- and time-tuned data structures
– AST, BB’s, CFG; bit-vectors semi-sparse & factored
– Memory allocation pools, free “block”
– Steensgaard pointer analysis
• Also piggybacked with CFG build pass for locality
• Event-based demand-driven architecture
– Compute all on demand; even discard/recompute
– Persistently store “undemandable” information
Event-based Demand Architecture
Icaria: User Control
• Declarative precision management
– Context sensitivity (call stack modelling)
– Pointer analysis (e.g., distinguish struct fields)
• Iteration strategies
– With tuned bit-vector stealing and reclamation
• Declarative programmer input
– ANSI/non-ANSI typing, memory allocators, …
– Adds precision, sometimes speed-up
• Termination control
– Suspend/resume buttons, procedural hook
– Because analysis is a means to an end (a task)
Icaria: The Price of Performance
• Must conform to architectural rules to get
performance benefits
– E.g., can’t demand/discard/redemand your AST
unless it meets architecture’s protocol
• May cascade into a lot of front-end work
– Can buy in modularly, incrementally
• “Demand” in batch
• Don’t discard
• Reconsider demand strategy for new analysis
– I.e., when to discard, what to save persistently
Icaria: Scenario – Java Retarget
• Use existing AST or derive off of Ponder’s
• Rethink pointer analysis
– Calls through function pointers mean bad CG
– Intersect (filter) Steensgaard with language types?
• Modular; variant works for C
• Rethink 3-address code and call-graph
– Small methods (many, deep calling contexts)
– “Allocation contexts” instead of calling contexts?
• Context sensitivity module would support
• Existing analyses not likely reusable OTS
Icaria: Applications
• Icaria supports Cawk, Sprite slicer, StarTool
– Cawk generated by Ponder syntactic infra [Atkinson]
– Slicer is 6 KLOC: 50% GUI, 20% equations
• Discard AST, CFG
• Persistently store backwards call-graph
• Scalability
– Simple Cawk scripts run at 500 KLOC/minute
– Sliced gcc (200 KLOC) on 200MHz/200MB UltraSparc
• 1 hour --> 1/2 minute by tuning function pointers
• Dependent on program and slice
• Other parameters less dramatic
Analysis Components
Designing for Reusable Analyses
• Approaches assume that tool is coded “within”
– Complicates migration to a new infrastructure
• Genoa [Devanbu] and sharlit [Tjiang] are
“monolithic” language/generator solutions
• How design a reusable “analysis component”?
– A client of infrastructure, so incomplete
• Addressed for StarTool reengineering tool
– Only front-end infra and target lang., not Tcl/tk GUI
StarTool: Main View
• “Referenced-by” relation for entity in clustered hierarchy
• Views are navigable, customizable, and annotable
StarTool: Adapter Approach
• Interpose an adapter [GHJV] to increase
separation of analysis and infra
• What adapter interface allows best retargets?
• Low-level: a few small, simple operations
– E.g., generic tree traversal ops
More responsibility
in Star relieves all
future adapters
Did 3 retargets,
including to GNAT
Ada AST [Dewar]
StarTool: Lessons Learned
• Retargets range from 500 to 2000 LOC
– Precise mappings to source, language complexity
• Best interface assumes nothing about infra
– In extreme, don’t assume there’s an AST at all
– Means providing operations that make StarTool’s
implementation easy (despite that there’s just one)
• E.g., iterator for “all references similar to this ”
• Metaquery operations resolve feature specifics
– Gives adapter lots of design room, can choose best
– More, bigger ops; mitigated by template class [GHJV]
– Got multi-language tool using 2 levels of adapters
• Infrastructures for prototyping or scalability
– 1000 LOC effort won’t scale-up, yet
– Absolute effort is lessening, scale increasing
– Boring stuff is still 1/2+ effort
• Trend towards components
– Span of requirements, performance, IDE integration
– Many components are programmable, however
• Interactive whole-program analysis stresses
modularity (reuse) of infrastructure
– Much reuse is white-box
Observations, cont’d
• Retargeting is expensive, defies infrastructure
– Symbol table (scoping, typing), and base analyses
– Language proliferation & evolution continue, slowly
– Tool retargets lag language definition, maybe a lot
• Bigger components are better [Sullivan]
– Many small components complicate integration
– Mitigates symbol-table issue
– Reuse still hard, sometimes white-box
• Language analyzability has big impact
– Front-end, mappings, precise and fast analysis
– Designers need to consider consequences
Open Issues
• Effective infrastructures for “deep” analysis
– In principle not hard
– In practice, performance/precision tradeoffs can
require significant rewrites for “small” change
• Out of private toolbox, beyond white-box reuse
– Fragile modularity, complexity, documentation
• Robustness
– Useful for incomplete or evolving systems
– Complicates the analysis, results harder to interpret
• Modification: beyond instrumentation & translation
Emerging Challenges
• Integration into IDE’s
– GUI dependence, native AST; reuse across IDE’s
• What is a program? What is the program?
– Multi-language programs
– Federated applications, client-server apps
– Trend is towards writing component glue
• Less source code (maybe), but huge apps
• How treat vast, numerous packages? Sans source?
• Current tools provide/require stub code
• Multi-threading is entering the main stream
• Faster computers, better OS’s and compilers
– Basic Dell’s can take two processors, and it works
• Compatibility packages: Cygwin, VMware, Exceed
• Emergence of Java, etc., for tool construction
– Better type systems, garbage collection
– API model, persistence, GUI, multi-threading
– (Maybe better analyzability, too)
• Infrastructure
– Modular analyses [Ryder], incremental update
– Visualization toolkits (e.g., SGI’s MineSet)
• Open source: share, improve; benchmarks
Icaria, etc.
Michael Ernst:
Dynamic analysis
Daniel Jackson:
Mik Kersten:
IDE integration
Mary Jean Harrold:
Arun Lakhotia:
Refine and CodeSurfer
Nicholas Mitchell:
Compiler infras, EDG
John Stasko:
Michelle Strout:
Compiler infrastructures
Kevin Sullivan:
Mediators and components

Software Engineering Tools Research on Only $10 a Day