Writing a Successful
TC Evaluation Plan
SMART objective
Evaluation Plan Type
Data collection by plan type
Process versus outcome data
collection and sampling
Evaluation Reporting
Evaluation Summary Narrative
Greatest Mistake
A Good Evaluation Plan
Needs SMART Objectives
Achievable and ambitious
Realistic (and Relevant)
SMART example
• By June 2013 at least 1 city in Fresno
County and/or the unincorporated area of
Fresno County will adopt a policy that
requires all tobacco retailers to obtain a
license to sell tobacco products with a
portion of the fees earmarked to conduct
regular compliance checks.
Specific: adopt and implement with enforcement fees
Measurable: Can do process and outcome evaluation
Achievable: Has been achieved elsewhere
Relevant: CTCP has indicated that it is of high priority
Time bound: By June 2013
Not So SMART Examples
“Our objective is to reduce secondhand smoke in
outdoor dining facilities” (By when? How? Through
a voluntary policy? A city ordinance?)
“By June 2012 we plan to have zero tobacco litter in
our parks” (Not realistic and not specific: which
parks? City? County? Specific ones?)
“We will work on reducing tobacco sales to minors”
(By how much? By when? Where? How?)
Evaluating Your
Think “backwards” by starting at the end:
• What is it that you are trying to achieve?
• What evaluation activities will help you get
• How can you best keep track of successes
and barriers?
• How can you best show what you have
achieved/what your outcome is?
• Use evaluation activities strategically to
help advance your objective
• Make sure your evaluation activities
produce useful and usable data
• Use activities that are manageable with
the resources you have
• Adhere to OTIS instructions and make the
evaluation plan fit your intervention plan
Evaluation Plan on OTIS
• Consult the OTIS Evaluation Guide
• Follow instructions closely
• Your objective determines the overall
evaluation design
• Within some limits you have many choices
• Refer to the OTIS plan type decision tree
to determine plan type and the data
collection by plan type to determine
outcome or process (or both)
Data Collection by Plan Type
Single policy
Multiple policy
Indiv. behavior
Other w/
Other without
measurable measurable
Process Evaluation
Process evaluations are used:
– To help further the objectives (e.g. KII
w/policy makers to assess adoption readiness)
– To demonstrate need (e.g. POS to show the
public wants a policy)
– To document how your project addressed this
objective (e.g. KIIs w/key stakeholders)
– To document policy adoption (e.g. record review
of policy language)
Outcome Evaluation
• Outcome evaluations are used
– To show a policy has been successfully
implemented (e.g. pre-/post outdoor
policy adoption litter observation)
– To show that a program has had impact
(e.g. pre-/post- and 3 month follow-up
survey with participants in cessation
Process versus Outcome
Common errors:
• Mistaking the adoption of a policy for an
“outcome” (in OTIS “outcome” requires
implementation of the policy)
• Thinking that Process and Outcome use
different data collection methods (either
evaluation type can use KIIs, surveys,
observations, etc.)
Plan type 1: Single/Multiple Policy
- Policy adoption needs Process evaluation
– Policy implementation needs Outcome evaluation
– Policy adopt and implement needs Process and
Outcome evaluation
Common errors: choosing “multiple policy” when it’s a
single policy; and confusing process and outcome
– EXPLAIN – meaning not clear re mult vs single
Plan type 2: Individual Behavior Change
(cessation programs, education programs)
Needs: Outcome evaluation
Recommended: several waves of a survey that
measures short-term and long-term quitting rates
in cessation programs
Plan type 3: Other with Measurable Outcome
(e.g. reduce # of stores with signage violations)
Needs: Outcome evaluation
Plan type 4: Other without Measurable Outcome
(e.g. develop and distribute education materials in
different languages)
Needs: Process Evaluation
Evaluation Design
• Outcome Study Design:
– Experimental (needs control and intervention
group and random assignment; rarely done in TC)
– Quasi-experimental (control group or multiple
measurement, e.g. over time; often used in TC)
– Non-experimental (no control group, only one prepost, anecdotal evidence)
Common Errors with
Outcome Study Design
• Proposing an experimental design that is
unrealistic and too costly
• Falsely classifying a quasi-experimental
design as experimental when it has a
control group but is not randomly assigned
• Only one post-test for cessation programs
• Classifying one pre-post test as quasiexperimental (it’s non-experimental)
Outcome Data Activity
• One of the most important decisions! Huh????
• What method makes most sense, gives you the
best information, shows outcome most likely? ???
(Mail survey, written questionnaire, key informant
interview [called “face-to-face survey” in OTIS
Evaluation Guide], telephone survey, observations,
Internet survey)
• Use tested instruments or specify ”will use TCEC
resources to help develop instrument”
Sampling for Outcome
• The sampling method and the sample size
need to be convincing to likely critics
• For sampling strategies see OTIS
evaluation guide p. 87 or the Sampling tool
on the TCEC website
• For calculating sample sizes, use a
reputable online sample size calculator
Common Errors with Outcome
Data Collection Activities
• Data collection activity not well suited for
• Purpose of data collection not clear – no logical
connection to how results will be used by project
• Suggesting poorly designed, non-tested data
collection instruments
• Sample size is too small or is not representative
(cluster or stratified sampling may be more
Process Data Collection
• Almost all objectives need some process data
collection activities
• Some activities can be used to advance the
objective: Focus group, KII, public opinion survey,
observation, youth purchase survey
• Some activities can be used to evaluate your
activities: education/participant survey
• Some activities can help document your project’s
process: policy record, media activity record
• Need to specify how data collectors will be
trained and by whom
Common Problems with Process
Data Collection Activities
• Forgetting data collection trainings
• Forgetting to add an education/
participant survey when an education
event takes place
• Same sampling errors as in outcome
data collection
Evaluation Reporting
• In this section, you will be asked
about the analysis you plan to do. You
will report on the analysis through
progress reports and the final
evaluation report (brief for nonprimary objective, full for primary
objective), as well as disseminate
results through other chanels.
Evaluation Reporting
Analysis plan that relates to the overall design
Most commonly done in TC:
– Descriptive statistics (frequencies, means)
– Comparison over time using frequencies, means
(excel is sufficient)
– Less frequently but sometimes useful in
comparisons over time or comparisons of variables:
T-test, Anova, Regression (statistical program
Common Errors in
Analysis Plan
• Getting stuck in evaluation details and not
addressing the “overall design”
• Proposing a design that is too
sophisticated for the limited sample size
or the limited resources and capacity
• Not sufficiently linking the design to the
proposed program and evaluation plan
Dissemination of Results
• Share w those who provided info and local
• Present data to decision makers
• Create fact sheets to use in education materials
• Send out press releases to generate media
coverage of the issue/your program
• Share with other tobacco control programs using
CTCP channels
• Tobacco related conferences
• Peer reviewed journals
• Mention any possible limitations that
you can foresee, including program
changes, low response rates, only
adoption achieved but not
implementation, so outcome
evaluation is not possible
• Suggest back-up plan
Evaluation Summary
Common problems:
– Not describing why this design and the
suggested activities were chosen
– Not presenting a “story” of your
evaluation but only listing activities
– Not describing the plan in chronological
– The logic is hard to follow
Greatest Mistake
• Filling out the OTIS evaluation fields
BEFORE having thought out the
evaluation plan.
• Hoping the plan will come together
while answering the questions and
drop-down menus on OTIS.
TCEC website has:
• OTIS Evaluation Guide
• Sample evaluation plans
• Sample data collection instruments
• Evaluation tools (e.g. “Sampling”)

Writing a Successful Evaluation Plan