SWE 681 / ISA 681
Secure Software Design &
Programming
Lecture 10: Miscellaneous
Dr. David A. Wheeler
2015-03-29
Outline
• Malicious tools / trusting trust attack
– Countering using diverse double-compiling (DDC)
• Vulnerability disclosure
Some portions © Institute for Defense Analyses
(the OSS and open proofs sections), used by permission.
This material is not intended to endorse particular
suppliers or products.
2
Countering trusting trust through
diverse double-compiling (DDC)
3
The trusting trust attack:
Disseminating malicious compilers
• Corrupted executable = an executable that does not
correspond to its putative source code
– An executable e corresponds to source code s iff execution
of e always behaves as specified by s when the execution
environment of e behaves correctly
• Trusting trust attack = An attack in which:
– “the attacker attempts to disseminate a compiler
executable that produces corrupted executables,
– at least one of those produced corrupted executables is a
corrupted compiler, and
– the attacker attempts to make this situation selfperpetuating”
Source: Wheeler 2009, “Fully Countering Trusting Trust through Diverse Double-Compiling (DDC)”
4
A maliciously-corrupted compiler
can cause serious damage
Trustworthy
source
code...
can produce
maliciously
corrupted
executables
Critical
program
source
login
Analysis
program
source
Symbolic debugger
Compiler
source
C compiler
Compiler executable
(maliciously corrupted)
Perpetuates
Critical
program
Analysis
program
Compiler
executable
1974: Karger & Schell first described (obliquely)
1984: Ken Thompson. Demo’d.
Fundamental security problem
5
Trusting trust countermeasure:
Diverse double-compiling (DDC)
• Idea created by Henry Spencer in 1998
–
–
Uses a different (diverse) trusted* compiler
Two compilation steps
• Compile source of “parent” compiler
• Use results to compile source of compiler-under-test
–
If result bit-for-bit identical to compiler-under-test cA, then
source and executable correspond
• Testing for bit-for-bit equality is easy
–
–
Source code may include malicious/erroneous code, but
now we can review source instead
Only brief description - didn’t describe in detail, prove it,
or demonstrate it. Didn’t even name it
• David A. Wheeler’s 2009 PhD dissertation
* We will define “trusted” soon
6
Wheeler’s dissertation thesis
The trusting trust attack can be detected and
effectively countered using the “Diverse DoubleCompiling” (DDC) technique, as demonstrated by:
1. a formal proof that DDC can determine if source
code and generated executable code correspond
2. a demonstration of DDC with four compilers (a
small C compiler [tcc], a small Lisp compiler, a
small maliciously corrupted Lisp compiler, and a
large industrial-strength C compiler, GCC), and
3. a description of approaches for applying DDC in
various real-world scenarios
Source: Wheeler 2009, “Fully Countering Trusting Trust through Diverse Double-Compiling (DDC)”
7
Diverse double-compiling (DDC)
DDC Process
Claimed Origin
cGP : executable
(grandparent)
sP: source
ePeffects
(of parent
e1: environment c ?)
eP: env.
o1
P
cT: executable
(trusted compiler)
e1effects
sP: source
(of parent cP?)
1
stage1
sA: source
(of cA?)
2
sA: source
e2effects
e2: environment (of cA?)
stage2: executable;
to run on
eArun: environment
cP
o2
eAeffects
eA: env.
cA: executable; compiler
under test, to run on
eArun: environment
8
Assumptions (informal)
• DDC performed by trusted programs/processes
–
–
Includes trusted compiler cT, trusted environments,
trusted comparer, trusted acquirers for cA, sP, sA
Trusted = justified confidence that it does not have
triggers and payloads that would affect the results of
DDC. Could be malicious, as long as DDC is unaffected
• Correct languages (Java compiler for Java source)
• Compiler defined by sP is deterministic (same
inputs always produce same outputs)
–
Real compilers typically deterministic
• Non-deterministic compilers hard to test & can’t use compiler
bootstrap test
9
DDC does not assume that different
compilers produce identical executables
• Different compilers typically produce different executables
• But given this C source:
#include <stdio.h>
main() {
printf("%d\n", 2+2);
}
• And two different properly-working C compilers:
–
–
Resulting executables will usually differ
Running those executables should produce “4” (modulo text encoding,
& presuming certain other assumptions)
10
Why not always use
the trusted compiler?
• May not be suitable for general use
– May be slow, produce slow code, generate code for a
different CPU, be costly, have undesirable license
restrictions, may lack key functions, etc.
– In particular, a simple easily-verified compiler (with limited
functionality & optimizations) could be used
• Using a different compiler greatly increases confidence
that source & executable correspond
– Attacker must now subvert multiple executables and
executable-generation processes to avoid detection
– DDC can be performed multiple times, using different
compilers and/or different environments, increasing
difficulty of undetected attack
11
Three proofs (using FOL)
• Proof 1: “If DDC produces the same executable as the compiler-under-test
cA, then source code sA corresponds to the executable cA”
(5 assumptions, 19 steps)
(stage2 = cA) -> exactly_correspond(cA, sA, lsA, eArun).
• Proof 2: “Under benign conditions and cP_corresponds _to_sP, the DDC
result stage2 and the compiler-under-test cA will be the same”
(9 assumptions, 30 steps)
stage2 = cA.
• Proof 3: “When there’s a benign environment & a grandparent compiler,
proof 2 assumption cP_corresponds_to_sP is true”
(3 assumptions, 10 steps)
exactly_correspond(cP, sP, lsP, eA).
• Proofs found by prover9, verified by ivy & by multi-person review
12
First DDC demonstration: tcc
• Performed on small C compiler, tcc (ACSAC)
–
Separate runtime library, handle in pieces
–
–
–
x86: Constants -128..127 can be 1 byte (vs. 4)
tcc detects this with a cast (prefers short form)
tcc bug – cast produces wrong result, so tcc compiledby-self always uses long form
• tcc defect: fails to sign-extend 8-bit casts
• tcc junk bytes: long double constant
–
–
Long double uses 10 bytes, stored in 12 bytes
Other two “junk” bytes have random data
• Fixed tcc, technique successfully verified fixed tcc
• Used verified fixed tcc to verify original tcc
It works!
13
Demonstration:
Goerigk Lisp compilers
• Pair of Lisp compilers, “correct” & “incorrect”
–
–
“Incorrect” implemented the trusting trust attack
Ported to Common Lisp
• DDC applied
–
–
“Correct” compiler compared correctly, as expected
Executable based on “incorrect” source code did not
match the DDC results when DDC used the “correct”
source code, as expected
• “Diff” between results revealed that the “incorrect”
executable was producing different results, in particular for a
“login” program
• Tip-off that executable is probably malicious
14
Demonstration: GCC
• GNU Compiler Collection (GCC) is widely-used compiler in
industry – shows DDC scales up
–
Many languages; for demo, chose C compiler
• Used Intel C++ compiler (icc) as trusted compiler
–
Completely different compiler
• Fedora didn’t record info to reproduce executable
• Created C compiler executable to capture all necessary data
& use that as compiler under test
–
–
–
Chose GCC version 3.0.4 as compiler under test
“gcc” is a front-end that runs the real compiler programs; C
compiler is actually cc1
Code outside of GCC (including linker, assembler, archiver, etc.)
considered outside compiler
15
DDC applied to GCC (simplified)
DDC Process
cT: executable : icc
(trusted compiler)
1
sP=sA : source
of GCC 3.0.4 stage1
2
stage2: executable;
to run on
eArun: environment
Claimed Origin
cGP : GCC in
Fedora 9
sP=sA : source
of GCC 3.0.4
o1
cP
o2
cA: executable; compiler
under test, to run on
eArun: environment
16
DDC applied to GCC (continued)
• Challenges:
–
–
–
“Master result” pathname embedded in
executable (so made sure it was the same)
Tool semantic change (“tail +16c”)
GCC did not fully rebuild when using its build
process (libiberty library not rebuilt)
• This took time to trace back & determine cause
• Once corrected, DDC produced bit-for-bit
equal results as expected
17
How can an attacker counter DDC?
Must falsify a DDC assumption, for example:
• Swap DDC result with cA during DDC process (!)
–
Defender can protect DDC environment
• Make compiler-under-test ≠ compiler used
–
–
If environment may provide inaccurate compiler under test, defender can
extract without using environment
If environment may run different compiler, defender can redefine
“compiler” to include environment & apply DDC
• Subvert trusted compiler/trusted environment(s)
–
–
Challenge: Don’t usually know what they’ll be
Defender can use DDC multiple times
• Attacker must subvert them all, while defender only needs to protect
at least one—unusual for defender
18
Guidelines for compiler suppliers
(Appendix D)
1.
2.
3.
Pass the compiler bootstrap test, if applicable
Don’t use or write uninitialized values
Record the detailed information necessary to recompile the compiler and
produce the same bit sequence
4. Don’t include information about compilation process inside files used during
later compilation
5. Encourage the development of alternative implementations of languages. Use or
help develop public specs for computer languages (preferably open standards)
6. Eliminate roadblocks to alternative implementations, particularly patents
7. Make the compiler portable and deterministic
8. Consider using a simpler language subset to implement the compiler
9. Release self-parented compiler executables, if applicable
10. Release the compiler as FLOSS, and choose a FLOSS compiler as its parent.
Alternatively, though less effective, release source to trusted third parties
11. Apply DDC before each release
19
Vulnerability Disclosure
20
Vulnerability disclosure:
The debate!
• Full disclosure
– Disclose vulnerabilities directly to public
– Supplier finds out when the public does
– May include example of exploit
• Coordinated disclosure
– Disclose vulnerabilities only after coordination with supplier
– May automatically disclose to public after period of time (common)
– Also called “responsible disclosure” but this is a highly biased term
• No disclosure
– Often goal is to exploit vulnerability (directly, or to sell to those who will).
Remember: Many people are in this mode!
• Separate issue: “bug bounties”
–
–
–
–
Paying for vulnerability reports, as a reward
Bug bounties can be great, but economics not favorable to defense
I think we should limit whom vulnerability information can be sold to
http://www.dwheeler.com/blog/2013/11/16/#vulnerability-economics
21
Full disclosure
• Crypto-Gram Newsletter by Bruce Schneier, October 15, 2013,
https://www.schneier.com/crypto-gram-1310.html:
– “Among IT security professionals, it has been long understood that the public
disclosure of vulnerabilities is the only consistent way to improve security….
– It wasn't always like this. In the early years of computing, it was common for
security researchers to quietly alert the product vendors about vulnerabilities,
so they could fix them without the "bad guys" learning about them. The
problem was that the vendors wouldn't bother fixing them, or took years
before getting around to it. Without public pressure, there was no rush.
– This all changed when researchers started publishing. Now vendors are under
intense public pressure to patch vulnerabilities as quickly as possible. The
majority of security improvements in the hardware and software we all use
today is a result of this process... Without public disclosure, you'd be much
less secure against cybercriminals, hacktivists, and state-sponsored
cyberattackers.”
•
“We don't believe in security by obscurity, and as far as we know, full
disclosure is the only way to ensure that everyone, not just the insiders,
have access to the information we need.” - Leonard Rose
22
Coordinated disclosure
• Originally called “responsible disclosure”
– “It’s Time to End Information Anarchy” by Scott Culp, October 2001
– Term still used, but it is extremely biased (“framing”)
– Microsoft now calls it “coordinated disclosure” – I recommend that
term instead
• Notion: Tell supplier first, disclose after supplier releases a fix
– That way, attackers who don’t know about it can’t exploit information
• Microsoft has a coordinated disclosure policy with no time limit as
long as the supplier keeps yakking
– http://www.theregister.co.uk/Print/2011/04/19/microsoft_vulnerabili
ty_disclosure_policy/
• But many suppliers never fix non-public vulnerabilities & just stall
– Incorrectly assumes attackers are not already exploiting vulnerability
(when they are, nondisclosure puts public at greater risk)
23
Coordinated disclosure
with time limit
• Common variant (compromise?) of coordinated (“responsible”) disclosure
– Maximum time limit (“embargo”) set for public disclosure
– Purpose of time limit: Supplier might actually fix the vulnerability for a change 
– Suppliers always recommend longer embargo times than everyone else 
• Some examples of public disclosure time limits:
– linux-distros: <7 days preferred, up to 14 days allowed, up to 19 days if Thu/Fri report &
disclosure on Mon/Tue http://oss-security.openwall.org/wiki/mailing-lists/distros
– oCERT: 14 days standard; 7 days if trivial, 30 days if critical/complex, up to 2 months
“extremely exceptional” https://www.ocert.org/disclosure_policy.html
– CERT/CC: 45 days “regardless of the existence… of patches or workarounds…
Extenuating circumstances … may result in earlier or later disclosure... We will not
distribute exploits” https://www.cert.org/vulnerability-analysis/vul-disclosure.cfm
– Google: 60 days “reasonable upper bound” http://googleonlinesecurity.blogspot.com/
2010/07/rebooting-responsible-disclosure-focus.html
• Personal opinion: Coordinated disclosure with ~3 week embargo
– Discuss time with supplier, but attackers are probably exploiting it already
– Don’t bother coordinating with suppliers known to ignore vulnerability reports
24
Disclosure: More information
• “Full Disclosure” by Bruce Schneier (November 15, 2001)
https://www.schneier.com/crypto-gram-0111.html#1
• “Full Disclosure and Why Vendors Hate it” by Jonathan
Zdziarski (May 1, 2008)
http://www.zdziarski.com/blog/?p=47
• “Software Vulnerabilities: Full-, Responsible-, and NonDisclosure” by Andrew Cencini, Kevin Yu, Tony Chan
• “How long should security embargos be?”, Jake Edge,
https://lwn.net/Articles/479936/
• “ Vulnerability disclosure publications and discussion
tracking” https://www.ee.oulu.fi/
research/ouspg/Disclosure_tracking
25
Recommendations for developers –
handling vulnerability reports
• Create an easy way to report vulnerabilities
–
–
–
–
Typically set up email address (“secure” or “security” @ “supplier”)
Make sure this is well-documented & easy to find!
Pre-establish an encrypted channel (e.g., encrypted email or HTTPS page)
Beware: Attackers often try to monitor these channels…
• Monitor for vulnerabilities about your software & libraries embedded in it
(e.g., Google alerts, watch major vulnerability sites) - don’t be last to know
• Have working development environment with CM of all released versions
– So can quickly check out & fix released version(s)
• Have strong automated regression test suite & needed old hardware, set
up to run at a moment’s notice
– If it takes too long to test, you are not ready to release
• Have automated patch release/install system established (with keys)
– So users can quickly & automatically receive fixes (airgap signing keys!!)
• Always credit & thank vulnerability reporter (unless requested otherwise)
26
We’ve covered…
• Formal methods
– General approaches & specific tools available
– How to applying them varies by purpose
• Open proofs
• Malicious tools / trusting trust attack
– Countering using diverse double-compiling (DDC)
• Vulnerability disclosure
27
Released under CC BY-SA 3.0
• This presentation is released under the Creative Commons AttributionShareAlike 3.0 Unported (CC BY-SA 3.0) license
• You are free:
– to Share — to copy, distribute and transmit the work
– to Remix — to adapt the work
– to make commercial use of the work
• Under the following conditions:
– Attribution — You must attribute the work in the manner specified by the
author or licensor (but not in any way that suggests that they endorse you or
your use of the work)
– Share Alike — If you alter, transform, or build upon this work, you may
distribute the resulting work only under the same or similar license to this one
• These conditions can be waived by permission from the copyright holder
– dwheeler at dwheeler dot com
• Details at: http://creativecommons.org/licenses/by-sa/3.0/
• Attribute as “David A. Wheeler and the Institute for Defense Analyses”
28
Descargar

SWE 781 / ISA 681 Secure Software Design & Programming