Introduction
Dr. Bernard Chen Ph.D.
University of Central Arkansas
Spring 2009
What Is Computer Architecture?
Computer Architecture =
Instruction Set Architecture +
Machine Organization
2
Instruction Set Architecture

ISA = attributes of the computing
system as seen by the programmer






Organization of programmable storage
Data types & data structures
Instruction set
Instruction formats
Modes of addressing
Exception handling
3
Machine Organization




Capabilities & performance characteristics of
principal functional units (e.g., registers, ALU,
shifters, logic units)
Ways in which these components are
interconnected
Information flow between components
Logic and means by which such information
flow is controlled
4
What is “Computer”
• A computer is a machine that
performs computational tasks
using stored instructions.
A computer consist of … ?
1) Central processing unit (CPU);
2) Random access memory (RAM);
3) Input-output processors (IOP).

These devices communicate to each
other through a set of electric wires
called bus.
CPU consists of



> Arithmetic logic unit (ALU): Executes arithmetic
(addition, multiplication,...) and logical (AND, OR,...)
operations.
> Control unit: Generates a sequence of control signals
(cf. traffic signal) telling the ALU how to operate; reads
and executes microprograms stored in a read only
memory (ROM).
> Registers: Fast, small memory for temporary storage
during mathematical operations.
RAM stores


> Program: A sequence of
instructions to be executed by the
computer
 Data
History of Computers

The world’s first generalpurpose electronic
computer was ENIAC built
by Eckert and Mauchly at the
University of Pennsylvania
during World War II.
However, rewiring this
computer to perform a new
task requires
days of work by a number
of operators.
ENIAC built by Eckert and Mauchly at
the University of Pennsylvania
during World War II
9
The first practical stored-program computer
The first practical storedprogram computer was
EDSAC built in 1949 by
Wilkes of Cambridge
University.
Now the program in
addition to data is stored in
the memory so that different
problems can be solved
without hardware rewiring
anymore.
10
UNIVAC I
Eckert and Mauchly later went to
business, and built the first
commercial computer in the
United States, UNIVAC I, in
1951.
11
IBM System/360 series
A commercial breakthrough occurred in 1964 when IBM introduced
System/360 series.
The series include various models ranging from $225K to $1.9M with
varied performance but with a single instruction set architecture.
12
Supercomputers
The era of vector supercomputers started
in 1976 when Seymour Cray built Cray-1
Vector processing is a type of parallelism
which speeds up computation. We will learn
related concept of pipelining in this course.
In late 80’s, massively parallel computers
such as the CM-2 became the central
technology for supercomputing.
13
Microprocessors
Another important development
is the invention of the
microprocessor--a computer on
a single semiconductor chip.
14
Microprocessor
15
personal computers
Microprocessors enabled personal computers such as the Apple II (below) built in 1977 by
Steve Jobs and Steve Wozniak.
16
Moore’s Law
In 1965, Gordon Moore predicted that the number of
transistors per integrated circuit would double every 18
months. This prediction, called "Moore's Law,"
continues to hold true today. The table below shows the
number of transistors in several microprocessors
introduced since 1971.
17
Moore’s Law Still Holds
1011
2G 4G
10
Transistors Per Die
10
Memory
Microprocessor
109
108
107
4M
512M 1G
256M
128M
Itanium®
64M
Pentium® 4
16M
®
1M
6
10
256K
64K
4K 16K
5
10
104
80286
i386™
8080
1K
103
i486™
Pentium III
Pentium® II
Pentium®
4004
8086
102
101
100
’60
’65
Source: Intel
’70
’75
’80
’85
’90
’95
’00
’05
’10
18
Digital Systems - Analog vs. Digital
0000000000000000
0111111100000111
1000100011111000
1011011010001011
(a ) A n a lo g fo rm
(b ) S a m p le d a n a lo g fo rm
(c ) D ig ita l fo rm
M a g n e tic ta p e c o n ta n in g a n a lo g a n d d ig ita l fo rm s o f a s ig n a l.
•Analog
vs. Digital: Continuous vs. discrete.
•Results--- Digital computers replaced analog computers
19
Digital Advantages




More flexible (easy to program), faster,
more precise.
Storage devices are easier to
implement.
Built-in error detection and correction.
Easier to minimize.
Binary System
• Digital computers use the binary number system.
 Binary number system: Has two digits: 0 and 1.
• Reasons to choose the binary system:
1. Simplicity: A computer is an “idiot” which blindly
follows mechanical rules; we cannot assume any prior
knowledge on his part.
2. Universality: In addition to arithmetic operations,
a computer which speaks a binary language can
perform any tasks that are expressed using the formal
21
logic.
Example
Adding two numbers
High-level language (C)
c = a + b;
Assembly language
LDA 004
ADD 005
STA 006
Machine language
0010 0000 0000 0100
0001 0000 0000 0101
0011 0000 0000 0110
Boolean algebra
Since the need is great for manipulating the relations
between the functions that contain the binary or
logic expression, Boolean algebra has been
introduced.
The Boolean algebra is named in honor of a
pioneering scientist named: George Boole.
A Boolean value is a 1 or a 0.
A Boolean variable takes on Boolean values.
A Boolean function takes in Boolean variables and
produces Boolean values.
23
Boolean or logic operations
1.
OR. This is written + (e.g. X+Y where X and Y are Boolean
variables) and often called the logical sum. OR is called binary
operator.
2.
AND. Called logical product and written as a centered dot (like
product in regular algebra). AND is called binary operator.
3.
NOT. This is a unary operator (One argument), NOT(A) is written
A with a bar over it or use ' instead of a bar as it is easier to type.
4.
Exclusive OR (XOR).
Written as + with circle around it . It is also a binary operator.
True if exactly one input is true (i.e. true XOR true = false).
24
TRUTH TABLES
INPU
INPU
AND
A.B
___
A.B
A
B
A
B
0
0
0
1
0
0
0
1
0
1
0
1
0
0
1
1
1
1
0
OR
A+B
INPU
XOR
AB
A
B
0
0
0
0
1
1
0
1
1
1
0
1
1
0
1
1
1
1
1
1
0
25
Important identities of Boolean ALGEBRA.
Identity:
•A+0 = 0+A = A
•A.1 = 1.A = A
+ for OR
. for AND
Inverse:
•A+A' = A'+A = 1
•A.A' = A'.A = 0
•(using ' for not)
26
Important identities of Boolean
ALGEBRA

Associative:




A+(B+C) = (A+B)+C
A.(B.C)=(A.B).C
Due to associative law we can write A.B.C
since either order of evaluation gives the
same answer.
Often elide the . so the product associative
law is A(BC)=(AB)C
Important identities of Boolean
ALGEBRA

Distributive:



A(B+C)=AB+AC Similar to math.
A+(BC)=(A+B)(A+C) Contradictory to
math.
How does one prove these laws??

Simple (but long) write the Truth Tables for
each and see that the outputs are the
same.
Important identities of Boolean ALGEBRA.
29
Descargar

Document