Implementing Cognitive
Radio
How does a
radio become
cognitive?
1
Cognitive Radio Technologies, 2008
Presentation Overview
• Architectural Approaches
• Observing the Environment
– Autonomous Sensing
– Collaborative Sensing
– Radio Environment Maps and Observation Databases
• Recognizing Patterns
– Neural Nets
– Hidden Markov Models
• Making Decisions
– Common Heuristic Approaches
– Case-based Reasoning
• Representing Information
• A Case Study
2/77
Cognitive Radio Technologies, 2008
Architectural Overview
What are the
components of a
cognitive radio
and how do they
relate to each
other?
3
Cognitive Radio Technologies, 2008
Strong Artificial Intelligence
• Concept: Make a machine aware
(conscious) of its environment and self
aware
4/77
• A complete failure (probably a good thing)
Cognitive Radio Technologies, 2008
Weak Artificial Intelligence
• Concept: Develop powerful (but limited)
algorithms that intelligently respond to
sensory stimuli
• Applications
–
–
–
–
–
Machine Translation
Voice Recognition
Intrusion Detection
Computer Vision
Music Composition
5/77
Cognitive Radio Technologies, 2008
Implementation Classes
• Weak cognitive radio
• Strong cognitive radio
– Radio’s adaptations
– Radio’s adaptations
determined by hard coded
determined by conscious
algorithms and informed by
reasoning
observations
– Closest approximation is
– Many may not consider this
the ontology reasoning
to be cognitive (see
cognitive radios
discussion related to Fig 6
in 1900.1 draft)
l In general, strong cognitive radios have potential to achieve
both much better and much worse behavior in a network.
6/77
Cognitive Radio Technologies, 2008
Weak/Procedural Cognitive
Radios
• Radio’s adaptations determined by hard
coded algorithms and informed by
observations
• Many may not consider this to be cognitive
(see discussion related to Fig 6 in 1900.1
draft)
– A function of the fuzzy definition
• Implementations:
–
–
–
–
–
–
CWT Genetic Algorithm Radio
MPRG Neural Net Radio
Multi-dimensional hill climbing DoD LTS (Clancy)
Grambling Genetic Algorithm (Grambling)
Simulated Annealing/GA (Twente University)
Existing RRM Algorithms?
7/77
Cognitive Radio Technologies, 2008
Strong Cognitive Radios
• Radio’s adaptations determined by
some reasoning engine which is
guided by its ontological knowledge
base (which is informed by
observations)
• Proposed Implementations:
– CR One Model based reasoning (Mitola)
– Prolog reasoning engine (Kokar)
– Policy reasoning (DARPA xG)
8/77
Cognitive Radio Technologies, 2008
Service in function
DFS in 802.16h
• Drafts of 802.16h
defined a generic
DFS algorithm
which implements
observation,
decision, action,
and learning
processes
• Very simple
implementation
Decision,
Action
Channel Availability
Check on next channel
Choose
Different Channel
Observation
Available?
No
Yes
Observation
In service monitoring
of operating channel
No
Decision,
Action
Detection?
Select and change to
new available channel
in a defined time with a
max. transmission time
Learning
Yes
Stop Transmission
Start Channel Exclusion timer
Log of Channel
Availability
Channel unavailable for
Channel Exclusion time
Yes
Modified from Figure h1 IEEE 802.16h-06/010 Draft IEEE Standard for Local and metropolitan area
networks Part 16: Air Interface for Fixed Broadband Wireless Access
Systems
Amendment
Cognitive
Radio Technologies,
2008for Improved
Coexistence Mechanisms for License-Exempt Operation, 2006-03-29
Available?
Background In service
monitoring (on nonoperational channels)
No
9/77
Example Architecture from CWT
Radio
CE-Radio Interface
Observation
Orientation
Performance API
Action
Hardware/platform API
Radio-domain cognition
WMS
Waveform
Recognizer
Radio
Performance
Monitor
Channel
Identifier
User Domain
User data security
System/Network security
Policy Domain
Search
Space
Config
Chob
Cognitive System Controller
User Model
Uob
Evolver
|(Simulated Meters) – (Actual Meters)|
Policy Model
Reg
Decision Maker
User preference
Local service facility
Security
WSGA
Security
CE-user interface
User preference
Local service facility
Radio
Resource
Monitor
DCH  max{S CH  U CH }
DU  max{SU  U U }
Actual Meters
Decision
Simulated
Meters
Initial Chromosomes
WSGA Parameters
Objectives and weights
Learning
X86/Unix
Terminal
Knowledge Base
Short Term Memory
Long Term Memory
WSGA Parameter Set
Regulatory Information
Models
Cognitive Radio Technologies, 2008
Cognitive System Module
System Chromosome
10/77
Architecture Summary
• Two basic approaches
– Implement a specific algorithm or specific collection of algorithms which
provide the cognitive capabilities
• Specific Algorithms
– Implement a framework which permits algorithms to be changed based
on needs
• Cognitive engine
• Both implement following processes
– Observation, Decision, Action
• Either approach could implement
– Learning, Orientation
– Negotiation, policy engines, models
• Process boundaries may blur based on the implementation
– Signal classification could be orientation or observation
• Some processes are very complementary
– Orientation and learning
• Some processes make most intuitive sense with specific
instantiations
– Learning and case-based-reasoning
Cognitive Radio Technologies, 2008
11/77
Observations
How does the radio
find out about its
environment?
12
Cognitive Radio Technologies, 2008
The Cognitive Radio and its
Environment
Information is about
How the cognitive
radio gets the
information?
Other opportunities to
get information
Environment
(physical quantities, position,
situations)
•Measures
temperature, light
level, humidity, …
• Receives GPS signals to
determine position
• Parses short-range wireless
broadcasts in buildings or
urban areas for mapped
environment
• Observes the network for e.g.
weather forecast, reported
traffic jams, …etc.
Spectrum
(communication
opportunities)
• Passively "listens" to the
spectrum
• Performs channel quality
estimation
• Spectrum information is
provided by the network
• Spectrum information is
shared by other cognitive
radios
User
• Observes user's applications,
incoming/ outgoing data
streams
• Performs speech analysis
Cognitive Radio Technologies, 2008
13/77
Signal Detection
• Optimal technique is matched filter
• While sometimes useful, matched filter may not be
practical for cognitive radio applications as the signals
may not be known
• Frequency domain analysis often required
• Periodogram
2
T
 1

Pxx  F   lim 
T0  
 2 T0

0
 T0
x t  e
 j 2 Ft
d


– Fourier transform of autocorrelation function of received signal
– More commonly implemented as magnitude squared of FFT of
signal
14/77
Cognitive Radio Technologies, 2008
Comments on Periodogram
• Spectral leaking can mask weak signals
• Resolution a function of number of data points
• Significant variance in samples
– Can be improved by averaging, e.g., Bartlett, Welch
– Less resolution for the complexity
• Significant bias in estimations (due to finite length)
– Can be improved by windowing autocorrelation, e.g., Blackman-Tukey
Quality Factor
Q 
 E  P
xx
 f   
var  Pxx
Estimation
Quality Factor
Periodogram
1
2
 f  
Complexity
 0.9 
log 2 

2

f


N
Bartlett
1.11 N
f
Welch (50%
overlap)
1.39 N
f
 5.12 
N log 2 

 f 
f
 1.28 
N log 2 

 f 
BlackmanTukey
2.34 N
Cognitive Radio Technologies, 2008
15/77
Other Detection Techniques
• Nonparametric
– Goertzel – evaluates Fourier Transform for a small
band of frequencies
• Parametric Approaches
– Need some general characterization (perhaps as
general as sum of sinusoids)
– Yule-Walker (Autoregressive)
– Burg (Autoregressive)
• Eigenanalysis
– Pisarenko Harmonic Decomposition
– MUSIC
– ESPRIT
16/77
Cognitive Radio Technologies, 2008
Sub noise floor Detection
• Detecting narrowband signals with negative SNRs is
actually easy and can be performed with preceding
techniques
• Problem arises when signal PSD is close to or below
noise floor
• Pointers to techniques:
– (white noise) C. L. Nikias and J. M. Mendel, “Signal processing
with higher-order spectrum,” Signal Processing, July 1993.
– (Works with colored noise and time-varying frequencies) K.
Hock, “Narrowband Weak Signal Detection by Higher Order
Spectrum,” Signal Processing, April 1996
– C.T. Zhou, C. Ting, “Detection of weak signals hidden beneath
the noise floor with a modified principal components analysis,”
AS-SPCC 2000, pp. 236-240.
17/77
Cognitive Radio Technologies, 2008
Signal Classification
• Detection and frequency identification alone is
often insufficient as different policies are applied
to different signals
– Radar vs 802.11 in 802.11h,y
– TV vs 802.22
• However, would prefer to not have to implement
processing to recover every possible signal
• Spectral Correlation permits feature extraction
for classification
18/77
Cognitive Radio Technologies, 2008
Cyclic Autocorrelation
• Cyclic Autocorrelation
R x    lim

t  
1

t
t / 2
 /2
x t   / 2  x t   / 2  e
 j 2  t
dt
• Quicky terminology:
– Purely Stationary R x  
  0    0, R x    0
– Purely Cyclostationary R x     0 only for   n / T 0 , n 
– Exhibiting Cyclostationarity R x     0

• Meaning: periods of cyclostationarity correspond
to:
– Carrier frequencies, pulse rates, spreading code
repetition rates, frame rates
• Classify by periods exhibited in R
Cognitive Radio Technologies, 2008
19/77
Spectral Correlation
• Estimation of Spectral Correlation Density (SCD)

S XT
 f  t

t / 2
  *
 

X
t
,
f

X
t
,
f

T 
 T
 dt

 t  t / 2 T
2 
2 


1
1
– For =0, above is periodogram and in the limit the PSD
• SCD is equivalent to Fourier Transform of Cyclic
Autocorrelation

S XT

   
f

R x   e

 j 2 f 
d
20/77
Cognitive Radio Technologies, 2008
Spectral Coherence Function
• Spectral Coherence Function

Sx

Cx 
0
Sx

• Normalized, i.e., C x
f
f
f
  / 2 Sx
0
f
  / 2
1
• Terminology:
–  = cycle frequency
– f = spectrum frequency
• Utility: Peaks of C correspond to the underlying
periodicities of the signal that may be obscured in the
PSD
• Like periodogram, variance is reduced by averaging
21/77
Cognitive Radio Technologies, 2008
Practical Implementation of
Spectral Coherence Function
X (f -  / 2 )
x(t)
N p oint
FFT
X( f )
S hift X (f -  / 2)
an d X (f +  / 2)
C o rrelation of
X * (f -  / 2) and X (f +  / 2 )
X (f +  / 2)

S X T ( f ) t
G et S p ectral
C o h eren ce Fun ctio n
B lo ck av erag e
o v er  t

CX ( f )
From Figure 4.1 in I. Akbar, “Statistical Analysis of Wireless Systems Using Markov Models,” PhD Dissertation, Virginia Tech, January 2007
22/77
Cognitive Radio Technologies, 2008
Example Magnitude Plots
DSB-SC AM
FSK
BPSK
MSK
23/77
Cognitive Radio Technologies, 2008
- Profile
ˆ ( f ) |
I
(

)

m
ax
|
C
• -profile of SCF
x
f
• Reduces data set size, but captures most
periodicities
BPSK Cycle Freq. Profile with SNR=0 dB Obs. Length=100
1
Max. Ampltitude of Spectral Coherence
0.9
DSBSC AM Cycle Freq. Profile with SNR=0 dB Obs. Length=100
1
Max. Ampltitude of Spectral Coherence
0.9
DSB-SC AM
0.8
0.7
0.6
0.5
0.4
BPSK
0.8
0.7
0.6
0.5
0.4
0.3
0.2
0.3
0.1
0.2
0.1
0
0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
frequency
 /F dB Obs. Length=100
MSK Cycle Freq.Cycle
Profile
with SNR=0
s
1
Cycle frequency  /Fs
1
0.9
0.9
0.8
0.8
0.7
0.6
0.5
0.4
0.3
0.2
FSK
Max. Ampltitude of Spectral Coherence
Max. Ampltitude of Spectral Coherence
FSK Cycle Freq. Profile with SNR=0 dB Obs. Length=100
1
0.7
MSK
0.6
0.5
0.4
0.3
24/77
0.2
0.1
Cognitive Radio Technologies, 2008
0.1
0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
Combination of Signals
MSK
BPSK
BPSK + MSK
25/77
Cognitive Radio Technologies, 2008
Impact of Signal Strength
BPSK with SNR=9dB
BPSK with SNR=-9dB
Cognitive Radio Technologies, 2008
Main signature remains
26/77
Resolution
• High  resolution may be
needed to capture feature
space
BPSK
200x200
– High computational burden
• Lower resolution possible
if there are expected
features
– Legacy radios should be
predictable
– CR may not be predictable
– Also implies an LPI strategy
BPSK
100x100
AM
Plots from A. Fehske, J. Gaeddert, J. Reed, “A new approach to signal
classification using spectral correlation and neuralCognitive
networks,”
Radio Technologies, 2008
DySPAN 05, pp. 144-150.
27/77
Additional comments on
Spectral Correlation
• Even though PSDs may overlap, spectral correlation
functions for many signals are quite distinct, e.g., BPSK,
QPSK, AM, PAM
• Uncorrelated noise is theoretically zeroed in the SCF
– Technique for subnoise floor detection
• Permits extraction of information in addition to
classification
– Phase, frequency, timing
• Higher order techniques sometimes required
– Some signals will not be very distinct, e.g., QPSK, QAM, PSK
– Some signals do not exhibit requisite second order periodicity
28/77
Cognitive Radio Technologies, 2008
Collaborative Observation
• Possible to combine estimations
• Reduces variance, improves PD
vs PFA
• Should be able to improve
resolution
• Proposed for use in 802.22
– Partition cell into disjoint regions
– CPE feeds back what it finds
• Number of incumbents
• Occupied bands
C P E N um ber = 400, IT N um ber = 4
100
90
80
G rid Index Y
70
60
50
40
30
20
29/77
10
0
0
Cognitive
Radio Technologies, 2008
100
Source: IEEE
G rid802.22-06/0048r0
Index X
10
20
30
40
50
60
70
80
90
More Expansive Collaboration:
Radio Environment Map (REM)
• “Integrated database consisting of multi-domain information, which
supports global cross-layer optimization by enabling CR to “look”
through various layers.”
• Conceptually, all the information a radio might need to make its
decisions.
– Shared observations, reported actions, learned techniques
• Significant overhead to set up, but simplifies a lot of applications
• Conceptually not just cognitive radio, omniscient radio
30/77
From: Y. Zhao, J. Gaeddert, K. Bae, J. Reed, “Radio Environment Map Enabled Situation-Aware Cognitive Radio Learning Algorithms,” SDR
Cognitive Radio Technologies, 2008
Forum Technical Conference 2006.
Example Application:
• Overlay network of secondary
users (SU) free to adapt
power, transmit time, and
channel
• Without REM:
– Decisions solely based on link
SINR
• With REM
– Radios effectively know everything
Upshot: A little gain for the secondary users;
big gain for primary users
31/77
From: Radio
Y. Zhao,
J. Gaeddert,
Cognitive
Technologies,
2008 K. Bae, J. Reed, “Radio Environment Map Enabled SituationAware Cognitive Radio Learning Algorithms,” SDR Forum Technical Conference 2006.
Observation Summary
• Numerous sources of information available
• Tradeoff in collection time and spectral resolution
• Finite run-length introduces bias
– Can be managed with windowing
• Averaging reduces variance in estimations
• Several techniques exist for negative SNR detection and
classification
• Cyclostationarity analysis yields hidden “features” related
to periodic signal components such as baud rate, frame
rate and can vary by modulation type
• Collaboration improves detection and classification
• REM is logical extreme of collaborative observation.
32/77
Cognitive Radio Technologies, 2008
Pattern Recognition
Hidden Markov
Models, Neural
Networks,
Ontological
Reasoning
33
Cognitive Radio Technologies, 2008
Hidden Markov Model (HMM)
• A model of a system which behaves like a Markov chain
except we cannot directly observe the states, transition
probabilities, or initial state.
• Instead we only observe random variables with
distributions that vary by the hidden state
• To build an HMM, must estimate:
–
–
–
–
–
Number of states
State transition probabilities
Initial state distribution
Observations available for each state
Probability of each observation for each state
• Model can be built from observations using Baum-Welch
algorithm
• With a specified model, output sequences can be
predicted using the forward-backward algorithm
• With a specified model, a sequence of states can be 34/77
estimated from observations using the Viterbi algorithm.
Cognitive Radio Technologies, 2008
Example
• A hidden machine selects balls from an
unknown number of bins.
• Bin selection is driven by a Markov chain.
• You can only observe the sequence of balls
delivered to you and want to be able to predict
future deliveries
Hidden States (bins)
Observation Sequence
35/77
Cognitive Radio Technologies, 2008
HMM for Classification
• Suppose several different HMMs have been calculated
with Baum Welch for different processes
• A sequence of observations could then be classified as
being most like one of the different models
• Techniques:
– Apply Viterbi to find most likely sequence of state transitions
through each HMM and classify as the one with the smallest
residual error.
– Build a new HMM based on the observations and apply an
approximation of Kullback-Leibler divergence to measure
“distance” between new and existing HMMs. See M.
Mohammed, “Cellular Diagnostic Systems Using Hidden Markov
Models,” PhD Dissertation, Virginia Tech, October 2006.
36/77
Cognitive Radio Technologies, 2008
System Model for Signal
Classification
37/77
Cognitive Radio Technologies, 2008
Signal Classification Results
38/77
Cognitive Radio Technologies, 2008
-12dB
-9dB
0.8
-6dB
-3dB
0.7
3dB
0.6
-
50%
Detection Rate
0dB
0.5
0.4
-
12dB
6dB
9dB
9dB
0.3
0.2
-
6dB
0.1
0%
• Decreasing SNR
increases
observation time
to obtain a good
detection rate
BPSK Signal Classification Performance for various SNRs and Observation Lengths (BPSK HMM with 9dB)
1
0.9
Detection Rate
• BPSK signal
detection rate of
various SNR and
observation
length
(BPSK HMM is
trained with 9dB)
100%
Effect of SNR and Observation
Length
0
0
0
5
5
10
10
15 Observation
20 Length 25
15
20
25
30
30
35
35
40
Observation Length (One block is 100 symbols)
39/77
Cognitive Radio Technologies, 2008
40
Location Classifier Design
• Designing a classifier requires two fundamental steps
– Extraction of a set of features that ensures highly
discriminatory attributes between locations
– Select a suitable classification model
• Features are extracted based on received power delay
profile which includes information regarding the surrounding
environment (NLoS/LoS, multipath strength, delay etc.).
• The selection of hidden Markov model (HMM) as a
classification tool was motivated by its success in other
applications i.e., speech recognition.
Location of
interest
Collect statistics to
form signature
features
Cognitive Radio Technologies, 2008
Use pattern
matching algorithm
to classify features
40/77
Determining Location by
Comparing HMM Sequences
HMM for
position 1
p(O|1 )
HMM for
position 2
p(O|2 )
Candidate received
power profile
Feature
extraction
Vector
Observation
quantization sequence,O
Index of
recognized
position, v*
Select
maximum
v*=argmax [p(O|v ]
1vn
HMM for
position n
p(O|n )
• In the testing phase, the candidate power profile is compared against all
the HMMs previously trained and stored in the data base.
• The HMM with the closest match identifies the corresponding position.
41/77
Cognitive Radio Technologies, 2008
Feature Vector Generation
Power delay profile
-60
-65
Received power (dBm)
-70
Noise threshold = -80 dBm
-75
-80
-85
-90
-95
-100
0
100
200
Excess delay (ns)
300
400
500
• Each location of interest was characterized by its channel
characteristics i.e., power delay profile.
• Three dimensional feature vectors were derived from the
power delay profile with excess time, magnitude and phase
of the Fourier transform (FT) of the power delay profile in
42/77
each direction.
Cognitive Radio Technologies, 2008
Measurement Setup Cont.
Measurement Locations 1.1 – 1.4, 4th Floor,
Durham Hall, Virginia Tech. The transmitter is
located in Room 475, Receivers 1.1 and 1.2 are
located in Room 471; Receiver 1.3 is in the
conference room in the 476 computer lab, and
Receiver 1.4 is located in the hallway adjacent to
475.
~17’
RX 1.3
~58’
• Transmitter location 1 represents
NLOS propagation from a room to
another room, and from a room to
a hallway. The transmitter and
receivers were separated by
drywall containing metal studs.
• The transmitter was located in a
small laboratory. Receiver
locations 1.1 – 1.3 were in
adjacent rooms, whereas receiver
location 1.4 was in an adjacent
hallway. Additionally, for locations
1.1 – 1.3, standard office dryerase “whiteboard” was located
on the wall separating the
transmitter and receiver.
Cognitive Radio Technologies, 2008
RX
1.4
RX 1.1
RX 1.2
43/77
Vector Quantization (VQ)
• Since the discrete observation
density is required to train
HMMs, a quantization step is
required to map the
“continuous” vectors into
discrete observation sequence.
• Vector quantization (VQ) is an
efficient way of representing
multi-dimensional signals.
Features are represented by a
small set of vectors, called
codebook, based on minimum
distance criteria.
• The entire space is partitioned
into disjointed regions, known
as Voronoi region.
Example vector quantization in
a two-dimensional space.*
44/77
* http://www.geocities.com/mohamedqasem/vectorquantization/vq.htm
Cognitive
Radio Technologies, 2008
Classification Result
• A four-state HMM was used to represent each location (Rx
1.1-1.4).
• Codebook size was 32
• Confusion matrix for Rx location 1.1-1.4
Candidate received
power profile (true)
HMM based on Rx location (estimated)
Position Rx 1.1 Rx 1.2 Rx 1.3 Rx 1.4
Rx 1.1
95%
5%
0%
0%
Rx 1.2
5%
95%
0%
0%
Rx 1.3
0%
0%
100%
0%
Rx 1.4
0%
10%
0%
90%
Cognitive Radio Technologies, 2008
Overall accuracy
95%
Correct
classification
45/77
Some Applications of HMMs to
CR from VT
•
•
•
•
•
Signal Detection and Classification
Position Location from a Single Site
Traffic Prediction
Fault Detection
Data Fusion
46/77
Cognitive Radio Technologies, 2008
The Neuron and Threshold Logic
Unit
Neuron
• Several inputs are weighted,
summed, and passed through
a transfer function
• Output passed onto other
layers or forms an output itself
• Common transfer (activation)
functions
– Step
– Linear Threshold
– Sigmoid
– tanh
1
f a  
0
a 
a 
f  a   a  w n 1
f a 
1
f a  
0
a 
Image from: http://en.wikipedia.org/wiki/Neuron
x1
w1
x2
w2
Threshold Logic Unit
a
1
1 e
a 
f (a)
  a   / 
 a  
f  a   tanh 
xn




Cognitive Radio Technologies, 2008
wn
47/77
Neuron as Classifier
• Threshold of
multilinear neuron
defines a
hyperplane decision
boundary
• Number of inputs
defines defines
dimensionality of
hyperplane
• Sigmoid or tanh
activation functions
permit soft
decisions
Inputs
Weights
Activation
Activation Function
x1
x2
w1
w2
a
w3
>0.25?
0
0
-0.5
0.5
0
0.5
1
0
1
0.5
1
1
0
-0.5
0
1
1
0
1
x2
48/77
Cognitive Radio Technologies, 2008
x1
Training Algorithm
• Perceptron (linear
transfer function)
• Basically an LMS training
algorithm
• Steps:
– Given sequence of input
vectors v and correct
output t
– For each (v,t) update
weights as
w
 w   t  y  v
– where y is the actual output
(thus t-y is the error)
k 1
k
• Delta (differentiable
transfer function)
• Adjusts based on the
slope of the transfer
function
df  a 
k 1
k
w
 w 
t  y  v
da
• Originally used with
sigmoid as derivative is
easy to implement
f a 
df  a 
da
Cognitive Radio Technologies, 2008
1
1 e

1

  a   / 
f  a  1  f  a  
49/77
The Perceptron
• More sophisticated version of TLU
• Prior to weighting, inputs are processed with Boolean
logic blocks
• Boolean logic is fixed during training
Boolean Logic Blocks
x1
x2
xn
w1
w2
Threshold Logic Unit
a
f (a)
wn
50/77
Cognitive Radio Technologies, 2008
More Complex Decision Rules
• Frequently, it is impossible
to correctly classify with just
a single hyperplane
• Solution: Define several
hyperplanes via several
neurons and combine the
results (perhaps in another
neuron)
• This combination is called a
neural net
• Size of hidden layer is
x1
number of hyperplanes in
decision rules
x2
x1
x2
51/77
Output
Layer
Cognitive Radio Technologies, 2008
Input
Layer
Hidden Layer
Backpropagation Algorithm
• Just using outputs and inputs doesn’t tell us how
to adjust hidden layer weights
• Trick is figuring out how much of the error can be
ascribed to each hidden neuron
w
k 1

 w   v
k
k
m

 
df a
da
m
m

 wm
j
j I m
j
 
o
 
df a
da
o
o
t  y 
x1
x2
Output Layer
Input Layer
Cognitive Radio Technologies, 2008
Hidden Layer
52/77
Example Application
• Each signal class is a
multilayer linear
perceptron network with 4
neurons in the hidden
layer
• Trained with 199 point profile, back propagation
• Activation function tanh
• MAXNET chooses one
with largest value
460 Trials Known Carrier, BW,
-9 dB SNR
295 Trials unknown Carrier, BW,
15 dB SNR
53/77
Results from: A. Fehske, J. Gaeddert, J. Reed, “A new approach to signal classification using spectral correlation and neural networks,”
Cognitive Radio Technologies, 2008
DySPAN 2005. pp. 144 - 150.
Comments on Orientation
• By itself ontological reasoning is likely inappropriate for
dealing with signals
• HMM and Neural Nets somewhat limited in how much
they can scale up arbitrarily
• Implementations should probably feature both classes of
techniques where
– HMMs and NNs identify presence of objects, locations, or
scenarios, and reasoning engine combines.
– Meaning of the presence of these objects is then inferred by
ontological reasoning.
54/77
Cognitive Radio Technologies, 2008
Decision Processes
Genetic
algorithms,
case-based
reasoning, and
more
55
Cognitive Radio Technologies, 2008
Decision Processes
• Goal: choose the actions
that maximize the radio’s
goal
• Very large number of
nonlinearly related
parameters tends to make
solving for optimal solution
quite time consuming.
56/77
Cognitive Radio Technologies, 2008
Case Based Reasoning
• An elaborate switch (or ifthen-else) statement
informed by “cases”
defined by orientation (or
context)
• “Case” identified by
orientation, decision
specified in database for
the case
• Database can be built up
over time
• Problem of what to do
when new case is
identified
A. Aamodt, E. Plaza (1994); Case-Based Reasoning: Foundational Issues,
57/77
Methodological Variations, and System Approaches. AI Communications. IOS
Cognitive Radio Technologies, 2008
Press, Vol. 7: 1, pp. 39-59.
Local Search
• Steps:
1. Search a “neighborhood” of
solution, sk to find s* that that
improves performance the most.
2. sk+1=s*
3. Repeat 1,2 until sk+1= sk
• Variant: Gradient search, fixed
number of iterations, minimal
improvement
• Issues: Gets trapped in local
maxima
Figure from Fig 2.6 in I. Akbar, “Statistical Analysis of
Wireless Systems Using Markov Models,” PhD
Dissertation, Virginia Tech, January 2007
58/77
Cognitive Radio Technologies, 2008
Genetic Algorithms
• Concept: Apply concept of evolution to
searching complex spaces
• Really random search with some structure
• Successive populations (or generations) of
solutions are evaluated for their fitness.
• Least fit solutions are removed from the
population
• Most fit survive to breed replacement
members of the population
• Breeding introduces mutation and crossovers so that new population is not identical
to original population
– Like parents and kids
• Lots of variants
– Parents die off
– Niches
– Tabu for looping
59/77
Cognitive Radio Technologies, 2008
Genetic Algorithm Example
Fitness
Breeding
Population
7
PWR F MAC NET
5
PWR F MAC NET
PWR F MAC NET
PWR F MAC NET
Cross over
9
PWR F MAC NET
Mutation
1
PWR F MAC NET
60/77
Cognitive Radio Technologies, 2008
Comments on GA
• Tends to result in good solution very quickly
• Long time (perhaps no better than a random search) to find optimum
– Often paired with a local search
• Low mutation rates can cause “genetic drift”
• High mutation rates can limit convergence
• Cross over is like high mutation, but without damaging convergence,
but can get stuck on local maxima
• In theory, reaches global optimum, but requires more time to
guarantee than an exhaustive search
• Lots of freedom in the design
– Mutation rate, cross over rate, chromosome size, number of
generations, population size, number of survivors, breeding rules,
surviving rules
– Even more variation used when fitness function or data sets are
changing over time (e.g., set mutation rate or population as a function of
fitness)
– Theoretically, best combination of parameters is a function of the
characteristics of the solution space
– In practice, empirically setting parameters tends to be better (GA to 61/77
Cognitive Radio Technologies, 2008
program a GA?)
Simulated Annealing
• Steps:
• Comments:
1. Generate a random
solution, s*
2. If s* is better than sk, then
sk, then sk+1=s*; else
generate random variable
r. If r is less than some
function of temperature and
the difference in value of sk
and s* and T, then sk+1=s*.
3. From time to time decrease
T so that f(sk – s*,T)
decreases over time.
4. Repeat steps 1-3 until
stopping criterion
– Important to store best result
– In theory, reaches global
optimum, but requires more
time to guarantee than an
exhaustive search
– Often finished with a local
search applied to best
solution
• Freedom in algorithm
– Distributions for generating
s*, schedules for T, change in
distributions with T
• Threshold trading can be
less costly
62/77
Cognitive Radio Technologies, 2008
Comments on Decision
Processes
• Execution time
– Case-based reasoning < Searches
• Good architectural decision is to combine approaches:
–
–
–
–
CBR except when unknown case
GA for a quick good solution
Refine with local search
Can revisit searches later when excess cycles are available
• CBR can provide initial solution(s) to search algorithms
• Sometimes simpler algorithms are all that are required
and will run much faster than any of these
– Adjust power level for a target SINR
63/77
Cognitive Radio Technologies, 2008
Representing Information
How can a
radio store
and
manipulate
knowledge?
64
Cognitive Radio Technologies, 2008
Types of Knowledge
• Conceptual Knowledge
– Analytic or axiomatic
– Analytic if it expresses or follows from the meaning of objects
• E.g., a mobile radio is a radio with the property of mobility
– Axiomatic – fundamental conceptual relationships not based on
meaning alone
• Rules
– Relationships or theorems committed to memory
– Some authors draw a distinction between rules and conceptual
knowledge, but it could be argued that a rule is just an axiom (or
property)
• Can be expressed symbolically (e.g., UML),
ontologically, or behaviorally (e.g., GA)
65/77
Cognitive Radio Technologies, 2008
Why languages to represent
information?
• Negotiation
– Heterogeneous devices can exchange information
• Sharing learned information between devices
• Permits reasoning and learning to be abstracted
away from specific platforms and algorithms
– Portability, maintainability
• Permits appearance of intelligence by reasoning
in a manner that appears familiar to a human
• Note: much of the preceding could also be done
with behavioral knowledge (e.g., sharing GA
states) but it is somewhat clumsier
66/77
Cognitive Radio Technologies, 2008
Proposed Languages
• UML
• Radio Knowledge Representation Language
– Describes environment and radio capabilities
– Part of “radioOne”
• Resource Description Language
• Web-based Ontology Language (OWL)
– Proposed for facilitate queries between radios
• DAML and (used by BBN)
• Issues of language interoperability, testability,
actual “thought” processes
67/77
Cognitive Radio Technologies, 2008
Language Capabilities and
Complexity
• Increasing capabilities significantly increases complexity
Language
Features
Reasoning
XTM
Higher order relationships
None
O(N)
RDF
Binary Relationships
None
O(N)
RDFS
RDF plus subclass,
subproperty, domain, and
range
Subsumption
O(Nm)
OWL Lite
RDFS plus some class
constructors; no crossing
of metalevels
Limited form of
description logic
O(eN)
OWL-DL
All class constructors; no
crossing of metalevels
General
description logic
<
No restrictions
Limited form of
first order
predicate logic
?
OWL Full
Complexity
68/77
Modified from Table 13.1 in M. Kokar, The Role of Ontologies
in Cognitive
Radio
Cognitive Radio
Technologies,
2008in Cognitive Radio Technology, ed., B. Fette, 2006.
Comments on Knowledge
Representation
• Ontologies are conceptually very appealing for realizing
thinking machines
• Personal concern that goals of very high level abstraction,
platform independence, lack of a detailed specification,
and automated interoperability will lead to JTRS-like
implementation difficulties (see theoretically unbounded
complexity, JTRS is at least bounded)
– However these are really the benefits of using ontologies…
• Building an ontology is a time-intensive and complex task
• Combining ontologies will frequently lead to logical
inconsistencies
– Makes code validation hard
• Encourage development of domain standardized
ontologies
– Policy, radio, network
69/77
Cognitive Radio Technologies, 2008
Virginia Tech Cognitive Radio
Testbed - CORTEKS Researchers:
Joseph Gaeddert,
Kyouwoong Kim, Kyung
Bae, Lizdabel Morales,
and Jeffrey H. Reed
G P IB
G P IB
A rb itra ry W a ve fo rm
G e n e ra to r A W G 4 3 0
R e a l T im e S p e ctru m
A n a lyze r (R S A 3 4 0 8 A )
M u lti-m o d e tra n s m itte r
R e c e ive r & sp e ctru m
se n so r
G P IB
P e rso n a l C o m p u te r
A rb itra ry W a ve fo rm
G e n e ra to r A W G 7 1 0 B
S ig n a l u p co n ve rte r
M P R G ’s O S S IE (O p e n S o u rce
S C A Im p le m e n ta tio n E m b e d d e d )
p la tfo rm
C o R T e k S S o ftw a re A p p lica tio n
(R u d im e n ta ry N e u ra l
N e tw o rk b a se d C o g n itive
E n g in e )
P o lic y (4 c h .)
T ra n s m it s p e c tra l
p o w e r m a sk
M o d u la tio n
s ch e m e s
Im a g e D is p la y
T ra n s m itte d Im a g e
R e c e ive d Im a g e
P a c k e t H is to ry
D is p la y
B it e rro r R a te
S p e c tra l E ffic ie n cy
W a v e fo rm
C e n te r F re q .
S p e c tru m
D is p la y
S ym b o l R a te
A v a ila b le S p e ctru m
M o d . T yp e
T ra n s m it P o w e r
D e te c te d
In te rfe re n ce
C u rre n t C R s p e c tru m
u sa g e
Cognitive Radio Technologies, 2008
70
Current Setup (CORTEKS)
GPIB
GPIB
Arbitrary Waveform
Generator AWG430
Real Time Spectrum
Analyzer (RSA3408A)
Multi-mode transmitter
Receiver & spectrum
sensor
GPIB
Personal Computer
Arbitrary Waveform
Generator AWG710B
Signal upconverter
MPRG’s OSSIE (Open Source
SCA Implementation Embedded)
platform
CoRTekS Software Application
(Rudimentary Neural
Network based Cognitive
Engine)
Cognitive Radio Technologies, 2008
71/77
Current Waveform
Architecture
Wireless
Microphone
(Interferer)
Transmitter
AWG710B
RF Interface
Receiver
AWG430
AWG
Component
RSA3408A
Cognitive
Engine
Component
Assembly
Controller
RSA
Component
APPLICATION
OSSIE
Open Source SCA Implementation Embedded
Cognitive Radio Technologies, 2008
72/77
CoRTekS Screenshot
Policy (4 ch.)
Image Display
Transmit spectral
power mask
Transmitted Image
Modulation
schemes
Received Image
Packet History
Display
Bit error Rate
Spectral Efficiency
Waveform
Spectrum Display
Center Freq.
Symbol Rate
Available Spectrum
Mod. Type
Detected
Interference
Transmit Power
Current CR
73/77
spectrum
usage
Cognitive Radio Technologies, 2008
CoRTekS Decision Process
A c c e p ta b le B E R
L a te n c y
Tx Pow er
Required QoS
Available Spectrum
Policy
Tx Pow er
F re q u e n cy
S y m b o l R a te
M o d u la tio n T yp e
Parameters
M emory
Neural Ntwk
Utility Function
74/77
Cognitive Radio Technologies, 2008
Demonstration of CORTEKs
75/77
Cognitive Radio Technologies, 2008
Implementation Summary
•
Broad differences in architectural approaches to implementing cognitive
radio
– Engines vs algorithms
– Procedural vs ontological
•
Numerous different techniques available to implement cognitive
functionalities
– Some tradeoffs in efficiencies
– Likely need a meta-cognitive radio to find optimal parameters
•
Process boundaries are sometimes blurred
–
–
–
–
•
•
Observation/Orientation
Orientation/Learning
Learning/Decision
Implies need for pooled memory
Good internal models will be important for success of many processes
Lots of research going on all over the world; lots of low hanging fruit
– See DySPAN, CrownCom, SDR Forum, Milcom for papers; upcoming JSACs
•
No clue as to how to make a radio conscious or if we even should
76/77
Cognitive Radio Technologies, 2008
Descargar

Cognitive Radio Technologies and WANN