Markedness Optimization in
Grammar and Cognition
Paul Smolensky
Cognitive Science Department
Johns Hopkins University
with:
Elliott Moreton
Karen Arnold
Donald Mathis
Melanie Soderstrom
Géraldine Legendre
Alan Prince
Peter Jusczyk
Suzanne Stevenson
Grammar and Cognition
1. What is the system of knowledge?
2. How does this system of knowledge
arise in the mind/brain?
3. How is this knowledge put to use?
4. What are the physical mechanisms
that serve as the material basis for
this system of knowledge and for the
use of this knowledge?
(Chomsky ‘88; p. 3)
Jakobson’s Program
A Grand Unified Theory for the cognitive science
of language is enabled by Markedness:
Avoid α
① Structure
• Alternations eliminate α
①
• Typology: Inventories lack α
② Acquisition
②
• α is acquired late
③ Processing
③
• α is processed poorly
④ Neural
④
• Brain damage most easily disrupts α
Formalize
through OT?
OT
Advertisement
The complete story, forthcoming (2003)
Blackwell:
The harmonic mind: From neural computation to
optimality-theoretic grammar
Smolensky & Legendre
Overview
Structure
Acquisition Use Neural Realization
 Theoretical. OT (Prince & Smolensky ’91, ’93):
– Construct formal grammars directly from
markedness principles
– General formalism/ framework for grammars:
phonology, syntax, semantics; GB/LFG/…
– Strongly universalist: inherent typology
 Empirical. OT:
– Allows completely formal markednessbased explanation of highly complex data
Structure
Acquisition Use Neural Realization
• Theoretical
Formal structure enables OT-general:
– Learning algorithms
• Constraint
 Initial
state Demotion: Provably correct and

efficient (when part of a general decomposition
of the grammar learning problem)
– Tesar 1995 et seq.
Empirical
– Tesar
Smolensky 1993,
…, 2000
– Initial
state&predictions
explored
through
• Gradual Learning
Algorithm
behavioral
experiments
with infants
– Boersma 1998 et seq.
Structure
Acquisition Use Neural Realization
• Theoretical
– Theorems
regarding
the computational
• Empirical
(with
Suzanne
Stevenson)
–
–
–
–
complexity of algorithms for processing
Typical
processing theory:
with OTsentence
grammars
heuristic
constraints
• Tesar ’94 et seq.
OT:
output
• Ellison
’94 for every input; enables incremental
(word-by-word)
• Eisner ’97 et seq. processing
Empirical
results
• Frank & Satta
’98 concerning human sentence
processing
can be explained with OT
• Karttunendifficulties
’98
grammars employing independently motivated
syntactic constraints
The competence theory [OT grammar] is the
performance theory [human parsing heuristics]
Structure
Acquisition UseNeural Realization
• Theoretical
Empirical
OT
derives fromof
the
theory of abstract
 Construction
a miniature,
concreteneural
LAD
(connectionist) networks
– via Harmonic Grammar (Legendre, Miyata,
Smolensky ’90)
For moderate complexity, now have general
formalisms for realizing
– complex symbol structures as distributed patterns of
activity over abstract neurons
– structure-sensitive constraints/rules as distributed
patterns of strengths of abstract synaptic connections
– optimization of Harmony
Program
Structure
 OT
• Constructs formal grammars directly from
markedness principles
• Strongly universalist: inherent typology
 OT allows completely formal markedness-based
explanation of highly complex data
Acquisition
Initial state predictions explored through
behavioral experiments with infants
Neural Realization
 Construction of a miniature, concrete LAD
Program
Structure
 OT
• Constructs formal grammars directly from
markedness principles
• Strongly universalist: inherent typology
 OT allows completely formal markedness-based
explanation of highly complex data
Acquisition
Initial state predictions explored through
behavioral experiments with infants
Neural Realization
 Construction of a miniature, concrete LAD
 The Great Dialectic
Phonological representations serve two masters
MARKEDNESS
FAITHFULNESS
Phonetic interface
[surface form]
Often: ‘minimize effort
(motoric & cognitive)’;
‘maximize discriminability’
Phonetic
s
Lexical interface
/underlying form/
Recoverability:
‘match this invariant form’
Phonological
Representation
Locked in conflict
Lexico
n
OT from Markedness Theory
• MARKEDNESS constraints: *α: No α
• FAITHFULNESS constraints
– Fα demands that /input/  [output] leave α
unchanged (McCarthy & Prince ’95)
– Fα controls when α is avoided (and how)
• Interaction of violable constraints: Ranking
– α is avoided when *α ≫ Fα
– α is tolerated when Fα ≫ *α
– M1 ≫ M2: combines multiple markedness dimensions
OT from Markedness Theory
• MARKEDNESS constraints: *α
• FAITHFULNESS constraints: Fα
• Interaction of violable constraints: Ranking
– α is avoided when *α ≫ Fα
– α is tolerated when Fα ≫ *α
– M1 ≫ M2: combines multiple markedness dimensions
• Typology: All cross-linguistic variation results
from differences in ranking – in how the
dialectic is resolved (and in how multiple
markedness dimensions are combined)
OT from Markedness Theory
•
•
•
•
MARKEDNESS constraints
FAITHFULNESS constraints
Interaction of violable constraints: Ranking
Typology: All cross-linguistic variation results
from differences in ranking – in resolution of the
dialectic
• Harmony = MARKEDNESS + FAITHFULNESS
– A formally viable successor to Minimize Markedness is
OT’s Maximize Harmony (among competitors)
 Structure
Explanatory goals achieved by OT
• Individual grammars are literally and
formally constructed directly from
universal markedness principles
• Inherent Typology :
Within the analysis of phenomenon Φ in
language L is inherent a typology of Φ
across all languages
Program
Structure
 OT
• Constructs formal grammars directly from
markedness principles
• Strongly universalist: inherent typology
 OT allows completely formal markedness-based
explanation of highly complex data
Acquisition
Initial state predictions explored through
behavioral experiments with infants
Neural Realization
 Construction of a miniature, concrete LAD
Markedness and Inventories
Theoretical part
• An inventory structured by markedness
An inventory I is harmonically complete (HC) iff
x  I and y is (strictly) less marked than x
implies
yI
• A typology structured by markedness
A typology T is strongly Harmonically complete
(SHarC) iff
L  T if and only if L is harmonically complete
(Prince & Smolensky ’93: Ch. 9)
• Are OT inventories harmonically complete?
• Are OT typologies SHarC?
Harmonic Completeness
English obstruent
inventory is HC
w.r.t. Place/
continuancy
1
2

*
*[velar]

+
 velar


t
k
*
+
s
x
*[+cont]
 cont
… but is not generable by ranking
{ *[velar], *[+cont]; FPlace, Fcont }
Inventory Bans Only
the Worst Of the
Worst (BOWOW)
Local Conjunction
• Crucial to distinguish
*x w.r.t segment inventory:
*[taxi]
*[+cont], *[velar] *[+cont], *[velar] fatal in
same segment
[saki]
*[+cont], *[velar]
Local conjunction:
*[+cont] &seg *[velar] violated
when both violated in same segment
Basic Inventories/Typologies
• Formal analysis of HC/SHarC in OT: Definitions
• Basic inventory I [Φ] of elements of type T, where
Φ = {φk}
Candidates: {X} = { [φ1, φ2, φ3, φ4, …] }
Con: MARK = { *[+φ1], *[φ2], … }
FAITH = { F[φ1], F[φ2], … }
I [Φ]: a ranking of Con
• Basic typology T [Φ]: All rankings of Con
• Basic typology w/ Local Conjunction, T LC[Φ]: All
rankings of ConLC = Con + all conjunctions of
constraints in MARK, local to T
SHarC Theorem
• SHarC Theorem
T [Φ]:
• each language is HC
• SHarC property does not hold
TLC[Φ]:
• each language is HC
• SHarC property holds
Empirical Relevance
Empirical part
• Local conjunction has seen many empirical
applications; here, vowel harmony
• Lango (Nilotic, Uganda) ATR harmony
– Woock & Noonan =79
– Archangeli & Pulleyblank ‘91 et seq., esp. =94
• Markedness:
– *[+ATR, hi/fr]
– *[ATR, +hi/fr]
– *[+A]/σclosed
– HD-L[ATR]
Rather than imposing a
parametric superstructure
marked articulatorily
on spreading rules (A&P ’94),
we build the grammar directly
from these markedness
constraints
Lango ATR Harmony
• Inventory of ATR domains D [ATR] (~ tiers)
• Vowel harmony renders many possibilities
ungrammatical
’yourSING/PLUR stew’:
d̀k +Cí  * d̀ k k í 
dè kk í
*
d̀ kk ́
ATR: 
+
[ ] [ +] [ + +0] [ 0 ]
d̀k+wú  d̀kwú * dèkwú * d̀kw́́
critical difference:
i[+fr] vs. u[fr]
[fr] ‘worse’ source for [+ATR] spread
violates *[+ATR, fr]
— marked w.r.t. ATR
• Complex system: interaction of 6 dimensions
(26 = 64 distinct environments)
V ATR
(P oten tial: S ou rce of  A T R ; T arg et of  A T R )
Lango
/ V  A T R (C ).C V  A T R /
hi
A T R -d o m a in
(P otential:  A T R Source;  A T R T arg et)
V  ATR
in v en to ry
 fr
C
 fr
 fr
C .C
 fr
/ V  A T R (C ).C V  A T R /
hi
hi
hi
 fr
 fr
 fr
 fr
 fr
 fr
 fr
 fr
i_
u _
e _
o / _
_ i
_ u
_ e _ o/
hi

[i° i] [u ° i] [e° i] [o ° i] [ i i°] [i u °]
e
o
hi

[i° e] [u ° e] [e° e] [o ° e] [e i°] [e u °]  e
o
hi

[i° u ] [u ° u ] [e° u ] [o ° u ] [u i°] [u u °] [° ] [° ]
 h i / a [i° o ] [u ° o ] [e° o ] [o ° o ] [o i°] [o u °] [° ] [° ]
hi

[i° i] [u ° i]
hi

[i° e] [u ° e] e  o  [e i°]
hi

[i° u ] [u ° u ] e 
e  o  [i i°] [i u °]
u
e
o
e
o
o  [u i°] [u u °] [° ] [° ]
 h i / a [i° o ] [u ° o ] e  o  [o i°]
 u [° ] [° ]
K ey : / V  V  / [ °  ]   [  °] / V  V  / [  °]   [ °  ]
A TR
 A TR
 + A TR
A TR
V ATR
(P oten tial: S ou rce of  A T R ; T arg et of  A T R )
Lango
/ V  A T R (C ).C V  A T R /
hi
A T R -d o m a in
(P otential:  A T R Source;  A T R T arg et)
V  ATR
in v en to ry
 fr
C
 fr
 fr
C .C
 fr
/ V  A T R (C ).C V  A T R /
hi
hi
hi
 fr
 fr
 fr
 fr
 fr
 fr
 fr
 fr
i_
u _
e _
o / _
_ i
_ u
_ e _ o/
hi

[i° i] [u ° i] [e° i] [o ° i] [ i i°] [i u °]
e
o
hi

[i° e] [u ° e] [e° e] [o ° e] [e i°] [e u °]  e
o
hi
d̀
k +Cí  dèkkí
 [i° u ] [u ° u ] [e° u ] [o ° u ]
[u i°] [u u °] [° ] [° ]
 h i / a [i° o ] [u ° o ] [e° o ] [o ° o ] [o i°] [o u °] [° ] [° ]
hi

[i° i] [u ° i]
hi

[i° e] [u ° e] e  o  [e i°]
hi

[i° u ] [u ° u ] e 
e  o  [i i°] [i u °]
u
e
o
e
o
o  [u i°] [u u °] [° ] [° ]
 h i / a [i° o ] [u ° o ] e  o  [o i°]
 u [° ] [° ]
K ey : / V  V  / [ °  ]   [  °] / V  V  / [  °]   [ °  ]
A TR
 A TR
 + A TR
A TR
V ATR
(P oten tial: S ou rce of  A T R ; T arg et of  A T R )
Lango
/ V  A T R (C ).C V  A T R /
hi
A T R -d o m a in
(P otential:  A T R Source;  A T R T arg et)
V  ATR
in v en to ry
 fr
C
 fr
 fr
C .C
 fr
hi

hi

hi

/ V  A T R (C ).C V  A T R /
hi
hi
hi
 fr
 fr
 fr
 fr
 fr
 fr
 fr
i_
u _
e _
o / _
_ i
_ u
_ e _ o/
[i° i] [u ° i] [e° i] [o ° i] [ i i°] [i u °]
 fr
e
o
[i° e] [u ° e] [e° e] [o ° e] [e i°] [e u °]  e
o
d̀k+wú 
d̀
wú
[i° u ] [u
° u ]k
[e°
u ] [o ° u ]
[u i°] [u u °] [° ] [° ]
 h i / a [i° o ] [u ° o ] [e° o ] [o ° o ] [o i°] [o u °] [° ] [° ]
hi

[i° i] [u ° i]
hi

[i° e] [u ° e] e  o  [e i°]
hi

[i° u ] [u ° u ] e 
e  o  [i i°] [i u °]
u
e
o
e
o
o  [u i°] [u u °] [° ] [° ]
 h i / a [i° o ] [u ° o ] e  o  [o i°]
 u [° ] [° ]
K ey : / V  V  / [ °  ]   [  °] / V  V  / [  °]   [ °  ]
A TR
 A TR
 + A TR
A TR
V ATR
(P oten tial: S ou rce of  A T R ; T arg et of  A T R )
Lango
/ V  A T R (C ).C V  A T R /
hi
A T R -d o m a in
(P otential:  A T R Source;  A T R T arg et)
V  ATR
in v en to ry
 fr
C
 fr
 fr
C .C
 fr
/ V  A T R (C ).C V  A T R /
hi
hi
hi
 fr
 fr
 fr
 fr
 fr
 fr
 fr
 fr
i_
u _
e _
o / _
_ i
_ u
_ e _ o/
hi

[i° i] [u ° i] [e° i] [o ° i] [ i i°] [i u °]
e
o
hi

[i° e] [u ° e] [e° e] [o ° e] [e i°] [e u °]  e
o
hi

[i° u ] [u ° u ] [e° u ] [o ° u ] [u i°] [u u °] [° ] [° ]
 h i / a [i° o ] [u ° o ] [e° o ] [o ° o ] [o i°] [o u °] [° ] [° ]
hi

[i° i] [u ° i]
hi

[i° e] [u ° e] e  o  [e i°]
hi

[i° u ] [u ° u ] e 
e  o  [i i°] [i u °]
u
e
o
e
o
o  [u i°] [u u °] [° ] [° ]
 h i / a [i° o ] [u ° o ] e  o  [o i°]
 u [° ] [° ]
K ey : / V  V  / [ °  ]   [  °] / V  V  / [  °]   [ °  ]
A TR
 A TR
 + A TR
A TR
The Challenge
Need a grammatical framework able to
handle this nightmarish descriptive
complexity
while staying strictly within the confines
of rigidly universal principles
Lang
o
rules
ATR  rules:
α
β
ATR
ATR
VCV
 ATR rules:
Archangeli &
Pulleyblank ‘94
a
V (C)C V
b
c
ATR
ATR
ATR
VCV
V (C)C V
hi
ATR  rule:
hi
x
ATR
V (C)C V
hi
fr
V (C)C V
hi
fr
V ATR
(P oten tial: S ou rce of  A T R ; T arg et of  A T R )
Lango
/ V  A T R (C ).C V  A T R /
hi
R u le -b a sed
(P oten tial:  A T R S ou rce;  A T R T arg et)
V  ATR
a cco u n t
 fr
C
 fr
 fr
C .C
 fr
/ V  A T R (C ).C V  A T R /
hi
hi
hi
 fr
 fr
 fr
 fr
 fr
 fr
 fr
 fr
i_
u _
e _
o / _
_ i
_ u
_ e _ o/
hi

α β
α β
α
α
a b c
a b
hi

α β
α β
α
α
a
a
hi

α β
α β
α
α
a b c
a b
x
x
α β
α β
α
α
a
a
x
x
x
x
x
x
 h i / a
c
c
hi

β
β
b c
hi

β
β
c
hi

β
β
b c
β
β
c
 h i / a
b
b
K ey : / V  V  / [ °  ]   [  °] / V  V  / [  °]   [ °  ]
A TR
 A TR
 + A TR
A TR
M a rk e d n e ss
V ATR
(P oten tial: S ou rce of  A T R ; T arg et of  A T R )
of A T R d om a in s
/ V  A T R (C ).C V  A T R /
favors:
+A TR
hi
A TR
(P oten tial:  A T R S ou rce;  A T R T arg et)
V  ATR
 fr
 fr
C
 fr
 fr
C .C
 fr
/ V  A T R (C ).C V  A T R /
hi
 fr
 fr
 fr
hi
 fr
hi
 fr
 fr
 fr
hi
hi
hi
hi
hi
hi
hi
hi
K ey : / V  V  / [ °  ]   [  °] / V  V  / [  °]   [ °  ]
A TR
 A TR
 +A TR
A TR
*
*[velar]

+
 velar

t
k
*
+
s
x
*[+cont]
 cont
BOWOW
1
2
*[+cont] &seg *[velar]
*[+A]/σclosed &D[A]
*[hi,+A]/HD[A]
“No [ATR] spread
into a closed syllable
from a [hi] source”
3
BOWOW
*[hi, A] & HD-L[A]
“No regressive [ATR] spread
from a [hi] source”
X,Y,Z: *[A]
1,2,3: *[+A]
V ATR
(P oten tial: S ou rce of  A T R ; T arg et of  A T R )
/ V  A T R (C ).C V  A T R /
(P oten tial:  A T R S ou rce;  A T R T arg et)
V  ATR
≫ AGREE
≫ F[A]
hi
 fr
C
 fr
 fr
C .C
 fr
/ V  A T R (C ).C V  A T R /
hi
hi
 fr
 fr
 fr
 fr
 fr
hi
X Y Z
X Y Z
X Y
X Y
Y Z
hi
X Y Z
X Y Z
X Y
X Y
hi
X
Z
X
Z
X
hi
X
Z
X
Z
X
hi
X Y Z
X Y Z
X Y
hi
X Y Z
X Y Z
X Y
hi
X
Z
X
Z
X
hi
X
Z
X
Z
X
1
1
1
1
hi
 fr
 fr
Y Z
Y
Y
Y Z
Y Z
Y
X
Z
Z
X
Z
Z
Y Z
Y Z
Y Z
Y Z
Z
Z
Z
Z
X Y
X Y
X
X
1
1
1
1
 fr
Y
3
3
Y
2
2
Y
2
2
2
2
2
2
1 2
1 2
Y
Y
1 2
1 2 3
1 2
1 2
1 2
1 2 3
K ey : / V  V  / [ °  ]   [  °] / V  V  / [  °]   [ °  ]
A TR
 A TR
 +A TR
A TR
The Challenge
Need a grammatical framework able to
handle this nightmarish descriptive
complexity
while staying strictly within the confines
of rigidly universal principles
Inherent Typology
• Method applicable to related African
languages, where the same markedness
constraints govern the inventory
(Archangeli & Pulleyblank ’94), but with
different interactions: different rankings
and active conjunctions
• Part of a larger typology including a
range of vowel harmony systems
 Structure: Summary
• OT builds formal grammars directly from
markedness: MARK, with FAITH
• Inventories consistent with markedness
relations are formally the result of OT with
local conjunction: TLC[Φ], SHarC theorem
• Even highly complex patterns can be
explained purely with simple markedness
constraints: all complexity is in constraints’
interaction through ranking and conjunction:
Lango ATR harmony
Program
Structure
 OT
• Constructs formal grammars directly from
markedness principles
• Strongly universalist: inherent typology
 OT allows completely formal markedness-based
explanation of highly complex data
Acquisition
Initial state predictions explored through
behavioral experiments with infants
Neural Realization
 Construction of a miniature, concrete LAD
The Initial State
OT-general:
MARKEDNESS ≫ FAITHFULNESS
 Learnability demands (Richness of the Base)
(Alan Prince, p.c., ’93; Smolensky ’96a)
 Child production: restricted to the unmarked
 Child comprehension: not so restricted
(Smolensky ’96b)
 Experimental Exploration of
the Initial State
Collaborators:
Peter Jusczyk
Theresa Allocco
Language Acquisition 2002
Karen Arnold
in progress
Elliott Moreton
Grammar at 4.5 months?
Experimental Paradigm
• Headturn Preference Procedure
(Kemler Nelson et al. ‘95; Jusczyk ‘97)
• X/Y/XY paradigm (P. Jusczyk)
un...b...umb
un...b...umb
*FNP
ℜ
um...b...umb
iŋ…..gu...iŋgu
…
um...b...iŋgu
vs.
iŋ…..gu…umb
…
•Highly general paradigm: Main result
p = .006
∃FAITH
Linking Hypothesis
• Experimental results challenging to explain
• Suppose stimuli A and B differ w.r.t. φ.
Child: MARK[φ] ≫ FAITH[φ] (‘M ≫ F’). Then:
• If A is consistent with M ≫ F and
B is consistent with F ≫ M
then ‘prefer’ (attend longer to) A: ‘A > B’
• MARK[φ] = Nasal Place Agreement
Experimental Results
If A is consistent with M ≫ F and
B is consistent with F ≫ M
then ‘prefer’ (attend longer to) A: ‘A > B’
M≫F?
yes (+)
m+b  mb
no
n+b  mb
>
A
n+b  nb
B

>
F≫M?
yes
()
no

n+b  nd
p < .05
∃MARK
p < .001
nb  mb; M ≫ F
.05 /n+b/:
n 
mnd
detectable
ppp >
≺UG mb
>< .40
.30
*UG
unreliability
Program
Structure
 OT
• Constructs formal grammars directly from
markedness principles
• Strongly universalist: inherent typology
 OT allows completely formal markedness-based
explanation of highly complex data
Acquisition
Initial state predictions explored through
behavioral experiments with infants
Neural Realization
 Construction of a miniature, concrete LAD
A LAD for OT
Acquisition:
• Hypothesis: Universals are genetically
encoded, learning is search among UGpermitted grammars.
• Question: Is this even possible?
• Collaborators:
Melanie Soderstrom Donald Mathis
UGenomics
• The game: Take a first shot at a concrete
example of a genetic encoding of UG in a
Language Acquisition Device
¿ Proteins ⇝ Universal grammatical principles ?
Time to willingly suspend disbelief …
UGenomics
• The game: Take a first shot at a concrete
example of a genetic encoding of UG in a
Language Acquisition Device
¿ Proteins ⇝ Universal grammatical principles ?
• Case study: Basic CV Syllable Theory (Prince
& Smolensky ’93)
• Innovation: Introduce a new level, an
‘abstract genome’ notion parallel to [and
encoding] ‘abstract neural network’
UGenome for CV Theory
• Three levels
– Abstract symbolic: Basic CV Theory
– Abstract neural:
CVNet
– Abstract genomic: CVGenome
UGenomics: Symbolic Level
• Three levels
– Abstract symbolic: Basic CV Theory
– Abstract neural:
CVNet
– Abstract genomic: CVGenome
Basic syllabification: Function
• Basic CV Syllable Structure Theory
– ‘Basic’ — No more than one segment per
syllable position: .(C)V(C).
• ƒ: /underlying form/  [surface form]
• /CVCC/  [.CV.C V C.]
/pæd+d/[pædd]
• Correspondence Theory
– McCarthy & Prince 1995 (‘M&P’)
• /C1V2C3C4/  [.C1V2.C3 V C4]
Syllabification: Constraints (Con)
• PARSE: Every element in the input
corresponds to an element in the output
• ONSET: No V without a preceding C
• etc.
UGenomics: Neural Level
• Three levels
– Abstract symbolic: Basic CV Theory
– Abstract neural:
CVNet
– Abstract genomic: CVGenome
CVNet Architecture
/C1 C2/  [C1 V C2]
/ C1
C2
C
[
V
‘1’
C1
V
C2
‘2’
]
/
Connections: PARSE
• All connection coefficients are +2
1
C
1
1
V
1
1
3
3
3
3
3
3
3
3
3
3
3
3
3
3
3
3
1
3
3
Connections: ONSET
• All connection coefficients are 1
C
V
CVNet Dynamics
• Boltzmann machine/Harmony network
– Hinton & Sejnowski ’83 et seq. ; Smolensky ‘83 et seq.
– stochastic activation-spreading algorithm:
higher Harmony  more probable
– CVNet innovation: connections realize fixed
symbol-level constraints with variable strengths
– learning: modification of Boltzmann machine
algorithm to new architecture
UGenomics: Genome Level
• Three levels
– Abstract symbolic: Basic CV Theory
– Abstract neural:
CVNet
– Abstract genomic: CVGenome
Connectivity geometry
• Assume 3-d grid geometry (e.g., gradients)
C
V
‘N’
‘E’
‘back’
Connectivity: PARSE
• Input units grow south and connect
• Output units grow east and connect
• Correspondence units grow north & west
and connect with input & output units.
Connectivity: ONSET
• VO segment: N&S S VO | N S x0 x0 segment: | S S VO
C
V
Connectivity Genome
• Contributions from ONSET and PARSE:
Source:
CI
VI
CO
Projections:
S LCC
S L VC
E L CC
 Key:
Direction
N(orth) S(outh)
E(ast) W(est)
F(ront) B(ack)
VO
CC
VC
xo
E L VC
N L CI N L VI S S VO
N&S S VO W L CO W L VO
N S x0
Extent
Target
L(ong) S(hort) Input: CI VI
Output: CO VO
x(0)
Corr: VC CC
CVGenome: Connectivity
Abstract Gene Map
General Developmental Machinery
S L CC
C-C:
S L VC
F S VC

N/E
L CC&VC
S/W L CC&VC
target
RESPOND:
CO&V&x
Constraint Coefficients
V-I:
C-I:
direction extent
Connectivity
CORRESPOND:
B 1

G
G
CC&VC
B 2
CC CI&CO
1
VC VI&VO
1
CVGenome:
Connection Coefficients
UGenomics
• Realization of processing and learning
algorithms in ‘abstract molecular
biology’, using the types of interactions
known to be biologically possible and
genetically encodable
UGenomics
• Host of questions to address
–
–
–
–
Will this really work?
Can it be generalized to distributed nets?
Is the number of genes [77=0.26%] plausible?
Are the mechanisms truly biologically
plausible?
– Is it evolvable?
 How is strict domination to be handled?
Hopeful Conclusion
• Progress is possible toward a Grand Unified
Theory of the cognitive science of language
– addressing the structure, acquisition, use, and
neural realization of knowledge of language
– strongly governed by universal grammar
– with markedness as the unifying principle
– as formalized in Optimality Theory at the
symbolic level
– and realized via Harmony Theory in abstract
neural nets which are potentially encodable
genetically
Hopeful Conclusion
• Progress is possible toward a Grand Unified
Theory of the cognitive science of language
€
Still lots of promissory notes, but
all in a common currency —
Harmony ≈ unmarkedness; hopefully
this will promote further
progress by facilitating integration of
the sub-disciplines of cognitive science
Thank you for your attention (and
indulgence)
Descargar

The Harmonic Mind - Cognitive Science at Johns Hopkins