Front | Back |
What are three major concerns regarding Connectionism model of learning past tense verbs?
|
1) The network didn't learn the past tense2) The U-shape was built into the training data3) The network can still learn rules people never learn
|
What are some features of natural neural networks as opposed to artificial neural networks?
|
The brain is an electro-chemical machine, single neurons do not have local representations, synapses are one way, connections are either excitatory or inhibitory, but not a mixture, and natural neural networks are considerably larger than current ANNs
|
What is the key to connectionism?
|
Continuous distributed representations
|
What are two challenges associated with connectionism?
|
Showing that neural networks can reproduce the kind of phenomena that inspired symbolic accounts of cognitionShowing that neural networks can learn from small amounts of data
|
What are the differences between the symbolic paradigm and the connectionist paradigm?
|
Computationally: symbol manipulation vs. minimization of error in continuous systemsAlgorithmically: Processes of symbol manipulation vs. error-driven learningImplementationally: neurons as switches s. neurons as neurons
|
What was John McCarthy's main goal regarding AI?
|
To replicate human intelligence in a machine
|
What is Backpropogation? And what algorithm is used?
|
A form of supervised learning used in feed forward networks. The error is propogated from the output nodes to the inner nodes. Algorithm: Gradient Descent = Move in direction where error (E) is minimized with respect to the weight (w).
|
How does Hebbian Learning result in the formation of cell-assemblies?
|
Hebbian Learning = Neurons that fire together wire together. Unsupervised Learning.
Repeated stimulation of specific receptors leads slowly to the formation of cell-assemblies.
Hebb: “When an axon of cell A is near enough to excite a cell B and repeatedly or persistently takes part in firing it, some growth process or metabolic change takes place in one or both cells such that A’s efficiency, as on of the cells firing B, is increased.
If both source and target neurons are active at the same time, weight increased. Otherwise, weight decreased.
|
What is graceful degradation?
|
The property that enables a ystem to continue operating properly in the even of the failure of some of eats components.
|
Give an example of a Hopfield Network?
|
A network that converges to a remembered state if it is given part of the state.
Example: reconstruction of corrupted pattern Input = partial letter A Output = letter A
|
Describe how scheduling is an example of Parallel Constraint Satisfaction?
|
PCS: Cognitive tasks require simultaneous satisfaction of multiple constraints.
Other example: Necker Cube: it could look like the right side is popping out, or the left side is popping out, but not both.
|
What is inflectional morphology?
|
Word forms to word forms (no representation needed)
|
What is a Turing Test
|
When a human engages in a natural conversation and tries to tell if it is a human or machine
|
What are Searle's arguments RE The Chinese Room?
|
Any physical symbol system cannot have a mind, and that there are causal properties of brains that give rise to minds
|
What are four replies to Searle's Chinese Room Argumetns?
|
The systems reply: The system is what "understands" ChineseSpeed, power, and complexity: The man in the room would take billions of years to completeRobot Reply: The chinese room needs eyes and hands, man trapped inside room is not a mindBrain Simulator reply: What if the program simulates the sequence of nerve firings at the synapses of the actual brain of an actual chinese speaker??
|