Term
Three Principles of Cognitive Science |
|
Definition
1. Interdisciplinary study of the mind-brain 2. Mental processes are computational and typically unconscious 3.Mental capacities are modularly implemented and many of them are innate |
|
|
Term
Thinking is information processing at it involves... |
|
Definition
Transformation: blotchy dog illusion
Selection: Enhancement (3d cube that can be flipped) Reduction (filtering out repeated words)
Elaboration: laundry story w/ & w/o context
Storage: remembering lists of semantically related words causes misremembering because info leaves traces. |
|
|
Term
Wason Selection Task with Conditionals |
|
Definition
We are good with deontic conditionals and bad with abstract ones. |
|
|
Term
Behaviorism's critiques of intentional explanations of behavior |
|
Definition
1. mental phenomena are not directly observable (and thus not verifiable)
2. Intentional explanations are circular
3. Intentional explanations are homuncular. |
|
|
Term
|
Definition
Operalization of Psychological Phenomena-- attention, e.g. is responding to a stimulus.
Learning is Conditioning-- pavlovian/classic involves CR and UR, operant/instrumental involves reinforcers, rewards and punishments.
Explanation consists of control and prediction. |
|
|
Term
|
Definition
Went against behaviorism.
One showed latent learning, in which rats got faster after being rewarded. The other showed place learning, in which rats take paths that are closest to reward.
Concept of mental/cognitive maps and latent learning. |
|
|
Term
Lashley's criticisms to associative chain theories |
|
Definition
Behaviorists believe in complex behavior explaining language-- steps that depend on each other.
Critiques: --Goal dependance: the role a unit of behavior plays is determined by the overall goal of the sequence. The structure of the behavior is also controlled by the goal.
--Anticipations: future behavior can influence actual behavior, even though the future behavior hasn't happened yet. Behavior is represented before it happens. |
|
|
Term
Lashley's Planning Theory |
|
Definition
Behavioral sequences are planned and organized in advance. This plan is hierarchical, starting with linguistic/discourse pragmatics and moving on to lexical/syntactical semantics, and lastly phonemes. |
|
|
Term
|
Definition
A procedure that --can be specified with a finite number of steps --Can be unambiguously followed by a human or mechanical computer --Will always yield an output for any input for which the function is defined. |
|
|
Term
|
Definition
Define what a computational process is. Give us a way of analyzing intelligent processes into unintelligent ones. |
|
|
Term
|
Definition
The effectively calculable functions are exactly the functions that can be computed by a Turing machine. Anything that can be done with an algorithm can be done with a Turing machine. |
|
|
Term
|
Definition
Chomsky believed that speaker competence can be explained by an algorithmic set of rules tacitly known by the speaker. It is an algorithm for specifying and generating all and only the grammatical sentences in a language. |
|
|
Term
|
Definition
when applied, make deep structures. |
|
|
Term
|
Definition
Map phrase markers to phrase markers, make surface structures. |
|
|
Term
|
Definition
How a sentence is built up from basic constituents according to basic rules. |
|
|
Term
|
Definition
the actual organization of words in a sentence, derived from the deep structure according to the principles of transformational grammar. |
|
|
Term
|
Definition
Idealized performance-- all the utterances a speaker could make and understand if there were no time, memory, etc. constraints. |
|
|
Term
|
Definition
Actual behavior-- the utterances that speakers actually make. Evidence of speaker competence. |
|
|
Term
|
Definition
Information is the reduction of uncertainty. |
|
|
Term
|
Definition
The amount of information that an information channel can reliably transmit, for human perception. For unidimensional stimuli, constant in bits. In working memory, constant in chunks. |
|
|
Term
|
Definition
The quantity of information needed to distinguish between two incompatible states of affairs. |
|
|
Term
|
Definition
recoding information into meaningful units. |
|
|
Term
Cherry's Dichotic Listening Task |
|
Definition
Participants shadow right ear, ignore left ear, recall. People are good detecting speech vs. non-speech, okay with male vs. female speech, thought something was off for reversed speech, and poor with words, ideas, and language. |
|
|
Term
|
Definition
The allocation of limited capacity processing resources by filtering out irrelevant information. Shows cocktail party phenomenon
When subjects are asked to report digits, they usually do it by ear and have trouble doing it in order of arrival. |
|
|
Term
Broadbent's Early Selection Model and its downfalls. |
|
Definition
Selective attention is accounted for by 1) the flow if information and 2) the locus of information. Information comes through the senses and passes through a short term store before passing through a selective filter. The selective filter screens out large portion of incoming information, selecting some of it for further processing.
Issues: --subjects' names pop out --sometimes shadowed message shifts to unattended ear --There is processing without awareness. When words in unattended ear are paired with shocks and subjects are asked to report target words, they can't but there are skin responses. |
|
|
Term
The Imagery Debate: Is vision epiphenomenalistic? Why not? |
|
Definition
1. There are quasi-perpetual experiences of images. 2. These images facilitate problem solving. 3. Infomration processing: there are certain representations that give rise to these experiences. Are these representations images themselves? |
|
|
Term
Propositional/Digital Representations |
|
Definition
Discrete (Can be broken into pieces) Arbitrary relation between symbol and referent e.g. binary digits or words. |
|
|
Term
Imagistic/Analog Representation |
|
Definition
Representations are continuous. Systematic correlations between properties of symbol and properties of event. e.g. maps, stick figures. |
|
|
Term
Shepard and Metzler Rotation Study |
|
Definition
Mental rotations are related to real rotations in length of time. This shows an analog representation of a process- the intermediate internal states have a natural one-to-one correspondence to appropriate intermediate states int eh external process. |
|
|
Term
Marr's Three Levels: Computational |
|
Definition
Top down Identify specific information processing problem that the system is configured to solve and any general constraints on any solution to that problem. Input/Output mapping.
What is the goal of the computation, why is it appropriate, and what is the logic of the strategy by which it can be carried out? |
|
|
Term
Marr's Levels of Processing: Algorithmic/Representational |
|
Definition
Top down. Explain how the cognitive system actually solves the information-processing task. Specify how information is encoded and identify algorithm for transforming input to required output.
How can this computational theory be implemented? In particular, what is the representation for the input and output, and what is the algorithm for transformation? |
|
|
Term
Marr's Levels of Processing: Hardware Implementation |
|
Definition
Bottom up. Physical realization of outcome. Identify physical structures that will realize the representational states over which the algorithm is defined and find the mechanisms at the neural level that can properly be described as computing the algorithm in question.
How can the representation and algorithm be realized physically? |
|
|
Term
Warrington and Taylor 1973 |
|
Definition
Dealt with patients lesioned in left and right parietal lobes, Right parietal lesions produced shape perception issues from unusual perpectives. Left parietal lobe lesions had no such issues with recognition and identification. There is a dissociation between perceptual and recognition abilities. Computational-level analysis |
|
|
Term
|
Definition
detect changes in luminescence (zero crossings) and obtain from them information about geometrical structure and light sources. Viewer centered representation. Algorithmic Level. |
|
|
Term
|
Definition
Extract orientation of visible surfaces to explain depth, surface orientation, and surface discontinuities. Viewer centered representation. Algorithmic Level. |
|
|
Term
|
Definition
Representation of shape and size. Determine basic volumetric and surface primitives and then establish how the primitives are assembled into a whole figure. Object centered frame. Algorithmic Level. |
|
|
Term
Underleider and Mishkin 1982: 2 Visual Systems Hypothesis |
|
Definition
Lesioned monkeys in temporal lob and parietal lobe and gave them object discrimination and landmark discrimination task.
First, remove temporal lobe from oen hemisphere and parietal from the other and then cut the corpus callosum.
Double dissociation. Temporal lesion: important for object discrimination. Parietal Lesion: important for landmark discrimination. |
|
|
Term
Milner & Goodale 1992: 2 Streams |
|
Definition
Dorsal Stream: Occipital --> Parietal. Spatial information-- where? pathway. Guide for action.
Ventral Stream: Occipital -->Temporal-- recognizing objects- what? pathway. Guide for identification |
|
|
Term
|
Definition
Ventral Lesion to temporal. Can't copy images even though she can draw them from memory. Can't hold cards parallel to slit, but can push through slot. |
|
|
Term
|
Definition
Dorsal Lesion to parietal Damage to dorsal stream-- problems in reaching and grasping nut not verbally recognizing and localizing objects. |
|
|
Term
|
Definition
Participant lies in stretcher and performs task which induces changes in info processing demands, changing localized neuronal activity and blood flow. Blood flow imaged using radioactive isotopes-- nucleus of isotopes emit positrons, which collide with electrons, producing two photons moving in opposite directions. Head in donut that detects photons and collisions.
Allows us to study well-functioning humans. Non-invasive and safe, replicable. Use subtraction technique and average across subjects. |
|
|
Term
|
Definition
Phonemic: rhyme? Semantic: meaning? Visual: ascenders/descenders? Articulatory: rolled r's?
Sensory code is input, articulatory code is output. |
|
|
Term
Gershwind's Model of Lexical Access |
|
Definition
Language consists of two basic functions: comprehension (a sensory/perceptual function), and speaking, which is a motor function. |
|
|
Term
|
Definition
|
|
Term
|
Definition
Patients have issues finding words, producing sentences. Comprehension intact, production disrupted. One-word hesitant responses, no syntactic markers/inflections. Caused by damage to left frontal lobe. |
|
|
Term
|
Definition
Patients have normal syntax, grammar, rate, but make no sense. Comprehension is impaired and there is a lack of awareness. Fluent salads and neologisms. Caused by damage to left temporal lobe. |
|
|
Term
|
Definition
3 Tasks, each with controls --passively view/listen to words-- test for sensory processing, word level coding. Found stimulation in auditory and temporal-parietal, including Wernicke's area, which are responsible for phonological codes. --read, repeat words-- tests for articulatory coding, motor output. Found stimulation in pre-motor cortex, cerebellum, Wernicke's area only in auditory version, and no Broca's --Generate verbs-- test for semantic association. Found stimulation in left frontal areas including Broca's Wernicke's only in auditory by close by for visual. |
|
|
Term
|
Definition
There is access to articulatory coding without sensory processing. There are multiple pathways, and codes localized in new areas. Both a functional and anatomical model. |
|
|
Term
|
Definition
It's hard to integrate co-evolutionary and top down approaches. Three disunities: 1. Domains- horizontal (stimulus/task) vs. vertical (information) 2. Methodological- electrophysiology, lesion studies, imaging, reaction times, questionnaires, documentation, etc. 3. Levels-- of processing,control, analysis computation, etc. |
|
|
Term
|
Definition
Relation between theories. Model for showing how one theory can be understood in terms of another.
We need: 1. Principles for connecting vocabularies (bridges) 2. Derivation of laws
Unfortunately, physics is different from cog sci because cog sci is mechanistic. Break down pieces and put them together to get global ideas. |
|
|
Term
Supervisory Attentional System |
|
Definition
Deliberate control of actions via attention --alertness --orientation to critical stimuli --executive attention, which involves over-riding automatic or impulsive responses as well as goal preservation in the presence of distraction. |
|
|
Term
|
Definition
How can we satisfy our intentions given over intentions, environmental constraint, skill sets, etc. Plans are solutions to this problem.
Indeterminacy: there are often many ways of achieving a certain goal.
Sequencing: achieving a goal involves coordinating several behaviors-- often chain behavioral sequences as opposed to ballistic ones. |
|
|
Term
|
Definition
Control units for partially ordered sequences of actions. Representations of goal-directed patters of behavior. Interconnected and hierarchically organized |
|
|
Term
Contention Scheduling: Principles of Scheme Selection |
|
Definition
1) Schema selection is determined by activation value. Schema is selected once it reaches a certain threshold, and once a schema is selected, an action is triggered.
2)Schemata compete for activation, which solves conflict and allows for coordination. |
|
|
Term
Sources of Schema Activation |
|
Definition
Intentional, lateral, environmental, self. |
|
|
Term
|
Definition
Elaborative, purposeful behavior-- object accordant skill behaviors, triggered by the presence of an object in the environment. These patients have unilateral or bilateral frontal lobe lesions and thus perform these behaviors, like shuffling cards, with no environmental cues of intentions. Their biggest issues are SUBSTITUTIONS. They cannot modulate schema activation due to environmental triggering.
Shows evidence of a distinct supervisory system, which is necessary for over-riding automatic or impulsive responses. |
|
|
Term
Action Disorganization Syndrome |
|
Definition
Patients make issues of OMISSIONS. They cannot resolve competition among schemas, and some thus do not reach threshold. They do things like put shaving cream on a toothbrush.
Shows evidence of a distinct supervisory system, which is necessary for over-riding automatic or impulsive responses. |
|
|
Term
|
Definition
Mental state with propositional and attitudinal component. |
|
|
Term
|
Definition
Entertain without evaluating: when you perceive something, you entertain it in your mind before evaluating it as something you should reject. |
|
|
Term
|
Definition
Believe without reconsidering. When you perceive something, you automatically believe it, and then you take the time to evaluate it and either continuing believing it (retain it) or you change your mind and reject it. |
|
|
Term
|
Definition
Evidence for Spinozan ideology. People fail to adjust their impressions sufficiently when they encounter information that discredits the evidence on which the belief is based.
e.g. in fake suicide note study. |
|
|
Term
|
Definition
Evidence for Spinozan ideology. People still make judgements based on ideas when they are told BEFOREHAND that ideas are false. |
|
|
Term
|
Definition
The value of an action depends on its consequences/outcomes (Mill) |
|
|
Term
|
Definition
Duties/rules/obligations confer value to actions (Kant) |
|
|
Term
|
Definition
The stubborn and puzzled maintenance of moral judgement without supporting reasons. "it's just the right/wrong thing to do" e.g. with cannibalism or incest dilemmas. |
|
|
Term
Two System View-- Cushman et al Emotional |
|
Definition
Rapid, automatic, unconscious, elicits deontological reasoning and corresponds to footbridge trolley problem. |
|
|
Term
Two System View-- Cushman et al Cognitive/Rational |
|
Definition
Slow, attention-involving, conscious. Elicits consequentialist reasoning and corresponds to initial/switch trolley problem. |
|
|
Term
Doctrine of Double Effect |
|
Definition
It is permissible to cause harm as a side effect of achieving a greater good. But it is impermissible to cause harm as a means to achieve a greater good. New trolley problem, in which train runs over five pople or runs over a man (as a side effect) before wood block stops it. |
|
|
Term
DDE- Experimental results |
|
Definition
In emotional personal dilemmas: medial frontal gyrus and angular gyrus
In high conflict situations: anterior cingulate cortex
Value/control areas in impersonal dilemmas: dorsolateral prefrontal cortex.
Reaction time: it takes longer to deep something inappropriate in nonmoral and moral-impersonal situations and longer to deem something appropriate in moral situations because you're overriding the automatic system. |
|
|
Term
VMPFC Patients in trolley problems |
|
Definition
These patients are less empathetic, less susceptible to pain, shame and guilt, don't react to painful/disturbing stimuli like normal people, reduced affect to aversive stimuli.
They endorsed the following actions differently than normal people: --nonmoral: more --impersonal: less --personal: more |
|
|
Term
|
Definition
1) Symbols are physical objects 2) Symbols can be combined to form complex symbol structures 3) The system contains processes for manipulating complex symbol structures 4)The processes for representing complex symbol structures can themselves be symbolically represented within the system |
|
|
Term
Physical Symbol Systems Hypothesis |
|
Definition
A physical symbol system has the necessary and sufficient means for intelligent action. --Necessity: anything capable of intelligent action is a physical symbol system. --Sufficiency: any (sufficiently sophisticated) physical symbol system is capable of intelligent action. |
|
|
Term
|
Definition
the ability to solve problems by transforming symbolic structures in accordance with the rule. |
|
|
Term
|
Definition
description of given situation, operators for changing situation, a goal situation, tests to determine whether the goal has been resolved/reached. |
|
|
Term
|
Definition
A set of achievable situations defined by potential application of operators to the initial situation. |
|
|
Term
|
Definition
Identify a situation-- i.e. how to reach goal by stepwise transformations. |
|
|
Term
Exhaustive Searches: Brute Force Algorithms Breadth First |
|
Definition
All nodes in level tested before proceeding to next level
issue: problem spaces too big to be searched exhaustively and this process is seemingly unintelligent. |
|
|
Term
Exhaustive Searches: Brute Force Algorithms Depth First |
|
Definition
Go deeper, testing each node and proceeding to next level, backtracking.
issue: problem spaces too big to be searched exhaustively and this process is seemingly unintelligent. |
|
|
Term
Heuristic Searches: Hill Climbing |
|
Definition
Evaluate which available option is closer to gal, do that on each move.
issue: sometimes theres more than one way to go uphill or there are local maximums. |
|
|
Term
Heuristic Searches: Means End Analysis |
|
Definition
Evaluate difference between current state and goal state, identify transformation that most reduces difference, apply it if possible. If not, go to next possible option. This is a backwards looking, goal to sub-goals, global evaluation |
|
|
Term
|
Definition
Computer models of mental processes are useful for studying the mind. |
|
|
Term
|
Definition
The sufficiency part of the PSS-- any (sufficiently sophisticated) physical symbol system is capable of intelligent action. |
|
|
Term
Turing Test/Interrogation Game |
|
Definition
Game in which man tries to trick you and woman answers honestly and you have to figure out which is which. If a computer can take the place of a man with the same amount of accuracy/skill, Turing says that it's intelligent. Basically, a machine is intelligent if it can replicate human behaviors that we see as intelligent. |
|
|
Term
|
Definition
Searle's argument against strong AI: the chinese room is an input-output system identical to a chinese speaker-- man with no knowledge of chinese uses a manual to "communicate" with chinese scientists. Internal processing of Chinese is purely syntactic. There is no real understanding of chinese. What happens in room is nothing like what happens in the head of chinese speakers.
Argument: -Computer programs are formal/syntactic -Minds have mental contents/semantics -Syntax by itself is neither constitutive nor sufficient for semantics.
Therefore, programs are neither constitutive nor sufficient for minds. |
|
|
Term
Reply to Chinese Room Argument: Systems Reply |
|
Definition
The point is not whether the person in a room understand Chinese, it is whether the whole room understands it, which it does. |
|
|
Term
Reply to Chinese Room Argument: Robot Reply |
|
Definition
Suppose that instead of a room, the program was placed into a robot that could wander around and interact with its environment. This would allow a "causal connection" between the symbols and things they represent. |
|
|
Term
Reply to Chinese Room Argument: Connectionist Reply |
|
Definition
Computers, understood as physical symbol systems, cannot think. But tere are other computers in which syntactic transformation gives rise to semantics. Massively parallel connectionist architectures could be capable of understanding. |
|
|
Term
|
Definition
Abstract away brain complexity, model connections among populations of neurons. parallel processing within networks. Alternative to symbolic manipulation. There is algorithmic processing without specific task rules and there are discrete manipulations |
|
|
Term
|
Definition
there is a threshold, after which the unit fires, giving output signal. Strength of signal can vary. -Linear -threshold linear -sigmoid (s-shaped) -binary thresholds. |
|
|
Term
|
Definition
An algorithm of an input into one of two possible outputs. networks can learn-- we can change the connection weight and/or units' thresholds until given input yields desired output. |
|
|
Term
|
Definition
Boolean functions can be represented by a formula in disjunctive normal form-- combinations of NOT, AND, OR
We use single layer networks to represent Boolean functions. Instead of T or F, 1 and 0 |
|
|
Term
|
Definition
Training depends on discrepancy between actual output and intended output. The delta/perceptron convergence rule gives algorithm for changing threshold and weights as a function of delta and epsilon. This rule will converge on a solution in any case wehre a solution is possible. It will generate a set of weights and a threshold that will compute every Boolean function that can be computed by a perceptron (i.e. single layer network) |
|
|
Term
|
Definition
The XOR (exclusive or) function is not linearly separable, meaning that there is no linear function that separates those inputs that will yield 0 from those that will yield 1. Thus, Xor cannot be computed by a single layer network |
|
|
Term
|
Definition
Have hidden units: units that affect output but that receive input form other units. At least two weights can be assigned to any given output |
|
|
Term
|
Definition
Multilayer networks can't be trained using the perceptron convergence rule, so a backpropagation algorithm is needed. --info is transmitted through network --error is propagated back through the network --backpropagated error signal adjusts weights to/from hidden units. --the algorithm needs to find a way of calculating error in hidden units, given that these don't have predefined target activation levels. --it does this by calculating for each hidden unit its degree of responsibility for error at the output units. --this error value is used to adjust weights of hidden units. |
|
|
Term
Past-Tense Formation: Rumelhart and McClelland |
|
Definition
Connecitonist model in which there is a pattern associator mechanism and a Wickelfeature, a way of coding phonetic information. Networks tarts out pretty bad at encoding/coding/decoding but gets much better. General-purpose algorithms and partially systematic patterns.
Issues: semantics matter, wrong mistaks are made, and there are generalization to unfamiliar sounds that don't work. |
|
|
Term
Past-Tense Formation: PSS/Algorithmic |
|
Definition
Cognition is like computational processing. It's an algorithmic transformation of inputs into outputs. Has task specific rules/algorithms and well-defined regularities. |
|
|
Term
Theory of Action: Botvinick and Plaut 2004 |
|
Definition
Multilayer network with fifty hidden units. Recurrent network: hidden units are interconnected-- activation in hidden units is partially preserved across time. Actions are generated without schemas-- there are no units representing the overarching goal guiding the routine.
Training: after each sequence, compare output pattern at each step with correct output. The difference is performance error. Backpropogate error through network and adjust weights of connections to and from each layer. Error plateaus over awhile. |
|
|
Term
Theory of Action: PSS/Distributed Representations |
|
Definition
Symbolic representations-- have schemas, which are discrete representations, represent hierarchical organizations and goals. Distributed representations-- knowledge lies in weights and thresholds. no hierarchies. There's a goal directed behavior without explicit representation of a goal. |
|
|
Term
Direct Approach to difference between perception and cognition |
|
Definition
(Gibson) Big distinction-- no background information is necessary/available for perception. Information is "out there" and already available. There are invariants, e.g. affordances, action possibilites latent in the environment.
Two organisms with totally different backgrounds can perceive things identically. |
|
|
Term
Cognitive Approach to difference between perception and cognition |
|
Definition
(Gregory, Bruner) Stresses importance of intelligence in perception. Perception is completely intelligent process. Involves unconscious inference and brings into play general knowledge of the world, expectations, etc.
People looking at world with different backgrounds see different things. |
|
|
Term
|
Definition
Issue with direct approach: perception of digital stimulus remains the same despite changes in proximal stimulus-- e.g. fruit that looks burgundy in dark is same as one that looks red in light.
Types of constancy: size, color, miller-lyer illusion, ponzo illusion |
|
|
Term
|
Definition
Object "out in world" e.g. brightness. This is what you care about. |
|
|
Term
|
Definition
Energy impinging transducer, e.g. light on retina. There is too much irrelevant variability in this stimuli. |
|
|
Term
Information Encapsulation & Horizontal/Vertical Faculties |
|
Definition
Like all cognitive processes, perception is inferential. However, it is encapsulated. Access to background knowledge is under rigid, permanent constraints.
Horizontal Faculties: operations cross content domains Vertical Faculties: inputs come from specific domains |
|
|
Term
|
Definition
inference making component, in which access to background knowledge is rigidly constrained. --domain specific: modules have specialized tasks, e.g. face recognition --informationally encapsulated: lack access to background information and have a proprietary database. |
|
|
Term
|
Definition
Access all information e.g. decision making, planning, etc. |
|
|
Term
|
Definition
Validity-- if premises are true, conclusion must be true |
|
|
Term
|
Definition
inference to best explanation. Premises can be true and conclusion false. |
|
|
Term
|
Definition
Abductive reasoning is sensitive to global properties, simplicity, coherence, consistency |
|
|
Term
|
Definition
Abductive reasoning is potentially affected by any relevant information |
|
|
Term
|
Definition
1) Central Processes are forms of abductive reasoning 2) Abductive reasoning is Quinean and isotropic 3) Determining which beliefs are relevant is computationally intractable-- it cannot be done in a realistic time frame. This is the frame problem. 4) The operation of central (cognitive) systems seems not understandable in computational terms. 5) but the computational theory is all we have. |
|
|
Term
|
Definition
Posits that there is no such things as central processing. There is no Quinean and isotropic information processing. Central systems consist of a large number of modules. General purpose abilities are the results of highly specialized modules. |
|
|
Term
|
Definition
While Fodorian modules are domain specific and encapsulated, Darwinian modules are domain specific but frugal. Minds are built by evolution. Modules are rough, dirty, heuristic algorithmic ways of solving problems. They're just sufficient and thus often deliver the wrong results. |
|
|
Term
|
Definition
Determining which beliefs are relevant for task is computationally intractable. it cannot be done in a realistic time frame. This is the frame problem. |
|
|
Term
|
Definition
Goal: design system that successfully forms and executes plans for completing general task. Three approaches: 1) Select action if it leads to desired effect. Issue: unintended consequences. 2)Deduce and check all consequences of action before selecting it. Issue: this takes forever! 3) Check only relevant consequences of action before selecting it. Issue: there are a lot of consequences to ignore. |
|
|
Term
|
Definition
Cooperate first, then do what your opponent did in the first round. In infinitely iterated dilemmas, backwards induction is blocked. Evolutionary stable strategy. |
|
|
Term
|
Definition
The ability to come up with certain kinds of explanations (intentional explanations) for behavior. Intentional stance: treat the object whose behavior one wants to explain as the result of what the object believes or desires. |
|
|
Term
|
Definition
Test of belief attribution abilities. Key: beliefs can be false. Minds can represent things incorrectly. Attributing false beliefs requires being able to decouple representations from reality and think about said representations.
Sally-Ann task. |
|
|
Term
|
Definition
Autistics have trouble with social navigation. They don't have theory of mind and have poor spatial reasoning and thus fail false belief task. Also have trouble with belief sequencing tasks. |
|
|
Term
Arguments Against Massive Modularity: Fodor's Argument |
|
Definition
Selection of Input 1) Modular systems take a limited range of inputs 2) For Fodorian modules, transducers select inputs 3) Inputs for Darwinian modules can't be selected by a transducer. 4) Input selection requires a domain general (non modular) mechanism.
Therefore, mind can't be all modules. |
|
|
Term
Arguments Against Massive Modularity: Bermudez's Argument |
|
Definition
Prioritization of Output
1) A situation can result in inputs to different modules 2) Modules might give competing outputs 3) Competition needs to be solved for an appropriate response to be triggered 4) Prioritization requires a domain-general i.e. non-modular mechanism.
Therefore, mind can't be all modules. |
|
|
Term
|
Definition
Plato
All knowledge is recollection, and, hence, innate |
|
|
Term
|
Definition
Locke
There are no innate principles in the mind, which is a tabula rasa. |
|
|
Term
|
Definition
We divide the world into objects and properties that correspond to objects.
Perception goes beyond the immediately visible Quine- we come up with objects from juxtaposition of features. Piaget- child constructs concept of object/perceives objects as a result of manipulating and moving them around/ |
|
|
Term
|
Definition
We think of objects as bundles of features that obey certain physical principles. To have the concept of an object is to know a physical theory whose principles jointly define the notion of an object.
Spelke says that infants have this--- four principles jointly define an initial object concept. |
|
|
Term
|
Definition
Babies can be studie through this paradigm. They orient and then habituate to a certain stimulus after looking at it for awhile, and then do it again with a new stimulus. We can test infant understanding of physical principles with this. |
|
|
Term
Principles of Objectivity |
|
Definition
Principle of solidity: one object can't pass through space occupied by another object.
Principles of cohesion, contact, continuity. |
|
|
Term
|
Definition
Outcome of a choice depends solely on the action chosen and the state of the world.
e.g. Ellsberg Problem |
|
|
Term
|
Definition
Outcome of choice depends on actions of other players e.g. Ultimatum game, trust game. |
|
|
Term
|
Definition
You cannot assign a possible number to probability you can reduce ambiguity. When you're making a choice you need to be confident about the state of the world. |
|
|
Term
|
Definition
Each action has one possible outcome. Choose action that gets preferred outcome. |
|
|
Term
Decision under uncertainty |
|
Definition
Outcome of an action depends on possible states of world. Choose action that maximizes utility. |
|
|
Term
|
Definition
(Probability X Utility of Outcome 1) + (1-probability X utilitiy of outcome 2) |
|
|
Term
Electrophysiological Data in decision under uncertainty |
|
Definition
Neural activity in monkeys when reward possibility is 50% or more. This is in the substantia nigra/dopamergenic cells. |
|
|
Term
|
Definition
If you look up only what's different between outcomes, (a) and (c) and (b) and (d) are the same in this problem. But that's not what people pick. This is a situation in which expected utility theory is violated. |
|
|
Term
|
Definition
problem with urns and colors of marbles. |
|
|