You are viewing mnemosynosis

Previous 50

Apr. 8th, 2014

me2009

Bayesian Virtue Epistemology

Abstract submitted to the Australasian Association of Philosophy (AAP) 2014


Ammonite


Reflective knowledge is the pinnacle of human functioning, traditionally conceived of as the reservoir of the a priori—a revered, almost mystical mental faculty through which Platonic ideals, truths, and axioms depart the heavens and settle on the brow of mortal man. In contrast, reliabilist beliefs are merely dumb associations, forged by mechanistic repetition of limited cognition creating impoverished models of the external world. In this paper, I splice together these apparently conflicting processes by examining Ernest Sosa’s higher order reliabilist account of reflective knowledge within virtue epistemology. To resolve deficiencies within Sosa’s account I draw on another agent-centred, normative and reliabilist epistemology—Bayesian epistemology.

Critics of Sosa's view argue that reliabilism is too weak to do the work of reflective knowledge. I respond that reflective knowledge may be forged from low-level beliefs according to Bayesian mechanisms found in hierarchically nested probabilistic models (HNPM). HNPM explain a child's development of higher order beliefs about abstract concepts such as causation, natural laws and theoretical entities. A hybrid Bayesian virtue epistemology emerges as a robust, empirically promising means to defend Sosa against his critics. Bayesian virtue epistemology is a higher-order reliabilism capable of generating genuine reflective knowledge.


Full paper:
Devitt, S.K. (2013). Homeostatic Epistemology: Reliability, coherence and coordination in a Bayesian virtue epistemology. PhD, Rutgers, The State University of New Jersey. pp.51-104. Retrieved from QUT ePrints.

Feb. 21st, 2013

me2009

Ecological Learning: Lessons from Charlotte Mason for the 21st C.


Charlotte Mason at the turn of the 20th Century

Education, for Charlotte Mason, was coming to know the world and one's place in it, rather than studying for exams or employment. She considered children capable of complex abstract ideas as well as detailed particulars. Her educational theories and methods advocated an experimental and observational approach to learning rather than a teacher-led experience dispensing facts. She had great respect for science and scientific methods. But, more than this, she wanted to nurture the curiosity and sense of wonder in the natural world. Her respect for children's experimental nature accords with recent work in developmental psychology that views the growing child's brain like that of a young scientist (see the work of Alison Gopnik on the child scientist). Charlotte thought that children are capable, but must be scaffolded to work every day. She says in Charlotte Mason's Home Education,

"Do not let the children pass a day without distinct efforts, intellectual, moral, volitional; let them brace themselves to understand; let them compel themselves to do and to bear; and let them do right at the sacrifice of ease and pleasure: and this for many higher reasons, but, in the first and lowest place, that the mere physical organ of mind and will may grow vigorous with work."


While Charlotte shared early educators' vision for hard work, she thought the classroom a poor catalyst for learning. Instead, she thought the best way to learn was outdoors.

"True, we must needs houses for shelter from the weather by day and for rest at night; but in proportion as we cease to make our houses 'comfortable,' as we regard them merely as necessary shelters when we cannot be out of doors, shall we enjoy to the full the vigorous vitality possible to us"

"On fine days when it is warm enough to sit out with wraps, why should not tea and breakfast, everything but a hot dinner, be served out of doors?... every hour spent in the open is a clear gain, tending to the increase of brain power and bodily vigour, and to the lengthening of life itself. They who know what it is to have fevered skin and throbbing brain deliciously soothed by the cool touch of the air are inclined to make a new rule of life, Never be within doors when you can rightly be without. Read more...Collapse )


Inspired by Charlotte Masons vision of education outdoors, I have created a digital artefact (Haiku Deck) as a testament to Charlotte Mason in the 21st century. My artefact explores how modern digital devices and constant access to information ought be combined for optimal well-being and attainment of knowledge. In this way I create utopian argument for the use of technology in education. The ideal location for learning is outdoors, therefore technology ought to augment human experiences within a natural environment and during hours spent under traditional habitation.

Ecological Learning [Haiku Deck] or Pinterest board

Draft assignment for eLearning & Digital Cultures MOOC

Mar. 3rd, 2012

me2009

The Renaissance portrait, individualism and autobiographical memory

A new article in the New York Review of Books They Clamor for Our Attention: The Renaissance Portrait from Donatello to Bellini, connects the rise of the portrait with the rise of the individual in the early Renaissance. The article begins by discussing the uncontroversial view that the portrait was created at the end of the dark ages, leading into the Renaissance. More striking is the thesis that, along with the rise of the portrait came the invention of individual identity. This thesis stems from Burckhardt (1860)

Man [previously] was conscious of himself only as member of a race, people, party, family, or corporation--only through some general category. In Italy this veil first melted into air; an objective treatment and consideration of the state and of all things of this world became possible. The subjective side at the same time asserted itself with corresponding emphasis; man became a spiritual individual, and recognized himself as such. (Butterfield, p.10)


That is, prior to portraits, people didn't individuate themselves apart from their collective identity by religion, nationality, geographic area, gender etc... The portraits of normal people (rather than kings or popes), with great detail, supposedly represent the rise of the individual. Yet, the article points to a new exhibit and a new thesis that really, apart from facial details, typical portraits of the time show people in garb that clearly demarcates them as part of a group (e.g. ruling class of Florence). Butterfield (2012) states:

In Burckhardt's formulation, the individual was seen in distinction from the group. But in the exhibition what we often view are individuals portrayed as the preeminent and exemplary representatives of groups; the men and women are depicted as distinguished members of a virtuosu and honored elite (p.10).


Thus, while portraits certainly paved the way for a shift in the representation of individuals, did they really contribute to the invention of individuation?

This thesis particularly interests me because of a book I've read called Autobiographical Memory and the Construction of the Narrative Self. In this book, there is much discussion of cross-cultural differences in the way we construct self-identity. For example, the Maori in NZ raise their children to be able to retell, in great detail, the events that have recently occurred in their lives. Where as, (studied) rural Indians do not encourage remembering individual events, rather a person's memories tend to be about events that happened to the group, such as floods, food availability etc..., . Thus, the cultural norms for autobiographical memory have a big impact on how one describes and potentially conceptualizes oneself.

Of course, that doesn't mean that identity itself depends on how one describes it. A person's identity over time is defined by a set of interconnected memory events (a version of Locke's theory), but these memory events frequently do not inform conscious reflection. For example, I am the same person as the 2yr old version of me, even though I cannot remember anything that happened before autobiographical memory came online (probably between 3-4). That is, all my memories, implicit and explicit together, define my identity. It's important to remember that there are a lot of memory systems and self-perception only taps into a fraction of that which makes us the same person.

Anyway, if self-perception is largely culturally constructed, then what was going on for these Renaissance individuals, such that they thought of themselves differently? Indeed, did they change at all? I mean, just because I get a staff card with my photo on it, doesn't impinge on my self-image created and fostered by my parents, school and society.

References:
Burckhardt, J. (1860) The Civilization of the Renaissance in Italy.
Butterfield, A. (2012) They clamor for our attention: The Renaissance Portrait from Donatello to Bellini. The New York Review of Books. LIX(4), 10-12
Fivush, R. Haden, C.A. (2003) Autobiographical Memory and the Construction of the Narrative Self. Psychology Press.

Sep. 26th, 2011

me2009

Have we become more depressed because we have stopped memorising?

Education from ancient greece to early 20th century was largely a process of memorisation. Students were expected to learn, remember and use a large variety of cultural materials to furnish their thoughts, words and actions. Once finished with studies, students would continue to have poetry, literature, theatre, religion and history to draw on, wherever they were, when ruminating on particular struggles through their lives. Take Shakespeare, woven throughout his dialogues are poetic and cultural references and yet he was writing for a largely illiterate audience. These references may have helped people challenge their unhelpful thoughts.

I want to look at two things have occurred in the west, in the late 20th century in terms of their impact on mental health [1]. The first is the trend in education to avoid memorisation and the second is the development and refinement of various cognitive behavioural therapies (CBT).

Since the education revolution of the 1970s, students in the west have largely stopped memorising. With the rise of the internet in the 1990s, this process has all but ceased, even for adults who were brought up believing that memorisation was an important aspect of living one's life. Advocates of the extended mind might say that access to the internet or books makes memorisation obsolete. But, think of driving one's car across town, or rock climbing or trying to cook a meal whilst managing anxiety and/or depression. When individuals are alone with their thoughts, when they cannot plug into the internet, or even when they can plug in, they can't necessarily bring to mind a reference or activity that would calm them, offer advice or solace to guide them back to a rational state of mind. Depressed people often turn to social applications such as FB to get help or to feel better, and can spend quite a deal of time there without any progress in their mental state at all. Even if a person does open a relevant page, they can find it difficult to concentrate or absorb external information in a psychopathic state of mind. I claim that the mnemonic structures found in religious texts, poetry and so forth used to form a buffer against one's own negative thoughts and no longer plays such a central role in people's daily mental health management. That is, there is something different about memorising and it could be the key to fixing depression. But, I'm not advocating a return to religion in order to get benefits.

I argue that the most important thing about memorising is that it makes it easier to resolve negative affect. When content is memorised it becomes effective self-talk, springing effortlessly to mind. Lack of energy, poor problem solving and reduced cognitive function are features of depression. I compare this process to learning self-defence by practising moves over and over again without threat, so that in the event of an actual attack, reactions are swift and effective. This leads me to CBT.

CBT is a set of methods of challenging unhelpful thoughts. It has been empirically shown [1] to have a large impact on "unipolar depression, generalized anxiety disorder, panic disorder with or without agoraphobia, social phobia, post traumatic stress disorder, and childhood depressive and anxiety disorders" [2]. The techniques are varied and have been refined over decades, but the principles are clear. What improves depression is when patients actively acknowledge distorted thoughts, challenge them and/or observe them to lessen their impact. The process is very intense, confronting and requires discipline and perseverance to carry out. Part of the effort involved is absorbing and retaining the various 'reframes' of negative thought patterns into realistic, positive, yet believable statements--i.e. memorising them. Patients must begin by quite laboriously writing out their thoughts and analysing them. But, with time and practise those new thought patterns become dominant and reflexive. They have been memorised and are accessible, even during an 'attack'.

Much of the effectiveness of CBT is due to the benefit of memorisation, a skill known for thousands of years, but perhaps only recently rediscovered.

[1] I focus on the west in this case. But, clearly depression exists in Asian cultures and they have a very strong focus on memorisation. I should be very clear then in stating that I do not mean that memorising anything will help depression. But, that using memorisation with CBT (or perhaps religious texts, poetry etc…) is the combination required to ease symptoms.

[2] Butler, A.C., Chapman, J.E., Forman, E.M., Beck, A.T. (2006) The empirical status of cognitive-behavioral therapy: A review of meta-analyses Clinical Psychology Review 26(1) 17-31.

Aug. 13th, 2010

me2009

Be Concise, Pause, Delay, Revise: How to increase the odds of someone remembering what you just said



There’s a lovely experiment from 1929 (Jersild) that compares the effectiveness of common speaking techniques--(or "vividness devices" (67)--on the retention of meaningful information including slowing down and repetition.

Contrary to common habit, speaking slowly is ineffective, because “no active review of a statement can take place until the last word has been spoken, since not until then will the content of the statement be known.” (68) That is, people process and understand your point holistically, once they have the entire idea in mind, not in the order that words are spoken. People also interpret meaningful sections of sentences before modifiers, making instructions in the form of a negative such as “don’t skip your veges” less effective than “eat your veges”, or telling a child, “be careful crossing the road” less effective than “stop”. Encapsulating an idea swiftly and succinctly improves the odds of successful transmission.

However, slowing down through strategic use of pauses between ideas can improve memory retention because it facilitates active rehearsal and consolidation. Rushing from one idea to the next stops listener reflection and review—increasing the odds of information going ‘in one ear and out the other’.

Nonetheless, speaker repetition of ideas can be effective, just so long as there is a lapse of time between repetitions. Immediately repeating what you have just said, hinders learning because it prevents active rehearsal, or as Jersild so beautifully phrases it, “the revival of an impression after the elapse of an interval of time is more profitable than the opportunity to review the item immediately after its initial reception.” (68) In summary: be concise, pause, delay and revise

References:

Jersild, A. (1929). Primacy, Recency, Frequency and Vividness. Journal of Experimental Psychology, 12(1), 58-70.

Apr. 19th, 2010

me2009

Dendritic Spines and their role in Memory and Cognition



Dendritic spines by Eduard Korkotian Weizmann Institute of Science, Israel.

Randy Gallistel and Adam King in their book Memory and the Computational Brain: Why Cognitive Science Will Transform Neuroscience, claim that addressable memory architecture is necessary to explain complex animal behaviour such as food caching by Scrub Jays or even the human capacity to recollect and reconsider prior beliefs.

Their view is contrasted with non-addressable architecture in contemporary neuroscience. Traditional neural networks suppose that computations in neural tissue are implemented by relaying action potentials between neurons. Gallistel and King argue that the implementation must be sought elsewhere. They offer two neurobiological suggestions of where to look, 1) subcellular, e.g. dendritic spines and 2) molecular, something like re-writable DNA & RNA.

I'm not going to give an in-depth analysis of the philosophical issues surrounding their claims, but I thought dendritic spines were worth further consideration.

Recent studies show that dendritic spines are dynamic structures. Their rapid creation, destruction and shape-changing are essential for short- and long-term plasticity at excitatory synapses on pyramidal neurons in the cerebral cortex. The onset of long-term potentiation, spine-volume growth and an increase in receptor trafficking are coincident, enabling a ‘functional readout’ of spine structure that links the age, size, strength and lifetime of a synapse. Spine dynamics are also implicated in long-term memory and cognition: intrinsic fluctuations in volume can explain synapse maintenance over long periods, and rapid, activity-triggered plasticity can relate directly to cognitive processes. Thus, spine dynamics are cellular phenomena with important implications for cognition and memory. Furthermore, impaired spine dynamics can cause psychiatric and neurodevelopmental disorders [e.g. autism, retardation]...

... Discovered in the 19th century and intensely scrutinized in the 20th century, dendritic spines are found in higher animals and some insects. Spines exist only on certain types of neurons, including pyramidal neurons in the cortex, medium spiny neurons in the basal ganglia and Purkinje cells in the cerebellum. Spines are more abundant in higher brain regions and highly variable in shape. Moreover, dendritic spines are the most actin-rich structures in the brain, and their morphology and density are abnormal in several mental disorders...

...Unfortunately, many studies of these correlates reduce every neuron to its action potential. This presents an incomplete picture of cognitive function, and indeed the brain is more than its ions. Recent experiments using optogenetic tools suggest that mechanisms other than spikes can participate in the creation of internal representations. In this section, we attempt to identify connections between the cognitive and synaptic neurosciences to suggest a new synaptic basis for cognitive function. [E.g. attention, computational speed and memory processing. In addition these spines may help solve the binding problem or even possibly the neural correlates of consciousness]...

...The rapid, responsive movement of synapses shares many features with cognition. Dendritic spines can take part directly in cognitive processes to make them more individual, active and stochastic—unlike a computer, in which memory elements obey simple and deterministic rules. Thus, cognitive processes can be easier to understand when we take account of the spine structural dynamics.

Kasai, H., Fukuda, M., Watanabe, S., Hayashi-Takagi, A. & Noguchi, J. (2010) Structural dynamics of dendritic spines in memory and cognition. Trends in Neurosciences. 33(3). 121-129.


I am experiencing my first taste of the computational theory of dendritic spines. I am intrigued. There's quite a nice graphic on this New York Times article



I'm still reading through the book. More thoughts as I prepare my poster for the SPP 2010

Feb. 24th, 2010

me2009

The Indulgence and Futility of Reminiscence


Knowledge, mural by Robert Lewis Reid. The painting suggests knowledge is within a book, a view in contrast with Socratic thinking

Whilst researching on the extended mind, I came upon this passage by Plato on writing, knowledge and memory

SOCRATES: At the Egyptian city of Naucratis, there was a famous old god, whose name was Theuth; the bird which is called the Ibis is sacred to him, and he was the inventor of many arts, such as arithmetic and calculation and geometry and astronomy and draughts and dice, but his great discovery was the use of letters. Now in those days the god Thamus was the king of the whole country of Egypt; and he dwelt in that great city of Upper Egypt which the Hellenes call Egyptian Thebes, and the god himself is called by them Ammon.


Thoth

To him came Theuth and showed his inventions, desiring that the other Egyptians might be allowed to have the benefit of them; he enumerated them, and Thamus enquired about their several uses, and praised some of them and censured others, as he approved or disapproved of them. It would take a long time to repeat all that Thamus said to Theuth in praise or blame of the various arts. But when they came to letters, This, said Theuth, will make the Egyptians wiser and give them better memories; it is a specific both for the memory and for the wit. Thamus replied: O most ingenious Theuth, the parent or inventor of an art is not always the best judge of the utility or inutility of his own inventions to the users of them. And in this instance, you who are the father of letters, from a paternal love of your own children have been led to attribute to them a quality which they cannot have; for this discovery of yours will create forgetfulness in the learners' souls, because they will not use their memories; they will trust to the external written characters and not remember of themselves. The specific which you have discovered is an aid not to memory, but to reminiscence, and you give your disciples not truth, but only the semblance of truth; they will be hearers of many things and will have learned nothing; they will appear to be omniscient and will generally know nothing; they will be tiresome company, having the show of wisdom without the reality.

Phaedrus

In this quote, Socrates warns against relying on the written word to externalize one's memories. He suggests that whilst written words offer the opportunity to reminisce, they offer no improvement of knowledge. Side-stepping the thorny issue of the nature of knowledge. It seems that Socrates would agree with me that 'belief' (true, false or hesitant) is something held in biological memory. Not to be confused with the use of writing to prompt our thoughts. By belief I mean something like,

"a belief is a functional state of an organism that implements or embodies that organism's endorsement of a particular state of affairs as actual" (McKay & Dennett, 493)

The combination of beliefs furnished by memory constitute the mind (anima or 'soul' in ancient greek terms) and impact behaviour. This account implies that more memories constitute a better mind. What really intrigues me about Socrates' comment is his contrast of memory with reminiscence.


Archeological Reminiscence of Millet's Angelus, Salvador Dalí

The useful memory Socrates is referring to was a faculty somewhat like semantic memory, where as reminiscence was more like episodic memory. Semantic memory enables a person to answer "what is 4 x 7" with "28", but only episodic memory allows a person to revisit an event imaginatively, also known as 'mental time travel'.

Socrates seems rather judgmental of reminiscence. Later in the Phaedrus he speaks of writing down thoughts as merely a pastime, being only for "recreation and amusement". He questions the use of "sowing words which can neither speak for themselves nor teach the truth adequately to others?".

Surely Socrates is being somewhat harsh in his condemnation of the written word and of reminiscence. Still, for the first time I appreciate the normative dimension of the different memory faculties in ancient Greek thought. It makes sense that reminiscence is not seen as a path to knowledge. Reminiscence depends upon imagery (phantasia), a sense assumed to furnish imagination as much as 'memories'. On the other hand, abstracted beliefs--stored and integrated with one another about the nature of reality--were useful to the pursuit of knowledge and critical reasoning. A well-furnished memory is required to define, analyze (divide), understand complex arguments, teach or persuade. However, reminiscence is largely a useless indulgence.

I'm fascinated by the degree of information we record about our lives today and the issues around what use such records will play in our lives at a later stage of life. Perhaps we spend too much time documenting our lives and not enough time improving our abilities to live them.


References

McKay, R. & Dennett, D.C. (2009) The evolution of misbelief. Behavioral and Brain Sciences, 32, 493-561

Feb. 17th, 2010

me2009

Saccades and ergonomics: The real reason the iPad is a good idea


Saccades are the voluntary movements of the eyes which serve to bring a new part of the visual field into the foveal region. Saccadic eye movements can reveal global aspects of perception, such as the scan patterns and fixation locations of subjects inspecting human faces.

The dust is settling over Apple's iPad. As predicted, the product has polarized geeks across the world. Fans boast of its features, critics point to the lack of features and a third group are confused about how the device would improve the daily life of a normal, dedicated Apple user, already happily shackled to their iPhone and MacBook.

Here's the real reason the iPad is a good idea: size.

The screen of the iPad is 9.7" (diagonal). This is roughly the size of a trade paperback. Trade paperbacks are the most successful size for reading material because it capitalizes on all the physiological facts of the human body that make reading easiest for most readers. If a human has to hold an object to read--as opposed to viewing a screen or distant sign--then the object's design is limited by the size of the reader's arm and hand size and grasping strength.

Focal distance within arm's length enables the reader to read in sharp focus the printed word, about the size of the fovea. Foveal area is about the size of your thumbnail stretched out in front of your eyes. Surrounding the fovea, the human eye can identify a lot of words within a few inches--roughly the width of a paperback page. Plus, the sacccades of the eye, enable people to quickly move the foveal area to read the written word extremely fast.

Finally, a normal reader can scan a single paperback page of text and pick out words they are looking for within microseconds. The combination of physiological features of human eyes make the paperback size of page ideal to capitalize on the fastest reading possible. In addition, the light weight of a paperback and slimness makes it easy for humans to hold them in a variety of positions for extended periods of time.

Contrary to popular belief amongst geeks, the screen size on the iPhone inhibits our ability to read efficiently, because the text must be sized at a point where we do not benefit as much from scanning or saccades. I can perceive approximately 80 words on my iPhone screen before needing to change the page (8 words per line, 10 lines). By doing this reading test, I have discovered that I read comfortably at 450-500 words per minute. This means I need to change pages every 10 seconds or so to read at a comfortable speed on the iPhone for 1 minute. That is an infuriating level of 'clicking'. I can read a paperback book for 2 minutes (400 words per page) without needing to move.

Additionally to this. I often skim ahead of the text, read the current words and then quickly review what I have just read. I do this to gather context and relate the current sentences to the surrounding paragraphs. I'm sure a saccade test on my eyes whilst reading would reveal an elaborate pattern of eye movements that are not linear, but geometric. I estimate that my mind scans between 120-240 words ahead or behind of what I'm currently reading in detail. The size of the iPhone screen prevents these sorts of reading tricks.

What about the claim that my laptop is better than the iPad? There is no doubt that the screen size on an average laptop is bigger than needed to read a single document at maximum efficiency. However, by including so many features, the laptop becomes less functional for reading. Anything more than 2lb becomes unwieldy to hold for reading. Almost all net books are heavier than the iPad's 1.5lbs.

Nature has given us a particular set of physiological parameters that Apple has recognized with the iPad. It is worth having a third device for reading, so long as it has the size, weight, screen resolution and battery life accelerate our comprehension of new material, rather than hinder it.

Yes, the iPhone is an 'iPad mini', but like miniature books, the fact that we can make devices small, does not make them necessarily better at particular tasks.

Jan. 25th, 2010

me2009

Against Tulving's Residues: Why representations need to be physical entities


Cider and beer residue on the side of a glass. Photo credit: http://www.flickr.com/photos/livenow/2728832048/

At the heart of cognitive science is the notion of a representation: How the mind represents the world during perception and how we relate to these representations when we think.

Representations are the objects of thought, the building blocks of mental experience. Representations have been considered imagistic, propositional or sentential by philosophers going back as far as Aristotle. But only with the advent of cognitivism in the 20th century, has the explanation of mind gone further than mere representation. Cognitivism supposes that representations are aggregates of neural firings and synaptic strengths that also instantiate computational algorithms and symbolic meaning. It is the syntactic and semantic structure of representation that accounts for the systematicity and productivity of thought. This explanation is unapologetically non-reductive, accepting the reality of psychological states such as beliefs and desires as much as the Ca2+ and NA+ ions that underpin them (see Fodor & Pylyshyn's classic article for more details).

As a contrast to cognitivism, connectionism claims that there are no objects of the mind. It supposes that there is no neurological underpinning to beliefs, hopes or desires, because beliefs, hopes and desires don't exist. In fact, connectionists suppose that our talk of mental images, thoughts and so on, whilst pleasurable to discuss, mean nothing in terms of a scientific explanation of the mind. Instead, the only way to understand the mind is to analyze the brain biologically and to formulate our views from discoveries found mechanistically within.

I had assumed that Endel Tulving--the father of episodic memory--was a bog-standard representationalist until reading his article "Coding and representation: Searching for a home in the brain" in The Science of Memory: Concepts. I have believed this because of many instances of representationalist discussion, for example, he has written that the function of episodic memory is to make propositional information available to an organism. Propositional memory has a truth value (i.e. discrete) and can be introspected and considered. Propositional knowledge can be acquired in a single act of perception or thought--contrasted with procedural or skill-based memory that requires repetitious behavioral practice. (Tulving, 1984, 224)

Now, in a small chapter (2007), Tulving says something that makes me question his views.

Tulving considers the nature of the residue left by experience in the mind. Residue is the term he uses by preference, but he acknowledges other frequently used terms such as 'representation', 'coding', 'engram', 'memory image' and 'memory trace' as synonyms. He claims that there is no such thing. Representations, he says, are not physical entities (67).

He defines a memory trace as:

"the neural change that accompanies a mental experience at one time (time 1) whose retention, modified or otherwise, allows the individual later (at time 2) to have mental experiences of the kind that would not have been possible in the absence of the trace. (Tulving, 2007, 66)"


Tulving makes clear that he is referring to "cognitive memory, the kind of memory that has to do with mental experience.... nothing to [do with]... skill learning, conditioning, priming and simple forms of associative learning" (2007, 66). He offers an analogy.



"...think of drawing a straight line. After you have drawn it, the line exists physically with all its properties. Then you grasp the pencil again and make the same line a bit longer. After you have done it, the 'second' line exists physically with all its properties. The difference between the two does not exist anywhere other than in your mind." (2007, 67)



Map of Paris t1

To put Tulving's analogy in context, consider how we learn more about a city. Presumably our mental map of Paris does not change entirely, instead it gets bigger and more comprehensive after we have driven about and seen how more of the roads fit together.


Map of Paris t2

Whilst this explanation might make sense of some learning, there's something problematic with Turing's account of representation. Consider this sentence again: "the difference [between the two lines] does not exist anywhere other than in your mind". How is this expanatory? Understanding how our mind works in a physical body, means figuring out how we can conceive and think about line (t1) and line (t2) in all the complex conceptual and mereological ways we can, yet our brain appears to behave more mechanistically. This is precisely the problem of connectionism as an explanation of the mind. The emergent properties of my thoughts about line (t2) include it's physical properties of length and width, but also abstract properties such as it's relationship to (t1) and all sorts of complex conceptual ideas about identity, numeracy etc... It's difficult to see how to move from Tulving's simplistic memory trace to the subjective capacity of thinking.

What makes cognitivism so compelling is that physical changes in the brain are simultaneously a physical object, a symbolic object and a semantic object--potentially contributing to a many symbolic objects. Surely removing the physicality of representations leaves us either with either mysterious dualism, where representations are individuated by some hitherto unknown mental force or a dispositional account of mind that approaches analytical behaviorism? Tulving states that a "memory trace is something that makes something else possible" (67), what is the something else that is made possible? There is room for counterfactual analysis in mental explanation, but what enables such explanation is the existence of discrete physical entities to which we refer.

In sum, Tulving thinks he's defending an anti-reductionist account of the mind by supposing that representations aren't physical objects. But he offers nothing suitably explanatory in return.

References

Tulving, E. (1984) Elements of episodic memory. The Behavioral and Brain Sciences. 7. 223-268

Tulving, E. (2007). Coding and representation: Searching for a home in the brain. In H. L. Roediger, Y. Dudai, & S. M. Fitzpatrick (Eds.), Coding and Representation: Searching for a Home in the Brain (pp. 65-68). New York, NY: Oxford University Press.
me2009

Phylomon Project and the Problem with Educators and Game Design


Mock-up of what Phylomon might look like.

I'm intrigued by The Phylomon Project. In the year of biodiversity, the Science Creative Quarterly (SCQ) are compiling a bunch of cards with the key statistics of species on them, emulating Pokemon. Apparently primary school kids can remember 120+ types of Pokemon creatures, but know less than 50% of common wildlife species. My concern with the project is that they have missed out on the fun.


Pokemon synthetic species

Why is Pokemon fun? Isn't it something to do with creating an army that gives you power over your friends? The guy that created the game loved collecting insects as a child. But, he drew on that love to create something that hooks the psychology of children even more than species diversity: Power. So, what the well-meaning folks at SCQ have done is assume that the key to the success of Pokemon is the card format, rather than the pleasurable experience and outcomes of playing.

I find this fascinating, because I'm also the sort of person who naively comes up with game design ideas, "what about a game that teaches studying skills to students by being a real-time strategy about managing time... like Sally's Salon I say to Morgan.

Morgan sighs (lovingly) and asks me to step back from the design and tell him what the concrete outcomes for the students should be. I stumble about offering abstract suggestions like "learn the principles of learning", or "be able to study better" or "change what they're doing when what they're doing is not working". No, Morgan says, tell him concrete outcomes. For example, a driving instructor explains to a student that their side mirror is in the right position when the tip of the back door handle is visible. He sends me away to consider what concrete outcomes I want from students and then he'll help out with a design to achieve this.

Education is obsessed with interactivity right now. Well-meaning and passionate educators are falling over themselves to design games to teach their subject material. But, they urgently need coaching on psychology, game design, memory and rigorous thinking.

Jan. 18th, 2010

me2009

Peer learning in Lectures


Slide from Eric Mazur's lecture, "Confessions of a Converted Lecturer"[.pdf]

On Thursday 14th January I attended a lecture "Confessions of a Converted Lecturer"[.pdf], by physicist and learning pioneer, Eric Mazur at the University of Queensland. Mazur says:
I thought I was a good teacher until I discovered my students were just memorizing information rather than learning to understand the material. Who was to blame? The students? The material? I will explain how I came to the agonizing conclusion that the culprit was neither of these. It was my teaching that caused students to fail! I will show how I have adjusted my approach to teaching and how it has improved my students' performance significantly.



Mazur came from a typical teaching environment where scientists communicate to students in big lecture theatres, deliver material dryly from the text book and write examples on the blackboard. He found that students maintain their Aristotelian intuitions, even after a year of physics education. Somehow this system was not helping students understand the conceptual framework of Newtonian physics, only teaching them how to solve a narrow range of problems in a constrained format. Mazur also found that just because students enjoy a class and give terrific evaluations, does not mean they have successfully absorbed the material.

He began to consider what teachers need to do to really help students learn. He decided that information transfer was no longer a key part of the lecturer’s role. Unlike learning environments before the Gutenberg press, information today is easily available in a variety of formats, styles and interactivity. Mazur shifted his energies away from presenting information in his lectures, to assimilating information already encountered by students.

Mazur looked to teaching in the humanities, where lecturers expect students to have read Shakespeare before a lecture discussing the significance of the text. He began expecting students to arrive at physics lectures already having attempted to learn the material for the week. This way, instead of regurgitating information from the text book, he was able to focus on an in-depth analysis of the concepts being introduced. Helping students configure mental models of the scientific concepts encouraged true comprehension and understanding.

However, the true turning-point came when Masur enabled peer learning within the lecture. With the help of clickers, Masur was able to improve conceptual understanding through student engagement. His peer instruction technique works in the following way:
1. Ask a question
2. Students think about an answer
3. Students ‘click’ their answer in.
4. Peer discussion
5. Students submit a revised answer.
6. Explanation

He found that students learnt more thoroughly by teaching each other and articulating their own perspective in class. These techniques have helped both low-achieving students and high-achieving students to improve their exam results. In addition, better conceptual understanding has enabled students to be better problem-solvers.

Further resources:
http://www.abc.net.au/rn/lifematters/stories/2009/2521800.htm

Dec. 9th, 2009

me2009

How to Make Friends and Influence People now an iPhone App: Forget Me Not


iPhone application 'Forget Me Not'

In my previous blog post I discussed Locke's CommonPlace book and information overload in the 16th century. Now, my attention turns to what we use today.

Whilst Google manages our common knowledge extremely well, it does not help us remember the most basic and arguably important data for success, people's names.

Dale Carnegie, in his infamous book, How to Make Friends and Influence People, said that "a man's name is to him the sweetest and most important sound in any language". He gives an example of a useful strategy to remember people:

Whenever he met a new acquaintance, he found out his complete name, the size of his family, the nature of his business and the color of his political opinions. He got all these facts well in mind as part of the picture, and the next time he met that man, even if it was a year later, he was able to slap him on the back, inquire after the wife and kids, and ask him about the hollyhocks in the backyard. No wonder he developed a following! [1]


The new iPhone application Forget Me Not is designed to merge Google's search functionality with the Dale Carnegie's recommendation. What I find particularly interesting about this application, are the clue recommendations:


Clue Helper screen capture from Forget Me Not

These recommendations suggest that the 21st century iPhone user is just as interested in business connections as they were in 1936, but they no longer prioritize a person's core, such as their family or values (i.e. political opinion). I can't help but feel that remembering a person's sport's team is going to make less of an impression than remembering the name of a person's child.

Does this reflect the nature of conversation amongst strangers today, particularly those preoccupied with mobile phones? Do we converse about our sport or club affiliations more than we discuss the controversial, global issues that face the world? What sort of question do you ask of someone you just met to make a good impression?

On the mnemonic side, taking the time to record details of the location you met someone is a great way to jog memories of the person. Combining spatio-temporal information and facial features is likely to enhance subsequent remembering.

Forget Me Not, would surely be ideal to utilize when attending a professional conference. You could record a person's name, their field of study, their philosophical views and so forth. Probably asking about their family will yield better networking for further occasions.

I admit that I'm struck by the degree of alien interaction that occurs at philosophy conferences. Fellow philosophers might not discuss Britney Spears or the Mets, but they often come up and talk to you by blathering about this or that argument without ever really engaging with you on a personal level. This goes for both sexes, although I'm sure female philosophers feel under pressure to avoid as much 'small talk' as possible to appear intelligent. This is yet another arena where social intelligence is ignored for the glory of analytic capacity.

Anyway, I've bought the application. I'll let you know how it goes. :)

UPDATE: This application focuses on what is memorable, memory-hooks, but not what is strategic to remember about a person.

References

Carnegie, D. (1936) How to Win Friends and Influence People. New York: Simon and Schuster. 74.

Nov. 19th, 2009

me2009

Information overload, commonplace books and the backlash against rote memory in the 18th Century


Locke's Common-place book

Between 1500-1700 the amount of available knowledge increased dramatically and the lack of an ordered system for cataloging this information frightened scholars [1]. Initially ideas were grouped together in notebooks by subject, but this system was supplanted by Locke's new method, that indexed memorable ideas via alphabetic order, rather than by relevant heading. This method increased one's capacity to store ideas whilst simultaneously reducing search and retrieval time and promoting lateral thought by grouping semantically decoupled ideas.

Locke also developed his theory of personal identity during this period. Locke suggested that, instead of the body defining the beginning and end of a person's self, identity was co-extensive with memory. Locke's psychological theory first garnered a skeptical reception because of the difficulty reconciling the mind's instability, variability and transience--especially during sleep or altered states of consciousness. However, Locke's theory of identity gradually gained currency at the same time his method of indexing grew popular.

Systematically indexing one's thoughts external to the body supposedly kept the mind clearer and better ordered. It is almost as though externally representing one's memories kept the mind safer from the gusts of consciousness and emotion, thus reducing the sting of philosophical objections against a psychological theory of personal identity.

Externalizing the storage function of memory enabled Locke to push against rote memorization as the primary focus for education.
Locke... criticized the habit to collect and memorize arguments on the grounds that it misguided the understanding, made an individual "a retainer to others" and did not grant any solid foundation to knowledge. He acknowledged that the accumulation of sentences that was "very familiar among bookish men" could bring them "to furnish themselves with the arguments they meet with pro and con in the questions they study." But he maintained that although such "arguments gathered from other men's thoughts, floating only in the memory," could supply "copious talk with some appearance of reason," they did not help scholars "to judge right nor argue strongly, but only to talk copiously on either side, without being steady and settled in their own judgments." Moreover, "the multiplying variety of arguments" cumbered the memory to no purpose. [2]


Here Locke speaks scathingly of peers who simply parrot back information that they have read. Instead he values true understanding, where critical consideration from the reader is required. It's amusing to consider that similar arguments still occur in education debates today.

I find this even more amusing given the obsession of modern psychology (> 1885 post-Ebbinhaus) with rote memory task performance and the ability to remember stimuli verbatim. Whilst remembering facts is a component of memory, the noteworthy job is interpreting the relationship of stimuli with other relevant parts of our lives. I think it may be (ironically) via false memory research that this valuable contribution of memory is finally getting empirical treatment.

Finally, this essential misunderstanding of memory as merely a storage and retrieval device gets to a core of my issue with the way contemporary epistemology deals with mental faculties. Whilst epistemologists are delighted to treat the 'imagination' as an active mental capacity with 'imaginings' as outputs, they still adopt simplistic metaphors when discussing memory. Perhaps Locke's solution to information overload can pave the foundation for a new way of conceiving memory today in a similarly vexed information environment.

References
[1] Blair, A. (2003) Reading Strategies for Coping with Information Overload ca. 1550-1700. Journal of the History of Ideas 64, 11-28

[2] Dacome, L. (2004) Noting the Mind: Commonplace Books and the Pursuit of the Self in Eighteenth-Century Britain. Journal of the History of Ideas 65, 611.

Nov. 4th, 2009

me2009

Torture the Tool of Memory

In the recent Extended Mind debate, it can be easy to forget how long discussions of 'artificial' memory have been in existence. I don't normally quote great swathes of text, but this single sentence from 1200AD, expresses such a richness of content that I couldn't resist:


Three Capetian French scholars consulting an astrolabe, ca. AD 1200

I claim it as established that all books that have been written, or have existed in every region of the earth, all tools, records, inscriptions on wax tablets, epitaphs, all paintings, images, and sculptures; all crosses, of stone, iron, or wood set up at the intersections of two, three, or four roads, and those fixed on monastic houses, placed on top of churches, of houses of charity and bell towers; pillories, forks, gibbets, iron chains, and the swords of justice that are carried before princes for the sake of instilling fear; eye extractions, mutilations, and various tortures of bandits and forgers; all posts that are set up to mark out boundaries; all bell-peals, the clap of wooden tablets in Greek churches, the calls to prayer from the mosques of the Saracens; the blarings of horns and trumpets; all seals; the various dress and tokens of the religious and the dead; alphabets; the insignia of harbors, boats, travelers; inns, taverns, fisheries, nets, messengers, and various entertainers; knights' standards ,the insignia of arms, and armed men; Arabic numerals, astrolabes, clocks, and the seal on a papal bull; the marks and points on knucklebones, varieties of colors, memorial knots, supports for the feet, bandages for the fingers, the lead seals in the staves of penitents; the small notches that seneschals, administrators, and stewards make in sticks when they pay out or receive household expenses; the slaps that bishops give to adults during sacramental annointings; the blows given to boys to preserve the events of history in the memories; the nods and signals of lovers; the whispers of thieves; courteous gifts and small presents--all have been devised for the purpose of supporting the weakness of natural memory.
Boncompagno da Signa, "On Memory" in The Medieval Craft of Memory edited by M. Carruthers and J.M. Ziolkowski. Philadelphia: University of Pennsylvania Press. p.111


I love the historical features of this account such as 'eye extractions'. We know that fear, pain and difficulty is more likely to cement long-term memories than mundane affairs. Still, it's rather shocking to consider torture as a tool of memory.

As I read this I think of Sterelny's account of how human civilization has shaped its environment to suit cognitive tasks larger than the mind of a single individual.

Oct. 31st, 2009

me2009

Source Monitoring: 15 years later


The Dream, Henri Rosseau, (1910) MoMA

Source Monitoring was first described as a framework for understanding how people attribute the source of mental experiences in 1993 (Johnson, Hastroudi & Lindsay, see also Johnson & Raye). The Source Monitoring Framework (SMF) has been used by many labs in the last 15 years to investigate how the subjective experience affects memory judgments. Features that make up complex event memories are derived either perceptually through the senses or via thought (e.g. imagined or inferred) including:

- Perceptual information (e.g. size, taste)
- Spatial details (e.g. left or right of an object)
- Temporal details (e.g time of day, season)
- Semantic information (e.g. gist, category membership, associated items)
- Emotional information (how we or others felt)
- Records of the cognitive operations engaged (imagining, logical inference, counterfactual consideration)

The output from these different modalities and processes combine to constitute an episodic memory (Johnson, 2006). In addition to information or details, the recollection of episodic memories often generate phenomenal experiences, such as emotions, mental images, smells or the 'sense of being there' etc...

The contrast between detail and phenomenality is loosely captured by Endel Tulving's (1985) 'remember-know' distinction. Participants can sometimes know details of a prior event without putting themselves in the past, so to speak--a phenomenon known as 'mental time travel' (Suddendorf & Corballis, 2007). There are a few models that try to establish how details and familiarity interact to influence a remember/know judgment of a particular mental experience.

A promising two-dimensional model positions memory details on the y-axis and familiarity on the x-axis (Rotello, Macmillan, Reeder, 2004). A person is supposed to judge a mental experience as 'remembering' when the difference between details and familiarity is minimized and as 'knowing' when the difference between details and familiarity is pronounced, e.g., I remember being a bridesmaid for my best friend because I can bring to mind many details of the event and a strong emotional conviction that I attended. However, I only know that I completed yr.12 chemistry because whilst my familiarity is very high, my ability to pick out details of the the experience is limited. Conversely, I only know the public transport system of Montreal because I can bring to mind details of the trains and buses, but cannot remember learning about them.

A consistent question in the SMF is precisely how people use detail and familiarity to judge a mental experience as a memory. What evidence do we draw on to judge a particular mental event as referring to an event x rather than an imagining x?

It seems that mental experiences are attributed to source categories according to assumptions about average differences in the features that characterize sources (e.g. more affective information for actually experienced events, more cognitive operations for imagined events...). It may be that we build up expectations for experiences over time and use those expectations to guide our judgments about them.

References:
Johnson, M.K. (2006). Memory and reality. American Psychologist, 61, 760-771.
Johnson, M.K., Hastroudi, S., & Lindsay, D. S. (1993). Source Monitoring. Psychological Bulletin, 114, 3-28.
Johnson, M.K., Raye, C.L. (1981). Reality Monitoring. Psychological Review, 88, 67-85.
Rotello, C. M., Macmillan, N. A., & Reeder, J. A. (2004) Sum–difference theory of remembering and knowing: A two-dimensional signal detection model. Psychological Review, 111, 588–616.
Suddendorf, T., & Corballis, M. C. (2007). The evolution of foresight: What is mental time travel, and is it unique to humans? Behavioral and Brain Sciences, 30(3), 299-313
Tulving, E. (1985). Memory and consciousness.Canadian Psychologist, 26, 1–12

Oct. 30th, 2009

me2009

Sleep, False Memories and the Posterior Analytics


Sleep

A recent Harvard experiment (Payne et. al., 2009) shows that an afternoon nap selectively increases false recall of semantically similar critical words to presented words [see DRM paradigm]. This is because sleep plays an active rather than a merely passive role in memory consolidation. Upon waking, experimenters found that subjects remember the gist of the presented items more easily than specific words, which probably accounts for this effect.

This result may be because the DRM paradigm engages semantic memory more then episodic memory. Where as semantic memory extracts semantic regularities to emphasize what memories share in common, episodic memory stores veridical details to keep memories separate.

Sleep does more than simply consolidate memories in veridical form, additionally transforming and restructuring them so that insights and abstractions can be made, inferences can be drawn, integration can occur, and emotionally salient aspects of information can be preferentially remembered over neutral aspects. (Payne et. al., p.333, in-text references removed)


Sleep transitions the experience of particulars into generalities, universals and abstractions. When I read of this quality of sleep, I thought of Aristotle's posterior analytics (post. an.) and it's high esteem for demonstration. I think I should read it again with thoughts of memory and the nature of the mind to create representations of the world around it. On first blush it seems the will is not needed to forge rational demonstrations. Like bayesian reasoning, even the unconscious and undirected parts of our minds can process information in sophisticated and defensible ways.

References

Payne, J.D., Schacter, D.L, Propper, R.E., et. al. (2009) The role of sleep in false memory formation. Neurobiology of Learning and Memory. 92. 327-334

Roediger, H.L., III, McDermott, K. B. (1995). Creating false memories: Remembering words not presented in lists. Journal of Experimental Psychology: Learning, Memory, and Cognition, Vol. 21, 803-814.

Oct. 5th, 2009

me2009

Building robots with souls?


The Ego Tunnel by Thomas Metzinger

One of the keynote speakers at ASCS 2009 this year was Thomas Metzinger. He presented a hypothesis from his book The Ego Tunnel, that our sense of having a soul or self stem from cognitive systems for physical self-represention. These systems are activated during normal waking life and misaligned during phantom limb or out-of-body experiences. These representations along with sensory feedback (e.g. proprioceptive, visual) create the phenomenenology of embodiment and location in reality. To give the audience a sense of how these representations function, Metzinger showed videos of Qped, a starfish-shaped self-modelling robot that has a nascent sense of self.


Qped 'starfish' continuous self-modelling robot.Video

Animals sustain the ability to operate after injury by creating qualitatively different compensatory behaviors. Although such robustness would be desirable in engineered systems, most machines fail in the face of unexpected damage. We describe a robot that can recover from such change autonomously, through continuous self-modeling. A four-legged machine uses actuation-sensation relationships to indirectly infer its own structure, and it then uses this self-model to generate forward locomotion. When a leg part is removed, it adapts the self-models, leading to the generation of alternative gaits. This concept may help develop more robust machines and shed light on self-modeling in animals. Video of robot, article

When the robot has a limb cut off, it reconfigures itself self-image and then learns a new gait to compensate.

This self-representation is present at birth, e.g. even congenital amputees can report phantom limb pain. It is not learnt through experience, but developed in-utero and presumably comes 'online' at some point during pregnancy. Perhaps this is when abortion becomes repugnant? Will self-modelling robots become the first contenders for robot rights?

Aug. 30th, 2009

me2009

Placebo & Mental Time Travel


Figure 1: Imagining a healing white light might actually precipitate the body's own capacity to deliver pain relief (e.g. opioids), but won't reduce the size of a cancerous tumour.

Placebo-activated opioids, for example, not only relieve pain; they also modulate heart rate and respiration. The neurotransmitter dopamine, when released by placebo treatment, helps improve motor function in Parkinson's patients. Mechanisms like these can elevate mood, sharpen cognitive ability, alleviate digestive disorders, relieve insomnia, and limit the secretion of stress-related hormones like insulin and cortisol.

In one study, Benedetti found that Alzheimer's patients with impaired cognitive function get less pain relief from analgesic drugs than normal volunteers do. Using advanced methods of EEG analysis, he discovered that the connections between the patients' prefrontal lobes and their opioid systems had been damaged. Healthy volunteers feel the benefit of medication plus a placebo boost. Patients who are unable to formulate ideas about the future because of cortical deficits, however, feel only the effect of the drug itself. The experiment suggests that because Alzheimer's patients don't get the benefits of anticipating the treatment, they require higher doses of painkillers to experience normal levels of relief...

...one way that placebo aids recovery is by hacking the mind's ability to predict the future. We are constantly parsing the reactions of those around us—such as the tone a doctor uses to deliver a diagnosis—to generate more-accurate estimations of our fate. One of the most powerful placebogenic triggers is watching someone else experience the benefits of an alleged drug. Researchers call these social aspects of medicine the therapeutic ritual. [Italics added] Link


This is the first time I've considered the intimate relationship between mental time travel and the placebo effect. The mental time travel hypothesis supposes that the same memory system (episodic) which allows us to vividly recollect past events is also activated when we anticipate future events. The research above suggests that patients who lack the cognitive capacity to consider the future also fail to show placebo benefits.

If this is true, then new age remedies such as visualization, meditation or hypnosis may have successful impacts on patients. Simple activities such as imagining or recollecting comfort and peacefulness could affect placebogenic triggers for pain relief.

Similarly, people who struggle to construct vivid mental images or lack empathy for the experience of fictional characters, may also be less susceptible to placebogenic triggers.

Aug. 20th, 2009

me2009

Helen Keller, Cryptomnesia and the Many Systems of Memory

Helen Keller is famous for being a tremendously successful deaf, blind and mute writer and activist. She is also central to one of the most famous cases of cryptomnesia, a circumstance when a person utilizes implicit information whilst experiencing no phenomenal familiarity with the content. In this case, Helen wrote the story The Frost King and was accused of plagiarizing The Frost Fairies by Margaret Canby.

In the history of analytic philosophy, there is much debate about whether a cryptomnesic experience counts as a memory. Bertrand Russell [1] thought that a real memory needed the bearer to have the sense of familiarity that it was indeed a memory. His thinking is certainly shared by Hume [2], who argued that memories, as opposed to imaginings, were particularly vivid to the person experiencing them. Martin and Deutscher [3] pointed out that the necessary conditions of a memory were not the phenomenal experience of it as a memory, but the correct causal connection to the perceptual event that precipitated it. Contemporary cognitive science echos Martin and Deutscher's observation. It recognizes a variety of memory systems, some of which may be active during cryptomnesia and some--such as conscious familiarity--may be passive, making any binary classification of Helen Keller's experience insufficient to capture the complex cognitive processes occurring in her mind at the time she wrote her story.

Philosophical issues aside, here is a great video of Helen and her teacher Anne Sullivan:


Thanks to tomble for the link.

References:

[1] Russell, B. Analysis of Mind
[2] Hume
[3] Martin. C.B. & Deutscher M. (1966) Remembering. The Philosophical Review. 75(2). 161-196

Jul. 14th, 2009

me2009

The Danger of the Extended Mind

To be presented at Australasian Society for Cognitive Science (ASCS09)



TITLE: The Danger of the Extended Mind

ABSTRACT: This paper accepts three claims of the Extended Mind Hypothesis (EMH): 1) External elements form part of the machinery of cognition, insofar as they causally interact with mental states. 2) The meaning of our thoughts is partly explained by reference to the external world (content externalism) and 3) Objects outside the physical brain can operate in functionally equivalent ways to many brain-based processes. I reject criticisms of EMH by Adams & Aizawa including 1) that cognition should be restricted to the domain studied by cognitive psychologists 2) that the processes of cognition are defined by the production of intrinsic or original content. Instead of engaging with these claims, I ask: What difference does it make to include external epistemic artifacts as part of the mind? I consider philosophical issues specifically relating to EMH, thus avoid issues relating to piecemeal replacement and functionalism generally. In one sense it does not matter whether we include external elements as part of cognitive processing. It does not matter if our beliefs are stored in our brains or on a notepad as long as we can access them when needed, just as it does not matter if a person’s leg is made of wood or flesh if it helps them walk.

However, in another sense it might matter a great deal to obviate differences between peripheral (e.g. edge detection in early vision, night vision goggles) and central processing. By ‘central processing’ I mean processes that lie at the core of mental life such as analyzing, understanding and evaluating, henceforth summarized by the term ‘thinking’. Thinking is a skill. Like any skill, thinking requires practice. A virtuoso thinker needs a strong capacity to concentrate and an excellent working memory. Thinking practice is internally generated and executed, even when influenced by a variety of inputs. If thinking is a skill, then failure to practice leads to cognitive atrophy. Increasing our peripheral access to data, whether via iPhones or iPlants increases the availability of information and opportunity for distraction, but not our ability to centrally process that information. Even worse, the more access we get to data, the less we bother to memorize for any particular task. This loss of mnemonic practice in turn decreases our ability to hold many ideas simultaneously and thus further decreasing our ability to think. This is the real danger of saying it doesn't matter if external epistemic artifacts are included in the 'mind'. By obfuscating the difference between peripheral and central processes, we risk confusing data for thinking

May. 28th, 2009

me2009

UQ Philosophy Women's Dinner



On Saturday night I hosted a dinner for UQ women in Philosophy. We had a whole covern turn up (13)*! It was awesome. There was enough food for forty people! Everyone seemed to have a great time and we're all looking forward to another one in the spring.


Preparing for everyone to show up. I wanted lots of candle-light for ambiance.


I discovered that tea lights look amazing in my grandmother's tea cups. The light shines right through the bone china!


Reading the dictionary by candle-light. It was a great party. I had a very interesting conversation about the rivers of Hades and the actual role of the river Styx (not as important as I'd thought!). This led to reading Robert Graves on the river Lethe and that lead to this post.

* I actually have a theory about why 13 is a good number for a non-sit down party. 12 is divisible by 2,3,4,6, which means lots of opportunities for little groups to form and manage themselves. But, with excellent division comes stability, so adding an extra person ensures more chaotic fracturing of social interactions, helping to move people around like a wooden spoon in a pot of dumplings.

May. 20th, 2009

me2009

Social Gaming for Good Memory



...it’s very important to use your brain, to keep challenging your mind, but all mental activities may not be equal. We’re seeing some evidence that a social component may be crucial... The evidence suggests that people who spend long stretches of their days, three hours and more, engrossed in some mental activities like cards may be at reduced risk of developing dementia. Researchers are trying to tease apart cause from effect: Are they active because they are sharp, or sharp because they are active?

.. So far, scientists here have found little evidence that diet or exercise affects the risk of dementia in people over 90. But some researchers argue that mental engagement — doing crossword puzzles, reading books — may delay the arrival of symptoms. And social connections, including interaction with friends, may be very important, some suspect. In isolation, a healthy human mind can go blank and quickly become disoriented...

“There is quite a bit of evidence now suggesting that the more people you have contact with, in your own home or outside, the better you do” mentally and physically, Dr. Kawas said. “Interacting with people regularly, even strangers, uses easily as much brain power as doing puzzles, and it wouldn’t surprise me if this is what it’s all about.” Link


It's fairly intuitive that people are happier and more content when they have friends and family surrounding them. However, prior to reading this article I would have considered the memory benefit of puzzle solving, crosswords etc... to be fairly independent of the social component for avoiding dementia. I also would have thought diet would play a greater role.

On the other hand, loss of social interaction probably reduces mental agility across all age groups regardless of dementia. Because social isolation in non-dementia adults is reversible, it is probably ignored when considering poor cognitive performance. However, if life decisions such as moving cities, having children etc... are a serious handicap to cognition, then perhaps we should be even more concerned about the loss of community and fragmentation in our society?

Also, the benefits of socializing might also justify a moderate partying lifestyle amongst undergraduate students*.

* Although this is potentially offset by binge drinking and drug use.
me2009

The Lethe River and the Value of Forgetting


The Waters of Lethe, Thomas Benjamin Kennington

...the river of Unmindfulness, whose water no vessel can hold; of this they were all obliged to drink a certain quantity, and those who were not saved by wisdom drank more than was necessary; and each one as he drank forgot all things. Plato


The Lethe River was one of the rivers of Hades* also known as the river of oblivion. The river functioned as a mind-wipe and was either positive or negative depending on what type of soul a person had. Heroes and virtuous folk drank from lethe on their way to Elysium to be freed from the sorrows and suffering of a past life. Where as mediocre souls drank from the river Lethe as punishment so that they would not know who they were when they arrived to work, machine-like, for eternity in the Asphodel Meadows. Does this make forgetting a good or a bad thing? Wouldn't the heroes want to remember their feats of bravery and achievements even if it meant reflecting upon the hurt and difficulty of their lives? If total amnesia was great for heroes, why was it bad for regular souls? Did the heroes retain free will or some other attribute that enabled them to be fulfilled in their sojorn through the underworld?

Forgetting is generally frowned upon. We are told 'lest we forget' regarding World War I because remembering the actions of our ancestors is the right and respectful thing to do. Also, learning from the past is also a moral good in the sense that those who forget the past are doomed to repeat it and cyclical mistakes are bad. The aim of life is to learn, retain and react differently as our experiences build and our capacities change. A good person builds their goodness by learning from their errors, not by forgetting their past and repeating mistakes. Punishments such as prison must be remembered to act as a deterant against future crime.

However, forgetting is encouraged when reminiscing becomes too painful or disabiling. Those in broken love affairs re-write the narrative of their relationship to bolster their ego and reconcile the outcome. Modern society encourages us to forget inductive evidence for stereotypes and concentrate on only the person we meet as an individual.

At some level the value of memory ties in with the problem of evil. The problem of evil struggles to explain why a omnibenevalent, omniscient and omnipotent God could allow suffering. One answer is to claim that suffering builds character. One might respond that building character does not demand the degree of suffering inflicted upon the average person in one lifetime. In the same way, forgetting might be valuable in the sense that it can reduce suffering, just as offered by the river Lethe for heroes and the virtuous.

* Other rivers of Hades included: Acheron (river of woe), Cocytus (river of lamentation), Phlegethon (river of fire), Styx (river of unbreakable oath or hate)

Apr. 3rd, 2009

me2009

Dollhouse, Repressed Memory and Personal Identity



Joss Whedon has written a new TV series called Dollhouse. A brief summary is below:
Eliza Dushku plays a young woman called Echo, a member of a group of people known as "Actives" or "Dolls". The Dolls have had their personalities wiped clean so they can be imprinted with any number of new personas, including memory, muscle memory, skills, and language, for different assignments (referred to as engagements). The new persona is... an amalgam of different, existing personalities... The Actives are then hired out for particular jobs – crimes, fantasies, and the occasional good deed... In between tasks, they are mind-wiped into a child-like state... The story follows Echo, who begins, in her mind-wiped state, to become self-aware.
The first few episodes created an interesting phenomenon amongst the audience of the show. Online discussion boards were rife with eager fans of Joss Whedon's previous work (e.g. Buffy or Firefly) expressing disappointment about their own emotional detachment from the newest series. The stunts were great, the futuristic science fiction was interesting, but something was missing. What was it?

The audience struggled, because they couldn't connect with an empty vessel as a protagonist. Whilst the idea of an amnesiac is intriguing, we don't care for Echo, because in an important sense, she ceased to exist the moment the Rossum Corporation wiped her mind. What then, does it mean to exist?

Modern philosophers consider a person to exist if they are a temporally extended psychological entity bound in a physical body; e.g., a human with consciousness and connected memories. This notion ensures that zombies are not people, but also entails that there is no immaterial soul that survives after the body ceases to be.

The psychological reality of a normal person includes their sense of self and their ability to connect up with their memories consciously. The psychological model also explains why we think that a person is 'gone' if they're in the advance stages of dementia, in a coma or a vegetative state.
Plot spoilers beneath cutCollapse )

Mar. 23rd, 2009

me2009

Teaching Turing Machines

It can be hard to convey the beauty, simplicity and profundity of Turing Machines to introductory cognitive science students who frequently have no background (or interest) in mathematics, logic or philosophy. Today I demonstrated a Turing Machine using students to act out the various parts.

The demonstration function was 2 + 1 = 3

I lined up nine students in front of the class to be the 'tape' and one student in front to be the 'machine head'.



The Tape

Each student on the 'tape' represented either a 1 or a 0 and stood in the following pattern

...000110000...

- '1s' stood facing the class and each represented the number '1'
- '0s' stood with their backs to the class and represented the space between numbers.

The Machine Head

The 'machine head' student started in state A. State A was visually represented by the 'superman pose' (hands on hips, looking triumphant).

When the machine head switched to state B, she put her hands by her sides.

In order to 'write' on the tape, she used her hands to mechanically turn the pertinent 'tape' student around 180 degrees so that he or she now represented a 0 or 1 respectively.

After a few minutes demonstrating the machine head's repertoire of possible actions, I positioned the machine head to the far right and gave the class the following rules to solve 2 + 1 = 3

The Instructions*

1. If the machine is in state A, and reads a 0, then it stays in state A, writes a 0, and moves one square to the right.

2. If the machine is in state A, and reads a 1, then it changes to state B, writes a 1, and moves one square to the right.

3. If the machine is in state B, and reads a 0, then it changes to state A, writes a 1 and stops.

4. If the machine is in state B, and reads a 1, then it stays in state B, writes a 1, and moves one square to the right.

The Output*



Students calculated 2 + 1 = 3 using instructions 1 through 4. Each output on the 'tape' is represented by steps (i) through (vii) represented in the diagram above. The final output was:

...001110000...

Which meant three students faced the class and the machine head stood in superman pose at a halt.

Students seemed to have fun and I hope they got a better understanding of Turing Machines by acting them out physically than simply reading about them in a book.

* Instructions and output from Crane, T. (2003) Computers and Thought. The Mechanical Mind, Ch.3, 94-95.

Feb. 25th, 2009

me2009

Francis Galton and the History of X-Phi



Apart from controversial work on Eugenics, Francis Galton seems to be a pioneer in experimental philosophy--when the intellectual distinction between philosophy and psychology was arguably at its most diaphanous. His statistical analysis of individual differences in mental imagery (1880) shows an early interest in testing folk intuitions against the 'expertise' of introspective peers. Then, as now, the ontological status of mental imagery was a topic of inquiry. In an early volume of Mind, he explains:

I desire to define the different degrees of vividness with which different persons have the faculty of recalling familiar scenes under the form of mental pictures, and the peculiarities of the mental visions of different persons. The first questions that I put referred to the illumination, definition and colouring of the mental image, and they were framed as follows...:-

"Before addressing yourself to any of the Questions on the opposite page, think of some definite object--suppose it is your breakfast-table as you sat down to it this morning--and consider carefully the picture that rises before your mind's eye.

1. Illumination--Is the image dim or fairly clear? Is its brightness comparable to that of the actual scene?
2. Definition--Are all the objects pretty well defined at the same time, or is the place of sharpest definition at any one moment more contracted than it is in a real scene?
3. Colouring--Are the colours of the china, of the toast, bread crust, mustard, meat, parsely, or whatever may have been on the table, quite distinct and natural?" [301-302]

Galton first asked his male friends in the scientific world for their responses to his questions because "they were the most likely class of men to give accurate answers concerning this faculty of visualizing, to which novelists and poets continually allude" [302]. He offers no reason why these men would be best able to know mental imagery. Indeed, it turns out that "the great majority... protested that mental imagery was unknown to them, and they looked on me as fanciful and fantastic in supposing that the words 'mental imagery' really expressed what I believed everybody supposed them to mean" [302]. Galton describes his friends as somewhat like colour blind men before knowledge of a perceptive deficiency.

Indeed, a critic said, "these questions presuppose assent to some sort of proposition regarding the 'mind's eye' and the 'images' which it sees... This points to some initial fallacy'... It is only by a figure of speech that I can describe my recollection of a scene as a 'mental image' which I can 'see' with my 'mind's eye'... I do not see it... anymore than a man sees the thousand lines of Sophocles which under due pressure he is ready to repeat. The memory possesses it..." [302] His friends doubted that references to imagery were anything more than metaphor.

Instead of accepting the word of his peers, Galton persisted. He asked 'the folk' (including women and children) about mental imagery and he found them very forthcoming about their inner experiences. These subjects were surprised that anyone would doubt the veracity of mental images given how vividly they experienced them.

Galton ends up completing a statistical survey on adolescent boys and adult men that yields little data of interest*; nevertheless, his method of seeking empirical results from participants unfamiliar to him might strike a resonance with x-phi researchers.

* Galton found that young boys report a greater vividness of the colour conception than adult men.

References

Galton, F. (1880) Statistics of Mental Imagery. Mind, 5(19), 301-318

Feb. 22nd, 2009

me2009

iPlant - Brain Acupuncture


Link

Would you like to keep some parts of your brain under greater control? If so, then you should consider getting brain implants that look like acupuncture needles!

This is not science fiction. Implants are currently used to inhibit in hyper-active brain regions.
"Many disorders, including Parkinson's, essential tremor, dystonia and obsessive compulsive disorder are characterized by hyperactive brain regions. Deep brain stimulation is replacing lesioning as standard treatment for these disorders, is EMA and FDA approved and is 'very benificial' in 80% of cases (Gritsun et al, 2006)." Link

iPlant, a company selling these device says, "iPlants could help a great number of people suffering from poor monoamine signaling, learning and self-control. Link

If they hinder overactive regions, would they also hinder normally functioning ones? If so, then a nemisis could destroy your identity by sticking some of these bad boys into your perfectly working memory systems. If The Prisoner were made today, the BBC would absolutely do an episode on these.

Ethics discussion on the iPlant website. It includes such gems as: iPlant-driven behaviour should only be engaged in when the user would normally be idle or engaged in destructive behaviour.

Jan. 28th, 2009

me2009

Mastering Dukkah for Peak Performance

The researchers from the University of Munster carried out the human study after results in rats suggested that memory could be boosted by a diet containing 30% fewer calories than normal.

The study volunteers, who had an average age of 60, were split into three groups - the first had a balanced diet containing the normal number of calories, the second had a similar diet but with a higher proportion of unsaturated fatty acids, such as those found in olive oil and fish. The final group were given the calorie restricted diet.

After three months, there was no difference in memory scores in the first two groups, but the 50 in the third group performed better.

They also showed other signs of physical improvement, with decreased levels of insulin and fewer signs of inflammation...

...care was taken to make sure that the volunteers, despite eating a restricted diet in terms of calories, carried on eating the right amount of vitamins and other nutrients.

...the drop in insulin levels were one plausible reason why mental performance might improve. Link
Combined with the literature on reduced-calorie diets and lifespan, this new research immediately makes me think of viticulture; i.e., wine-makers know that the best grapes grow in somewhat hostile conditions. What does this say for the human condition? Did God build suffering into the fabric of health?

Perhaps, but I find it strange that the experimental conditions offered participants the 'right amount of vitamins and other nutrients' and still called it low-calorie. Doesn't the body quickly adjust to new conditions and burn calories more efficiently? So long as the body is getting its nutrients, then doesn't a 'low-calorie diet' become just 'a diet'? Would participants waste away if they stuck to the regime, or could they continue it ad infinitum? Or, to put it more significantly, does it matter if they waste away if they live twice as long?

If the ascetic life is the right path, then the only trick left is psychological interpretation of temperance so that it is no longer perceived as a negative.

Dec. 20th, 2008

me2009

Metaphorysics

This is a poetic response I wrote in 2003 to a piece fo writing called "Principia Metaphysica" by Colin McGinn. It is a bit 'in-jokey', so reading the original may make it more fun to read. On the other hand, it might work alone. I'm not sure. Comments and criticisms definately encouraged. BTW, Colin's website seems to be causing trouble at the moment.

Metaphorysics


1. Philosophical poetry is a neglected genre
A neglected beast is either shot or nursed to health.
It depends how much glue we need.
What potential is in this sick creature?
"The point of philosophy is to start with something so simple as not to seem worth stating, and to end with something so paradoxical that no one will believe it."
Ah Bertrand.
What is the point really? What do I really think?
I'm having a tingling sensation in my concept area-by turns painful and pleasant.

2. Philosophy is a game.
A family resemblance of rules, politics and scoring
Sometimes we run naked across the field with 'fuck you' scrawled on our buttocks.
Bluntness and crudity.
Either way, you'll want to transcend the audience and perform in the spectacle.
Are the successful philosophers the ones on the team?
Or did they just court the right metaphor?
She was the most beautiful metaphor in the kingdom, with her long flowing stream of mental images and divine aesthetic.
Compare the starving artist and the starving philosopher.
Does the latter get published posthumously?
The natural history of philosophical thought is not without interest.Read more...Collapse )

Dec. 10th, 2008

me2009

The Simpsons Excites and Re-excites the same Neurons

In the study, Prof. Fried observed the neural activity in the brains of 13 epilepsy patients, as the patients watched clips from TV shows like Seinfeld and The Simpsons. A short while after, the test subjects were asked to describe what they remembered from the video clips. During recall, the exact same neurons that had fired while viewing a clip fired once again while the subject was recalling it. Soon, the researchers were able to predict what clip the subjects would recall just by looking at the neurons that lit up seconds before the recall experience was vocalized. Link

This is an exciting development in short-term memory research on the hippocampus. It is also validation for David Hume's theory of mind. Hume thought that the way we differentiate perceiving, remembering, imagining, and reasoning is by their phenomenal properties; how vividly we experience them. Specifically, he supposed that remembering was closest to perception--and generally a more vivid and lively experience--than imagining. It would be good for his theory if the neurons used during perception were more activated when remembering than when imagining. Because even if the way we differentiate between mental processes is more complicated than Hume's picture, there still might be some reliable physical differences that help us monitor the source of our thoughts.

If Hume was part of Prof. Fried's experimental team, then he'd probably like to contrast the neural activity of imagining and remembering the Simpsons. Of course, there are very difficult methodological problems with such an experiment as anyone imagining the Simpsons is probably accessing memories in order to furnish the imagining. Also, how could we test the imagining, remembering and perceiving of the same stimuli within subjects without conflating the results? Perhaps subjects could be asked first if they had seen a particular episode (and removed from the study if they had), then asked to imagine what will happen, then shown the video and subsequently asked to remember what they saw. If at any point subjects remembered that they had seen the episode before, then they would leave the study. Whilst this experimental design is not ideal; it might turn out that imagining produces a quantifiable difference in neuron firing in the hippocampus than remembering.

There has been a lot of philosophical criticism of Hume's theory on the basis that our imaginings can be more vivid than some of our memories, so vivacity can't be the way we differentiate between them: e.g., imagining lying on a beach on a tropical island on a perfect day might affect us more than remembering climbing a tree at our childhood home. I have no doubt we use many different systems to evaluate our mental experiences: We reason about ideas and contrast them with other experiences we've had, we infer the likelihood of something being true based on our theory of memory formation and retention: e.g. we know that older memories fade and horror movies can create vivid nightmares. Nevertheless, it seems likely that one of the systems we use is connected with 'how much like perception' a mental experience is. When our mentalizing seems real, we judge it more likely to be a memory than something we just imagined. New research on neuron firing patterns seems to back up this practise.

Oct. 24th, 2008

me2009

There is no 'Eternal Sunshine' drug to selectively erase memories

The popular press is excited by the prospect of a drug designed to delete memories we don't wish to retain. The story reads like a super-villian's research proposal. It's a pity the research isn't about 'memories' as most people understand them. The current research shows that mice with boosted levels of α-CaMKII exhibit behavioral calmness when returned to a terrifying cage they'd once been inside. Their lack of fear indicates that they have forgotten previous exposure to an electric shock. The results are extremely interesting, but they don't say much about deleting our memories.

The reason is that memory is not a unitary faculty and our memories are not the same as a mouse's reaction to shock. For example, the mechanisms by which we are able to recall Aunt Flo's wedding are not the same as those that enable us to tie our shoes or learn to avoid an electric fence. When most people think of a stereotypical 'memory', they consider highly conceptualized, detailed autobiographical memory from their own past. The fact that experimenters can break a behavioristically generated association in a mouse, doesn't shed light on our representational memory.

Strangely enough, Quine said something relevant to this in the opening pages of Word and Object. In this passage he argues that immediate experiences (e.g. sounds, electric shocks, pain) do not ground our language or our memories. We gauge meaning from our experiences by reference to physical objects, not the sense-data that we initially perceive.
...immediate experience simply will not, of itself, cohere as an autonomous domain. References to physical things are largely what hold [language] together. These references are not just inessential vestiges of the initially inter-subjective character of language, capable of being weeded out by devising an artificially subjective language for sense data. Rather they give us our main continuing access to past sense data themselves; for past sense data are mostly gone for good except as commemorated in physical posits. All we would have apart from posits and speculation are present sense data and present memories of past ones; and a memory trace of a sense datum is too meagre an affair to do much good. Actual memories mostly are traces not of past sensations but of past conceptualization or verbalization. Quine, W.V.O. (1960) Word and Object. MIT Press. p.2-3
The point of this passage (for the sake of this post) is that the meaningfulness of our autobiographical memories is abstracted from direct experience or even recollections of sensation. No doubt our strongest memories are adorned with emotion, mental imagery and recreated sensations; but their cognitive impact can be abstracted from these feelings. Remembering is not simply retrieval; it requires the re-creation of past events. We furnish these re-creations with our imagination more than directly retrieving inputs. Thus a variety of non-memory-based mechanisms (e.g. current emotional state) impact how we perceive the past.

Because I believe that the mind supervenes on the brain, I have no doubt that neurochemical research will reveal much about our mental states. But, we're a long way from knowing how to selectively create a spotless mind.

Oct. 19th, 2008

me2009

Russell Reconciles Materialism and Consciousness

Bertrand Russell wrote The Analysis of Mind (AOM) in 1921. His writings reveal influence from a variety of sources including the fledgling disciplines of both behaviorism and psychoanalysis. Behaviorism argued that the science of psychology could achieve objective results without the philosophical difficulties of consciousness or introspection. Psychoanalysis interpreted our introspective failings as a challenge to create better reflective techniques.

Russell was impressed by J.B. Watson, stating: "it is humiliating to find how terribly adequate this hypothesis turns out to be". Where 'this' was the contention that "thought processes" were merely "the habit of language". Russell also acknowledged that the psychoanalysts were onto something. He states, "What, I think, is clearly established, is that a man's actions and beliefs may be wholly dominated by a desire of which he is quite unconscious, and which he indignantly repudiates when it is suggested to him."

Even though Russell was sympathetic to the notion that some of our mental actions were driven from "unconscious" desire, he argued that this was merely a "causal law" of our behaviour and not the mysterious, mythological character psychoanalysts portray it as. Russell describes the psychoanalytic conscious vividly as "...a sort of underground prisoner, living in a dungeon, breaking in at long intervals upon our daylight respectability with dark groans and maledictions and strange atavistic lusts".

Russell comments that while psychologists were busy finding an objective, physical basis for their subject matter, physicists, with the advent of relativity were making their subject-matter less and less material. Russell seeks to reconcile the material and mental using the functional approach of William James and the "American new realists" (AOM, preface). "James's view is that the raw material out of which the world is built up is not of two sorts, one matter and the other mind, but that it is arranged in different patterns by its inter-relations, and that some arrangements may be called mental, while others may be called physical." (AOM, I) Russell reconciles monism by distinguishing matter from its arrangement.

However, Russell argues against the notion that the essence of everything mental is conscious (AOM I); suggesting that some mental acts are not-conscious. For example, perhaps one could have beliefs or desires that one is not conscious of? Russell says that a man could desire his lunch but not be conscious of it until he tells himself that he is hungry. Thus a desire is conscious only when we tell ourselves that we have it. (AOM I) He believes that "an "unconscious" desire is merely a causal law of our behaviour. namely, that we remain restlessly active until a certain state of affairs is realized, when we achieve temporary equilibrium If we know beforehand what this state of affairs is, our desire is conscious; if not, unconscious. The unconscious desire is not something actually existing, but merely a tendency to a certain behaviour; it has exactly the same status as a force in dynamics." (AOM I) Russell's talk of a dispositional subconscious mental states pre-dates Gilbert Ryle's similar argument from The Concept of Mind in 1949.

In conclusion, whilst Russell draws inspiration from contemporary theories of mind, he is not bound by them. Instead he creates a hybrid materialist view of the mental that acknowledges the explanatory force of associationism and the poverty of our own introspections, yet includes consciousness as a legitimate object of psychology to be explained.

Oct. 17th, 2008

me2009

Hippocampus


Image from Rose, S. (1992) The Making of Memory. London: Bantam Press. p.125

Look! This is your hippocampus. Your hippocampus is where memory begins. Short term memories are formed in this sea-horse shaped organ and then reactivated there when you dream. An electrical storm in your hippocampus, say from an epileptic fit, results in amnesia. As you age, Alzheimer's first attacks your hippocampus, creating disorientation and memory loss. Nevertheless, the hippocampus is one of an elite subset of brain regions that are capable of generating new neurons throughout life.

Oct. 13th, 2008

me2009

How to Fight a Rumour

FOR ANYONE WHO has ever worried about the power of a vicious rumor, Barack Obama's strategy over the summer [Fight the Smears] must have seemed almost bizarre.

...

New research into the science of rumors suggests Obama's approach may be a sounder strategy - and the reasons why it makes sense suggest that we misunderstand both how rumors work and why they exist.

By using the tools of evolutionary theory and new approaches to mathematical modeling, researchers are drawing a clearer picture of how and why rumors spread. As they do, they are finding that far from being merely idle or malicious gossip, rumor is deeply entwined with our history as a species. It serves some basic social purposes and provides a valuable window on not just what people talk to each other about, but why.

...

Our brains aren't terribly adept at distinguishing people who are "actually" important from people who simply receive a lot of attention.

...

Other than denying a rumor that's true, perhaps the biggest mistake one can make... is to adopt a "no comment" policy: Numerous studies have shown that rumors thrive in environments of uncertainty. Considering that rumors often represent a real attempt to get at the truth, the best way to fight them is to address them in as comprehensive a manner as possible... An effective rebuttal will be more than a denial - it will create a new truth, including an explanation of why the rumor exists and who is benefiting from it.

The more vivid that replacement is, the better [stealing thunder]. When done correctly and early enough in a rumor's lifetime, it can shift the subsequent conversation in beneficial ways. Link [Italics added]


Understanding social reasoning and information exchange is absolutely critical. So many people (especially the intelligentsia) mock and ridicule the habits of 'normals' talking around the water cooler about Britney or Paris. Yet, they too gossip about their collegues, sexual misconduct, political funding decisions and so forth. Mock all you like, but these exchanges build social credibility; the more negatives you know about a highly discussed community member, the more acceptance you can get in the network. Remember that peer acceptance is critical for advancement in every profession.

Social heuristicss such as listening to rumours and including gossips in one's social circle cuts down on the amount of social cognitive processing required by individuals, freeing them to concentrate on other concerns.

The rational strategy is to offer information about individuals relevant to your superiors. Your boss must find you socially valuable to promote you. Of course pure merit can get you quite a way along the food chain. Some professions are better than others at recognising such contributions. Nevertheless, without appreciating the role of gossip, you're unlikely to be invited to the cool kids party.

Aug. 6th, 2008

me2009

Olfaction: Epistemic bootstrap to the external world?

On the one hand, "...olfaction is our slow sense, for it depends on messages carried not at the speed of light or of sound, but at the far statelier pace of a bypassing breeze, a pocket of air enriched with the sort of small, volatile molecules that our nasal-based odor receptors can read.

Yet, on the other hand, olfaction is our quickest sense. Whereas new signals detected from the visual system, auditory system, proprioception (body position), nociception (pain) and gustation (taste) "must first be assimilated by a structural way station called the thalamus before reaching the brain’s interpretive regions, odiferous messages barrel along dedicated pathways straight from the nose and right into the brain’s olfactory cortex, for instant processing. Importantly, the olfactory cortex is embedded within the brain’s limbic system and amygdala, where emotions are born and emotional memories stored. That’s why smells, feelings and memories become so easily and intimately entangled...

...numerous studies have shown that smell memory is long and resilient, and that the earliest odor associations we make often stick...

...while the word and visual cues elicited associations largely from subjects’ adolescence and young adulthood, the smell cues evoked thoughts of early childhood, under the age of 10. And despite the comparative antiquity of such memories, Dr. Larsson said, people described them in exceptionally rich and emotional terms, and they were much likelier to report the sudden sensation of being brought back in time...

...Dr. Larsson attributes the youthfulness of smell memories to the fact that our olfaction is the first of our senses to mature and only later cedes cognitive primacy to vision and words, while the cortical link between olfaction and emotion ensures that those early sensations keep their bloom all life long."

Link


Perception is one of the most basic epistemic sources we have. Yet, most epistemologists spend their entire philosophical lives using examples from vision(e.g. apprehending objects) or audition (e.g. understanding assertions). Perhaps olfaction is the most basic of all the senses, in the sense that the signal from external stimuli is least compromised by interpretation during its path to consciousness? All other sensory inputs must survive selective processing by the thalamus and an unpredictable journey to various parts of the cerebral cortex.

Even though most people struggle to identify smells conceptually, they can recognise smells with great reliability and also trigger memories more powerfully than almost any other stimulus: internal or external. In fact, when considering how we bootstrap ourselves into the world; how we align our experience of the world with what exists; perhaps olfaction is the biggest piece of leather?

Jun. 30th, 2008

me2009

The Power of Repetition on False Beliefs

The brain does not simply gather and stockpile information as a computer's hard drive does. Facts are stored first in the hippocampus, a structure deep in the brain about the size and shape of a fat man's curled pinkie finger. But the information does not rest there. Every time we recall it, our brain writes it down again, and during this re-storage, it is also reprocessed. In time, the fact is gradually transferred to the cerebral cortex and is separated from the context in which it was originally learned. For example, you know that the capital of California is Sacramento, but you probably don't remember how you learned it.

This phenomenon, known as source amnesia, can also lead people to forget whether a statement is true. Even when a lie is presented with a disclaimer, people often later remember it as true.

With time, this misremembering gets worse. A false statement from a noncredible source that is at first not believed can gain credibility during the months it takes to reprocess memories from short-term hippocampal storage to longer-term cortical storage. As the source is forgotten, the message and its implications gain strength...

...Psychologists have suggested that legends propagate by striking an emotional chord. In the same way, ideas can spread by emotional selection, rather than by their factual merits, encouraging the persistence of falsehoods...

...Journalists and campaign workers may think they are acting to counter misinformation by pointing out that it is not true. But by repeating a false rumor, they may inadvertently make it stronger. In its concerted effort to "stop the smears," the Obama campaign may want to keep this in mind. Rather than emphasize that Obama is not a Muslim, for instance, it may be more effective to stress that he embraced Christianity as a young man.

In 1919, Justice Oliver Wendell Holmes of the Supreme Court wrote that "the best test of truth is the power of the thought to get itself accepted in the competition of the market." Holmes erroneously assumed that ideas are more likely to spread if they are honest. Our brains do not naturally obey this admirable dictum, but by better understanding the mechanisms of memory perhaps we can move closer to Holmes' ideal. Link
Epistemology prefers to examine the value of truth, over other values such as moral impact, belief coherence, desire for social approval or emotional saliency. This emphasis goes hand-in-hand with treating memory as a passive mechanism, i.e., 'shit-in/shit-out'. The metaphor of memory as a simple storage and retrieval device does nothing to explain belief revision and why false beliefs may sometimes be justified, rational or valuable to individuals, regardless of their veritistic value.

Jun. 24th, 2008

me2009

Spousal cognition

If the extended mind literature is onto something, then any two individuals with a sufficiently reliable relationship can be considered part of a single cognitive system, just as cohesive as the intra-cranial faculties that typically define a cognitive agent. For example, married couples reliably co-represent facts and utilize each other to make decisions with an advantage over individuals solving tasks alone (Berg, et.al. 2007). Even if one is skeptical that the mind is truly extended, the benefits of group recall, problem-solving and decision-making remain.

Now suppose that the act of retrieving memories stored in ones spouse's mind can be faster, more reliable and easier than finding the information within one's own mind. This consideration runs counter to work in epistemology on testimony (see Lackey & Sosa, 2006) which suggests that the combined act of memory and perception required for testimony renders it more difficult than acts of mere memory alone. The problem with this approach is that it assumes that complexity signals difficulty.

A priori, it seems that linguistic communication between married couples is more efficient than between less familiar dyads--hence the amusement of "The Newlywed Game". Older married couples have reliable patterns of information processing and behaviour that automates particular interactions. Combining communicative ease with retrieval difficulties of one's own memory; the total effort required to request answers from one's spouse may be simpler, both experientially and computationally, than piecing together one's own fragmented recollections.

How can this be? Well, suppose that running multiple cognitive faculties in parallel can be more efficient than running a single process serially. For example, perceiving and understanding ones spouse's linguistic utterances involves many faculties working habitually and synchronistically. Contrast this with accessing a dubious past perception. The latter may be perplexing and uncertain. Even if the faculties running in parallel take longer than the serial process to produce outputs, those outputs may yield strikingly different epistemic states. Depending on the context of use, the former may have epistemic reliability that the latter do not. For example, our spouse can help us find our keys if we are absent-minded much faster than we can by-ourselves.

Also, each year cognitive tasks that were once viewed as mind-bogglingly difficult are found to utilize fairly simple heuristics. E.g. tracking a baseball requires maintaining the angle of one's head, which is a vastly simpler task than explicitly solving parabolic functions.[1]

It may turn out that testimony between intimate dyads deserves a unique epistemic approach.

[1] Of course, any task which is solved through embodiment or outsourcing then gets questioned about its cognitive status. Is the mark of the cognitive only those activities involving thought?

References:

Berg, et.al. (2007) Task Control and Cognitive Abilities of Self and Spouse in Collaboration in Middle-Aged and Older Couples. Psychology and Aging. 22(3). 420-427.

May. 22nd, 2008

me2009

The effect of collaboration on false memory reduction.

Which is better: Individual or collaborative recall?

Psychologists have found that individual recall is better than collaborative recall in typical word list recall tasks (e.g. Finlay, et.al., 2000). That is, if experimenters give participants a list of unconnected words to learn, they remember more of them if they work alone, than if paired into a group. Individuals rarely spontaneously introduce false positives when completing lists alone.

However, real life does not merely involve brute rote memory, it requires narrative construction. Real mnemic epistemic success requires the ability to both remember facts and avoid false memories. Consider a significant event, such as a marriage. When recollecting details it can be difficult to avoid adding embellishments (e.g. was it really a mustard sauce or did the steak have a pepper sauce?). However, if we convey memories with fellow event attendees, then narrative flourishes or mistakes can be picked up on and removed (or scaled back) to maintain group accuracy, (e.g. it was definitely pepper because Bob says he bit on a peppercorn and started sneezing)[1].

There is a new social false memory task which is being used to investigate collaborative reduction of false memories.

The most widely used false memory paradigm is the DRM word list (Roediger & McDermott, 1995) In this task, normal subjects are presented with lists of words with similar semantic associates (e.g. mad, fear, hate, rage, temper, fury) of a particular prototype word (e.g. anger), which is not itself studied. In subsequent tests, subjects are given lists of words that included the original list, the prototype word and unrelated words (e.g. bread). Although participants are able to separate out the semantically irrelevant words (e.g. bread), they were just as likely to recall prototype words as the actual words from the list (e.g. they claim that 'anger' was part of the original list). Not only this, but they claim to vividly recollect seeing 'anger'.

A newer false memory paradigm tests the effect of social factors in false memory creation (Roediger et.al., 2001):
Participants briefly viewed pictures of six rooms in a house and subsequently tried to recall the objects in each room... Immediately after viewing the pictures, participants were asked to remember six items from each scene. Participants took turns doing the recall with an anonymous 'other participant' who also recalled six items from the scene. The 'other participant' was actually a computer programmed to provide responses from a list of items in the photos. For three of the scenes, the computer provided the names of two items that had not appeared... Some of these false items would seem very likely to be in the target scene (high-expectancy items); other false items were less likely to be in the target scene (low-expectancy items), though not out of place...The critical recall phase occurred next. Participants were asked to remember as many items from each scene as they could. (Ross, et. al., p.86)
Collaboration on this task reduces false memory errors in both older and younger subjects (Ross et.al, 2004).

False memory experiments are an institutional type of gaslighting, deliberately tricking or manipulating other's beliefs. They provide an excellent opportunity to study how social settings can yield skewed beliefs. I'd like to do the above experiment with different power structures. For example, instead of a computer program suggesting false items, a high-status individual would suggest them. I expect that false memory acceptance would increase when a powerful person suggests them and little external verification available.

[1] We might say that pepper sauce is the least important feature of a wedding to remember, however, those sort of facts are precisely what psychologists tend to examine in word list tasks.

References

Finlay, F., Hitch, G.J., & Meudell, P.R. (2000). Mutual inhibition in collaborative recall: Evidence for a retrieval-based account. Journal of Experimental Psychology: Learning, Memory, and Cognition, 26, 1556-1567.

Roediger, H.L., III, McDermott, K. B. (1995). Creating false memories: Remembering words not presented in lists. Journal of Experimental Psychology: Learning, Memory, and Cognition, Vol. 21, 803-814.

Roediger, H. L., III. Meade, M. L., & Bergman, E. T. (2001). Social contagion of memory. Psychonomic Bulletin & Review, 8, 365-371

Ross, M., Spencer, S.J., Lindardatos, L., Lam, K.C.H., & Perunovic, M. (2004). Going Shopping and identifying landmarks: Does collaboration improve older people’s memory? Applied Cognitive Psychology, 18, 683-696.

May. 19th, 2008

me2009

What Sorts of People Should There Be?

Cognitive Scientist Rob Wilson is involved with a new collaborative project:
What Sorts of People Should There Be? is a broad, interdisciplinary, collaborative project in the humanities and social sciences that is focused on human variation, normalcy, and enhancement. By weaving together distinct philosophical, historical, and comparative threads through the establishment of a Canadian-based team of 44 researchers from 18 disciplines, this project will undertake innovative work on this topic at the interface of the humanities, biotechnology, and the social and health sciences.

Website
Blog
Syndicated livejournal Feed

One of their sub-projects is called "From Archives to Activism: Building Inclusive Communities Through Practices of Collective Memory". Contributors include: Sue Campbell, John Sutton, John Paul Himka, Lene Koch (Scandinavian eugenics), and Joanne Faulkner (working in part on the stolen generation). There will be work on trauma and memory, and on collective memory and reconciliation.

May. 9th, 2008

me2009

Gaslighting: Cognitive Sabotage in the Extended Mind

AAP 2008 Abstract

Traditional cognitivist accounts of the mind focus on an individual’s symbolic mental processing. Embodied cognition and extended mind research goes beyond the boundaries of the agent. These newer accounts promote biologically and ecologically situated cognition and endorse external epistemic artifacts as legitimate parts of an agent’s mental repertoire. Some work in decision-theory and collective memory also acknowledge the positive impact of group reasoning on individual cognition. Amongst the enthusiasm for collaborative thinking and collective memory, not much research considers the downsides to sharing cognitive resources. Although psychology warns against accidental groupthink, few write of a more insidious threat to the extended mind: cognitive sabotage. Cognitive sabotage or ‘gaslighting’ is the deliberate creation of false beliefs. Saboteurs lie or distort group information to serve selfish ends. For example, abusive husbands manipulate their wives, mothers enact Munchausen's by Proxy on their children and psychiatrists misdiagnose patients as delusional when in fact they speak the truth (Martha Mitchell Effect). Outsourcing memory and cognition may ease an agent’s informational burden, but it also increases their epistemic threats. By assuming a middle ground between cognitivism and embodied cognition, I defend a model of gaslighting that extends representational explanations of individual decision-making to an account of group cognition. I argue that the unreliable and deceptive aspects of collaborative reasoning are analogous epistemic hazards to the self-deception and flawed reasoning faced by individuals alone.

May. 6th, 2008

me2009

Remembering Beliefs

CogSci 08 Submission

Remembering Beliefs
Optimal decision-making requires us to accurately pin-point the basis of our thoughts, e.g. whether they originate from our memory or our imagination. This paper argues that the phenomenal qualities of our subjective experience provide permissible evidence to revise beliefs, particularly as it pertains to memory. I look to the source monitoring literature to reconcile circumstances where mnemic beliefs and mnemic qualia conflict. By separating the experience of remembering from biological facts of memory, unusual cases make sense, such as memory qualia without memory (e.g. déjà vu, false memories) or a failure to have memory qualia with memory (e.g. functional amnesia, unintentional plagiarism). I argue that a pragmatic, probabilistic approach to belief revision is a way to rationally incorporate information from conscious experience, whilst acknowledging its inherent difficulties as an epistemic source. I conclude with a Bayesian defense of source monitoring based on C.I. Lewis' coherence argument for memorial knowledge. PDF .txtCollapse )

Apr. 28th, 2008

me2009

Catching the bus



When catching a series of unfamiliar buses to an unfamiliar part of town, we can find ourselves at the mercy of printed maps and scheduling keys. The experienced traveller senses that they should leave ample time on a first journey for unexpected contingencies, yet still can be derailed when things go wrong.

There is an interesting phenomenon whereby, if something strange happens during the first portion of the journey, it causes us to question the reliability of the rest of the information in a disproportionate way. For example, suppose we get off at a bus stop not listed in the schedule. This can be sufficiently disconcerting that we anxiously scrutinize the available information at the next bus stop to ensure we that our next bus actually stops there. Upon finding no information about our bus, we might discard our printed plans altogether and find an entirely new route based on nothing but information available at street-level. Of course, then the second bus arrives, and we miss it, because we're too busy investigating alternative routes at other bus stops.

What explains this radical departure from rational behaviour? Surely the peculiarities of our journey are due to human error rather than incorrect schedules? It seems that the emotional impact of two unexpected outcomes is so great that we diverge vehemently from a likely hypothesis in favour of a less substantiated theory.

I'd like to do more work on the role of emotions on rational decision-making. Bayesian belief revision gives us a guideline for how we should revise belief in light of evidence, but, perhaps it doesn't explain how we pick between two equivalently believable hypotheses (instrumentalism) or why we shift dramatically to an untested theory. Instead, something emotional or aesthetic comes to play. The emotions might offer impetus to make radical hypothesis changes regardless of sensible evidence analysis. Emotions create cognitive mutations; in themselves dangerous, but in conjunction with rational decision-making; lateral genius.

Apr. 21st, 2008

me2009

Could déjà vu be explained by grid cells?

This is a great question, because grid cells, which are involved in processing spatial information about our surroundings, are located in a brain region that is part of a larger memory system thought to be responsible for the feeling of familiarity. After considering their function in detail, however, I think it seems more likely that a different system of neurons, place cells, plays a stronger role in providing us with the sense that a new locale is familiar—a feeling called “déjà visité.”

In any environment, the brain must keep track of the distinct locations within the surrounding area (say, at the kitchen table versus in front of the refrigerator). It also must note how these different locales relate to one another (the table is three feet to the right of the fridge, for instance). Place cells are involved in the former type of processing; each place cell corresponds to a specific location in an environment and fires when you pass through that spot.

In contrast, grid cells work in a network to produce a kind of internal coordinate system, noting information about distance and direction. These neurons do not correspond to a specific location but become active across several regularly spaced points in any setting. The geometric arrangement of these cells, relative to one another and to the external setting, ultimately helps us form a mental map of a certain environment.

Grid cells are located in the entorhinal cortex, a brain region that processes information before sending it to the hippocampus, the area where place cells are located. Because we know that place cells have a unique firing pattern for nearly every experience, it is likely that the hippocampus, and not primarily the entorhinal cortex, decides whether a location is novel or being revisited. When a strange place is experienced as familiar, it may be because the activated ensemble of place cells at that location happens to be similar to a pattern of activity that was elicited by a previous locale. Link

Is our sense of familiarity more tied to visual modalities than others? Consider the overwhelming familiarity aroused through smell. The key question here is not to consider which sense modalities yield familiarity, so much as which sense modalities yield false familiarity and why. As far as I know, I've never had an olfactory sense of déjà vu.

However, I think I have experienced déjà vu sparked by conversational content rather than visual scene. Considering the discussion above, perhaps the sense of familiarity is neurochemically triggered because I'm in some particular location and not strictly because of auditory or conceptual features of the conversation?

Deja vu might reveal yet another failure of our introspective capacity to pinpoint the causal underpinnings of our conscious experience.

Apr. 1st, 2008

me2009

Emotion Makes Nose a Sharper Smeller

Know how a whiff of certain odors can take you back in time, either to a great memory or bad one? It turns out emotion plays an even bigger role with the nose, and that your sense of smell actually can sharpen when something bad happens.

Northwestern University researchers proved the surprising connection by giving volunteers electric shocks while they sniffed novel odors.

The discovery, reported in Friday's edition of the journal Science, helps explain how our senses can steer us clear of danger. More intriguing, it could shed light on disorders such as post-traumatic stress syndrome.

"This is an incredibly unique study," said Dr. David Zald, a Vanderbilt University neuroscientist who studies how the brain handles sensory and emotional learning. "We're talking about a change in our perceptual abilities based on emotional learning."

Scientists long have known of a strong link between the sense of smell and emotion. A certain perfume or scent of baking pie, for instance, can raise memories of a long-dead loved one. Conversely, a whiff of diesel fuel might trigger a flashback for a soldier suffering PTSD.

Could an emotionally charged situation make that initial cue be perceived more strongly in the first place?

The research team recruited 12 healthy young adults to find out.

Volunteers repeatedly smelled sets of laboratory chemicals with odors distinctly different from ones in everyday life. An "oily grassy" smell is the best description that lead researcher Wen Li, a Northwestern postdoctoral fellow in neuroscience, could give.

Two of the bottles in a set contained the same substance and the third had a mirror image of it, meaning its odor normally would be indistinguishable. By chance, the volunteers correctly guessed the odd odor about one-third of the time.

Then Li gave the volunteers mild electric shocks while they smelled just the odd chemical. In later smell tests, they could correctly pick out the odd odor 70 percent of the time.

MRI scans showed the improvement was more than coincidence. There were changes in how the brain's main olfactory region stored the odor information, essentially better imprinting the shock-linked scent so it could be distinguished more quickly from a similar odor.

In other words, the brain seems to have a mechanism to sniff out threats.

That almost is certainly a survival trait evolved to help humans rapidly and subconsciously pick a dangerous odor from the sea of scents constantly surrounding us, Li said. Today, that might mean someone who has been through a kitchen fire can tell immediately if a whiff of smoke has that greasy undertone or simply comes from the fireplace.

But the MRI scans found the brain's emotional regions did not better discriminate among the different odors, Li noted. That discrepancy between brain regions is where anxiety disorders may come in. If someone's olfactory region does not distinguish a dangerous odor signal from a similar one, the brain's emotional fight-or-flight region can overreact.

Researchers say that is a theory not yet tested.

For now, Northwestern neuroscientist Jay Gottfried, the study's senior author, says the work illuminates a sense that society too often gives short shrift.

"People really dismiss the sense of smell," said Gottfried, who researches "how the brain can put together perceptions of hundreds of thousands of different smells. ... Work like this really says that the human sense of smell has much more capacity than people usually give it credit."

Link

Smell is a much under-studied faculty. It is difficult to philosophize about in part because we lack a suitably rich vocabulary to differentiate our experiences. If I'm right that a lack of olfactory conceptual apparatus is a hindrance to research, then I wonder whether our understanding of music and cognition is aided by western music theory?

Nevertheless, smell is a vital component of our episodic memory. Smells trigger emotions, mental images and thoughts that we identify with a past event. The familiarity of a particular smell gives us a lot of confidence in our memories. In my coherence theory of memory, perhaps smell should get a weighted value as a piece of evidence against other sorts of phenomenology?

Mar. 20th, 2008

me2009

Malleable and Immutable mental faculties.

Some cognitive processes are cognitively impenetrable, for example, early visual processing produces the muller-lyer perceptual illusion reliably without any impact from contrary beliefs. It simply does not matter how much one wishes to avoid experiencing the discrepancy between what we know of the lines (that they are even) and how they look (one appears longer than the other), the dissonance is unavoidable. On the other hand, the content of other cognitive processes are susceptible to manipulation. For example, one's feeling of pain can be reduced through hypnotic suggestion and our desires can be altered with new beliefs. In this paper I wish to consider a third variant of cognitive process: malleable faculties. What is a malleable faculty? A malleable faculty is one where the architecture of the faculty itself (not merely content) can be modified through the will and action of the agent.

The first malleable faculty I considered was memory as it pertains to mnemonics. The ancient greeks considered a mnemonic education essential to any citizen's intellectual development. They knew that one's intellectual capacities to remember were vastly plastic and required constant discipline to maintain high performance. There is no absolute structure that underpins everyone's memory the way there is for early vision. Some people think in pictures, others in words and some kinaesthetically*. But, more importantly, an individual can alter the architecture of their memory, either by choice or through externally generated environmental cues. Our ability to influence memory begins with attention; the way we see the world during perception. We can chose to pay attention to particular features of our world which will in turn affect what we retain at a later time. We can construct mnemonic schemes to sort incoming information into more easily retrievable chunks. If we have poor spatial or imagistic recall, we can practise to improve these areas. Like the muscles of the body, our memory is incredibly malleable. Now, of course, we will not all have infinite capacities for change. Just as men tend to have an easier time building muscle mass than women, some individuals will be innately disposed for greater mnemic potential. However, within certain biological constraints there is a vast area for work and improvement.

Another malleable faculty is introspection. Many ancient cultures have argued that through meditation we can come to know our thoughts mental states and feelings with greater clarity. The details of this process I will have to expand on at a later time.

Malleable faculties are particularly interesting from an epistemic point of view because they reveal that we can become more reliable, more knowledgeable through greater technique. Unlike static epistemic issues in perception such as blindspots, change-blindness or visual illusions, we have more control over the errors of our memories and introspection. With mental exercise we can navigate the world less prone to error. Of course, this does not mean we can avoid being tricked altogether. Memory is an inherently falliable faculty, so no amount of training can entirely eradicate error.

If memory and introspection are malleable, what sort of faculties are not open to conscious will? I argue that all the cognitively impenetrable candidates are likely immutable, e.g. early visual processing. But, perhaps also our faculty of desire and our faculty of belief. For, although we can alter the content of what we believe and what we desire, we cannot change the architecture that manipulate these propositions. Perhaps this is unfair on belief? Perhaps we can become more rational and more logical and thus more reliable at forming true beliefs or maintaining skepticism when appropriate? But, surely this is not the faculty of belief, but the faculty of reason, of thought itself? Reason may be another malleable faculty.

* We should differentiate between the way we introspect our mental states from the structure of representations underlying the phenomenology. e.g., when imagining Paris we might say, "I see the Eiffel Tower", yet have no mental imagery whatsoever. Our use of metaphor in speech (e.g. I see what you mean) is so pervasive and subtle that we must be careful before attributing significance to it. Nevertheless, there do seem to be different ways of experiencing mental states and they can involve physical differences such as emotions or bodily sensations, auditory or visual experiences.

TBC

Feb. 21st, 2008

me2009

Dissertation Statement

Summary: I defend memory qualia using a Bayesian calculus to explain how the subjective feeling of remembering contributes to cognition and knowledge.


Dissertation Statement

Optimized decision-making requires us to track the origins of our mental experiences as accurately as possible. While humans are quite reliable at distinguishing, say, memories from imaginings, we are also susceptible to false memories and psychogenic amnesias. People can be manipulated into believing false childhood memories that were implanted by an experimenter, or suggested by a person in authority. In these circumstances subjects not only find fictions familiar and have strong beliefs about their veracity, but they also claim to re-experience vividly the details of their prior occurrence. Conversely, psychogenic amnesias have been reported in patients with multiple personality disorder, dissociative fugue or post-traumatic stress disorder subjects—e.g. rape victims. Individuals in these situations respond to stimuli connected to an instigating event without any familiarity or sense of connection to this past. In light of the evidence that subjective judgment is a poor guide to the true origin of our thoughts, one might wonder whether consciousness has any functional use in cognition at all. Perhaps consciousness is epiphenomenal? This is the predominant view in cognitive science.

In my dissertation I argue that consciousness is not epiphenomenal. I elucidate how the subjective experience of memory (mnemic qualia) contributes to cognition, knowledge, planning and decision-making. I begin by examining the subjective experience itself via the contributions of Aristotle, Hume, James and Russell. Hume noted that unlike imaginings, memories seem more vivid or convincing. People often justify assertions by examining the quality of the mental experience, such as the level of detail in their mental images, the degree of emotional salience or even a sense of ‘being there’. People offer factual details of events they purport to remember. Thus, memories are distinguished from imaginings partially by the way they feel and partially because of background beliefs. Inspired by the American pragmatists, I outline how the experiential and representational can be reconciled within a representational theory of mind.

I go on to integrate my philosophical defense of qualia with the source-monitoring literature from psychology. Successful source monitoring is an inferential process that requires people to examine and categorize their mental state based on qualitative features of the experience itself and coherence with other beliefs. The inferential contribution is evident in cases of ‘déjà vu’, where we experience the feeling of memory but rationalize that we are not remembering.

The coherence view in the source monitoring literature in psychology is supported by a Bayesian account of belief. This view is that the congruence between independently generated beliefs can raise the probability of what is remembered to the level of practical certainty in a way analogous to that in which agreement of independently given testimonies can eventually convince us that what is being testified is true. The theory works on the basis that there is initial credibility (i.e. a non-zero prior probability) for the memory in question. Coherence increases the posterior probability that x occurred with the number of consistent beliefs. However, the coherence of independent items of evidence has no impact on the probability of a conclusion unless each item has some credibility of its own; for example, a person with poor vision would be unwise to treat mental images from an event as seriously as their auditory memories.

The Bayesian account explains how normal memories are successfully segregated from other mental phenomena. But, perhaps more impressively, it sheds light on circumstances when source-monitoring fails. Because we must already begin with a degree of belief in a particular memory, Bayesianism explains why psychogenic amnesia patients have no capacity to revise beliefs. It also shows how individuals who are unusually gifted at creating cross-modal phenomenology are particularly prone to false memories. Individuals who are fantasy-prone, or hypnotically suggestible are wise to remain skeptical of their qualia, because coherent subjective experiences are too easily constructed by their imaginations.

I conclude that regardless of the functional underpinnings of our cognitive architecture, consciousness impacts our reasoning and this is rationally explained by combining the empirically informed source-monitoring literature and a Bayesian probability calculus.

Feb. 20th, 2008

me2009

C. I. Lewis

The term 'qualia' was coined by C.I. Lewis in 1929 in Mind and the World Order. In 1946, in An Analysis of Knowledge and Valuation, Lewis defended a Bayesian account of memorial knowledge. It seems that my entire dissertation is being inspired by a philosopher I know almost nothing about.

Today I have sought him out.
While many saw Lewis as kin to the logical empiricists, he was never truly comfortable in such company because he declined to divorce experience from cognition. Positivism rejected value as lacking cognitive significance, also rejecting the analysis of experience in favor of physicalism. Both rejections struck him as regrettable. Indeed his growing awareness of the pragmatic tradition led him in the opposite direction. For Lewis, it is only within experience that anything can have significance for anything, and thus he came to see value as a way of representing the significance of knowledge for future conduct. These convictions led him to reflect on the differences between pragmatism and positivism, and on the cognitive structure of value experiences.
Link


I have great fondness for the American pragmatists and didn't realize how Lewis fit into this tradition. It seems he was also excited by the role of subjective experience on cognition.

In my searches, I found a memorial Lewis wrote for the Journal of Philosophy in 1954 (Vol. 51) for George Santayana--a man of letters famous for the quote: "Those who cannot remember the past are condemned to repeat it". Here is a section from the memorial:
The external impression was somewhat striking; and it remains vivid with me. I can still see him as he came through that gate out there on Quincy Street; a taller than average figure, erect and well set up, walking with easy gait like that of a man who has sometime learned to march. He appeared observant of whatever went on about him, but not engaged with it-a little aloof perhaps, as if his thoughts were elsewhere. He wore a longish military cape, instead of an overcoat, coming over from his rooms, and I can see him as he swung it off at the door. The complexion was a little darker than the average, indicative of the Spanish strain in his inheritance; and the eyes at once drew notice. The features and general presence were such as I can only suggest by the word "aristocratic."(p.29)

I love the nested sorts of remembering involved in a memorial of a man famous for valuing memory. I am intrigued by Lewis' use of the word 'vivid' and the precise description he provides of the professor. The description suggests that Lewis is not remembering a single event, but has averaged out a set of experiences to produce a distilled essence of experience.

It makes me think of the usual differentiation of episodic memory into either abstract/conceptual or event specific knowledge (ESK). Abstract/conceptual episodic memory involves explicit memory for facts about events in a person's life, e.g. "when I was at high school I studied chemistry and once I performed a perfect titration". On the other hand, ESK involves near-sensory experiences (E.g. mental imagery, smells, tastes), emotions (joy, fear, sorrow, sense of significance). Lewis' recollection seems to incorporate both abstract/conceptual episodic memory and ESK. That is, he constructs a near-sensory recollection fabricated from many exposures.

We all do this frequently, e.g., when we imagine eating a lox and cream cheese bagel, we recollect the feelings from many instances.

Feb. 2nd, 2008

me2009

SPP 08 submission

Society for Philosophy and Psychology Submission

Title: Remembering Beliefs
Abstract:
Optimal decision-making requires us to accurately introspect the origins of our mental experiences. In this paper I examine how different types of remembering impacts belief. Remembering can be implicit—either representational or procedural—and remembering can be explicit—either semantic or episodic. The minimal conditions for remembering are a causal connection to the learning event and the retention and subsequent impact of this learnt material on behavior, regardless of our conscious awareness or attribution. The minimal conditions for recollection require remembering, belief and mnemic qualia. Whenever we have conscious influence over our memory attributions, it is in our interests to efficiently source their origins. I look to the source-monitoring literature to reconcile circumstances where beliefs and qualia conflict. By separating the experience of remembering from biological facts of memory, unusual cases make sense, such as memory qualia without memory (e.g. déjà vu, false memories) or a failure to have memory qualia with memory (e.g. functional amnesia, unintentional plagiarism). I conclude with a Bayesian defense of source-monitoring based on C.I. Lewis' coherence argument for memorial knowledge.

Jan. 16th, 2008

me2009

J.K. Rowling describes a false memory

“The day of [my sister's] birth is my earliest memory, or my earliest datable memory, anyway. I distinctly remember playing with a bit of plasticine in the kitchen while my father rushed in and out of the room, hurrying backwards and forwards to my mother, who was giving birth in their bedroom. I know I didn't invent this memory because I checked the details later with my mother. I also have a vivid mental picture of walking into their bedroom a little while later, hand in hand with my father, and seeing my mother lying in bed in her nightdress next to my beaming sister, who is stark naked with a full head of hair and looks about five years old. Although I clearly pasted together this bizarre false memory out of bits of hearsay when I was a child, it is so vivid that it still comes to mind if I ever think about Di being born.”

From J.K. Rowling's biography.

Above is a lovely example of how an author examines her own memories. There are two good components here:

1) Her 'earliest memory' (dated at 23 months) may be an accurate rendition of facts from her past, but is it truly a memory? Details 'remembered' from two years of age are easily implanted by parents eager to create a shared narrative with their adult children. Individual memories can be created by discussion within a group to create and enhance collective memory. In this case, the birth of Rowling's younger sister is an important and bonding event for the entire family.

Compare this example with your own 'memory' of the attacks on the world trade center. How well do you remember clearly your own actual experiences as they unfolded at the time? How sure are you that you haven't altered your narrative to fit in with the social group you are communicating using details informed via testimony and heresay over personal experience?

There has been an effort in the 20th century to define what counts as a memory more rigorously. One of Martin & Deutscher's (1966) suggestions is a causal criterion for remembering.

Causal Criterion

"to remember an event, a person must not only represent and have experienced it, but also his experience of it must have been operative in producing a state or successive states in him finally operative in producing his representation." (p. 173) They explain:
"A person has an apparent recollection of something from early childhood, and wonders whether he really remembers it. His parents may tell him that what he describes did happen, and that he witnessed it, but the discussion of whether he remembers it still goes on. They wonder whether his witnessing of the event has any connection with his now giving the story or whether his description can be completely explained by what he heard later." (p.176) [my italics]

The point for Rowling is that even though she may have been in the kitchen playing with plasticine when her sister was born, it is less clear that she can actually remember the event when she recounts it. This is even more questionable when we look to the second component of the memory.

2) Rowling explains that she has a vivid false memory of her newborn sister looking like a five year old. She knows this must be false because of its absurdity, nevertheless, her experience of it very strong. It is precisely this sort of experience that I'm investigating in my PhD. One of the ideas I'm studying is the individual differences in false memory suceptibility. If Rowling has a powerful faculty of mental imagery, then--based on the causal criterion--she should be even more careful about any recollections, even when facts support her stories.

I'd like to do an experiment where I study a statistically significant set of autobiographies and then I run false memory tasks on the protagonist. It would be interesting to compare the way people report on their memories compared with their actual performance in a laboratory setting.

Source

Martin. C.B. & Deutscher M. (1966) Remembering. The Philosophical Review. 75(2). 161-196

Jan. 10th, 2008

me2009

Imagination article for the Encyclopedia of American Philosophy

This entry on imagination appears in the Encyclopedia of American Philosophy

Imagination

Imagination is a capacity of internal visualization, concept creation and manipulation not directly dependent upon sensation. Imagination is associated with a range of phenomena: mental imagery, fancy, inventiveness, insight, counterfactual reasoning, pretence, simulation and conceivability. Products of the imagination are sometimes considered false or fantastical, e.g. the phrase, ‘I must have imagined it’ is applied to explain a mistake in conversation. The term ‘imagine’ is also used in countless idiomatic ways that do not imply any use of an imaginative faculty; for example, a person might exclaim ‘imagine that’ to an unusual news item.

In the history of western philosophy most connotations of the term ‘imagination’ are a response to the notion of mental imagery and imagistic representation. Philosophers ask how mental ‘pictures’ could be ‘in the head’ and how those ‘pictures’ can have intentionality. The association of imagination with inventiveness has yielded epistemological concern for imagined ideas, including what kind of knowledge stems from the imagination and whether conceivability impacts possibility. Contemporary cognitive science is focused on the cognitive architecture underlying imaginative thought and suggests connections between imagination, counterfactual reasoning and understanding other people’s mental states. This approach has impacted broader philosophical areas such as ethics, where the imagination plays a role in invoking moral responsibility through simulation, role-play and empathy. Non-visual sense modalities (e.g. hearing) have received much less attention in the imagination literature (Currie and Ravenscroft, 2003). Because of the range of uses of the term ‘imagination’ it is illuminating to look at the history of the faculty of mental imagery.
Read more...Collapse )

Previous 50

me2009

April 2014

S M T W T F S
  12345
6789101112
13141516171819
20212223242526
27282930   

Tags

Page Summary

Syndicate

RSS Atom
Powered by LiveJournal.com