Tuesday, April 10, 2012

Nietzsche: The Wrong Case Study

My own effort:


Nietzsche: The Wrong Case Study

By

Dr. Charles Stanford, CCBT

©2003



Introduction

          As Alice Miller (1990) returned to re-read Nietzsche's works after thirty years, so did I as a result of her essay.  Ironically enough, it reminded me of the opening of Nietzsche's Zarathustra: "Als Zarathustra dreißig Jahre alt war, verließ er seine Heimat und den See seiner Heimat und gin in das Gebirge." (Z, 303) ("When Zarathustra became thirty years old, he left his homeland, and the sea of his homeland, and went to the mountains."  This will be my last reference to Zarathustra for reasons mentioned in the bibliographical note.)  Both Miller and I were surprised by what we learned in those mountains, but we learned quite different things.  However, I am thankful to her for prodding me to re-read what may be the most powerful and influential writer for the twentieth century.

          Most frustrating about Miller's essay is not the apparent effect of reducing to pity the writings, but her tactic of dismissing aforehand any disagreement as character defects in the reader.  "Experts," those who can not face the facts, those who do not take seriously the situation of the child, will disagree with her.  Therefore, those who do not accept her "proof" ignore the plight of children. 

          Her explanation of his misogyny is characteristically biographical, but using the same approach we can see that he was ahead of his time and recognized potential greatness in specific women -- those who do not allow their roles in life to be prescribed by men. 

          Basically, the disagreement can be reduced to a simple question:  Who has better insight into Nietzsche's work -- Miller or Nietzsche?  If we argue that perhaps Nietzsche did not have sufficient psychological insight to recognize the effects of his childhood in accordance with Miller's interpretations, we either have to dismiss Freud's (and later Frankl’s) view that Nietzsche knew himself better than any other man in history and that Nietzsche knew little about psychology, or we must assume that Miller knows more.  Even his own autobiography is at odds with her interpretations, especially as they relate to his relationship with Wagner. 

          There is one other possibility:  Miller is not writing about Nietzsche at all.  She is writing about child abuse and its consequences and Nietzsche is "handy" for her.  She thus would be employing a tactic that Nietzsche himself used -- using a person as a symbol for something, as a magnifying glass.  If I knew nothing about Nietzsche, I would have been convinced by her arguments.  Knowing a bit about him proved a considerable handicap in understanding the essay.  As it is, Miller simply chose the wrong figure for a case study to "prove" her thesis, at least to one reader.



Alice Miller

          Alice Miller, in "Friedrich Nietzsche: The Struggle Against the Truth," The Untouched Key: Tracing Childhood Trauma in Creativity and Destructiveness (New York: Anchor Books, 1990), 73-133) writes with a warm, caring, feeling for "little Fritz," but does little to enlighten any reader on Nietzsche.  In fact, she takes the corpus of his works (which she has at least read and understood on a basic level) and reduces them to the helpless rantings of a wounded child.  While the years have seen a number of attacks on Nietzsche, none is so subtly pernicious.  It does nothing to advance scholarship on Nietzsche (except perhaps in reaction) and could have the effect of misrepresenting him to those who have only heard of him or who have perhaps read a few passages here and there.  I believe Nietzsche may have reacted to her essay by calling it an example of "the eternal feminine."  Just as today the intended humor of that last sentence would most likely be overlooked in a reaction to perceived misogyny, so the humor and light spirit of Nietzsche is overlooked by Miller throughout in favor of "understanding" poor little Fritz.

          She early on (74) points out that anyone who is "expert" on the subject of Nietzsche will not agree with her because "they were not capable of recognizing such forbidden knowledge" because, unlike her, they never realize "how strongly I was clinging to my childhood idealization of my parents" (74).  Therefore, those readers who do not recognizer the truth of what she is saying simply are idealizing their parents and thus are intellectually handicapped.  She writes for people "who can face the facts.  They need not be experts...." (74)  Because of his early childhood, Nietzsche wrote in such a way that the Nazis could distort him -- the corollary, of course, is that anyone the Nazis distorted had an unhappy childhood.  This includes Christ, Luther, Beethoven, Wagner, Jack London, Bernard Shaw, etc.  Perhaps all of these writers had an unhappy childhood, but Shaw, at least, constantly says that his was a happy one.  Miller "witnessed the way the deadly marching of the National Socialists in the thirties and forties was indirectly spurred on by Nietzsche's words ...." (75) Could we say, using the same approach Miller applies to Nietzsche, that therefore her reading of Nietzsche is affected by PTSD?

          But she covers her bases.  She knows ahead of time that "my thesis that Nietzsche's works reflect the unlived feelings, needs, and tragedy of his childhood will probably meet with great resistance."  However, "my thesis is correct nevertheless, and I will offer proof in the pages that follow."  Now the reader must pay attention -- we will get "proof."  However, this "proof can be understood ... only by someone who is willing to temporarily abandon the adult perspective to gain insight into and take serious account of the situation of a child." (76)  So, there it is -- if the reader does not accept her proof, it is an example of not being willing to take seriously the helplessness of children.  Is it possible to say the contrary, that anyone who agrees with Miller simply has not read Nietzsche appropriately? 

          I am not sure how to interpret her remarks about his "madness" as a result of syphilis.  (76-78)  Suffice to say, I get the impression that she feels the illness, in reality, was a result of an unhappy childhood, the body reacting to repression, an example of society defending itself against Nietzsche by invoking some sort of divine retribution.  Walter Kaufmann (1966), who remains the authority on Nietzsche, provides this account:  "During his disease Nietzsche was almost invariably gentle and pleasant, and in lucid hours he engaged in conversation.  Sometimes, however, he was wild and frenzied.  At no time could he be induced to discuss any of his works or ideas.  His last books and letters notwithstanding, his disease was not paranoia but almost certainly an atypical general paralysis.  If this diagnosis is correct, it would follow that he must have had a syphilitic infection--but it cannot be [considered proven].  The certainty which can be achieved today by various tests can never be matched by posthumous conjectures on an atypical disease.  All we can say is -- and all sober and unsensational medical treatments of the subject seem agreed on this -- that Nietzsche very probably contracted syphilis." (58)  Both are reacting to negative commentaries about Nietzsche's work, commentaries that see his "madness" as a result of God's retribution or as proof that his ideas are invalid, or as both.  Kaufmann's, however, seems to me more substantiated (but then, perhaps Kaufmann does not take the sufferings of children seriously?).



Misogyny

          Nietzsche's remarks about women generally do him little credit from today's (1997) perspective.  At the same time, perhaps we can gain some insight into his remarks by examining them in context rather than dismissing them, as does Miller:  "Nietzsche's misogyny becomes understandable, of course, if we consider how much distrust must have accumulated in someone who was whipped so frequently as a child."  (98)  So, poor Fritz, so abused by these women particular women, came to hate all women.  Let me add that Miller overlooks an incident that would tend to support her case: Nietzsche threw his arms around a horse that was being whipped, went home to write a few letters, and was confined to an asylum shortly after that.

          In truth, Nietzsche's remarks about women should be understood in context of this remark from The Gay Science: "...it is man who creates for himself the image of woman, and woman forms herself according to this image." (GS, 126)  As is usually the case with Nietzsche, any sentence, paragraph, even book, quoted out of context requires explication.  The remark means that we live in a male-dominated society and it is the males in power who prescribe the roles played both by males and females.  Those males are usually members of the clergy (the priestly caste).  Women attempt to fit into these roles and it is the roles that are absurd, not the gender itself.  In fact, throughout, Nietzsche's remarks about women should be interpreted as attacking those women who are unable or unwilling to rise above those roles and act instead as human beings striving for an overcoming of mankind.  Indeed, we still see this phenomenon -- one need only consider televised beauty pageants (who watches them?  To whom are the commercials targeted?) 

          Freud himself has been attacked as misogynistic and certainly the early hypotheses concerning hysteria (a wandering uterus) does little credit to those who accepted it, yet it is best seen as an attempt to understand a condition, not as an attempt to belittle women.  The O.J. Simpson trial, believe it or not, illustrated how dangerous this subject is today when discussed by males.  At one point, a defense attorney described the prosecution's actions as hysterical.  Marsha Clark, the lead prosecution attorney, pounced on this and described it as a blatantly sexist remark, demeaning to women and by implication to the prosecution, justifying spousal abuse, compounded by the fact that the judge, another male, allowed the remark to enter the record.  Another defense attorney, one of Barry Sheck's team, did a computer search on the term and documented the fact that the first use of the term in the trial was by Marsh Clark herself!  Why was it permissible for the prosecution to use the term and not the defense?  Johnnie Cochran with a wandering uterus?  She dropped her objection.

          If we wish, however, to see the historical context of Nietzsche's above remarks, we can look to Schopenhaurer: "it is only the man whose intellect is clouded by his sexual impulses that could give the name of the fair sex to that undersized, narrow-shouldered, broad-hipped, and short-legged race; for the whole beauty of the sex is bound up with this impulse" (SP, 440).  He continues, complaining of "..the childish simplicity, for example, with which they keep on chattering during the finest passages in the greatest masterpieces." (411)  Today, presumably, we have improved as both men and women chatter during performances.  Aristotle, somewhere, places women on step below man on the ladder of evolution and offers as partial proof the fact that they have less teeth.  Aristotle was married three times as I remember reading, and apparently never bothered to count.

          In light of the above, Nietzsche's remarks seem rather enlightened and forward looking.  Certainly, his remarks on women, taken out of the context of late nineteenth century views are harsh, but placed in their context they seem insightful.  On the other hand, Shaw and Ibsen held more feminist views during the same period or a decade later, but their attacks were on the same issue: the roles women found themselves forced to play.  (Nietzsche referred to Ibsen as "...that typical old maid").  (EH, 863)

          Moreover, it is on precisely this point that Nietzsche is most vulnerable today.  His remarks on Christianity, Wagner, society, etc., are far more obviously lucid and accepted.  However, if his remarks on women are interpreted as extending to all women and they way women are biologically and for all time, how does one explain his comment on "Madame Cosima Wagner, who had by far the most superior judgment in matters of taste that I have ever heard."? (EH, 840)



The Psychologist

          Rather than proceed to a discussion of all of Miller's points, it may be more profitable to discuss Nietzsche as a psychologist, or as a precursor to psychology.  After all, could Nietzsche have understood himself well enough to recognize how his philosophy was a reaction to his childhood?

          It is fairly well-known that Freud thought that Nietzsche knew himself far better than any man who ever lived (although I can not locate a specific reference).  Certainly, one who knew himself this well in the opinion of the "father" of Psychoanalysis would have enough acumen to recognize how "poor little Fritz" contaminated his writings as a result of being abused as a child.  Much more important, however, is this remark by Freud: "Nietzsche ... whose premonitions and insights often agree in the most amazing manner with the laborious results of psychoanalysis, I have long avoided for this very reason.  After all, I was less concerned about any priority than about the preservation of my open-mindedness [Unbefangenheit]" (Kaufmann, 382).  In short, Nietzsche anticipated the findings and the discipline of Psychoanalysis so precisely and prematurely that Freud himself avoided reading him for fear of being unduly influenced.  As Nietzsche said, he "was born posthuminously."

          So what are these insights?  Nietzsche's attacks on the "Slave Morality," are well-known, but the basis for those attacks is not, especially since Christianity is the premiere example of that morality.  The following passage is fairly explicit, however:  "The slave revolt begins by rancor turning creative and giving birth to values -- the rancor of beings who, deprived of the direct outlet of action, compensate by an imaginary vengeance.  All truly noble morality grows out of triumphant self-affirmation.  Slave ethics, on the other hand, begins by saying no to an 'outside,' and 'other,' a non-self, and that no is its creative act." And a few sentences later: "we should remember that the emotion of contempt, of looking down, provided that it falsifies at all, is as nothing compared with the falsification which suppressed hatred, impotent vindictiveness, effects upon its opponent, though only in effigy."  (GM, 171).  Ironically enough, this is precisely what Miller accuses Nietzsche of doing in his writings, of reacting to his childhood, suppressing hatred, etc.

          In Psychoanalytic terms, the above is fairly clear.  the dangers of suppression of libinous instincts in clearly expressed and anticipates Civilization and its Discontents.  Thanatos, the death wish, the turning inward of this violence negates any possibility of self-affirmation.  It leads to or results in a lack of ego strength or identification makes one unable to identify boundaries between self and other.  A strong ego may look upon others negatively, and at times erroneously, but it is "as nothing" compared to the violence done to the ego and others by the guilt caused by the turning inward of this energy.  Perhaps this is the appropriate time to mention that Nietzsche used the term "das Ich", not ego.  I understand that the same was true for Freud.  Curiously enough, both suffered from the biases of their early translators.  Thomas Common's translation of Also Sprach Zarathustra is one of the main reasons for much misinterpretation of Nietzsche.  Common understood the entire work as a parody of the Bible because of the use of the second person -- in grammatical terms, this is common in German, but fairly unique to the Bible in English today.  I could say much more on this ....)

          Another passage on the Superego or guilt:  "whereas the noble lives before his own conscience with confidence and frankness..., the rancorous person is neither truthful nor ingenious nor honest and forthright with himself.  His soul squints; his mind loves hide-outs, secret paths, and back doors ... he is expert in silence .., in provisional self-depreciation, and in self-humiliation. (GM, 172).  He focuses "...nun auch noch einen 'guten" ausdenkt -- sich selbst!" (GMG, 334)   I have not seen this translated acceptably, but it means that the thoughts concerning what is "good" become the self-concept of the "slave" (eg., Christian Clergy, the Monkish Caste) and those who are opposed are "den Bösen" (the Devil, the "bad," the "evil" sort of all rolled up into one in the same way "spirit," "ghost," and "energy" and all rolled up into one are "Geist.")  Since the slave is self-hating, self-hatred forms the basis of slave-morality.  Moreover, the slave does not recognize this in himself -- he sublimates it to the point where is it unconscious (in "hide-outs, secret paths, and back doors") and then projects this hatred onto the other to the extent it is recognized.  (One wonders about the childhood of such people.)

          And these values, the form that self-hatred takes, and consequently what is proper behavior and thought is dictated by the "herd instinct," that is to say, contemporary norms: "Whenever we encounter a morality, we also encounter valuations and an order of rank of human impulses and actions.  These valuations and orders of rank are always expressions of the needs of a community and herd:  whatever benefits it more -- and second most, and third most -- that is also considered the first standard for the value of all individuals." (GS, 174))  Another passage illustrates this more clearly, but I think Nietzsche is a bit too generous in assuming progress on the part of humanity in it.  When he uses the past tense, he is referring to prehistoric times, assuming we (his readers) have "overcome" much of this.  Nevertheless, the passage is as follows:  "...nothing was more terrible that to feel that one stood by oneself.... Freedom of thought was considered discomfort itself.  While we experience law and submission as compulsion and loss, it was egoism that was formerly experienced as something painful and as real misery.  To be a self and to esteem oneself according to one's own weight and measure -- that offended taste in those days.  An inclination to do this would have been considered madness; for being alone was associated with every misery and fear.  In those days, "free will" was very closely associated with a bad conscience; and the more unfree one's actions were and the more the herd instinct rather than any personal sense found expression in an action, the more moral one felt."  (GS,175). 

          It seems relatively safe to say that mental illness is usually defined using some sort of normative approach and measured against some code of socially acceptable behavior.  It also seems clear that the Superego is the internalization of these moral norms and that transgression against them is experienced with guilt and shame, if not incarceration.  Certainly, this is how I understand Freud's concept of the Superego, at least in part.  Perhaps some aphorisms make more clear Nietzsche's attitude towards the debilitating effects of sublimation and the Superego: "Not to perpetrate cowardice against one's own acts!  Not to leave them in the lurch afterward!  The bite of conscience is indecent." (TI, 467)  To put it another way, "How much conscience has had teeth to chew on in the past!  And what excellent teeth it had!  And today -- what is lacking?  A dentist's question."  (TI, 470)  (The answer, not given here, is "it needs to be rooted out.")

         

Ecce Homo

          Since this essay began as a response to someone who would interpret Nietzsche in light of his life, his childhood, perhaps we should allow Nietzsche himself to make some remarks on the subject.  He wrote his book Ecce Homo (Behold the Man) in 1888, but it was suppressed by his sister and not published until 1905 when she allowed its publication. 

          Miller (1990) seems to believe that Nietzsche's attachment to Richard Wagner and subsequent vitriolic attacks on him were a result of Nietzsche's idealization of a father he never really knew and the disappointment and finding him human, that he took Wagner's Parsifal as a personal disappointment.  In his description of what he meant by "war," Nietzsche wrote "I never attack persons -- I make use of a personality merely as a powerful magnifying glass, by means of which I render a general ... evil more visible.... In this way I attacked Wagner, or more exactly the falsity of mongrel instincts of our 'culture' which confounds super-refinement with abundance, and decadence with greatness." (EH, 829)  And later, "in speaking of the recreations of my life, I must express a word or two of gratititude for the one which has afforded me by far the greatest and heartiest refreshment.  This was undoubtedly my intimate relationship with Richard Wagner....I know not what Wagner may have been for others; but no cloud ever obscured our sky. ....What is it that I have never forgiven Wagner?  The fact that he condescended to the Germans -- that he became a German Imperialist."  (EH, 843-845). 

          In other words, Nietzsche used Wagner as a symbol of negative characteristics which were in their ascendancy (he also only attacked caused which he felt were triumphant).  He felt no ill-will toward him personally. 

          In Ecce Homo, Nietzsche surveys his life and work, even early childhood, and there is not one hint of this "idealization."  He admits that his childhood was not happy, but attributes it to the bad climate.  But perhaps his most salient point in this context is his remark that "I am one thing, my writings are another."  (EH, 854)



Conclusion

          This brief essay began as a reaction to Miller (1990).  A central difficulty in discussing her essay in the context of Nietzsche's writings lies in the fact that her focus or interest is not on Nietzsche but on child abuse.  She uses Nietzsche in much the same fashion Nietzsche used Wagner -- as a vehicle for her ideas on the subject.  The fact that one with the liability of being suspect as an "expert" on Nietzsche could react with ire is educational in itself as it explains to some extent why less educated Christians, Wagnerians, Feminists, etc., in short, all those who devoted to causes or beliefs that Nietzsche attacked, react so violently and emotionally to his attacks.  It seems clear that Nietzsche did attack institutions and issues that were a part of his childhood, but then all of us have many of our characteristics formed in that period of life -- perhaps not the particular view of those issues, but the approach to it.  The fact that Nietzsche as a child was exposed to music by his father may have made him more sensitive to issues related to music and better able to appreciate and compose it (there is a relatively new two CD set of Nietzsche's musical compositions available), but it in no way explains his attacks on Wagner.  The fact that his father died when he was four was of great importance to Nietzsche, but not because he felt abandoned -- it was because Nietzsche felt his energy and vigor at its lowest ebb when he was at the age his father was when he died.  He felt it was hereditary that he should die young. 

          Perhaps the best that can be said is that Miller's approach is somewhat demeaning and indirect if seen as an analysis of Nietzsche, but remarkably sensitive if seen as an attack on child abuse.  The fact that she and Nietzsche (and most "experts") disagree as to the effects of his early childhood on his later writings is quite irrelevant for her purpose. 




Nietzsche's Chronology

          Nietzsche was born in Röchen, Germany on October 15, 1844.  His father was a Lutheran Pastor and music teacher who died five years later, probably as a result of a head wound.  From 1850-58, he lived in a household of women and in 1858 began attending a boarding school Schulpforta.  Immediately afterwards, he bagan his studies of classical philology at Bonn University, met Richard Wagner in 1868, and became Professor extraordinarius of classical philogy at the University of Basel -- full professor at the age of twenty-four without having finished his residency requiremnts.  In 1870, a Swiss subject, he volunteers as a medical orderly and returns to Basel in very ill health.  From this point on, his important work takes place.

          1872:  The Birth of Tragedy

          1873:  Untimely Meditations

          1874:  Schopenhaurer as Education, the third untimely meditation.

          1875:  Richard Wagner in Bayreuth

          1878:  Human, all-too-Human

          1879:  Resigns from the university as a result of ill-health with a pension.  By this time, he had already published two books not directly related to philology and the one that was (1872) was far from conventional.

          1880:  The Wanderer and His Shadow

          1881:  The Dawn of Day

          1882:  The Gay Science (GS)

          1883:  The first two books of Thus Spoke Zarathustra, each written in ten days.

          1884:  The third book of Zarathustra

          1885:  The fourth book of Zarathustra -- fourty copies are printed and only seven are distributed to close friends.

          1886:  Beyond Good and Evil

          1887:  The Case of Wagner -- an attack on Wagner.  Geneology of Morals.  (GM)  (GMG--German version).  About this time, his fame begins to spread.

          1888:  The Anti-Christ, Nietzsche Contra Wagner, Twilight of the Idols.  (TI)  Ecce Homo (EH) (suppressed by his sister until 1908).

          1889:  Throws his arms around a horse that is being whipped and writes a few letters.  His friend Overbeck takes him to an asylum in Jena, but his mother moves him out to live with her.

          1890 -- Eventually, his sister obtains exclusive rights to all his publications and notes and zealously promotes the image of her brother as an insane genius as his fame grows.  She is responsible for many misconceptions about him.

          I have made no mention of the Will to Power.  The book is a collection of Nietzsche's notes, most of which he had revised and rewritten and used earlier, often revising them to indicate the opposite of what they say.  His sister patched it together and promoted it is his Magnum Opus.








References



          Kaufmann, Walter.  1966.  Nietzsche:  Philosopher, Psychologist, Antichrist.  New York: Meridian.  This is the eleventh printing of this work.  A reader interested in secondary material and facts should start with this volume. 



          Miller, Alice.  1990.  The Untouched Key: tracing childhood trauma in creativity and destructiveness.  New York: Doubleday.



          Nietzsche, Frederich.  (n.d., c. 1960).  Herausgeber und Verfasser, Gerhard Stenzel.  Nietzsches Werke in Zwei Bänden.  Salzburg: R. Kiesel zu Salzburg.  A handy collection of Nietzsche's works in the original german. (Z, TI)



          Nietzsche, Frederich.  1956.  The Birth of Tragedy and The Geneology of Morals.  tr., Francis Golffing.  New York: Anchor.  Kaufmann has subsequently translated this as he has all of Nietzsche's works.  Kaufmann does not like the use of "ego" to replace "THE I". 



          Nietzsche, Frederich.  1954.  The Philosophy of Nietzsche.  New York: The Modern Library.  (EH)  I have use Clifton Fadiman's translation of Ecce Home from this source.  It has since been retranslated by Walter Kaufmann.  The main defect of this translation is its use of the Thomas Common translation of Zarathustra in quotations.  The volume also contains that translation in its entirety. 



          Nietzsche, Frederich.  1967.  The Birth of Tragedy and the Case of Wagner.  tr., Walter Kaufmann.  New York: Vintage.  I had intended to use the Case of Wagner to point to the humor and the nature of Nietzsche's attacks, but it became beyond the scope of this essay.



          Nietzsche, Frederich.  1976.  The Portable Nietzsche.  New York: Penguin.  First published in 1954, this remarkable work has gone through thirty-eight printings as of 1976.  It remains the definitive translation of Zarathustra, Twilight of the Idols (TI), and excerpts.



          Nietzsche, Frederich. 1974.  The Gay Science.  New York: Vintage.  This is the book I would recommend as an introduction to Nietzsche, not Zarathustra.  (GS)



          Schopenhauer, Arthur.  nd.  Essays of Arthur Schopenhauer.  Tr. T. BAiley Saunders.  New York:  A.L. Burt.


Saturday, April 7, 2012

Cognitive Science -- Public Domain Info

Cognitive science
From Wikipedia, the free encyclopedia
Jump to: navigation, search

Figure illustrating the fields that contributed to the birth of cognitive science, including linguistics, education, neuroscience, artificial Intelligence, philosophy, anthropology, and psychology. Adapted from Miller, George A (2003). "The cognitive revolution: a historical perspective". TRENDS in Cognitive Sciences 7.
Cognitive science is the interdisciplinary scientific study of the mind and its processes. It examines what cognition is, what it does and how it works. It includes research on intelligence and behavior, especially focusing on how information is represented, processed, and transformed (in faculties such as perception, language, memory, reasoning, and emotion) within nervous systems (human or other animal) and machines (e.g. computers). Cognitive science consists of multiple research disciplines, including psychology, artificial intelligence, philosophy, neuroscience, linguistics, anthropology, sociology, and education.[1] It spans many levels of analysis, from low-level learning and decision mechanisms to high-level logic and planning; from neural circuitry to modular brain organization.

Contents

[hide]

[edit] History

Cognitive science has a pre-history traceable back to ancient Greek philosophical texts (see Plato's Meno); and certainly must include writers such as Descartes, David Hume, Immanuel Kant, Benedict de Spinoza, Nicolas Malebranche, Pierre Cabanis, Leibniz and John Locke. However, although these early writers contributed greatly to the philosophical discovery of mind and this would ultimately lead to the development of psychology, they were working with an entirely different set of tools and core concepts than those of the cognitive scientist.
The modern culture of cognitive science can be traced back to the early cyberneticists in the 1930s and 1940s, such as Warren McCulloch and Walter Pitts, who sought to understand the organizing principles of the mind. McCulloch and Pitts developed the first variants of what are now known as artificial neural networks, models of computation inspired by the structure of biological neural networks.
Another precursor was the early development of the theory of computation and the digital computer in the 1940s and 1950s. Alan Turing and John von Neumann were instrumental in these developments. The modern computer, or Von Neumann machine, would play a central role in cognitive science, both as a metaphor for the mind, and as a tool for investigation.
In 1959, Noam Chomsky published a scathing review of B. F. Skinner's book Verbal Behavior. At the time, Skinner's behaviorist paradigm dominated psychology: Most psychologists focused on functional relations between stimulus and response, without positing internal representations. Chomsky argued that in order to explain language, we needed a theory like generative grammar, which not only attributed internal representations but characterized their underlying order.
The term cognitive science was coined by Christopher Longuet-Higgins in his 1973 commentary on the Lighthill report, which concerned the then-current state of Artificial Intelligence research.[2] In the same decade, the journal Cognitive Science and the Cognitive Science Society were founded.[3] In 1982, Vassar College became the first institution in the world to grant an undergraduate degree in Cognitive Science.[4]
In the 1970s and early 1980s, much cognitive science research focused on the possibility of artificial intelligence. Researchers such as Marvin Minsky would write computer programs in languages such as LISP to attempt to formally characterize the steps that human beings went through, for instance, in making decisions and solving problems, in the hope of better understanding human thought, and also in the hope of creating artificial minds. This approach is known as "symbolic AI".
Eventually the limits of the symbolic AI research program became apparent. For instance, it seemed to be unrealistic to comprehensively list human knowledge in a form usable by a symbolic computer program. The late 80s and 90s saw the rise of neural networks and connectionism as a research paradigm. Under this point of view, often attributed to James McClelland and David Rumelhart, the mind could be characterized as a set of complex associations, represented as a layered network. Critics argue that there are some phenomena which are better captured by symbolic models, and that connectionist models are often so complex as to have little explanatory power. Recently symbolic and connectionist models have been combined, making it possible to take advantage of both forms of explanation.[5]

[edit] Principles

[edit] Levels of analysis

A central tenet of cognitive science is that a complete understanding of the mind/brain cannot be attained by studying only a single level. An example would be the problem of remembering a phone number and recalling it later. One approach to understanding this process would be to study behavior through direct observation. A person could be presented with a phone number, asked to recall it after some delay. Then the accuracy of the response could be measured. Another approach would be to study the firings of individual neurons while a person is trying to remember the phone number. Neither of these experiments on their own would fully explain how the process of remembering a phone number works. Even if the technology to map out every neuron in the brain in real-time were available, and it were known when each neuron was firing, it would still be impossible to know how a particular firing of neurons translates into the observed behavior. Thus an understanding of how these two levels relate to each other is needed. This can be provided by a functional level account of the process. Studying a particular phenomenon from multiple levels creates a better understanding of the processes that occur in the brain to give rise to a particular behavior. Marr[6] gave a famous description of three levels of analysis:
  1. the computational theory, specifying the goals of the computation;
  2. representation and algorithm, giving a representation of the input and output and the algorithm which transforms one into the other; and
  3. the hardware implementation, how algorithm and representation may be physically realized.
(See also the entry on functionalism.)

[edit] Interdisciplinary nature

Cognitive science is an interdisciplinary field with contributors from various fields, including psychology, neuroscience, linguistics, philosophy of mind, computer science, anthropology, sociology, and biology. Cognitive science tends to view the world outside the mind much as other sciences do. Thus it too has an objective, observer-independent existence. The field is usually seen as compatible with the physical sciences, and uses the scientific method as well as simulation or modeling, often comparing the output of models with aspects of human behavior. Some doubt whether there is a unified cognitive science and prefer to speak of the cognitive sciences in plural.[7]
Many, but not all, who consider themselves cognitive scientists have a functionalist view of the mind—the view that mental states are classified functionally, such that any system that performs the proper function for some mental state is considered to be in that mental state. According to some versions of functionalism, even non-human systems, such as other animal species, alien life forms, or advanced computers can, in principle, have mental states.

[edit] Cognitive science: the term

The term "cognitive" in "cognitive science" is "used for any kind of mental operation or structure that can be studied in precise terms" (Lakoff and Johnson, 1999). This conceptualization is very broad, and should not be confused with how "cognitive" is used in some traditions of analytic philosophy, where "cognitive" has to do only with formal rules and truth conditional semantics.
The earliest entries for the word "cognitive" in the OED take it to mean roughly pertaining "to the action or process of knowing". The first entry, from 1586, shows the word was at one time used in the context of discussions of Platonic theories of knowledge. Most in cognitive science, however, presumably do not believe their field is the study of anything as certain as the knowledge sought by Plato.[citation needed]

[edit] Scope

Cognitive science is a large field, and covers a wide array of topics on cognition. However, it should be recognized that cognitive science is not equally concerned with every topic that might bear on the nature and operation of the mind or intelligence. Social and cultural factors, emotion, consciousness, animal cognition, comparative and evolutionary approaches are frequently de-emphasized or excluded outright, often based on key philosophical conflicts. Another important mind-related subject that the cognitive sciences tend to avoid is the existence of qualia, with discussions over this issue being sometimes limited to only mentioning qualia as a philosophically-open matter. Some within the cognitive science community, however, consider these to be vital topics, and advocate the importance of investigating them.[8]
Below are some of the main topics that cognitive science is concerned with. This is not an exhaustive list, but is meant to cover the wide range of intelligent behaviors. See List of cognitive science topics for a list of various aspects of the field.

[edit] Artificial intelligence

"... One major contribution of AI and cognitive science to psychology has been the information processing model of human thinking in which the metaphor of brain-as-computer is taken quite literally. ." AAAI Web pages.
Artificial intelligence (AI) involves the study of cognitive phenomena in machines. One of the practical goals of AI is to implement aspects of human intelligence in computers. Computers are also widely used as a tool with which to study cognitive phenomena. Computational modeling uses simulations to study how human intelligence may be structured.[9] (See the section on computational modeling in the Research Methods section.)
There is some debate in the field as to whether the mind is best viewed as a huge array of small but individually feeble elements (i.e. neurons), or as a collection of higher-level structures such as symbols, schemas, plans, and rules. The former view uses connectionism to study the mind, whereas the latter emphasizes symbolic computations. One way to view the issue is whether it is possible to accurately simulate a human brain on a computer without accurately simulating the neurons that make up the human brain.

[edit] Attention

Attention is the selection of important information. The human mind is bombarded with millions of stimuli and it must have a way of deciding which of this information to process. Attention is sometimes seen as a spotlight, meaning one can only shine the light on a particular set of information. Experiments that support this metaphor include the dichotic listening task (Cherry, 1957) and studies of inattentional blindness (Mack and Rock, 1998). In the dichotic listening task, subjects are bombarded with two different messages, one in each ear, and told to focus on only one of the messages. At the end of the experiment, when asked about the content of the unattended message, subjects cannot report it.

[edit] Knowledge, and Processing, of Language


A well known example of a Phrase structure tree. This is one way of representing human language that shows how different components are organized hierarchically.
The ability to learn and understand language is an extremely complex process. Language is acquired within the first few years of life, and all humans under normal circumstances are able to acquire language proficiently. A major driving force in the theoretical linguistic field is discovering the nature that language must have in the abstract in order to be learned in such a fashion. Some of the driving research questions in studying how the brain itself processes language include: (1) To what extent is linguistic knowledge innate or learned?, (2) Why is it more difficult for adults to acquire a second-language than it is for infants to acquire their first-language?, and (3) How are humans able to understand novel sentences?
The study of language processing ranges from the investigation of the sound patterns of speech to the meaning of words and whole sentences. Linguistics often divides language processing into orthography, phonology and phonetics, morphology, syntax, semantics, and pragmatics. Many aspects of language can be studied from each of these components and from their interaction.
The study of language processing in cognitive science is closely tied to the field of linguistics. Linguistics was traditionally studied as a part of the humanities, including studies of history, art and literature. In the last fifty years or so, more and more researchers have studied knowledge and use of language as a cognitive phenomenon, the main problems being how knowledge of language can be acquired and used, and what precisely it consists of. Linguists have found that, while humans form sentences in ways apparently governed by very complex systems, they are remarkably unaware of the rules that govern their own speech. Thus linguists must resort to indirect methods to determine what those rules might be, if indeed rules as such exist. In any event, if speech is indeed governed by rules, they appear to be opaque to any conscious consideration.

[edit] Learning and development

Learning and development are the processes by which we acquire knowledge and information over time. Infants are born with little or no knowledge (depending on how knowledge is defined), yet they rapidly acquire the ability to use language, walk, and recognize people and objects. Research in learning and development aims to explain the mechanisms by which these processes might take place.
A major question in the study of cognitive development is the extent to which certain abilities are innate or learned. This is often framed in terms of the nature versus nurture debate. The nativist view emphasizes that certain features are innate to an organism and are determined by its genetic endowment. The empiricist view, on the other hand, emphasizes that certain abilities are learned from the environment. Although clearly both genetic and environmental input is needed for a child to develop normally, considerable debate remains about how genetic information might guide cognitive development. In the area of language acquisition, for example, some (such as Steven Pinker)[10] have argued that specific information containing universal grammatical rules must be contained in the genes, whereas others (such as Jeffrey Elman and colleagues in Rethinking Innateness) have argued that Pinker's claims are biologically unrealistic. They argue that genes determine the architecture of a learning system, but that specific "facts" about how grammar works can only be learned as a result of experience.

[edit] Memory

Memory allows us to store information for later retrieval. Memory is often thought of consisting of both a long-term and short-term store. Long-term memory allows us to store information over prolonged periods (days, weeks, years). We do not yet know the practical limit of long-term memory capacity. Short-term memory allows us to store information over short time scales (seconds or minutes).
Memory is also often grouped into declarative and procedural forms. Declarative memory--grouped into subsets of semantic and episodic forms of memory--refers to our memory for facts and specific knowledge, specific meanings, and specific experiences (e.g., Who was the first president of the U.S.A.?, or "What did I eat for breakfast four days ago?). Procedural memory allows us to remember actions and motor sequences (e.g. how to ride a bicycle) and is often dubbed implicit knowledge or memory .
Cognitive scientists study memory just as psychologists do, but tend to focus in more on how memory bears on cognitive processes, and the interrelationship between cognition and memory. One example of this could be, what mental processes does a person go through to retrieve a long-lost memory? Or, what differentiates between the cognitive process of recognition (seeing hints of something before remembering it, or memory in context) and recall (retrieving a memory, as in "fill-in-the-blank")?

[edit] Perception and action


The Necker cube, an example of an optical illusion

An optical illusion. The square A is exactly the same shade of gray as square B. See checker shadow illusion.
Perception is the ability to take in information via the senses, and process it in some way. Vision and hearing are two dominant senses that allow us to perceive the environment. Some questions in the study of visual perception, for example, include: (1) How are we able to recognize objects?, (2) Why do we perceive a continuous visual environment, even though we only see small bits of it at any one time? One tool for studying visual perception is by looking at how people process optical illusions. The image on the right of a Necker cube is an example of a bistable percept, that is, the cube can be interpreted as being oriented in two different directions.
The study of haptic (tactile), olfactory, and gustatory stimuli also fall into the domain of perception.
Action is taken to refer to the output of a system. In humans, this is accomplished through motor responses. Spatial planning and movement, speech production, and complex motor movements are all aspects of action.

[edit] Research methods

Many different methodologies are used to study cognitive science. As the field is highly interdisciplinary, research often cuts across multiple areas of study, drawing on research methods from psychology, neuroscience, computer science and systems theory.

[edit] Behavioral experiments

In order to have a description of what constitutes intelligent behavior, one must study behavior itself. This type of research is closely tied to that in cognitive psychology and psychophysics. By measuring behavioral responses to different stimuli, one can understand something about how those stimuli are processed. Lewandowski and Strohmetz (2009) review a collection of innovative uses of behavioral measurement in psychology including behavioral traces, behavioral observations, and behavioral choice.[11] Behavioral traces are pieces of evidence that indicate behavior occurred, but the actor is not present (e.g., litter in a parking lot or readings on an electric meter). Behavioral observations involve the direct witnessing of the actor engaging in the behavior (e.g., watching how close a person sits next to another person). Behavioral choices are when a person selects between two or more options (e.g., voting behavior, choice of a punishment for another participant).
  • Reaction time. The time between the presentation of a stimulus and an appropriate response can indicate differences between two cognitive processes, and can indicate some things about their nature. For example, if in a search task the reaction times vary proportionally with the number of elements, then it is evident that this cognitive process of searching involves serial instead of parallel processing.
  • Psychophysical responses. Psychophysical experiments are an old psychological technique, which has been adopted by cognitive psychology. They typically involve making judgments of some physical property, e.g. the loudness of a sound. Correlation of subjective scales between individuals can show cognitive or sensory biases as compared to actual physical measurements. Some examples include:
    • sameness judgments for colors, tones, textures, etc.
    • threshold differences for colors, tones, textures, etc.
  • Eye tracking. This methodology is used to study a variety of cognitive processes, most notably visual perception and language processing. The fixation point of the eyes is linked to an individual's focus of attention. Thus, by monitoring eye movements, we can study what information is being processed at a given time. Eye tracking allows us to study cognitive processes on extremely short time scales. Eye movements reflect online decision making during a task, and they provide us with some insight into the ways in which those decisions may be processed.

[edit] Brain imaging


Image of the human head with the brain. The arrow indicates the position of the hypothalamus.
Brain imaging involves analyzing activity within the brain while performing various cognitive tasks. This allows us to link behavior and brain function to help understand how information is processed. Different types of imaging techniques vary in their temporal (time-based) and spatial (location-based) resolution. Brain imaging is often used in cognitive neuroscience.
  • Single photon emission computed tomography and Positron emission tomography. SPECT and PET use radioactive isotopes, which are injected into the subject's bloodstream and taken up by the brain. By observing which areas of the brain take up the radioactive isotope, we can see which areas of the brain are more active than other areas. PET has similar spatial resolution to fMRI, but it has extremely poor temporal resolution.
  • Electroencephalography. EEG measures the electrical fields generated by large populations of neurons in the cortex by placing a series of electrodes on the scalp of the subject. This technique has an extremely high temporal resolution, but a relatively poor spatial resolution.
  • Functional magnetic resonance imaging. fMRI measures the relative amount of oxygenated blood flowing to different parts of the brain. More oxygenated blood in a particular region is assumed to correlate with an increase in neural activity in that part of the brain. This allows us to localize particular functions within different brain regions. fMRI has moderate spatial and temporal resolution.
  • Optical imaging. This technique uses infrared transmitters and receivers to measure the amount of light reflectance by blood near different areas of the brain. Since oxygenated and deoxygenated blood reflects light by different amounts, we can study which areas are more active (i.e., those that have more oxygenated blood). Optical imaging has moderate temporal resolution, but poor spatial resolution. It also has the advantage that it is extremely safe and can be used to study infants' brains.
  • Magnetoencephalography. MEG measures magnetic fields resulting from cortical activity. It is similar to EEG, except that it has improved spatial resolution since the magnetic fields it measures are not as blurred or attenuated by the scalp, meninges and so forth as the electrical activity measured in EEG is. MEG uses SQUID sensors to detect tiny magnetic fields.

[edit] Computational modeling


A Neural network with two layers.
Computational models require a mathematically and logically formal representation of a problem. Computer models are used in the simulation and experimental verification of different specific and general properties of intelligence. Computational modeling can help us to understand the functional organization of a particular cognitive phenomenon. There are two basic approaches to cognitive modeling. The first is focused on abstract mental functions of an intelligent mind and operates using symbols, and the second, which follows the neural and associative properties of the human brain, is called subsymbolic.
  • Symbolic modeling evolved from the computer science paradigms using the technologies of Knowledge-based systems, as well as a philosophical perspective, see for example "Good Old-Fashioned Artificial Intelligence" (GOFAI). They are developed by the first cognitive researchers and later used in information engineering for expert systems . Since the early 1990s it was generalized in systemics for the investigation of functional human-like intelligence models, such as personoids, and, in parallel, developed as the SOAR environment. Recently, especially in the context of cognitive decision making, symbolic cognitive modeling is extended to socio-cognitive approach including social and organization cognition interrelated with a sub-symbolic not conscious layer.
  • Subsymbolic modeling includes Connectionist/neural network models. Connectionism relies on the idea that the mind/brain is composed of simple nodes and that the power of the system comes primarily from the existence and manner of connections between the simple nodes. Neural nets are textbook implementations of this approach. Some critics of this approach feel that while these models approach biological reality as a representation of how the system works, they lack explanatory powers because complicated systems of connections with even simple rules are extremely complex and often less interpretable than the system they model.
Other approaches gaining in popularity include the use of Dynamical systems theory and also techniques putting symbolic models and connectionist models into correspondence (Neural-symbolic integration). Bayesian models, often drawn from machine learning, are also gaining popularity.
All the above approaches tend to be generalized to the form of integrated computational models of a synthetic/abstract intelligence, in order to be applied to the explanation and improvement of individual and social/organizational decision-making and reasoning.

[edit] Neurobiological methods

Research methods borrowed directly from neuroscience and neuropsychology can also help us to understand aspects of intelligence. These methods allow us to understand how intelligent behavior is implemented in a physical system.

[edit] Key findings

Cognitive science has much to its credit. Among other accomplishments, it has given rise to models of human cognitive bias and risk perception, and has been influential in the development of behavioral finance, part of economics. It has also given rise to a new theory of the philosophy of mathematics, and many theories of artificial intelligence, persuasion and coercion. It has made its presence firmly known in the philosophy of language and epistemology - a modern revival of rationalism - as well as constituting a substantial wing of modern linguistics. Fields of cognitive science have been influential in understanding the brain's particular functional systems (and functional deficits) ranging from speech production to auditory processing and visual perception. It has made progress in understanding how damage to particular areas of the brain affect cognition, and it has helped to uncover the root causes and results of specific disfunction, such as dyslexia, anopia, and hemispatial neglect.

[edit] Criticism

In a paper written shortly before his death, B.F. Skinner stated that "cognitive science is the creation science of psychology."[12]

[edit] Notable researchers

Some of the more recognized names in cognitive science are usually either the most controversial or the most cited. Within philosophy familiar names include Daniel Dennett who writes from a computational systems perspective, John Searle known for his controversial Chinese Room, Jerry Fodor who advocates functionalism, and Douglas Hofstadter, famous for writing Gödel, Escher, Bach, which questions the nature of words and thought. In the realm of linguistics, Noam Chomsky and George Lakoff have been influential (both have also become notable as political commentators). In Artificial intelligence Marvin Minsky, Herbert Simon, Allen Newell, and Kevin Warwick are prominent. Popular names in the discipline of psychology include James McClelland and Steven Pinker. Anthropologists Dan Sperber, Edwin Hutchins, Scott Atran, Pascal Boyer and Joseph Henrich have been involved in collaborative projects with cognitive and social psychologists, political scientists and evolutionary biologists in attempts to develop general theories of culture formation, religion and political association.

[edit] See also

[edit] References

  1. ^ Thagard, Paul, Cognitive Science, The Stanford Encyclopedia of Philosophy (Fall 2008 Edition), Edward N. Zalta (ed.).
  2. ^ Longuet-Higgins, H. C. (1973). "Comments on the Lighthill Report and the Sutherland Reply", in Artificial Intelligence: a paper symposium, Science Research Council, 35-37
  3. ^ Cognitive Science Society
  4. ^ http://cogsci.vassar.edu/about/index.html
  5. ^ Artur S. d'Avila Garcez, Luis C. Lamb and Dov M. Gabbay. Neural-Symbolic Cognitive Reasoning. Cognitive Technologies. Springer, 2008, ISBN 978-3-540-73245-7, 2008.
  6. ^ Marr, D. (1982). Vision: A Computational Investigation into the Human Representation and Processing of Visual Information. W. H. Freeman.
  7. ^ Miller, G. A. (2003). The cognitive revolution: a historical perspective. Trends in Cognitive Sciences, 7, 141-144.
  8. ^ A number of authors consider the qualia problem to be part of the cognitive science field, e.g. Some philosophical issues in cognitive science: qualia, intentionality, and the mind-body problem, Qualia: The Hard Problem, and indeed the entire discipline of philosophy as being part of the cog sci field, e.g. What is Cognitive Science?, while other reputable sources that cover both qualia and cog sci do not draw any obvious connection between them, e.g. the Stanford encyclopedia of philosophy (Jan 2008 online edition) does have full-size articles on both qualia and cog sci, but qualia are not even mentioned in the cog sci article while cog sci is not mentioned in the qualia article.
  9. ^ Sun, Ron (ed.) (2008). The Cambridge Handbook of Computational Psychology. Cambridge University Press, New York.
  10. ^ Pinker S., Bloom P. (1990). "Natural language and natural selection". Behavioral and Brain Sciences 13 (4): 707–784. doi:10.1017/S0140525X00081061.
  11. ^ Lewandowski, Gary; Strohmetz, David (2009). "Actions can speak as loud as words: Measuring behavior in psychological science". Social and Personality Psychology Compass 3 (6): 992–1002. doi:10.1111/j.1751-9004.2009.00229.
  12. ^ B. F. Skinner, "Can Psychology be a Science of Mind?", American Psychologist, November 1990, page 1209, At the APA Web Site Successfully accessed 29 December 2009

[edit] Further reading

Introductory literature
  • Eckardt, Barbara Von (2003): Cognitive Science: Philosophical Issues. In: Lynn Nadel (Ed.): Encyclopedia of Cognitive Science, Vol. 1, London: Nature Publishing Group, pp. 552–559.
  • Thagard, Paul (2nd, 2005). Mind : Introduction to Cognitive Science. Cambridge, MA: The MIT Press.
General
Classic texts
Definitions
Miscellaneous
  • Baumgartner, P., et al. Eds. (1995). Speaking Minds: Interviews With Twenty Eminent Cognitive Scientists. Princeton, NJ: Princeton University Press.
  • Damasio, A. R. (1994). Descartes' Error: Emotion, Reason and the Human Brain. New York: Grosset/Putnam.
  • Gazzaniga, M. S. Ed. (1996). Conversations in the Cognitive Neurosciences. New York: The MIT Press.
  • Hunt, M. (1982). The Universe Within: A New Science Explores the Human Mind. Brighton: The Harvester Press.
  • Lakoff, G and Johnson, M. (1999). Philosophy In The Flesh. New York: Basic Books.
  • Port, Robert F. and vanGelder, Tim (1995). Mind as Motion: Explorations in the Dynamics of Cognition. Cambridge, MA: The MIT Press. ISBN 0262161508 .
  • Sun, Ron & L. Bookman, (eds.), Computational Architectures Integrating Neural and Symbolic Processes. Kluwer Academic Publishers, Needham, MA. 1994.
  • Thelen, Esther and Smith, Linda B. (1996). A Dynamic Systems Approach to the Development of Cognition and Action. Cambridge, MA: The MIT Press. ISBN 026270059X .
  • Tsakiridis, George. Evagrius Ponticus and Cognitive Science: A Look at Moral Evil and the Thoughts. Eugene, OR: Pickwick Publications, 2010.
Publications & publishers

[edit] External links