Bourgeois Dignity: what doesn't explain the industrial revolution

Deirdre McCloskey is a very unorthodox economist. Even though she did a lot of classical work on the history of the industrial revolution in England, she is best known for her critical examination of the 'rhetoric' of economics. A good example of her attacks can be found in her latest book on that issue (The Cult of Statistical Significance, with Stephen Ziliak), in which she criticizes the slippery use of 'significance' in statistics (see this post). But McCloskey has now engaged in an even larger enterprise: explaining the unprecedented economic growth observed over the last two centuries. The ambition of the project is reflected in the sheer volume of the treatment: six books, one published in 2006 (The Bourgeois Virtues), one that just came out (Bourgeois Dignity – that is briefly reviewed here), one available in draft form (The Bourgeois Revaluation), and three more that should appear over the next few years. McCloskey's main these is that the period of growth we have experienced was due to a shift in the rhetoric about bourgeois values.

Read more: Bourgeois Dignity: what doesn't explain the industrial revolution

Anthropology is not a science, says the AAA

(editor's note: click here to learn more about the AAA's decision from a partial but enlightening point of view)

The Board of the American Anthropological Association has recently adopted a new "mission statement" that omits any reference to "science" in its characterization of anthropology. The previous mission statement contained such a reference.

A number of US anthropologists have protested the new mission statement. I paste below a recent post from Professor Eric C. Thompson of the National University of Singapore. I find Professor Thompson's post especially interesting because it summarizes some of the data that he and his associates collected from graduate students in several leading US anthropology programs. The student respondents gave their opinions as to which anthropologists they regard as having been most influential on the development of anthropology during the last two decades. Professor Thompson has given me permission to reproduce his post here, along with relevant contact information. Those of you who may want to read his preliminary survey are invited to contact him directly.

Thompson's e-mail:

I'm writing in response to this valuable discussion of dropping the term "science" from the AAA mission statement. I was trained and worked (dissertation, c.1990s) largely in an "interpretive" tradition... with "postmodern" influences - scaremarks and all, haha. But I've followed and signed on to the SASci because I've never agreed with the anti-science ideologues and am not keen on an anthropology that excludes the modern scientific tradition.

In collaboration with students in a graduate seminar on anthropology and anthropological theory just completed here at the National University of Singapore, we queried graduate students in cultural anthropology at six leading anthropology departments in the United States (Chicago, Columbia, Duke, Harvard, UCLA, U of Washington) as to the most influential anthropologists of the past two decades. This was a very informal 'survey' but yielded some interesting results; which bear on the discussion here.

Read more: Anthropology is not a science, says the AAA

The Zeus problem revisited - or is it the Jedi problem?

In their recent paper (available here) in Journal of Cognition and Culture, Will M. Gervais and Joseph Henrich call attention to the Zeus problem. If religious belief is solely guided by representational content biases (as many scholars in the cognitive science of religion have argued), why do people generally not come to believe in the gods of their neighbours, or indeed, in gods of the past such as Zeus? Zeus has all the features that are characteristic of successful religious agents, but he is no longer a target for widespread belief and commitment. Of course, what Gervais and Henrich do not mention is that there are in fact modern believers in Zeus and other members of the Greek pantheon, namely adherents to Hellenic Polytheistic reconstructionism. As can be seen in the movie here, Zeus is still an object of worship today. There are about 2000 adherents to this form of paganism in Greece today.

{source}

<object width="640" height="505"><param name="movie" value="http://www.youtube.com/v/KT0az7g1PJ8?fs=1&amp;hl=fr_FR&amp;rel=0&amp;color1=0x006699&amp;color2=0x54abd6"></param><param name="allowFullScreen" value="true"></param><param name="allowscriptaccess" value="always"></param><embed src="http://www.youtube.com/v/KT0az7g1PJ8?fs=1&amp;hl=fr_FR&amp;rel=0&amp;color1=0x006699&amp;color2=0x54abd6" type="application/x-shockwave-flash" allowscriptaccess="always" allowfullscreen="true" width="640" height="505"></embed></object>

{/source}

So is there in fact a Zeus problem? I am not so convinced, since it turns out that even religions that make no secret of their purely fictional origins are quite successful.

Read more: The Zeus problem revisited - or is it the Jedi problem?

Where good ideas come from

Following up on the news of a few days earlier about the role of different network structures in the spread of new ideas, it's worth mentioning the new Steven Johnson book on a related topic: Where good ideas come from. Johnson sets out to dispel the myth of the lone inventor whose main motivation would be the financial benefits derived from her creativity. Instead, he suggests that most inventions are the result of a kind of undirected cooperation -- not a group of people purposively pursuing an idea, but several groups working on slightly different things that add up to a significant discovery in the end. Moreover, profit seems to be a motive only in a minority of case, despite the 'progress' made in copyright and patent law over the course of the 20th century.

Read more: Where good ideas come from

Video games as applied anthropology

Pursuing its ambitious development, the ICCI blog has decided now to open a "video games" section. And today, we are discussing the release of Civilization V, the last sequel of one of the most famous series in the history of video games.  

{source}

<object width="640" height="385"><param name="movie" value="http://www.youtube.com/v/uYFQRd1sY5E?fs=1&amp;hl=fr_FR&amp;rel=0"></param><param name="allowFullScreen" value="true"></param><param name="allowscriptaccess" value="always"></param><embed src="http://www.youtube.com/v/uYFQRd1sY5E?fs=1&amp;hl=fr_FR&amp;rel=0" type="application/x-shockwave-flash" allowscriptaccess="always" allowfullscreen="true" width="640" height="385"></embed></object>

 {/source}

OK I was just kidding. There won't be any video games section. But why not after all? Video games are often portrayed as violent and simplistic, consisting only in racing cars or shooting people. Some of them, however, are quite different. Some are about breeding a pet, playing guitar or... developing a culture. In Civilization V, the player leads a civilization from prehistoric times into the future, achieving one of a number of different victory conditions through research, diplomacy, expansion, economic development, government and military conquest.

Of course, one could argue that the view developed in Civilization is quite unrealistic, reductionist and deterministic. You have to go through certain political and technical stages to develop and expand your civilization. In a way, it evokes the old evolutionist theories in anthropology. But is Civilization that bad from an anthropological perspective?

Read more: Video games as applied anthropology

Picture of the week: The colors of the Web

In a recent post, Ophelia wondered about the basis of people's colours preference: Which colour do you prefer ? Have you always preferred it, or did your preference change ? Can you tell why you prefer pink to, let's say, yellow ?

One of the problem here is that, as Ophelia noticed, we lack data, and methods to gather them. As usual, the web may change the debate. The blog COLOURlovers has just released an interesting study of the colors in the brands from the top 100 sites in the world (see also here).

most-powerful-web-colors_2

Unsurprinsingly, the web is dominated by red and blue.

Read more: Picture of the week: The colors of the Web

Learning suicide in Sri Lanka

Durkheim's sociological study of suicide as a 'social fact' was premised, in part, on the idea that only human beings were known to commit suicide. But ethologists tell us that the ability to commit self-injury or to deliberately self-destruct has been found across separate species including other primates and land mammals, dolphins, insects, and even some bacteria. While the evidence is far from definitive and the use of terms like 'deliberate self-injury' or 'suicide' across vastly different species is hugely problematic (just as it is across different human cultures and epochs), the possibility that we might not be the only animal to display suicide-like behaviours raises fascinating questions about how and why the ability to commit suicide evolved at all, and how and why modern humans learn to use suicide as a response to particular kinds of problems in their lives.

Yellow_Oleander

One of the most popular methods of self-harm in Sri Lanka is through the consumption of yellow oleander seeds (in Sinhalese "kaneru atta"). Folklore states that the poison (waha) contained in the seeds is enhanced if they are mixed with water and sugar, and suicidal youth and adults often prepare kaneru atta in this manner. During suicide play children also sometimes mix kaneru atta with water and sugar.

Read more: Learning suicide in Sri Lanka

Picture of the week: West African Masquerade by Phyllis Galembo

Halloween is the perfect opportunity to discover Phyllis Galembo's work on masquerade traditions in West Africa (and a nice illustration of the way artifacts build on our social cognitive capacities). This is only a small selection of many amazing pictures Phyllis Galembo took during her trips in Nigeria, Benin and Burkina Faso between 2004 and 2006.

 

Atam Masquerader, Alok Village, Nigeria, 2004, Ilfochrome, 30 x 30 inches

Read more: Picture of the week: West African Masquerade by Phyllis Galembo

Philippa Foot, Famous Philosopher, Unknown Anthropologist (1920-2010)

Philippa Foot died at her home in Oxford, England, on Oct. 3, her 90th birthday (see the NYT here and the Guardian here, and note the difference). In her career, she defended the view that moral judgments have a rational basis and that they can be said to be true or false. Her writing also helped establish virtue ethics as a leading approach to the study of moral problems. She insisted that virtues like courage, wisdom and temperance are indispensable to human life and the foundation stones of morality.

Her work was thus primarily normative. However, she may be remembered more for her description of moral judgements and her indirect contribution to moral psychology. Indeed, along with Judith Thomson, she is the inventor of the trolley dilemma.

In 1967, in the essay "The Problem of Abortion and the Doctrine of the Double Effect," Philippa Foot discussed the moral distinctions between intended and unintended consequences, between doing and allowing, and between positive and negative duties — the duty not to inflict harm weighed against the duty to render aid.

The most arresting of her examples, offered in just a few sentences, was the ethical dilemma faced by the driver of a runaway trolley hurtling toward five track workers.

Read more: Philippa Foot, Famous Philosopher, Unknown Anthropologist (1920-2010)

Does God's omnipotence extend to vision?

In a recent paper, Lorenza Colzato and her colleagues have tried to provide some pieces of evidence for cultural influences on perception, and more precisely, for their hypothesis that following religious rules can 'affect' visual processing.

I am very sceptical about this paper. One crucial question I have been obsessed with while reading it was: how can high-level things like beliefs be influential at such a low level as visual perception? For sure, Colzato reports evidence for supporting this hypothesis (as did a previous paper of Colzato's) but what remains unclear is the function such an influence could fulfil, and how this influence could be cognitively realized. Moreover, the authors report that this influence of beliefs on visual processing is not specific to religion.

 KuyperArguing that "cultural contexts are very hard to capture and to define" (since citizens of a same country could belong to different cultures, a geographical cut-out is not completely sufficient to distinguish between cultural areas), Colzato et al. have decided to focus on religion, and they decided to define a 'religion' as a set of rules. Thus, every follower of a religion is tied to some rules which are, little by little, inducing "particular cognitive-control strategies and establishing default control parameters that generalize to situations that have no bearing for religious beliefs". This argument is supported by other studies such as McCullough and Willoughby's, in 2009, which shows that religious people were less likely to break the law since they were more 'used to' following rules in general (because of their daily, effortful religious training).

Yet, the authors argue, these induced cognitive-control strategies are not limited to complex decisions (such as "to break or not to break the law"). They can be found in low-level processing mechanisms like vision.

For example, Colzato et al.' previous study (already linked to) has shown that Calvinists (whose cultural background is supposed to stress the notion that every sphere of the society should mind its own business) were more likely to be victims of the global precedence effect than were atheists (who, the authors suppose, possess a more "holistic" view of the society). In the global precedence effect, as described by D. Navon in 1977, the global structure of a percept is available earlier than its local features, for example, to see the forest before the trees.

(Photograph: Abraham Kuyper, father of the Dutch Anti-Revolutionary Party, a conservative Calvinist movement.)

Read more: Does God's omnipotence extend to vision?

Picture of the week: How segregated is your city?

One of the tools that may change our view of culture is modelization. It helps us understand big phenomena such as language change or the dynamics of hot topics. One of the first and the most convincing use of models in social sciences probably comes from Nobel prize winner Thomas Schelling. In a famous paper, he showed that a small preference for one's neighbors to be of the same color could lead to total segregation. He used coins on graph paper to demonstrate his theory by placing pennies and nickels in different patterns on the "board" and then moving them one by one if they were in an "unhappy" situation.

Now, cartographer Bill Rankin produced an astounding map of Chicago, which beatifully illustrate the segregation phenomenon. Eric Fischer saw these maps, and took it upon himself to create similar ones for the top 40 cities in the United States. Fisher used a straight forward method borrowed from Rankin: Using U.S. Census data from 2000, he created a map where one dot equals 25 people. The dots are then color-coded based on race: White is pink; Black is blue; Hispanic is orange, and Asian is green. Here, I borrow Cliff Kuang's simple and efficient presentation.

Washington, D.C., for example (see figure on the right), has a stark east/west divide between Blacks and Whites.

Read more: Picture of the week: How segregated is your city?

Epistemic trust in scientific practice: The case of primates studies

tamarinsA few days ago, I received a favorable review of a paper of mine. The reviewer suggested some minor improvements, one of which led me to reflect on epistemic trust in scientific practice. In the paper, I cited a recent study of which Marc Hauser was the lead author. The reviewer suggested that I replace this reference by a similar study on primate cognition. Fortunately, in this case, it turns out that there were other studies that reach similar findings. My paper was a revision of an earlier submission which I had been told to 'revise and resubmit'. At the time of this earlier submission, the Hauser investigation had not yet been made public.

The paper I cited was not compromised in the recent Harvard investigation, but it is nevertheless tainted since it has appeared in the time when the scientific misconduct took place. I would have changed the reference anyway, even if the reviewer had not brought it up. For some researchers, the consequences of this affair may be much more dramatic, if they directly relied on Hauser's findings in their experimental designs or conclusions. I am thinking in particular about his language research, which has led to the retraction of the 2002 paper in Cognition.

Read more: Epistemic trust in scientific practice: The case of primates studies

Creative pairs

Patti Smith and Robert Mapplethorpe (photo by Norman Seeff)

Hugo Mercier and I have of late been developing the idea that reasoning, typically seen as an activity of the individual thinker, is in fact a social activity aimed at exercising some control on the flow of communicated information by arguing in order to convince others and by examining others' arguments in order to be convinced only when appropriate (see here). With such ideas in mind, I was struck by the opening paragraphs of a the first of a series of essays by Joshua Wolf Shenk, (the author of Lincoln's Melancholy and a variety of essays in The New Yorker, Harper's, The Nation, Mother Jones, or the Atlantic Monthly)  on "Creative pairs" published at Slate:

"What makes creative relationships work? How do two people—who may be perfectly capable and talented on their own—explode into innovation, discovery, and brilliance when working together? These may seem to be obvious questions. Collaboration yields so much of what is novel, useful, and beautiful that it's natural to try to understand it. Yet looking at achievement through relationships is a new, and even radical, idea. For hundreds of years, science and culture have focused on the self. We talk of self-expression, self-realization. Popular culture celebrates the hero. Schools test intelligence and learning through solo exams. Biographies shape our view of history.

This pervasive belief in individualism can be traced to the idea most forcefully articulated by René Descartes. "Each self inhabits its own subjective realm," he declared, "and its mental life has an integrity prior to and independent of its interaction with other people." ..."

Read more: Creative pairs

Can Antropologists and other Cognitive Scientist live together?

How can we go beyond the rhetorical dichotomy between nature and culture and avoid misunderstandings that repeatedly occur when social/cultural anthropologists and natural scientists try to co-operate? It shouldn’t be all that difficult if we think, as I believe we should, of human cognition not as a state but as a single process where history and individual cognitive development interact.

Bronislaw Malinowski among Trobriand Islanders, 1918

One can put the matter over simply by saying that the theoretical starting point of, for example, a cognitive psychologist is "external" while the starting point of a social anthropologist is "internal". The analytical tools of the psychologist, the questions she ask, the categories of analysis she uses – categories such as "concepts" or "mind" – have all been defined in a discourse that is external to the subjects of the enquiry. On the other hand, an anthropologist tries to use as the ground from which to produce her analysis the cognitive tools of the subjects of her enquiry as they are available to them in the particular place and the particular time they are located. The significance of using this "internal" base line has been stressed by anthropologists again and again, perhaps most eloquently by Malinowski with his well known phrase "from the native's point of view".

Read more: Can Antropologists and other Cognitive Scientist live together?

Why pink? Color matters

Just ask yourself : Which colour do you prefer ? Have you always preferred it, or did your preference change ? Can you tell why you prefer pink to, let's say, yellow ? If you have no answer to these questions, you may wonder what's so interesting about colour preferences. And if you have no answer, or no interest in the questions, it's perhaps because they are not very well shaped.

pink_tutu_copy

Let's first agree that color preference is an important aspect of human behavior. It influences a large number of decisions people make on a daily basis, including the clothes and make up they wear (I will never wear a kaki jacket !), the way they decorate their homes (I like dark furniture), the artifacts they buy or create, to name but a few examples. What is more interesting is that color is, in some sense, a superficial quality that seldom influences the practical function of artifacts. What's more interesting for psychologists, is that we still know very little on which factors actually determine these preferences. We still don't have a good grasp on what they are, and how to capture them descriptively: some studies have reported universal preferences (for blue rather than red); others. for highly saturated colors ; some, finally, stress cultural and individual differences.

Read more: Why pink? Color matters

Paul Rozin on what psychologists should study

Paul Rozin, one of the founding fathers of cognition-and-culture studies, is a psychologist with a rich set of interests. Even though he’s often known for his work on food, and disgust in particular (cockroach in your drink anyone?), the list of his current projects alone would make many a psychological career look narrowly focused. However, this post will not dwell on the value of having such diverse interests, but on the value for psychology of adopting a richer set of methodologies.

In an insightful series of articles (see below), Rozin highlights some of the shortcomings of modern psychology (while his focus is primarily on social psychology, his remarks apply equally well to most of cognitive psychology). One of these shortcomings is the failure to sufficiently take into account—and study—cultural variability. Even the bulk of cross-cultural psychology only compares undergrads across countries (usually a ‘Western’ sample and an ‘Eastern’ sample). But Rozin draws our attention to the even less forgivable paucity of data regarding presumably less stark cultural variations along ethnic, religious, political or social lines. Understandably, for most Western researchers, a trip to Shanghai or Kyoto to carry out an experiment will be more attractive than one to, say, inner-city Detroit (I plead guilty here). But there also seems to be a publication bias: cross-cultural psychology journals are likely to publish more easily a comparative study of Chinese and American undergraduates rather than one comparing, say, blue and white collar workers in Philadelphia (coda: a publication bias nearly automatically translates into a grant bias which further compounds the problem). But I will not belabor this point, as the lack of cultural variability in the samples of psychologists has already been discussed on this blog.

Read more: Paul Rozin on what psychologists should study

What if there had never been a Cognitive Revolution?

What_ifIn a previous post, I questioned the relevance of the label “Cognition and Culture” for our institute. Why not ‘Cognition and Society’ instead? Choosing ‘culture’ over  ‘society’, I argued, is not arbitrary and implicate that some questions (religion, transmission) are preferred over some others (cooperation, institutions). The same remark holds for the term cognition. Why not cognition and not simply psychology? Why aren’t we part of an International Psychology and Culture Institute? Arguably, we use the term ‘cognition’ because we reckon that we are the heirs of the Cognitive Revolution. But is it really the case? Would the field of ‘Cognition and Culture’ be different if the Cognitive Revolution never happened? My guess is: not so much.

Let’s imagine an uchronia, a different version of our world in which history diverged from the actual history of the world. But this time, it is not about the Nazis winning World War II, the Spanish Armada successfully invading England or the Black Death of the 14th century killing 99% of Europe. It is about the Cognitive Revolution.Yes, that’s not the most funky uchronia ever (although some writers have imagined some cognitive related uchronia, the most famous being probably The difference engine by Gibson and Sterling in which Charles Babbage's Analytical Engine takes on the roles of modern computers a century in advance). Still, the question remains interesting: What if the Cognitive revolution had never happened?

Read more: What if there had never been a Cognitive Revolution?

Paul the Octopus, relevance and the joy of superstition

So, as you all know, Spain beat the Netherlands and won the World Football Cup in Johannesburg on July 11, 2010. As most of you may also know, this victory was predicted by a German octopus named Paul. Paul was presented before the match with two transparent boxes each baited with mussel flesh and decorated one with the Spanish flag, the other with the Dutch flag, and, yes, Paul the octopus correctly chose the Spanish flag box. One chance out of two, you might sneer, but Paul had correctly predicted, by the same method, the results of the seven matches in which the German team played. The probability of achieving by chance such a perfect series of prediction is 1/256 or 0.003906. More impressive, no? Paul the Octopus is now a TV news star: he has today more than 200,000 Google entries and more than 170,000 Facebook friends; he has received both death threats and commercial offers, and so on. On July 12, Paul's owners presented him with a replica World Cup trophy and announced that "he won't give any more oracle predictions - either in football, or in politics, lifestyle or economy."

Should you be impressed?

Read more: Paul the Octopus, relevance and the joy of superstition

Opacity tasting with Dan and Maurice

A few weeks ago, Maurice Bloch, Dan Sperber and several others had a debate on the nature of those beliefs (in particular 'religious' beliefs) that cannot easily be made fully explicit, or brought to bear on concrete matters (so-called 'semi-propositional beliefs' - here called 'opaque' beliefs). The question, very roughly, was the following. Can such beliefs be arrived at only through some form of reasoning or of trust? In other words, must they be 'reflective' rather than simply 'intuitive'? Maurice argued that such beliefs can be intuitive, but Dan disagreed (if you missed the first installments of the feuilleton, see here and here). Dan imagined a friendly discussion with Maurice, taking place around a glass of wine, which has now prompted György Gergely to jump in the debate. (O.M.)

Under the mild mental pleasure that the virtual and vicarious consumption of wine in the distinguished company of such authorities on both wine and opaque beliefs (pardon me for the expression) as Dan Sperber and Maurice Bloch induced in me, I feel liberated to raise some – admittedly opaque but for me at least mildly intoxicating – questions concerning Dan’s characterization of the nature of having reflective semi-propositional beliefs.

What it is like to be holding a reflective belief of opaque content?

Pierre Soulages (the master of pictorial opacity). Peinture (1956). Musée National d'art Moderne, Paris

Read more: Opacity tasting with Dan and Maurice

Homeopathy as witchcraft

The British Medical Association's annual conference of junior doctors has declared that homeopathy is witchcraft. They have voted a blanket ban and an end to all placements teaching homeopathic principles to training doctors (this debate is quite hot in the UK as you can see here).

 

 

Dr Tom Dolphin, deputy chairman of the BMA's junior doctors committee in England, told the conference:

"Homeopathy is witchcraft. It is a disgrace that nestling between the National Hospital for Neurology and Great Ormond Street there is a National Hospital for Homeopathy which is paid for by the NHS".

Although this comparison may be quite harsh for homeopathy, the connection between homeopathy and witchcraft may be of some interest from a Cognition & Culture point of view.

Read more: Homeopathy as witchcraft

The sacredness of God

One of the difficulties I run into in expounding Pascal Boyer's theory of the minimal counterintuitiveness of religious concepts ("MCI theory") is that many people feel that the critical feature of god concepts—the gods’ sacredness or ultimacy—is not explained by the theory.  Here I propose a sort of solution to this problem, or at least a response to the objection.

Sacredness, holiness, awesomeness, ultimacy, greatness—these terms (at least in their religious uses) denote a quality that seems to elude definition.  Let us denote the quality to which these terms refer as AA may be a simple quality, or some compound of qualities—I do not know and it does not matter for this discussion.

In my own experience, there was a time when I used these terms because they were used by other people in my religious tradition, and early on I discerned that they were used primarily in reference to God, but occasionally and partially in reference to other things as well.  They were part of church language, to be used in religious contexts but not, at least in the same sense, elsewhere.  They were abstract and theoretical, unconnected to any perceptible quality, and emotionally sterile.  Nonetheless, I knew how to use them in socially appropriate ways.  Later, after I had a set of experiences, these terms came to life for me...

Read more: The sacredness of God

“Oy vey, have you got the wrong vampire!” A reply to Frans de Waal

I am used to being attacked by fellow anthropologists for having a naturalistic approach and for arguing that cognitive science, experimental methods, and evolutionary theorizing are highly relevant to anthropology’s pursuit. Some of these attacks have been quite violent (one, in l’Homme 1982 concluded with the suggestion that, in order to show me the irrelevance of what is in the skull, I should be given a blow on the head); few if any have paid much attention to my precise claims, but at least they were quite right in targeting me as a naturalist. I am also used to having to work harder in order to get evolutionary biologists and comparative psychologists to pay attention to what I have to say than I would have to if I were one of them. That is understandable.

However, what happened in the past few days was a novel experience.

Read more: “Oy vey, have you got the wrong vampire!” A reply to Frans de Waal

Three Questions for Michael Tomasello

Mike TomaselloMichael Tomasello is Co-Director of the Max Planck Institute for Evolutionary Anthropology, and Co-Director of the Wolfgang Köhler Primate Research Center. He has conducted and inspired research on a wide range of questions of critical relevance and foundational importance to the cognition and culture area. His 1999 book, The Cultural Origins of Human Cognition (Harvard University Press), won the 2001 APA William James Book Award and has been translated into a dozen languages.  (A Brasilia bookshop sold its very last copy of Origens Culturais Da Aquisição Do Conhecimento a few days ago. I was there - Em). Tomasello's awards and distinctions include a Guggenheim Fellowship (1997), a Hegel Prize (2009), and, most recently, the 2010 Dr A H Heineken Prize for Cognitive Science. For a list of selected recent publications, see here. For his thoughts on past, present, and present, see below. All comments welcome!

Three Questions

1. In 1992, you brought us your first book, First Verbs: A Case Study of Early Grammatical Development (CUP). Your most recent book, Why We Cooperate (MIT, 2009) takes us well beyond the realms of cognitive linguistics, focusing on another old – and very hot – chestnut in the human behavioural sciences. What are some of the key intellectual landmarks that have marked your route between the two?

Read more: Three Questions for Michael Tomasello

Why do academics oppose capitalism?

A few weeks ago, Megan McArdle, the business and economics editor for The Atlantic, wondered why Academia treats its workforce so badly.

Academia has bifurcated into two classes:  tenured professors who are decently paid, have lifetime job security, and get to work on whatever strikes their fancy; and adjuncts who are paid at the poverty level and may labor for years in the desperate and often futile hope of landing a tenure track position.  And, of course, graduate students, the number of whom may paradoxically increase as the number of tenure track jobs decreases--because someone has to teach all those intro classes.

There seems to be a paradox here:

What puzzles me is how this job market persists, and is even worsening, in one of the most left-wing institutions in the country. (...) Almost every academic I know is committed to a pretty strongly left-wing vision of labor market institutions.  Even if it's not their very first concern, one would assume that the collective preference should result in something much more egalitarian.  So what's overriding that preference?

McArdle’s solution to this paradox is that that Academia's leftward drift (some of it at least) can be explained by the fact that it has one of the most abusive labor markets in the world. I’d rather say that it’s probably the other way around and that it is the academics’ moral judgements that permitted these inequalities. But in order to see why, we first need to understand why it is that so many academics oppose capitalism.

Liberalprof

Read more: Why do academics oppose capitalism?

Communication, punishment and common pool resources

Economic games have been discussed several times on this blog. Their extreme simplicity makes them attractive tools for an experimental approach, but it also makes them all too perfect examples of lack of naturalness and ecological validity. Still many would argue that, together with formal modelling, these games have permitted important theoretical advances and demonstrated for instance that punishment of defectors plays a crucial role in explaining human cooperation. But is it really so? How reliable are the insights gained from simple games such as the ultimatum of the common good games, when in real life, the dilemma people are faced with are much more complex, both in terms of the range of choices available, and the dynamics of interaction over time? Elinor Ostrom, the 2009 Nobel Prize in Economics (which we hailed at the time), is uniquely well placed to understand the complexity of the dilemma that people face when they have to solve real common goods problems, having studied many such dilemmas in real life herself. She has been developing new ways to test experimentally participants’ reactions when faced with dilemma that offer more complex problems than most experiments so far, while maintaining an adequate degree of control. The results from one of these experimental studies have recently been published in an article in Science (328: 613-617):   "Lab Experiments for the Study of Social-Ecological Systems" by Marco A. Janssen, Robert Holahan, Allen Lee, Elinor Ostrom. They put in perspective the more standard approaches and strongly suggest rethinking some of their conclusions.

A screen shot of the experimental environment of the study of Janssen et al. The green star-shaped figures are resource tokens; the circles are avatars of the participants (yellow is participant’s own avatar; blue represents other participants).

Read more: Communication, punishment and common pool resources

Believing Maurice Bloch on doubting, doubting him on believing

My friend Maurice Bloch and I have been arguing since even before we first met in the 70s. What makes it worthwhile is that there is much we agree on, and, once in a while, one of us causes the other to change his mind on some issue. There has been one issue however where I have failed to convince Maurice (and reciprocally, of course); it is about an old argument of mine regarding the disunity of beliefs. Since my 1982 paper “Apparently irrational beliefs”, I have argued that we should distinguish two mental attitudes toward a belief content, an ‘intuitive’ and ‘reflective’ belief attitude (see here). Intuitive beliefs are experienced as plain knowledge of fact without attention and generally without awareness of reasons to hold them to be facts.  Reflective beliefs are held for reasons that are mentally entertained. These reasons can be of two kinds: the authority of the source of the belief, or the sense that their content is such that it would be incoherent not to accept them.

Read more: Believing Maurice Bloch on doubting, doubting him on believing

Why do we make our tastes public?

Facebook has recently changed the way it asks its users to endorse brands and celebrities on the site. Rather than ask people to "become a fan" of say, Starbucks or Lady Gaga, Facebook will instead let users click to indicate that they "like" the item.

likes

Facebook already lets people "like" comments or pictures posted on the site, and users click "like" almost twice as much as they click "become a fan." Facebook says that replacing "become a fan" with "like" will make users more comfortable with linking up with a brand and will streamline the site. The Independent quotes Michael Lazerow, CEO of Buddy Media, which helps companies establish their brands and advertise on social networks such as Facebook: "The idea of liking a brand is a much more natural action than [becoming a fan] of a brand. In many ways it's a lower threshold."

But while it might seem to be less of a commitment to declare that you "like" Starbucks than to announce you are a fan of it, the meaning essentially would stay the same: Your Facebook friends would see that you clicked that you "like" a page and that’s why users do it anyhow: to advertise their good taste or, to use Bourdieu’s famous term, their “distinction” (below the break is one of the famous Bourdieusian graphs where cultural and economic capital are related to cultural practices. Although the data are quite old now, it still is fun to plot oneself in this kind of space).

Read more: Why do we make our tastes public?

Pitt-Rivers haunts the Musée du Quai Branly

This post was originally published in 2006, on the Alphapsy blog.

On visiting the brand new Musée du Quai Branly in Paris last Sunday, I was amazed to meet the ghost of one of the most outdated anthropologists of the Victorian Age.

The French Président de la République is probably the most monarchic head of state in any democratic constitution; it is customary that, once in his reign, he treats himself with a grand construction. The said construction is usually located in Paris, much advertised for, and preferably ugly (although François Mitterrand's Grand Louvre and Grande Bibliothèque are exceptions to that rule). This year, Jacques Chirac has offered Paris, and the amazed world, a museum of exotic art. I know I am not supposed to call it that; I know that it is all about anthropological science and respectful curiosity. But whatever the brochures might say, the spiritual father of the Musée du Quai Branly is not Claude Lévi-Strauss; it would rather be Guillaume Apollinaire, the poet who launched the "Art Nègre" fad in early twentieth-century Paris.

Read more: Pitt-Rivers haunts the Musée du Quai Branly

In praise of babies

This post was originally published in 2006 on the Alphapsy blog.

No news in this post: its only aim is to remind us of how socially savvy babies are. A review paper in press in trends in cognitive sciences sums up the evidence from developmental psychology and neuroscience.

During a conference a few months ago, some members of the audience were taken aback by Jesse Prinz’ remark that “babies are dumb”. Well, they might not be at ease in the high realms of philosophy, as he is, but they do a lot of pretty smart things, especially in the domain of social cognition (as this paper reminds us; see also this very interesting paper with a different take on the data). Here are some results.

Read more: In praise of babies

The face of the thinker

This post was first puvlished in 2006 on the Alphapsy blog.

The hindsight bias is the tendency to say after an event happened that “we knew it all along”, that it’s not really surprising. This is a bias because when asked before the event, we wouldn’t have predicted it, but after it happened we think we would have (and because it can be quite irritating). A 2006 paper in Current Directions in Psychological Science  (August issue) that reviews the role of metacognitive thoughts and feeling in this phenomenon, and among the effects mentioned one is quite surprising: people asked to make the face we make when engaged in deep thinking were actually more doubtful towards their answers, as if they had had to think hard to come out with them. Explanation of the experiment.

Read more: The face of the thinker

God is dead?

This post was originally published in 2006 on the Alphapsy blog.

Atheists (and religious people sometimes too) often think that people believe in God because it shields them from the fear of their own death, or protects them from the idea that their departed loved ones are, well, just dead. Two recent studies confirm this idea:
one by showing that being aware of one's own mortality increases belief in supernatural agents and the other by demonstrating that showing contradiction in their sacred texts increases the accessibility of death related thoughts among fundamentalists.

Above, two ways to fight your fear of death: prayer and humour (pick your side).

Read more: God is dead?

Additional information