“Learned helplessness” and torture – is a leading US psychologist complicit?

Passing on an exchange, from NY Review of Books, between Marty Seligman, prominent psychologist at University of Pennsylvania, and an author, Tamsin Shaw, who suggested in an article that he was possibly complicit when the CIA developed torture techniques under former President Bush. Seligman first, then the author second.


‘Learned Helplessness’ & Torture: An Exchange Martin Seligman, reply by Tamsin Shaw APRIL 21, 2016 ISSUE In response to: The Psychologists Take Power from the February 25, 2016 issue Saddam Saleh, a former prisoner at Abu Ghraib, showing a photograph of himself and other prisoners being abused there in November 2003 by US soldiers, Baghdad, May 2004 To the Editors: In her defamatory article [“The Psychologists Take Power,” NYR, February 25], Tamsin Shaw seeks to portray me as aiding and abetting torture. I strongly disapprove of torture and I have never and would never aid, abet, or provide any assistance in its process. I have spent my life trying to cure and prevent learned helplessness, so I am horrified that good science, which has helped so many people overcome depression, may have been used for such a bad purpose as torture. If you accuse a fellow academic of supporting torture, you’d better have some pretty good evidence. Shaw does not. Here is her evidence and my responses: Seligman was one of only three witnesses out of 148 who refused to speak directly with Hoffman’s investigators, demanding instead that they send him questions in writing… Shaw seems to imply that I was trying to hide something by being interviewed in writing. On the contrary, I wanted the record to be public if necessary and Hoffman has refused to provide transcripts of others’ spoken interviews. Hoffman and I went back and forth for several weeks and I answered all his queries at great length. In December 2001, Seligman convened a meeting at his home to discuss the participation of academics in national security efforts following September 11. Among those present were CIA psychologist James Mitchell and the chief of research and analysis in the CIA’s Operational Division, Kirk Hubbard. This meeting occurred as described. The meeting was about how academics could counter jihadi violence. There was no mention of torture, interrogations, detainees, or any remotely related topic. Mitchell and Hubbard were entirely silent throughout the proceedings. Seligman claimed to remember meeting with Hubbard on one subsequent occasion at his home, in April 2002, to discuss his theory of “learned helplessness” with Hubbard and a female lawyer, and that on this occasion he was invited to speak on the theory of learned helplessness at the Survival, Evasion, Resistance, and Escape (SERE) school, sponsored by the US government. Hubbard, however, recalled meeting Seligman at his home several times after the initial meeting, including a meeting in April 2002 at which, according the Hoffman report, “he, Mitchell, and Jessen met with Seligman in his home to invite him to speak about learned helplessness at the SERE school.” My discussions with Hubbard and Mitchell were entirely about how captured Americans could resist and evade torture. All of their questions were about captured American soldiers and what our soldiers could do. The Hoffman report verified this and Hubbard and Mitchell testified that they never discussed interrogations with Seligman and did not provide him information about the interrogation program. The extent of Seligman’s further involvement has not been established, but in an e-mail sent by Hubbard in 2004, he expressed gratitude for Seligman’s help “over the past four years.” The reason that my “further involvement has not been established” is because there was none. I assume Hubbard was thanking me for the meetings above and for my pro bono lecture in May 2002 to the Joint Personnel Recovery Agency on how what is known about learned helplessness could be used to help captured American soldiers resist and escape torture. Shaw might have gone to the trouble of asking Hubbard what he was thanking me for, as did David Hoffman, uncovering no further involvement. The Hoffman report twice states that Seligman’s denial of any suspicion that the CIA’s interest in his theories was for use in interrogations is not credible. The first I heard that the CIA might be using torture in interrogation was only years later when I read Jane Mayer’s New Yorker article. It never occurred to me before that. If I had known about the methods employed, I would not have discussed learned helplessness with them. Shaw ends her reply by denouncing “groundless psychological assumption.” But her charge against me is essentially that I might have guessed how my work was being misused, and therefore I support torture. To throw around such serious charges based on such a flimsy psychological assumption is, as Jonathan Haidt and Steven Pinker noted [“Moral Psychology: An Exchange,” NYR, April 7], conduct unbecoming a philosopher. Martin Seligman Zellerbach Family Professor of Psychology University of Pennsylvania Philadelphia, Pennsylvania
Tamsin Shaw replies: Martin Seligman has repeatedly insisted that he is an opponent of torture. He tells us in his letter that he “strongly disapproves” of it. If he found himself at the very center of the terrible episode in our recent history in which the United States inflicted brutal torture on detainees in the Abu Ghraib prison, the Guantánamo Bay detention camp, and at CIA black sites, this was, he maintains, entirely unwittingly. And yet, since he was at the center of this episode, being in direct contact with the architects of the CIA’s torture program at the moment of its devising, there are some clear questions that a declared opponent of torture might have asked in his position. In April 2002, Seligman was invited to give a lecture on his theory of learned helplessness (a theory perhaps better called “induced helplessness,” since it involves being placed under such psychologically devastating stress that the subject becomes helpless) at the SERE school in May of that year. He says he believed that this was solely for the purposes of helping captured American soldiers resist and evade torture. But he was not invited by the military. He was invited by members of the CIA. His principal contacts were James Mitchell and Kirk Hubbard, who attended a meeting at Seligman’s home in December 2001 and whose affiliations are listed, in the document produced by Seligman on that occasion, as CIA (Mitchell had moved to the CIA in 2001 after retiring from his position as a US Air Force instructor in the SERE school). We now know from the 2008 report by the Senate Committee on Armed Services that Mitchell was working with Bruce Jessen, his former fellow SERE instructor, to write a report on the resistance techniques used by al-Qaeda and also to study ways in which the theory of learned helplessness, employed by them both previously in SERE training, could be used in interrogations. Seligman may have been ignorant of the fact that in April and May of 2002 he was participating in a new initiative in which the CIA and the US military would collaborate via the SERE program to devise “enhanced interrogation techniques.” But it might have occurred to him ask why the CIA should suddenly have made the resistance and survival of US military personnel a priority for agency psychologists, even if no clear explanation presented itself at the time. One explanation for the CIA’s interest in SERE techniques was certainly available in 2002, in the form of press reports of CIA interrogations during this period. A story by Philip Shenon, published in The New York Times on April 26, 2002, stated that “non-violent forms of coercion” were being employed, with the assistance of psychologists, in the interrogation of Abu Zubaydah, though the Bush administration claimed that the techniques used, such as sleep deprivation, fell short of torture. In December 2002 The Washington Post published a lengthy article by Dana Priest and Barton Gellman on CIA interrogations, describing the use of techniques that many people would consider torture, such as blindfolding, being bound in painful positions, being subjected to loud noises, and sleep deprivation. And yet Seligman claims in his letter that he had no suspicions that the CIA might be using torture in interrogations until he read Jane Mayer’s article “The Experiment” inThe New Yorker in July 2005. This is an extraordinary claim. Since CBS first broadcast, on April 28, 2004, photographs of prisoners being abused at Abu Ghraib, there had been tremendous public debate about the issue in America. Seymour Hersh, writing inThe New Yorker in May 2004, had explicitly linked the abuses to CIA-led interrogations. The use of psychological techniques against prisoners featured heavily in many of the media reports. And social psychology was discussed very frequently in attempts to explain the abuse, with former American Psychological Association President Philip Zimbardo writing Op-Eds and giving interviews on the psychological conditions under which the prison guards had acted. He raised significant issues concerning the resilience of prisoners under stressful conditions (resilience being one of Seligman’s core research interests). In June 2004, a memo issued by the Justice Department’s Office of Legal Counsel in 2002 was leaked (as reported, for example, in a Washington Post story of June 8, by Dana Priest and R. Jeffrey Smith), in which it was made clear that the line between legal interrogation techniques and torture was being blurred. The 2002 memo, moreover, referred directly to an issue central to Seligman’s area of psychological expertise. In the Washington Post article of June 8, 2004, Priest and Smith wrote: “For purely mental pain or suffering to amount to torture,” the memo said, “it must result in significant psychological harm of significant duration, e.g., lasting for months or even years.” Examples include the development of mental disorders, drug-induced dementia, “post traumatic stress disorder which can last months or even years, or even chronic depression.” And yet Seligman, one of the most prominent behavioral scientists in the country, with direct links to the CIA and the military, on his own account somehow, according to his letter, remained unaware of the reports I have cited in The New York Times and The Washington Post, as well as The New Yorker report by Seymour Hersh, and the grave moral and psychological questions they raised. On January 1, 2005, The New York Times published an article by Neil Lewis that, according to the Hoffman report, alarmed many senior members of the APA. It described in some detail the way in which psychologists were assisting in “break[ing] down” detainees at Guántanamo Bay. In early 2005, well before Jane Mayer’s article inThe New Yorker, further details emerged in articles by Gregg Bloche (in The New England Journal of Medicine) and Jonathan Marks (in the Los Angeles Times). But Martin Seligman insists that he remained oblivious to this crisis in his profession. As I reported in my review, David Hoffman, the author of the Independent Review Relating to APA Ethics Guidelines, National Security Interrogations, and Torture, concludes: “We think it would have been difficult not to suspect that one reason for the CIA’s interest in learned helplessness was to consider how it could be used in the interrogation of others.” But he also tells us, “We do not have enough information to know what Seligman knew or thought at the time.” The question of what Seligman was thinking remains a mystery. He has not offered us an account to replace our “groundless psychological assumption.” That question has significance as a small part of a much broader set of concerns for the psychological profession. Hoffman notes that psychologists possess “a special skill regarding how our mind and emotions work,” one that permits them to heal damaged psyches but also confers on them a special ability to cause harm. At the same time, they are especially vulnerable to conflicts of interest, owing to the enormous sums of money that the Department of Defense pours into their field. They are therefore in a position in which very serious moral failings may possibly be enacted on a very large scale. When this happens they have a special moral responsibility to analyze what went wrong. In discussing the actions of the leadership of the APA between 2004 and 2008, Hoffman tells us that “by June 2005, it would have been clear to all well-informed observers that abusive interrogation techniques had almost certainly occurred and that there was a substantial risk they were still occurring.” Senior APA officials, the report claims, failed to investigate unsubstantiated assurances from the Department of Defense that the abuse had been halted. Hoffman tells us: In this situation in a criminal case, one would ask whether this intentional decision not to seek more information constituted “willful blindness” or “deliberate avoidance….” One common legal definition of “deliberate avoidance” in this context is “cutting off one’s curiosity through an effort of the will.” But Hoffman and his team were not prosecuting a criminal case. Instead, they have provided us with an important public document and information on the basis of which any concerned person might reasonably ask questions of the leading psychologists involved. Such public questioning will inevitably make deliberate avoidance harder and we can hope that it might even elicit valuable insights and explanations. It should in any case be welcomed by all those concerned with the moral standing of an exceptionally powerful profession.

Leave a comment

Filed under Uncategorized

We won the war. No we did.



















It’s inevitable and natural that we think about our own kith and kin before foreigners, and I think it’s all but inevitable that we overestimate our participation in any large-scale international enterprise – and overestimate how much others esteem, admire and even know about what we’ve done (or not done). This is all the more the case in an enterprise that is seen as a) good and b) successful.

World War II fits these criteria. What follows is not an exhaustive examination of the facts, but some anecdotal impressions with a sprinkling of fact.

Certainly when I was growing up in the UK the phrase “we won the war” was not controversial. And I don’t think it would be controversial today. I remember yelling “we won the war in 1964” at my nursery school (such a satisfying rhyme) – only to be informed that it was 1962. It’s not the first time I’ve been slightly off about the war. I’m also obviously old enough to remember endlessly playing battle games with “Airfix” men – plastic soldiers – always Germans against English – and, via my older brother, ceaselessly reading black and white war comics in which German pilots were always crying out “Teufel – Englander!!” as they were shot down. Such was my moral universe.

The thing is, the phrase “we won the war” – as far as I can guess – is also not controversial today in either the US or Russia. And while of course this doesn’t strictly mean that no one else participated, it does have a tendency to play up the role of one’s own country at the expense of others.

The more I read about the war, the more fascinated I am. And then there are chance meetings. Last year I talked with a well-educated Russian guy about the war. Having read some accounts by Max Hastings, the British historian, I’d become aware – according to Hastings – that the Russian perspective on the Great Patriotic War, as they call it, is that the Western front was a side-show to the main event of the war – the enormous battles between the Germans and Russians. Hastings writes the Russians have some reason to support this view- their losses were colossal – around 26 million or even more – and some of the battles, involving millions of soldiers, were the largest in history. Sure enough, my Russian interlocutor assured me that Russians, generally speaking, think they were responsible for about 80% of the fighting and the winning of the war. He told me that he was surprised and interested on visiting the UK to hear about something called the “Battle of Britain.”

So to the UK. We of course couldn’t possibly have won the war on our own (supposing that that claim has been made seriously…): the German military were far more powerful than the British – at least at the beginning of hostilities. Hastings writes also that generally speaking the British army performed abysmally during the war – suffered defeat after defeat and only came up to a barely satisfactory level by the end of the conflict. He does say though the RAF and the navy performed with distinction. The other thing that is not always remembered is how close the British came to parlaying with Hitler – ie negotiating a truce after the fall of France. It was one of Churchill’s signal achievements that he persuaded the cabinet not to do this – but it was a close run thing, and could have gone the other way. Churchill himself was a mixture of inspiring brilliance and an overly gung-ho loose cannon who also made some serious military blunders.

Did the Americans win the war? Undoubtedly they were the major force in the Pacific war against the Japanese – but the British and others also fought campaigns against the Japanese. The US industrial might was a major factor in winning the war in the West too, but their losses compared to the Russians, for example, were puny and their involvement much more short-lived. The US didn’t join in the war for the first two years – since most of the American public were (understandably) opposed to joining in the European war.

And then there was a remark by a well-educated UNICEF colleague years ago, after I must have irritated her.

“You should be grateful that we saved your neck in the war,” she growled at me. A fair response would be: yes, the US did sell destroyers and some ships to the UK in the first years of the war, in return for large amounts of money and control of several naval bases – and this was undoubtedly useful. However, it’s perfectly clear that if Nazi Germany had succeeded in invading the UK in 1940 the US would not have intervened. (The US didn’t join until after Pearl Harbor, Dec. 1941). We were saved by a number of things: the English Channel, our navy and air force  – and the fact that Hitler eventually turned his attention to the USSR. So we preserved ourselves through our own efforts (yep that “Battle of Britain” thing, which was the air war over the Channel – in which we managed to hold off the Germans sufficiently so that an invasion wasn’t practical – and we also had the largest navy in the world which also made it more difficult) and a fair bit of luck. And after the disastrous series of defeats and withdrawals at the beginning of the war, the UK’s military was in a complete shambles – so there really was a fair bit of luck.

But then again – you come to D-Day – and popular representations of it – as in the US movie Saving Private Ryan. From that film you wouldn’t have the slightest inkling that 50% of the troops that landed that day in France were not American – the other 50% being mostly British with Canadians and others.

But another sobering thing for me – I read recently that 2.5 million Indian soldiers served overseas in the war – 2.5 million!! – and was I really aware of that? No. I wasn’t really even aware that Canadians had helped out our air force until some time ago. Oh convenient ignorance!  More than one million Indians served in the First World War as well (more than the US, one Indian writer wrote recently).

But back to my Russian friend – he told me that the annual celebration of victory in the Great Patriotic War is overwhelming and somewhat overblown – and he ruefully acknowledged my prod that – as I expected – the fact that Stalin signed a non-aggression pact with Hitler for the first two years of the war is quietly glossed over. Plus the fact that the Soviets began the war by invading Poland from the East, in coordination with Germany, which invaded Poland from the West.

So who did win the war?  Well I guess a bunch of different countries – many of which I haven’t even mentioned – some of them were called the United Nations – the forerunner group to the current organization – the group of allies that formed during hostilities.

But I’ve learned I need to be a bit careful about my own country’s myths.

Extract from Max Hastings:

But the principal reality of subsequent military operations would be that Russians did most of the dying necessary to undo Nazism, while the Western powers advanced at their own measured pace towards a long-delayed confrontation with the Wehrmacht. For many years after 1945, the democracies found it gratifying to perceive the Second World War in Europe as a struggle for survival between themselves and Nazi tyranny. Yet the military outcome of the contest was overwhelmingly decided by the forces of Soviet tyranny, rather than by Anglo-American armies. Perversely, this reality was better understood by contemporary Americans and British than it has been by many of their descendants.

Hastings, Max (2010-04-17). Winston’s War (p. 146). Knopf Doubleday Publishing Group. Kindle Edition.

A relevant passage from Tolstoy’s War and Peace:

“He asked him to tell them how and where he got his wound. This pleased Rostov and he began talking about it, and as he went on became more and more animated. He told them of his Schon Grabern affair, just as those who have taken part in a battle generally do describe it, that is, as they would like it to have been, as they have heard it described by others, and as sounds well, but not at all as it really was. Rostov was a truthful young man and would on no account have told a deliberate lie. He began his story meaning to tell everything just as it happened, but imperceptibly, involuntarily, and inevitably he lapsed into falsehood. If he had told the truth to his hearers—who like himself had often heard stories of attacks and had formed a definite idea of what an attack was and were expecting to hear just such a story—they would either not have believed him or, still worse, would have thought that Rostov was himself to blame since what generally happens to the narrators of cavalry attacks had not happened to him. He could not tell them simply that everyone went at a trot and that he fell off his horse and sprained his arm and then ran as hard as he could from a Frenchman into the wood. Besides, to tell everything as it really happened, it would have been necessary to make an effort of will to tell only what happened. It is very difficult to tell the truth, and young people are rarely capable of it. His hearers expected a story of how beside himself and all aflame with excitement, he had flown like a storm at the square, cut his way in, slashed right and left, how his saber had tasted flesh and he had fallen exhausted, and so on. And so he told them all that.
Tolstoy, Leo. War and Peace (Complete Version, Best Navigation, Active TOC) (p. 268). Flip. Kindle Edition.

Survey of French perceptions of who “most contributed to defeat of Germany in 45”:

Screen Shot 2019-06-06 at 8.37.46 PM


Leave a comment

Filed under Uncategorized

I defer to expertise

Philosopher Crispin Wright

Philosopher Crispin Wright

Well, often and mostly (but not always.)

Why am I talking about this? Because I frequently hear, from other people, an expression of global scepticism, along the lines of:

  • you can’t believe anything the media says
  • you can’t believe what scientists say
  • you can’t believe any official statistics, ESPECIALLY government statistics

Well I believe quite a lot of what the media says, a lot of what scientists say, and a lot of official and government statistics. Do I believe uncritically and without exercising judgement? No.

With the media, I tend to believe sources that have a well established tradition of balanced reporting – I would cite the BBC in this. There are no entirely objective media outlets of course – they all report from a perspective – the BBC from a British/European perspective, and it can’t, for practical reasons, always report on the complete spectrum of opinions – it tends to focus on those opinions which are supported by significant proportions of the population. And yes, it’s imperfect, it sometimes makes mistakes. I read the New York Times and listen to NPR – and I don’t claim these are entirely objective either, though overall my judgement is that there is a commitment to accurate reporting. Sometimes they fail and I’m also aware that both organizations are slightly left of centre – so to be fair I try to listen, read from more right-wing outlets too – pretty easy with the internet. What’s critical for me is an evident commitment to accurate and balanced reporting – organizations may fail, but if the overall commitment is there, then I am much more trusting.

With scientists I need to check who is funding the research, I need to check if it’s a minority position, and I need to check whether what is being said is towards a consensus position. (Yes, as astro-physicist Neil deGrasse Tyson points out, science DOES in practice proceed by consensus – viz the systematic review in health science which assesses a large range of well conducted trials and averages them out.) And if a scientist is making a claim that contradicts the large body of accumulated evidence, the scientist had better have extraordinary evidence to make extraordinary claims. Are scientists infallible? Of course not. But is science the best means we have of establishing the mostly likely explanation for phenomena? Yes. If science hadn’t been successful, it wouldn’t a) have transformed the world over the past few centuries and b) would not be in widespread use today, if its predictive powers had proven to be non-existent.

One example: after my recent concussion I had consultations with two experts in the field – both with decades of experience in handling concussion cases. Both had an international reputation and lectured around the world. I was urged by an acquaintance, with no medical training, that I should check alternative websites and try out, among other things, vitamin supplements. I told her to get lost. Why?  Should I put more weight on the opinions of two people with medical expertise and decades of experience (who hadn’t suggested vitamins), or someone with no medical training and no expertise in the area? By the way, it’s not that the two experts in question were closed to “alternative” methods – in fact I tried out one approach, using running, which was being explored by a lab in Buffalo. The difference though was that there was some scientific plausibility to this “alternative” approach and it was being carried out by credible scientists. There is no evidence, incidentally, that vitamins can help with a concussion.

Government and official statistics: it depends on the government. I will tend to give more weight to statistics issued by democratically accountable governments, and much less when there isn’t real democracy. I will tend to be accepting if there is no sufficient and specific reason to disbelieve them (and the fact that they come from government is not of itself a reason). I tend to think that figures, say, from the Centers for Disease Control, are issued with the genuine intention of spreading knowledge and improving medicine. Or, for that matter, economic analyses from the Congressional Budget Office, which was set up specifically to do non-partisan research. I might, though, be more sceptical of statements from the Environmental Protection Agency on fracking, since I’ve read specific media reports from a reliable source (Propublica) that they have come under commercial pressure to soften their conclusions about fracking. (The same might apply to the FDA, but again I’d use judgement and it would depend on what subject they were reporting.)

All of this is influenced by a paper from the philosopher Crispin Wright – who I had the good fortune to attend lectures with at NYU – he’s actually a Brit, but was visiting. The paper is “Warrant for Nothing” and it’s basically an attempt at a serious and considered answer to the global sceptic who says we don’t know anything – we might be a brain being manipulated in a vat, or everything might be a dream. In others words, according to the radical, global sceptic, we can take absolutely nothing for granted and don’t know anything. You can maybe see why I think there’s a broad analogy with the global scepticism mentioned above.

Anyhow, Wright sets out to argue that there is a way we can rationally justify saying we know stuff. He argues that we are “entitled” to accept and trust things as they are – even without absolutely conclusive evidence – provided a number of conditions are met. I won’t mention all of them – but one is that we have no sufficient or specific reason to doubt, for example, that my cognitive faculties are working OK at this time, or that what I’m asserting is true. Secondly, I should behave in a way that indicates I accept whatever it is I’m asserting – ie whatever from “I’m hungry”, to “there’s a table in front of me.” My actions should confirm that I really do hold those beliefs. We are entitled to assume that the laws of nature are uniform, Wright argues, so long as there is no better way to understand nature (other than observing and assuming regularities), and since making this assumption allows us to live happy and stable lives and since also we have no specific or sufficient reason to doubt that the laws of nature are uniform. Sure, we need to examine and investigate our assumptions in a responsible way, and we shouldn’t use methods and assumptions that we have reason to doubt before coming to conclusions.

On these bases, Wright concludes, we are rationally entitled to say we know stuff DESPITE the attack of the sceptic, and even though we can’t base everything on absolutely rock solid evidence – we trust instead in certain assumptions, that Wright argues, are reasonable to make. If you’re interested it’s a really interesting and dense article – it certainly took me several reads to get a grasp of it, but I think he has some really, really important insights. (I admit I haven’t done full justice to it – it’s very elegantly and subtly argued – I’ve reduced it to a few, crude extracts – which are hopefully not wholly misleading.)

In short, I think scepticism is often a valuable approach, but when it’s applied globally, it’s not warranted or useful, and ultimately leads to paralysis. If we believe nothing that we see or hear, we can’t function as a society, and hardly even as an individual. The old saying “Question Everything” has some value in its spirit but taken literally it’s an absurd and destructive idea. Quite a lot of the time, there are good reasons for accepting information, and for deferring to expertise.

Leave a comment

Filed under Uncategorized

Things have never been this good in human history, Part 3

End of term report from the UN

Happy globeFifteen years ago, world leaders agreed to a set of development goals to be achieved by this year – on broad measures like education, poverty, healthcare etc. (The Millennium Development Goals). There’s been massive progress in many areas.

One or two highlights:

  • MDG 1: Eradicate extreme poverty and hunger:
  • Extreme poverty: In 1990, nearly half of the population in the developing regions lived on less than $1.25 a day. This rate dropped to 14 per cent in 2015. Globally, the number of people living in extreme poverty has declined by more than half, falling from 1.9 billion in 1990 to 836 million in 2015, with most progress occurring since 2000.
  • Hunger: The proportion of undernourished people in the developing regions has fallen by almost half since 1990, from 23.3 per cent in 1990–1992 to 12.9 per cent in 2014–2016.
  • MDG 4: Reduce child mortality:
  • Child mortality rate: Globally, the under-five mortality rate dropped from 90 to 43 deaths per 1,000 live births between 1990 and 2015. Despite population growth in the developing regions, the number of deaths of children under five declined from 12.7 million in 1990 to almost 6 million in 2015 globally.
  • Infectious diseases: Measles vaccination helped prevent nearly 15.6 million deaths between 2000 and 2013. The number of globally reported measles cases declined by 67 per cent. About 84 per cent of children worldwide received at least one dose of measles-containing vaccine in 2013, up from 73 per cent in 2000.
  • MDG 2: Achieve universal primary education:
  • Primary school enrolment: In the developing regions, the primary school net enrolment rate has reached 91 per cent in 2015, up from 83 per cent in 2000.
  • Sub-Saharan Africa recorded the best progress in primary education, with a 20 percentage-point increase in the net enrolment ratio from 2000 to 2015, compared to an 8 percentage-point gain between 1990 and 2000.
  • Out-of-school children: Globally, the number of out-of-school children of primary school age has fallen to an estimated 57 million in 2015, down from 100 million in 2000.
  • Literacy rate: Among youth aged 15 to 24, the literacy rate has improved globally from 83 per cent to 91 per cent between 1990 and 2015, and the gap between women and men has narrowed.

UPDATE 2019 – under five mortality rates globally have more than halved in the last 20 years.

In short, much still needed to be done – but also massive, positive progress.

Source: the UN

Things have never been this good in human history, Part 1

Things have never been this good in human history, Part 2

1 Comment

Filed under Uncategorized

Free will – some thoughts

immanuel-kantMichael Sandel on Kant:

Acting morally means acting out of duty—for the sake of the moral law. The moral law consists of a categorical imperative, a principle that requires us to treat persons with respect, as ends in themselves. Only when I act in accordance with the categorical imperative am I acting freely. For whenever I act according to a hypothetical imperative, I act for the sake of some interest or end given outside of me. But in that case, I’m not really free; my will is determined not by me, but by outside forces—by the necessities of my circumstance or by the wants and desires I happen to have. I can escape the dictates of nature and circumstance only by acting autonomously, according to a law I give myself. Such a law must be unconditioned by my particular wants and desires. So Kant’s demanding notions of freedom and morality are connected. Acting freely, that is, autonomously, and acting morally, according to the categorical imperative, are one and the same.

Sandel, Michael J. (2009-09-15). Justice: What’s the Right Thing to Do? (pp. 123-124). Macmillan. Kindle Edition.

I can’t really agree wholly with what Kant says (according to Sandel’s interpretation.) The problem here is in the phrase “according to a law I give myself.” Since how are we free to give a law to ourselves – and how can we be free from my wants and desires, or the necessities of my circumstances, when we do so? I don’t think we can be free of these things when we “give a law to ourselves” so it’s not clear how we can be free according to this definition.

However, I quite like the idea of adapting this slightly. I think for free will to make sense we have to move away from “acting without constraint” since I don’t think that’s possible – and I think Kant opens up a pathway of thinking toward this.

But first of all here’s the problem. We always have constraints – namely our environment, genes, upbringing, concomitant wants and desires etc. etc. In that sense I am a strict determinist. I can’t see how we can do anything other than we actually do in any given moment – since every atom in our body is already set in a certain way. So to me the traditional, conventional idea of free will doesn’t make sense – since it can’t escape the above – it’s based on a bogus idea of an individual who is somehow completely autonomous who can will something into existence that is not defined and controlled by precursors.

(Side note: OK quantum physics says events at that level can happen without precursors, however, that doesn’t provide the sense of agency required for us to have free will: according to the traditional idea, we have to generate acts ourselves, not be subject to some unknown process that we have no say or control over. And secondly, while there may be events without causes at the quantum level, this doesn’t at all mean that there are therefore events without causes at the level of human activity and implementation – and I’m not aware that physicists have been able to show anything of the sort – so quantum doesn’t satisfy the requirements of traditional free will and hasn’t been shown to have an effect at the human level of implementation.)

The conventional idea, I think, is that we have the capacity to act spontaneously WITHOUT any constraint – but this involves a weirdly unrealistic idea of the self. Our self is already and inevitably embedded in the environment and the universe – it’s not separate or autonomous – there simply is nowhere that a separate or autonomous choice can be generated from. In order to make a choice there has to be an agent – ie ourselves – and our self is made up out of the universe and not sui generis. Therefore as far as we can tell, only one outcome at a time is possible or conceivable for every human act. Therefore I can’t see how this traditional idea of free will makes sense.

But following on from Kant I propose a different definition of free will: to exercise free will is to follow a rational course of behaviour or behaviours.

Yes, following such a course would still be entirely deterministic, but I say (and I’m not of course being hugely original – Daniel Dennett to quote just one has a similar concept) that this is a workable and more useful definition of free will.

There’s a second practical challenge with this definition (though we don’t necessarily have to solve it for my proposal to be true) – how do we know or decide what’s rational? Since this is at the very least very difficult to determine in practice, we’d have to decide it by discussion, consensus and comparison, using the best information we have, and making use of the input of well-informed people.

The Kant article did make me think of things that are NOT determined by physical forces, or in fact the laws of physics – ie the laws of logic for example. And logic can make a contribution to our actions – it can be a factor in our decision-making. Thus it’s true that we are not necessarily entirely physically determined.

However, logic – and rationality based on logic – are still wholly deterministic and still, as I think about it, effectively part of our environment. It’s just that I hadn’t thought about the role of logic and rationality – as non-physical entities that nevertheless influence behaviour – before. This made me more open to the Kantian idea that free will is following a law, and following reason. This doesn’t give us free will in the traditional (and as I argue, flawed) concept of free will – but it does provide an opening for a new definition.

Free will therefore, could be “to follow the laws dictated by reason – or rationality.”

IF we are following the laws dictated by reason – even though we can do no other and are completely subject to a deterministic universe – we are exercising free will.

Leave a comment

Filed under Uncategorized

Agriculture – humanity’s worst mistake – or not?

jaredJared Diamond’s article, “The Worst Mistake in the History of the Human Race,” written in 1987, has reportedly become a standard discussion point in anthropology classes – and it’s definitely a fascinating and provocative statement. His central thesis is as follows:

Now archaeology is demolishing another sacred belief: that human history over the past million years has been a long tale of progress. In particular, recent discoveries suggest that the adoption of agriculture, supposedly our most decisive step toward a better life, was in many ways a catastrophe from which we have never recovered.

Article: http://www.ditext.com/diamond/mistake.html

I’ve read Diamond’s “Guns, Germs and Steel” and found it fascinating and convincing – although I know there are disputes about it – but I have a feeling I should look at it again.

In that book, he at the least raises some very good questions and makes some good points. I remember from “Guns” how certain diseases – especially those that transferred over from animals – only developed after humans adopted agriculture etc. So it’s clearly the case that agriculture produced problems. The question raised by his article is whether on balance agriculture was a good or a bad thing for the human race.

So these are the questions and comments the article provoked in me:

Interestingly, since I would have expected Diamond to go the other way, given his strongly implied critique of modern society in this piece, Diamond agrees with other experts – including Steven Pinker (see link to my other blog post below) – that early, hunter-gatherer societies were considerably more violent than modern societies – based on the evidence we have.

See: http://www.jareddiamond.org/Jared_Diamond/Rousseau_Revisited.html

So I would take that to be a significant counter-example to his argument that introducing agriculture (by implication entailing the subsequent development course of the human race) was a mistake – given that presumably one would agree that lower average levels of violence among human beings is a good thing.

A very basic point – in terms of survival and in terms of becoming the dominant form of human existence: hunter-gatherer societies have clearly not been successful comparatively. They are now minuscule by comparison to humans living based on agriculture and its consequences.

I frequently find that Diamond’s evidence is unsatisfying and rather unconvincing. For example – his argument that the average amount of time spent obtaining food appears less IF we compare it (by implication) to people’s average work hours today. But first of all this is based on TWO examples of hunter-gatherer groups. Which begs the question – what is the AVERAGE time spent by hunter-gatherers obtaining food? If Diamond is also including “working” for food (rather than simply “obtaining” – although this is left unclear *see updated note below), then modern-day people do spend considerably more time than 12, 14 or 19 hours a week working (on average), but they are certainly working for more things than just food, many of which wouldn’t have been available to Bushmen – which we can debate the merits of – but it’s certainly not just food. And the reference also leaves out how much time hunter-gatherers spend working for things other than food.

Same point with his reference to ONE study of bushmen’s diets, saying its calorific input was high – what was the average including the OTHER studies?

There seems to me to be a basic undermining retort to his overall argument: even if we allow that conditions for human beings were INITIALLY worse under agriculture, that of course doesn’t mean that it didn’t eventually lead to much better conditions. And remember Diamond is boldly playing the long game in his argument – he’s talking about the last million years of history – so he’s inviting a critique on these terms.

For example – he says that lifestyles of hunter-gatherers weren’t nasty or brutish – but they were definitely short.

Quote, from a study in Illinois:

Life expectancy at birth in the pre-agricultural community was about twenty-six years,” says Armelagos, “but in the post-agricultural community it was nineteen years.

Great – but 26 years is supposed to compare favourably to now? Average life expectancy for ALL humans born in 2013, according to WHO, was 71.

More trivially, he comes up with the interesting fact that hunter-gatherers in the region of Turkey and Greece averaged 5ft 9 for men – and was considerably lower in 3000 BC under agriculture. I checked and it has risen above that today – even it took a long time!

But given these considerations above, his following statement,

“Thus with the advent of agriculture an elite became better off, but most people became worse off,” is demonstrably false – in the long-term.

Despite continuing, serious problems, most people – a large majority of the current 7 billion on the planet – are definitely better off today by most of the usual measurements, than they were as hunter-gatherers. (ie the classic health outcome measurements of under five mortality and longevity, and they are much less likely to die violently, not to mention access to things that hunter-gatherers couldn’t have had – like information about far away places, new and different ideas, far away peoples).

See for example “Getting Better” by Economist Charles Kenny. (Though in fact, even the blurb for this is misleadingly pessimistic – the overall trend is for convergence between economies and incomes world wide (yes between the developed and developing world) – even if there’s been increasing inequality recently within the west – see economist Thomas Piketty:

A global convergence process in which emerging countries are catching up with developed countries seems well under way today, even though substantial inequalities between rich and poor countries remain.

Piketty, Thomas (2014-03-10). Capital in the Twenty-First Century (p. 72). Harvard University Press. Kindle Edition.

Lurking behind (or not so behind) much of his argument is a critique of population growth: “Some bands chose the former solution, unable to anticipate the evils of farming, and seduced by the transient abundance they enjoyed until population growth caught up with increased food production.”

pikettyBut while population growth is an important issue, most experts expect it to shade off during this century and to level out:

The rate of global population growth peaked in the period 1950– 1970 at nearly 2 percent per year and since then has decreased steadily. Although one can never be sure of anything in this realm, it is likely that this process will continue and that global demographic growth rates will decline to near zero in the second half of the twenty-first century. The shape of the bell curve is quite well defined (see Figure 2.2). Piketty, Thomas (2014-03-10). Capital in the Twenty-First Century (p. 99). Harvard University Press. Kindle Edition.

And according to the World Bank: “In 1960, women worldwide had an average of 5 children. The rate has since halved, and in 2012, women had an average of 2.5 children across all regions.”

Warnings by population alarmists like Paul Ehrlich, who predicted that “hundreds of millions” would die of famine in the 70s and 80s have proven to be wildly off target. (A very generous reading of famines in that period might put it at 4.5 million deaths – meaning that at the very least, Ehrlich was off by a factor of more than 40).

And the average person lives with an abundance of food available to him or her, not to mention safe water (yes, many don’t, but the vast majority do have these things).

And finally, without agriculture we could not have organized or developed our current societies – with its chain of government, education, technical advances, massive developments in healthcare, education, transport, dissemination of information and overall knowledge. We now have the most extraordinary access to an enormous wealth of information and knowledge, literally at our fingertips. For me that is an absolutely stupendous development. We would not have had the transformation provided by the scientific process. We would have no books, internet, the spread of abstract thinking, the intellectual architecture that developed after the Enlightenment – including human rights and international law; and, indeed, no discussions like this; and maybe more trivially we wouldn’t have had comforts like heating, refrigeration, and far greater protection from the elements. Less trivially (I suppose) I wouldn’t have been alive to write this and it’s a fair guess that friends of mine reading this wouldn’t be alive. And while he argues there was art under the hunter-gatherers – of course, beautiful stuff – we have had an unimaginable outpouring of creativity in all the arts for centuries – most of it made possible (not to mention the consumption of said art) by living in settled, organized societies.

So the question remains, on balance, whether agriculture was a good thing or bad thing for humanity – whether agriculture was a “mistake”. It’s a question that’s hard to answer, because you can’t distance yourself to make a disinterested choice, and I don’t think humanity really “chose” its course, in any meaningful sense. I think it’s more of a gut level, emotional question. Does modern society fill me with such despair that I think it would better – let’s say if it was practically feasible – to return to a hunter-gatherer lifestyle? I think I definitely come to a different conclusion from Jared Diamond.

And here’s why I’m more of an optimist – another of my blog posts.



Post script – this recent article confirms many of my questions – working time was much longer than the claims above; and leisure time was undercut by the conditions of violence and threat hunter-gatherers usually live under. The claim about working hours had to be radically revised when the original author admitted that he hadn’t included time for food processing, tool making, or general housework.

The threat of disease was actually higher in mobile groups than in more sedentary groups that used horticulture: “Much is made of the increased risk of infectious disease in large, concentrated, sedentary populations, but comparatively little attention has been given to the risk of ‘traveler’s diarrhea’ common among hunter-gatherers. For mobile groups, infants, the elderly, and other vulnerable individuals have little opportunity to develop resistance to local pathogens. This may help explain why infant and child mortality among hunter-gatherers tends to be so high. Across hunter-gatherer societies, only about 57% of children born survive to the age of 15. Sedentary populations of forager-horticulturalists, and acculturated hunter-gatherers, have a greater number of children surviving into adulthood, with 64% and 67%, respectively, surviving to the age of 15.”

Leave a comment

Filed under Uncategorized

Three recent films – South Sudan, Detroit and the Central African Republic

Screen Shot 2015-05-18 at 3.52.12 PM

After a long period of lesser activity following my head injury (bike accident in Central Park) I can claim to have become more active in recent months.

Three films I’ve created – all for the UN

Defying death in South Sudan

You are an unarmed UN civilian in charge of a compound where 12000 people have fled for safety. You are confronted by 80 armed soldiers and a government minister demanding to enter the compound. You know they will almost certainly kill people if they get in. What do you do?

2  Central African Republic: the path out of violence

A small country and an almost forgotten crisis. It’s a nation submerged in violence, hatred and instability. But human rights campaigners are striving to change that dynamic.

Made for UN TV and distributed to broadcasters worldwide. English and French, with English subtitles.

Detroit: Water not flowing

A bankrupt city needs revenues, so it attempts to force people to pay water bills by shutting off supplies. Is that the best way forward?  The UN gets involved after local groups appeal to the world body for help over the shut-offs.

Leave a comment

Filed under Uncategorized

Things have never been this good in human history

Happy dogYep. I mean it. Apologies if you were expecting this to be a wry and unoriginal joke.

We – the human race – have never had it so good.

There’s plenty of evidence for this.

The chances of the average global citizen, born today, dying from violent causes are the lowest in human history. Wars are far less frequent today than in all of recorded history. Combat deaths are massively down. Criminal violence is massively down. All this from the long-perspective of the last three millennia. Why? Trade, greater stability and societal control compared to ancient societies, spread of ideas and knowledge leading to changing values and attitudes, international agreements, human rights promotion, diplomatic and peacekeeping interventions, and much more.

References: “Our Better Angels. Why Violence Has Declined” – massively researched, and densely supported by statistics – by the Harvard-based psychologist Steven Pinker.

“Why Nations Fight.” Based on his case-study of 100 conflicts since 1648 Richard Ned Lebow concludes, among other things, that war is on the decline since attitudes, values and motives for violent conflict have changed markedly in that time.

What about health, education and general welfare? An extract from Charles Kenny’s “Getting Better:”

“Fifty years ago, more than half the world’s population struggled with getting enough daily calories. By the 1990s, this figure was below 10 percent. Famine affected less than three-tenths of 1 percent of the population in sub-Saharan Africa from 1990 to 2005. …Virtually everywhere, infant mortality is down and life expectancy is up. In Africa, life expectancy has increased by ten years since 1960, despite the continent’s HIV pandemic. Nearly 90 percent of the world’s children are now enrolled in primary schools, compared with less than half in 1950.”

The numbers of children dying from preventable disease is falling in absolute terms despite the world’s increasing population – and of course in percentage terms – partly as a result of successful medical interventions, especially vaccination. (Source: UNICEF)

For example, measles vaccine has led to a huge drop in global deaths from the disease. In the 1980s, measles killed 2.6 million a year. In 2016, for the first time since records were kept, deaths fell below 100,000.”

Hundreds of millions have escaped absolute poverty in recent decades – largely due to the economic advances of China and India – but other areas, including in South Asia, have also been transformed.

Eighty-nine countries—which represent nearly half the world’s population—are “free,” according to the Freedom House measures, and 116 are electoral democracies (out of 193). Twenty years ago, only 61 and 76 fit those respective categories. (To be sure many of the democracies are very imperfect, but the long-term trend is still encouraging overall.)

And getting a bit more parochial – what about efforts to alleviate poverty? For example, did President Johnson’s War on Poverty succeed? (ie Medicaid, Medicare, housing subsidies, guaranteed income to elderly and disabled, food stamps and other tax benefits – some of which came after but were inspired by Johnson). Largely it did. Adjusting for differing benefits, the poverty level in the US fell by as much as three quarters (roughly 19% to 5%) between 1964 and 2013.

Source: “The War on Poverty: Was it Lost?” by Christopher Jencks. New York Review of Books, April 2015.

Is it all good news? The major bad news I would allow is climate change – an enormous global challenge, though a new climate agreement is likely to be sealed soon in Paris. Not enough, but a beginning. But this threat can’t be discounted. It will have to be dealt with.

I am not saying of course that violence, poverty, ill-health and prejudice don’t still exist – and that things are fairly distributed globally. What I am saying is that, measured against previous human history, almost everything is the best it’s ever been for the human race as a whole.

Why do I say this? Because it’s important to focus on what works. If we keep repeating that things have never been this bad, we will obscure the opportunity to learn from what has helped the human race.

Are these improvements going to inevitably continue? Of course not, that’s why it’s all the more important to learn from our successes.

So let’s celebrate the good – and focus on what works.

Things have never been this good part 2 – very funny but also trenchant Ted talk from Swedish global health expert.

Things have never been this good in history, Part 3

Article from 2017 confirming a great deal of progress – for example:                                         ” On any given day, the number of people worldwide living in extreme poverty:

A.) Rises by 5,000, because of climate change, food shortages and endemic corruption.

B.) Stays about the same.

C.) Drops by 250,000.

Polls show that about 9 out of 10 Americans believe that global poverty has worsened or stayed the same. But in fact, the correct answer is C. Every day, an average of about a quarter-million people worldwide graduate from extreme poverty, according to World Bank figures.

Or if you need more of a blast of good news, consider this: Just since 1990, more than 100 million children’s lives have been saved through vaccinations, breast-feeding promotion, diarrhea treatment and more. If just about the worst thing that can happen is for a parent to lose a child, that’s only half as likely today as in 1990.

When I began writing about global poverty in the early 1980s, more than 40 percent of all humans were living in extreme poverty. Now fewer than 10 percent are. By 2030 it looks as if just 3 or 4 percent will be. (Extreme poverty is defined as less than $1.90 per person per day, adjusted for inflation.)”



Filed under Uncategorized

Science and religion do not get along



I’ve frequently heard and read the opinion that science and religion are compatible – after all there are plenty of religious scientists, and, anyway, isn’t belief in religion ultimately the same as belief in science?

I’ve concluded that they’re not compatible – neither fundamentally nor conceptually. I think when people argue for compatibility they are mostly arguing for social cohesion. After all, most people have religious beliefs, and science and its products are very prevalent in almost every society – so it’s an intolerable situation if the two can’t sit alongside each other. So when people say the two are not opposed, I think it’s mostly a wish for social harmony – a need for social compatibility. While that is (maybe) a laudable aim, I don’t think that the two spheres can be reconciled conceptually.

I’ve often heard the argument – and I’m sure many others have – which goes along the following lines: “Well you believe in science – and I believe in God – so you can’t claim any priority for your belief over mine – we are both in the same boat – you depend on belief just as much as I do.” This is usually accompanied by a smug smile – and an almost audible “Gotcha on that one!” Continue reading

Leave a comment

Filed under Uncategorized

‘Fire Island’ on its travels

I shot a short drama film last October on Fire Island – unfortunately I didn’t get back to the island this year (probably because I’ve been recovering from concussion for the past seven months). The film is now out being pitched on the (rather expensive) film festival circuit. I suspect – as with pitching for a job – it makes a huge difference if you establish personal contacts with people on that circuit – but so far, it’s been a strictly internet, cold-call approach. Still, I’m enjoying having this small snap-shot of a relationship out there.

This is the film’s website. 

Extract below.

Fire Island: a new short drama by Francis Mead. Duration: 15 minutes.

Mike and Leah have been together for seven years, but they have never made a decision. As their summer rental on Fire Island draws to a close, things are looking bleak. They quickly discover what lengths they are willing to go, both to avoid, and then to force, a choice. Their future depends on the flip of a coin.

Shot on location at Davis Park on Fire Island.

Writer/Director Francis Mead:

“From my personal perspective, having a child and getting married are absolutely terrifying choices. Not very long ago there was very little freedom around these decisions. Of course I prefer to have those choices – but I think one of the most insidious ideas, that I espoused when I was (slightly) younger, was that “settling down” was a death experience, an inescapable trap. It was a no-brainer that settling down as a married couple, especially with children, was the antithesis of happiness. Two Shot CUBut now I’ve come to be much more interested in people in long-term relationships. Of course I like romantic stories about people when they first meet and fall in love – but there’s an enormous wealth of material to be explored in, for example, marriages. And maybe, just maybe, a lot of those long-term married people know more than the rest of us about happiness.

Leave a comment

Filed under Uncategorized