Knowledge never progresses unencumbered by ordinary human politics. Clubbiness, careerism, prejudice, personality clashes, bigotry, corruption, charm, and other human factors affect the advancement and dissemination of all knowledge, even in the hallowed academies of the West. While the scientific disciplines may have the best inbuilt methodologies for self-correction, still their practice isn’t immune to these impairments of judgment and objectivity.
In his recent Guardian article, The Sugar Conspiracy, Ian Leslie reminds us of how important individual personalities or even the fashionability of ideas can dominate, pervert, or slow the progress of entire fields of science. He writes,
In a 2015 paper titled Does Science Advance One Funeral at a Time?, a team of scholars at the National Bureau of Economic Research sought an empirical basis for a remark made by the physicist Max Planck: “A new scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents eventually die, and a new generation grows up that is familiar with it.”
The researchers identified more than 12,000 “elite” scientists from different fields. The criteria for elite status included funding, number of publications, and whether they were members of the National Academies of Science or the Institute of Medicine. Searching obituaries, the team found 452 who had died before retirement. They then looked to see what happened to the fields from which these celebrated scientists had unexpectedly departed, by analysing publishing patterns.
What they found confirmed the truth of Planck’s maxim. Junior researchers who had worked closely with the elite scientists, authoring papers with them, published less. At the same time, there was a marked increase in papers by newcomers to the field, who were less likely to cite the work of the deceased eminence. The articles by these newcomers were substantive and influential, attracting a high number of citations. They moved the whole field along.
In this context, Leslie goes on to narrate the story of how, for decades, American nutritional science chased doggedly down a rabbit hole of false conclusions about the probable causes of heart disease, under the influence of decidedly non-scientific factors. A prevailing theory became fashionable, and contradictory data was shouted down; those presenting it were professionally attacked. The shaming and silencing alternative lines of questioning surely contributed to the ongoing public health crisis we now face, in which at least two generations of people are suffering epidemic frequencies of obesity and diabetes. Leslie lays it out,
Population genetics is an emerging field that’s shedding new light on ancient human migrations. It complements linguistics and archaeology, which have until now been the primary avenues for understanding prehistory. David Reich, a leading geneticist and a Harvard professor, has taken special interest in the much contested issue of the original homeland of Indo-European (IE) languages and the mixing of populations in India. Watch a video conversation with him on the edge.org page below (also transcribed).
Nothing Reich says will comfort the “out-of-India” theorists, largely a Hindutva brigade of “scholars” who claim that there was no Aryan migration into India; that instead a migration happened from India to Europe; that IE languages originated in the Indian Subcontinent from a proto-Sanskrit; that the people of the Indus Valley Civilization spoke this proto-Sanskrit (never mind that their script remains undeciphered; there’s no consensus on whether it is even a linguistic script); that the Vedas are wholly indigenous in inspiration, etc. It’s amazing how many people on the Internet confidently assert that the Aryan migration theory has been “discredited”.
Of course much of this was/is nationalistic windbaggery, based on wishful thinking and gaps in rival theories, not on any solid evidence from linguistics or archaeology. Population genetics is now producing a clearer picture once and for all. But we’re not there yet, even though Reich’s work has bolstered the Kurgan hypothesis, which puts the IE homeland in the Pontic-Caspian steppe. Watch this field for more definitive revelations in the years ahead.
I have a piece in The Wire today: The Road to Fixing Air Pollution in Delhi, Beyond Odd-even. Among other things, this attempts to distill the research and learning from my recent months at the Delhi Dialogue Commission, an advisory body to the Government of NCT of Delhi. Also an announcement on the right for my talk this weekend that's open to all.
An unprecedented public health crisis has been unfolding in Delhi: 40% of our kids now fail lung capacity tests. Respiratory emergencies have tripled in the last seven years, with no relief in sight. Just breathing our air, full of toxic gases and particulates, has raised the incidence of strokes, heart disease, cancers, birth defects, pneumonia, and more. In Delhi alone, an estimated 80 people are dying daily from conditions provoked by air pollution. Much like smoking cigarettes, it’s shaving years off our lives.
Though some fare worse than others, none are immune: rich or poor, young or old. A high burden of disease erodes quality of life, family finances, and the economy. What will be the cost of this health crisis, in human lives, in healthcare, in lost productivity?
It’s a good thing the AAP government plans to build a thousand Mohalla clinics, because what’s unfolding is far bigger than last year’s dengue scare in Delhi. Though experts have long known these health effects of air pollution, years of apathy, ignorance, and denial—among both citizens and politicians—have led us here. So how serious are the Aam Aadmi Party (AAP) and Bharatiya Janata Party (BJP) governments about fixing this menace? How well do they understand the gravity of the situation?
A Plea for Culinary Modernism is a though-provoking essay on modern food and our attitudes towards it by Rachael Laudan, food historian and philosopher of science and technology. "The obsession with eating natural and artisanal," she argues, "is ahistorical. We should demand more high-quality industrial food." She is also the author of "Cuisine and Empire: Cooking in World History", now on my reading list.
As an historian I cannot accept the account of the past implied by Culinary Luddism, a past sharply divided between good and bad, between the sunny rural days of yore and the gray industrial present. My enthusiasm for Luddite kitchen wisdom does not carry over to their history, any more than my response to a stirring political speech inclines me to accept the orator as scholar.
The Luddites’ fable of disaster, of a fall from grace, smacks more of wishful thinking than of digging through archives. It gains credence not from scholarship but from evocative dichotomies: fresh and natural versus processed and preserved; local versus global; slow versus fast: artisanal and traditional versus urban and industrial; healthful versus contaminated and fatty. History shows, I believe, that the Luddites have things back to front. That food should be fresh and natural has become an article of faith. It comes as something of a shock to realize that this is a latter-day creed. For our ancestors, natural was something quite nasty. Natural often tasted bad.
"Under the Dome" is a brilliant documentary on air pollution in China that has been seen by millions. Scary as hell. India is catching up fast and would do well to avoid some of China's mistakes. Not likely though. Things are going to get much worse in India before people wake up.
A film on the life and work of three Indian scientists: Satyendra Nath Bose, Chandrasekhara Venkata Raman, and Meghnad Saha, "the significance of whose contributions are of vital importance even today in quantum physics, fibre optics, nuclear science or astrophysics." The film's biographical sketches are celebratory and tinged with patriotic pride, but it still furnishes an engaging overview of their life and work.
"We live in a world of unseeable beauty, so subtle and delicate that it is imperceptible to the human eye. To bring this invisible world to light, filmmaker Louie Schwartzberg bends the boundaries of time and space with high-speed cameras, time lapses and microscopes. At TED2014, he shares highlights from his latest project, a 3D film titled Mysteries of the Unseen World, which slows down, speeds up, and magnifies the astonishing wonders of nature." Must see.
This brilliant talk by Dr. Robert Lustig persuasively argues that sugar, based on how our bodies metabolize it in the liver, is no less a poison than alcohol. He explains how our bodies process different carbohydrates like glucose, sucrose (table sugar), and fructose, and why sugar in the latter two forms is the primary cause of obesity, high blood pressure, heart disease, diabetes, and more. He also debunks many common myths of health and nutrition by showing that a calorie is not a calorie (its source is important), why exercising is not about burning calories but improving metabolism, why fat is nowhere near as bad as sugar, etc. Also read this review of the related new documentary, Fed Up.
.... [Head keeper Jerry] Stones finally managed to catch Fu Manchu in the act. First, the young ape climbed down some air-vent louvers into a dry moat. Then, taking hold of the bottom of the furnace door, he used brute force to pull it back just far enough to slide a wire into the gap, slip a latch and pop the door open. The next day, Stones noticed something shiny sticking out of Fu's mouth. It was the wire lock pick, bent to fit between his lip and gum and stowed there between escapes.
Apparently, Orangutans are the escape artists of the animal world. This particular incident happened back in 1968, but scientists at the time weren't paying attention, as they were busy with their apes struggling with language and performing tasks in their labs.
However, Eugene Lynden, author of several books on animal intelligence, found it more than interesting. Lynden's 1999 article on animal intelligence is remarkable for the way that it's astutely anecdotal. Lynden had realized what "now seems obvious: if animals can think, they will probably do their best thinking when it serves their purposes, not when some scientist asks them to", and he then began to speak to a broad range of people who work intimately with animals: zookeepers, veterinarians, trainers, and yes, researchers. He says,
Get a bunch of keepers together and they will start telling stories about how their charges try to outsmart, beguile or otherwise astonish humans. They tell stories about animals that hoodwink or manipulate their keepers, stories about wheeling and dealing, stories of understanding and trust across the vast gulf that separates different species. And, if the keepers have had a few drinks, they will tell stories about escape.
Each of these narratives reveals another facet of what I have become convinced is a new window on animal intelligence: the kind of mental feats they perform when dealing with captivity and the dominant species on the planet--humanity.Though it's an old article, it's only recently available online. It remains worth reading for the astonishing and amusing stories of animal wit—and lack, thereof. Among other things, what becomes clear is how complex, non-linear, and multifaced is anything we might call intelligence. Certainly animals have it, but with lacunae in areas one might not expect.
The Green Revolution of the 60s and 70s is best associated with higher yields through new innovations in agricultural science and technology. To attain its impressive results however, the new farming practices used synthetic fertilizers and chemical pesticides which ravaged the soil, damaged ecosystems, polluted groundwater, encouraged crop monocultures, and raised the incidence of certain diseases. The resulting land degradation fueled the search for new land and deforestation. In other words, modern intensive farming practices are not sustainable, and various experiments worldwide have tried to make them sustainable while increasing yields at lower cost — the agricultural holy grail.
Here is a promising Al-Jazeera story about "two million farmers in the Indian state of Andhra Pradesh [who] have ditched chemical pesticides in favour of natural repellants and fertilisers, as part of a growing eco-agriculture movement [that] has improved soil health and biodiversity, reduced costs and upped yields." Could this catch on more widely?
The latest issue of the Humanist magazine (July-Aug '13) has a slightly modified version of my essay from last year.
Clearly, most people don’t even know about the horror and pain we inflict on billions of birds and mammals in our meat factories. But there’s no good excuse for this, is there? It’s more likely that we don’t want to know—can’t afford to know for our own sake—so we turn a blind eye and trust the artifice of bucolic imagery on meat packaging. Some see parallels here with the German people’s willful denial of the concentration camps that once operated around them, or call those who consume factory-farmed meat little Eichmanns. “For the animals, it is an eternal Treblinka,” wrote Isaac Bashevis Singer (who also used to say he turned vegetarian “for health reasons—the health of the chicken”).
Predictably enough, many others are offended by such comparisons. They say that comparing the industrialized abuse of animals with the industrialized abuse of humans trivializes the latter. There are indeed limits to such comparisons, though our current enterprise may be worse in at least one respect: it has no foreseeable end. We seem committed to raising billions of sentient beings year after year only to kill them after a short life of intense suffering. Furthermore, rather than take offense at polemical comparisons—as if others are obliged to be more judicious in their speech than we are in our silent deeds—why not reflect on our apathy instead? Criticizing vegetarians and vegans for being self-righteous—or being moral opportunists in having found a new way of affirming their decency to themselves—certainly doesn’t absolve us from the need to face up to our role in perpetuating this cycle of violence and degradation.
The innovations of the last two decades, led by the Internet and mobile devices, have fundamentally altered the way so many of us live, work, and play. Is modern technology a problem or a solution—and why? How is the disruptive impact of the Internet shaping human societies and cultures, our values, ideas of Self, and relationships? What trends should worry us the most, and who should we hold responsible for them? The viewpoints here are perhaps as numerous as people themselves, even as we cluster them in categories like evangelists, pioneers, enthusiasts, skeptics, laggards, technophobes, curmudgeons, and so on.
Evgeny Morozov, an analyst of technological trends, has a new book, To Save Everything, Click Here: The Folly of Technological Solutionism. In it, he attempts a critique of a culture that worships technology as the great hope and savior of humanity. I just read an interesting exchange between Morozov and Farhad Manjoo, technology columnist for Slate. I might read the book though my impression from this exchange is that while attempts like Morozov's are badly needed and he does raise many good questions, in the end young Morozov seems to me simply out of his depth for the ambitious task he has taken on. Here are links to the exchange:
Entry 1: Manjoo's opening salvo against Morozov
Entry 2: Morozov on what Manjoo gets wrong about his book
Entry 3: Manjoo on why Morozov has Silicon Valley absolutely wrong
Entry 4: Morozov on why and how technology journalism needs to evolve
Primatologist Frans de Waal has a new book, The Bonobo and the Atheist. Below is an excerpt from an early review in the New Republic (click photo for the Amazon listing; the Publisher's Weekly blurb is here).
Those familiar with de Waal’s previous books ... will recognize many of the same arguments resurfacing here, including the idea that human morality has biological origins. “Fairness and justice are … best looked at as ancient capacities. They derive from the need to preserve harmony in the face of resource competition.” De Waal uses the bonobo—a peaceful, sex-loving primate who may be as closely related to us, or more closely related, than the more Machiavellian chimpanzee—to attack the prevailing notion of human nature as selfish and violent, and that we are constantly battling to suppress our terrible “animal nature.” “Everything science has learned in the past few decades argues against this pessimistic view that morality is a thin veneer over a nasty human nature.”
What’s new here is that de Waal wades directly into the atheism-versus-religion debate, which he claims is often mistakenly cast as a science-versus-religion debate. He argues that a biologically evolved “bottom-up” morality obviates the need for the “top-down” morality imposed by religion. And yet, he sees science (and himself) as aligned with secular humanism, which is not necessarily anti-religion. He would like to see the influence of religion fade, but acknowledges that a moral code is not all religion provides: “The question is not so much whether religion is true or false, but how it shapes our lives, and what might possibly take its place.”
Charles C. Mann discusses how homo sapiens, from very humble beginnings devoid of language or symbol use, went from anatomically to behaviorally modern humans, becoming thereafter a highly successful species — so successful that it now risks wiping itself out, unless ...
Homo sapiens emerged on the planet about 200,000 years ago, researchers believe. From the beginning, our species looked much as it does today. If some of those long-ago people walked by us on the street now, we would think they looked and acted somewhat oddly, but not that they weren’t people. But those anatomically modern humans were not, as anthropologists say, behaviorally modern. Those first people had no language, no clothing, no art, no religion, nothing but the simplest, unspecialized tools. They were little more advanced, technologically speaking, than their predecessors—or, for that matter, modern chimpanzees. (The big exception was fire, but that was first controlled by Homo erectus, one of our ancestors, a million years ago or more.) Our species had so little capacity for innovation that archaeologists have found almost no evidence of cultural or social change during our first 100,000 years of existence. Equally important, for almost all that time these early humans were confined to a single, small area in the hot, dry savanna of East Africa (and possibly a second, still smaller area in southern Africa).
But now jump forward 50,000 years. East Africa looks much the same. So do the humans in it—but suddenly they are drawing and carving images, weaving ropes and baskets, shaping and wielding specialized tools, burying the dead in formal ceremonies, and perhaps worshipping supernatural beings. They are wearing clothes—lice-filled clothes, to be sure, but clothes nonetheless. Momentously, they are using language. And they are dramatically increasing their range. Homo sapiens is exploding across the planet.
What caused this remarkable change?
On this Thanksgiving Day, consider watching this extraordinary and beautifully filmed Nature documentary in which naturalist Joe Hutto raises 16 wild turkeys from incubation to adulthood, an experience that changed his life. As their turkey mother, Hutto spent over a year in a Florida forest with these birds, each developing a complex and unique relationship with him. He shows us their stages of development, their innate knowledge of the environment, their curiosity and survival instincts. He exults at their distinct personalities, social and emotional lives, individuality and playfulness, and their different appetites for physical affection.
Hutto gets very immersed in their lives, begins to understand their communication, and learns to "talk turkey". He identifies over 30 distinct turkey vocalizations for other animals like rattlesnakes and hawks. He explains how "within each of those calls are inflections that have very different meanings". His bond with one bird in particular, and the way it ends, is especially remarkable and unexpected. En route, Hutto also reveals his own shifting state of mind and what he has learned from this experience about his own life. It might well become hard to see turkeys as "dumb birds" after this documentary, which, incidentally, won the 2012 Emmy for Outstanding Nature Programming.
An enchanting conversation between Einstein and Tagore, which concludes with Einstein saying, "Then I am more religious than you are!"
EINSTEIN: If there would be no human beings any more, the Apollo of Belvedere would no longer be beautiful.
EINSTEIN: I agree with regard to this conception of Beauty, but not with regard to Truth.
TAGORE: Why not? Truth is realized through man.
EINSTEIN: I cannot prove that my conception is right, but that is my religion.
TAGORE: Beauty is in the ideal of perfect harmony which is in the Universal Being; Truth the perfect comprehension of the Universal Mind. We individuals approach it through our own mistakes and blunders, through our accumulated experiences, through our illumined consciousness — how, otherwise, can we know Truth?
In this thought-provoking essay, Aaron Rothstein discusses neurological conditions like autism, ADHD, depression, dyslexia, and others. In recent decades, they have become quite visible for reasons that are often highly dubious (and owe more to the workings of knowledge and power that Foucault outlined in Madness and Civilization). Rothstein describes how people with such neurological conditions function, often with other heightened capacities. To what extent should their differences be seen as an aspect of neurodiversity worth embracing versus a genuine mental disorder worth fixing?
Today, some psychologists, journalists, and advocates explore and celebrate mental differences under the rubric of neurodiversity. The term encompasses those with Attention Deficit/Hyperactivity Disorder (ADHD), autism, schizophrenia, depression, dyslexia, and other disorders affecting the mind and brain. People living with these conditions have written books, founded websites, and started groups to explain and praise the personal worlds of those with different neurological “wiring.” The proponents of neurodiversity argue that there are positive aspects to having brains that function differently; many, therefore, prefer that we see these differences simply as differences rather than disorders. Why, they ask, should what makes them them need to be classified as a disability?
But other public figures, including many parents of affected children, focus on the difficulties and suffering brought on by these conditions. They warn of the dangers of normalizing mental disorders, potentially creating reluctance among parents to provide treatments to children — treatments that researchers are always seeking to improve. The National Institute of Mental Health, for example, has been doing extensive research on the physical and genetic causes of various mental conditions, with the aim of controlling or eliminating them.
Disagreements, then, abound. What does it mean to see and experience the world in a different way? What does it mean to be a “normal” human being? What does it mean to be abnormal, disordered, or sick? And what exactly would a cure for these disorders look like? The answers to these questions may be as difficult to know as the minds of others. Learning how properly to treat or accommodate neurological differences means seeking answers to questions such as these — challenging our ideas about “normal” human biology, the purpose of medical innovation, and the uniqueness of each human being.
A thought-provoking lecture by Robert Sapolsky, professor of neurobiology and primatology at Stanford, in which he tries to discern, to the best of our knowledge, what it is that separates us from other animals. He narrates lots of fascinating experimental results from recent decades. This lecture, archived on Fora.tv, is one of several in a series called Being Human: Connecting to Our Ancient Ancestors.
In this engaging piece, Joseph Henrich argues that the rise of cumulative culture in our ancestral lineage contributed to our genetic evolution, starting as far back as 1.8 million years ago with the earliest of the Homo line, i.e., Homo habilis and Homo erectus. Henrich's approach differs from how most "people thinking about human evolution have approached this as a two-part puzzle, as if there was a long period of genetic evolution until either 10,000 years ago or 40,000 years ago, depending on who you're reading, and then only after that did culture matter, and often little or no consideration given to a long period of interaction between genes and culture."
The main questions I've been asking myself over the last couple years are broadly about how culture drove human evolution. Think back to when humans first got the capacity for cumulative cultural evolution—and by this I mean the ability for ideas to accumulate over generations, to get an increasingly complex tool starting from something simple. One generation adds a few things to it, the next generation adds a few more things, and the next generation, until it's so complex that no one in the first generation could have invented it. This was a really important line in human evolution, and we've begun to pursue this idea called the cultural brain hypothesis—this is the idea that the real driver in the expansion of human brains was this growing cumulative body of cultural information, so that what our brains increasingly got good at was the ability to acquire information, store, process and retransmit this non genetic body of information.
A congregation of scientists in Cambridge, UK, recently issued a formal declaration that lots of non-human animals, including mammals, birds, and likely even octopuses are conscious beings. What do they mean by consciousness, you ask? It's a state of awareness of one's body and one's environment, anywhere from basic perceptual awareness to the reflective self-awareness of humans. This declaration will surely strike many of us as ancient news and a long overdue recognition, even as it may annoy the stubborn skeptics among us.
Below is an abstract about the conference followed by the full declaration. I found the lectures delivered on the occasion quite interesting. I haven't heard them all but I can recommend Irene Pepperberg's "Human-like Consciousness in Non-Humans: Evidence from Grey Parrots" (watch two short videos on these birds: one, two). I also liked Diane Reiss's "Mirror Self-recognition: A Case for Cognitive Convergence in Humans and Other Animals". And if you want to consider the octopus, as DFW enjoined, watch David B. Edelman's "Through the Eyes of an Octopus".
The First Annual Francis Crick Memorial Conference, focusing on "Consciousness in Humans and Non-Human Animals", aims to provide a purely data-driven perspective on the neural correlates of consciousness. The most advanced quantitative techniques for measuring and monitoring consciousness will be presented, with the topics of focus ranging from exploring the properties of neurons deep in the brainstem, to assessing global cerebral function in comatose patients. Model organisms investigated will span the species spectrum from flies to rodents, humans to birds, elephants to dolphins, and will be approached from the viewpoint of three branches of biology: anatomy, physiology, and behavior. Until animals have their own storytellers, humans will always have the most glorious part of the story, and with this proverbial concept in mind, the symposium will address the notion that humans do not alone possess the neurological faculties that constitute consciousness as it is presently understood.
The Cambridge Declaration on Consciousness*
On this day of July 7, 2012, a prominent international group of cognitive neuroscientists, neuropharmacologists, neurophysiologists, neuroanatomists and computational neuroscientists gathered at The University of Cambridge to reassess the neurobiological substrates of conscious experience and related behaviors in human and non-human animals. While comparative research on this topic is naturally hampered by the inability of non-human animals, and often humans, to clearly and readily communicate about their internal states, the following observations can be stated unequivocally:
Scott Atran on how science should approach religion, esp. in an age where religious faith continues to grow around the world (hint: not how the so-called New Atheists do it). The excerpt below will surprise those who think religion is the leading cause of conflict in human history.
Moreover, the chief complaint against religion -- that it is history's prime instigator of intergroup conflict -- does not withstand scrutiny. Religious issues motivate only a small minority of recorded wars. The Encyclopedia of Wars surveyed 1,763 violent conflicts across history; only 123 (7 percent) were religious. A BBC-sponsored "God and War" audit, which evaluated major conflicts over 3,500 years and rated them on a 0-to-5 scale for religious motivation (Punic Wars = 0, Crusades = 5), found that more than 60 percent had no religious motivation. Less than 7 percent earned a rating greater than 3. There was little religious motivation for the internecine Russian and Chinese conflicts or the world wars responsible for history's most lethal century of international bloodshed.
Indeed, inclusive concepts such as "humanity" arguably emerged with the rise of universal religions. Sociologist Rodney Stark reveals that early Christianity became the Roman Empire's majority religion not through conquest, but through a social process grounded in trust. Repeated acts of altruism, such as caring for non-Christians during epidemics, facilitated the expansion of social networks that were invested in the religion. Likewise, studies by behavioral economist Joseph Henrich and colleagues on contemporary foragers, farmers, and herders show that professing a world religion is correlated with greater fairness toward passing strangers. This research helps explain what's going on in sub-Saharan Africa, where Islam is spreading rapidly. In Rwanda, for example, people began converting to Islam in droves after Muslims systematically risked their lives to protect Christians and animists from genocide when few others cared.
In this provocative paper, sure to annoy evolutionary psychologists, Cecilia Hayes argues that the "cognitive processes that comprise cultural learning are themselves culturally inherited; they are cultural adaptations [rather than genetic adaptations]. They are products as well as producers of cultural evolution." Here is the abstract (and my two recent related posts are here and here):
Cumulative cultural evolution is what ‘makes us odd’; our capacity to learn facts and techniques from others, and to refine them over generations, plays a major role in making human minds and lives radically different from those of other animals. In this article I discuss cognitive processes that are known collectively as ‘cultural learning’ because they enable cumulative cultural evolution. These cognitive processes include reading, social learning, imitation, teaching, social motivation, and theory of mind. Taking the first of these three types of cultural learning as examples, I ask whether and to what extent these cognitive processes have been adapted genetically or culturally to enable cumulative cultural evolution. I find that recent empirical work in comparative psychology, developmental psychology and cognitive neuroscience provides surprisingly little evidence of genetic adaptation, and ample evidence of cultural adaptation. This raises the possibility that it is not only ‘grist’ but also ‘mills’ that are culturally inherited; through social interaction in the course of development, we not only acquire facts about the world and how to deal with it (grist), we also build the cognitive processes that make ‘fact inheritance’ possible (mills).
When we think about global warming at all, the arguments tend to be ideological, theological and economic. But to grasp the seriousness of our predicament, you just need to do a little math. For the past year, an easy and powerful bit of arithmetical analysis first published by financial analysts in the U.K. has been making the rounds of environmental conferences and journals, but it hasn't yet broken through to the larger public. This analysis upends most of the conventional political thinking about climate change. And it allows us to understand our precarious – our almost-but-not-quite-finally hopeless – position with three simple numbers.
Also check out this article on the everyday denial of climate change.
The term "denial" is sometimes used to describe the outright rejection of scientifically accepted information, as in the case of climate skeptics. But for most people, who do genuinely care about the planet, denial takes the form of avoidance rather than rejection. People avoid disturbing information in order to sidestep unpleasant emotions and to maintain positive conceptions of individual and national identity. As a result of this kind of denial, people have a sense of knowing and not knowing about climate change, of having information but not thinking about it in their daily lives. Information from climate science is understood in the abstract but disconnected from social or private life.