Category: Science

  • Mental Disorder or Neurodiversity?

    In this thought-provoking essay, Aaron Rothstein discusses neurological conditions like autism, ADHD, depression, dyslexia, and others. In recent decades, they have become quite visible for reasons that are often highly dubious (and owe more to the workings of knowledge and power that Foucault outlined in Madness and Civilization). Rothstein describes how people with such neurological conditions function, often with other heightened capacities. To what extent should their differences be seen as an aspect of neurodiversity worth embracing versus a genuine mental disorder worth fixing?

    NeurodiversityToday, some psychologists, journalists, and advocates explore and celebrate mental differences under the rubric of neurodiversity. The term encompasses those with Attention Deficit/Hyperactivity Disorder (ADHD), autism, schizophrenia, depression, dyslexia, and other disorders affecting the mind and brain. People living with these conditions have written books, founded websites, and started groups to explain and praise the personal worlds of those with different neurological “wiring.” The proponents of neurodiversity argue that there are positive aspects to having brains that function differently; many, therefore, prefer that we see these differences simply as differences rather than disorders. Why, they ask, should what makes them them need to be classified as a disability?

    But other public figures, including many parents of affected children, focus on the difficulties and suffering brought on by these conditions. They warn of the dangers of normalizing mental disorders, potentially creating reluctance among parents to provide treatments to children — treatments that researchers are always seeking to improve. The National Institute of Mental Health, for example, has been doing extensive research on the physical and genetic causes of various mental conditions, with the aim of controlling or eliminating them.

    Continue Reading

  • Robert Sapolsky: Are Humans Just Another Primate?

    A thought-provoking lecture by Robert Sapolsky, professor of neurobiology and primatology at Stanford, in which he tries to discern, to the best of our knowledge, what it is that separates us from other animals. He narrates lots of fascinating experimental results from recent decades. This lecture, archived on Fora.tv, is one of several in a series called Being Human: Connecting to Our Ancient Ancestors.

    Continue Reading

  • How Culture Drove Human Evolution

    In this engaging piece, Joseph Henrich argues that the rise of cumulative culture in our ancestral lineage contributed to our genetic evolution, starting as far back as 1.8 million years ago with the earliest of the Homo line, i.e., Homo habilis and Homo erectus. Henrich’s approach differs from how most “people thinking about human evolution have approached this as a two-part puzzle, as if there was a long period of genetic evolution until either 10,000 years ago or 40,000 years ago, depending on who you’re reading, and then only after that did culture matter, and often little or no consideration given to a long period of interaction between genes and culture.”


    HenrichThe main questions I’ve been asking myself over the last couple years are broadly about how culture drove human evolution. Think back to when humans first got the capacity for cumulative cultural evolution—and by this I mean the ability for ideas to accumulate over generations, to get an increasingly complex tool starting from something simple. One generation adds a few things to it, the next generation adds a few more things, and the next generation, until it’s so complex that no one in the first generation could have invented it. This was a really important line in human evolution, and we’ve begun to pursue this idea called the cultural brain hypothesis—this is the idea that the real driver in the expansion of human brains was this growing cumulative body of cultural information, so that what our brains increasingly got good at was the ability to acquire information, store, process and retransmit this non genetic body of information.

    More here (via 3QD). You can either listen to a video of his talk or read its transcript. Also checkout a set of related papers.

    Continue Reading

  • The Cambridge Declaration on Consciousness


    Alex

    A congregation of scientists in Cambridge, UK, recently issued a formal declaration that lots of non-human animals, including mammals, birds, and likely even octopuses are conscious beings. What do they mean by consciousness,
    you ask? It’s a state of awareness of one’s body and one’s environment, anywhere from
    basic perceptual awareness to the reflective self-awareness of humans. This declaration will surely strike many of us as ancient news and a long overdue recognition, even as it may annoy the stubborn skeptics among us. 


    Continue Reading

  • God and the Ivory Tower

    Scott Atran on how science should approach religion, esp. in an age where religious faith continues to grow around the world (hint: not how the so-called New Atheists do it). The excerpt below will surprise those who think religion is the leading cause of conflict in human history.

    Koran_burningMoreover, the chief complaint against religion — that it is history’s prime instigator of intergroup conflict — does not withstand scrutiny. Religious issues motivate only a small minority of recorded wars. The Encyclopedia of Wars surveyed 1,763 violent conflicts across history; only 123 (7 percent) were religious. A BBC-sponsored “God and War” audit, which evaluated major conflicts over 3,500 years and rated them on a 0-to-5 scale for religious motivation (Punic Wars = 0, Crusades = 5), found that more than 60 percent had no religious motivation. Less than 7 percent earned a rating greater than 3. There was little religious motivation for the internecine Russian and Chinese conflicts or the world wars responsible for history’s most lethal century of international bloodshed.

    Indeed, inclusive concepts such as “humanity” arguably emerged with the rise of universal religions. Sociologist Rodney Stark reveals that early Christianity became the Roman Empire’s majority religion not through conquest, but through a social process grounded in trust. Repeated acts of altruism, such as caring for non-Christians during epidemics, facilitated the expansion of social networks that were invested in the religion. Likewise, studies by behavioral economist Joseph Henrich and colleagues on contemporary foragers, farmers, and herders show that professing a world religion is correlated with greater fairness toward passing strangers. This research helps explain what’s going on in sub-Saharan Africa, where Islam is spreading rapidly. In Rwanda, for example, people began converting to Islam in droves after Muslims systematically risked their lives to protect Christians and animists from genocide when few others cared.

    Continue Reading

  • Grist and Mills: On the Cultural Origins of Cultural Learning

    In this provocative paper, sure to annoy evolutionary psychologists, Cecilia Hayes argues that the “cognitive processes that comprise cultural learning are themselves culturally inherited; they are cultural adaptations [rather than genetic adaptations]. They are products as well as producers of cultural evolution.” Here is the abstract (and my two recent related posts are here and here):

    RajMahal11Cumulative cultural evolution is what ‘makes us odd’; our capacity to learn facts and techniques from others, and to refine them over generations, plays a major role in making human minds and lives radically different from those of other animals. In this article I discuss cognitive processes that are known collectively as ‘cultural learning’ because they enable cumulative cultural evolution. These cognitive processes include reading, social learning, imitation, teaching, social motivation, and theory of mind. Taking the first of these three types of cultural learning as examples, I ask whether and to what extent these cognitive processes have been adapted genetically or culturally to enable cumulative cultural evolution. I find that recent empirical work in comparative psychology, developmental psychology and cognitive neuroscience provides surprisingly little evidence of genetic adaptation, and ample evidence of cultural adaptation. This raises the possibility that it is not only ‘grist’ but also ‘mills’ that are culturally inherited; through social interaction in the course of development, we not only acquire facts about the world and how to deal with it (grist), we also build the cognitive processes that make ‘fact inheritance’ possible (mills).

    Continue Reading

  • Global Warming’s Terrifying New Math

    This article by Bill McKibben is going to scare the pants off you. It pairs quite well with Bartlett’s analysis that I blogged about recently.

    McKibbenWhen we think about global warming at all, the arguments tend to be ideological, theological and economic. But to grasp the seriousness of our predicament, you just need to do a little math. For the past year, an easy and powerful bit of arithmetical analysis first published by financial analysts in the U.K. has been making the rounds of environmental conferences and journals, but it hasn’t yet broken through to the larger public. This analysis upends most of the conventional political thinking about climate change. And it allows us to understand our precarious – our almost-but-not-quite-finally hopeless – position with three simple numbers.

    Also check out this article on the everyday denial of climate change.

    Continue Reading

  • Is the Web Driving Us Mad?

    “Tweets, texts, emails, posts. New research says the Internet can make us lonely and depressed—and may even create more extreme forms of mental illness.” If you habitually spend hours a day on the Web, as I do, this Newsweek article is definitely worth a look (via The Browser).

    InternetCrazyQuestions about the Internet’s deleterious effects on the mind are at least as old as hyperlinks. But even among Web skeptics, the idea that a new technology might influence how we think and feel—let alone contribute to a great American crack-up—was considered silly and naive, like waving a cane at electric light or blaming the television for kids these days. Instead, the Internet was seen as just another medium, a delivery system, not a diabolical machine. It made people happier and more productive. And where was the proof otherwise?

    Now, however, the proof is starting to pile up. The first good, peer-reviewed research is emerging, and the picture is much gloomier than the trumpet blasts of Web utopians have allowed. The current incarnation of the Internet—portable, social, accelerated, and all-pervasive—may be making us not just dumber or lonelier but more depressed and anxious, prone to obsessive-compulsive and attention-deficit disorders, even outright psychotic. Our digitized minds can scan like those of drug addicts, and normal people are breaking down in sad and seemingly new ways.

    Continue Reading

    Category: ,
  • Arithmetic, Population and Energy

    A brilliant lecture by Dr. Albert A. Bartlett, professor of physics, that looks at population growth, energy use, and sustainability in light of basic arithmetic. Insightful and alarming, Bartlett shows why “sustainable growth” is an oxymoron and how many “experts” do not get this. According to Bartlett, a person in the U.S. on average consumes ~30 times the resources than a person in an underdeveloped country. One encouraging trend I found elsewhere is that per capita energy use in the U.S. has fallen in the last three decades though more than offset by population growth in the U.S., and by growth in population and per capita energy use in developing countries. Hardly a better case can be made for zero population growth and massive investments in renewable energy than the one made by Bartlett (from 2002, via 3QD).

    Continue Reading

  • The Social Conquest of Earth

    GroupSelectionEO Wilson’s new book, The Social Conquest of Earth, has reignited an old debate about natural evolution, i.e., the level at which it occurs. In the dominant camp are folks like Richard Dawkins and Steven Pinker who hold that evolution occurs via gene selection. In the other camp, folks like Wilson and Jonathan Haidt claim that evolution occurs at multiple levels, including via both gene and group selection. Notably, Charles Darwin himself supported the latter view in Descent of Man.

    Not surprisingly, Dawkins and Pinker wrote hostile reviews. Dawkins lamented Wilson’s “erroneous and downright perverse misunderstandings of evolutionary theory” and called Darwin’s own support of group selection “anomalous”. Other reviewers I’ve read include David Sloan Wilson, Steven Mithen, Jerry Coyne, and Leonard Finkelman. Wilson responded with a vigorous defense of his thesis in a NYT’s Stone column. At least to me—a general reader, not a specialist in the field—Wilson’s account seems entirely plausible and more than likely. What empirical observations might settle this dispute however seems less than clear.

    Whatever the truth, what concerns me more about both camps is their penchant for, in the words of H Allen Orr, Darwinian storytelling. Evolutionary psychologists, who abound in both camps, often try to explain too much of human behavior, including our current morality, through evolutionary selection. While it is undeniable that our basic moral instincts come out of millions of years of evolution, it also seems to me that to explain the prolific range of our behavior, we should look more at the cultural edifice that our humanoid ancestors have developed relatively recently through symbolic language and the resulting explosion of speech, concept formation, and social learning.

    Continue Reading

  • Stannard On the Reach of Science

    If Stephen Hawking and Lawrence Krauss incline you to the view that physicists today are philosophically naïve, read Russell Stannard.

    StannardWe said earlier that the job of science is to describe the world. In order to do this, we have to observe it to find out what kind of world it is. But having made the observations (done the experiments) what we write down in our physics textbooks is a description of the world itself, regardless of whether one happens to be observing it. Bohr, and other adherents to his so-called Copenhagen Interpretation of quantum mechanics, claimed that this was not so. What has been written down is not a description of the world at all, but a description of acts of observation made on the world. All our customary scientific terms such as energy, momentum, position, speed, distance, time, etc. — they are terms specifically for the description of observations. It is a misuse of language to try and apply them to a world-in-itself divorced from the action of an observation. It is this misuse of language that leads to problems like that posed by the wave/particle paradox. Which is not to say that the world-in-itself does not exist outside the context of someone making an observation of it. Rather, as Werner Heisenberg asserted, all attempts to talk about the world-in-itself are rendered meaningless.

    Not that there is anything new in this. The philosopher Immanuel Kant had long ago asserted that one could know nothing about the thing-in-itself. (So much for the death of philosophy.

    Continue Reading

  • On Eating Animals

    (Cross-posted on 3 Quarks Daily, where it has received many comments. A slightly modified version of this essay appeared in the July/Aug 2013 issue of the Humanist.)

    MollyCowSome years ago in a Montana slaughterhouse, a Black Angus cow awaiting execution suddenly went berserk, jumped a five-foot fence, and escaped. She ran through the streets for hours, dodging cops, animal control officers, cars, trucks, and a train. Cornered near the Missouri river, the frightened animal jumped into its icy waters and made it across, where a tranquilizer gun brought her down. Her “daring escape” stole the hearts of the locals, some of whom had even cheered her on. The story got international media coverage. Telephone polls were held, calls demanding her freedom poured into local TV stations. Sensing the public mood, the slaughterhouse manager made a show of “granting clemency” to what he dubbed “the brave cow.” Given a name, Molly, the cow was sent to a nearby farm to live out her days grazing under open skies—which warmed the cockles of many a heart.

    Cattle trying to escape slaughterhouses are not uncommon. Few of their stories end happily though. Some years ago in Omaha, six cows escaped at once. Five were quickly recaptured; one kept running until Omaha police cornered her in an alley and pumped her with bullets. The cow, bellowing miserably and hobbling like a drunk for several seconds before collapsing, died on the street in a pool of blood. This brought howls of protest, some from folks who had witnessed the killing. They called the police’s handling inhumane and needlessly cruel.

    Continue Reading

  • The Emotional World of Farm Animals

    EmotionalHere is a delightful documentary “about the thinking and feeling side of animals that are all too often just viewed as food. Jefferey Masson … leads viewers through the personal journey he underwent while writing his latest book, The Pig Who Sang to The Moon. This journey into the sentient, emotional lives of farm animals brings Masson to animal sanctuaries around the country where caregivers and the animals themselves tell their harrowing stories of rescue and escape. Masson delves into the rich ancestry of these curious and intelligent animals and interviews top experts in animal behavior who offer scientific perspectives on these amazing creatures.” (52 mins)

    Continue Reading

    Category: , ,
  • Education, Khan Academy Style

    In this interesting video, Salman Khan “talks about how and why he created the remarkable Khan Academy, a carefully structured series of educational videos offering complete curricula in math and, now, other subjects. He shows the power of interactive exercises, and calls for teachers to consider flipping the traditional classroom script — give students video lectures to watch at home, and do “homework” in the classroom with the teacher available to help.”

    Continue Reading

    Category: ,
  • Humankind’s Best Friend

    2012441243148999-2012-05MargShipmanFADogs may have been a better friend to humanity than we ever realized, according to an article by Dr. Pat Shipman in American Scientist. They may have played a crucial role in helping modern humans outcompete our Neanderthal cousins.

    Many theories have been proposed for why Neanderthals couldn’t seem to compete with the invaders, when modern humans arrived in Europe some 35-45,000 years ago, including climate change, the newcomers’ better social organization, or their greater facility for language. But new lines of evidence are beginning to suggest another possibility: that it might have been the domesticated dog that gave H. sapiens sapiens the edge over Neanderthals (and, one must presume, Denisovans). There’s now mounting evidence that modern humans were domesticating dogs by 35,000 years ago, during the same period when modern human populations began to increase and Neanderthal populations were in decline. Dogs were used for hunting and as pack animals, as they are used even into modern times by some groups. Studies reveal that dogs can significantly increase the success of a hunt and the amount of meat brought in to a community who uses them.

    If the dogs carried the meat, humans would have saved a lot of energy, so each kill would have provided a greater net gain in food—even after feeding the dogs. Additional food generally has marked effects on the health of a group. Better-fed females can have more babies, can provide them with more milk and can have babies at shorter intervals. Before long, using pack dogs could have caused the human population to increase.

    Continue Reading

  • The Inner Lives of Animals

    (Cross-posted on 3 Quarks Daily, where it has received many comments.)

    BaboonIt is often said that humans are the only animals to use symbols. So many other claims of human uniqueness have fallen away—thoughts, emotions, intelligence, tool use, sense of fairness—what’s so special about symbols, you ask? I share your skepticism, dear reader, and in the next few paragraphs I’ll tell you why.

    Let’s begin by clarifying what “symbol” means here. One way to do this is to contrast symbols with signs. A sign, such as a red light, a grimace, a growl, or a thunderstorm, signifies something direct and tangible, making us think or act in response to the thing signified. Issuing and responding to signs is commonplace in Animalia. A symbol, on the other hand, is “something that represents something else by association, resemblance, or convention”. A symbol allows us to think about the thing or idea symbolized outside its immediate context, such as the word “water” for the liquid, “7” for a certain quantity, and “flag” for a community. What is symbolized doesn’t even have to be real, such as God, and herein lies the power of symbols—they are the building blocks of abstract and reflective thought. Evidence of material symbols used by humans dates back at least 60-100K years, when burial objects and decorated beads start to appear in archaeological finds. Linguistic symbols were almost certainly in use long before then.

    Continue Reading

  • Two Hundred Years of Surgery

    Atul Gawande reviews the last two years years of the surgical profession, from the days before anesthesia and antiseptics (yikes!).

    AmputationConsider, for instance, amputation of the leg. The procedure had long been recognized as lifesaving, in particular for compound fractures and other wounds prone to sepsis, and at the same time horrific. Before the discovery of anesthesia, orderlies pinned the patient down while an assistant exerted pressure on the femoral artery or applied a tourniquet on the upper thigh. Surgeons using the circular method proceeded through the limb in layers, taking a long curved knife in a circle through the skin first, then, a few inches higher up, through the muscle, and finally, with the assistant retracting the muscle to expose the bone a few inches higher still, taking an amputation saw smoothly through the bone so as not to leave splintered protrusions. Surgeons using the flap method, popularized by the British surgeon Robert Liston, stabbed through the skin and muscle close to the bone and cut swiftly through at an oblique angle on one side so as to leave a flap covering the stump.

    The limits of patients’ tolerance for pain forced surgeons to choose slashing speed over precision. With either the flap method or the circular method, amputation could be accomplished in less than a minute, though the subsequent ligation of the severed blood vessels and suturing of the muscle and skin over the stump sometimes required 20 or 30 minutes when performed by less experienced surgeons. No matter how swiftly the amputation was performed, however, the suffering that patients experienced was terrible. Few were able to put it into words. Among those who did was Professor George Wilson. In 1843, he underwent a Syme amputation — ankle disarticulation — performed by the great surgeon James Syme himself. Four years later, when opponents of anesthetic agents attempted to dismiss them as “needless luxuries,” Wilson felt obliged to pen a description of his experience:

    Continue Reading

    Category: ,
  • A 50-Year Plan for Energy

    In this TED talk, Amory Lovins, an energy researcher, lays out a plan for a whole new private-sector energy industry that will save trillions while decimating fossil fuel use, creating jobs, reducing oil conflicts, and growing the economy. For more, visit ReinventingFire.com.

    Continue Reading

  • The Slow Explosion of Speech

    In this review of James R. Hurford’s The Origins of Grammar, Nick Enfield presents a significant viewpoint on how we humans, from a stage when our ancestors were without language, came to acquire language in all its modern complexity.

    HurfordIf you could travel back to a time around the dawn of humankind, and if you encountered a people there whose only form of language was a list of one-word interjections like Yuck, Wow, Oops, Hey!, No, and Huh?, would you say that these people were of a different species, not quite human? Would they be like today’s apes that simply don’t have it in them to fully acquire a modern human language? Or would they be the same as us only less well equipped for communication, like the eighteenth-century man who is every bit human but happens not to have been born in a world with telephones? If the latter were true, then language would be more technology than biology, more something we build than something that grows. It’s clear that the earliest humans did not possess language as we know it. The question is whether this was because language as we know it hadn’t yet been invented.

    In James R. Hurford’s towering account of our species’ path from being once without language to now being emphatically with it, he proposes that just such a monophrase language of the Yuck/Wow variety was an important early human achievement. And, Hurford argues, while our earliest forms of language had no grammatical rules by which words were combined to form sentences, they were far from primitive call systems.

    Continue Reading

  • The Evolved Apprentice

    Kim Sterelny, a leading philosopher of biology (and author of the bestseller Dawkins vs. Gould) has a new book, The Evolved Apprentice: How Evolution Made Humans Unique. Here is a brief description (and an essay by Sterelny):

    SterelnyOver the last three million years or so, our lineage has diverged sharply from those of our great ape relatives. Change has been rapid (in evolutionary terms) and pervasive. Morphology, life history, social life, sexual behavior, and foraging patterns have all shifted sharply away from other great apes. No other great ape lineage–including those of chimpanzees and gorillas–seems to have undergone such a profound transformation. In The Evolved Apprentice, Kim Sterelny argues that the divergence stems from the fact that humans gradually came to enrich the learning environment of the next generation. Humans came to cooperate in sharing information, and to cooperate ecologically and reproductively as well, and these changes initiated positive feedback loops that drove us further from other great apes.

    Sterelny develops a new theory of the evolution of human cognition and human social life that emphasizes the gradual evolution of information sharing practices across generations and how information sharing transformed human minds and social lives. Sterelny proposes that humans developed a new form of ecological interaction with their environment, cooperative foraging, which led to positive feedback linking ecological cooperation, cultural learning, and environmental change. The ability to cope with the immense variety of human ancestral environments and social forms, he argues, depended not just on adapted minds but also on adapted developmental environments.

    Continue Reading

Contact us:

← Back

Thank you for your response. ✨