TedMed Talk on Health of Brain
The Hawn Foundation site is here.
A political philosopher's reflections on politics, philosophy, science, medicine and law. "Enlightenment is man's emergence from his self-incurred immaturity" (Immanuel Kant, 1784).
Of the many climate-change catastrophes facing humankind, the anticipated spread of infectious tropical diseases is one of the most frequently cited — and most alarming. But a paper in this week's Nature adds to the growing voice of dissent from epidemiologists who challenge the prediction that global warming will fuel a worldwide increase in malaria.
On the surface, the connection between malaria and climate change is intuitive: higher temperatures are known to boost mosquito populations and the frequency with which they bite. And more mosquito bites mean more malaria.
Yet when epidemiologists Peter Gething and Simon Hay of the Malaria Atlas Project at the University of Oxford, UK, and their colleagues compiled data on the incidence of malaria in 1900 and 2007 (see page 342), they found the opposite: despite rising temperatures during the twentieth century, malaria has lost ground. According to the models the researchers used to tease out the factors affecting the incidence of malaria, the impact of public-health measures such as improved medications, widespread insecticide use and bed nets have overwhelmed the influence of climate change. "Malaria is still a huge problem," says Gething. "But climate change per se is not something that should be central to the discussion. The risks have been overstated."
The current and potential future impact of climate change on malaria is of major public health interest1, 2. The proposed effects of rising global temperatures on the future spread and intensification of the disease3, 4, 5, and on existing malaria morbidity and mortality rates3, substantively influence global health policy6, 7. The contemporary spatial limits of Plasmodium falciparum malaria and its endemicity within this range8, when compared with comparable historical maps, offer unique insights into the changing global epidemiology of malaria over the last century. It has long been known that the range of malaria has contracted through a century of economic development and disease control9. Here, for the first time, we quantify this contraction and the global decreases in malaria endemicity since approximately 1900. We compare the magnitude of these changes to the size of effects on malaria endemicity proposed under future climate scenarios and associated with widely used public health interventions. Our findings have two key and often ignored implications with respect to climate change and malaria. First, widespread claims that rising mean temperatures have already led to increases in worldwide malaria morbidity and mortality are largely at odds with observed decreasing global trends in both its endemicity and geographic extent. Second, the proposed future effects of rising temperatures on endemicity are at least one order of magnitude smaller than changes observed since about 1900 and up to two orders of magnitude smaller than those that can be achieved by the effective scale-up of key control measures. Predictions of an intensification of malaria in a warmer world, based on extrapolated empirical relationships or biological mechanisms, must be set against a context of a century of warming that has seen marked global declines in the disease and a substantial weakening of the global correlation between malaria endemicity and climate.
"This will change our view of humanity," says John Hardy, a neuroscientist at University College London who was not involved in the research but studies genetic neurodegenerative diseases.
The drive to sequence the complete Neanderthal genome began about five years ago following the invention of better, faster methods for sequencing DNA. From three Neanderthal bones found in Vindija Cave in Croatia, the team extracted a total of about 300 milligrams of bone. The bones date to between 38,300 and 44,400 years ago, and some have been broken open posibbly to remove their marrow — a sign of cannibalism.
.... Using the Neanderthal genome for comparison, Pääbo and his colleagues were also able to identify genes that occur frequently in modern humans, suggesting that such genes are the result of selection pressure.
The report notes genes that affect metabolism, cognition and skeletal development show similar signs of such positive selection in modern humans. And there was positive selection for three genes, that when mutated, have been implicated in Down syndrome, autism and schizophrenia1.
The Neanderthal draft genome provides "a powerful method to shine a light on our evolutionary history", says Green — a technique that will reveal the genomic regions and genes that are keys to our human identity.
Prehistoric human remains have never revealed individuals older than about 50 years of age, and humans had a life expectancy at birth of 30 years or less for more than 99.9% of the time that we have inhabited this planet.