Information

New Research Suggests Ancient "Hobbit" Looked More Like us than Apes


Recently, we reported on new research proving that the remains of the “hobbit” (technically known as Homo floresiensis) did not belong to that of a Homo sapien with pathology but to a distinct species. Now the latest study on this intriguing species has revealed that the face of the “hobbits” looked much closer to humans than apes .

The remains of Homo floresiensis were first discovered in 2003 in a cave on the island of Flores in Indonesia and were dated to have lived between 95,000 and 17,000 years ago. It has been nicknamed ‘hobbit’ for its small stature (approximately 3 feet 6 inches tall) and large feet. Their remarkably small height, compared to other ancient human species, have left scientists perplexed as to how they should be categorized.

In the latest paper, published in the Journal of Archaeological Science, the team of researchers sought to uncover some of the mystery surrounding the hobbit species by trying to determine what their faces looked like.

After careful analysis of the single whole skull that was recovered on Flores they verified the relationship between bone and soft tissue, comparing them with human samples. This allowed them to draw a face which was subsequently compared to nine other faces that have been generated from prior research of other hominins of roughly the same era, using geometric morphometrics. That led to further refinements of the face of the hobbit, which the team reports, looks reasonably similar to modern humans.

The study suggests that, far from being wild ape-like creatures or the missing link between modern humans and apes, the little hominins were more likely to be descendants of Homo erectus.


    How did humans evolve from apes?

    Many parents dread the moment when a child asks where they came from. Charles Darwin found the subject awkward too: On the Origin of Species makes almost no mention of human evolution.

    Darwin was being tactful. The idea of evolution in any form was controversial enough in the middle of the nineteenth century. Claiming that humanity had been shaped by evolution was explosive, as Darwin found when he published a book all about it in 1871.

    There was also a scientific barrier. Darwin had access to almost no fossil evidence that might indicate how, when or even where humans evolved.

    In the intervening years the human – or hominin, to use the proper term – fossil record has expanded enormously. There is still much to discover, but the broad picture of our evolution is largely in place. We know that our evolutionary tree first sprouted in Africa. We are sure that our closest living relatives are chimpanzees, and that our lineage split from theirs about 7 million years ago.

    Advertisement

    The road to humanity was a long one, however. Nearly 4 million years later, our ancestors were still very ape-like. Lucy, a famous 3.2-million-year-old human ancestor discovered in Ethiopia, had a small, chimp-sized brain and long arms that suggest her species still spent a lot of time up trees, perhaps retreating to the branches at night as chimps still do. But she did have one defining human trait: she walked on two legs.

    Australopiths

    Lucy belongs to a group called the australopiths. In the 40 years since her partial skeleton was discovered, fragmentary remains of even older fossils have been found, some dating back 7 million years. These follow the same pattern: they had chimp-like features and tiny brains but probably walked on two legs.

    We also know that australopiths probably made simple stone tools. These advances aside, australopiths weren’t that different from other apes.

    Only with the appearance of true humans – the genus Homo – did hominins begin to look and behave a little more like we do. Few now doubt that our genus evolved from a species of australopith, although exactly which one is a matter of debate. It was probably Lucy’s species Australopithecus afarensis, but a South African species, Australopithecus sediba, is also a candidate. It doesn’t help that this transition probably occurred between 2 and 3 million years ago, a time interval with a very poor hominin fossil record.

    The earliest species of Homo are known from only a few bone fragments, which makes them difficult to study. Some doubt that they belong in our genus, preferring to label them as australopiths. The first well-established Homo, and the first that we would recognise as looking a bit like us, appeared about 1.9 million years ago. It is named Homo erectus.

    Homo erectus: the toolmaker

    Erectus was unlike earlier hominins. It had come down from the trees completely and also shared our wanderlust: all earlier hominins are known only from Africa, but Homo erectus fossils have been discovered in Europe and Asia too.

    Homo erectus was also an innovator. It produced far more sophisticated tools than had any of its predecessors, and was probably the first to control fire. Some researchers think that it invented cooking, improving the quality of its diet and leading to an energy surplus that allowed bigger brains to evolve. It is certainly true that the brain size of Homo erectus grew dramatically during the species’s 1.5-million-year existence. Some of the very earliest individuals had a brain volume below 600 cubic centimetres, not much larger than an australopith, but some later individuals had brains with a volume of 900 cubic centimetres.

    Successful though Homo erectus was, it still lacked some key human traits: for instance, its anatomy suggests it was probably incapable of speech.

    The next hominin to appear was Homo heidelbergensis. It evolved from a Homo erectus population in Africa about 600,000 years ago. This species’ hyoid – a small bone with an important role in our vocal apparatus – is virtually indistinguishable from ours, and its ear anatomy suggests it would have been sensitive to speech.

    According to some interpretations, Homo heidelbergensis gave rise to our species, Homo sapiens, about 200,000 years ago in Africa. Separate populations of Homo heidelbergensis living in Eurasia evolved too, becoming the Neanderthals in the west and a still enigmatic group called the Denisovans in the east.

    Modern humans

    Within the last 100,000 years or so, the most recent chapter in our story unfolded. Modern humans spread throughout the world and Neanderthals and Denisovans disappeared. Exactly why they went extinct is another great mystery, but it seems likely that our species played its part. Interactions weren’t entirely hostile, though: DNA evidence shows that modern humans sometimes interbred with both Neanderthals and Denisovans.

    There is still much we do not know, and new fossils have the potential to change the story. Three new extinct hominins have been discovered in the past decade or so, including Australopithecus sediba and the enigmatic and not yet well-dated Homo naledi, also in South Africa. Strangest of all is the tiny “hobbit” Homo floresiensis, which lived in Indonesia until about 12,000 years ago and appears to have been a separate species.

    For 7 million years our lineage had shared the planet with at least one other species of hominid. With the hobbit gone, Homo sapiens stood alone.


    Study: Last Common Ancestor of Humans and Apes Looked Like Gorilla or Chimpanzee

    Humans split from our closest African ape relatives in the genus Pan around six to seven million years ago. We have features that clearly link us with African apes, but we also have features that appear more primitive. This combination calls into question whether the Homo-Pan last common ancestor looked more like modern day chimpanzees and gorillas or an ancient ape unlike any living group. A new study, published online in the Proceedings of the National Academy of Sciences, suggests that the simplest explanation – that the ancestor looked a lot like a chimpanzee or gorilla – is the right one, at least in the shoulder.

    Scientists carefully study fossils to determine what the last common ancestor of humans and African apes looked like. This image shows Paranthropus boisei, a hominid that lived in sub-Saharan Africa between 2 and 1.4 million years ago. Image credit: © Roman Yevseyev.

    “It appears that shoulder shape tracks changes in early human behavior such as reduced climbing and increased tool use,” said the study’s lead author Dr Nathan Young of the University of California, San Francisco.

    The shoulders of African apes consist of a trowel-shaped blade and a handle-like spine that points the joint with the arm up toward the skull, giving an advantage to the arms when climbing or swinging through the branches.

    In contrast, the scapular spine of monkeys is pointed more downwards.

    In humans this trait is even more pronounced, indicating behaviors such as stone tool making and high-speed throwing.

    The prevailing question was whether humans evolved this configuration from a more primitive ape, or from a modern African ape-like creature, but later reverted back to the downward angle.

    Dr Young and his colleagues from Harvard University, American Museum of Natural History, and California Academy of Sciences, tested these competing theories by comparing 3D measurements of fossil shoulder blades of early hominins and modern humans against African apes, orangutan, gibbons and large, tree-dwelling monkeys.

    The scientists found that the shoulder shape of anatomically modern Homo sapiens is unique in that it shares the lateral orientation with orangutans and the scapular blade shape with African apes a primate in the middle.

    “Human shoulder blades are odd, separated from all the apes. Primitive in some ways, derived in other ways, and different from all of them,” Dr Young said.

    “How did the human lineage evolve and where did the common ancestor to modern humans evolve a shoulder like ours?”

    To find out, the researchers analyzed two early human ancestors – Australopithecus afarensis and A. sediba – as well as Homo ergaster and Neanderthals, to see where they fit on the shoulder spectrum.

    The results showed that australopiths were intermediate between African apes and humans.

    The shoulder of Australopithecus afarensis was more like an African ape than a human, and Australopithecus sediba closer to human’s than to an ape’s.

    This positioning is consistent with evidence for increasingly sophisticated tool use in Australopithecus.

    “The mix of ape and human features observed in Australopithecus afarensis’ shoulder support the notion that, while bipedal, the species engaged in tree climbing and wielded stone tools. This is a primate clearly on its way to becoming human,” explained co-author Dr Zeray Alemseged from the California Academy of Sciences.

    These shifts in the shoulder also enabled the evolution of another critical behavior – human’s ability to throw objects with speed and accuracy.

    A laterally facing shoulder blade allows humans to store energy in their shoulders, much like a slingshot, facilitating high-speed throwing, an important and uniquely human behavior.

    “These changes in the shoulder, which were probably initially driven by the use of tools well back into human evolution, also made us great throwers,” said co-author Dr Neil Roach of Harvard University.

    “Our unique throwing ability likely helped our ancestors hunt and protect themselves, turning our species into the most dominant predators on Earth.”

    Nathan M. Young et al. Fossil hominin shoulders support an African ape-like last common ancestor of humans and chimpanzees. PNAS, published online September 8, 2015 doi: 10.1073/pnas.1511220112


    The Hobbit gets a little older, and science a little wiser

    A forensic reconstruction of the appearance of Homo floresiensis Credit: Cicero Moraes et alii. Wikimedia Commons, CC BY-SA

    When a skeleton of the so-called 'Hobbit' - scientific name Homo floresiensis - was unearthed in Indonesia in 2003 it would go on to cause a major furor in anthropological circles like few others before it.

    More than a decade on, the dust has largely settled on the debate about its status as a legitimate pre-human species although, some researchers will probably never agree it is anything other than a diseased modern human. I doubt history will be on their side.

    Still, the Hobbit continues to surprise us, and its discovery has rewritten the human story in some remarkable and unpredictable ways.

    The first incredible thing about it was that in many ways it physically resembled Australopithecus: ape-like pre-humans that lived in Africa between about 4.5 million and 2 million years ago.

    Famous examples of Australopithecus include 'Lucy' from Ethiopia and the 'Taung Child' and Australopithecus sediba from South Africa.

    Homo floresiensis was, as its nickname suggests, a pint sized prehuman: it stood just over a metre tall (

    106 cm) and weighed in at only 30-35 kg. The skeleton is thought to be from a female of the species.

    Its lower limbs were very, very short, just like lucy's, meaning it was an inefficient walking on the ground, but a biped nonetheless. The Hobbit's upper limbs were also short, and again very much like Lucy's, as well as a little like our own.

    But, what's really revealing is the ratio of upper to lower limb bone length, and at 87 percent, Homo floresiensis is very much like Lucy and very unlike our own species.

    It also had a very stocky build, much more so than modern humans. But its brain was tiny: not much larger than a grapefruit at about 430 cubic centimetres.

    To put this into context, Lucy's kind had a brain volume in the range of 380-550 cubic centimetres, while living humans have on average a brain volume of about 1,350 cubic centimetres. So again, Lucy-like.

    But don't let its small brain fool you. The stone tools found along with the Hobbit are very sophisticated indeed. In fact, some archaeologists believed their level of complexity was only ever seen in tools made by modern humans, until the Hobbit came along.

    This shows us once again that our perceptions about behavioural sophistication and its links to large brains is an unnecessary assumption. It has more to do with a deeply ingrained anthropocentric view of the world than evolutionary reality.

    The shape of its skull is reminiscent of both Homo habilis and Homo erectus, and its teeth are small and more human-like, which is why it was classified in Homo and not Australopithecus.

    Still, I think it sits uneasily in Homo could certainly be accommodated in Australopithecus but probably better deserves to be classified in its own group, its own genus.

    Also, Homo floresiensis parallels what we see in Australopithecus sediba in showing many Homo-like traits combined with Australopithecus ones. Remember, sediba is about 2 million years old and was, I think, incorrectly assigned to Australopithecus.

    None of my colleagues are yet to acknowledge the parallels here and my views will not be popular among anthropologists who are for the most part deeply conservative about such matters.

    But to claim the Hobbit fits comfortably within Homo is absurd, and turns the human genus into an ill defined hodgepodge of all too difficult to classify fossils. It renders Homo meaningless.

    Had the Hobbit been a new species of monkey, elephant or rodent I doubt anyone would have objected to it being a wholly new kind of creature deserving of its own genus and place in the tree of life.

    The second incredible thing about Homo floresiensis is its geographic location. What on earth was a Lucy-like creature doing on the Indonesian island of Flores, so far away from Africa? And, so damned close to Australia?

    This remains one of the biggest mysteries still about the Hobbit. Why was it living on an island that was, over the last million years and more, never connected to Asian mainland? How did it get there?

    Some of my colleagues think it is simply a dwarfed version of Homo erectus, a species found on the nearby island of Java from perhaps 1.5 million years ago. But I don't buy it. This idea can't explain the resemblances to Lucy.

    Homo floresiensis is the first example of a genuine island dwelling pre-human and beyond it, only modern humans among all members of the human evolutionary group are known to have colonised and survived on genuinely isolated islands like Flores.

    If the archaeologists are right about the complexity of its tools and cognition, then it must surely have been capable of building boats, even if quite rudimentary ones?

    Where did it come from? Well, the similarities to Lucy and sediba suggest it must have evolved from Australopithecus. In Africa, or perhaps even outside of Africa. We should expect anthropologists to find Australopithecus in Asia one day soon.

    The third incredible thing about it is its remarkably young geological age.

    The cave deposits in which the bones of Homo floresiensis were found were thought until just last week to span the period 95 thousand to 12 thousand years ago. This made it the youngest example of a non-sapiens species anywhere on the planet.

    To put this into context, people were already beginning to develop agriculture in the Fertile Crescent and rich plains of the Yangtze River by around 12 thousand years ago.

    New research published last week in the journal Nature by Thomas Sutikna and co-workers shows the original estimates of the age of the Hobbit were wrong. New ages, including directly on the bones of Homo floresiensis itself, now show it lived at Liang Bua Cave between 100 thousand and 60 thousand years ago.

    And the stone tools associated with the species are found in cave sediments dating between 190 thousand and 50 thousand years old.

    Does the re-dating detract from the significance of the Hobbit? Not at all. It still seems incredible that a Lucy-like creature survived until so late where it did 12 thousand or 60 thousand years ago. It makes little difference really.

    It's about as radical a discovery as we might expect in anthropology, and the full implications of the find are yet to be fully appreciated, as I hope I've explained somewhat here.

    Why did it disappear? Well the new dates actually suggest a very likely culprit where a date of 12 thousand years just left anthropologists scratching their heads on the issue.

    We know that the earliest modern humans got into Southeast Asia and Australia about the time the Hobbits disappeared. And while this is not direct evidence, it's certainly plausible that our own kind was responsible, directly or indirectly, for their demise.

    Homo floresiensis was too small to be considered 'megafauna', but might still be part of the wave of extinction that accompanied the settlement of the globe by our species that led to hundreds of mammal species disappearing by the end of the last Ice Age.

    This article was originally published on The Conversation. Read the original article.


    Thumbs Down on Human-Ape Evolution

    Attempts to prove that humans evolved from apes have constantly failed. After the myth that we are genetically 98 percent identical to our supposedly closest ancestors—the chimpanzees—had been thoroughly debunked, evolutionists have not ceased to continue looking for other similarities. Before we cover this area, it would be instructive to review the amount of confidence among scientists about that 98 percent claim.[1] It was not the view of only one or two scientists it was science orthodoxy for years. It was practically a consensus among evolutionists, judging by how frequently the claim was made. The alleged similarity has varied between 96% and 99%, but the upshot has been that our genetic similarity, they kept saying, is extremely close. Science Daily said in 2005,

    The first comprehensive comparison of the genetic blueprints of humans and chimpanzees shows our closest living relatives share perfect identity with 96 percent of our DNA sequence, an international research consortium reported today.[2]

    The lead author of the main study which came to the 98% conclusion was Robert Waterston, M.D., Ph.D., chair of the Department of Genome Sciences of the University of Washington School of Medicine in Seattle.[3] An article in the leading popular science Journal, Scientific American wrote

    We now have large regions of the chimpanzee genome fully sequenced and can compare them to human sequences. Most studies indicate that when genomic regions are compared between chimpanzees and humans, they share about 98.5 percent sequence identity…. These early findings suggested that chimps and humans might typically have sequences that diverge from one another by only about 1 percent.[4]

    After noting that individual humans differ genetically by about 0.1 percent which results in a significant variation in appearance and other traits between different humans, the report concludes, “chimps differ from humans by about 15-fold more, on the average, than humans do from one another. … Therefore, perhaps we shouldn’t be so surprised that chimps could be 98.5 percent related to humans.”[5] Such was the spirit of the early 2000s.

    Now that we know the actual similarity is more like 85 percent (a 15 percent). This difference produces a genetic chasm between humans and chimps resulting in an over 150-fold difference greater than humans differ among themselves! In spite of the chasm between humans and chimps, evolutionists ignore it. They have never apologized for misleading the public. Some continue to state the 98% figure as fact. Meanwhile, they continue to look for evidence that humans evolved from some chimp like ancestor. Today’s example is the thumb.

    The Effort to Prove a Chimp Finger Evolved into the Human Thumb

    The human thumb is a major example of the many anatomical designs that set us apart from apes. To evolutionists, who take it as a given that humans and chimps have a common ancestor, they simply ignore the genetic chasm in their efforts to find evidence that a chimp thumb evolved into a human thumb. Gizmodo stated the challenge the evolutionary view poses:

    Many primates have opposable thumbs, but none are quite like ours. The human thumb, set in opposition to the other fingers, allows for precision grasping, which anthropologists consider a necessary physical attribute for crafting tools.[6]

    Their work is certainly cut out for them. Anatomists recognize that “The human thumb is a marvel” of design, “allowing our ancestors to craft stone tools and radically expand their food choices.” The evolutionist proceeds with conviction of evolutionary ancestry into the chasm, looking for bits of fossil evidence. “New research suggests our agile, dexterous thumbs appeared 2 million years ago, in a development that irrevocably changed the course of human history.”[7] In Science Magazine Jan 2021, Michael Price jumped into the chasm, grasping a slight piece of indirect fossil evidence.

    [The] human thumb is a nimble wonder, allowing us to make tools, sew clothing, and open pickle jars. But just how and when this unique digit evolved has long been a mystery. Now, a new study modeling muscle in fossilized thumbs suggests about 2 million years ago, our ancient ancestors evolved a uniquely dexterous appendage while our other close relatives remained … all thumbs.[8]

    Since evolutionists “know” by faith that humans evolved from chimps, they seek to learn when this thumb evolved, not whether it did. They would like to have thumb evolution coincide with their story of the emergence of stone tool production and other innovations. University of Tübingen paleoanthropologist Katerina Harvati is the lead author of a new study. She pointed out, “most studies looking into the history of hominin dexterity rely on direct comparison between the modern human hand and the hands of early hominins.”[9] Because this approach has not been successful, other approaches are now being attempted. Her research explored the possibility that an earlier, less evolved, hominin hand existed that has comparatively superior thumb dexterity. The assumption, now falsified[10] is

    Homo sapiens, emerged around 300,000 years ago, which means we were latecomers to the human show. Other humans (now extinct), such as Homo habilis, Homo erectus, Homo naledi, and Homo neanderthalensis (otherwise known as Neanderthals) were around much earlier, with the very first humans appearing approximately 2.8 million years ago and possibly even earlier.[11]

    The new research[12] was based on the anatomical concept known as “thumb opposition … action of bringing the thumb in contact with the fingers,” This efficiency is “greatly enhanced in humans” compared to other primates like chimpanzees (which also have opposable thumbs) and is a “central component of human-like manual dexterity.” The researchers integrated “virtual muscle modeling with three-dimensional bone shape analysis to investigate biomechanical efficiency for thumb opposition in the fossil human record.”[13]

    Harvati and her colleagues wanted to know if this enhanced thumb opposition efficiency could be detected in early hominin fossils as listed above, and if so, which ones. This methodology produces major problems related to the quality and status of the fossil fragments she evaluated.[14] Assuming that the oldest stone tools in the archaeological record date back to 3 million Darwin years ago, she speculates under Darwine hypnosis that another hominin genus, namely Australopithecus, may also have had human-like thumbs.

    Her team’s effort drew on hand bones of modern humans, chimpanzees, and a number of Pleistocene-era hominins, including Homo neanderthalensis, Homo naledi, and three species of Australopithecus. The researchers analyzed bone anatomy. Because soft tissue like muscles are not preserved in the fossils, their presence and location were inferred. They focused on one muscle, opponens pollicis, whose location, function, and muscle attachment sites are known to be similar among all the living great apes. The scientists then created virtual models of the hominin hands and calculate the manual dexterity of each species. The results, they concluded

    indicate that a fundamental aspect of efficient thumb opposition appeared approximately 2 million years ago, possibly associated with our own genus Homo, and did not characterize Australopithecus, the earliest proposed stone tool maker. This was true also of the late Australopithecus species, Australopithecus sediba, previously found to exhibit human-like thumb proportions. In contrast, later Homo species, including the small-brained Homo naledi, show high levels of thumb opposition dexterity, highlighting the increasing importance of cultural processes and manual dexterity in later human evolution.[15]

    One interesting result was that thumb dexterity in Australopithecus was similar to that of living chimpanzees, supporting the conclusion that Australopithecus was not a link between apes and humans but simply an extinct chimp. The controversial fossil of Australopithecus sediba, “whose hand, and particularly the thumb, has been described as especially human-like,” has prompted “suggestions that it was associated with tool-related behaviors.”

    A number of researchers presented several major valid criticisms of the study. For example, Chatham University professor of biology Erin Marie Williams-Hatala, mentioned the “focus on a single muscle attachment site, known as an enthesis, as a major limitation.”[16] Another concern was that the living great apes have thumbs that are relatively small and cannot directly oppose the fingertips as human thumbs can, a fact largely negating the results of the study.[17]

    The authors used “aspects of the shape and size of a muscle attachment complex to approximate the shape and functional abilities of the associated small muscle in the hand.” This particular muscle is very important for moving the thumb, but the “idea that muscle morphology—and by extension, muscle and organismal function—can be gleaned from the associated attachment site is an old and very tempting one that continues to be heavily debated”[18]

    Essentially, scientists “simply do not understand the relationship between the morphology of muscle attachment sites and the morphology, and certainly not the functional ability of the associated muscle, to confidently say anything about the latter based on the former.”[19] Another study of muscle attachment sites, which was the main methodology of the paper reviewed here, concluded that the attachment site morphological parameters “do not reflect muscle size or activity. In spite of decades of assumption otherwise, there appears to be no direct causal relationship between muscle size or activity and attachment site morphology, and reconstructions of behavior based on these features should be viewed with caution.”[20]

    Another problem was the study was only able to focus on a single, “albeit crucial,” muscle of the thumb “due to the fragmentary nature of the fossil record.” Her team “wanted to include as many specimens from as many fossil hominin species as possible,” but this limitation to one muscle restricted conclusions that could be drawn. Last, Aix-Marseille University biomechanics professor Laurent Vigouroux, whose specialty is the mechanics of the human grip, added “more than 10 different muscles that contribute to thumb movement, and it’s possible that weaker opponens pollicis in some species may have been compensated for by some other muscle or muscles.”[21]

    Looking at the criticisms by other scientists in the specific field who discounted the practice of extrapolating muscle attachment to thumb function, we conclude that little weight can be put on the conclusions reached by the evolutionists. Consequently, little confidence can be placed in a study that attempted to determine when and how the chimp thumb evolved into the human thumb. The study did not even provide any evidence that the modern human thumb evolved from some primate common chimp and human ancestor. The chasm between chimps and human remains, which would be expected given the genetic chasm documented by the DNA research.

    See Christa Charles’s write-up about Harvati’s work in New Scientist as an example of the uncritical repetition of any story that makes evolution appear to have evidence.—Ed.

    [1] Marks, Jonathan. 2003. What It Means to Be 98% Chimpanzee: Apes, People, and Their Genes. Berkeley, CA University of California Press.

    [2] New Genome Comparison Finds Chimps, Humans Very Similar At DNA Levelhttps://www.sciencedaily.com/releases/2005/09/050901074102.htm

    [3] 2005. Initial sequence of the chimpanzee genome and comparison with the human genome. Nature. 437:69-87. P. 73. 70 September 1.

    [4] Deininger, Prescott. 2004. What does the fact that we share 95 percent of our genes with the chimpanzee mean? And how was this number derived? https://www.scientificamerican.com/article/what-does-the-fact-that-w/

    [6] Dvorsky, George. 2021. Human Thumbs Got a Major Upgrade 2 Million Years Ago, Sparking a Cultural Revolution, Study Finds. https://gizmodo.com/human-thumbs-got-a-major-upgrade-2-million-years-ago-s-1846150313/ Emphasis added.

    [8] Price, Michael. 2021. Your amazing thumb is about 2 million years old. Science. https://www.sciencemag.org/news/2021/01/your-amazing-thumb-about-2-million-years-old. January 21, 2021,

    [10] Bergman, Jerry. 2020. Apes as Ancestors: Examining the Claims About Human Evolution. Tulsa, OK: Bartlett Publishing. Co-Authored with Peter Line, PhD and Jeff Tomkins. PhD.

    [12]Karakostis, Fotios Alexandros. et al., 2021. Biomechanics of the human thumb and the evolution of dexterity. increased manual dexterity, a vital component of human-like tool use, in thumb bones dated to about 2 million years ago. Current Biology. 31”1-9. https:// doi.org/10.1016/j.cub.2020.12.041.

    [15] Karakostis, et al., 2021, p. 1.

    [20] Zumwalt, Ann. 2006. The effect of endurance exercise on the morphology of muscle attachment sites. Journal of Experimental Biology. 209:444-454. https://jeb.biologists.org/content/209/3/444.

    Dr. Jerry Bergman has taught biology, genetics, chemistry, biochemistry, anthropology, geology, and microbiology for over 40 years at several colleges and universities including Bowling Green State University, Medical College of Ohio where he was a research associate in experimental pathology, and The University of Toledo. He is a graduate of the Medical College of Ohio, Wayne State University in Detroit, the University of Toledo, and Bowling Green State University. He has over 1,300 publications in 12 languages and 40 books and monographs. His books and textbooks that include chapters that he authored are in over 1,500 college libraries in 27 countries. So far over 80,000 copies of the 40 books and monographs that he has authored or co-authored are in print. For more articles by Dr Bergman, see his Author Profile.


    Shouldering the Burden of Evolution

    As early humans increasingly left forests and utilized tools, they took an evolutionary step away from apes. But what this last common ancestor with apes looked like has remained unclear. A new study led by researchers at UC San Francisco shows that important clues lie in the shoulder.

    Humans split from our closest African ape relatives in the genus Pan – including chimpanzees and bonobos – 6 to 7 million years ago. Yet certain human traits resemble the more distantly related orangutan or even monkeys. This combination of characteristics calls into question whether the last common ancestor of modern humans and African apes looked more like modern day chimps and gorillas or an ancient ape unlike any living group.

    “Humans are unique in many ways. We have features that clearly link us with African apes, but we also have features that appear more primitive, leading to uncertainty about what our common ancestor looked like,” said Nathan Young, PhD, assistant professor at UC San Francisco School of Medicine and lead author of the study. “Our study suggests that the simplest explanation, that the ancestor looked a lot like a chimp or gorilla, is the right one, at least in the shoulder.”

    It appears, he said, that shoulder shape tracks changes in early human behavior such as reduced climbing and increased tool use. The paper, titled Fossil Hominin Shoulders Support an African Ape-like Last Common Ancestor of Chimpanzees and Humans, published online Sept. 7, in the journal PNAS.

    The shoulders of African apes consist of a trowel-shaped blade and a handle-like spine that points the joint with the arm up toward the skull, giving an advantage to the arms when climbing or swinging through the branches. In contrast, the scapular spine of monkeys is pointed more downwards. In humans this trait is even more pronounced, indicating behaviors such as stone tool making and high-speed throwing. The prevailing question was whether humans evolved this configuration from a more primitive ape, or from a modern African ape-like creature, but later reverted back to the downward angle.

    The researchers tested these competing theories by comparing 3-D measurements of fossil shoulder blades of early hominins and modern humans against African apes, orangutan, gibbons and large, tree-dwelling monkeys. They found that the modern human’s shoulder shape is unique in that it shares the lateral orientation with orangutans and the scapular blade shape with African apes a primate in the middle.

    “Human shoulder blades are odd, separated from all the apes. Primitive in some ways, derived in other ways, and different from all of them,” Young said. “How did the human lineage evolve and where did the common ancestor to modern humans evolve a shoulder like ours?”

    To find out, Young and his team analyzed two early human Australopithecus species, the primitive A. afarensis and younger A. sediba, as well as H. ergaster and Neandertals, to see where they fit on the shoulder spectrum.

    “Finding fossil remains of the common ancestor would be ideal, however, when fossils are absent, employing such multifaceted techniques is the next best solution,” said Zeray Alemseged, PhD, senior curator of Anthropology at the California Academy of Sciences.

    The results showed that australopiths were intermediate between African apes and humans: the A. afarensis shoulder was more like an African ape than a human, and A. sediba closer to human’s than to an ape’s. This positioning is consistent with evidence for increasingly sophisticated tool use in Australopithecus.

    “The mix of ape and human features observed in A. afarensis’ shoulder support the notion that, while bipedal, the species engaged in tree climbing and wielded stone tools. This is a primate clearly on its way to becoming human,” Alemseged said.

    These shifts in the shoulder also enabled the evolution of another critical behavior – human’s ability to throw objects with speed and accuracy, said Neil T. Roach, PhD, a fellow of human evolutionary biology at Harvard University. A laterally facing shoulder blade allows humans to store energy in their shoulders, much like a slingshot, facilitating high-speed throwing, an important and uniquely human behavior.

    “These changes in the shoulder, which were probably initially driven by the use of tools well back into human evolution, also made us great throwers,” Roach said. “Our unique throwing ability likely helped our ancestors hunt and protect themselves, turning our species into the most dominant predators on earth.”

    However, this remarkable ability has trade offs — partly because of that downward scapular tilt, humans can throw fastballs, but are also prone to shoulder injuries. Today, Americans get approximately 2 million rotator cuff injuries each year, but not everyone is at equal risk. Because shoulder shape varies widely among modern humans, understanding these variations could help predict which people are more prone to injury.

    ”We could potentially use information about the shape of an individual’s shoulder to predict if they have a higher likelihood of injury and then recommend personalized exercise programs that would best help to prevent them,” Young said. “For a baseball pitcher, depending on your shoulder shape, you might want to emphasize some strengthening exercises over others to protect your rotator cuff.”

    The researchers’ next step will be to analyze variability in the shoulder blade of modern humans and the genetic sequences that cause those differences to understand how these factors influence the likelihood to get rotator cuff injuries.

    “Once we understand how the shape of the shoulder blade affects who gets injured, the next step is to find out what genes contribute to those injury prone shapes,” said Terence Capellini, PhD, assistant professor of human evolutionary biology at Harvard University. “With that information, we hope that one day doctors can diagnose and help prevent shoulder injuries years before they happen, simply by rubbing a cotton swab on a patient’s cheek to collect their DNA”.

    This work was supported by funding from the National Science Foundation (Grant BCS-1518596) Margaret and William Hearst the National Institutes of Health (Grants R01DE019638 and R01DE021708) and the ongoing support of the UCSF Orthopaedic Trauma Institute and the Laboratory for Skeletal Regeneration at San Francisco General Hospital.

    UC San Francisco (UCSF) is a leading university dedicated to promoting health worldwide through advanced biomedical research, graduate-level education in the life sciences and health professions, and excellence in patient care. It includes top-ranked graduate schools of dentistry, medicine, nursing and pharmacy, a graduate division with nationally renowned programs in basic, biomedical, translational and population sciences, as well as a preeminent biomedical research enterprise and two top-ranked hospitals, UCSF Medical Center and UCSF Benioff Children’s Hospital San Francisco.


    Researchers Suggest Big Toe Was Last Part of Foot to Evolve

    The earliest hominins split their days between the trees and the ground, alternately adopting ape-like tree-swinging behaviors and human-like bipedalism, or walking upright on two feet—albeit in a crouched position. By the time Lucy and her Australopithecus afarensis relatives arrived on the scene some four million years ago, bipedalism had largely overtaken tree-dwelling, but according to a study published in the Proceedings of the National Academy of Sciences, these human ancestors likely lacked a key evolutionary adaptation: the rigid big toe.

    BBC News ’ Angus Davison reports that the new findings suggest the big toe, which enables humans to push off of the ground while walking and running, was one of the last parts of the foot to evolve.

    “It might have been last because it was the hardest to change,” lead author Peter Fernandez, a biomedicist at Milwaukee’s Marquette University, tells Davison. “We also think there was a compromise. The big toe could still be used for grasping, as our ancestors spent a fair amount of their time in the trees before becoming fully committed to walking on the ground."

    To trace the big toe’s evolution, Fernandez and colleagues created 3D scans of human relatives’ toe bone joints, relying on a combination of living creatures—including apes and monkeys—and fossilized samples. After juxtaposing these scans with ones made of modern humans and mapping the data onto an evolutionary tree, researchers realized that the big toe developed much later than the rest of the foot’s bones. Early hominins’ gait, therefore, had more in common with apes’ than the easy human stride seen today.

    According to Live Science’s Jennifer Welsh, differences between human and non-human primate feet come down to purpose. Whereas most primates use their feet to grasp onto tree branches and other objects, humans rely on theirs to navigate life on two legs. For example, arches, which are located on the inside of the foot close to the big toe, make it harder for humans to nimbly climb trees but offer shock absorption when planting one’s feet on the ground.

    The human big toe specifically carries 40 percent of the five toes’ collective weight, Corey Binns writes for Scientific American, and it is the last part of the foot to leave the ground when one walks or runs. Comparatively, apes’ big toes are opposable, built for grasping and functioning similarly to the versatile opposable thumb, which allows primates to deftly perform a wide range of motions.

    Although early humans such as A. afarensis and the roughly 4.4-million-year-old Ardipithecus ramidus walked upright, BBC News’ Davison notes that the study confirms this bipedalism did not preclude the existence of an opposable, ape-like big toe.

    “It was a bit of shock when hominins were found that have a grasping, or opposable, big toe, as this was thought to be incompatible with effective bipedalism,” anatomist Fred Spoor of London’s Natural History Museum tells Davison. “This work shows that different parts of the foot can have different functions. When a big toe is opposable, you can still function properly as a biped."


    Recommended Reading

    The Irrationality of Alcoholics Anonymous

    How Helicopter Parenting Can Cause Binge Drinking

    Top Gun Is an Infomercial for America

    But even presuming that this story of natural selection is right, it doesn’t explain why, 10 million years later, I like wine so much. “It should puzzle us more than it does,” Edward Slingerland writes in his wide-ranging and provocative new book, Drunk: How We Sipped, Danced, and Stumbled Our Way to Civilization, “that one of the greatest foci of human ingenuity and concentrated effort over the past millennia has been the problem of how to get drunk.” The damage done by alcohol is profound: impaired cognition and motor skills, belligerence, injury, and vulnerability to all sorts of predation in the short run damaged livers and brains, dysfunction, addiction, and early death as years of heavy drinking pile up. As the importance of alcohol as a caloric stopgap diminished, why didn’t evolution eventually lead us away from drinking—say, by favoring genotypes associated with hating alcohol’s taste? That it didn’t suggests that alcohol’s harms were, over the long haul, outweighed by some serious advantages.

    Versions of this idea have recently bubbled up at academic conferences and in scholarly journals and anthologies (largely to the credit of the British anthropologist Robin Dunbar). Drunk helpfully synthesizes the literature, then underlines its most radical implication: Humans aren’t merely built to get buzzed—getting buzzed helped humans build civilization. Slingerland is not unmindful of alcohol’s dark side, and his exploration of when and why its harms outweigh its benefits will unsettle some American drinkers. Still, he describes the book as “a holistic defense of alcohol.” And he announces, early on, that “it might actually be good for us to tie one on now and then.”

    Slingerland is a professor at the University of British Columbia who, for most of his career, has specialized in ancient Chinese religion and philosophy. In a conversation this spring, I remarked that it seemed odd that he had just devoted several years of his life to a subject so far outside his wheelhouse. He replied that alcohol isn’t quite the departure from his specialty that it might seem as he has recently come to see things, intoxication and religion are parallel puzzles, interesting for very similar reasons. As far back as his graduate work at Stanford in the 1990s, he’d found it bizarre that across all cultures and time periods, humans went to such extraordinary (and frequently painful and expensive) lengths to please invisible beings.

    In 2012, Slingerland and several scholars in other fields won a big grant to study religion from an evolutionary perspective. In the years since, they have argued that religion helped humans cooperate on a much larger scale than they had as hunter-gatherers. Belief in moralistic, punitive gods, for example, might have discouraged behaviors (stealing, say, or murder) that make it hard to peacefully coexist. In turn, groups with such beliefs would have had greater solidarity, allowing them to outcompete or absorb other groups.

    Around the same time, Slingerland published a social-science-heavy self-help book called Trying Not to Try. In it, he argued that the ancient Taoist concept of wu-wei (akin to what we now call “flow”) could help with both the demands of modern life and the more eternal challenge of dealing with other people. Intoxicants, he pointed out in passing, offer a chemical shortcut to wu-wei—by suppressing our conscious mind, they can unleash creativity and also make us more sociable.

    At a talk he later gave on wu-wei at Google, Slingerland made much the same point about intoxication. During the Q&A, someone in the audience told him about the Ballmer Peak—the notion, named after the former Microsoft CEO Steve Ballmer, that alcohol can affect programming ability. Drink a certain amount, and it gets better. Drink too much, and it goes to hell. Some programmers have been rumored to hook themselves up to alcohol-filled IV drips in hopes of hovering at the curve’s apex for an extended time.

    His hosts later took him over to the “whiskey room,” a lounge with a foosball table and what Slingerland described to me as “a blow-your-mind collection of single-malt Scotches.” The lounge was there, they said, to provide liquid inspiration to coders who had hit a creative wall. Engineers could pour themselves a Scotch, sink into a beanbag chair, and chat with whoever else happened to be around. They said doing so helped them to get mentally unstuck, to collaborate, to notice new connections. At that moment, something clicked for Slingerland too: “I started to think, Alcohol is really this very useful cultural tool.” Both its social lubrications and its creativity-enhancing aspects might play real roles in human society, he mused, and might possibly have been involved in its formation.

    He belatedly realized how much the arrival of a pub a few years earlier on the UBC campus had transformed his professional life. “We started meeting there on Fridays, on our way home,” he told me. “Psychologists, economists, archaeologists—we had nothing in common—shooting the shit over some beers.” The drinks provided just enough disinhibition to get conversation flowing. A fascinating set of exchanges about religion unfolded. Without them, Slingerland doubts that he would have begun exploring religion’s evolutionary functions, much less have written Drunk.

    Which came first, the bread or the beer? For a long time, most archaeologists assumed that hunger for bread was the thing that got people to settle down and cooperate and have themselves an agricultural revolution. In this version of events, the discovery of brewing came later—an unexpected bonus. But lately, more scholars have started to take seriously the possibility that beer brought us together. (Though beer may not be quite the word. Prehistoric alcohol would have been more like a fermented soup of whatever was growing nearby.)

    For the past 25 years, archaeologists have been working to uncover the ruins of Göbekli Tepe, a temple in eastern Turkey. It dates to about 10,000 B.C.—making it about twice as old as Stonehenge. It is made of enormous slabs of rock that would have required hundreds of people to haul from a nearby quarry. As far as archaeologists can tell, no one lived there. No one farmed there. What people did there was party. “The remains of what appear to be brewing vats, combined with images of festivals and dancing, suggest that people were gathering in groups, fermenting grain or grapes,” Slingerland writes, “and then getting truly hammered.”

    Over the decades, scientists have proposed many theories as to why we still drink alcohol, despite its harms and despite millions of years having passed since our ancestors’ drunken scavenging. Some suggest that it must have had some interim purpose it’s since outlived. (For example, maybe it was safer to drink than untreated water—fermentation kills pathogens.) Slingerland questions most of these explanations. Boiling water is simpler than making beer, for instance.

    Göbekli Tepe—and other archaeological finds indicating very early alcohol use—gets us closer to a satisfying explanation. The site’s architecture lets us visualize, vividly, the magnetic role that alcohol might have played for prehistoric peoples. As Slingerland imagines it, the promise of food and drink would have lured hunter-gatherers from all directions, in numbers great enough to move gigantic pillars. Once built, both the temple and the revels it was home to would have lent organizers authority, and participants a sense of community. “Periodic alcohol-fueled feasts,” he writes, “served as a kind of ‘glue’ holding together the culture that created Göbekli Tepe.”

    Things were likely more complicated than that. Coercion, not just inebriated cooperation, probably played a part in the construction of early architectural sites, and in the maintenance of order in early societies. Still, cohesion would have been essential, and this is the core of Slingerland’s argument: Bonding is necessary to human society, and alcohol has been an essential means of our bonding. Compare us with our competitive, fractious chimpanzee cousins. Placing hundreds of unrelated chimps in close quarters for several hours would result in “blood and dismembered body parts,” Slingerland notes—not a party with dancing, and definitely not collaborative stone-lugging. Human civilization requires “individual and collective creativity, intensive cooperation, a tolerance for strangers and crowds, and a degree of openness and trust that is entirely unmatched among our closest primate relatives.” It requires us not only to put up with one another, but to become allies and friends.

    As to how alcohol assists with that process, Slingerland focuses mostly on its suppression of prefrontal-cortex activity, and how resulting disinhibition may allow us to reach a more playful, trusting, childlike state. Other important social benefits may derive from endorphins, which have a key role in social bonding. Like many things that bring humans together—laughter, dancing, singing, storytelling, sex, religious rituals—drinking triggers their release. Slingerland observes a virtuous circle here: Alcohol doesn’t merely unleash a flood of endorphins that promote bonding by reducing our inhibitions, it nudges us to do other things that trigger endorphins and bonding.

    Over time, groups that drank together would have cohered and flourished, dominating smaller groups—much like the ones that prayed together. Moments of slightly buzzed creativity and subsequent innovation might have given them further advantage still. In the end, the theory goes, the drunk tribes beat the sober ones.

    But this rosy story about how alcohol made more friendships and advanced civilization comes with two enormous asterisks: All of that was before the advent of liquor, and before humans started regularly drinking alone.

    Photograph by Chelsea Kyle Prop Stylist: Amy Elise Wilson Food Stylist: Sue Li

    The early Greeks watered down their wine swilling it full-strength was, they believed, barbaric—a recipe for chaos and violence. “They would have been absolutely horrified by the potential for chaos contained in a bottle of brandy,” Slingerland writes. Human beings, he notes, “are apes built to drink, but not 100-proof vodka. We are also not well equipped to control our drinking without social help.”

    Distilled alcohol is recent—it became widespread in China in the 13th century and in Europe from the 16th to 18th centuries—and a different beast from what came before it. Fallen grapes that have fermented on the ground are about 3 percent alcohol by volume. Beer and wine run about 5 and 11 percent, respectively. At these levels, unless people are strenuously trying, they rarely manage to drink enough to pass out, let alone die. Modern liquor, however, is 40 to 50 percent alcohol by volume, making it easy to blow right past a pleasant social buzz and into all sorts of tragic outcomes.

    Just as people were learning to love their gin and whiskey, more of them (especially in parts of Europe and North America) started drinking outside of family meals and social gatherings. As the Industrial Revolution raged, alcohol use became less leisurely. Drinking establishments suddenly started to feature the long counters that we associate with the word bar today, enabling people to drink on the go, rather than around a table with other drinkers. This short move across the barroom reflects a fairly dramatic break from tradition: According to anthropologists, in nearly every era and society, solitary drinking had been almost unheard‑of among humans.

    The social context of drinking turns out to matter quite a lot to how alcohol affects us psychologically. Although we tend to think of alcohol as reducing anxiety, it doesn’t do so uniformly. As Michael Sayette, a leading alcohol researcher at the University of Pittsburgh, recently told me, if you packaged alcohol as an anti-anxiety serum and submitted it to the FDA, it would never be approved. He and his onetime graduate student Kasey Creswell, a Carnegie Mellon professor who studies solitary drinking, have come to believe that one key to understanding drinking’s uneven effects may be the presence of other people. Having combed through decades’ worth of literature, Creswell reports that in the rare experiments that have compared social and solitary alcohol use, drinking with others tends to spark joy and even euphoria, while drinking alone elicits neither—if anything, solo drinkers get more depressed as they drink.

    Sayette, for his part, has spent much of the past 20 years trying to get to the bottom of a related question: why social drinking can be so rewarding. In a 2012 study, he and Creswell divided 720 strangers into groups, then served some groups vodka cocktails and other groups nonalcoholic cocktails. Compared with people who were served nonalcoholic drinks, the drinkers appeared significantly happier, according to a range of objective measures. Maybe more important, they vibed with one another in distinctive ways. They experienced what Sayette calls “golden moments,” smiling genuinely and simultaneously at one another. Their conversations flowed more easily, and their happiness appeared infectious. Alcohol, in other words, helped them enjoy one another more.

    This research might also shed light on another mystery: why, in a number of large-scale surveys, people who drink lightly or moderately are happier and psychologically healthier than those who abstain. Robin Dunbar, the anthropologist, examined this question directly in a large study of British adults and their drinking habits. He reports that those who regularly visit pubs are happier and more fulfilled than those who don’t—not because they drink, but because they have more friends. And he demonstrates that it’s typically the pub-going that leads to more friends, rather than the other way around. Social drinking, too, can cause problems, of course—and set people on a path to alcohol-use disorder. (Sayette’s research focuses in part on how that happens, and why some extroverts, for example, may find alcohol’s social benefits especially hard to resist.) But solitary drinking—even with one’s family somewhere in the background—is uniquely pernicious because it serves up all the risks of alcohol without any of its social perks. Divorced from life’s shared routines, drinking becomes something akin to an escape from life.

    Southern Europe’s healthy drinking culture is hardly news, but its attributes are striking enough to bear revisiting: Despite widespread consumption of alcohol, Italy has some of the lowest rates of alcoholism in the world. Its residents drink mostly wine and beer, and almost exclusively over meals with other people. When liquor is consumed, it’s usually in small quantities, either right before or after a meal. Alcohol is seen as a food, not a drug. Drinking to get drunk is discouraged, as is drinking alone. The way Italians drink today may not be quite the way premodern people drank, but it likewise accentuates alcohol’s benefits and helps limit its harms. It is also, Slingerland told me, about as far as you can get from the way many people drink in the United States.

    Americans may not have invented binge drinking, but we have a solid claim to bingeing alone, which was almost unheard-of in the Old World. During the early 19th century, solitary binges became common enough to need a name, so Americans started calling them “sprees” or “frolics”—words that sound a lot happier than the lonely one-to-three-day benders they described.

    In his 1979 history, The Alcoholic Republic, the historian W. J. Rorabaugh painstakingly calculated the stunning amount of alcohol early Americans drank on a daily basis. In 1830, when American liquor consumption hit its all-time high, the average adult was going through more than nine gallons of spirits each year. Most of this was in the form of whiskey (which, thanks to grain surpluses, was sometimes cheaper than milk), and most of it was drunk at home. And this came on top of early Americans’ other favorite drink, homemade cider. Many people, including children, drank cider at every meal a family could easily go through a barrel a week. In short, Americans of the early 1800s were rarely in a state that could be described as sober, and a lot of the time, they were drinking to get drunk.

    Rorabaugh argued that this longing for oblivion resulted from America’s almost unprecedented pace of change between 1790 and 1830. Thanks to rapid westward migration in the years before railroads, canals, and steamboats, he wrote, “more Americans lived in isolation and independence than ever before or since.” In the more densely populated East, meanwhile, the old social hierarchies evaporated, cities mushroomed, and industrialization upended the labor market, leading to profound social dislocation and a mismatch between skills and jobs. The resulting epidemics of loneliness and anxiety, he concluded, led people to numb their pain with alcohol.

    The temperance movement that took off in the decades that followed was a more rational (and multifaceted) response to all of this than it tends to look like in the rearview mirror. Rather than pushing for full prohibition, many advocates supported some combination of personal moderation, bans on liquor, and regulation of those who profited off alcohol. Nor was temperance a peculiarly American obsession. As Mark Lawrence Schrad shows in his new book, Smashing the Liquor Machine: A Global History of Prohibition, concerns about distilled liquor’s impact were international: As many as two dozen countries enacted some form of prohibition.

    Yet the version that went into effect in 1920 in the United States was by far the most sweeping approach adopted by any country, and the most famous example of the all-or-nothing approach to alcohol that has dogged us for the past century. Prohibition did, in fact, result in a dramatic reduction in American drinking. In 1935, two years after repeal, per capita alcohol consumption was less than half what it had been early in the century. Rates of cirrhosis had also plummeted, and would remain well below pre-Prohibition levels for decades.

    The temperance movement had an even more lasting result: It cleaved the country into tipplers and teetotalers. Drinkers were on average more educated and more affluent than nondrinkers, and also more likely to live in cities or on the coasts. Dry America, meanwhile, was more rural, more southern, more midwestern, more churchgoing, and less educated. To this day, it includes about a third of U.S. adults—a higher proportion of abstainers than in many other Western countries.

    What’s more, as Christine Sismondo writes in America Walks Into a Bar, by kicking the party out of saloons, the Eighteenth Amendment had the effect of moving alcohol into the country’s living rooms, where it mostly remained. This is one reason that, even as drinking rates decreased overall, drinking among women became more socially acceptable. Public drinking establishments had long been dominated by men, but home was another matter—as were speakeasies, which tended to be more welcoming.

    After Prohibition’s repeal, the alcohol industry refrained from aggressive marketing, especially of liquor. Nonetheless, drinking steadily ticked back up, hitting pre-Prohibition levels in the early ’70s, then surging past them. Around that time, most states lowered their drinking age from 21 to 18 (to follow the change in voting age)—just as the Baby Boomers, the biggest generation to date, were hitting their prime drinking years. For an illustration of what followed, I direct you to the film Dazed and Confused.

    Drinking peaked in 1981, at which point—true to form—the country took a long look at the empty beer cans littering the lawn, and collectively recoiled. What followed has been described as an age of neo-temperance. Taxes on alcohol increased warning labels were added to containers. The drinking age went back up to 21, and penalties for drunk driving finally got serious. Awareness of fetal alcohol syndrome rose too—prompting a quintessentially American freak-out: Unlike in Europe, where pregnant women were reassured that light drinking remained safe, those in the U.S. were, and are, essentially warned that a drop of wine could ruin a baby’s life. By the late 1990s, the volume of alcohol consumed annually had declined by a fifth.

    And then began the current lurch upward. Around the turn of the millennium, Americans said To hell with it and poured a second drink, and in almost every year since, we’ve drunk a bit more wine and a bit more liquor than the year before. But why?

    One answer is that we did what the alcohol industry was spending billions of dollars persuading us to do. In the ’90s, makers of distilled liquor ended their self-imposed ban on TV advertising. They also developed new products that might initiate nondrinkers (think sweet premixed drinks like Smirnoff Ice and Mike’s Hard Lemonade). Meanwhile, winemakers benefited from the idea, then in wide circulation and since challenged, that moderate wine consumption might be good for you physically. (As Iain Gately reports in Drink: A Cultural History of Alcohol, in the month after 60 Minutes ran a widely viewed segment on the so-called French paradox—the notion that wine might explain low rates of heart disease in France—U.S. sales of red wine shot up 44 percent.)

    But this doesn’t explain why Americans have been so receptive to the sales pitches. Some people have argued that our increased consumption is a response to various stressors that emerged over this period. (Gately, for example, proposes a 9/11 effect—he notes that in 2002, heavy drinking was up 10 percent over the previous year.) This seems closer to the truth. It also may help explain why women account for such a disproportionate share of the recent increase in drinking.

    Throughout history, drinking has provided a social and psychological service. At a moment when friendships seem more attenuated than ever, maybe it can do so again.

    Although both men and women commonly use alcohol to cope with stressful situations and negative feelings, research finds that women are substantially more likely to do so. And they’re much more apt to be sad and stressed out to begin with: Women are about twice as likely as men to suffer from depression or anxiety disorders—and their overall happiness has fallen substantially in recent decades.

    In the 2013 book Her Best-Kept Secret, an exploration of the surge in female drinking, the journalist Gabrielle Glaser recalls noticing, early this century, that women around her were drinking more. Alcohol hadn’t been a big part of mom culture in the ’90s, when her first daughter was young—but by the time her younger children entered school, it was everywhere: “Mothers joked about bringing their flasks to Pasta Night. Flasks? I wondered, at the time. Wasn’t that like Gunsmoke?” (Her quip seems quaint today. A growing class of merchandise now helps women carry concealed alcohol: There are purses with secret pockets, and chunky bracelets that double as flasks, and—perhaps least likely of all to invite close investigation—flasks designed to look like tampons.)

    Glaser notes that an earlier rise in women’s drinking, in the 1970s, followed increased female participation in the workforce—and with it the particular stresses of returning home, after work, to attend to the house or the children. She concludes that women are today using alcohol to quell the anxieties associated with “the breathtaking pace of modern economic and social change” as well as with “the loss of the social and family cohesion” enjoyed by previous generations. Almost all of the heavy-drinking women Glaser interviewed drank alone—the bottle of wine while cooking, the Baileys in the morning coffee, the Poland Spring bottle secretly filled with vodka. They did so not to feel good, but to take the edge off feeling bad.

    Men still drink more than women, and of course no demographic group has a monopoly on either problem drinking or the stresses that can cause it. The shift in women’s drinking is particularly stark, but unhealthier forms of alcohol use appear to be proliferating in many groups. Even drinking in bars has become less social in recent years, or at least this was a common perception among about three dozen bartenders I surveyed while reporting this article. “I have a few regulars who play games on their phone,” one in San Francisco said, “and I have a standing order to just refill their beer when it’s empty. No eye contact or talking until they are ready to leave.” Striking up conversations with strangers has become almost taboo, many bartenders observed, especially among younger patrons. So why not just drink at home? Spending money to sit in a bar alone and not talk to anyone was, a bartender in Columbus, Ohio, said, an interesting case of “trying to avoid loneliness without actual togetherness.”

    Last August, the beer manufacturer Busch launched a new product well timed to the problem of pandemic-era solitary drinking. Dog Brew is bone broth packaged as beer for your pet. “You’ll never drink alone again,” said news articles reporting its debut. It promptly sold out. As for human beverages, though beer sales were down in 2020, continuing their long decline, Americans drank more of everything else, especially spirits and (perhaps the loneliest-sounding drinks of all) premixed, single-serve cocktails, sales of which skyrocketed.

    Not everyone consumed more alcohol during the pandemic. Even as some of us (especially women and parents) drank more frequently, others drank less often. But the drinking that increased was, almost definitionally, of the stuck-at-home, sad, too-anxious-to-sleep, can’t-bear-another-day-like-all-the-other-days variety—the kind that has a higher likelihood of setting us up for drinking problems down the line. The drinking that decreased was mostly the good, socially connecting kind. (Zoom drinking—with its not-so-happy hours and first dates doomed to digital purgatory—was neither anesthetizing nor particularly connecting, and deserves its own dreary category.)

    As the pandemic eases, we may be nearing an inflection point. My inner optimist imagines a new world in which, reminded of how much we miss joy and fun and other people, we embrace all kinds of socially connecting activities, including eating and drinking together—while also forswearing unhealthy habits we may have acquired in isolation.

    But my inner pessimist sees alcohol use continuing in its pandemic vein, more about coping than conviviality. Not all social drinking is good, of course maybe some of it should wane, too (for example, some employers have recently banned alcohol from work events because of concerns about its role in unwanted sexual advances and worse). And yet, if we use alcohol more and more as a private drug, we’ll enjoy fewer of its social benefits, and get a bigger helping of its harms.

    Let’s contemplate those harms for a minute. My doctor’s nagging notwithstanding, there is a big, big difference between the kind of drinking that will give you cirrhosis and the kind that a great majority of Americans do. According to an analysis in The Washington Post some years back, to break into the top 10 percent of American drinkers, you needed to drink more than two bottles of wine every night. People in the next decile consumed, on average, 15 drinks a week, and in the one below that, six drinks a week. The first category of drinking is, stating the obvious, very bad for your health. But for people in the third category or edging toward the second, like me, the calculation is more complicated. Physical and mental health are inextricably linked, as is made vivid by the overwhelming quantity of research showing how devastating isolation is to longevity. Stunningly, the health toll of social disconnection is estimated to be equivalent to the toll of smoking 15 cigarettes a day.

    To be clear, people who don’t want to drink should not drink. There are many wonderful, alcohol-free means of bonding. Drinking, as Edward Slingerland notes, is merely a convenient shortcut to that end. Still, throughout human history, this shortcut has provided a nontrivial social and psychological service. At a moment when friendships seem more attenuated than ever, and loneliness is rampant, maybe it can do so again. For those of us who do want to take the shortcut, Slingerland has some reasonable guidance: Drink only in public, with other people, over a meal—or at least, he says, “under the watchful eye of your local pub’s barkeep.”

    After more than a year in relative isolation, we may be closer than we’d like to the wary, socially clumsy strangers who first gathered at Göbekli Tepe. “We get drunk because we are a weird species, the awkward losers of the animal world,” Slingerland writes, “and need all of the help we can get.” For those of us who have emerged from our caves feeling as if we’ve regressed into weird and awkward ways, a standing drinks night with friends might not be the worst idea to come out of 2021.

    This article appears in the July/August 2021 print edition with the headline “Drinking Alone.”

    When you buy a book using a link on this page, we receive a commission. Thank you for supporting The Atlantic.


    Neanderthals, Denisovans, humans genetically closer than polar bears, brown bears

    June 3 (UPI) -- Several genomic studies have previously shown that Neanderthals, Denisovans and anatomically modern humans interbred. Now, new research suggests the trio of populations were so genetically similar that they most certainly produced healthy, fertile hybrids.

    In a new study, published Wednesday in the journal Proceedings of the Royal Society B, scientists quantified the genetic differences between early humans and their closest relatives, Neanderthals and Denisovans.

    The analysis showed the genetic distance values separating the three human species were smaller than the differences between modern animal species -- like brown bears and polar bears -- known to produce healthy hybrid offspring.

    "Our desire to categorize the world into discrete boxes has led us to think of species as completely separate units," Greger Larson, director of the Palaeogenomics and Bio-Archaeology Research Network at the University of Cambridge, said in a news release. "Biology does not care about these rigid definitions, and lots of species, even those that are far apart evolutionarily, swap genes all the time."

    "Our predictive metric allows for a quick and easy determination of how likely it is for any two species to produce fertile hybrid offspring," Larson said. "This comparative measure suggests that humans and Neanderthals and Denisovans were able to produce live fertile young with ease."

    For the study, researchers looked at relationships between the fertility of modern animal hybrids and the genetic differences between the two hybridizing species. The analysis showed that species that were genetically more similar were more likely to produce fertile offspring.

    The researchers also determined that there was a threshold for the fertility of hybrid offspring. When scientists used the results of their analysis to measure the relative genetic differences between Neanderthals, Denisovans and anatomically modern humans, they found the three human species more than surpassed the threshold.

    Authors of the new study suggest their methodology can be used to determine the likelihood that any two species would produce healthy, fertile offspring. Such information could help zookeepers decide which animals to house together.

    "Many decisions in conservation biology have been made on the basis that related organisms that produce hybrids in captivity should be prevented from doing so," said Richard Benjamin Allen, co-first author of the study.

    "Such an approach has not considered the significant role that hybridization has played in evolution in the wild, especially in populations under the threat of extinction," Allen said. "Our study can be used to inform future conservation efforts of related species where hybridization or surrogacy programs could be viable alternatives."


    Special relationship

    Previous studies have shown that many animals, from songbirds to dolphins, use the subcortex to process emotional cues, and the cortex to analyze more complex learned signals—even though they can’t talk. Zebras, for instance, can eavesdrop on the emotions in other herbivore species’ calls to learn if predators are nearby.

    It’s likely that human language evolved from such cues, recruiting the same neurological systems to develop speech, notes Terrence Deacon, a neuroanthropologist at the University of California, Berkeley.

    And as domesticated animals that have evolved alongside humans for the past 10,000 years, dogs make special use of this ancient ability to process human emotions, Andics adds.

    “It helps explain why dogs are so successful at partnering with us”—and at times manipulating us with those soulful eyes.