Female infanticide disproves sociobiology?

Wikipedia says “Sociobiology is a field of biology that aims to examine and explain social behavior in terms of evolution.”

One of the things that we learned about on our Northwest Passage cruise was the historical practice of female infanticide among Eskimos/Inuits. When food got scarce, female infants were at risk. The explanation given in museums and by guides was that boys would grow up into adult male hunters who could take care of their elderly parents.

From The North West Passage Exploration Anthology (a report from John Franklin from his 1825 trip):

The difficulty of procuring nourishment frequently induces the women of this tribe to destroy their female children. Two pregnant women of the party then at the fort, made known their intention of acting on this inhuman custom, though Mr. Dease threatened them with our heaviest displeasure if they put it into execution: we learned that, after they left us, one actually did destroy her child; the infant of the other woman proved to be a boy.

If the goal of an animal is propagating his/her/zir genes, this does not seem to make sense. A typical human female reproduces, thus passing on her parents’ genes. A typical human male has no offspring (polygamy is the natural human state, it seems; see “The era of monogamous long-term marriage was a brief interruption” within Real World Divorce).

The period of life in which the son will be potentially useful won’t likely start until after the parents are beyond reproductive age (and therefore whether they live or die has minimal effect on their reproductive success).

Readers: Is the existence of female infanticide across a range of cultures a simple proof that sociobiology is wrong?

Full post, including comments

Nobel-winning physicists discourage young people from physics as a career

From a CNN article on the latest Nobel Prize in Physics:

Peebles, who is Albert Einstein Professor of Science at Princeton University, had a message for budding scientists.

“My advice to young people entering science: you should do it for the love of science,” he said at a press conference following the announcement.

“You should enter science because you are fascinated by it.”

In other words, “Don’t do it for the paycheck or the working conditions, as you might for most other career choices.” (Nobody says “You should train to be a dental hygienist because you are fascinated by teeth”; the stress will be on the $75k/year median wage following a two-year degree and on the flexibility to work anywhere in the U.S. and any number of days per week.)

[Apropos of nothing, CNN goes on to note

In 2018 the Nobel Prize in Physics was awarded to a woman for the first time in 55 years, and for only the third time in its history. Donna Strickland, a Canadian physicist, was awarded last year’s prize jointly with Gérard Mourou, from France, for their work on generating high-intensity, ultra-short optical pulses. They shared the award with an American, Arthur Ashkin, who at 96 becomes the oldest Nobel Laureate, for developing “optical tweezers.”

The preceding year’s Nobel had nothing to do with astrophysics, but it continues to be newsworthy because of the gender ID of one of the winners? (If, indeed Dr. Strickland identified as a woman at the time of the research or award, is there any evidence that Dr. Strickland continues to identify as a woman?)]

Related:

  • In countries that empower women, they are less likely to choose math and science professions. https://philip.greenspun.com/blog/2018/06/14/losing-the-nobel-prize-on-careers-in-science/ : “There is a fierce competition that begins the day you declare yourself a physics major. First, among your fellow undergraduates, you spar for top ranking in your class. This leads to the next battle: becoming a graduate student at a top school. Then, you toil for six to eight years to earn a postdoc job at another top school. And finally, you hope, comes a coveted faculty job, which can become permanent if you are privileged enough to get tenure. Along the way, the number of peers in your group diminishes by a factor of ten at each stage, from hundreds of undergraduates to just one faculty job becoming available every few years in your field. Then the competition really begins, for you compete against fellow gladiators honed in battle just as you are. You compete for the scarcest resource in science: money.”
  • https://philip.greenspun.com/blog/2018/09/20/75-percent-chance-of-career-failure-considered-in-a-positive-light/
  • “Women in Science” (compare to medicine, for example)
  • “The More Gender Equality, the Fewer Women in STEM” (Atlantic): “In countries that empower women, they are less likely to choose math and science professions.”
  • An academic career need not be entirely bleak: “College professor spends nearly $190K in federal grants on strip clubs, sports bars” (USA Today), regarding an Electrical and Computer Engineering professor who spent tens of thousands of dollars of grant money in strip clubs and then wasted the rest….; Philadelphia Inquirer story on the same guy says “Once confronted, Nwankpa decided to bare all” and noted that his colleagues had selected him to be department chair

Full post, including comments

Book that explores the biggest issue of our age

The Wizard and the Prophet by Charles Mann, author of the fascinating 1491 (what Elizabeth Warren’s ancestors were up to before Europeans arrived to trash these continents), explores what I think is the biggest issue of our age: can the human population continue to expand without (a) the Earth being transformed into an unpleasant habitat, and (b) humans themselves suffering a Malthusian reduction to a subsistence standard of living.

Mann frames the issue:

The two people were William Vogt and Norman Borlaug. Vogt, born in 1902, laid out the basic ideas for the modern environmental movement. In particular, he founded what the Hampshire College demographer Betsy Hartmann has called “apocalyptic environmentalism”—the belief that unless humankind drastically reduces consumption its growing numbers and appetite will overwhelm the planet’s ecosystems. In best-selling books and powerful speeches, Vogt argued that affluence is not our greatest achievement but our biggest problem. Our prosperity is temporary, he said, because it is based on taking more from Earth than it can give.

Borlaug, born twelve years later, has become the emblem of what has been termed “techno-optimism” or “cornucopianism”—the view that science and technology, properly applied, can help us produce our way out of our predicament. Exemplifying this idea, Borlaug was the primary figure in the research that in the 1960s created the “Green Revolution,” the combination of high-yielding crop varieties and agronomic techniques that raised grain harvests around the world, helping to avert tens of millions of deaths from hunger.

Prophets look at the world as finite, and people as constrained by their environment. Wizards see possibilities as inexhaustible, and humans as wily managers of the planet. One views growth and development as the lot and blessing of our species; others regard stability and preservation as our future and our goal. Wizards regard Earth as a toolbox, its contents freely available for use; Prophets think of the natural world as embodying an overarching order that should not casually be disturbed.

Mann reminds us that the default scientific assumption is that Vogt is correct:

Biologists tell us that all species, if given the chance, overreach, overreproduce, overconsume. Inevitably, they encounter a wall, always to catastrophic effect, and usually sooner rather than later.

Yet, on the other hand, we’ve already apparently cheated what seemed like biological limits. World population has the proverbial Silicon Valley hockey stick growth and yet people are living better than ever, all around the world (except here in the U.S., according to my Facebook friends, since the Trumpenfuhrer arrived at the Reichstag!). Mann cites estimates that humans currently consume 25-50% of the Earth’s “primary production”

Convinced by politicians that STEM is the path to a glamorous and satisfying career? Here’s a description of Vogt’s 1938 job studying birds:

As a new employee of the Compañía Administradora del Guano, Vogt based his operations on the Chincha Islands, three granitic outposts thirteen miles off the southwest coast of Peru. Named, unexcitingly, North, South, and Central Chincha, they were each less than a mile across, ringed by hundred-foot cliffs, and completely covered in heaps of bird excrement—treeless, gray-white barrens of guano. Atop the guano, shrieking and flapping, were millions of Guanay cormorants, packed together three nests to the square yard, sharp beaks guarding eggs that sat in small guano craters lined by molted feathers. The birds’ wings rustled and thrummed; multiplied by the million, the sound was a vibration in the skull. Fleas, ticks, and biting flies were everywhere. So was the stench of guano. By noon the light was so bright that Vogt’s photographic light meter “often could not measure it.” Vogt’s head and neck were constantly sunburned; later his ears developed precancerous growths. Vogt worked, ate, and slept in the bird guardians’ barracks on North Chincha, remaining offshore for weeks on end (he was also given an apartment in the nearby shore town of Pisco). His quarters on the island were almost without furniture, covered with guano dust, alive with flies and roaches. Birds mated, fought, and raised their offspring on the roof overhead, leaving so much guano that the building had to be shoveled off periodically to avoid collapse.

Vogt’s opinion was that World War II in the Pacific could be explained by “population pressure” in Japan, and that both World Wars in Europe were explained by competition over resources. He was worried about population growth elsewhere:

Vogt, for instance, was loudly scornful of the “unchecked spawning” and “untrammeled copulation” of “backward populations”—people in India, he sneered, breed with “the irresponsibility of codfish.”

The book proves that every American has an idea for a movie (about soil!) and confirms the history in Real World Divorce:

Marjorie instead went home to California, where she apparently met Vogt, fourteen years her senior, who was futilely trying to convince Walt Disney to make an animated movie about soil. It seems evident that they began a relationship. Juana had spent much of the previous two years alone in Latin America, trolling the embassy circuit for Nazi gossip. In June 1945 the couple rendezvoused in California. The marriage collapsed. Two months later Juana went to Reno, Nevada, to obtain one of the city’s famous quick divorces. Early in 1946 Marjorie also went to Reno, and for the same reason. Marjorie filed for divorce from Devereux, appeared before the court, received her decree, and married Bill on the same day: April 4, 1946.

With the help of the new young wife, Vogt pushes Road to Survival in 1948, coinciding with Fairfield Osborn’s Our Plundered Planet. Thinking around environmentalism hasn’t significantly changed in the ensuing 70 years:

Vogt and Osborn were also the first to bring to a wide public a belief that would become a foundation of environmental thought: consumption driven by capitalism and rising human numbers is the ultimate cause of most of the world’s ecological problems, and only dramatic reductions in human fertility and economic activity will prevent a worldwide calamity.

The Earth has a carrying capacity. Humans will breed until this carrying capacity is exceeded. Then wars and famine will break out.

Norman Borlaug also demonstrates what a comfortable career science can be…

Many years later, after he won the Nobel Prize, Norman Borlaug would look back on his first days in Mexico with incredulity. He was supposed to breed disease-resistant wheat in Mexico’s central highlands. Only after he arrived, in September 1944, did he grasp how unsuited he was for the task—almost as unqualified in his own way as Vogt had been when he set sail for Peru. He had never published an article in a peer-reviewed, professional journal. He had never worked with wheat or, for that matter, bred plants of any sort. In recent years he had not even been doing botanical research—since winning his Ph.D., he had spent his time testing chemicals and materials for industry. He had never been outside the United States and couldn’t speak Spanish. The work facilities were equally unprepossessing. Borlaug’s “laboratory” was a windowless tarpaper shack on 160 acres of dry, scrubby land on the campus of the Autonomous University of Chapingo. (“Autonomous” refers to the university’s legal authority to set its curriculum without government interference; Chapingo was the name of the village outside Mexico City where it was located.) And although Borlaug was sponsored by the wealthy Rockefeller Foundation, it could not provide him with scientific tools or machinery; during the Second World War, such equipment was reserved for the military.

Mann points out that being a science writer is a lot more fun than being a scientist: “A prerequisite for a successful scientific career is an enthusiastic willingness to pore through the minutiae of subjects that 99.9 percent of Earth’s population find screamingly dull.”

After decades of poverty and 80-hour work weeks, the Green Revolution ensues. Combine with the Haber-Bosch process for synthesizing ammonia to use in fertilizer (Mann says that 1 percent of the world’s industrial energy goes for this) and we can have unlimited food, right?

Maybe not. “Norman Borlaug: humanitarian hero or menace to society?” (Guardian, 2014):

“Few people at the time considered the profound social and ecological changes that the revolution heralded among peasant farmers. The long-term cost of depending on Borlaug’s new varieties, said eminent critics such as ecologist Vandana Shiva in India, was reduced soil fertility, reduced genetic diversity, soil erosion and increased vulnerability to pests.

Not only did Borlaug’s ‘high-yielding’ seeds demand expensive fertilisers, they also needed more water. Both were in short supply, and the revolution in plant breeding was said to have led to rural impoverishment, increased debt, social inequality and the displacement of vast numbers of peasant farmers,” he wrote.

The political journalist Alexander Cockburn was even less complimentary: “Aside from Kissinger, probably the biggest killer of all to have got the peace prize was Norman Borlaug, whose ‘green revolution’ wheat strains led to the death of peasants by the million.”

Mann does not cover these criticisms of Borlaug’s work. Even with Mann’s 100-percent positive perspective on the frankengrains, he admits that the only way to feed an increased human population with the latest tech comes at the cost of destroying animals in the ocean:

Hard on the heels of the gains were the losses. About 40 percent of the fertilizer applied in the last sixty years wasn’t assimilated by plants; instead, it washed away into rivers or seeped into the air in the form of nitrous oxide. Fertilizer flushed into rivers, lakes, and oceans is still fertilizer: it boosts the growth of algae, weeds, and other aquatic organisms. When these die, they rain to the ocean floor, where they are consumed by microbes. So rapidly do the microbes grow on the increased food supply that their respiration drains the oxygen from the lower depths, killing off most life. Where agricultural runoff flows, dead zones flourish. Nitrogen from Middle Western farms flows down the Mississippi to the Gulf of Mexico every summer, creating an oxygen desert that in 2016 covered almost 7,000 square miles. The next year a still larger dead zone—23,000 square miles—was mapped in the Bay of Bengal.

How about organics? Maybe that is the answer:

[Organic farming promoter Jerome] Rodale died in 1971—bizarrely, on a television talk show, suffering a heart attack minutes after declaring “I never felt better in my life!” and offering the host his special asparagus boiled in urine.

The Gates Foundation will enable the Earth to support 50 billion people by engineering rice that accomplishes C4 photosynthesis:

Barely 3 percent of the flowering plants are C4, but they are responsible for about a quarter of all the photosynthesis on land. The impact of C4 is evident to anyone who has looked at a recently mowed lawn. Within a few days of mowing, the crabgrass in the lawn springs up, towering over the rest of the lawn (typically bluegrass or fescue in cool areas). Fast-growing crabgrass is C4; lawn grass is ordinary photosynthesis. The same is true for wheat and maize. Plant them on the same day in the same place and soon the maize will overshadow the wheat—maize is C4, wheat is not. In addition to growing faster, C4 plants also need less water and fertilizer, because they don’t waste water on reactions that lead to excess oxygen, and because they don’t have to make as much rubisco.

One of these in-between species is maize: its main leaves are C4, whereas the leaves around the cob are a mix of C4 and ordinary photosynthesis. If two forms of photosynthesis can be encoded from the same genome, they cannot be that far apart. Which in turn implies that people equipped with the tools of molecular biology might be able to transform one into another. In the botanical equivalent of a moonshot, an international consortium of almost a hundred agricultural scientists is working to convert rice into a C4 plant—a rice that could grow faster, require less water and fertilizer, withstand higher temperatures, and produce more grain. Funded largely by the Bill & Melinda

Full post, including comments

Factory farms may be killing coral reefs, not a warming planet

Interesting article from the nerds at phys.org:

A study published in the international journal Marine Biology, reveals what’s really killing coral reefs. With 30 years of unique data from Looe Key Reef in the lower Florida Keys, researchers from Florida Atlantic University’s Harbor Branch Oceanographic Institute and collaborators have discovered that the problem of coral bleaching is not just due to a warming planet, but also a planet that is simultaneously being enriched with reactive nitrogen from multiple sources.

Improperly treated sewage, fertilizers and top soil are elevating nitrogen levels, which are causing phosphorus starvation in the corals, reducing their temperature threshold for “bleaching.” These coral reefs were dying off long before they were impacted by rising water temperatures. This study represents the longest record of reactive nutrients and algae concentrations for coral reefs anywhere in the world.

In other words, the same factory/industrial farming that creates massive dead zones in oceans worldwide (including the Gulf of Mexico) is also at least partially responsible for killing the coral reefs, not a rise in sea temperature.

Will Earth support a human population of 10 billion or more? Yes, but maybe without any animals, including coral.

Full post, including comments

Science says that success cannot be inherited genetically

An enduring source of amusement is watching people who have a scientific perspective (and oftentimes actual training in science) throw rocks at the religious for being irrationally dogmatic.

Part of the dogma of the politically righteous today in the U.S. is that success cannot be inherited genetically. The children of the rich tend to be rich, but that is because they got cash from their parents (Exhibit A: Donald Trump!), not because they have personal characteristics that resembles their successful parents’ personal characteristics.

When this has been carefully studied, e.g., in The Son Also Rises, it turns out that success does behave like other genetically inherited characteristics. The child of successful parents has roughly the same chance of becoming successful regardless of the number of siblings who are present to dilute any financial inheritance.

She Has Her Mother’s Laugh: The Powers, Perversions, and Potential of Heredity by Carl Zimmer is a great illustration of this dogma. The book goes on at length regarding things that can be inherited genetically. But then we get an economics lesson on inequality:

Raj Chetty, a Stanford economist, has estimated that Americans born in 1940 had a 90 percent chance of making more money than their parents at age thirty. But Chetty and his colleagues have found that those odds then steadily dropped. Americans born in 1984 had only a 50 percent chance of making more than their parents. The shift was not the result of the United States suddenly running out of money. It’s just that wealthy Americans have been taking much of the extra money the economy has generated in recent decades. Chetty’s research suggests that if the recent economic growth in the United States was distributed more broadly, most of the fading he has found would disappear. “The rise in inequality and the decline in absolute mobility are closely linked,” he and his colleagues reported in 2017. Inheritance has helped push open that gulf. About two-thirds of parental income differences among Americans persist into the next generation. Economists have found that American children who are born to parents in the ninetieth percentile of earners will grow up to make three times more than children of the tenth percentile. This inheritance is not simply what parents leave in their wills but the things that they can buy for their children as they grow up. In the United States, affluent parents can afford a house in a good public school district, or even private school tuition. They can pay for college test prep classes to increase the odds their children will get into good colleges. And if they do get in, their parents can cover more of their college tuitions. Poor parents have fewer means to prepare their children to get into college. Even if their children do get accepted, they have fewer funds, and they’re more vulnerable to layoffs or medical bankruptcy. Their children may graduate saddled with steep college debts or drop out before getting a degree. The gifts that children inherit can keep coming well into adulthood. Parents may help cover the cost of law school, or write a check to help out with a septic tank that failed just after their children bought their first house. Protected from catastrophes that can wipe out bank accounts, young adults from affluent families can get started sooner on building their own wealth. Inheritance also goes a long way to explain the gap in wealth between races in the United States. In 2013, the median white American household had thirteen times the wealth of the median black household, and ten times that of the median Latino household.

Whites are five times as likely to receive major gifts from relatives, and when they do, their value is much greater. These gifts can, among other things, allow white college students to graduate with much less debt than blacks or Latinos. And the effects of these inheritances have compounded through the generations as blacks and Latinos were left outside the wealth feedback loop that benefited white families.

In looking at how the children of those in the top 10 percent do, the author does not consider the possibility that the parents reached the top 10 percent due to genetic fitness for the current economic environment. (e.g., a fondness for sitting at a desk looking at numbers on a computer screen!). So it is our cruel economic system alone that dooms children of the least successful parents to mediocre incomes. If a third generation of a family whose first and second generations were on welfare (public housing, Medicaid, food stamps, and Obamaphone) elects to continue the welfare lifestyle, this is because the parents and grandparents couldn’t provide an inheritance.

It is not the beliefs that are interesting so much as the fact that the author can’t see this dogma conflicts with all of the science that he presented in the previous pages. Perhaps the UC Davis econ professor who did The Son Also Rises got it wrong, but what’s interesting is that apparently nobody can dare to consider the possibility that he got it right.

Full post, including comments

Science is Settled: one characteristic cannot be inherited genetically

She Has Her Mother’s Laugh: The Powers, Perversions, and Potential of Heredity by Carl Zimmer says that almost everything is heritable and that genetics is the mechanism for heritability. However, there is one big exception… intelligence.

Why does this matter? The book reminds us that the idea that a lack of intelligence will render a person dependent on welfare goes back at least to the 1930s:

The Great Depression was reaching its depths when [Henry Herbert] Goddard came back to Vineland, and he blamed it largely on America’s lack of intelligence: Most of the newly destitute didn’t have the foresight to save enough money. “Half of the world must take care of the other half,” Goddard said.

The idea that intelligence could not be explained by heredity is similarly old:

[British doctor Lionel] Penrose entered the profession as a passionate critic of eugenics, dismissing it as “pretentious and absurd.” In the early 1930s, eugenics still had a powerful hold on both doctors and the public at large—a situation Penrose blamed on lurid tales like The Kallikak Family. While those stories might be seductive, eugenicists made a mess of traits like intelligence. They were obsessed with splitting people into two categories—healthy and feebleminded—and then they would cast the feebleminded as a “class of vast and dangerous dimensions.” Penrose saw intelligence as a far more complex trait. He likened intelligence to height: In every population, most people were close to average height, but some people were taller and shorter than average. Just being short wasn’t equivalent to having some kind of a height disease. Likewise, people developed a range of different mental aptitudes. Height, Penrose observed, was the product of both inherited genes and upbringing. He believed the same was true for intelligence. Just as Mendelian variants could cause dwarfism, others might cause severe intellectual developmental disorders. But that was no reason to leap immediately to heredity as an explanation. “That mental deficiency may be to some extent due to criminal parents’ dwelling ‘habitually’ in slums seems to have been overlooked,” Penrose said. He condemned the fatalism of eugenicists, as they declared “there was nothing to be done but to blame heredity and advocate methods of extinction.”

Even if a country did sterilize every feebleminded citizen, Penrose warned, the next generation would have plenty of new cases from environmental causes. “The first consideration in the prevention of mental deficiency is to consider how environmental influences which are held responsible can be modified,” Penrose declared.

The author finds some cases in which children with severe physical disorders, e.g., PKU, have impaired intelligence. From this he reminds us that it is wrong to believe that “our intelligence is fixed by the genes we inherit.” (Is that truly a comforting idea? I would have been as smart as Albert Einstein, for example, but I watched too much TV as a kid and didn’t work hard enough as an adult?)

We would be as tall as the Dutch if only we were smart enough to build a bigger government (2nd largest welfare state, as a percentage of GDP, is not enough to grow tall!):

The economy of the United States, the biggest in the world, has not protected it from a height stagnation. Height experts have argued that the country’s economic inequality is partly to blame. Medical care is so expensive that millions go without insurance and many people don’t get proper medical care. Many American women go without prenatal care during pregnancy, while expectant mothers in the Netherlands get free house calls from nurses.

How do intelligence distributions change over time, given that environment is supposed to be a huge factor?

Intelligence is also a surprisingly durable trait. On June 1, 1932, the government of Scotland tested almost every eleven-year-old in the country—87,498 all told—with a seventy-one-question exam. The students decoded ciphers, made analogies, did arithmetic. The Scottish Council for Research in Education scored the tests and analyzed the results to get an objective picture of the intelligence of Scottish children. Scotland carried out only one more nationwide exam, in 1947. Over the next couple of decades, the council analyzed the data and published monographs before their work slipped away into oblivion.

Deary, Whalley, and their colleagues moved the 87,498 tests from ledgers onto computers. They then investigated what had become of the test takers. Their ranks included soldiers who died in World War II, along with a bus driver, a tomato grower, a bottle labeler, a manager of a tropical fish shop, a member of an Antarctic expedition, a cardiologist, a restaurant owner, and an assistant in a doll hospital. The researchers decided to track down all the surviving test takers in a single city, Aberdeen. They were slowed down by the misspelled names and erroneous birth dates. Many of the Aberdeen examinees had died by the late 1990s. Others had moved to other parts of the world. And still others were just unreachable. But on June 1, 1998, 101 elderly people assembled at the Aberdeen Music Hall, exactly sixty-six years after they had gathered there as eleven-year-olds to take the original test. Deary had just broken both his arms in a bicycling accident, but he would not miss the historic event. He rode a train 120 miles from Edinburgh to Aberdeen, up to his elbows in plaster, to witness them taking their second test. Back in Edinburgh, Deary and his colleagues scored the tests. Deary pushed a button on his computer to calculate the correlation between their scores as children and as senior citizens. The computer spat back a result of 73 percent. In other words, the people who had gotten relatively low scores in 1932 tended to get relatively low scores in 1998, while the high-scoring children tended to score high in old age.

If you had looked at the score of one of the eleven-year-olds in 1933, you’d have been able to make a pretty good prediction of their score almost seven decades later. Deary’s research prompted other scientists to look for other predictions they could make from childhood intelligence test scores. They do fairly well at predicting how long people stay in school, and how highly they will be rated at work. The US Air Force found that the variation in [general intelligence] among its pilots could predict virtually all the variation in tests of their work performance. While intelligence test scores don’t predict how likely people are to take up smoking, they do predict how likely they are to quit. In a study of one million people in Sweden, scientists found that people with lower intelligence test scores were more likely to get into accidents.

IQ is correlated with longevity:

But Deary’s research raises the possibility that the roots of intelligence dig even deeper. When he and his colleagues started examining Scottish test takers in the late 1990s, many had already died. Studying the records of 2,230 of the students, they found that the ones who had died by 1997 had on average a lower test score than the ones who were still alive. About 70 percent of the women who scored in the top quarter were still alive, while only 45 percent of the women in the bottom quarter were. Men had a similar split. Children who scored higher, in other words, tended to live longer. Each extra fifteen IQ points, researchers have since found, translates into a 24 percent drop in the risk of death.

The author reports that twins separated at birth have almost identical IQs, despite completely different childhood environments. With most other personal characteristics, this would lead to the conclusion that intelligence was mostly heritable. Instead, however, Zimmer points out that if heritability is not 100 percent then it would be a mistake to call something “genetic”:

Intelligence is far from blood types. While test scores are unquestionably heritable, their heritability is not 100 percent. It sits instead somewhere near the middle of the range of possibilities. While identical twins often end up with similar test scores, sometimes they don’t. If you get average scores on intelligence tests, it’s entirely possible your children may turn out to be geniuses. And if you’re a genius, you should be smart enough to recognize your children may not follow suit. Intelligence is not a thing to will to your descendants like a crown.

To bolster the claim that intelligence is not heritable, the book cites examples of children whose mothers were exposed to toxic chemicals during pregnancy. Also examples that staying in school for additional years raises IQ (a measure of symbol processing efficiency).

Here’s an interesting-sounding study:

In 2003, Eric Turkheimer of the University of Virginia and his colleagues gave a twist to the standard studies on twins. To calculate the heritability of intelligence, they decided not to just look at the typical middle-class families who were the subject of earlier studies. They looked for twins from poorer families, too. Turkheimer and his colleagues found that the socioeconomic class determined how heritable intelligence was. Among children who grew up in affluent families, the heritability was about 60 percent. But twins from poorer families showed no greater correlation than other siblings. Their heritability was close to zero.

(Do we believe that heritability is zero because identical twins and siblings both have highly correlated IQs? Earlier in the book, the author describes how hospitals and doctors often misclassify twins.)

Buried in the next section is that this finding is not straightforward to replicate (“Why Most Published Research Findings Are False”):

If you raise corn in uniformly healthy soil, with the same level of abundant sunlight and water, the variation in their height will largely be the product of the variation in their genes. But if you plant them in a bad soil, where they may or may not get enough of some vital nutrient, the environment will be responsible for more of their differences. Turkheimer’s study hints that something similar happens to intelligence. By focusing their research on affluent families—or on countries such as Norway, where people get universal health care—intelligence researchers may end up giving too much credit to heredity. Poverty may be powerful enough to swamp the influence of variants in our DNA. In the years since Turkheimer’s study, some researchers have gotten the same results, although others have not. It’s possible that the effect is milder than once thought. A 2016 study pointed to another possibility, however. It showed that poverty reduced the heritability of intelligence in the United States, but not in Europe. Perhaps Europe just doesn’t impoverish the soil of its children enough to see the effect. Yet there’s another paradox in the relationship between genes and the environment. Over time, genes can mold the environment in which our intelligence develops. In 2010, Robert Plomin led a study on eleven thousand twins from four countries, measuring their heritability at different ages. At age nine, the heritability of intelligence was 42 percent. By age twelve, it had risen to 54 percent. And at age seventeen, it was at 68 percent. In other words, the genetic variants we inherit assert themselves more strongly as we get older. Plomin has argued that this shift happens as people with different variants end up in different environments. A child who has trouble reading due to inherited variants may shy away from books, and not get the benefits that come from reading them.

Poverty in the cruel U.S. crushes children! But, measured in terms of consumption, an American family on welfare actually lives better than a lot of middle class European families. The author praises the Europeans with their universal health care systems, but 100 percent of poor American children qualify for Medicaid, a system of unlimited health care spending (currently covering roughly 75 million people).

If unlimited taxpayer-funded Medicaid isn’t sufficient to help poor American children reach their genetic potential, maybe early education will? The book quotes a Head Start planner: “The fundamental theoretical basis of Head Start was the concept

Full post, including comments

Mothers acquiring cells from babies

She Has Her Mother’s Laugh: The Powers, Perversions, and Potential of Heredity by Carl Zimmer:

In 1996, Lee Nelson proposed that microchimerism might make some mothers sick. With half their genetic material coming from their father, fetal cells might be a confusing mix of the foreign and the familiar. Nelson speculated that being exposed to fetal cells for years on end could lead a woman’s immune system to attack her own tissues. That confusion might be the reason that women are more vulnerable to autoimmune diseases such as arthritis and scleroderma. To test this possibility, Nelson and Bianchi collaborated on an experiment. They picked out thrity-three mothers of sons, sixteen of whom were healthy and seventeen of whom suffered from scleroderma. Nelson and Bianchi found that the women with scleroderma had far more fetal cells from their sons than did the healthy women.

But maybe this can be good?

It’s also possible that being a chimera can be good for your health. Bianchi’s first clue that chimerism might have an upside came in the late 1990s, when she was searching for fetal cells in various organs. She discovered a mother’s thyroid gland packed with fetal cells carrying Y chromosomes. Her gland was badly damaged by goiter, and yet it still managed to secrete normal levels of thyroid hormones. The evidence pointed to a startling conclusion: A fetal cell from her son had wended its way through her body to her diseased thyroid gland. It had sensed the damage there and responded by multiplying into new thyroid cells, regenerating the gland.

What about getting genes from a baby that is not genetically one’s own?

As chimerism rises out of the freak category, it also raises unexpected ethical questions. Somewhere around a thousand children a year are born to surrogate mothers in the United States alone. As Ruth Fischbach and John Loike, two bioethicists at Columbia University, have observed, the rules for surrogacy are based on an old-fashioned notion of pregnancy. They treat people as bundles of genes. As a society, we are comfortable with a woman nourishing another couple’s embryo and then parting ways with it, because she does not share the hereditary bond that a biological mother would. If the pregnancy goes smoothly, the surrogate mother is supposed to leave the experience no different than before the procedure. But Fischbach and Loike observed that a surrogate mother and a baby may end up connected in the most profound way possible. Cells from the fetus may embed themselves throughout her body, perhaps for life. And she may bequeath some of her cells to the child. This is not merely a thought experiment. In 2009, researchers at Harvard did a study on eleven surrogate mothers who carried boys but who never had sons of their own. After the women gave birth, the scientists found Y chromosomes in the bloodstreams of five of them.

More: Read She Has Her Mother’s Laugh: The Powers, Perversions, and Potential of Heredity

Related:

Full post, including comments

Human Chimeras

Some more interesting stuff from She Has Her Mother’s Laugh: The Powers, Perversions, and Potential of Heredity by Carl Zimmer… It turns out that Biology 101 contains a lot of simplifications (lies!).

Wikipedia: “A genetic chimerism or chimera … is a single organism composed of cells with distinct genotypes. In animals, this means an individual derived from two or more zygotes, …

How can this happen to a human and how does that interact with our “science is settled” attitude regarding DNA tests? Zimmer gives some examples:

In 2001, a thirty-year-old woman in Germany discovered she was a chimera while she was trying to get pregnant. For the previous five years, she and her husband had been trying to have a baby. They were fairly certain the problem didn’t lie with her biology, because she had gotten pregnant when she was seventeen and had had regular menstrual cycles ever since. A fertility test revealed that her husband had a low level of viable sperm, and so they made plans for IVF. As a routine check, the woman’s doctors took blood samples from her and her husband. They looked at the chromosomes in the couple’s cells, to make sure neither would-be parent had an abnormality that would torpedo the IVF procedure. The woman’s chromosomes looked normal—if she were a man. In every white blood cell they inspected, they found a Y chromosome. Given that she had given birth, this was a weird result. And a careful exam revealed that all her reproductive organs were normal. To get a broader picture of the woman’s cellular makeup, her doctors took samples of her muscle, ovaries, and skin. Unlike her immune cells, none of the cells from these other tissues had a Y chromosome in them. The researchers then carried out a DNA fingerprinting test on the different tissues, looking at the women’s microsatellites—the repeating sequences that can distinguish people from one another. They found that her immune cells belonged to a different person than her other tissues. It turned out that the woman had had a twin brother who died only four days after birth. Although he was unable to survive on his own, his cells took over his sister’s blood and lived on within her.

In 2003, a woman in Washington State named Lydia Fairchild had to get a DNA test. Fairchild, who was then twenty-seven, was pregnant with her fourth child, unemployed, and single. To get welfare benefits, state law required that she prove that her children were genetically related both to herself and to their father, Jamie. One day, Fairchild got a call from the Department of Social Services to come in immediately. A DNA test had confirmed that Jamie was the father of the three children. But Fairchild was not their mother.

When Fairchild was rushed to a hospital to deliver her fourth child, a court officer was there to witness the birth. The officer also oversaw a blood draw for a DNA test. The results came back two weeks later. Once again, Fairchild’s DNA didn’t match her child’s. Even though the court officer had witnessed the child’s birth, the court still refused to consider any evidence beyond DNA.

In Boston, a woman named Karen Keegan had developed kidney disease and needed a transplant. To see if her husband or three sons were a match, her doctors drew blood from the whole family in order to examine a set of immune-system genes called HLA. A nurse called Keegan with the results. Not only were her sons not suitable as organ donors, but the HLA genes from two of them didn’t match hers at all. It was impossible for them to be her children. The hospital went so far as to raise the possibility she had stolen her two sons as babies. Since Keegan’s children were now grown men, she didn’t have to face the terrifying prospect of losing her children as Fairchild did. But Keegan’s doctors were determined to figure out what was going on. Tests on her husband confirmed he was the father of the boys. Her doctors took blood samples from Keegan’s mother and brothers, and collected samples from Keegan’s other tissues, including hair and skin. Years earlier, Keegan had had a nodule removed from her thyroid gland, and it turned out that the hospital had saved it ever since. Her doctors also got hold of a bladder biopsy. Examining all these tissues, Keegan’s doctors found that she was made up of two distinct groups of cells. They could trace her body’s origins along a pair of pedigrees—not to a single ancestral cell but to a pair. They realized Keegan was a tetragametic chimera, the product of two female fraternal twins. The cells of one twin gave rise to all her blood. They also helped give rise to other tissues, as well as to some of her eggs. One of her sons developed from an egg that belonged to the same cell lineage as her blood. Her other two children developed from eggs belonging to the lineage that arose from the other twin. When Lydia Fairchild’s lawyer heard about the Keegan case, he immediately demanded that his client get the same test. At first, it looked as if things were going to go against Fairchild yet again. The DNA in her skin, hair, and saliva failed to match her children’s. But then researchers looked at a sample taken from a cervical smear she had gotten years before. It matched, proving she was a chimera after all.

More: Read She Has Her Mother’s Laugh: The Powers, Perversions, and Potential of Heredity

Full post, including comments

Don’t bite tumors off other folks’ faces

She Has Her Mother’s Laugh: The Powers, Perversions, and Potential of Heredity by Carl Zimmer:

Tough as they might be, though, Tasmanian devils were in trouble. A singular epidemic was sweeping across the island, not quite like anything veterinarians had seen before. A devil would develop a fleshy growth in or around its mouth. In a matter of weeks, the growth would balloon, and within a few months, the animal would starve or suffocate. These growths were first observed in 1996 in the northeast corner of Tasmania, and over the next few years they spread over most of the island, killing off tens of thousands of devils. By the early 2000s, the species looked like it might become extinct in a matter of decades, killed by a disease scientists didn’t understand. It would take them years to realize that these devils were chimeras, and that their cancers descended from the cells of a long-dead animal.

The DNA fingerprint from tumor cells didn’t match the healthy cells from the same devil’s body. Instead, they matched cancer cells from devils who died dozens of miles away. It was as if all the sick devils had gotten a cancer transplant from a single tumor.

At some point in the early 1990s, [Elizabeth] Murchison’s research showed, a single Tasmanian devil in the northeast corner of the island got cancer. The mutations may have initially arisen in a Schwann cell. The descendants of that original cancer cell grew into a tumor. During a fight, another devil bit off the tumor. The cancer cells did not end up digested in the attacker’s stomach. Instead, they likely lingered in the devil’s mouth, where they were able to burrow into the cheek lining and work their way metastatically into the other tissues in the devil’s head. The cancer cells continued dividing and mutating, until they broke through the skin of the second devil’s face. At some point, that new victim also got bit, and its own attacker took in the cells from the original cancer. A single carrier could pass the cancer on to several other devils if it was especially aggressive, and thus help accelerate its spread. Passing through host after host, the tumor cells gained about twenty thousand new mutations.


Consider this fair warning!

Full post, including comments

On being mistaken for a lawyer

I was down in Washington, D.C. recently to catch the 88th Joseph Henry lecture run by the Philosophical Society of Washington. There was a dinner beforehand to honor Brian Keating, the speaker, and it was officially black tie. I put on a pinstripe suit and walked through the door of the Cosmos Club, which was apparently hosting some other events that night as well. Here’s the exchange:

  • Cosmos Club hostess: “You’re a lawyer here for the American Bar Association event?”
  • Me: “Now why would you say a thing like that to a person you just met?”

It was the highlight of my evening! (For everyone else, though, it was “The Big Bang and Inflation; Glimpsing the Beginning of Time from the Ends of the Earth” (YouTube))

Full post, including comments