Nov 18, 2008 - Inside 23andMe

Solution to Mutation Rate Discrepancy

This guest post is by Brenna Henn, a Stanford University’s Department of Anthropology doctoral student and a 23andMe consultant. Brenna studies human evolution using genetic information. Her interests include the origin of modern humans, migration patterns among African groups, and genetic models of demography.

One of the reasons genetics is such a powerful tool for telling us about the past is that mutations that accumulate over the generations can be used as a clock, allowing scientists to calculate when different events occurred in the prehistoric past.

By counting the number of mutations differentiating one person from another, we can determine when in the past they shared a common ancestor. Each mutation is like a genetic tick of the clock, equivalent to a certain period of time. But figuring out how frequently mutations occur, and thus how long each tick of the genetic clock takes, is a major enterprise.

In a new paper appearing this month in Molecular Biology and Evolution, scientists from 23andMe propose that the genetic clock may have started running faster after the Ice Age ended about 15,000 years ago. If so, they may have resolved a discrepancy that has perplexed researchers for a number of years.

DNA mutation rates are estimated in two different ways. Geneticists can either count the number of mutations that occur in one generation (the changes between parents and their children), or they can count the number of differences between human and chimpanzee DNA and calculate a mutation rate based on the number of years since the two species diverged — about 6 million years.

It turns out that mitochondrial DNA mutation rates measured by these two methods described differ almost 10-fold. This difference is seen not only in humans, but also in species such as birds and fish. The discrepancy hints at the possibility that what we are measuring might be more complicated than we previously thought.

Working with Marcus Feldman of Stanford University, 23andMe scientists set out to find a new way of estimating the human DNA mutation rate by comparing the genetic diversity of lineages derived from the first human inhabitants of various continents and islands with archaeological finds that give precise information about when those colonists arrived (such as Polynesia).

We focused our studies on mitochondrial DNA data because there are a large number of mitochondrial sequences available in databases and the mutation rate of this DNA is known to be higher than in the autosomal DNA found in the 23 pairs of chromosomes,

What we found was surprising. Younger lineages (i.e. groups of people that have more recent common ancestors) had higher rates of mutation than older lineages, meaning that the molecular clock is not constant for human mtDNA: it slows down as lineages get older.

But why?

In our study the estimated mutation rates dropped off quickly around the end of the Ice Age — 15,000-20,000 years ago — suggesting that population history may play an important role in explaining the reduction in genetic diversity, and thus the changing pace of the DNA clock.

Prior to 20,000 years ago, humans lived in small hunter-gatherer groups that likely experienced boom and bust cycles: periodic climatic shifts or food shortages probably caused frequent and sudden population collapses. When populations decrease in size, or “bottleneck,” some genetic lineages are lost, which decreases the genetic diversity of the population. Since mutation rates are calculated using genetic diversity, the mutation rate estimates from these older lineages are slower.

After the climate warmed about 15,000 years ago and humans subsequently invented agriculture, populations grew dramatically. This resulted in more stable genetic diversity and faster mutation rate estimates.

More research needs to be done before we can determine whether population history or other factors, such as natural selection changing mutation rates, are primarily responsible for the human molecular clock slowdown. For example, we could simulate genetic diversity changes when a population size decreases and see if this matches the empirical mutation rate estimates.

Using our new data, which trace the slowdown in mutation rate back in time, scientists can identify the number that is most appropriate to use for studies of particular population events.

To date, the timing of most population events in human evolutionary genetics was estimated has used a rate close to the slower one we see for older lineages, before the end of the Ice Age. So our understanding of the genetic history of early human evolution shouldn’t change very much. But the timing of the splits between mitochondrial lineages associated with relatively recent events, such as agricultural expansions, may need revision. Using our newly calibrated mitochondrial mutation rates, researchers will be better able to correlate genetic, archaeological and linguistic data, leading to a more accurate understanding of human prehistory.

Stay in the know.

Receive the latest from your DNA community.