Questions around personal genomics

The breadth of topics studied in medical anthropology — or anthropology in general — is so broad that when I choose to write about a wealthy, industrialized nation issue like personal genomics, I feel a pang of guilt that I’m not posting about malaria and insecticide resistant mosquitoes or the new UNICEF cholera toolkit.  It’s that diversity of topics that keeps me hooked, though, so here we go!

We’re at a point where genetic testing for various disorders is becoming more common and where it’s not prohibitively expensive to get one’s DNA genotyped.  This year, I asked for it as a birthday present; in a couple months I should have my report from 23andme. Since a number of factors went into my decision, I thought I’d use them to frame a post about the individual considerations of genotyping and sequencing.

Why do it?

For me, the answer is pure curiosity and that I like the idea of contributing my data set to ongoing research. I have no biological offspring, so I’m not looking for risk factors that may be passed to them.  My ethnicity and family tree are known and I don’t expect any surprises, but neither would they bother me. There’s also the possibility for customized medical treatment in the future, and of knowing right now if I’m more or less sensitive to some medications.

first world problems

I bounce with nerdy excitement thinking of some things I could learn, even if the knowledge doesn’t have much practical application. For example, the American Red Cross won’t take my blood because of a remote possibility that I was exposed to Creutzfeld-Jakob Disease  in the early 1990s. This test won’t tell me if I was exposed or not, but it can reveal if I have an A at both copies of rs17571, which is correlated with ten times greater risk of contracting the disease if I was.  Neato!

What about privacy?

I’m not prepared to join the ranks of people who have released their genetic information online, yet.  Micah wrote about the shocked reaction of Henrietta Lacks’ family when the HeLa genome was published and while DNA may be unique, I don’t feel it’s ethical to reveal genetic information about my parents or possible information about my sibling without consent.  Dan Vorhaus at Genomes Unzipped has a good post about his decision to take his genetic information public and how he approached his family. The potential for the abuse of that information by others is too great, and I when I decided to have the test done I weighed the chances for stigma and discrimination, not only based on what genetic knowledge we have now but what we might learn in the future. I’m too hacker-aware to believe my data will be truly private or anonymous anywhere, so that was part of my calculation. I decided not to consider the more remote possibilities, such as those in the prequalification reading for enrollment in the Personal Genome Project:

More nefarious uses are also possible, if unlikely. DNA is commonly used to identify individuals in criminal investigations. Someone could plant samples of DNA, created from genome data or cell lines, to falsely implicate you in a crime. It’s currently science fiction — but it’s possible that someone could use your DNA or cells for in vitro fertilization to create children without your knowledge or permission, or to create human clones.

What if the test shows something terrible?

A lot of what we know now is limited to correlations and increased chances.  I disagree strongly with congressmen who think that someone who learns that he has a genetic risk factor will “panic first and ask questions later”, which seems like a reductive and wrong guess at human behavior, especially among those who are proactively seeking that information. Some very intelligent and knowledgeable people  — James Watson and Steven Pinker among them — have chosen not to learn if they have a variant associated with a higher risk of Alzheimer’s Disease (read Pinker’s great article about his personal genomic explorations). I want to know all I can.  Maybe I feel confident because I’ve immersed myself in genealogy recently and have seen a lot of long-lived ancestors, but I think that I can handle it.  There’s a big difference between increased risk and a certain future, after all.

What’s the anthropological angle?

There are several:

  • This piece from the Yale Journal of Biology and Medicine nicely summarizes Margaret Lock’s talk on this topic at the 2009  Society for Medical Anthropology meeting.  From the article, “…Lock believes that it is now time for anthropologists to be more accepting of this reality and even aid the integration of the genomic era by examining the many issues that arise because of its associated activities, such as the social implications of genetic profiling, the ownership and moral dilemmas of engineering hybrid crops and livestock, and the societal perception of the newly formulated concept of man’s own hybridity.”
  • There is a question of ownership and advocacy to explore. Myriad Genetics has patents on genes BRCA1 and BRCA2, which have been in the news lately because of Angelina Jolie. These patents prevent other companies from testing for mutations on those genes, keeping the price high. The ACLU took this issue to the Supreme Court last month and a decision is expected in summer, but as our genetic knowledge expands, ensuring access to diagnostic testing seems like an issue we’d care about.
  • The field of anthropological genetics and the more broad molecular anthropology. One of the features 23andme promotes is an estimation of what percentage of a subject’s DNA came from Neanderthals. I’m a bit wary of that one.
  • Every aspect of human experience from avoiding or seeking testing, to how results impact one’s identity and risk perception, to watching for the creation of new social categories based on genetic factors, to whether this is just reinforcing the hegemony of “Western” medicine.

NIMH rejects the DSM-5

In a letter on the National Institute of Mental Health website, Director Thomas Insel announced that NIMH will be “re-orienting its research away from DSM categories.” He comments that the DSM has had reliability but not validity:

In the rest of medicine, this would be equivalent to creating diagnostic systems based on the nature of chest pain or the quality of fever. Indeed, symptom-based diagnosis, once common in other areas of medicine, has been largely replaced in the past half century as we have understood that symptoms alone rarely indicate the best choice of treatment.

I had a moment of hope, that perhaps they would be looking beyond reported symptoms to cultural and structural as well as biological factors. Instead, NIMH is launching the Research Domain Criteria (RDoc) project to develop a classification system of its own. NIMH support in the future will be for research that cuts across DSM categories and fits the assumptions of RDoC (the emphasis is mine):

  • A diagnostic approach based on the biology as well as the symptoms must not be constrained by the current DSM categories,
  • Mental disorders are biological disorders involving brain circuits that implicate specific domains of cognition, emotion, or behavior,
  • Each level of analysis needs to be understood across a dimension of function,
  • Mapping the cognitive, circuit, and genetic aspects of mental disorders will yield new and better targets for treatment

I’m not a doctor and I do believe that many illnesses that we see as “mental disorders” have a basis in chemical imbalances or biological disease. However, this sort of institutional bioreductionism worries me. It seems like a quest for magic bullet solutions rather than an understanding of the complex factors inside and outside the patient that contribute to what he or she is experiencing.

The robot in the white coat

The cover story of the March print edition of The Atlantic is “The Robot Will See You Now”, which explores the various ways that technology is queued to disrupt the medical establishment.  IBM’s Watson is now working through case histories from Memorial Sloan-Kettering, a step toward a much more sophisticated diagnostic and treatment recommendation tool than symptom searches in medical databases.

While I don’t undervalue the talent of an experienced doctor to perceive symptoms that may not be mentioned as complaints and put them together into a better diagnosis, I found myself nodding when one physician on Watson’s training team mentioned the problem of “anchoring bias”, in which one symptom is given priority and others are ignored or seen as unimportant.  That can be multiplied with other prejudices, such as the implicit and explicit bias against fat people that was shown in a study released a few months ago (this post from Jezebel describes the situation well), making it more difficult for members of some populations to receive a valid diagnosis.

The article also discusses the improvements in monitoring technology that are being pioneered by the enthusiasts in the quantified self movement that I’ve posted about previously.  It may soon be possible to wear a monitor that reports regularly and wirelessly to your doctor on an important statistic that is being tracked: blood pressure or heart rate, for example.  A scale or blood sugar tester could share data every time it’s used in the same way.

Potential changes in the career paths of medical workers are considered and decreased contact between doctors and patients for routine issues seems likely.  This could boost the already strong prospects for nurses and physician assistants and — as the article states — allow “everyone to practice at the top of their license.”

This article comes out the same time as a study of robot-assisted hysterectomies determined they are increasing in prevalence despite costing 100-200% more than the standard surgery.  There is little evidence of any improvement in outcome and the suggestion is that the surgery is becoming more widespread due to marketing, not only to the medical establishment but also to patients.  We’ll be wise to remember that new and high-tech doesn’t always mean it’s better for patient care, as Monty Python tried to show us decades ago:

 

Addition: For a half hour audio discussion of this topic, take a listen to Talk of the Nation with Ira Flatow from June 1, 2012.  Flatow speaks with guests Dr. Eric Topol, author of The Creative Destruction of Medicine, Dr. Reed Tuckson, head of UnitedHealth Group, and Dr. Arnold Relman, former editor-in-chief of the New England Journal of Medicine.

A surprising model for health system improvement

In the debate over how to improve health care in the US, systems in other countries are often held up as models.  You know, countries like Rwanda.

Yes, Rwanda.

A thought provoking piece yesterday on The Atlantic made that comparison, citing analysis by Dr. Paul Farmer.  From the article:

Over the last ten years, Rwanda’s health system development has led to the most dramatic improvements of health in history. Rwanda is the only country in sub-Saharan Africa on track to meet most of the Millennium Development Goals. Deaths from HIV, TB, and malaria have each dropped by roughly 80 percent over the last decade and the maternal mortality ratio dropped by 60 percent over the same period. Even as the population has increased by 35 percent since 2000, the number of annual child deaths has fallen by 63 percent. In turn, these advances bolstered Rwanda’s economic growth: GDP per person tripled to $580, and millions lifted themselves from poverty over the last decade.

One explanation for this dramatic improvement is that the genocide in Rwanda allowed for a clean slate upon which a new program could be built.  Farmer and others reject this explanation, however.  A recent report focuses more on interdepartmental coordination and central planning with health as a priority.  The article is a good summary and the BMJ research paper with Farmer as lead researcher has more details.

On ice cream, lead, and murder

On this blog in the past, we have looked at some intriguing ways in which social issues such as violence may be considered as epidemics.  We have also looked at some of the problems in public health with confusing correlation with causation; a classic statement of the fallacy is often given as follows: in summer ice cream sales go up, and murder rates go up.  Therefore, eating ice cream causes murder.

The example of ice cream and murder is absurd, but it points out just how difficult it can be to ascribe causation definitively in matters of public health.  Clearly, both ice cream sales and murder rates are independently affected by the same actual cause (heat waves), but one could easily imagine compelling data showing that ice cream sales go up just before each wave of violence.  And in fact, a fascinating new piece in Mother Jones has been getting a lot of attention in public health circles this week because it shows exactly that kind of compelling relationship between violence and a different factor: leaded gasoline.

Through a pretty careful analysis of past publication, the article makes an extraordinary claim: “Gasoline lead may explain as much as 90 percent of the rise and fall of violent crime over the past half century”.  But it has the data to back it up, and what’s really intriguing is that these correlations hold from the macro- all the way down to the neighborhood level.  In neighborhoods where lead is removed, crime rates drop a predictable number of years afterward.  If there really is a causal relationship between lead exposure and violent crime, we should be making the removal of lead from the environment a top priority- and maybe we should also be reconsidering the effectiveness of the police campaigns that are claiming the credit for the tremendous decline in violent crime America has been experiencing in recent decades.

But is this really ice cream and murder all over again? Scott Firestone has an excellent blog post about the MJ piece that does a nice job discussing why we might temper our enthusiasm about these findings somewhat (although he also finds the data very compelling), and it’s worth reading just to think more about how hard it can be to prove anything with certainty, even when the evidence is extraordinary (think of how successful tobacco lobbyists were for so long in creatively interpreting the data on the health effects of smoking).  There’s a brief and well-executed discussion this week in Scientific American about just how hard it can be to establish causation in health on another issue:  whether even very moderate amounts of drinking during pregnancy has any negative effects on babies.  This should be easy to establish, but it isn’t: in part because of ethical considerations (you can’t set up a control for potentially harmful behavior), in part because of the reliability of self-reports, and in part because of confounding variables like “lifestyle” associations (the same arguments tobacco lobbies make).

Cancer survivors forgoing care because of cost… perhaps

Young adult cancer survivors often forgo follow up medical care, reads the headline of this article on amednews.com (via the AMA’s Twitter feed). The article goes on to cite a study in which

Researchers analyzed data from the Centers for Disease Control and Prevention’s 2009 Behavioral Risk Factor Surveillance System on adults 20 to 39. A total of 979 had been diagnosed with cancer between ages 15 and 34 and were at least five years past the date of their diagnosis. They were compared with 67,216 adults with no cancer.

Both groups had similar rates of having health insurance. But those with a history of cancer were 67% more likely to go without care because of cost.

The rest of the article goes on to talk about insurance rates, unemployment, cancer costs, and the economic hardship of the young.  I’ve scanned the research and it seems that they used self-reports from the subjects to determine why they were forgoing care.

That leads me to ask: is it likely that cost is the only reason for the 67% difference in care-seeking?  Perhaps the dominant paradigm that heath care is too expensive provides a convenient answer without digging into more complex and uncomfortable reasons.  Purely speculating, I’d wonder about emotional exhaustion of Damocles Syndrome, a feeling of invulnerability from both the level of mental maturity and having survived cancer already, resentment at time lost to illness and a desire to just move on until a symptom appears,  an avoidance of potential bad news, for the sake of themselves and loved ones, or any of countless personal or cultural reasons.

Maybe I’m completely wrong on this, but it seems like the researchers put a lot of thought into economic factors and few into human ones.  It’s not that the study is inaccurate — it reports what the subjects told them — but that the conclusions seem superficial.