My mother-in-law’s arms look like she’s been in a fight. The bruises don’t hurt, but they’re embarrassing. They’re likely due to the drug Plavix, a trade-off for preventing clots. But we don’t know if the drug is actually helping, because she started it before the FDA urged physicians to use a pharmacogenetic (PGx) test to distinguish patients likely to respond to the drug from “poor metabolizers,” who won’t. And no one’s thought to test her since.
The original Plavix genetic test identified mutations in the CYP2C19 gene. More recent versions assess seven other genetic variants that affect metabolism of the drug. On June 29, the University of Florida Academic Health Center announced that it would use the wider genetic test to screen all cardiac catheterization patients for response to Plavix. And in the future, they’ll check additional DNA variants in the samples. According to the press release from the university, “researchers … will collect results for the other 249 gene variations to continue investigating which ones might be clinically actionable and become the basis for additional PGx tests for other treatments such as warfarin and statins.”
Is it OK to take DNA today for one purpose, and use it tomorrow for another? Should future use of DNA information be part of informed consent for participation in a clinical trial? And should patients, like someone giving blood for a PGx text, be told that his or her DNA might be used later, for reasons not currently known? And how can the Florida clinicians even obtain informed consent from patients in an emergency situation undergoing cardiac catheterization?
At least the Plavix case will use the DNA to address the same illness for which it was donated. But what if DNA collected today is eventually used to investigate a different condition, perhaps one that the original owner of that DNA didn’t want to know about? A Native American tribe from Arizona offers a compelling (although not legal) precedent for future-use scenarios.
The Havasupai and Future-Use DNA
The Havasupai have lived at the bottom of the Grand Canyon for more than 10 centuries, but in 1882 the US government deemed the region a national park, restricting their home. When the tribe abandoned farming and turned to tourism to survive, they partook of junk food and a more leisurely lifestyle. Soon, diabetes became common.
In 1990, researchers from Arizona State University visited the Havasupai to take DNA samples to look for diabetes genes. Two years later, with no findings, they then analyzed the DNA for other traits, including schizophrenia (a stigma in the Havasupai culture), inbreeding (an insult), and worst of all, ancestry (Asian origins countered what the Havasupai told their children). The researchers also shared the DNA with others, without consent.
The Havasupai discovered the future-use of their DNA only after one of their own heard about it in a lecture at ASU. In 2004, they filed a lawsuit. The settlement in April 2010 brought $700,000 to 41 Havasupai members, return of blood samples, scholarships, and help to build a health clinic. But the researchers didn’t own up to liability.
Bioethicists Arthur Caplan and Jonathan Moreno discussed implications of the Havasupai settlement in The Lancet, but, I think, too broadly. They compare the Havasupai DNA situation to that of organ donors, embryo donors, and people like Henrietta Lacks and John Moore, whose cervical cancer cells and spleen, respectively, were taken without consent and eventually yielded huge profits. But DNA is different.
DNA is information. Destroy the source – white blood cells, cheek scrapings, cervical or spleen cells – and the information, the base sequence, can nevertheless live on in a database. Providing DNA to a researcher is a more lasting donation than having your placenta end up in shampoo without consent, for example.
I disagree with another of the bioethicists’ statements: “Investigators must either know in what direction their research might lead, so that the donors can be informed about these prospects, or they must find and reconsent donors years or decades later, when new research opportunities present themselves.”
Knowing where research, particularly genetic research, will lead is not how science works. If we knew where it would lead, why do it? And reconsent would be confusing, disturbing, and likely expensive.
Instead, I think that, especially for DNA tests, informed consent documents should state that the sample might be used in the future to get information unknown today. Participants or patients can agree, or not sign. The possibility that relevant but unexpected DNA results may arise, and what to do with the information, is another story altogether. But the very existence of such "genetic incidentalomas" counters the bioethicists’ view that researchers can know what they’re going to look for, or find.
“Recruitment by Genotype”
A recent study from The Hastings Center about a report in IRB: Ethics & Human Research puts a new spin on the future-use DNA question: what they call “recruitment by genotype” – admitting only participants with particular mutations to a clinical trial. It makes sense. Assessing a drug to treat breast cancer, for example, would yield more meaningful information if all participants had the same driver mutations – of which there are at least 40. If a new drug disrupts a specific cellular mechanism, shouldn’t trial participants have as uniform genetic backgrounds as possible?
In the IRB: Ethics & Human Research investigation, Laura M. Beskow, MPH, PhD and colleagues at Duke University presented 201 chairs of Institutional Review Boards with scenarios of using DNA from one study to identify other genetic variants in a subsequent study – and how much participants should know.
The idea is that probing already-donated DNA to find trial participants is less costly than screening large populations to identify people of the desired genotype. Logical though the approach may be, it adds a layer to informed consent, and could upset people who might prefer not to dwell on their past diagnosis. Perhaps that’s why the study found that only 37% of IRB chairs agreed with the statement “Researchers should be allowed to contact participants in one genetic research study in order to invite their participation in another genetic research study.”
(In one interesting case, a parent requested a future DNA test. A mother contacted researchers in 2004 after reading about Rett syndrome in a magazine. It described her daughter, who’d died at age 7 without a diagnosis in 1991. The parents had been blaming themselves ever since, and were concerned about grandkids. A Rett test done on DNA in a saved baby tooth nailed the diagnosis.)
Getting back to 2012, the IRB researchers conclude that there won’t be a “one-size-fits-all solution” to future-use DNA testing, and that genotype-driven recruitment is okay if it meets several conditions. The top two:
· Future-use is spelled out in informed consent documents from now on
· The potential information from future DNA tests has “clinical validity” – that is, a detected genotype predicts a clinical condition or predisposition.
But researchers and ethicists are still discussing what to do with the inevitable genetic incidentalomas.
The day is nearing when having one’s exome or genome sequenced will be routine in health care, and we’ll all face the problem of “too much information.” Until then, perhaps researchers can compare different ways of handling informed consent that projects future uses for donated DNA. Clinical trial participants as well as health care consumers have a right to decide how much they want to know about their genetic information.