Storygram: Charles Piller’s “Failure to Report”

The Storygram series, in which professional writers annotate award-winning stories to illuminate what makes a great science story great, is a joint project of The Open Notebook and the Council for the Advancement of Science Writing. It is supported by a grant from the Gordon and Betty Moore Foundation. 

Charles Piller is STAT’s West Coast editor and Natalia Bronshtein is STAT’s interactives editor. Together, they won a AAAS Kavli award in 2016 for the following story. This annotation was done by Roxanne Khamsi and is co-published at The Open Notebook

Stanford University, Memorial Sloan Kettering Cancer Center, and other prestigious medical research institutions have flagrantly violated a federal law requiring public reporting of study results, depriving patients and doctors of complete data to gauge the safety and benefits of treatments, a STAT investigation has found.

The violations have left gaping holes in a federal database used by millions of patients, their relatives, and medical professionals, often to compare the effectiveness and side effects of treatments for deadly diseases such as advanced breast cancer.

The worst offenders included four of the top 10 recipients of federal medical research funding from the National Institutes of Health: Stanford, the University of Pennsylvania, the University of Pittsburgh, and the University of California, San Diego. All disclosed research results late or not at all at least 95 percent of the time since reporting became mandatory in 2008.

Failure to Report: A STAT investigation

What we found: Most research institutions — including leading universities and hospitals in addition to drug companies — routinely break a law that requires them to report the results of human studies of new treatments to the federal government’s database.

Read: Law ignored, patients at risk

Interactive: Explore the performance of top research institutions in academia, industry, and government.

Read more: How we did the project.

Have you participated in a clinical trial? Take our survey.

Drug companies have long been castigated by lawmakers and advocacy groups for a lack of openness on research, and the investigation shows just how far individual firms have gone to skirt the disclosure law. But while the industry generally performed poorly, major medical schools, teaching hospitals, and nonprofit groups did worse overall — many of them far worse.

The federal government has the power to impose fines on institutions that fail to disclose trial results, or suspend their research funding. It could have collected a whopping $25 billion from drug companies alone in the past seven years. But it has not levied a single fine.

The STAT investigation is the first detailed review of how well individual research institutions have complied with the law.

The legislation was intended to ensure that findings from human testing of drugs and medical devices were quickly made public through the NIH website Its passage was driven by concerns that the pharma industry was hiding negative results to make treatments look better — claims that had been dramatically raised in a major lawsuit alleging that the manufacturer of the antidepressant Paxil concealed data that the drug caused suicidal thoughts among teens.

“GlaxoSmithKline was misstating the downside risks. We then asked the question, how can they get away with this?” Eliot Spitzer, who filed the 2004 suit when he was New York attorney general, said in an interview.

The STAT analysis shows that “someone at NIH is not picking up the phone to enforce the obligation to report,” Spitzer said. “Where NIH has the leverage through its funding to require disclosure, it’s a pretty easy pressure point.”

NIH Director Dr. Francis Collins acknowledged in a statement that the findings are “very troubling.”

Clinical trial reporting lapses visualized

STAT examined data reporting for all institutions – federal agencies, universities, hospitals, nonprofits, and corporations – required to report results to for at least 20 human experiments since 2008. This visualization shows the first comparative analysis of their performance. Among the groups, just two companies provided trial results within the legal deadline more than half the time. Most – including the National Institutes of Health, which oversees the reporting system – violated the law the vast majority of the time.

Data visualization by Natalia Bronshtein. (Originally published in STAT. Reprinted with permission from STAT.)
Data visualization by Natalia Bronshtein. (Originally published in STAT. Reprinted with permission from STAT.)
CLICK HERE to read Roxanne Khamsi’s annotation of Natalia Bronshtein’s data visualization.

Doctors rely on robust reporting of data to see how effectively a drug or device performs and to understand the frequency of side effects such as suicidal thinking — crucial information for making treatment recommendations. Researchers depend on prior results to plan future studies and to try to ensure the safety of volunteers who participate in trials.

Yet academic scientists have failed to disclose most results on time — within a year of a study being completed or terminated.

In interviews, researchers, university administrators, and hospital executives said they are not intentionally undermining the law. They said that they are simply too busy and lack administrative funding, and that they disclose some clinical trial results in medical journals and at conferences. Some experts voiced concerns that the researchers might also face pressure from corporate sponsors to delay reports — whether negative or positive — for commercial reasons.

Not much to boast about

A number of trials languishing on researchers’ “to do” piles involve life-and-death problems.

Memorial Sloan Kettering failed to report data for two trials of the experimental anticancer drug ganetespib, made by Synta Pharmaceuticals. The results of tests involving breast and colorectal cancer patients showed serious adverse effects in 13 of 37 volunteers. They included heart, liver, and blood disorders, bowel and colon obstructions, and one death.

The New York City hospital said it delayed its reporting so that it could complete related medical journal articles. It rushed to submit the results to only after being contacted by STAT — more than two years late for one of the studies.

Collette M. Houston, who directs clinical research operations for Memorial Sloan Kettering, said she believes the researchers did immediately report adverse events to Synta and the Food and Drug Administration. But a hospital spokesperson said records needed to verify if that happened were no longer accessible.

Houston said that Synta, which paid for the research, normally would alert investigators on other ongoing trials of the same drug about adverse events, so they would learn of the possible added risks to patients despite Memorial Sloan Kettering’s failure to report to

In another case, delayed reporting denied doctors timely information on treating breast cancer. In 2009, the nonprofit Hoosier Cancer Research Network terminated a study of Avastin in 18 patients with metastatic breast cancer. The drug didn’t help and caused trial volunteers serious harm — including hypertension, gastrointestinal toxicity, sensory problems, and pain. But the Indianapolis-based network, which runs trials under contract for drug companies, did not report the results as required the following year — though doctors nationwide were debating whether it was advisable to treat breast cancer patients with Avastin.

In 2011, the FDA revoked its approval of Avastin for breast cancer after determining that it was ineffective for that use and posed life-threatening risks. The Hoosier Network researchers finally published the data in a medical journal in 2013. They have yet to post results on — nearly six years after the legal deadline.

The lead investigator, Dr. Kathy Miller of Indiana University School of Medicine, blamed “short staffing” for the reporting delay. “Getting that uploaded was not a big priority,” she said, adding that she had never heard of anyone using to examine trial results. She insisted that her study “didn’t have anything to add to that national debate” about Avastin’s safety and efficacy.

STAT identified about 9,000 trials in subject to the law’s disclosure requirements. The analysis focused on the 98 universities, nonprofits, and corporations that served as sponsors or collaborators on at least 20 such trials between 2008 and Sept. 10 of this year.

None of the organizations has much to boast about. Only two entities — drug companies Eli Lilly and AbbVie — complied with reporting requirements more than half the time, a sign of how uniformly the research community flouts the law.

Results from academic institutions arrived late or not at all 90 percent of the time, compared with 74 percent for industry. More than one-fourth of companies did better than all the universities and nonprofits.

The FDA, which regulates prescription drugs, is empowered to levy fines of up to $10,000 a day per trial for late reporting to In theory, it could have collected $25 billion from drug companies since 2008 — enough to underwrite the agency’s annual budget five times over. But neither FDA nor NIH, the biggest single source of medical research funds in the United States, has ever penalized an institution or researcher for failing to post data.

Even the agency’s own staff scientists have violated the reporting law three-quarters of the time.

Collins said NIH research shows reporting rates by both industry and academia have improved over the past several years due to greater awareness, and NIH’s own scientists are now reporting their results about 90 percent of the time. “Of course I would like to see all of these rates reach 100 percent,” Collins said.

Data from reviewed by STAT show NIH scientists’ reporting of results within the legal deadline peaked at 38 percent in 2013. Counting results reported late, NIH staff performance reached 90 percent for studies due in 2012 but has dropped since then.

Beginning next spring, after further refinement of rules, NIH and FDA will have “a firmer basis for taking enforcement actions,” Collins said. If an NIH grantee fails to meet reporting deadlines, their funding can be terminated, he said.

The agency’s staff scientists, he added, “will be required to follow the law and NIH policy, and there will be consequences … for failing to do so.”

NIH/STAT. (Originally published in STAT. Reprinted with permission from STAT.)
NIH/STAT. (Originally published in STAT. Reprinted with permission from STAT.)

‘Too busy to follow the law’

University researchers and administrators cited a lack of time as one of the reasons they miss reporting deadlines. Lisa Lapin, a top communications official at Stanford, said via email that many faculty members find the process for entering results “above and beyond” their capacity, “given multiple existing constraints on their time” and no funding for additional staff.

Stanford supplied data to for 26 of 82 trials for which results were due — far below the performance by most other top universities or companies. Professors supplied data on time for only four studies.

Dr. Harry Greenberg, senior associate dean for research at Stanford, said his institution was determined to improve.

“We will solve it, and we’ll solve it the best way possible,” through steps that include providing support for faculty members to enter trial data, he said.

NIH estimates that it takes, on average, about 40 hours to submit trial results, and many drug and medical device makers have dedicated staff to post data to Eli Lilly has seven full-time employees for the task.

But Collins and advocates for transparency said time demands are no excuse for noncompliance. “There is no ethical justification for ignoring the law,” said Jennifer Miller, a New York University medical ethicist. “It’s not OK to say, ‘I’m too busy to follow the law.’”

Jun Seita (Creative Commons)/STAT. (Originally published in STAT. Reprinted with permission from STAT.)
Jun Seita (Creative Commons)/STAT. (Originally published in STAT. Reprinted with permission from STAT.)

Filling information gaps

Six organizations — Memorial Sloan Kettering, the University of Kansas, JDRF (formerly the Juvenile Diabetes Research Foundation), the University of Pittsburgh, the University of Cincinnati, and New York University — broke the law on 100 percent of their studies — reporting results late or not at all.

Dr. Paul Sabbatini, a research executive at Memorial Sloan Kettering, said that as an efficiency measure, his organization previously tried to complete work on scholarly publications in tandem with reporting results. “We thought and hoped that we could keep that process bundled,” he said, but preparing journal articles and getting them published often takes years. The hospital will now post to independently.

Several organizations argued that publishing in peer-reviewed medical journals is the highest form of disclosure of study results and should be sufficient, or that there should be a way to simply copy journal data into

Critics say such views reflect the primacy of scholarly articles for academic career advancement more than any inherent superiority over

A 2013 analysis in the journal PLOS Medicine compared 202 published studies against results in for the same research. The NIH registry proved far more complete, especially for reporting adverse events — signs that a therapy might be going badly wrong.

Journals often reject papers about small studies or trials stopped early for a range of reasons, such as a drug causing worrisome side effects. They publish relatively few negative results, although failed tests can be as important as positive findings for guiding treatment. tries to fill these critical information gaps and serve as a timely and comprehensive registry.

The website is open to the public, unlike many journals that charge a substantial fee. And the site requires data to be entered in a consistent way that allows easier comparisons of benefits or side effects across many studies.

It also lets experts check whether researchers have cherry-picked results, which can mislead doctors and patients.

Preliminary work by The Compare Project, at the Centre for Evidence-Based Medicine at the University of Oxford, suggests that shifts away from outcome measures designated before the start of a clinical trial to results selected at the study’s conclusion are common. The group recently examined reports in leading journals on 44 trials and found that overall, 213 original measures went unreported, while 225 new measures were added.

Withheld data spurred action

Congress passed a law requiring registration of clinical trials in 1997, and the database was created in 2000 for this purpose. At first it was little used.

NYU’s Miller traces the origins of stronger reporting requirements to Spitzer’s Paxil suit.

GlaxoSmithKline disputed the charges but settled out of court. This past September, the medical journal BMJ reanalyzed clinical trial data gathered by the company in 2001 and recently shared with independent scientists. Its findings showed Paxil to be no more effective in teens than a placebo, and linked the drug to increased suicide attempts.

Scenarios similar to the Paxil case have been repeated many times since with other drugs.

In the wake of the bird flu scare a decade ago, governments stockpiled billions of dollars worth of the antiviral drug Tamiflu, thought to be a lifesaver. The Lancet reported last year, however, that previously undisclosed trial data showed that Tamiflu “did not necessarily reduce hospital admissions and pulmonary complications in patients with pandemic influenza.”

Another notorious case involved an experimental arthritis drug, TeGenero’s TGN1412, tested on six men in London in 2006. All immediately fell gravely ill. One suffered heart failure, blood poisoning, and the loss of his fingers and toes in a reaction that resembled gangrene. A close variant of the drug had been found potentially dangerous years earlier, but the results had never been published.

In response to Spitzer’s suit, GlaxoSmithKline agreed to share much more data with the public. The move prodded competitors and lawmakers to do better.

Then in 2007, a new US law broadened trial registration requirements and added mandates for disclosing summary data, adverse events, and participants’ demographic information on

“Mandatory posting of clinical trial information would help prevent companies from withholding clinically important information about their products,” Senator Charles Grassley, an Iowa Republican and a leading proponent of the law, said at the time. “To do less would deny the American people safer drugs when they reach into their medicine cabinets.”

High hopes not realized

The popularity of shows progress on lawmakers’ goal of a better-informed public. The site logs 207 million monthly page views and 65,000 unique daily visitors. An NIH survey suggested that 8 in 10 are patients and their family members or friends, and medical professionals and researchers. With a $5 million budget, the program’s staff of 38, operating at NIH in Bethesda, Md., reviews trial designs and results, provides services, conducts research, and sets policy.

Yet, the lofty hopes for the website have never been realized. More than 200,000 trials are registered on, but many drug or device studies are exempted from reporting results, including most trials that have no US test site, those involving nutritional supplements, and early safety trials.

Study sponsors can also request permission to delay reporting on experimental drugs not yet approved by the FDA, and have done so on more than 4,300 occasions.

The many exceptions mean that virtually all research organizations — even the most conscientious — have posted results for just a small fraction of their registered trials.

AbbVie, one of the most legally compliant pharmaceutical companies, was involved in 459 registered trials. Just 25 of those required the reporting of results by law. Even for that small number, AbbVie achieved its relatively good reporting record not from exceptional efforts on public disclosure, but by diligently applying for filing extensions.

Biogen delivered results late or not at all 97 percent of the time, the worst record among pharma companies. It said that many of its studies were not covered by the reporting law, and that it had extension requests pending or planned for others — accounting for the poor performance. But STAT did not count trials exempted from the law, nor those that NIH listed as having applied for extensions.

The reporting of results on, poor by any reasonable standard, is worse than it appears, some researchers say.

In part, that’s because there is no easy way to tell whether companies or universities are registering all of their applicable trials, and there is no comprehensive policing mechanism. A research organization that appears conscientious in reporting results might be hiding some trials.

NYU’s Miller studied drugs from major companies approved by the FDA in 2012. In a recent article in the medical journal BMJ Open, she and her coauthors showed that trials subject to the legal requirements often were not registered, and results not provided — even when the firms gave the data confidentially to the FDA during the approval process.

Miller found that results from only about two-thirds of studies used to justify FDA approvals were posted to or published elsewhere.

Reporting lapses betray research volunteers who assume personal risks based on the promise that they are contributing to public health, said Kathy Hudson, an NIH deputy director who helps to guide the agency’s clinical trial policies. “If no one ever knows about the knowledge gained from a study,” she said in an interview, “then we have not been true to our word.”

Draconian consequences — including FDA and NIH penalties — might prove necessary to improve the distorted evidence base of clinical medicine, said Dr. Ben Goldacre, a fellow at the University of Oxford and cofounder of The Compare Project and, which advocates for disclosure of clinical research.

As a young doctor, Goldacre broke his leg in a fall while running to treat a cardiac arrest patient. “Nobody has ever broken a leg running down a corridor to try to fix the problem of publication bias,” he said. “That’s because of a failure of empathy.”

Roxanne Khamsi is a journalist whose work has appeared in Scientific American, Slate, Newsweek, and The New York Times Magazine, and is chief news editor for Nature Medicine. She teaches at Stony Brook University’s Alan Alda Center for Communicating Science. Follow her on Twitter @rkhamsi.

You may view this story in its original format at STAT. Roxanne Khamsi’s Q&A with writer Charles Piller is at The Open Notebook.