For decades, investigative reporters have approached stories with a similar mindset: Find the bad guys.
We collect examples of wrongdoing, we put the examples in a pile, and if the pile becomes big enough, we have a story. If the story is compelling enough, maybe someone will call for change.
But I’ve also wondered: What if we viewed investigative reporting more broadly and approached our work in new ways? Instead of focusing on the bad actors, what if we used our skills (and the financial resources of many investigative teams) to bring people together to solve problems?
Over the past two years, the Chicago Tribune engaged in a unique collaboration with data scientists, pharmacologists and cellular researchers at Columbia University Medical Center. The team used novel data-mining techniques to identify four drug combinations associated with a heart condition that can lead to a potentially fatal arrhythmia. The drugs include several widely prescribed medications, none of them linked to the cardiac condition on its own.
In the process, the team created an innovative scientific model with the potential to flag hundreds of additional drug interactions, offering a new way to protect patients and save lives.
To accomplish this, the Tribune took a fresh approach to investigative reporting—one that presents new opportunities to spark change, but perhaps also challenges the boundaries of public service journalism.
The Tribune didn’t just report on what scientists were doing. We came to them with an ambitious idea, connected them to other top researchers and then became an important part of the scientific effort.
There is already “solutions journalism,” in which reporters write detailed explanatory pieces about social problems and how to fix them. Could we take this one step further? Could we collaborate with experts to find answers, make discoveries and invent new things?
We decided to try. We did it because we felt the story was worth it, and we couldn’t do it alone.
Drug interactions harm thousands each year in the US, and the risks are escalating. One in five Americans takes three or more drugs. One in 10 people takes five or more—twice the percentage as in 1994.
Some interactions are well documented, but many are believed to be hidden, causing harm unbeknownst to anyone: doctors, scientists, drug companies, patients.
We had an idea about how to find some of these hidden drug combinations, but we weren’t doctors, and we didn’t have access to crucial data. We needed the help of some of the nation’s leading experts, such as Nicholas Tatonetti, a scientist who runs his own data lab at Columbia.
Unlike Tatonetti, I knew practically nothing about big data, coding, and algorithms. But I was intrigued by a data-mining approach he had pioneered that intentionally looked for evidence where none was visible.
Tatonetti was detecting drug interactions in much the same way that astronomers find black holes. Astronomers can’t see black holes, but they know they are there because of their side effects, such as the gravitational pull on neighboring stars.
Similarly, researchers can’t always detect drug interactions in the data because too few patients file complaints. But by analyzing a constellation of secondary side effects, Tatonetti could infer which combinations of drugs might cause a more serious, yet hidden, problem.
Tatonetti had used this approach to find drug interactions associated with an increase in blood sugar, but he was frustrated that the FDA didn’t act on his work. When I first interviewed him in 2013 he seemed ready to move on to other projects.
The Tribune proposed an idea: Instead of studying drugs linked to blood sugar, what if we tried to find drug combinations that led to sudden cardiac death? The FDA would have a hard time ignoring those findings.
Tatonetti said he was not an expert on sudden cardiac death, but I knew someone who was: Dr. Ray Woosley, a source of mine and former dean of the University of Arizona medical school.
I flew to Arizona and asked Woosley if he would be willing to help. He agreed, supplying information to plug into Tatonetti’s algorithms.
Over the course of the project, I traveled to New York 12 times to meet with Tatonetti. We brainstormed, I documented our progress, and we talked with Woosley via conference calls. Some ideas gained traction. Others did not. Weeks stretched into months.
But this story was different. We weren’t paying scientists for a service they routinely provided for a fee. We were relying on their goodwill to execute an experiment. They could drop out at any time, change their minds, or switch jobs.
The experiment also could be a bust. Maybe we wouldn’t find anything. Or maybe we would find the opposite of what we were expecting.
Were we willing to walk away—after spending all that time and money—if the experiment failed?
My Tribune colleagues—George Papajohn, the associate managing editor of investigations; Kaarin Tisue, the deputy investigations editor; and Karisa King, my reporting partner—agreed that if the experiment failed, it wouldn’t be the end of the world, or even the end of the story. We could still write an explanatory piece about how data scientists were trying to find hidden drug interactions.
If the experiment succeeded, we could be confident of the results. We were working with scientists who were going to enormous lengths to remove biases in the study and validate the work.
For example, when Tatonetti’s data-mining turned up hundreds of drug combinations associated with the heart condition, he knew many of these signals could be noise, and so he checked the results against millions of lab measurements at Columbia University Medical Center. Many combinations didn’t appear meaningful, but some did.
The project could have ended there. Tatonetti would have had a publishable paper and the Tribune a good story.
But Tatonetti wanted more validation. With Woosley’s help, the team enlisted the aid of Columbia cellular researchers, who tested one of the suspected drug combinations on individual cells.
The tests found that the drugs blocked an electrical channel crucial to the heart, providing a biological explanation for the possible drug interaction.
The cell testing was enormously time-consuming, especially in news-biz time. The tests put the project back a full year. But in the end, the team had found a new model for discovering drug interactions. The researchers wrote one paper, published this week in the peer-reviewed journal Drug Safety; another paper is in the works.
And I wondered: Could this reporting approach be used for other stories?
Tisue, our deputy investigations editor, cautions that we run the risk of confusing readers about our role.
That’s a concern, of course. But it can be mitigated by choosing partners with a similar commitment to objectivity and truth, by engaging in an open and ongoing dialogue with editors, by setting standards for a successful outcome, and by formulating a “Plan B”—such as writing an explanatory piece or no story at all—if the effort falls short.
When I started in this business in the early 1980s, reporters were increasingly adding context and analysis to stories, a shift many journalists resisted. Some thought reporters should copy down what people said and did and leave it at that.
Narrative elements were rarely found in news stories. Government press releases frequently went unchallenged. Entire sections of newspapers—business, sports, and entertainment—seldom asked tough questions.
Times changed, as did the boundaries of public service journalism.
Maybe it’s time once again to nudge the needle.
Instead of always nailing people for doing the wrong thing, maybe we could sometimes bring people together to do the right thing.