Before heading to work recently I was watching The Today Show, and Dr. Mehmet Oz (everyone’s favorite health guru) did a segment on ways to stay healthy during the holiday period.
I found it curious that Dr. Oz’s approach to “staying healthy” involved promoting several screening tests (memory, vision, fitness)—but it was when he began discussing various fitness tests that he made a statement that really caught my attention: “…Another big test that we have a lot of data on, it’s the quarter mile test. If you can’t walk a quarter mile in five minutes, the chance of you being dead is 35 percent higher than if you can.”
There’s a lot of thought-provoking content here. What sorts of studies were done to come up with these numbers? How good were the studies? (These are the kinds of questions we epidemiologists love to ask.)
Without getting too deep into the methodological weeds, I mostly wondered: how does the average viewer make sense of this whole “35 percent higher” concept? What does it mean to people? After all, it sounds like a big, scary deal that you should do something about. Now, let’s get a small but annoying point out of the way: Dr. Oz clearly must have meant “the chance of you dying in the future is 35 percent higher. . .” instead of the chance of you being dead is 35 percent higher. (If you’re already dead, there is zero chance you can walk a quarter mile.) We’ll excuse that verbo (the verbal equivalent of a typo).
But even that can be a tad confusing. First, what future do we mean? Over the next year, over the next five years, ever? . . . Well, clearly it can’t be “ever”: irrespective of your walking ability or speed, your chance of dying ever in the future is 100 percent, since there’s no preventing death. (When I hear any sort of activity or lifestyle or pill touted as “preventing mortality,” it always strikes me: you can’t prevent death, you can only delay it.)
I looked into the literature, and while I don’t know if this was the source Dr. Oz was using (TV stars usually don’t provide references), I did find a study that seemed relevant in the Journal of the American Medical Association from 2006. At least, subjects walked a quarter mile in this study (actually, 400 meters, but that’s pretty much the same). The study enrolled about 3,000 older adults, ages 70 to 79, and followed them for an average of about five years.
Based on the results of that study, one could quibble over whether there was actually a 35 percent increase in mortality rate (it depends whether you’re comparing those who were unable to complete the quarter mile with those who could, or if you’re considering the speed of completion, and what sort of statistical adjustment you choose), but the study did show higher rates of mortality among those unable to complete a quarter mile. So let’s say it’s a 35 percent increase: what would that actually mean? This percent increase refers to the relative risk or relative likelihood of death. But what about the absolute increased risk? How many actual people would be affected? That’s where the money is. And to figure this out, we have to know the overall mortality rate.
In this study, the overall death rate for all of those who completed the walking test was 24.7 per 1,000 person years.
The term “person years” is confusing to most. It essentially means, in this case, that about 2.5 percent of the study sample died each year (or, over the five years of the study, about 12.5 percent died). So, what would a 35 percent increase mean? It would mean that you multiply 12.5 percent by 1.35 and you get 16.9 percent. So instead of 12.5 out of 100 individuals in their 70s dying over the next five years, 16.9 out of 100 would die; an extra 4.4 percent over five years—a per-year increase of less than 1 percent.
I’m perfectly willing to accept the notion that not dying (at least, not dying for a while) is better than dying, so I might think about doing something (if possible) in the event I’m concerned that my slow walking at 75 increases my risk of death by less than 1 percent each year compared with a faster walker. But I suspect you’ll agree with me that when you really think about what this means, it’s a lot less alarming than it sounded when Dr. Oz declared it “a big deal!”