Map of life expectancy at birth from Global Education Project.

Monday, October 05, 2009

Another installment in our continuing series

entitled "More is Less." I vaguely recall referring some months back to two studies of prostate cancer screening published in NEJM, which you can read here and here due to a special dispensation extended to you ignorant ruffians for this one occasion.

With a little help from Jennifer Stark and friends in the new BMJ (not, alas, similarly available to the rabble), I want to help you understand the bottom lines on these studies, and in particular, how different it all seems depending on how you frame it. In the U.S. study reported by Andriole, et al, the death rate from prostate cancer after 7 to 10 years of follow up in a group assigned to routine screening was actually higher than the death rate for a group assigned to "usual care." Some in the intervention group were not screened every year, and some in the control group were screened, but the former group did have a much higher overall rate of screening. The difference was deemed "not statistically significant," which means it could have been due to chance, but it certainly seems highly unlikely that screening yielded any substantial mortality benefit over this time. Of course the follow up is short.

The European study by Schröder and the gang did find a reduction in death rates over average follow-up of nine years for men who were offered PSA screening of about 20%. That sounds pretty good, huh? But here's the key point I want to shoot you with "like a diamond bullet right between the eyes" -- you would have to screen 1,410 men to avert one death; 76% of positive tests would have been false positives leading to unnecessary anxiety and biopsies; and you would have to have treated 48 cases of actual prostate cancer to avert one death, because a) lots of men with prostate cancer never die from it or even have symptoms and b) some men who are treated die of the disease anyway. Treatment can result in incontinence, erectile dysfunction, and all kinds of pain and expense.

This is the importance of reporting, not relative risk, but absolute risk. Telling someone that you can reduce their chance of dying from prostate cancer by 20% sounds pretty good. But telling them there is a 1 in 1,400 chance that they will benefit; a 12% chance that they will have a false positive test and have to go through all the sequelae; and 3.5 chances out of 100 that they will undergo prostate cancer treatment and not in fact benefit from it, does not sound nearly so good. It will, however, make a whole lot of money for oncologists, surgeons and radiologists, which might just be one reason why they are all in favor of it.

As Doctor Ruth says, I'm not a real doctor, I cannot advise. Talk it over with your real doctor. But do show him or her these true facts.

No comments: