Allen(et All):
I finally got around to fully reading your link, sorry for the delay. But it is indeed interesting. Mainly in that it directly contradicts so many other studies, some examples of which I have linked to earlier, relating to both non-PC cancers and PC.
The first thing that jumps out at me is this U-shaped curve. It is definitely there. But not in the way I was thinking you meant. I thought you meant the risk for aggressive PC went up(in this study anyway) with the higher levels and maybe the lower levels, with the U-shaped curve showing best results in the middle blood levels. In fact, to a comment of mine in a different thread about
a similar(or was it the same study?) study where I said, somewhat sarcastically I guess, something like “ what we learn from this study is the lowest levels of Vit D are best”, you responded to me with something like “that would be an unwarranted conclusion or assumption”, then giving more info about
the U shaped curve. I think I defended my conclusion from that study, even though I didn't necessarily actually agree with that conclusion, that is what the study seemed to be showing.
But now when I look closely to this study you have linked, it seems to be showing evidence for the very same thing I argued previously: the best levels are the lowest levels. And there is a U shaped curve, but it does not point to the middle having the lowest aggressive PC, but rather the highest risk, and also lowest Vit D blood levels are lowest risk. However, the highest levels are a good bit closer to the lowest levels(in both blood levels and risk or odds) than the middle. Here is the copy and paste from that link, showing Odds Ratio(OR) and rates of aggressive PC per quintile:
jnci.oxfordjournals.org/content/100/11/796.full.pdf“However, serum 25(OH)D concentrations
greater than the lowest quintile (Q1) were associated with increased risk of aggressive (Gleason sum ≥7 or clinical stage III or IV) disease (in
a model adjusting for matching factors, study center, and history of diabetes,
ORs for Q2 vs Q1 = 1.20, 95% CI = 0.80 to 1.81,
for
Q3 vs Q1 =1.96, 95% CI = 1.34 to 2.87,
for Q4 vs Q1 = 1.61, 95% CI = 1.09 to 2.38, and
for
Q5 vs Q1 = 1.37, 95% CI = 0.92 to 2.05; Ptrend= .05).
The rates of aggressive prostate cancer for increasing quintiles of serum 25(OH)D were 406, 479,
780, 633, and
544 per 100 000 person-years”
Notice that the rates per 100,000 person years for Q3(middle at 780) were 1.43 times higher than Q5 and 1.9 times higher than Q1(lowest blood levels. Also notice that the Q5(highest blood levels) had only 1.3 times higher risk than the lowest/best levels, probably barely significant.
Wow! What are we to make of such results, if anything? Clearly- by this study- there is a U shaped curve indicating the highest risk in the middle, with the lowest being on both extremes, and the absolute lowest risk with the lowest levels? Especially when it conflicts wildly with several other studies? The folks with the lowest blood levels have 406 cases of aggressive PC per 100,000 person years, the highest levels have 544, and the middle have a whopping 780.
I assume I do not know enough about
how to read such studies. That must be why I don't quite get the authors conclusions, which is: “The findings of this large prospective study do not support the hypothesis that vitamin D is associated with decreased risk of prostate cancer; indeed, higher circulating 25(OH)D concentrations may be associated with increased risk of aggressive disease. ”. It seems to me that is true when comparing the very lowest levels to all others, but it seems that by far the highest risk they found was in the middle levels, and that the very highest levels barely had higher risk than the very lowest. Where am I going wrong, why didn't they state it this way?
Here is a possible reason for these wild results from there own words: “Limitations
Only a single baseline vitamin D measurement was available. Whether vitamin D levels could affect prostate-specific antigen levels in some cancers, causing a diagnosis bias, is not known. As
with all epidemiology studies, unmeasured confounders could account for the results. ”. Didn't you(or someone else?) tell us once that epidemiology studies were the most unreliable?
Another impressive thing in this study is that the mean IU intake for Vitamin D was only 416, which is not very high at all. And the mean blood level was 59 nmol/L( about
23.6 ng/L?), not very low, but still below the level many studies label as deficient(<25ng/L). Still, since so many folks report not being able to get above the 25 even with 1000-2000IUs per day supplementation, I am surprised the mean level is even this high (i.e. 59 nmol/L or 24ng) with a mere mean 400 IUs per day. Indeed, 200 of ~ 749 participants took less than 200 IUs per day, and over 1/2, 440, took <400 IUs per day. Only 49 took > 1000.
Another is- not so surprising considering the relatively low mean intake- is that the two outliers who have the very highest levels in the study are only (when converted) about
56 ng/L, not really all that high and a level that shows good results in many studies.
OK, I'm sure an answer for these surprising results lie somewhere in the details of the study, but I am tired out from searching through it for clues. But looking at their raw numbers, I just am not getting that the very highest levels are any more risky than the lower level, except when compared only to the very lowest, and then not by much. And indeed, higher = less risky than the middle. I'm not getting that any reasonable amount of supplementation is a significant risk to us, even before considering other studies. Indeed, few people in this study were even taking very much Vit D at all. Where am I going wrong?
Bill
Post Edited (BillyBob@388) : 2/18/2015 10:06:11 AM (GMT-7)