still, the numbers are too small for that to ever come up to one in five.
the numbers i cited in the BJS report weren't even the total number of completed rapes, that included their estimates for threats and attempted too.
they also found (surprise surprise) the two demographic groups with the highest rate were the age groups of 12 to 17 and 17 to 34. once that window is over odds go down to near nothing. so the 15 years of data from BJS when totaled, is pretty complete. when you total their total counts, and then divide the number of women by those counts over the last 15 years, we see one in 31. thats one in 31 too many, but it's no where near the one in 5 number cited.
the CDC study has another issue, over two thirds of their rape numbers come from the category of drunk or drugged, but how they phrase the survey questions is so shoddy, that they're getting overreported in that category, where someone will tell the researcher they had sexual intercourse while drunk, and the researcher will mark that as a rape, even if the person being surveyed considers it consensual or it doesn't meet the definition of rape is a legal sense. the BJS on the other hand, catergorizes assaults based on a legal definition of rape and is part of surveys that first ask the surveyee if they consider themselves the victim of a crime.
this is an important distinction, what happens when you start throwing out ridicolously high or inflated numbers, and people don't believe them (which I don't think many outside the extreme feminist in Oberlin College community actually believes that number) they'll start tuning those numbers out, which is bad, because if publishing those high numbers backfires, less attention will be paid to many very real victims.