Originally posted by Fidd:
Odd. 617 achieved an *average* error of 94 yards dropping 12000lb blast bombs from 10,000 ft, which would suggest a drift of 600ft or 200 yards! (according to USAF figures)
Maybe USAF just never expected to hit anything accuratly?
Fidd
(tongue in cheek)
figures taken from "The Dam busters" by Paul Brickhill.
(tongue in cheek)
The figure that I quoted from the Discovery Channel (up to 6 ft drift for every 1000 ft of alt for dumb bombs) was from a show about modern laser-guided bombs. Not that that matters as much as the fact that they were talking about completely ideal circumstances (bombsight accuracy, among other things, is not a factor in this formula).
Think of it this way, you drop a bomb from 1000 feet at a given speed. In a perfect setting where all atmospheric factors are always the same (no flucuation in air pressure, no turbulence, no wind, etc.) and you release the bomb at the same point in space at always the same velocity: repeating the experiment always ends up hitting the same spot.
Now if you take this perfect situation and add in the factor of irregularities in the atmosphere affecting the bomb on its way down, you can expect UP TO 6' in deviation at the 1000ft drop altitude. Of course in a hurricane it could be much more exagerrated, but I don't think the modern USAF drops weapons much in those conditions.
Add in the multitude of factors presented by the case you site, and of course the dispersion will much more magnified.
Just wanted to make clear that the 6ft max per 1000ft air force quote was only describing one variable in a complicated situation

.
Hope that made sense! God knows I'm no physics professor, lol.
Belt