The ten biggest nuclear weapons tests (from 50 megatons down to 10.4 megatons yield), and Project Orion - the nuclear explosion powered spacecraft
Above: on 30 October 1961, the 50 Mt RDS-220 or Tsar Bomba, the world's highest yield nuclear weapon test, 8.0 m long, 2.1 m diameter and weighing 26 tons, was dropped with parachute retarded fall from a Tu-95 Bear bomber flying at an altitude of 10.5 km. (These precise bomb size details were published in: V. N. Mikhailov, et al., USSR Nuclear Tests, vol. 2, Technology of Nuclear Tests, Begell-Atom, 1999, pp. 82–84.) It exploded at an altitude of 3.9 km some 188 seconds after being dropped, at 73.85 degrees north, 54.50 degrees east.
In the film you can see the fireball expanding spherically until the shock wave bounces off the ground and back up into the fireball, flattening the bottom of the fireball and pushing it upwards ahead of the usual slow development of buoyant toroidal rise. Windows were partially broken (cracked) by the high altitude refraction and focussing of the shock wave out to distances of 900 km. When the detonation was triggered at the correct altitude, the white-painted Tu-95 and accompanying Tu-16 photographic aircraft were both 45 kilometres from the bomb.
The film stills above show the laboratory work in putting together TSAR BOMBA. You can see the boosted fission primary sits in the nose: fission of the material in the primary gave a total yield of about 1 megaton, with the other 49 megatons coming from clean nuclear fusion. The lithium deuteride fusion capsules were encased in lead instead of natural uranium, to keep the fission yield low. X-rays were channeled from the boosted fission primary to the fusion capsules inside the thick casing. Notice that you can see how thick the casing is from some photos of the nose cone of the bomb being fitted over the boosted fission primary.
The usual claims about retina eye burns and 3rd degree skin burns occurring out to vast distances for a 50 or 100 Mt bomb is a massive exaggeration, neglecting both the attenuation due to the atmosphere and the fact that the very long thermal pulse for a large weapon is very ineffective in causing burns (you have many, many seconds to take evasive action, and anyway even if you don't, a lot more thermal radiation is needed to cause a specific burn than for a low yield weapon, because the temperature rise in the skin is reduced due to the increased duration that the energy is distributed over, which allows cooling mechanisms like reradiation and blood circulation to deeper tissues, to mitigate damage).
(1) 50 Mt*, 30 Oct 1961 USSR clean 2-3 % fission air burst at 3,900 m altitude over Novaya Zemlya. (100 Mt design with U-238 pusher replaced by lead to reduce fission yield from 50% to 2.5% and total yield from 100 Mt to 50 Mt.)
(2) 24.2 Mt, 24 Dec 1962 USSR air burst at 3,750 m altitude dropped from Tu-95 (carried externally) over Novaya Zemlya. (50 Mt design with U-238 pusher replaced by lead to reduce fission yield from 50% to 2.5% and total yield from 50 Mt to 24 Mt.)
(3) 21.1 Mt, 5 Aug 1962 USSR air burst at 3,600 m altitude over Novaya Zemlya
(4) 20 Mt, 27 Sep 1962 USSR air burst at 3,900 m altitude over Novaya Zemlya
(5) 19.1 Mt, 25 Sep 1962 USSR air burst at 4,090 m altitude over Novaya Zemlya
(6) 14.8 Mt, 28 Feb 1954 US Castle-Bravo 67 % fission reef surface burst on the northern reef at Bikini Atoll
(7) 13.5 Mt, 4 May 1954 US Castle-Yankee 52 % fission water surface burst on barge over Bikini Lagoon
(8) 12.5 Mt, 23 Oct 1961 USSR air burst at 3,500 m altitude over Novaya Zemlya
(9) 11 Mt, 26 Mar 1954 US Castle-Romeo 64 % fission water surface burst on barge in Bikini Lagoon
(10) 10.4 Mt, 31 Oct 1952 US Ivy-Mike Elugelab Island surface burst, Eniwetok Atoll**. Mike was an 82 ton liquid deuterium device, the first full scale test of the Teller-Ulam staged radiation implosion principle, using a boosted fission primary bomb and a physically separate fusion stage (a Dewar vacuum flash filled with deuterium and surrounded by a 5-ton natural uranium pusher). The outer steel casing was 2 metres wide and 6.1 metres high, with walls 30 cm thick. The inside surface of the casing was lined with lead and polyethylene, forming a conduit from the primary to the secondary.
Above: IVY-MIKE fireball to cloud transition photo sequence (30 seconds, 2 minutes, 10 minutes and 20 minutes).
Above: ‘Operation Ivy’, produced by the U.S. Air Force Lookout Mountain Laboratory, Hollywood, California for the U.S. Armed Forces Special Weapons Project, and presented very impressively by Western cowboy film star Reed Radley: ‘You have a grand stand seat here to one of the most momentous events in the history of science. In less than a minute you will see the most powerful explosion ever witnessed by human eyes. The blast will come up on the horizon just about there, and this is the significance of the moment: this is the first full-scale test of a hydrogen device. If the reaction goes, we’re in the thermonuclear era. For the sake of all of us, and for the sake of our country, I know you’ll join me in wishing this expedition well.’
* Close-in Russian data gave a yield of 50 Mt. Long-range Western micro-barographs suggested 56 and 58 Mt based on the peak overpressure and duration of the distant blast wave, when it had become a gravity wave-type disturbance in the atmosphere, but no burst altitude data was available, and close-in data are more accurate.
Information from Wikipedia relating to clean weapons and their eventual deployment for low yield tactical 'neutron bombs': In 1956, President Eisenhower announced the testing of a 95% 'clean' (2-stage) fusion weapon, later identified to have been the 11 July Navajo test at Bikini Atoll during Operation Redwing. This weapon had a 4.5 megatons yield. Previous 'dirty' weapons had fission proportions of 50-77%, due to the use of uranium-238 as a 'pusher' around the lithium deuteride (secondary) stage. (The fusion neutrons have energies of up to 14.1 MeV, well exceeding the 1.1 MeV 'fission threshold' for U-238.) The 1956 'clean' tests used a lead pusher, while in 1958 a tungsten carbide pusher was employed. Hans A. Bethe supported clean nuclear weapons in 1958 as Chairman of a Presidential science advisory group on nuclear testing:
'... certain hard targets require ground bursts, such as airfield runways if it is desired to make a crater, railroad yards if severe destruction of tracks is to be accomplished... The use of clean weapons in strategic situations may be indicated in order to protect the local population.' (Dr Hans Bethe, 27 March 1958 Top Secret - Restricted Data Report to the NSC Ad Hoc Working Group on the Technical Feasibility of a Cessation of Nuclear Testing (Bethe was the Working Group Chairman, page 9).
In consequence of Bethe's recommendations, on 12 July 1958, the Hardtack-Poplar shot on a barge in the lagoon yielded 9.3 megatons, of which only 4.8% was fission. It was 95.2% clean. It was the clean Mk-41C warhead. Cohen in 1958 investigated a low-yield 'clean' nuclear weapon and discovered that the 'clean' bomb case thickness scales as the cube-root of yield. So a larger percentage of neutrons escapes from a small detonation, due to the thinner case required to reflect back X-rays during the secondard stage (fusion) ignition. For example, a 1-kiloton bomb would need to have a case only 1/10th the thickness of that for 1-megaton. This means that although most of the neutrons are absorbed by the outer casing in a 1-megaton bomb, in a 1-kiloton bomb they would mostly escape. A neutron bomb is only feasible if the yield is sufficiently high that efficient fusion stage ignition is possible, and if the yield is low enough that the case thickness will not absorb too many neutrons. This means that neutron bombs have a yield range of 1-10 kilotons, with fission proportion varying from 50% at 1-kiloton to 25% at 10-kilotons (all of which comes from the primary stage). The neutron output per kiloton is then approximately 10-15 times greater than for a pure fission implosion weapon or a standard (high yield) strategic warhead like a W87 or W88.
In 1981, the Christian Science Monitor reported that there "are 19,500 tanks in the Soviet-controlled forces of the Warsaw Pact aimed at Western Europe. Of these, 12,500 are Soviet tanks in Soviet units. NATO has 7,000 tanks on its side facing the 19,500." (Joseph C. Harsch, 'Neutron Bomb: Why It Worries The Russians,' Christian Science Monitor, August 14, 1981, p. 1.)
Cohen's neutron bomb is not mentioned in the unclassified manual by Glasstone and Dolan, The Effects of Nuclear Weapons 1957-77, but is included as an 'enhanced neutron weapon' in chapter 5 of the declassified (formerly secret) manual edited by Philip J. Dolan, Capabilities of Nuclear Weapons, U.S. Department of Defense, effects manual DNA-EM-1, updated 1981 (U.S. Freedom of Information Act).
Provided that the weapon was not used in a thunderstorm, no fallout effects would occur from the use of a neutron bomb according to that manual, as the combination of 500 m burst altitude and low yield prevents fallout in addition to significant thermal and blast effects. The reduction in damage outside the target area is a major advantage of such a weapon to deter massed tank invasions. An aggressor would thus be forced to disperse tanks, which would make them easier to destroy by simple hand-held anti-tank missile launchers.
** http://www.johnstonsarchive.net/nuclear/tests/multimegtests.html states this test had a fission yield of 60 % whereas http://nuclearweaponarchive.org/Usa/Tests/Ivy.html states it was 77 %. I can add some comments to this issue. In the published U.S. Congressional Hearings of June 22-26, 1959, tables of data were presented showing the fission yields from all American and British tests, which the Americans had monitored (for Russian tests, the tables did not present fission yields but merely assumed 50 % of the total yield was from fission). Those tables showed that America detonated 15 Mt of fission in land surface bursts from 1952-54, i.e., the fission yield of Ivy-Mike and Castle-Bravo together was 15 Mt. Plenty of reports show that the fission yield of Castle-Bravo was known by 1956 to be 10 Mt, hence you can deduce a fission yield for Ivy-Mike of 5 Mt, or 48 %. However, other declassified data, for example the measured upwind fallout pattern for Ivy-Mike, suggests that the 77 % fission yield may be correct.
http://nuclearweaponarchive.org/Russia/TsarBomba.html wrongly states that the 57 Mt yield estimate was based on Western fallout analyses, which is false (although fallout analysis did imply a 2-3 % fission yield, i.e. the bomb was 97-98% clean). The total yield 57 Mt estimate came from rough measurements using micro-barographs to see the long range pressure and its duration, which are not as accurate as the Russian close-in blast data which indicated a yield of 50 Mt.
Above: TSAR BOMBA replica, photo credit Wikipedia.
Above: American toroidal fireball nuclear test films.
Above: British toroidal fireball nuclear test films (Christmas and Malden Islands, Pacific).
Above: Professor Freeman Dyson and Sir Arthur C. Clarke supporting the nuclear bomb powered spacecraft, Project Orion, the only economic practical way for human beings to holiday on Mars. (Excerpt from BBC's To Mars by A-Bomb (2003), with footage of the tests and comments by Arthur Clarke and Freeman Dyson.) The Orion spacecraft has a large thick steel pusher plate connected via hydraulic dampers to the crew accommodation. A series of nuclear explosions is detonated below the pusher plate, which shields the crew from nuclear radiation and recoils upwards when ablated by X-rays. The impulses from nuclear weapon explosions efficiently accelerate the spacecraft to high speed. It would have been launched to Mars from the Nevada nuclear test site, using relatively clean low fission yield detonations for the first few minutes (to minimise the EMP, air blast and fallout effects on Earth), and then larger detonations when a safe distance away. Project Orion was headed by Los Alamos nuclear weapons designer Dr Theodore Taylor, who developed many nuclear weapons (Scorpion, Wasp, Bee, Hornet Nevada tests, and the 500 kt pure fission implosion bomb tested as the IVY-KING shot in 1952). (The idea of utilizing explosions for work is not so crazy as it sounds, when you remember that the internal combustion engine doesn't 'burn' gasoline, it explodes it in a controlled way within the cylinder after mixing fuel with air and compressing the resulting mixture, and the engine converts the impulsive force of the explosion into useful work energy done against the piston to produce motion. Maybe a massive version of such a piston-in-cylinder engine could utilize recoil forces caused by thermonuclear explosions, which are more cost-efficient for releasing energy than the operation of a nuclear reactor to generate steam to power turbines.)
There were several other nuclear rocket systems as alternatives to Orion, although Orion is by far the best. One alternative was Project Thunderwell, the steam accelerated Jules Verne capsule, which was suggested by the speed of at least 6 times earth's escape velocity, achieved by the 10-cm thick, 1.2 m diameter steel cover blown off the top of the 152 m shaft of the 0.3 kt Plumbbob-Pascal B underground Nevada test on 27 August 1957. In that test, a 1.5 m thick 2 ton concrete plug immediately over the bomb was pushed up the shaft by the detonation, knocking the welded steel lid upward. This was a preliminary experiment by Dr Robert Brownlee which ultimately aimed to launch spacecraft using the steam pressure from deep shafts filled with water, with a nuclear explosion at the bottom; an improvement of Jules Verne's cannon-fired projectile described in De la Terre à la Lune, 1865, where steam pressure would give a more survivable gentle acceleration than Verne's direct impulse from an explosion. Some 90% of the radioactivity would be trapped underground. Like Project Orion, Project Thunderwell was cancelled for pseudoscientific (political) reasons after the nuclear test ban treaty was signed.
Another nuclear rocket system was simply to use a bare, uncluttered nuclear reactor core to directly heat hydrogen gas to high temperature and then expel it from an exhaust nozzle in lieu of burning it with oxygen. This was NASA's Kiwi rocket, which was extensively tested (producing a lot of radioactivity in the atmosphere) but, you guessed it, never deployed! The advantage of it is that you need to carry less fuel, because you're not burning hydrogen, you're just ejecting it to get a recoil by Newton's 3rd law of motion, and by ejecting it at high speed (fast hydrogen molecules) due to nuclear reactor heating, it can be more efficient than a conventional rocket engine.
There should be a note here about how unnatural radioactive pollution is (not) in space: the earth's atmosphere is a radiation shield equivalent to being protected behind a layer of water 10 metres thick. This reduces the cosmic background radiation by a factor of 100 of what it would be without the earth's atmosphere. Away from the largely uninhabited poles, the Earth's magnetic field also protects us against charged cosmic radiations, which are deflected and end up spiralling around the magnetic field at high altitude, in the Van Allen trapped radiation belts. On the Moon, for example, there is no atmosphere or significant magnetic field so the natural background radiation exposure rate at solar minimum is 1 milliRoentgen per hour (about 10 microSieverts/hour) some 100 times that on the Earth (0.010 milliRoentgen per hour or about 0.10 microSieverts/hour). The Apollo astronauts visiting the Moon wore dosimeters and they received an average of 275 milliRoentgens (about 2.75 milliSieverts) of radiation (well over a year's exposure to natural background at sea level) in over just 19.5 days. It is a lot more than that during a solar flare, which is one of the concerns for astronauts to avoid (micrometeorites are another concern in a soft spacesuit).
The higher up you are above sea level, the less of the atmosphere there is between you and space, so the less shielding you have to protect you from the intense cosmic space radiations (emitted by thermonuclear reactors we call 'stars', as well as distant supernovae explosions). At sea level, the air above you constitutes a radiation shield of 10 tons per square metre or the equivalent of having a 10 metres thick water shield between you and outer space. As you go up a mountain or up in an aircraft, the amount of atmosphere between you and space decreases, thus radiation levels increase with altitude because there is less shielding. The normal background radiation exposure rate shoots up by a factor of 20, from 0.010 to 0.20 milliRoentgens per hour, when any airplane ascends from sea level to 36,000 feet cruising altitude. (The now obsolete British Concorde supersonic transport used to maintain radiation-monitoring equipment so that it could drop to lower-altitude flight routes if excessive cosmic radiation due to solar storms were detected.) Flight aircrew get more radiation exposure than many nuclear industry workers at nuclear power plants. Residents of the high altitude city of Denver get 100 milliRoentgens (about 1 milliSievert) more annual exposure than a resident of Washington, D.C., but the mainstream anti-radiation cranks don't campaign for the city to be shut to save kids radiation exposure, for mountain climbing to be banned, etc.!
The point I'm making here, for the Green Warriors, is that a nuclear-powered rocket won't be a horrible unnatural thing polluting nice pristine non-radioactive 'clean' outer space with horrible human produced radioactive waste: the universe is full of nuclear reactors (called stars purely for reasons of political expediency) and unending nuclear explosions (called supernovae purely for reasons of political expediency). Live with it!
Above: Carl Sagan talking about Project Orion, which could be built today with existing technology if there was not insane groupthink about nuclear test effects. Dr Theodore Taylor gives the full technical details in John McPhee's book The Curve of Binding Energy, 1974. Cosmic radiation is 100 times higher in space than on the Earth's surface. The EMP and fallout effects could be suppressed by clean weapons designs with thick casings to absorb prompt gamma radiations (see blog posts here, here, here, and here).
Summary of Project Orion from Dr Taylor:
Project Orion began in 1958 when nuclear weapons designer Dr Theodore B. Taylor moved to General Atomic to design a nuclear bomb powered spaceship, sponsored by the U.S. Advanced Research Projects Agency. It would travel directly (in a straight line!) and quickly to Mars using 2,000 nuclear bombs, carrying 150 people and attaining a top speed of 45 km/second. The travel time would be 3 months for the minimum distance to Mars of 56 million km and 6 months for the maximum Mars-Earth distance of 101 million km. In 1959 the stability of the entire system was completely proved in a scaled-down demonstration test which impressed Dr von Braun so much that he supported Project Orion after seeing the demonstration film.
Above: blueprints for the nuclear rockets from R. S. Cooper, "Nuclear propulsion for space vehicles", Annual Review of Nuclear Science, v18, 1968, pp. 203-228. To resist the high temperatures, metals like tungsten (which has a very high melting point) are preferred to steel for the surface of the pusher. Graphite in a thin layer of droplets can be sprayed on to the pusher plate by retractable jet nozzle located within the central hole in the pusher plate. More advanced designs use a concave shaped pusher which detonates the bomb at the focus, to utilize a larger fraction of the case shock and X-ray ablative recoil energy. Project Orion was first proposed by Dr Stanislaw Ulam, of Teller-Ulam fame. It was developed by Dr Theodore Taylor at General Atomic.
Dr Taylor says in The Curve of Binding Energy (by McPhee) that the idea stemmed from the 15.2 kt REDWING-INCA nuclear test on June 26, 1956, where 30 cm diameter carbon-coated steel balls were placed 9 metres from the bomb by researcher Lew Allen, and were undamaged with only a loss of 0.1 mm of surface graphite! This gave rise to the design of the 75 ton, 41 metre diameter carbon-coated steel base pusher plate in the 76 metres high Project Orion spacecraft, where the base pusher plate is connected by hydraulic shock absorbers to the crew compartment. The steel plate acts as a radiation shield as well as ablative recoil mechanism to get propulsion: after each bomb was fired, oil would be sprayed on the plate to give it a carbon coating. The dynamics of X-ray ablation are well established in nuclear weapons design because this mechanism is what is used to cause the fusion stage in a bomb to explode: X-rays from the fission stage are channelled to the fusion stage, ablating the surface which causes a compression by recoil (Newton's 3rd law).
The nuclear test fireball experiments of Project 5.4 during Operation TEAPOT in Nevada, 1955, Project 5.9 of Operation REDWING at Bikini Atoll and Eniwetok Atoll in 1956, and then Project 8.3b of Operation PLUMBBOB in Nevada, 1957 proved that objects like steel spheres in the fireball only suffered a tiny amount of surface scarring because the thermal pulse just ablates a microscopic thickness of the surface, causing a recoil force. Actually, this kind of thin layer ablation had first been noted back on the TRINITY test of July 16, 1945:
‘The measured total radiation at [9.1-km] from the centre was 0.29 calories/cm2 ... Examination of the specimen exposed at [975 m] shows ... the charred layer does not appear to be thicker than 1/10 millimetre.... scorching of the fir lumber used to support signal wires extended out to about [1.9 km] ... the risk of fire due to the radiation ... is likely to be much less than the risk of fire from causes existing in the buildings at the time of explosion.’
– W. G. Marley and F. Reines, July 16th Nuclear Explosion: Incendiary Effects of Radiation, Los Alamos report LA-364, October 1945, originally Secret, pp. 5-6.
Dr Taylor explained that the first nuclear bomb to start ascent would only need to be 0.1 kt, the next a second later would be 0.2 kt, and so on up to bomb number 50 which would be 20 kt, by which time a total of 200 kt would have been detonated, and the spacecraft would then be in space without having caused any significant EMP or fallout damaging effects on the Earth compared to natural background radiation. There would be no radioactive trail left in space behind such a nuclear pulse rocket because the debris expands at a rate faster than the excape velocity of the solar system. The pusher plate would not be severely heated or damaged because of the 10 nanosecond duration of the ablative X-ray impulse from a nuclear explosion 60 metres away, which only ablates the surface layer (such as the layer of carbon rick grease sprayed on the pusher plate automatically after each detonation). Remember that in an automobile engine, the temperature attained by the exploding gasoline and air mixture is much higher than the melting point of the steel pistons and cylinders, but the latter don't melt because the duration of each explosion is too brief to heat up the material to that temperature, so the residual heat after expansion doesn't penetrate and destroy the piston and cylinder, but rapidly cools and ends up as warm exhaust gas!
“Observations of the remains of towers and shielding material after detonation at several ground zeros indicate that large masses of material are not vaporized. Observations of the residue of the Smoky tower [44 kt bomb atop a 700 foot high steel tower] indicated that a very significant portion of that tower remained, including the upper 200 feet of steel. Another example similar to Shot Smoky was Shot Apple II [29 kt atop a 500 ft steel tower], Teapot Series. Even though the total yield of Shot Apple II was about [29 kt], the floor of the cab [housing the nuclear bomb itself, at the top of the tower] and the main tower support columns remained intact. The results of the Shot Fizeau [11 kt atop a 500 ft steel tower] tower melt studies (W. K. Dolen and A. D. Thornborough, Fitzeau Tower Melt Studies, Sandia report SC-4185, 1958, Secret) show that about 85 percent of tower material was accounted for after the detonation and that only the upper 50 feet of tower was vaporized. No melting occurred beyond 175 feet from the top of the tower although the fireball theoretically engulfed more than 400 feet of the tower.”
- Dr Kermit H. Larson, et al., Distribution, Characteristics, and Biotic Availability of Fallout, Operation Plumbbob, weapon test report WT-1488, ADA077509, July 1966, page 59.
J. E. Kester and R. B. Ferguson report in Operation Teapot, Project 5.4, Evaluation of Fireball Lethality Using Basic Missile Structures, WT-1134 (originally Secret – Restricted Data), AD0340137, that within the 23 kt Teapot-Met (Nevada, 15 April 1955, 400 ft steel bomb tower) although the bomb test steel tower was blown down, it was not vaporized and much survived despite having been engulfed by the fireball itself, as stated on page 30:
“... nearly 225 feet of the main support members of the shot tower were still intact and laid out radially from their original position.”
Page 116 of WT-1134 states that after the 2 kt Moth shot atop a 300 foot triangular tower on 22 February 1955: “The three tower legs were laid out approximately radially from their pre-shot positions. The longest tower leg found was about 200 ft long. The other two legs appeared to be about 150 ft long. All three guy cables were still attached ... A few large pieces of the tower, about 20 to 30-ft long, were strewn to ranges of about 200 feet.” It adds that after the 7 kt Tesla shot atop a 300 ft square tower on 1 March 1955: “the four tower legs ... were laid out radially from their original position ... The tower legs remained intact to lengths of about 125 feet. All four guy cables were still attached ...” The 43 kt Turk nuclear test was fired atop a 500 ft square tower, leaving 100 ft lengths of tower lengths on the ground (page 118). The 8 kt Bee shot atop a 500 ft tower failed to even knock down most of the tower (pages 120-1): “A large portion of this tower was still standing after the shot. ... It is estimated that at least 150 feet of the tower was essentially undamaged and standing erect with an additional 50 to 75 feet of the tower slightly melted and drooped over at the top.” The 14 kt Apple 1 shot atop a 500 ft square tower results (page 121): “The main support members of the shot tower still remained to lengths of about 150 feet with the top 25 to 50 feet being crushed and split ... Some of the legs remained attached to the base.” The 23 kt Met shot was atop a 400 ft square tower (pages 123-4): “About 225 feet of the tower legs were still intact with the top 25 to 50 feet being crushed, split and slightly melted ....”
Above: color photo shows the lower 200 feet surviving from the 300 ft steel tower of the 0.2 kt Ruth nuclear test in Nevada on 31 March 1953. The black and white photographs are from the 23 kt Teapot-Met nuclear explosion (Nevada, 15 April 1955) ablation tests by J. E. Kester and R. B. Ferguson, Operation Teapot, Project 5.4, Evaluation of Fireball Lethality Using Basic Missile Structures, WT-1134 (Secret – Restricted Data), AD0340137, which showed that at just 80 feet only the outer 0.4 inch of steel balls was ablated by the fireball.
The error in the popular myth that everything is vaporized in the fireball is that the cooling rate of the fireball is so great that there is literally not enough time for the heat to penetrate more than a thin surface layer before the temperature drops below melting point. Good heat conductors like steel are protected by ablation. A very thin surface layer of the material is vaporized, protecting the underlying material, just as occurs with thermal radiation striking wooden houses (Glasstone and Dolan, The Effects of Nuclear Weapons):
References:
J. C. Nance, 'Nuclear Pulse Propulsion', IEEE Transactions on Nuclear Science, February 1965, p. 177.
T. W. Reynolds, 'Effective Specific Impulse of External Nuclear Pulse Propulsion Systems', Journal of Spacecraft and Rockets, October 1973, p. 629.
----------
‘The President put his name on the plaque Armstrong and Aldrin left on the moon and he telephoned them while they were there, but he cut America’s space budget to the smallest total since John Glenn orbited the Earth. The Vice-President says on to Mars by 1985, but we won’t make it by “stretching out” our effort. Perhaps NASA was too successful with Apollo. It violated the “Catt Concept”, enunciated by Britisher Ivor Catt. According to Catt, the most secure project is the unsuccessful one, because it lasts the longest.’
- Robert P. Crossley, Editorial, Popular Mechanics, Vol. 133, No. 5, May 1970, p. 14.
E.g., compare the Apollo project with the Vietnam war for price, length and success. Both were initially backed by Kennedy and Johnson as challenges to Communist space technology and subversion, respectively. The Vietnam war – the unsuccessful project – sucked in the cash for longer, which closed down the successful space exploration project!
Above: neutron bomb supporter Dr Edward Teller of the Lawrence Livermore National Laboratory stated in the San Francisco KQED-TV television Fallout and Disarmament debate with Nobel Laureate Linus Pauling on 20 February 1958:
“I believe that the second world war was brought on by a race in disarmament. The peace-loving nations disarmed, and when the Hitler tyranny armed inertia was too great ... he got away with his army and he almost conquered the world. ... If there is war, if the terrible catastrophe befalls us, then next we must try to keep that war as small as possible, and at the same time we must try to be sure that no more people will unwillingly be subjected to the Russian yoke. ... If such should happen, then it would be of great importance that these weapons should do as little damage in human life as possible. If a war of this kind has to be fought, then the danger from radioactivity will be very great indeed. ... there should not be unnecessary, uncontrollable radioactive dust – radioactive contamination, which would kill friend and foe alike. ... It is even possible, to my mind, that there is no damage; and there is the possibility, furthermore that very small amounts of radioactivity are helpful. ...
“Here is a recent quotation from Nature - the British publication. This says that due to our wearing tight clothes, and due to the increased temperature of the sperm plasm, to the organs which make our sperm, there will be an increase in mutations. Then it goes on to say that since our modes of dress have been predominant for several centuries, it might explain almost half the present load of spontaneous mutations. So we see how modes of dress, based chiefly on sexual taboos, might present genetic hazards one hundred to one thousand times greater that those estimated from different sources of radiation. ... even in the terrible event of war, I believe that in this war, if it were fought with the highly flexible and highly mobile nuclear weapons, it would not be necessary to take so many young people away from their homes. I do not believe, if we can localize wars, that the casualties need be very great.”
Now should the public be informed about positive research reports on radiation such as the following report? Or do we suppress it? Do we cover-up evidence which doesn't fit the popular media "radiation is bad" ideology?
W. L. Chen, Y. C. Luan, M. C. Shieh, S. T. Chen, H. T. Kung, K. L. Soong, Y. C. Yeh, T. S. Chou, S. H. Mong, J. T. Wu, C. P. Sun, W. P. Deng, M. F. Wu, and M. L. Shen, ‘Is Chronic Radiation an Effective Prophylaxis Against Cancer?’, published in the Journal of American Physicians and Surgeons, Vol. 9, No. 1, Spring 2004, pp. 6-10:
"An extraordinary incident occurred 20 years ago in Taiwan. Recycled steel, accidentally contaminated with cobalt-60 ([low dose rate, gamma radiation emitter] half-life: 5.3 y), was formed into construction steel for more than 180 buildings, which 10,000 persons occupied for 9 to 20 years. They unknowingly received radiation doses that averaged 0.4 Sv, a collective dose of 4,000 person-Sv. Based on the observed seven cancer deaths, the cancer mortality rate for this population was assessed to be 3.5 per 100,000 person-years. Three children were born with congenital heart malformations, indicating a prevalence rate of 1.5 cases per 1,000 children under age 19. The average spontaneous cancer death rate in the general population of Taiwan over these 20 years is 116 persons per 100,000 person-years. Based upon partial official statistics and hospital experience, the prevalence rate of congenital malformation is 23 cases per 1,000 children. Assuming the age and income distributions of these persons are the same as for the general population, it appears that significant beneficial health effects may be associated with this chronic radiation exposure."
Thus, a dose rate of roughly 0.4 Sv per 9-20 years, i.e. a dose rate of 2.3-5.1 microGrays per hour (0.23-0.51 millirads per hour) or 23-51 times normal background causes the benefit of a fall in normal cancer rates by a factor of 116/3.5 = 33, and a fall in congenital heart malformations by a factor of 23/1.5 = 15. These are big numbers!
Let me repeat the facts again just clarify this very important point, Chen and thirteen other physicians investigated the apparent benefits of low level radiation in Taiwan, "Is Chronic Radiation an Effective Prophylaxis Against Cancer?", Journal of American Physicians and Surgeons, v9 n1 2004. After a radioactive source was accidentally mixed into industrial steel and used to build apartments in Taiwan, 10,000 persons were unknowingly exposed to low-level radiation in Taiwan for periods of 9-20 years, and in this group cancer rates were lower than those in the general population by a factor of 33 (a reduction from 116 to just 3.5 per 100,000 person-years); while genetic defects fell by a factor of 15 (from 23 cases per 1,000 children to just 1.5). These are such enormous benefits that you would expect that all donor and publically funded "Cancer Research" institutes would be studying these benefits from dose rates of radiation a few hundred times background, which can apparently slash cancer risks and genetic defect rates to such an extent.
The statistics in the paper by Chen and others has been alleged to apply to a younger age group than the general population, affecting the significance of the data, although in other ways the data are more valid than Hiroshima and Nagasaki data extrapolations to low doses. For instance, the radiation cancer scare mongering of survivors of high doses in Hiroshima and Nagasaki would have been prejudiced in the sense of preventing a blind to avoid “anti-placebo” effect, e.g. increased fear, psychological stress and worry about the long term effects of radiation, and associated behaviour. The 1958 book about the Hiroshima and Nagasaki survivors, “Formula for Death”, makes the point that highly irradiated survivors often smoked more, in the belief that they were doomed to die from radiation induced cancer anyway. Therefore, the fear culture of the irradiated survivors would statistically be expected to result in a deviancy from normal behaviour, in some cases increasing the cancer risks above those due purely to radiation exposure.
For up-to-date data and literature discussions on the effects of DNA repair enzymes on preventing cancers from low-dose rate radiation, please see
http://en.wikipedia.org/wiki/Radiation_hormesis
There is also evidence for low dose radiation benefits from Hiroshima and Nagasaki's joint American-Japanese Radiation Effects Research Institute (RERF) which is being covered up by the statistical fiddle of "lumping together" the majority of the survivors into one large dose interval group, and only taking small dose intervals at high doses, which is a fiddle that falsely omits the benefits from the boosting of the P53 DNA repair enzyme by low radiation doses in those cities (the statistical bias in the table below from the RERF Brief Guide is in every sense a classic example of the biased presentation of data; remember that at high doses the cancer data are least reliable because the average amount of radiation shielding by buildings needed to survive the initial effects and get cancer years later was very high, and estimates of the exact shielding factors are one of the greatest uncertainties continuing in the DS02 dosimetry, as shown for example by the inconsistent curve of percentage temporary epilation versus dose in the same publication - the dosimetry is more accurate at lower doses because the average radiation shielding of survivors is much smaller at those lower doses):
Radiation delivered over long periods at a few hundred times the natural background dose rates stimulates the use of body resources to produce more of the natural DNA repair enzyme, protein P53, thus utilizing more of the energy resources of the body for repairing DNA breaks than is usually allocated, and this reduces the natural cancer and genetic risks. This effect is in some sense like working out at the gym regularly: you end up after regular exercise not generally more tired, but generally fitter and more muscular, because the body responds in the long run by using more resources to adapt by strengthening itself and maintaining hormesis (an effect well known in chemotherapy: "what doesn't kill you, makes you stronger").
In the West, freedom of speech allows politically incorrect facts to be censored by the fashionable media. If you want to see why this censorship of the benefits of low level radiation is continuing, see the relatively vague and unconvincing (apart from a quotation from Dr Robert Rowland) article by James Muckerheide in the year 2000, "It’s Time to Tell the Truth About the Health Benefits of Low-Dose Radiation", and see also weak graphical correlations shown in Dr T. D. Luckey's 2008 paper, "The Health Effects of Low-Dose Ionizing Radiation" in the Journal of American Physicians and Surgeons, v13, n2, pp. 39-42, which does at least summarize the 2004 Chen paper in the same journal concisely:
"In 1982-1983, several apartments in Taipei City, Taiwan, were built with structural steel contaminated with cobalt-60. Chen et al. noted the total cancer death rates for radiation-exposed adult occupants and controls in the city were comparable when the apartments were first occupied. As both groups aged, the cancer mortality rate in the radiation-exposed group decreased while the cancer mortality rate of controls increased. The cancer mortality rate of those who had lived 9–20 years in these buildings was only 3% that of the general adult population."
Of course, it's always been known since the work (mentioned above) of French radiologists that radiation is more effective at killing rapidly dividing cancer cells than normal cells (because cells are more vulnerable during cell nucleus fission than at other times, and more rapidly diving cells spend a greater percentage of the time in this vulnerable state than healthy cells do). But this discovery that low dose rates of radiation can produce a health benefit by preventing cancer in the first place is new.
What is happening here is the "what doesn't kill you makes you stronger" effect: dose rates of 20-50 times normal background over a period of 1-2 decades stimulates a stronger DNA repair enzyme system. The body simply devotes more energy from food into building more DNA repair enzymes, and it over-compensates, thereby reducing natural cancer rates. This positive benefit from radiation would occur up to the threshold for cancer seen in the radium dial painters, 57 microGrays per hour (5.7 millirads per hour) or 570 times normal background. Only if the dose rate becomes too high does the rate of damage overwhelm natural DNA repair mechanisms and cause cancer:
‘... it is important to note that, given the effects of a few seconds of irradiation at Hiroshima and Nagasaki in 1945, a threshold near 200 mSv may be expected for leukemia and some solid tumors. [Sources: UNSCEAR, Sources and Effects of Ionizing Radiation, New York, 1994; W. F. Heidenreich, et al., Radiat. Environ. Biophys., vol. 36 (1999), p. 205; and B. L. Cohen, Radiat. Res., vol. 149 (1998), p. 525.] For a protracted lifetime natural exposure, a threshold may be set at a level of several thousand millisieverts for malignancies, of 10 grays for radium-226 in bones, and probably about 1.5-2.0 Gy for lung cancer after x-ray and gamma irradiation. [Sources: G. Jaikrishan, et al., Radiation Research, vol. 152 (1999), p. S149 (for natural exposure); R. D. Evans, Health Physics, vol. 27 (1974), p. 497 (for radium-226); H. H. Rossi and M. Zaider, Radiat. Environ. Biophys., vol. 36 (1997), p. 85 (for radiogenic lung cancer).] The hormetic effects, such as a decreased cancer incidence at low doses and increased longevity, may be used as a guide for estimating practical thresholds and for setting standards. ...
‘Though about a hundred of the million daily spontaneous DNA damages per cell remain unrepaired or misrepaired, apoptosis, differentiation, necrosis, cell cycle regulation, intercellular interactions, and the immune system remove about 99% of the altered cells. [Source: R. D. Stewart, Radiation Research, vol. 152 (1999), p. 101.] ...
‘[Due to the Chernobyl nuclear accident in 1986] as of 1998 (according to UNSCEAR), a total of 1,791 thyroid cancers in children had been registered. About 93% of the youngsters have a prospect of full recovery. [Source: C. R. Moir and R. L. Telander, Seminars in Pediatric Surgery, vol. 3 (1994), p. 182.] ... The highest average thyroid doses in children (177 mGy) were accumulated in the Gomel region of Belarus. The highest incidence of thyroid cancer (17.9 cases per 100,000 children) occurred there in 1995, which means that the rate had increased by a factor of about 25 since 1987.
‘This rate increase was probably a result of improved screening [not radiation!]. Even then, the incidence rate for occult thyroid cancers was still a thousand times lower than it was for occult thyroid cancers in nonexposed populations (in the US, for example, the rate is 13,000 per 100,000 persons, and in Finland it is 35,600 per 100,000 persons). Thus, given the prospect of improved diagnostics, there is an enormous potential for detecting yet more [fictitious] "excess" thyroid cancers. In a study in the US that was performed during the period of active screening in 1974-79, it was determined that the incidence rate of malignant and other thyroid nodules was greater by 21-fold than it had been in the pre-1974 period. [Source: Z. Jaworowski, 21st Century Science and Technology, vol. 11 (1998), issue 1, p. 14.]’
- Zbigniew Jaworowski, 'Radiation Risk and Ethics: Health Hazards, Prevention Costs, and Radiophobia', Physics Today, April 2000, pp. 89-90.
Protein P53, discovered only in 1979, is encoded by gene TP53, which occurs on human chromosome 17. P53 also occurs in other mammals including mice, rats and dogs. P53 is one of the proteins which continually repairs breaks in DNA, which easily breaks at body temperature: the DNA in each cell of the human body suffers at least two single strand breaks every second, and one double strand (i.e. complete double helix) DNA break occurs at least once every 2 hours (5% of radiation-induced DNA breaks are double strand breaks, while 0.007% of spontaneous DNA breaks at body temperature are double strand breaks)! Cancer occurs when several breaks in DNA happen to occur by chance at nearly the same time, giving several loose strand ends at once, which repair proteins like P53 then repair incorrectly, causing a mutation which can be proliferated somatically. This cannot occur when only one break occurs, because only two loose ends are produced, and P53 will reattach them correctly. But if low-LET ionising radiation levels are increased to a certain extent, causing more single strand breaks, P53 works faster and is able deal with faster breaks as they occur, so that multiple broken strand ends do not arise. This prevents DNA strands being repaired incorrectly, and prevents cancer - a result of mutation caused by faults in DNA - from arising. Too much radiation of course overloads the P53 repair mechanism, and then it cannot repair breaks as they occur, so multiple breaks begin to appear and loose ends of DNA are wrongly connected by P53, causing an increased cancer risk:
In another post, we examine in detail the May-June 1957 Hearings Before the Special Subcommittee on Radiation of the Joint Committee on Atomic Energy, U.S. Congress, The Nature of Radioactive Fallout and Its Effects on Man, where the false dose-threshold (not dose rate-threshold) theory was publically killed off (in a political-journalism scrum sense, not a scientific evidence sense) by a consortium of loud-mouthed and physically ignorant fruitfly and maize geneticists (headed by Nobel Laureates Muller and Lewis), with only an incompetent and quiet defense for the scientific data from cancer radiotherapy experts with experience that high dose rates cause more damage than low dose rates. The argument they made was that genetic effects of radiation on fruitflies and maize showed no signs of dose rate effects or dose threshold effects. They they extrapolated from flies and maize to predict the same for human beings, and they also claimed that this genetic result should apply to all normal cell division (somatic) radiation effects not just genetic effects! Glasstone summarized this linear-no threshold theory on page 496 of the 1957 edition of The Effects of Nuclear Weapons:
"There is apparently no amount of radiation, however small, that does not cause some increase in the normal mutation frequency. The dose rate of the radiation exposure or its duration have little influence; it is the total accumulated dose to the gonads that is the important quantity."
Flies and seasonal plants don't need DNA repair enzymes, which is why they show no dose rate dependence: they simply don't live long enough to get a serious cancer risk caused by DNA copying errors during cell fissions. This is not so in humans, and even mice. Glasstone and Dolan write in the 1977 edition of The Effects of Nuclear Weapons, pages 611-612 (paragraphs 12.209-12.211):
"From the earlier studies of radiation-induced mutations, made with fruitflies, ... The mutation frequency appeared to be independent of the rate at which the radiation dose was received. ... More recent experiments with mice, however, have shown that these conclusions must be revised, at least for mammals.
"... in male mice ... For exposure rates from 90 down to 0.8 roentgen per minute ... the mutation frequency per roentgen decreases as the exposure rate is decreased.
"... in female mice ... The radiation-induced mutation frequency per roentgen decreases continuously with the exposure rate from 90 roentgens per minute downward. At an exposure rate of 0.009 roentgen per minute [0.54 roentgen/hour], the total mutation frequency in female mice is indistinguishable from the spontaneous frequency. There thus seems to be an exposure-rate threshold below which radiation-induced mutations are absent or negligible, no matter how large the total (accumulated) exposure to the female gonads, at least up to 400 roentgens."
The Oak Ridge Megamouse Radiation Exposure Project
Reference: W. L, ”Reminiscences of a Mouse Specific-Locus Test Addict”, Environmental and Molecular Mutagenesis, Supplement, v14 (1989), issue 16, pp. 16–22.
The source of Glasstone and Dolan’s dose-rate genetic effects threshold data (replacing the fruitfly insect and maize plant data of Muller, Lewis and other 1950s geneticists who falsely extrapolated directly from insects and plants to humans) is the Oak Ridge National Laboratory “megamouse project” by Liane and William Russell. This project exposed seven million mice to a variety of radiation situations to obtain statistically significant mammal data showing the effects of dose rate upon the DNA mutation risk (which in somatic cells can cause cancer). Seven different locus mutations were used, which showed a time-dependence on genetic risk from different dose rates, which could only be explained by DNA repair processes. This contradicted insect and plant response, which showed no dose rate effect on the dose-effects response. With the results of this enormous mammal radiation exposure project, observed human effects of high dose rates and high doses could be accurately extrapolated to humans, without using the false linear, no-threshold model that applies to insects and plants that lack the advanced DNA repair enzymes like P53 in mammals:
“As Hollaender remembers it: ‘Muller and Wright were the only two geneticists who backed the mouse genetics study. The rest of the geneticists thought we were wasting our time and money!’”
- Karen A. Rader, “Alexander Hollaender’s Postwar Vision for Biology: Oak Ridge and Beyond”, Journal of the History of Biology, v39 (2006), pp. 685–706.
For an interesting discussion of the way that the radiation controversy led to a change in thinking about DNA, from being a fixed chemical structure (as believed in 1957, after the structure DNA was discovered in its misleadingly non-cellular solid crystal form, which was required for X-ray diffraction analysis) to today’s far more dynamic picture of DNA in the cell nucleus as a delicate strand that is repeatedly being broken (several times a minute) by normal water molecular Brownian motion bombardment at body temperature, and being repaired by DNA repair enzymes like protein P53, see the article by Doogab Yi, “The coming of reversibility: The discovery of DNA repair between the atomic age and the information age”, Historical Studies in the Physical and Biological Sciences, v37 (2007), Supplement, pp. 35–72:
“This paper examines the contested ‘biological’ meaning of the genetic effects of radiation amid nuclear fear during the 1950s and 1960s. In particular, I explore how the question of irreversibility, a question that eventually led to the discovery of DNA repair, took shape in the context of postwar concerns of atomic energy. Yale biophysicists who opposed nuclear weapons testing later ironically played a central role in the discovery of DNA excision repair, or "error-correcting codes" that suggested the reversibility of the genetic effects of radiation. At Yale and elsewhere, continuing anticipation of medical applications from radiation therapy contributed to the discovery of DNA repair. The story of the discovery of DNA repair illustrates how the gene was studied in the atomic age and illuminates its legacy for the postwar life sciences. I argue that it was through the investigation of the irreversibility of the biological effects of radiation that biologists departed from an inert view of genetic stability and began to appreciate the dynamic stability of the gene. Moreover, the reformulation of DNA repair around notions of information and error-correction helped radiobiologists to expand the relevance of DNA repair research beyond radiobiology, even after the public concerns on nuclear fallout faded in the mid-1960s.”
In fact, the “safe dose rate” concept has always existed (most recently dressed up with health physics sophistry like ALARA, “As Low As Reasonably Achievable”) in the way that radiation safety guides have formulated as a maximum dose per unit time interval. For example, on page 102 of the 1957 Congressional Hearings The Nature of Radioactive Fallout and Its Effects on Man, nuclear testing scientific director Dr Alvin C. Graves testifies:
“I have forgotten the title, but I think it is the American Commission for Radiation Protection, or something of that sort, originally stated that the workers in radioactivity could take one tenth of a roentgen per day forever without suffering injury. [This is 36.5 R/year or 1095 R over 30 years, roughly the minimum dose needed for bone changes in the radium dial painters.]”
Dr Jane Orient, 'Homeland Security for Physicians', Journal of American Physicians and Surgeons, vol. 11, number 3, Fall 2006, pp. 75-9:
'In the 1960s, a group of activist physicians called Physicians for Social Responsibility (PSR) undertook to "educate the medical profession and the world about the dangers of nuclear weapons," beginning with a series of articles in the New England Journal of Medicine. [Note that journal was publishing information for anti-civil defense propaganda back in 1949, e.g. the article in volume 241, pp. 647-53 of New England Journal of Medicine which falsely suggests that civil defense in nuclear war would be hopeless because a single burned patient in 1947 with 40% body area burns required 42 oxygen tanks, 36 pints of plasma, 40 pints of whole blood, 104 pints of fluids, 4,300 m of gauze, 3 nurses and 2 doctors. First, only unclothed persons in direct line of sight without shadowing can get 40% body area burns from thermal radiation, second, duck and cover offers protection in a nuclear attack warning, and G. V. LeRoy had already published, two years earlier, in J.A.M.A., volume 134, 1947, pp. 1143-8, that less than 5% of burns in Hiroshima and Nagasaki were caused by building and debris fires. In medicine it is always possible to expend vast resources on patients who are fatally injured. In a mass casualty situation, doctors should not give up just because they don't have unlimited resources; as at Hiroshima and Nagasaki, they would need to do their best with what they have.] On its website, www.psr.org, the group boasts that it "led the campaign to end atmospheric nuclear testing." With this campaign, the linear no-threshold (LNT) theory of radiation carcinogenesis became entrenched. It enabled activists to calculate enormous numbers of potential casualties by taking a tiny risk and multiplying it by the population of the earth. As an enduring consequence, the perceived risks of radiation are far out of proportion to actual risks, causing tremendous damage to the American nuclear industry. ... Efforts to save lives were not only futile, but unethical: Any suggestion that nuclear war could be survivable increased its likelihood and was thus tantamount to warmongering, PSR spokesmen warned. ...
'For the mindset that engendered and enables this situation, which jeopardizes the existence of the United States as a nation as well as the lives of millions of its citizens, some American physicians and certain prestigious medical organizations bear a heavy responsibility.
'Ethical physicians should stand ready to help patients to the best of their ability, and not advocate sacrificing them in the name of a political agenda. Even very basic knowledge, especially combined with simple, inexpensive advance preparations, could save countless lives.'
‘International Physicians for the Prevention of Nuclear War: Messiahs of the Nuclear Age?’, The Lancet (British medical journal), 18 November 1988, pp.1185-6, by Jane M. Orient, MD:
'... history is apparently not among the areas of expertise claimed by IPPNW [international physicians for the prevention of nuclear war]. Its spokesmen have yet to comment on the Washington Naval Treaty of 1922, the Kellogg-Briand Pact of 1928 (for which Kellogg and Briand received the Nobel Peace Prize), the Oxford Peace Resolution of 1934, the Munich Agreement of 1938, or the Molotov-Ribbentrop Pact of 1939, and on the effectiveness of these measures in preventing World War II. ...
'Sir Norman Angell (also a Nobel Peace Prize winner), in his 1910 best-seller entitled The Great Illusion, showed that war had become so terrible and expensive as to be unthinkable. The concept of ‘destruction before detonation’ was not discovered by Victor Sidel (Sidel, V. W., ‘Destruction before detonation: the impact of the arms race on health and health care’, Lancet 1985; ii: 1287-1289), but was previously enunciated by Neville Chamberlain, who warned his Cabinet about the heavy bills for armaments: ‘even the present Programmes were placing a heavy strain upon our resources’ (Minutes of the British Cabinet meeting, February 3, 1937: quoted in Fuchser, L. W., ‘Neville Chamberlain and Appeasement: a Study in the Politics of History’, Norton, New York, 1982). ...
'Psychic numbing, denial, and ‘missile envy’ (Caldicott, H., Missile envy: the arms race and nuclear war, New York: William Morrow, 1984) are some of the diagnoses applied by IPPNW members to those who differ with them. However, for the threats facing the world, IPPNW does not entertain a differential diagnosis, nor admit the slightest doubt about the efficacy of their prescription, if only the world will follow it. So certain are they of their ability to save us from war that these physicians seem willing to bet the lives of millions who might be saved by defensive measures if a nuclear attack is ever launched.
'Is this an omnipotence fantasy?'
Here are some extracts from Dr Orient's letter to FEMA about the continued use of the lying LNT theory of radiation for long-term effects propaganda:
Jane M. Orient, M.D.
President, Physicians for Civil Defense
1601 N. Tucson Blvd. Suite 9
Tucson, AZ 85716
(520) 325-2680
www.physiciansforcivildefense.org
To Rules Docket Clerk
Office of the General Counsel
Federal Emergency Management Agency
Room 840, 500 C Street S.W.
Washington, D.C. 20472
RE: Docket #: DHS-2004-0029
Docket #: Z-RIN 1660-ZA02
FEMA-RULES@dhs.gov
We agree that flexibility is required in responding to incidents involving radiological dispersal device (RDD) or improvised nuclear device (IND). It is critical that actions taken do more good than harm. The dangers of panic, the shut-down of essential services, and disruption of the economy and social arrangements could vastly outweigh the supposed dangers of an increased exposure to radiation, particularly in the event of the use of an RDD.
We are disappointed that the document does not explicitly recognize that current radiation protection standards are based on the linear no-threshold (LNT) theory of radiation carcinogenesis. This theory calculates casualties based on collective doses. The assumptions are the equivalent of saying that if one person dies from ingesting one thousand aspirin tablets all at once, that one person will die if each one of the thousand persons ingests one aspirin tablet each. In fact, all actual evidence indicates that radiation, like most other potentially adverse exposures, exhibits a biphasic dose-response curve. While high levels are damaging or lethal, within a certain range at the lower end of the scale there is a seemingly paradoxical stimulatory or protective effect. Persons with accidental or occupational exposures within this “hormetic” range have a lower incidence of cancer and birth defects, and have had an increase in longevity as well. Thus, measures to “protect” people against exposures in this range may deprive them of a beneficial health effect, as well as harming them through excessive costs or deprivation of the other potential benefits of technology.
... It should be noted that the average background dose on the Colorado plateau is 600 mrem per year, and in some areas of the world, much higher than that. For example, in Ramasari, Iran, the average background is about 48 rems per year-that is 4,800 mrem per year-without noticeable adverse health effects. Forced resettlement, on the other hand, would cause many billions of dollars in damage to the economy as well as social upheaval. Because of widespread public fear of low-dose radiation, many people might choose to be resettled than face such increased exposure, but persons should not be forced to abandon their homes, personal property, and businesses based upon unfounded fears. ...
In appointing technical advisory committees, it would appear important to include persons whose reputation is not strongly invested in the linear no-threshold hypothesis, who would thus find it difficult or impossible to change their position. A full range of views must be heard and not suppressed by a “consensus” process that strongly pressures participants to approve a predetermined position and excludes those who do not.
We think it is critical that the United States government should not enable terrorists to destroy a large area of the country and cripple its economy by exploiting unwarranted fears. Instead, we need to be prepared to mitigate the damage should efforts at interdiction fail.
Above: on 30 October 1961, the 50 Mt RDS-220 or Tsar Bomba, the world's highest yield nuclear weapon test, 8.0 m long, 2.1 m diameter and weighing 26 tons, was dropped with parachute retarded fall from a Tu-95 Bear bomber flying at an altitude of 10.5 km. (These precise bomb size details were published in: V. N. Mikhailov, et al., USSR Nuclear Tests, vol. 2, Technology of Nuclear Tests, Begell-Atom, 1999, pp. 82–84.) It exploded at an altitude of 3.9 km some 188 seconds after being dropped, at 73.85 degrees north, 54.50 degrees east.
In the film you can see the fireball expanding spherically until the shock wave bounces off the ground and back up into the fireball, flattening the bottom of the fireball and pushing it upwards ahead of the usual slow development of buoyant toroidal rise. Windows were partially broken (cracked) by the high altitude refraction and focussing of the shock wave out to distances of 900 km. When the detonation was triggered at the correct altitude, the white-painted Tu-95 and accompanying Tu-16 photographic aircraft were both 45 kilometres from the bomb.
The film stills above show the laboratory work in putting together TSAR BOMBA. You can see the boosted fission primary sits in the nose: fission of the material in the primary gave a total yield of about 1 megaton, with the other 49 megatons coming from clean nuclear fusion. The lithium deuteride fusion capsules were encased in lead instead of natural uranium, to keep the fission yield low. X-rays were channeled from the boosted fission primary to the fusion capsules inside the thick casing. Notice that you can see how thick the casing is from some photos of the nose cone of the bomb being fitted over the boosted fission primary.
The usual claims about retina eye burns and 3rd degree skin burns occurring out to vast distances for a 50 or 100 Mt bomb is a massive exaggeration, neglecting both the attenuation due to the atmosphere and the fact that the very long thermal pulse for a large weapon is very ineffective in causing burns (you have many, many seconds to take evasive action, and anyway even if you don't, a lot more thermal radiation is needed to cause a specific burn than for a low yield weapon, because the temperature rise in the skin is reduced due to the increased duration that the energy is distributed over, which allows cooling mechanisms like reradiation and blood circulation to deeper tissues, to mitigate damage).
(1) 50 Mt*, 30 Oct 1961 USSR clean 2-3 % fission air burst at 3,900 m altitude over Novaya Zemlya. (100 Mt design with U-238 pusher replaced by lead to reduce fission yield from 50% to 2.5% and total yield from 100 Mt to 50 Mt.)
(2) 24.2 Mt, 24 Dec 1962 USSR air burst at 3,750 m altitude dropped from Tu-95 (carried externally) over Novaya Zemlya. (50 Mt design with U-238 pusher replaced by lead to reduce fission yield from 50% to 2.5% and total yield from 50 Mt to 24 Mt.)
(3) 21.1 Mt, 5 Aug 1962 USSR air burst at 3,600 m altitude over Novaya Zemlya
(4) 20 Mt, 27 Sep 1962 USSR air burst at 3,900 m altitude over Novaya Zemlya
(5) 19.1 Mt, 25 Sep 1962 USSR air burst at 4,090 m altitude over Novaya Zemlya
(6) 14.8 Mt, 28 Feb 1954 US Castle-Bravo 67 % fission reef surface burst on the northern reef at Bikini Atoll
(7) 13.5 Mt, 4 May 1954 US Castle-Yankee 52 % fission water surface burst on barge over Bikini Lagoon
(8) 12.5 Mt, 23 Oct 1961 USSR air burst at 3,500 m altitude over Novaya Zemlya
(9) 11 Mt, 26 Mar 1954 US Castle-Romeo 64 % fission water surface burst on barge in Bikini Lagoon
(10) 10.4 Mt, 31 Oct 1952 US Ivy-Mike Elugelab Island surface burst, Eniwetok Atoll**. Mike was an 82 ton liquid deuterium device, the first full scale test of the Teller-Ulam staged radiation implosion principle, using a boosted fission primary bomb and a physically separate fusion stage (a Dewar vacuum flash filled with deuterium and surrounded by a 5-ton natural uranium pusher). The outer steel casing was 2 metres wide and 6.1 metres high, with walls 30 cm thick. The inside surface of the casing was lined with lead and polyethylene, forming a conduit from the primary to the secondary.
Above: IVY-MIKE fireball to cloud transition photo sequence (30 seconds, 2 minutes, 10 minutes and 20 minutes).
Above: ‘Operation Ivy’, produced by the U.S. Air Force Lookout Mountain Laboratory, Hollywood, California for the U.S. Armed Forces Special Weapons Project, and presented very impressively by Western cowboy film star Reed Radley: ‘You have a grand stand seat here to one of the most momentous events in the history of science. In less than a minute you will see the most powerful explosion ever witnessed by human eyes. The blast will come up on the horizon just about there, and this is the significance of the moment: this is the first full-scale test of a hydrogen device. If the reaction goes, we’re in the thermonuclear era. For the sake of all of us, and for the sake of our country, I know you’ll join me in wishing this expedition well.’
* Close-in Russian data gave a yield of 50 Mt. Long-range Western micro-barographs suggested 56 and 58 Mt based on the peak overpressure and duration of the distant blast wave, when it had become a gravity wave-type disturbance in the atmosphere, but no burst altitude data was available, and close-in data are more accurate.
Information from Wikipedia relating to clean weapons and their eventual deployment for low yield tactical 'neutron bombs': In 1956, President Eisenhower announced the testing of a 95% 'clean' (2-stage) fusion weapon, later identified to have been the 11 July Navajo test at Bikini Atoll during Operation Redwing. This weapon had a 4.5 megatons yield. Previous 'dirty' weapons had fission proportions of 50-77%, due to the use of uranium-238 as a 'pusher' around the lithium deuteride (secondary) stage. (The fusion neutrons have energies of up to 14.1 MeV, well exceeding the 1.1 MeV 'fission threshold' for U-238.) The 1956 'clean' tests used a lead pusher, while in 1958 a tungsten carbide pusher was employed. Hans A. Bethe supported clean nuclear weapons in 1958 as Chairman of a Presidential science advisory group on nuclear testing:
'... certain hard targets require ground bursts, such as airfield runways if it is desired to make a crater, railroad yards if severe destruction of tracks is to be accomplished... The use of clean weapons in strategic situations may be indicated in order to protect the local population.' (Dr Hans Bethe, 27 March 1958 Top Secret - Restricted Data Report to the NSC Ad Hoc Working Group on the Technical Feasibility of a Cessation of Nuclear Testing (Bethe was the Working Group Chairman, page 9).
In consequence of Bethe's recommendations, on 12 July 1958, the Hardtack-Poplar shot on a barge in the lagoon yielded 9.3 megatons, of which only 4.8% was fission. It was 95.2% clean. It was the clean Mk-41C warhead. Cohen in 1958 investigated a low-yield 'clean' nuclear weapon and discovered that the 'clean' bomb case thickness scales as the cube-root of yield. So a larger percentage of neutrons escapes from a small detonation, due to the thinner case required to reflect back X-rays during the secondard stage (fusion) ignition. For example, a 1-kiloton bomb would need to have a case only 1/10th the thickness of that for 1-megaton. This means that although most of the neutrons are absorbed by the outer casing in a 1-megaton bomb, in a 1-kiloton bomb they would mostly escape. A neutron bomb is only feasible if the yield is sufficiently high that efficient fusion stage ignition is possible, and if the yield is low enough that the case thickness will not absorb too many neutrons. This means that neutron bombs have a yield range of 1-10 kilotons, with fission proportion varying from 50% at 1-kiloton to 25% at 10-kilotons (all of which comes from the primary stage). The neutron output per kiloton is then approximately 10-15 times greater than for a pure fission implosion weapon or a standard (high yield) strategic warhead like a W87 or W88.
In 1981, the Christian Science Monitor reported that there "are 19,500 tanks in the Soviet-controlled forces of the Warsaw Pact aimed at Western Europe. Of these, 12,500 are Soviet tanks in Soviet units. NATO has 7,000 tanks on its side facing the 19,500." (Joseph C. Harsch, 'Neutron Bomb: Why It Worries The Russians,' Christian Science Monitor, August 14, 1981, p. 1.)
Cohen's neutron bomb is not mentioned in the unclassified manual by Glasstone and Dolan, The Effects of Nuclear Weapons 1957-77, but is included as an 'enhanced neutron weapon' in chapter 5 of the declassified (formerly secret) manual edited by Philip J. Dolan, Capabilities of Nuclear Weapons, U.S. Department of Defense, effects manual DNA-EM-1, updated 1981 (U.S. Freedom of Information Act).
Provided that the weapon was not used in a thunderstorm, no fallout effects would occur from the use of a neutron bomb according to that manual, as the combination of 500 m burst altitude and low yield prevents fallout in addition to significant thermal and blast effects. The reduction in damage outside the target area is a major advantage of such a weapon to deter massed tank invasions. An aggressor would thus be forced to disperse tanks, which would make them easier to destroy by simple hand-held anti-tank missile launchers.
** http://www.johnstonsarchive.net/nuclear/tests/multimegtests.html states this test had a fission yield of 60 % whereas http://nuclearweaponarchive.org/Usa/Tests/Ivy.html states it was 77 %. I can add some comments to this issue. In the published U.S. Congressional Hearings of June 22-26, 1959, tables of data were presented showing the fission yields from all American and British tests, which the Americans had monitored (for Russian tests, the tables did not present fission yields but merely assumed 50 % of the total yield was from fission). Those tables showed that America detonated 15 Mt of fission in land surface bursts from 1952-54, i.e., the fission yield of Ivy-Mike and Castle-Bravo together was 15 Mt. Plenty of reports show that the fission yield of Castle-Bravo was known by 1956 to be 10 Mt, hence you can deduce a fission yield for Ivy-Mike of 5 Mt, or 48 %. However, other declassified data, for example the measured upwind fallout pattern for Ivy-Mike, suggests that the 77 % fission yield may be correct.
http://nuclearweaponarchive.org/Russia/TsarBomba.html wrongly states that the 57 Mt yield estimate was based on Western fallout analyses, which is false (although fallout analysis did imply a 2-3 % fission yield, i.e. the bomb was 97-98% clean). The total yield 57 Mt estimate came from rough measurements using micro-barographs to see the long range pressure and its duration, which are not as accurate as the Russian close-in blast data which indicated a yield of 50 Mt.
Above: TSAR BOMBA replica, photo credit Wikipedia.
Above: American toroidal fireball nuclear test films.
Above: British toroidal fireball nuclear test films (Christmas and Malden Islands, Pacific).
Above: Professor Freeman Dyson and Sir Arthur C. Clarke supporting the nuclear bomb powered spacecraft, Project Orion, the only economic practical way for human beings to holiday on Mars. (Excerpt from BBC's To Mars by A-Bomb (2003), with footage of the tests and comments by Arthur Clarke and Freeman Dyson.) The Orion spacecraft has a large thick steel pusher plate connected via hydraulic dampers to the crew accommodation. A series of nuclear explosions is detonated below the pusher plate, which shields the crew from nuclear radiation and recoils upwards when ablated by X-rays. The impulses from nuclear weapon explosions efficiently accelerate the spacecraft to high speed. It would have been launched to Mars from the Nevada nuclear test site, using relatively clean low fission yield detonations for the first few minutes (to minimise the EMP, air blast and fallout effects on Earth), and then larger detonations when a safe distance away. Project Orion was headed by Los Alamos nuclear weapons designer Dr Theodore Taylor, who developed many nuclear weapons (Scorpion, Wasp, Bee, Hornet Nevada tests, and the 500 kt pure fission implosion bomb tested as the IVY-KING shot in 1952). (The idea of utilizing explosions for work is not so crazy as it sounds, when you remember that the internal combustion engine doesn't 'burn' gasoline, it explodes it in a controlled way within the cylinder after mixing fuel with air and compressing the resulting mixture, and the engine converts the impulsive force of the explosion into useful work energy done against the piston to produce motion. Maybe a massive version of such a piston-in-cylinder engine could utilize recoil forces caused by thermonuclear explosions, which are more cost-efficient for releasing energy than the operation of a nuclear reactor to generate steam to power turbines.)
There were several other nuclear rocket systems as alternatives to Orion, although Orion is by far the best. One alternative was Project Thunderwell, the steam accelerated Jules Verne capsule, which was suggested by the speed of at least 6 times earth's escape velocity, achieved by the 10-cm thick, 1.2 m diameter steel cover blown off the top of the 152 m shaft of the 0.3 kt Plumbbob-Pascal B underground Nevada test on 27 August 1957. In that test, a 1.5 m thick 2 ton concrete plug immediately over the bomb was pushed up the shaft by the detonation, knocking the welded steel lid upward. This was a preliminary experiment by Dr Robert Brownlee which ultimately aimed to launch spacecraft using the steam pressure from deep shafts filled with water, with a nuclear explosion at the bottom; an improvement of Jules Verne's cannon-fired projectile described in De la Terre à la Lune, 1865, where steam pressure would give a more survivable gentle acceleration than Verne's direct impulse from an explosion. Some 90% of the radioactivity would be trapped underground. Like Project Orion, Project Thunderwell was cancelled for pseudoscientific (political) reasons after the nuclear test ban treaty was signed.
Another nuclear rocket system was simply to use a bare, uncluttered nuclear reactor core to directly heat hydrogen gas to high temperature and then expel it from an exhaust nozzle in lieu of burning it with oxygen. This was NASA's Kiwi rocket, which was extensively tested (producing a lot of radioactivity in the atmosphere) but, you guessed it, never deployed! The advantage of it is that you need to carry less fuel, because you're not burning hydrogen, you're just ejecting it to get a recoil by Newton's 3rd law of motion, and by ejecting it at high speed (fast hydrogen molecules) due to nuclear reactor heating, it can be more efficient than a conventional rocket engine.
There should be a note here about how unnatural radioactive pollution is (not) in space: the earth's atmosphere is a radiation shield equivalent to being protected behind a layer of water 10 metres thick. This reduces the cosmic background radiation by a factor of 100 of what it would be without the earth's atmosphere. Away from the largely uninhabited poles, the Earth's magnetic field also protects us against charged cosmic radiations, which are deflected and end up spiralling around the magnetic field at high altitude, in the Van Allen trapped radiation belts. On the Moon, for example, there is no atmosphere or significant magnetic field so the natural background radiation exposure rate at solar minimum is 1 milliRoentgen per hour (about 10 microSieverts/hour) some 100 times that on the Earth (0.010 milliRoentgen per hour or about 0.10 microSieverts/hour). The Apollo astronauts visiting the Moon wore dosimeters and they received an average of 275 milliRoentgens (about 2.75 milliSieverts) of radiation (well over a year's exposure to natural background at sea level) in over just 19.5 days. It is a lot more than that during a solar flare, which is one of the concerns for astronauts to avoid (micrometeorites are another concern in a soft spacesuit).
The higher up you are above sea level, the less of the atmosphere there is between you and space, so the less shielding you have to protect you from the intense cosmic space radiations (emitted by thermonuclear reactors we call 'stars', as well as distant supernovae explosions). At sea level, the air above you constitutes a radiation shield of 10 tons per square metre or the equivalent of having a 10 metres thick water shield between you and outer space. As you go up a mountain or up in an aircraft, the amount of atmosphere between you and space decreases, thus radiation levels increase with altitude because there is less shielding. The normal background radiation exposure rate shoots up by a factor of 20, from 0.010 to 0.20 milliRoentgens per hour, when any airplane ascends from sea level to 36,000 feet cruising altitude. (The now obsolete British Concorde supersonic transport used to maintain radiation-monitoring equipment so that it could drop to lower-altitude flight routes if excessive cosmic radiation due to solar storms were detected.) Flight aircrew get more radiation exposure than many nuclear industry workers at nuclear power plants. Residents of the high altitude city of Denver get 100 milliRoentgens (about 1 milliSievert) more annual exposure than a resident of Washington, D.C., but the mainstream anti-radiation cranks don't campaign for the city to be shut to save kids radiation exposure, for mountain climbing to be banned, etc.!
The point I'm making here, for the Green Warriors, is that a nuclear-powered rocket won't be a horrible unnatural thing polluting nice pristine non-radioactive 'clean' outer space with horrible human produced radioactive waste: the universe is full of nuclear reactors (called stars purely for reasons of political expediency) and unending nuclear explosions (called supernovae purely for reasons of political expediency). Live with it!
Above: Carl Sagan talking about Project Orion, which could be built today with existing technology if there was not insane groupthink about nuclear test effects. Dr Theodore Taylor gives the full technical details in John McPhee's book The Curve of Binding Energy, 1974. Cosmic radiation is 100 times higher in space than on the Earth's surface. The EMP and fallout effects could be suppressed by clean weapons designs with thick casings to absorb prompt gamma radiations (see blog posts here, here, here, and here).
Summary of Project Orion from Dr Taylor:
Project Orion began in 1958 when nuclear weapons designer Dr Theodore B. Taylor moved to General Atomic to design a nuclear bomb powered spaceship, sponsored by the U.S. Advanced Research Projects Agency. It would travel directly (in a straight line!) and quickly to Mars using 2,000 nuclear bombs, carrying 150 people and attaining a top speed of 45 km/second. The travel time would be 3 months for the minimum distance to Mars of 56 million km and 6 months for the maximum Mars-Earth distance of 101 million km. In 1959 the stability of the entire system was completely proved in a scaled-down demonstration test which impressed Dr von Braun so much that he supported Project Orion after seeing the demonstration film.
Above: blueprints for the nuclear rockets from R. S. Cooper, "Nuclear propulsion for space vehicles", Annual Review of Nuclear Science, v18, 1968, pp. 203-228. To resist the high temperatures, metals like tungsten (which has a very high melting point) are preferred to steel for the surface of the pusher. Graphite in a thin layer of droplets can be sprayed on to the pusher plate by retractable jet nozzle located within the central hole in the pusher plate. More advanced designs use a concave shaped pusher which detonates the bomb at the focus, to utilize a larger fraction of the case shock and X-ray ablative recoil energy. Project Orion was first proposed by Dr Stanislaw Ulam, of Teller-Ulam fame. It was developed by Dr Theodore Taylor at General Atomic.
Dr Taylor says in The Curve of Binding Energy (by McPhee) that the idea stemmed from the 15.2 kt REDWING-INCA nuclear test on June 26, 1956, where 30 cm diameter carbon-coated steel balls were placed 9 metres from the bomb by researcher Lew Allen, and were undamaged with only a loss of 0.1 mm of surface graphite! This gave rise to the design of the 75 ton, 41 metre diameter carbon-coated steel base pusher plate in the 76 metres high Project Orion spacecraft, where the base pusher plate is connected by hydraulic shock absorbers to the crew compartment. The steel plate acts as a radiation shield as well as ablative recoil mechanism to get propulsion: after each bomb was fired, oil would be sprayed on the plate to give it a carbon coating. The dynamics of X-ray ablation are well established in nuclear weapons design because this mechanism is what is used to cause the fusion stage in a bomb to explode: X-rays from the fission stage are channelled to the fusion stage, ablating the surface which causes a compression by recoil (Newton's 3rd law).
The nuclear test fireball experiments of Project 5.4 during Operation TEAPOT in Nevada, 1955, Project 5.9 of Operation REDWING at Bikini Atoll and Eniwetok Atoll in 1956, and then Project 8.3b of Operation PLUMBBOB in Nevada, 1957 proved that objects like steel spheres in the fireball only suffered a tiny amount of surface scarring because the thermal pulse just ablates a microscopic thickness of the surface, causing a recoil force. Actually, this kind of thin layer ablation had first been noted back on the TRINITY test of July 16, 1945:
‘The measured total radiation at [9.1-km] from the centre was 0.29 calories/cm2 ... Examination of the specimen exposed at [975 m] shows ... the charred layer does not appear to be thicker than 1/10 millimetre.... scorching of the fir lumber used to support signal wires extended out to about [1.9 km] ... the risk of fire due to the radiation ... is likely to be much less than the risk of fire from causes existing in the buildings at the time of explosion.’
– W. G. Marley and F. Reines, July 16th Nuclear Explosion: Incendiary Effects of Radiation, Los Alamos report LA-364, October 1945, originally Secret, pp. 5-6.
Dr Taylor explained that the first nuclear bomb to start ascent would only need to be 0.1 kt, the next a second later would be 0.2 kt, and so on up to bomb number 50 which would be 20 kt, by which time a total of 200 kt would have been detonated, and the spacecraft would then be in space without having caused any significant EMP or fallout damaging effects on the Earth compared to natural background radiation. There would be no radioactive trail left in space behind such a nuclear pulse rocket because the debris expands at a rate faster than the excape velocity of the solar system. The pusher plate would not be severely heated or damaged because of the 10 nanosecond duration of the ablative X-ray impulse from a nuclear explosion 60 metres away, which only ablates the surface layer (such as the layer of carbon rick grease sprayed on the pusher plate automatically after each detonation). Remember that in an automobile engine, the temperature attained by the exploding gasoline and air mixture is much higher than the melting point of the steel pistons and cylinders, but the latter don't melt because the duration of each explosion is too brief to heat up the material to that temperature, so the residual heat after expansion doesn't penetrate and destroy the piston and cylinder, but rapidly cools and ends up as warm exhaust gas!
“Observations of the remains of towers and shielding material after detonation at several ground zeros indicate that large masses of material are not vaporized. Observations of the residue of the Smoky tower [44 kt bomb atop a 700 foot high steel tower] indicated that a very significant portion of that tower remained, including the upper 200 feet of steel. Another example similar to Shot Smoky was Shot Apple II [29 kt atop a 500 ft steel tower], Teapot Series. Even though the total yield of Shot Apple II was about [29 kt], the floor of the cab [housing the nuclear bomb itself, at the top of the tower] and the main tower support columns remained intact. The results of the Shot Fizeau [11 kt atop a 500 ft steel tower] tower melt studies (W. K. Dolen and A. D. Thornborough, Fitzeau Tower Melt Studies, Sandia report SC-4185, 1958, Secret) show that about 85 percent of tower material was accounted for after the detonation and that only the upper 50 feet of tower was vaporized. No melting occurred beyond 175 feet from the top of the tower although the fireball theoretically engulfed more than 400 feet of the tower.”
- Dr Kermit H. Larson, et al., Distribution, Characteristics, and Biotic Availability of Fallout, Operation Plumbbob, weapon test report WT-1488, ADA077509, July 1966, page 59.
J. E. Kester and R. B. Ferguson report in Operation Teapot, Project 5.4, Evaluation of Fireball Lethality Using Basic Missile Structures, WT-1134 (originally Secret – Restricted Data), AD0340137, that within the 23 kt Teapot-Met (Nevada, 15 April 1955, 400 ft steel bomb tower) although the bomb test steel tower was blown down, it was not vaporized and much survived despite having been engulfed by the fireball itself, as stated on page 30:
“... nearly 225 feet of the main support members of the shot tower were still intact and laid out radially from their original position.”
Page 116 of WT-1134 states that after the 2 kt Moth shot atop a 300 foot triangular tower on 22 February 1955: “The three tower legs were laid out approximately radially from their pre-shot positions. The longest tower leg found was about 200 ft long. The other two legs appeared to be about 150 ft long. All three guy cables were still attached ... A few large pieces of the tower, about 20 to 30-ft long, were strewn to ranges of about 200 feet.” It adds that after the 7 kt Tesla shot atop a 300 ft square tower on 1 March 1955: “the four tower legs ... were laid out radially from their original position ... The tower legs remained intact to lengths of about 125 feet. All four guy cables were still attached ...” The 43 kt Turk nuclear test was fired atop a 500 ft square tower, leaving 100 ft lengths of tower lengths on the ground (page 118). The 8 kt Bee shot atop a 500 ft tower failed to even knock down most of the tower (pages 120-1): “A large portion of this tower was still standing after the shot. ... It is estimated that at least 150 feet of the tower was essentially undamaged and standing erect with an additional 50 to 75 feet of the tower slightly melted and drooped over at the top.” The 14 kt Apple 1 shot atop a 500 ft square tower results (page 121): “The main support members of the shot tower still remained to lengths of about 150 feet with the top 25 to 50 feet being crushed and split ... Some of the legs remained attached to the base.” The 23 kt Met shot was atop a 400 ft square tower (pages 123-4): “About 225 feet of the tower legs were still intact with the top 25 to 50 feet being crushed, split and slightly melted ....”
Above: color photo shows the lower 200 feet surviving from the 300 ft steel tower of the 0.2 kt Ruth nuclear test in Nevada on 31 March 1953. The black and white photographs are from the 23 kt Teapot-Met nuclear explosion (Nevada, 15 April 1955) ablation tests by J. E. Kester and R. B. Ferguson, Operation Teapot, Project 5.4, Evaluation of Fireball Lethality Using Basic Missile Structures, WT-1134 (Secret – Restricted Data), AD0340137, which showed that at just 80 feet only the outer 0.4 inch of steel balls was ablated by the fireball.
The error in the popular myth that everything is vaporized in the fireball is that the cooling rate of the fireball is so great that there is literally not enough time for the heat to penetrate more than a thin surface layer before the temperature drops below melting point. Good heat conductors like steel are protected by ablation. A very thin surface layer of the material is vaporized, protecting the underlying material, just as occurs with thermal radiation striking wooden houses (Glasstone and Dolan, The Effects of Nuclear Weapons):
References:
J. C. Nance, 'Nuclear Pulse Propulsion', IEEE Transactions on Nuclear Science, February 1965, p. 177.
T. W. Reynolds, 'Effective Specific Impulse of External Nuclear Pulse Propulsion Systems', Journal of Spacecraft and Rockets, October 1973, p. 629.
----------
‘The President put his name on the plaque Armstrong and Aldrin left on the moon and he telephoned them while they were there, but he cut America’s space budget to the smallest total since John Glenn orbited the Earth. The Vice-President says on to Mars by 1985, but we won’t make it by “stretching out” our effort. Perhaps NASA was too successful with Apollo. It violated the “Catt Concept”, enunciated by Britisher Ivor Catt. According to Catt, the most secure project is the unsuccessful one, because it lasts the longest.’
- Robert P. Crossley, Editorial, Popular Mechanics, Vol. 133, No. 5, May 1970, p. 14.
E.g., compare the Apollo project with the Vietnam war for price, length and success. Both were initially backed by Kennedy and Johnson as challenges to Communist space technology and subversion, respectively. The Vietnam war – the unsuccessful project – sucked in the cash for longer, which closed down the successful space exploration project!
Above: neutron bomb supporter Dr Edward Teller of the Lawrence Livermore National Laboratory stated in the San Francisco KQED-TV television Fallout and Disarmament debate with Nobel Laureate Linus Pauling on 20 February 1958:
“I believe that the second world war was brought on by a race in disarmament. The peace-loving nations disarmed, and when the Hitler tyranny armed inertia was too great ... he got away with his army and he almost conquered the world. ... If there is war, if the terrible catastrophe befalls us, then next we must try to keep that war as small as possible, and at the same time we must try to be sure that no more people will unwillingly be subjected to the Russian yoke. ... If such should happen, then it would be of great importance that these weapons should do as little damage in human life as possible. If a war of this kind has to be fought, then the danger from radioactivity will be very great indeed. ... there should not be unnecessary, uncontrollable radioactive dust – radioactive contamination, which would kill friend and foe alike. ... It is even possible, to my mind, that there is no damage; and there is the possibility, furthermore that very small amounts of radioactivity are helpful. ...
“Here is a recent quotation from Nature - the British publication. This says that due to our wearing tight clothes, and due to the increased temperature of the sperm plasm, to the organs which make our sperm, there will be an increase in mutations. Then it goes on to say that since our modes of dress have been predominant for several centuries, it might explain almost half the present load of spontaneous mutations. So we see how modes of dress, based chiefly on sexual taboos, might present genetic hazards one hundred to one thousand times greater that those estimated from different sources of radiation. ... even in the terrible event of war, I believe that in this war, if it were fought with the highly flexible and highly mobile nuclear weapons, it would not be necessary to take so many young people away from their homes. I do not believe, if we can localize wars, that the casualties need be very great.”
Now should the public be informed about positive research reports on radiation such as the following report? Or do we suppress it? Do we cover-up evidence which doesn't fit the popular media "radiation is bad" ideology?
W. L. Chen, Y. C. Luan, M. C. Shieh, S. T. Chen, H. T. Kung, K. L. Soong, Y. C. Yeh, T. S. Chou, S. H. Mong, J. T. Wu, C. P. Sun, W. P. Deng, M. F. Wu, and M. L. Shen, ‘Is Chronic Radiation an Effective Prophylaxis Against Cancer?’, published in the Journal of American Physicians and Surgeons, Vol. 9, No. 1, Spring 2004, pp. 6-10:
"An extraordinary incident occurred 20 years ago in Taiwan. Recycled steel, accidentally contaminated with cobalt-60 ([low dose rate, gamma radiation emitter] half-life: 5.3 y), was formed into construction steel for more than 180 buildings, which 10,000 persons occupied for 9 to 20 years. They unknowingly received radiation doses that averaged 0.4 Sv, a collective dose of 4,000 person-Sv. Based on the observed seven cancer deaths, the cancer mortality rate for this population was assessed to be 3.5 per 100,000 person-years. Three children were born with congenital heart malformations, indicating a prevalence rate of 1.5 cases per 1,000 children under age 19. The average spontaneous cancer death rate in the general population of Taiwan over these 20 years is 116 persons per 100,000 person-years. Based upon partial official statistics and hospital experience, the prevalence rate of congenital malformation is 23 cases per 1,000 children. Assuming the age and income distributions of these persons are the same as for the general population, it appears that significant beneficial health effects may be associated with this chronic radiation exposure."
Thus, a dose rate of roughly 0.4 Sv per 9-20 years, i.e. a dose rate of 2.3-5.1 microGrays per hour (0.23-0.51 millirads per hour) or 23-51 times normal background causes the benefit of a fall in normal cancer rates by a factor of 116/3.5 = 33, and a fall in congenital heart malformations by a factor of 23/1.5 = 15. These are big numbers!
Let me repeat the facts again just clarify this very important point, Chen and thirteen other physicians investigated the apparent benefits of low level radiation in Taiwan, "Is Chronic Radiation an Effective Prophylaxis Against Cancer?", Journal of American Physicians and Surgeons, v9 n1 2004. After a radioactive source was accidentally mixed into industrial steel and used to build apartments in Taiwan, 10,000 persons were unknowingly exposed to low-level radiation in Taiwan for periods of 9-20 years, and in this group cancer rates were lower than those in the general population by a factor of 33 (a reduction from 116 to just 3.5 per 100,000 person-years); while genetic defects fell by a factor of 15 (from 23 cases per 1,000 children to just 1.5). These are such enormous benefits that you would expect that all donor and publically funded "Cancer Research" institutes would be studying these benefits from dose rates of radiation a few hundred times background, which can apparently slash cancer risks and genetic defect rates to such an extent.
The statistics in the paper by Chen and others has been alleged to apply to a younger age group than the general population, affecting the significance of the data, although in other ways the data are more valid than Hiroshima and Nagasaki data extrapolations to low doses. For instance, the radiation cancer scare mongering of survivors of high doses in Hiroshima and Nagasaki would have been prejudiced in the sense of preventing a blind to avoid “anti-placebo” effect, e.g. increased fear, psychological stress and worry about the long term effects of radiation, and associated behaviour. The 1958 book about the Hiroshima and Nagasaki survivors, “Formula for Death”, makes the point that highly irradiated survivors often smoked more, in the belief that they were doomed to die from radiation induced cancer anyway. Therefore, the fear culture of the irradiated survivors would statistically be expected to result in a deviancy from normal behaviour, in some cases increasing the cancer risks above those due purely to radiation exposure.
For up-to-date data and literature discussions on the effects of DNA repair enzymes on preventing cancers from low-dose rate radiation, please see
http://en.wikipedia.org/wiki/Radiation_hormesis
There is also evidence for low dose radiation benefits from Hiroshima and Nagasaki's joint American-Japanese Radiation Effects Research Institute (RERF) which is being covered up by the statistical fiddle of "lumping together" the majority of the survivors into one large dose interval group, and only taking small dose intervals at high doses, which is a fiddle that falsely omits the benefits from the boosting of the P53 DNA repair enzyme by low radiation doses in those cities (the statistical bias in the table below from the RERF Brief Guide is in every sense a classic example of the biased presentation of data; remember that at high doses the cancer data are least reliable because the average amount of radiation shielding by buildings needed to survive the initial effects and get cancer years later was very high, and estimates of the exact shielding factors are one of the greatest uncertainties continuing in the DS02 dosimetry, as shown for example by the inconsistent curve of percentage temporary epilation versus dose in the same publication - the dosimetry is more accurate at lower doses because the average radiation shielding of survivors is much smaller at those lower doses):
Radiation delivered over long periods at a few hundred times the natural background dose rates stimulates the use of body resources to produce more of the natural DNA repair enzyme, protein P53, thus utilizing more of the energy resources of the body for repairing DNA breaks than is usually allocated, and this reduces the natural cancer and genetic risks. This effect is in some sense like working out at the gym regularly: you end up after regular exercise not generally more tired, but generally fitter and more muscular, because the body responds in the long run by using more resources to adapt by strengthening itself and maintaining hormesis (an effect well known in chemotherapy: "what doesn't kill you, makes you stronger").
In the West, freedom of speech allows politically incorrect facts to be censored by the fashionable media. If you want to see why this censorship of the benefits of low level radiation is continuing, see the relatively vague and unconvincing (apart from a quotation from Dr Robert Rowland) article by James Muckerheide in the year 2000, "It’s Time to Tell the Truth About the Health Benefits of Low-Dose Radiation", and see also weak graphical correlations shown in Dr T. D. Luckey's 2008 paper, "The Health Effects of Low-Dose Ionizing Radiation" in the Journal of American Physicians and Surgeons, v13, n2, pp. 39-42, which does at least summarize the 2004 Chen paper in the same journal concisely:
"In 1982-1983, several apartments in Taipei City, Taiwan, were built with structural steel contaminated with cobalt-60. Chen et al. noted the total cancer death rates for radiation-exposed adult occupants and controls in the city were comparable when the apartments were first occupied. As both groups aged, the cancer mortality rate in the radiation-exposed group decreased while the cancer mortality rate of controls increased. The cancer mortality rate of those who had lived 9–20 years in these buildings was only 3% that of the general adult population."
Of course, it's always been known since the work (mentioned above) of French radiologists that radiation is more effective at killing rapidly dividing cancer cells than normal cells (because cells are more vulnerable during cell nucleus fission than at other times, and more rapidly diving cells spend a greater percentage of the time in this vulnerable state than healthy cells do). But this discovery that low dose rates of radiation can produce a health benefit by preventing cancer in the first place is new.
What is happening here is the "what doesn't kill you makes you stronger" effect: dose rates of 20-50 times normal background over a period of 1-2 decades stimulates a stronger DNA repair enzyme system. The body simply devotes more energy from food into building more DNA repair enzymes, and it over-compensates, thereby reducing natural cancer rates. This positive benefit from radiation would occur up to the threshold for cancer seen in the radium dial painters, 57 microGrays per hour (5.7 millirads per hour) or 570 times normal background. Only if the dose rate becomes too high does the rate of damage overwhelm natural DNA repair mechanisms and cause cancer:
‘... it is important to note that, given the effects of a few seconds of irradiation at Hiroshima and Nagasaki in 1945, a threshold near 200 mSv may be expected for leukemia and some solid tumors. [Sources: UNSCEAR, Sources and Effects of Ionizing Radiation, New York, 1994; W. F. Heidenreich, et al., Radiat. Environ. Biophys., vol. 36 (1999), p. 205; and B. L. Cohen, Radiat. Res., vol. 149 (1998), p. 525.] For a protracted lifetime natural exposure, a threshold may be set at a level of several thousand millisieverts for malignancies, of 10 grays for radium-226 in bones, and probably about 1.5-2.0 Gy for lung cancer after x-ray and gamma irradiation. [Sources: G. Jaikrishan, et al., Radiation Research, vol. 152 (1999), p. S149 (for natural exposure); R. D. Evans, Health Physics, vol. 27 (1974), p. 497 (for radium-226); H. H. Rossi and M. Zaider, Radiat. Environ. Biophys., vol. 36 (1997), p. 85 (for radiogenic lung cancer).] The hormetic effects, such as a decreased cancer incidence at low doses and increased longevity, may be used as a guide for estimating practical thresholds and for setting standards. ...
‘Though about a hundred of the million daily spontaneous DNA damages per cell remain unrepaired or misrepaired, apoptosis, differentiation, necrosis, cell cycle regulation, intercellular interactions, and the immune system remove about 99% of the altered cells. [Source: R. D. Stewart, Radiation Research, vol. 152 (1999), p. 101.] ...
‘[Due to the Chernobyl nuclear accident in 1986] as of 1998 (according to UNSCEAR), a total of 1,791 thyroid cancers in children had been registered. About 93% of the youngsters have a prospect of full recovery. [Source: C. R. Moir and R. L. Telander, Seminars in Pediatric Surgery, vol. 3 (1994), p. 182.] ... The highest average thyroid doses in children (177 mGy) were accumulated in the Gomel region of Belarus. The highest incidence of thyroid cancer (17.9 cases per 100,000 children) occurred there in 1995, which means that the rate had increased by a factor of about 25 since 1987.
‘This rate increase was probably a result of improved screening [not radiation!]. Even then, the incidence rate for occult thyroid cancers was still a thousand times lower than it was for occult thyroid cancers in nonexposed populations (in the US, for example, the rate is 13,000 per 100,000 persons, and in Finland it is 35,600 per 100,000 persons). Thus, given the prospect of improved diagnostics, there is an enormous potential for detecting yet more [fictitious] "excess" thyroid cancers. In a study in the US that was performed during the period of active screening in 1974-79, it was determined that the incidence rate of malignant and other thyroid nodules was greater by 21-fold than it had been in the pre-1974 period. [Source: Z. Jaworowski, 21st Century Science and Technology, vol. 11 (1998), issue 1, p. 14.]’
- Zbigniew Jaworowski, 'Radiation Risk and Ethics: Health Hazards, Prevention Costs, and Radiophobia', Physics Today, April 2000, pp. 89-90.
Protein P53, discovered only in 1979, is encoded by gene TP53, which occurs on human chromosome 17. P53 also occurs in other mammals including mice, rats and dogs. P53 is one of the proteins which continually repairs breaks in DNA, which easily breaks at body temperature: the DNA in each cell of the human body suffers at least two single strand breaks every second, and one double strand (i.e. complete double helix) DNA break occurs at least once every 2 hours (5% of radiation-induced DNA breaks are double strand breaks, while 0.007% of spontaneous DNA breaks at body temperature are double strand breaks)! Cancer occurs when several breaks in DNA happen to occur by chance at nearly the same time, giving several loose strand ends at once, which repair proteins like P53 then repair incorrectly, causing a mutation which can be proliferated somatically. This cannot occur when only one break occurs, because only two loose ends are produced, and P53 will reattach them correctly. But if low-LET ionising radiation levels are increased to a certain extent, causing more single strand breaks, P53 works faster and is able deal with faster breaks as they occur, so that multiple broken strand ends do not arise. This prevents DNA strands being repaired incorrectly, and prevents cancer - a result of mutation caused by faults in DNA - from arising. Too much radiation of course overloads the P53 repair mechanism, and then it cannot repair breaks as they occur, so multiple breaks begin to appear and loose ends of DNA are wrongly connected by P53, causing an increased cancer risk:
In another post, we examine in detail the May-June 1957 Hearings Before the Special Subcommittee on Radiation of the Joint Committee on Atomic Energy, U.S. Congress, The Nature of Radioactive Fallout and Its Effects on Man, where the false dose-threshold (not dose rate-threshold) theory was publically killed off (in a political-journalism scrum sense, not a scientific evidence sense) by a consortium of loud-mouthed and physically ignorant fruitfly and maize geneticists (headed by Nobel Laureates Muller and Lewis), with only an incompetent and quiet defense for the scientific data from cancer radiotherapy experts with experience that high dose rates cause more damage than low dose rates. The argument they made was that genetic effects of radiation on fruitflies and maize showed no signs of dose rate effects or dose threshold effects. They they extrapolated from flies and maize to predict the same for human beings, and they also claimed that this genetic result should apply to all normal cell division (somatic) radiation effects not just genetic effects! Glasstone summarized this linear-no threshold theory on page 496 of the 1957 edition of The Effects of Nuclear Weapons:
"There is apparently no amount of radiation, however small, that does not cause some increase in the normal mutation frequency. The dose rate of the radiation exposure or its duration have little influence; it is the total accumulated dose to the gonads that is the important quantity."
Flies and seasonal plants don't need DNA repair enzymes, which is why they show no dose rate dependence: they simply don't live long enough to get a serious cancer risk caused by DNA copying errors during cell fissions. This is not so in humans, and even mice. Glasstone and Dolan write in the 1977 edition of The Effects of Nuclear Weapons, pages 611-612 (paragraphs 12.209-12.211):
"From the earlier studies of radiation-induced mutations, made with fruitflies, ... The mutation frequency appeared to be independent of the rate at which the radiation dose was received. ... More recent experiments with mice, however, have shown that these conclusions must be revised, at least for mammals.
"... in male mice ... For exposure rates from 90 down to 0.8 roentgen per minute ... the mutation frequency per roentgen decreases as the exposure rate is decreased.
"... in female mice ... The radiation-induced mutation frequency per roentgen decreases continuously with the exposure rate from 90 roentgens per minute downward. At an exposure rate of 0.009 roentgen per minute [0.54 roentgen/hour], the total mutation frequency in female mice is indistinguishable from the spontaneous frequency. There thus seems to be an exposure-rate threshold below which radiation-induced mutations are absent or negligible, no matter how large the total (accumulated) exposure to the female gonads, at least up to 400 roentgens."
The Oak Ridge Megamouse Radiation Exposure Project
Reference: W. L, ”Reminiscences of a Mouse Specific-Locus Test Addict”, Environmental and Molecular Mutagenesis, Supplement, v14 (1989), issue 16, pp. 16–22.
The source of Glasstone and Dolan’s dose-rate genetic effects threshold data (replacing the fruitfly insect and maize plant data of Muller, Lewis and other 1950s geneticists who falsely extrapolated directly from insects and plants to humans) is the Oak Ridge National Laboratory “megamouse project” by Liane and William Russell. This project exposed seven million mice to a variety of radiation situations to obtain statistically significant mammal data showing the effects of dose rate upon the DNA mutation risk (which in somatic cells can cause cancer). Seven different locus mutations were used, which showed a time-dependence on genetic risk from different dose rates, which could only be explained by DNA repair processes. This contradicted insect and plant response, which showed no dose rate effect on the dose-effects response. With the results of this enormous mammal radiation exposure project, observed human effects of high dose rates and high doses could be accurately extrapolated to humans, without using the false linear, no-threshold model that applies to insects and plants that lack the advanced DNA repair enzymes like P53 in mammals:
“As Hollaender remembers it: ‘Muller and Wright were the only two geneticists who backed the mouse genetics study. The rest of the geneticists thought we were wasting our time and money!’”
- Karen A. Rader, “Alexander Hollaender’s Postwar Vision for Biology: Oak Ridge and Beyond”, Journal of the History of Biology, v39 (2006), pp. 685–706.
For an interesting discussion of the way that the radiation controversy led to a change in thinking about DNA, from being a fixed chemical structure (as believed in 1957, after the structure DNA was discovered in its misleadingly non-cellular solid crystal form, which was required for X-ray diffraction analysis) to today’s far more dynamic picture of DNA in the cell nucleus as a delicate strand that is repeatedly being broken (several times a minute) by normal water molecular Brownian motion bombardment at body temperature, and being repaired by DNA repair enzymes like protein P53, see the article by Doogab Yi, “The coming of reversibility: The discovery of DNA repair between the atomic age and the information age”, Historical Studies in the Physical and Biological Sciences, v37 (2007), Supplement, pp. 35–72:
“This paper examines the contested ‘biological’ meaning of the genetic effects of radiation amid nuclear fear during the 1950s and 1960s. In particular, I explore how the question of irreversibility, a question that eventually led to the discovery of DNA repair, took shape in the context of postwar concerns of atomic energy. Yale biophysicists who opposed nuclear weapons testing later ironically played a central role in the discovery of DNA excision repair, or "error-correcting codes" that suggested the reversibility of the genetic effects of radiation. At Yale and elsewhere, continuing anticipation of medical applications from radiation therapy contributed to the discovery of DNA repair. The story of the discovery of DNA repair illustrates how the gene was studied in the atomic age and illuminates its legacy for the postwar life sciences. I argue that it was through the investigation of the irreversibility of the biological effects of radiation that biologists departed from an inert view of genetic stability and began to appreciate the dynamic stability of the gene. Moreover, the reformulation of DNA repair around notions of information and error-correction helped radiobiologists to expand the relevance of DNA repair research beyond radiobiology, even after the public concerns on nuclear fallout faded in the mid-1960s.”
In fact, the “safe dose rate” concept has always existed (most recently dressed up with health physics sophistry like ALARA, “As Low As Reasonably Achievable”) in the way that radiation safety guides have formulated as a maximum dose per unit time interval. For example, on page 102 of the 1957 Congressional Hearings The Nature of Radioactive Fallout and Its Effects on Man, nuclear testing scientific director Dr Alvin C. Graves testifies:
“I have forgotten the title, but I think it is the American Commission for Radiation Protection, or something of that sort, originally stated that the workers in radioactivity could take one tenth of a roentgen per day forever without suffering injury. [This is 36.5 R/year or 1095 R over 30 years, roughly the minimum dose needed for bone changes in the radium dial painters.]”
Dr Jane Orient, 'Homeland Security for Physicians', Journal of American Physicians and Surgeons, vol. 11, number 3, Fall 2006, pp. 75-9:
'In the 1960s, a group of activist physicians called Physicians for Social Responsibility (PSR) undertook to "educate the medical profession and the world about the dangers of nuclear weapons," beginning with a series of articles in the New England Journal of Medicine. [Note that journal was publishing information for anti-civil defense propaganda back in 1949, e.g. the article in volume 241, pp. 647-53 of New England Journal of Medicine which falsely suggests that civil defense in nuclear war would be hopeless because a single burned patient in 1947 with 40% body area burns required 42 oxygen tanks, 36 pints of plasma, 40 pints of whole blood, 104 pints of fluids, 4,300 m of gauze, 3 nurses and 2 doctors. First, only unclothed persons in direct line of sight without shadowing can get 40% body area burns from thermal radiation, second, duck and cover offers protection in a nuclear attack warning, and G. V. LeRoy had already published, two years earlier, in J.A.M.A., volume 134, 1947, pp. 1143-8, that less than 5% of burns in Hiroshima and Nagasaki were caused by building and debris fires. In medicine it is always possible to expend vast resources on patients who are fatally injured. In a mass casualty situation, doctors should not give up just because they don't have unlimited resources; as at Hiroshima and Nagasaki, they would need to do their best with what they have.] On its website, www.psr.org, the group boasts that it "led the campaign to end atmospheric nuclear testing." With this campaign, the linear no-threshold (LNT) theory of radiation carcinogenesis became entrenched. It enabled activists to calculate enormous numbers of potential casualties by taking a tiny risk and multiplying it by the population of the earth. As an enduring consequence, the perceived risks of radiation are far out of proportion to actual risks, causing tremendous damage to the American nuclear industry. ... Efforts to save lives were not only futile, but unethical: Any suggestion that nuclear war could be survivable increased its likelihood and was thus tantamount to warmongering, PSR spokesmen warned. ...
'For the mindset that engendered and enables this situation, which jeopardizes the existence of the United States as a nation as well as the lives of millions of its citizens, some American physicians and certain prestigious medical organizations bear a heavy responsibility.
'Ethical physicians should stand ready to help patients to the best of their ability, and not advocate sacrificing them in the name of a political agenda. Even very basic knowledge, especially combined with simple, inexpensive advance preparations, could save countless lives.'
‘International Physicians for the Prevention of Nuclear War: Messiahs of the Nuclear Age?’, The Lancet (British medical journal), 18 November 1988, pp.1185-6, by Jane M. Orient, MD:
'... history is apparently not among the areas of expertise claimed by IPPNW [international physicians for the prevention of nuclear war]. Its spokesmen have yet to comment on the Washington Naval Treaty of 1922, the Kellogg-Briand Pact of 1928 (for which Kellogg and Briand received the Nobel Peace Prize), the Oxford Peace Resolution of 1934, the Munich Agreement of 1938, or the Molotov-Ribbentrop Pact of 1939, and on the effectiveness of these measures in preventing World War II. ...
'Sir Norman Angell (also a Nobel Peace Prize winner), in his 1910 best-seller entitled The Great Illusion, showed that war had become so terrible and expensive as to be unthinkable. The concept of ‘destruction before detonation’ was not discovered by Victor Sidel (Sidel, V. W., ‘Destruction before detonation: the impact of the arms race on health and health care’, Lancet 1985; ii: 1287-1289), but was previously enunciated by Neville Chamberlain, who warned his Cabinet about the heavy bills for armaments: ‘even the present Programmes were placing a heavy strain upon our resources’ (Minutes of the British Cabinet meeting, February 3, 1937: quoted in Fuchser, L. W., ‘Neville Chamberlain and Appeasement: a Study in the Politics of History’, Norton, New York, 1982). ...
'Psychic numbing, denial, and ‘missile envy’ (Caldicott, H., Missile envy: the arms race and nuclear war, New York: William Morrow, 1984) are some of the diagnoses applied by IPPNW members to those who differ with them. However, for the threats facing the world, IPPNW does not entertain a differential diagnosis, nor admit the slightest doubt about the efficacy of their prescription, if only the world will follow it. So certain are they of their ability to save us from war that these physicians seem willing to bet the lives of millions who might be saved by defensive measures if a nuclear attack is ever launched.
'Is this an omnipotence fantasy?'
Here are some extracts from Dr Orient's letter to FEMA about the continued use of the lying LNT theory of radiation for long-term effects propaganda:
Jane M. Orient, M.D.
President, Physicians for Civil Defense
1601 N. Tucson Blvd. Suite 9
Tucson, AZ 85716
(520) 325-2680
www.physiciansforcivildefense.org
To Rules Docket Clerk
Office of the General Counsel
Federal Emergency Management Agency
Room 840, 500 C Street S.W.
Washington, D.C. 20472
RE: Docket #: DHS-2004-0029
Docket #: Z-RIN 1660-ZA02
FEMA-RULES@dhs.gov
We agree that flexibility is required in responding to incidents involving radiological dispersal device (RDD) or improvised nuclear device (IND). It is critical that actions taken do more good than harm. The dangers of panic, the shut-down of essential services, and disruption of the economy and social arrangements could vastly outweigh the supposed dangers of an increased exposure to radiation, particularly in the event of the use of an RDD.
We are disappointed that the document does not explicitly recognize that current radiation protection standards are based on the linear no-threshold (LNT) theory of radiation carcinogenesis. This theory calculates casualties based on collective doses. The assumptions are the equivalent of saying that if one person dies from ingesting one thousand aspirin tablets all at once, that one person will die if each one of the thousand persons ingests one aspirin tablet each. In fact, all actual evidence indicates that radiation, like most other potentially adverse exposures, exhibits a biphasic dose-response curve. While high levels are damaging or lethal, within a certain range at the lower end of the scale there is a seemingly paradoxical stimulatory or protective effect. Persons with accidental or occupational exposures within this “hormetic” range have a lower incidence of cancer and birth defects, and have had an increase in longevity as well. Thus, measures to “protect” people against exposures in this range may deprive them of a beneficial health effect, as well as harming them through excessive costs or deprivation of the other potential benefits of technology.
... It should be noted that the average background dose on the Colorado plateau is 600 mrem per year, and in some areas of the world, much higher than that. For example, in Ramasari, Iran, the average background is about 48 rems per year-that is 4,800 mrem per year-without noticeable adverse health effects. Forced resettlement, on the other hand, would cause many billions of dollars in damage to the economy as well as social upheaval. Because of widespread public fear of low-dose radiation, many people might choose to be resettled than face such increased exposure, but persons should not be forced to abandon their homes, personal property, and businesses based upon unfounded fears. ...
In appointing technical advisory committees, it would appear important to include persons whose reputation is not strongly invested in the linear no-threshold hypothesis, who would thus find it difficult or impossible to change their position. A full range of views must be heard and not suppressed by a “consensus” process that strongly pressures participants to approve a predetermined position and excludes those who do not.
We think it is critical that the United States government should not enable terrorists to destroy a large area of the country and cripple its economy by exploiting unwarranted fears. Instead, we need to be prepared to mitigate the damage should efforts at interdiction fail.
66 Comments:
The General Advisory Committee, now under Rabi’s leadership, was also raising questions about the future of Livermore. At a GAC meeting shortly after the Oppenheimer verdict was announced, Rabi described the effort there as ‘amateurish,’ adding, ominously, that Teller’s lab did not have responsibility for any ‘necessary’ part of the weapons program. After Koon’s failure, the AEC had canceled its order for Ramrod, to Teller’s chagrin. (Stung by the move, Teller told the GAC that he had plans for a 10,000-megaton bomb—something that Rabi and colleagues dismissed as ‘an advertising stunt.’)” From Gregg Herken's "Brotherhood of the Bomb."
“However, he knew better than anyone that, following the failure of its first three tests in the spring of 1954, Livermore was in serious danger of closure. At a GAC meeting soon after the hearing verdict, Isidor Rabi described its performance as ‘amateurish’ and commented that Teller’s laboratory did not have responsibility for any ‘necessary’ part of the weapons programme. Then, that September, close on the heels of the Shepley-Blair book, Norris Bradbury wrote to the AEC suggesting that the second laboratory should be made subordinate to his own, Los Alamos. ‘The brilliant new ideas have not appeared,’ he pointed out.
To compound matters, the AEC also cancelled its order for Ramrod, the thermonuclear trigger that Teller had hoped would at last made feasible his old classical Super design. In response, Teller tried to impress Washington with plans for a device with a preposterous yield of no less than 10,000 megatons, a thousand times that of the Mike shot. It was dismissed by Rabi as an advertising stunt.” From Comparisons between Peter Goodchild's "Edward Teller:Real Doctor Stangelove.
However both books writen from weak point of view,since as theirs authors not physicists.
At the same time, DOD noted that while the AEC had already spent time and money on development effort of another weapon for which DOD never established a requirement, that device was undesirably heavy, and its yield loss when converted from conventional to "clean" configuration was totally unacceptable. In light of these factors, and because the new Class B weapon would be available relatively soon, the DOD recommended that the AEC discontinue its development of the device.From Hansen.
I believe this is about this device.
It would be interesting if Dr Edward Taylor executed this plans.
What would consequences of this detonation?.With largest fission fraction ~89%.What would be fallout effects on China,Japan and others (assuming Pacific detonation,of course).What would be political effects?.I believe this ended Cold war in mid 50-s.
Sorry ,I mean of course Dr Edward Teller.
With enough broomsticks, everyone would have made it. See volume 2 of Dr Carl F. Miller's Fallout and Radiological Countermeasures, limited distribution to U.S. contractors and agencies only, January 1963. Miller shows that the heavier the fallout, the greater the relative efficiency of fallout. The worst case is where you are trying to decontaminate an invisibly small, harmless amount.
Where it looks like snow, you can throw away the geiger counter and sweep it into gutters, then flush it down drains. Out of cities, deep plow it into the soil, then use the soil to grow crops with short roots until it has decayed.
It decays fast. If you salt the bomb with cobalt, then the specific activity is relatively low than if you have a U-238 or thorium casing which undergoes fast neutron fission. You decontaminate the fallout before you get the lethal dose, because cobalt-60 has such a long half life.
About 72% of fission products have a half life less than 1 day, whereas cobalt-60 has a half life of about 5.3 years. So for every neutron captured by cobalt-59 to form radioactive cobalt-60, you get a measly 2.5 MeV of gamma rays (two gamma rays, one 1.17 MeV and one 1.33 MeV) spread over a mean life of 5.3/ln2 = 7.6 years (the mean life for normal exponential radioactive decay is always a factor of 1/ln2 ~ 1.44 times the half-life). Compare this to the 200 MeV of energy, including 7 MeV of delayed gamma rays, that you get from using the neutron to cause fission and create fission products!
This is why you can't do much with fallout! It is so easy to deal with. You can firehose it down drains while it lands to futher reduce doses, just like the "washdown" sprinkler system American ships used at nuclear tests in heavy fallout. Alternatively, you can evacuate while the radiation levels are high. Most "rainout" goes straight down the drain or deep into soil, shielding the radiation (only puddles present a danger).
As stated in this post, most of the radiation hysteria in the media is based on lies. There are very serious dangers from short and long term effects if you are exposed to high dose rates, especially over 0.5 R/hr or 0.5 cGy/hr, which is the very upper limit for the body's DNA repair enzymes like P53 to repair damage as it occurs. Beyond that level, the DNA breaks into multiple bits before the ends are rejoined (sometimes in the wrong order, either killing the cell or in a few cases causing cancer or genetic risks).
But at lower dose rates, especially 1 mR/hr (100 times background), the controlled, blindfolded experimental data shows benefits.
One more thing about fallout hazards from the 10,000 megaton "doomsday device" you refer to.
The fallout arrival time increases with bomb yield. During that time when the fallout is still at high altitudes, the most intense radioactivity decays without irradiating people.
For the 1 kt Nevada "Sugar" first surface burst in 1951, fallout arrived within a minute near ground zero.
For the biggest surface burst ever tested, the Bikini Atoll "Bravo" test in 1954, fallout arrival times were carefully instrumented with automatic tray elevator collectors throughout the Atoll and lagoon fallout collection rafts (all under the mushroom cloud), giving a mean fallout arrival time of 28 minutes, a time to peak dose rate of 1 hour after detonation, and a cessation at about 2 hours after detonation (weapon test report WT-915).
As you go up to bigger yields, the fireball begins to rise so fast that it goes up ballistically to very high altitudes.
Since nuclear explosions don't affect gravity, and since gravity is one factor determining the time for fallout particles to be deposited (there is some downward downdraft motion at the periphery of the cloud as well, but that doesn't cancel out the tremendous updraft that increases fallout times as yield increases), the time for fallout gets bigger as yield increases.
This means that the doses don't scale up with fission yield.
The total amount of radioactivity actually deposited on the ground is a progressively smaller fraction as bomb yield increases, due to the high rate of decay at local times while the activity is still in the cloud or at high altitudes.
This is covered up by the presentation of 1 hour "reference" dose rates in Glasstone and Dolan and other sources. Those 1 hour reference dose rates are totally imaginary and never exist over most of the fallout area, where fallout simply hasn't arrived or been completely deposited within 1 hour of detonation. It's all lies for Cold War propaganda and confusion. If they wanted to present useful data, they would show instead the REAL WORLD dose rates after fallout was complete, and accumulated doses as a function of bomb yield, which fall off more rapidly than "1 hour reference dose rates" with increasing downwind distance, due to the increasing arrival time and thus the lower actual dose rate when fallout actually arrives, due to radioactive decay in the interim.
In the Bulletin of Atomomic Scientists, November 1961, W. H. Clark discussed similar things: "A large mine containing 1000 tons of heavy water would fission 5000 tons of uranium. Fifty such mines would be requred to produce any lethal remote fallout." (So this device yield from fusion-17.6 gigatons, from fission-88 gigatons,total-105.6 gigatons, total fission yield - 4.4 teratons". Later, he wrote: Cobalt bombs are much more effective because half-live of cobalt is well matched to the residence time of bomb debris in stratosphere. Long ago Leo Scilard pointed out that only 50 tons of neutrons , obtainble from 2500 tons of heavy water ,would produce enough cobalt-60 to give a dose of 10,000 roentgens over the whole earth. The lethal dose of man, for radiation distributed over a long period of time is thought to be 5000 roentgens, but it is not certain that a 10,000 roentgen dose in open would kill the entire population...." Finally, he concluded : cobalt bombs designed to exterminate the human race will require a few hundred thousand tons of heavy water and equal amount of cobalt. But, how accurate this estimate? How large such bomb would be (I mean a proper design -mix of u-238 and cobalt) to kill humans?
http://www.time.com/time/magazine/article/0,9171,828877,00.html
Science: fy for Doomsday
Friday, Nov. 24, 1961
Herman Kahn's ponderous shocker, On Thermonuclear War, frequently mentions a weapon whose purpose is to end all human life: the Doomsday Machine. Kahn discusses its political uses as calmly as if it were a bug killer, but he gives few technical details. In the latest Bulletin of the Atomic Scientists, Physicist W. H. Clark spells out some little-known facts about Doomsday Machines—and some of the more refined horrors that nuclear war could bring. Both the U.S. and Russia already can build near-Doomsday bombs, but far more disturbing is the fact that they are sufficiently inexpensive and simple to be built by smaller nations with an emerging atomic technology.
The detonator of a thermonuclear bomb is a fission bomb containing plutonium or uranium 235, and its explosion sets off the main charge of fusion material, which is essentially deuterium (heavy hydrogen). Fission detonators are expensive, but a single one can explode any amount of comparatively cheap fusion material. Result: the bigger the bomb, the cheaper it is in terms of explosive yield. Clark figures that a ten-megaton bomb costs somewhat more than $1,000,000, mostly for the detonator. But further increases in yield cost only about $5,000 per megaton, so that the price tag on a 100-megaton bomb is roughly $1,500,000. A 1,000-megaton bomb would cost $6,000,000. Once they acquired enough fission material, many middle-sized countries could fashion even bigger bombs and, by using certain techniques, produce near-Doomsday destruction and death by exploding comparatively few of them.
Among the techniques: >A nuclear bomb could be loaded on a submarine or barge and planted on the ocean bottom near the coast of a target country. Exploded under two miles of water (at the aggressor's will and from great distance), a 20,000-megaton bomb would stir up a wave whose crest would still be 100 feet high after it had traveled 200 miles. It would wash most coastlines bare and ride far inland.
[continued]
>For better Doomsday effect, large bombs could be made as radioactive as possible. One way is to "salt" them with sodium, which becomes intensely radioactive when it absorbs neutrons. Clark figures that a 20,000-megaton bomb of this kind would contaminate 200,000 square miles (four times the area of New York State) so heavily that even people in basement shelters would surely die. But since the half life of radioactive sodium 24 is only 15 hours, the bomb's products would lose much of their punch before the wind could carry them around the earth. Thus, a sodium-salted bomb would not be a true Doomsday Machine.
> More deadly yet would be large fission-fusion-fission bombs whose outer blankets of cheap uranium 238 yield energy as well as deadly fission products. Clark believes that any nuclear power could easily destroy a nation with the close-range fallout effect from this type of bomb, but he thinks that the human race as a whole would be more resistant.
> Probably the most effective bomb in spreading worldwide death would be salted with cobalt, which absorbs neutrons and turns into radioactive cobalt 60. Since cobalt 60's half life is five years, it would be carried all over the earth before losing its activity. About 50,000 megatons of cobalt-salted bombs would theoretically provide enough long-lasting radioactivity to kill everyone on earth.
Despite the massive destructive potential of these bombs, Clark believes that not even such a Doomsday Machine—should any nation ever be suicidal enough to use one—would completely wipe out human life. In deep caves or far-underground shelters, enough people might find refuge to wait out the radioactivity and emerge to begin again. Concludes Physicist Clark: "The indications are that the human race will survive the H-bomb, though it will be a close thing. Until some more efficient process is discovered, extermination will require a major effort by one or both of the great powers. Lesser states will have to be content with destroying most people and making the rest miserable."
Anonymous,
Like the rest of the crackpots and quacks, the lying rubbish you are quoting is a danger to peace, freedom and security in the way Herman Kahn explained over 50 years ago in On Thermonuclear War:
"At no time did Hitler threaten to initiate war against France and England. He simply threatened to 'retaliate' if they attacked him. The Munich crisis had an incredible sequel in March 1939. ... Hitler occupied the rest of Czechslovakia. The technique he used is such an obvious prototype for a future aggressor armed with H-bombs that it is of extreme value to all who are concerned with the problem of maintaining a peaceful and secure world ..."
- Herman Kahn, On Thermonuclear War, Princeton University Press, 1960, p. 403.
On Thermonuclear War, Princeton University Press, 1960, pp. 390-1:
“... in spite of the tremendous scale of the violations it still took the Germans five years, from January 1933 when Hitler came in to around January 1938, before they had an army capable of standing up against the French and the British. At any time during that five-year period if the British and the French had had the will, they probably could have stopped the German rearmament program.... it is an important defect of ‘arms control’ agreements that the punishment or correction of even outright violation is not done automatically ... but takes an act of will ... one of the most important aspects of the interwar period [was] the enormous and almost uncontrollable impulse toward disarmament ... As late as 1934, after Hitler had been in power for almost a year and a half, [British Prime Minister] Ramsey McDonald still continued to urge the French that they should disarm themselves by reducing their army by 50 per cent, and their air force by 75 per cent.
“In effect, MacDonald and his supporters urged one of the least aggressive nations in Europe to disarm itself to a level equal with their potential attackers, the Germans. ... Probably as much as any other single group I think that these men of good will can be charged with causing World War II. [Emphasis by Herman Kahn.] ... Hitler came into power in January 1933 and almost immediately Germany began to rearm ... but it was not until October 14, 1933 [that] Germany withdrew from a disarmament conference and the League of Nations ... Hitler's advisors seem to have been greatly worried that this action might trigger off a violent counteraction - for example, a French occupation of the Ruhr. But the British and the French contented themselves with denouncing the action.”
AS STATED BEFORE, THE LIES ABOUT LONG-LIVED COBALT-60 OR WHATEVER IN FALLOUT STEM FROM THE "SITTING DUCK ASSUMPTION".
IT IS POSSIBLE TO DECONTAMINATE THE FALLOUT BEFORE YOU GET A SIGNIFICANT DOSE.
WHAT PART OF THIS IS UNCLEAR TO YOU?
THE MAN WAS A WAR MONGERER, EXAGGERATING NUCLEAR EFFECTS TO APPEASE THE TERRORISTS WHO WANTED, LIKE HITLER AND LENIN, TO DO HARM.
DON'T BELIEVE THE LIARS WHO EXAGGERATE THE EFFECTS OF NUCLEAR WEAPONS TO MAKE CIVIL DEFENSE SEEM "UNTHINKABLE" OR INEFFECTIVE. SUCH PEOPLE WERE THE BIG DANGER IN THE 1930S, BUT WERE NEVER OPPOSED BY POLITICIANS, THE MEDIA, THE MILITARY, OR THE PUBLIC.
THIS IS THE RISK!
THE LIARS WERE ACTUALLY GIVEN PEACE PRIZES BY FUNDS ESTABLISHED BY THE WAR MONGERER ALFRED NOBEL, OR THE "LENIN PEACE PRIZE". THE LIARS WERE SUPPORTED BY THE MILITARY, THE MEDIA, THE PUBLIC AND THE TERRORISTS WHO EXPLOITED THEIR LIES TO GET AHEAD IN ARMS RACES AND GENOCIDE.
THEY ARE EVIL. DON'T BELIEVE THEM!
Thanks.
2 last comments were not mine.
I'm understand this,I'm only asked you about threshold.In principle there existed amount of CO-60
lethal to whole Earth,so much that caused death very quick.But probably very large.I can make a calculation of course,but this calculation would be based on uniformal destrubution,so I'm asked you.
P.S. I make a mistake,this is almost 16.6 gigatons from fusion,not 17.6.
Lenin of course was a terrorist,but not so much as Stalin.But what about Genghis Khan and his heirs?
Global fallout mainly comes down with rain because the particle size is too small for much dry deposition. The rainout runs down drains or deep into the soil, where the gamma radiation is shielded.
Again, look up the research by the U.S. Naval Radiological Defense Laboratory led by Dr Carl F. Miller (who was killed by fallout, dying from leukemia after getting a high dose in various tests).
This research on both "washdown systems" (decontamination by continuous firehosing of surfaces, to get rid of fallout while it is actually being deposited, as used to protect U.S. ships from fallout successfully at Castle and Redwing nuclear tests in 1954 and 1956) is being ignored by everyone, as is the decontamination and countermeasure research on already-deposited fallout.
You can equip roadsweeper trucks with some shielding around the driver's seat, or you can install radio-control and TV cameras in them, like the U.S. Navy did with ships YAG-39 and YAG-40 at Operation Castle!
You can take shelter and decontaminate the area with radio-controlled street cleaners and such like, without anyone getting exposed. ... IF YOU HAVE GOOD CIVIL DEFENSE RESEARCH, PLANNING, AND PREPARATIONS IN PEACETIME, AHEAD OF SUCH A DISASTER!!
This research on decontamination was costly in both money and human terms, and can't be replicated. It should not be ignored, but utilized!
In :The end of the world: the science and ethics of human extinction by John Leslie even stated that Teller thinking about devices in million-megaton range (for destruction of asteroids)!
P.S. I'm even hear that Teller proposed H-bomb "capable blown -up Entire USSR" (I'm read this on anti-LHC sites -cerntruth.com and www.lhcdefence.org ,this claim probably have been made in Atomic movie: Trinity and beyound( i'm not seen this movie),but stated that Eisenhower cancelled this and call Soviet leadership for talk on disarmament-this means 1955(Open Skies),so this most likely this 10,000-megaton design(most likely Alarm clock-style(Teller proposed 1000-mt Alarm Clock in 1950)).
Rabi was simply -a worst Teller's ENEMY,but have a far greater influence on Ike than Teller.
Hyperlink to your reference: John A. Leslie, The End of the World: the Science and Ethics of Human Extinction, 1996.
"In January 1992, Teller was among the hundred participants at a meeting at los Alamos to discuss new nuclear armaments. The event coincided with his eighty -four birthday. He called for producing largest- ever bomb, ten thousand times larger than any that had hitherto been built ;it would be so large that it could never be exploded on Earth but would be used to destroy asteroids. According to the report about the meeting ,as Teller described this super-super bomb, his protégé at the Livermore Laboratory ,Lowell Wood, could not contain his excitement and shouted from audience, ”Nukes forever”."
From Judging Edward Teller: A Closer Look at One of the Most Influential Scientists of the Twentieth Century by Istvan Hargittai.
So why, He said that this bomb (1000 000 megatons) so large to be detonated on Earth? What main concern?Fallout?
Your reference:
http://www.amazon.com/Judging-Edward-Teller-Influential-Scientists/dp/1616142219
Judging Edward Teller: A Closer Look at One of the Most Influential Scientists of the Twentieth Century [Hardcover]
Istvan Hargittai (Author)
Thermal radiation, initial nuclear radiation, ground shock, cratering, EMP and and blast could be key problems at that kind of yield, depending where it was detonated. If you fired it above one fireball radius over the middle of the Atlantic or Pacific, you would limit effects to blast and tsunami innundation of islands. But near land the problems would be severe. Below 100 megatons or so, thermal radiation is easy to deal with by duck and cover and elementary firefighting (the thermal pulse can't ignite thick wood below 100 megatons, so you can stamp out the fires that begin in outdoor dry crumpled newspapers newspaper, etc.). But at a yield of 1,000,000 megatons the thermal pulse becomes many minutes long, which is enough to dry out and ignite thick wood over wide areas, so it's a far worse problem with firestorms in forests. The blast would refract in the atmosphere and break windows for thousands of miles. Fallout would also be a relatively severe problem, although obviously the decay rate is still quick (approx t^-1.2 regardless of yield) so people would survive if they take the right countermeasures.
As Edward Teller stated correctly, it is not something to test here on Earth. It's restricted for use in space against asteroids that might hit the earth, like the >100,000,000 megatons K-T event 65 million years ago. You would have to detonate it far enough from the Earth in space to avoid EMP problems and satellite radiation damage problems.
There exist one thing with large bombs-horizon.
There was one weapon thereotized -Orbital bomb for SR-181 program.
There around 37000 pound vehicle delivered by Saturn class booster to 100 n.m.orbit.Its included 100-megaton bomb (weighing 17500 pounds).Detonation on this orbit.
LeMay also wanted this warhead as counter-force weapon on Titan3 as ICBM.JFK and Mcnamara canceled development tests for high-yield warheads,but fear was that USSR would deploy such weapons.
So would be effects on ground of this 100 n.m. detonation ?
See for this warhead for example :
Nuclear Test Ban Treaty. Hearings before the Committee on Foreign Relations, United States Senate, Eighty-eighth Congress, first session, on Executive M, 88th Congress, 1st session .1028 pages.Perhaps you never seen this.Also Hansen and NRDC data is complete false.
SR-181 was huge study for dozens of weapon systems:
"Cold War American Spacecraft Projects and the “Paperclip” connection.
The Allied Intelligence units…specifically U.S. in nature infuriated, angered, and outfoxed our allies (French, Canadians, British, and of course the Russians), by initiating “Operation Plunder,” within it being “Operation Paperclip” the collection of scientists, engineers, technicians, technology, etc., and to prevent the talented Germans from slipping out of Germany via any “ratline,” were hunted within another project called “Safeghaven:” Safehaven was specifically created to prevent wherever possible, talented Germans from slipping out of Germany, Austria and other German occupied nations, to continue their research in other nations-specifically, South America. Source; Secret Agenda, Linda Hunt, St. Martin’s Press, 1991.
Part.1.
Part.2.
During the 1950s G. L. Martin Aircraft of Baltimore, Maryland possessed an Advanced Design Teams department. Martin Design Teams presided over by George S. Trimble, was involved in pure science, future over-the-horizon advances in aeronautics, anti-gravity and its applications, nuclear propulsion for aerial vehicles, and the little publicized lenticular aircraft-spacecraft design studies plus studying and applying natural phenomena or the laws of nature, and advanced missile-spacecraft design.
Source: Aviation Week, October 18, 1953.
Martin Aircraft at their Middle-River, Maryland facility reorganized one facility into a special subsidiary that was born in 1946, and was engaged in the guided missile field. In the 1950s specific Design Teams operated within a separate unit known as RIASI-Research Institute for Advanced Study Incorporated, and performed building-block research into varied projects. This streamlined unit engaged in various missile projects, electronics, electro-mechanical weapons systems, and especially to “make the spaceship respectable,” and solve the problems of space travel. Another RIAS goal was to close the gap between basic scientific discoveries and their applications to engineering problems in their various projects.
Source: Aviation Week June 3, 1957.
Part.3.
The most exotic division other than RIAS was the Martin-Denver facility (where the Martin Titan series ICBM and follow-on heavy boosters), where a research unit known internally as “Force XXIV” became heavily involved in advanced space research. One aspect was to study and apply spacecraft for military applications in an area defined as a “synchronous corridor-an area 19,340 miles into space directly over the Earth’s equator. Other space-flight avenues proposed were Macro-Life spacecraft and the self-conscious, self-repairing Micro-life craft that protectively housed a human colony of space-farers. Also studied were “planetoid” or Astroid harvesting, hollowing large Astroids out to transform them into deep space exploration outside of our own Solar System. One of the more extreme thinkers based at Martin Denver was Dr. Dandridge MacFarlan Cole a high level engineer proposed some of the most extreme and advanced projects concerning human space travel. Martin Museum and other historians have spoke in very denigrating terms of Dr. Cole that border on the charge of insanity among other terms.
Aerospace author Lloyd Mallan was one of a very few writers who received the invitation for a grand tour of the Martin-Denver facilities. He wrote up the numerous space research projects at “Force XXIV” in his quite revealing book “Space Science,” based mostly upon the work of “Force XXIV.” In fact, Mr. Mallan is the only author who provided an in-depth report complete with photography, drawings, and projects being studied inside the Martin-Denver facility.
Source; Missiles and Rockets, July 19, 1965.
Time Magazine January 27, 1961 within the Science column.
Many people are familiar with the split-up of the original German Peenemunde Rocket Research Center Team-von Braun and his team surrendering to the American Forces, while the other half surrendered to the Russians. The rest is history.
What is not known is that many German lesser known scientists-engineers-technicians were involved in numerous American space projects, many of which would be applied to what the USAF secretly created immediately after the initial Russian Sputnik series were launched. The USAF SR-181 Strategic Orbital System, encompassed a series of weapon systems and support spacecraft constituting the T.O. (Table of Organization) of a tightly classified USAF organization known as EOMSF…Earth Orbital Military Space Force, with a planned operational timeframe between the late 1960s and early 1970s. This very thorough study is still classified today.
A few known German scientists and engineers absorbed into the U.S. Aerospace community that were directly involved in hypersonic boost-glider spacecraft studies were:
Bell Aircraft;
Dr. Walter R. Dornberger Bell’s chief Guided Missile Specialist cum Chief Scientist and before retirement, Vice President for Research. Responsible for Bell initiating a series of continuous rocket powered and rocket boosted reconnaissance-bombardment glider project studies, operated up to orbital speeds and altitudes, including the final follow-on to other projects…the Dyna Soar I contract.
Ing; Rudolph H Reicher joined Bell 1953, R&D on rocket engines, and in 1959 joined Boeing Airplane Company performing Propulsion Analysis and Interaction work.
Part.5.
Dr. Krafft A. Ehricke engineer-physicist worked for Dr. Dornberger at Bell Aircraft and left to join Convair in San Diego to work on the Atlas ICBM and Centaur Projects.
Heinz Mueller joined the Bell Aircraft Rocket Laboratory in 1950 and created “thrust chambers” and associated rocket motor work.
Dipl. Ing. Daus Chamburg-Harburg specialist in rocket-transportation and;
Dipl. Ing. Wilhelm Emil Schlitt specializing in Guidance Systems may have worked at Bell Aircraft.
Dipl. Haas or Haase of which little is known who was German and worked at Bell Aircraft on unspecified projects…either rocket engines or their Rocket Boosted reconnaissance-bombardment glider projects.
A brief list of advance and cutting-edge Bell Projects are;
Part.6.
MX-2145 cum MX-2276 BoMi
R459L “Hi Fi Recce” a.k.a. ”Brass Bell.”
System 118P two phase aircraft and glide-rocket study.
Study Requirement or SR-126 RoBo.
R464L Dyna Soar I cum RS-620A Dyna Soar.
Hypernias I and Hypernias II EMS…Energy Management System for Dyna Soar, SLOMAR, Lenticular Re-entry Saucer spacecraft, and all Lifting-Body Spacecraft designed by other firms.
Ramora Space Maintenance Vehicle in conjunction with Model 7045 Saucer spacecraft and their Orbital Bombardment Station or Platform.
Nuclear armed Reentry Missile in conjunction with Bell or Martin Orbital Bombardment Platforms.
REACTION CONTROL MOTOR Contracts Boeing Dyna Soar and various spacecraft projects..
Bell’s Lenticular spacecraft studies culminating in the Bell Model 7045 Modified
Lenticular Re-entry Vehicle submitted for the Apollo Project, to service Orbital Bombardment Platforms within the SR-181 Strategic Orbital System study.
Continuous USAF-NASA Contracts to continue to develop and refine Bell Aircraft’s Patented double-wall liquid and solid insulation systems for hypersonic vehicles that included winged orbital weapons systems that must negotiate a glide-reentry into earth’s sensible atmosphere. This author has now discovered that such contracts involved double-wall cooling-insulation panels, test sections, and boost-glider fuselage-wing cross sections that were refined and actually flight-tested and wind tunnel tested. Bell’s future research was agreed upon by Larry Bell and Dr. Dornberger in 1951. The double-wall cooling-insulation system contract series ran as late as 1968, such was the confidence the USAF and Bell exhibited in their lengthy research work.
G.L. Martin Aircraft;
NOTE-the following individuals were nicknamed “The Blue Angels,” despite being assigned to the Dyna Soar I competition with Bell Aircraft against the Boeing-Vought Team. The Middle-River Plant (probably the RIAS Department), where they work was painted throughout the building with two shades of Blue paint, hence the term “Blue Angels.”
Hans Multhopp; A Director and Principal Scientist under Martin Dyna Soar I boss…Mr. Bastian (Buz) Hello, also involved in design of the Martin-Bell Dyna Soar I and lifting-body spacecraft.
Dr. Peter Friedrich Jordan, structural specialist assigned to Martin-Bell Dyna Soar I Project and other Martin spacecraft projects.
Dr. Julius Friedrich Vandrey; Aerospace Physics also assigned to Dyna Soar I.
The above list of course, is by no means complete and it is curious that other Germans constituting a list of approximately 1,500 plus German-Austrian scientists and other talented people are not listed as to their employment either within the U.S. Government, Industry, the military branches and other sources.
Part7.
MARTIN PROJECTS;
ASTROROCKET
START PROGRAM.
PILOT/PRIME
S-5
M-103, SV-5, X-24A X-24B, X-24C-FDL.
X-23A Prime mini-lifting body
SR-89774 Fly-back Titan II
SLOMAR
DYNA SOAR
RAPT
HL-10 STUDY
FDL-8
HASP
Martin projects listed above were applicable to inspect, maintain and support Bell or Martin’s manned and robotic Orbital Bombardment Platform proposals, crew-changes and resupply of Space Stations, etc. They offered both lifting body and lenticular plus Greatly enlarged and very heavy Dyna Soar Bombers and other spacecraft known as “SLOMAR” for a multitude of USAF missions. Martin also designed their own versions of highly modified boost-glider and lifting body spacecraft derived from the START Program. It must be noted that the plethora of U.S. Industry spacecraft designs, proposals, and Dyna Soar were to be incorporated into the still classified SR-181 Strategic Orbital System composing the hardware of the USAF planned SR-181 EOMSF…Earth Orbital Military Space Force studied between 1957-1963.
Source; numerous Martin Dyna Soar Reports and Proposal Booklets.
Boeing Airplane Company
Few German scientists and engineers worked for BAC and are unidentified.
MX-2145 Study.
SR-168 Air-Launch Glide Missile
SR-126 RoBo.
SR-178 GSS.
SR-181 Strategic Orbital System
ABMD Orbital Interceptor or Boeing nomenclature…AICBM interceptor.proposal.
SR-79500 Hypersonic Glide Misisle
ICGM
BOSS-IOC
BOSS-WEDGE
R464 a.k.a. RS620A Dyna Soar I
There are numerous contracts that either remain classified or have been destroyed by the contractor or have been buried by the Dod/USAF.
End."
I'm not Hansen or NRDC staff
SR-181 was not alone.
SR-181 -orbital nukes and related things.
SR-183 lunar based nukes and related things.Again huge work was done there.Threre even were concepts for lunar missile base..........
Where this stated that wood cannot be ignited by 100-megaton explosion?There all altitudes?
I'm have seen prediction that 100mt detonated at 100 n.m. would set fires on 100 000 square miles.
For my debunking of thermal ignition lies please see:
http://glasstone.blogspot.com/2006/04/ignition-of-fires-by-thermal-radiation.html
One more thing about Teller's 10,000-megaton bomb.In reality this bomb was not alone.One more think to debunk revisionist books like Herken'Brotherhood.
"DECISION NOT TO DEVELOP HIGH-YIELD WEAPONS
The CHAIRMAN. Did you participate in the decision apparently
made in 1954 and, I suppose, in years subsequent to that, not to go-
in for the very high yield weapons ?
Dr. FOSTER. No, sir. I was not in a position to participate in the
decisions.
The CHAIRMAN. You did not. You are familiar with that?
Dr. FOSTER. Yes, sir.
The CHAIRMAN. Who did make that determination not to pursue
the high yields?
Dr. FOSTER. Well, the instructions came from the Atomic Energy
Commission to the laboratory.
The CHAIRMAN. You think this was a mistake?
Dr. FOSTER. At the time, Senator, I think it was the right thing
to do, to decide not to build THEM and, as a matter of fact, at that
time, at the Lawrence Radiation Laboratory we had DESIGNS to do
JUST this job, and it was these designs that raised the question.
Now, the question of whether or not 1 megaton or 10 or 100 is the
right one depends an awful lot on what you think the situation on the
other side is, and this changes as a function of time.
The CHAIRMAN. Then, do I understand you to say at the time, you
mean in 1954 it was the right thing to do?
Dr. FOSTER. That is correct."
From Nuclear Test Ban Treaty. Hearings before the Committee on Foreign Relations, United States Senate, Eighty-eighth Congress, first session, on Executive M, 88th Congress, 1st session .Statement of Foster Dr. Jhon S., Jr.,Director ,Lawrence Radiation Laboratory,Livermore,Caliv.
In 10000-megaton device effective primary must be for final stage an order of 1000 megatons.This must be second device.But seems,that Foster stated about larger than 2.
It must be that additional several designs existed.
It even may be that "H-bomb to blown up entire USSR" not 10000 megatons,larger.Blown up means by fallout.
What yield needed for shipborn extemely dirty H-bomb to cover entire USSR with lethal fallout(based on contemporary understanding)?
Also Teller showed on calculations on early supercomputers in Livermore that Classical Super works!But unfortunately for devices with many gigaton yields and higher.
See:Lowell Wood and John Nuckolls,The development of nuclear explosives,Energy in Physics ,War and Peace:A Festschrift Celebrating Edward Teller's 80th Birthday ,edited by Hans Mark and Lowell Wood,p.317.
One more thing to consider a GUT magnetic monopole bomb.
Teller have a lunch with Turok ,they talking about magnetic monopoles.Teller immediately consider ,that they could be used as basis for bomb.However this is only estimation.
So what would be effects of this thing ?This is very interesting.
In the Chuck Hansen's Swords 10,000 mt bomb described very well compared to political books like Brotherhood.
"In other discussions, several members of the GAC expressed
concern over Teller's 10,000 MT device; some wondered what
fraction of the Livermore lab's effort was being expended on
high-yield devices named the GNOMON and the SUNDIAL. Drs. von
Neumann and Rabi stated that "the (Livermore) laboratory
clearly has very capable people on its staff; it is
unfortunate that they are not being effectively utilized up to
their abilities."
Its unclear whether or not 10,000 mt bomb one of them,or they smaller.The main reason not build them was that they not have military requirement,that was difference between US an USSR.
What would be effects of its explosion around Leningrad coast ?
Its capable to wipe out Soviet population?
"What would be effects of its explosion around Leningrad coast ?"
Teller would have been stopped from sailing his ship through the Baltic sea to get anywhere near Leningrad.
A 10,000 megaton bomb would need a ship to deliver it. Assume 50% fission yield. You get about 17 kilotons per kilogram of uranium-238 fissioned by high-energy fusion neutrons. However, not all of the uranium-238 will be fissioned, because many fusion neutrons will be captured by uranium-238 without fission to form uranium-239, -240, 241, etc. So non-fission capture competes with fission. If the total neutron capture-to-fission ration is 1 capture per fission, then you only get 8.5 kilotons per kg of uranium-238, which means you need 118 kg of uranium-238 per megaton of fission yield.
So for 10,000 megatons at 50% fission (5,000 megatons of fission yield), the bomb needs 590 tons of uranium-238. In addition you need a massive quantity of lithium deuteride or deuterium. You'd have to build the bomb inside a container ship, because you would not have a crane capable of lifting such a massive bomb on board. It's just stupid. The effects wouldn't be that impressive anyway, the cube-root scaling law means that 10,000 megatons has a radius of diffraction-type (peak overpressure) destruction only 10 times bigger that the 10 megaton Mike test. The wind effects would last 10 times longer for any given peak pressure, so things would be blown along for 10 times longer, but the maximum wind speed for any peak overpressure is not dependent on the bomb power.
Because of the weight of a 10,000 megaton device, it would be virtually impossible to air burst it (unless you had a massive Project Orion type nuclear explosion propelled rocket), so it would have to be surface burst. If you want to use fallout, you're dependent on the weather, wind and rain. It's not a good military weapon for many reasons.
Also, as I've blogged elsewhere, the Russians survived about 2,000,000 casualties during the terrible onslaught and near starvation from the Nazis in the 872 days Siege of Leningrad 18 January 1943 from 27 January 1944, and the Nazis lost.
I don't see why anyone would want to try to attack Leningrad with a bomb.
During the Cold War, it was only a minority of the people in Russia who were members of the Communist Party. You can't blame the people in the street for the crimes of the minority of evil dictators like Stalin.
It's the same with Vietnam, where America should have used high-yield, low-fission yield (Redwing-Navajo type) air burst nuclear weapons to safely blast down a demilitarized zone through the Jungle, instead of using Agent Orange. This would have saved lives, stopped jungle-cover insurgency, prevented the need for chemicals like Agent Orange and American ground troops, and also sent out a strong message to the USSR and others that America wasn't going to be intimidated by pacifist anti-nuclear propaganda into pro-Communist rubbish. Instead, the Cold War went on until Reagan forced it to an end in the 1980s.
The targets for weapons should be military targets. Enemy propaganda and threats to our civilians should be overcome by increasing our civil defence and air defense (ABM/Star Wars) as Reagan did from 1980-88, which brought the Soviets to the negotiating table and forced them to abandon their Cold War mentality and listen to reason.
"At the same time that Eisenhower was being informed that
REDWING was to test low-contamination "clean" bombs, the Air
Force was expressing its interest in "salted" or "dirty" high-contamination
weapons:
The DOD has asked the AEC to study salted
weapons. This letter is written in the
belief that it will be helpful to AEC
scientists working on this project to know a
little more about AF requirements
contemplated for this sort of weapon.
The AF has made some studies of the
military effects of salted weapons. A
summary of some results is attached for
your information.
Initial emphasis should be placed on large weapons. The AF is interested in a "salted
version" of the new Class A (25,000 lb.)
and of the MK 21, if feasible. The ideal
isotope should emit an energetic gamma ray
and have a half-life of the order of a week
or so. Yield losses should be no more than
25%.
These remarks as to the most desirable
ranges of parameters -- size, half-life,
tolerable yield loss -- are not intended to
preclude your consideration of wider
ranges."
From Chuck Hansen Swords.
Tantalum?
Class A mean a 60-megaton weapon in a normal version.
Mk21 yield was in reality 15 megatons in a normal version.
So what about this ?
20 mt and 19.1 mt soviet devices were a competitor devices for heavy warhead for r-36 missile (ss-9).
choice was a arzamas-16 device-20 megatons.
both were designed to be 18mt.
chelyabinsk device was heavier by 5-10 percents.
design.
weight around 4 tons .
dropped in the rds-6s drop case.
2-stage,boosted primary that have been tested in 1957-1958.
aluminium alloy casing.
electrodetonators have been used.
210 were deployed on r-36 icbm.
8f675 heavy rv.
later 30 of them were deployed on r-36m in 1975 and were in arms up to 1992-purpose was to destroy hardened structures.
interesting facts:
Soviets not considered bombers as reliable weapon at least against US so they not build a bomb version.
This weapon was a highest-yield weapon ever stockpiled by USSR.
21.1 mt test was in fact clean test of the 40mt device,scaled-down Tsar ,considered as warhead for UR-500 configurated as ICBM,but UR-500 ICBM have been canceled and device never been weaponized.
Question about Soviet megatonnage.
R-36 310 were build,23 were fired.
at peak 210 were deployed with 20mt,46 with 8.3mt,18 orbital with 2.3mt and 12 with 3 MRV 2.3mt.
Consider this, I'm believe 210-220 20mt warheads were build,46-60 8.3mt.
For R-16 ICBM.
First warhead was a 2.3 mt same as on R-36.
Second was a 3.4 megatons.
Combination not known .
R-9A was a 2.3 mt.
3 and 6mt for R-16 were CIA estimates.
R-16 202 were produced.
As for Mk21 ,275 units produced that only mod 0.
Its absurd that NRDC claimed that.
Mk21-0 produced -december 1955- july 1956.
Mk-21 -1 october 1956-april 1957.
mk-21-2 april 1957-june 1957.
And of course, 940 mk-36-s were new constructions.
Mk-41 500 were produced.
There were Mk 41 mod 0.
See for example:History of the Custody and Deployment of Nuclear Weapons July 1945 to September 1977.
All have a 25mt yield.
But small fraction of mk36-s were clean,six-megaton variant,compared to 19 megatons in the full-yield version.Data that existed claimed that only dirty variant was deployed.
Mk-36 production run april 1956-june 1958.
Mk36-0 production started in april 1956.
mk-36-1 in october 1956.
clean mk-36-0 started production in june 1957.
mk-36-2 same month.
So it possible to determine a maximum number of clean variant.
Mcnamara stated in 1967 that change of the bomber loadings -retirement of Mk36-s reduced total megatonnage by 40%.
B-52 loadings were at this time:
2 Mk-21 later 2 Mk-36 up to end FY 1960.
Later:
2 Mk-41
4 Mk-28Y1
1Mk-41+2 Hound Dog.
4Mk-28Y1+2 Hound Dog.
Later since Mcnamara destroyed medium bomber force and since FUFO version of MK41(15,000 pound weight) never existed Mk41 start to retire for Mk53 FUFO.
Other B-52 loadings were:
1 Class A (60MT)-canceled in 1957.
1Mk-41+4Skybolt.
4Mk-28Y1+4 Skybolt.
Skybolt canceled by Mcnamara .
Herken mixed up facts.
1000-megaton yield of the device that Teller described in 1947 in the report LA-643,he also described its explosion.
But yield of the Super 162 feet in dimeter and 30 feet in lenght would be hundreds of the gigatons.
In 1970-s was shown that Classical Super works,and not only it,but All Teller's early proposals and designs.
See:
A Festschrift Celebrating Edward Teller's 80th Birthday ,edited by Hans Mark and Lowell Wood.
Thanks for the information!
Please note that I have comment moderation turned on to prevent people writing nonsense on this blog, and I cannot always log in to approve comments regularly.
So if your comment is not quickly approved it's possibly I'm away from internet access, dead, etc.
I'm was not right-Herken not mixed up facts.
1.1000-megaton device that Teller described in report LA-643 was a Alarm Clock not Super.
2. 1000-megaton that Herken mentioned was another Super design.
From From Chuck Hansen's Swords :
At this time, the proposed Super was so huge and unwieldy that
delivery by boat or railroad train were the only choices (any
probable enemy was unlikely to permit either).165
Citation 165-AEC THERMONUCLEAR WEAPONS CHRONOLOGY, p. 21.
It seemed
impossible to package the Super into a droppable bomb, and
even if it were possible, its yield would be so great that the
airplane carrying it would have to be sacrificed.166
Citation 166-AEC THERMONUCLEAR WEAPONS CHRONOLOGY, p. 70; Borden-Walker
H-bomb chronology, p. 25.
One Super design in early 1949 included a fission trigger
that in itself weighed 30,000 lbs.; the overall length of the
bomb was estimated at approximately 30 feet, with a diameter
in excess of 162 feet. Under these circumstances, the scientists at Los Alamos preferred not even to estimate the
gross weight: this configuration represented essentially a
huge fission trigger in the middle of a container of liquid
deuterium the size of a large oil storage tank.167
Citation 167-AF ATOMIC ENERGY PROGRAM, Vol. IV, p. 187.
From Herken’s Brotherhood :
The hypothetical Super under consideration was some 30 feet long and a stunning 162 feet.,fission trigger weighed itself 30,000 pounds.
Citation-“A History of the Air Force Atomic Energy Program, 1943-1953” (USAF history), vol. IV, 188; Rhodes (1995), 379.
Another problem facing AIR Force planners was how any aircraft dropping the bomb could escape the shock wave from mammoth 1,000-megaton explosion.
Citation- “A subsequent Air Force memo envisioned an H-bomb 20 feet long, nine feet wide, and weighing between 35,000-80,000 pounds. Wood to Commanding General, July 12, 1951, #471.6, series 197, USAF/NARA”
So this bomb was a smallest Classical Super design proposed at this time.
Classical Super feasibility.
Teller's Memories:
"Work on thermonuclear weapons is complicated to conceptualize and describe,and accurate description of it are still highly classified.I have described my role in the development of thermonuclear weapons as fully as is allowed.I can only point out that during 1970s ,when more powerfull computers became avaible ,some of my original ideas were reexamined .As it turned out ,Ulam 's calculations,as well Johnny von Neumann's were based on incomplete assumptions.As Hans Mark and Lowell Wood wrote in 1988 :
Remarkably enough, all of Teller's earlier thermonuclear explosive designs and proposals were subsequently shown to be feasible-in experimental demonstrations.1"
1. Energy in Physics ,War and Peace : A Festschrift Celebrating Edward Teller's 80th Birthday,(Dodrecht,Boston and London :Kluwer Academic Publishers,1988).
Its,possible,that asteroid-buster,that Teller described, actually a Classical Super-one of the devices that were studied in 1970-s,that’s because Classical Super is essentially a cryotank of liquid deuterium,i.e. its mass much smaller for same yield compared to ordinary Teller-Ulam bomb.
Unfortunately, fact that Classical Super works known since publication of paper of Lowell Wood and Thomas Weaver, Necessary conditions for the initiation and propagation nuclear-detonation waves in plane atmospheres ,Physical.Review.A.vol.20.1.pp.316-328.
From section VIII .Prospects for the nuclear detonation of the oceans.” Its worth noting ,in conclusion ,that the suscectability of thermonuclear detonation of large body of hydrogenous material is exceedingly sensitive function of its isotopic composition ,and, specifically to the deuterium atom fraction, as is implicit in the discussion just preceding. If, for instance, the terrestrial oceans contained deuterium at any atom fraction greater than 1:300 (instead of actual value of 1:6000) ,the ocean could propagate an equilibrium thermonuclear-detonation wave at temperature >~ 2kev (although a fantastic 10^30 ergs-2*10^7 MT ,or the total amount of solar energy incident on the Earth for a two –week period –would be required to initiate such a detonation at deuterium concentration of 1:300).Now a non-negligible of the matter in our own galaxy exists at temperatures much less than 300K,i.e. the gas- giant planets of our Solar system, nebulas ,etc. Furthermore, it is well known that thermodynamically-governed isotopic fractionation even more strongly favors higher relative concentration of deuterium as temperature decreases ,e.g., D:H ratio in the 100 K Great Nebula of Orion is about 1:200.Finally,orbital velocities of matter about the galactic of mass are of the order 3*10^7 cm/sec at our distance from the galactic core.
It is quite conceivable that hydrogenous matter (e.g., CH4 ,NH3,H2O ,or just H2) relatively rich in deuterium (>~ 1at.%) could accumulate at its normal ,zero-pressure density in substantial thickness on planetary surfaces ,and such layering might even be fairly common feature of the colder, gas-giant planets. If thereby highly enriched in deuterium (>~ 10 at.%),thermonuclear detonation of such layers could be initiated artificially with attainable nuclear explosives. Even with deuterium atom fractions approaching 0.3 at.% (less than that observed over multiparsec scales in ),however ,such layers might be initiated into propagating thermonuclear detonation by the impact of large (dia >~10^2m),ultra high-velocity (v>~10^7cm/sec) meteors or comets ,originating from nearer galactic center. Such events, though extraordinary rare, would be spectacularly visible on distance scales of many parsecs.”
So ,what would be yield of initiator needed to propagate wave with temperature of 50 kev in liquid deuterium?
And why Classical Super works only starting with devices in the gigaton range ?
Information have been revealed that Classical super works in Energy in Physics ,War and Peace : A Festschrift Celebrating Edward Teller's 80th Birthday ,edited by Hans Mark and Lowell Wood,Kluwer Academic Publishers,1988, in article :Lowell Wood and John Nuckolls, The development of nuclear explosives on p.317.
They stated that Classical Super works for gigaton devices,but they not give exact minimal yield,but it seems its possible to estimate yield of minimal device.
The classical superbomb of Teller from 1946 or so was a gun-type fission weapon with a beryllium oxide wall at the end of the cylinder of fissile material, with an adjacent chamber containing some tritium (to start off fusion) then a long cylinder of liquid deuterium.
This "classical super" differs from the 1951 Teller-Ulam breakthrough by NOT compressing the fusion material axially using recoil from X-ray ablation of a heavy metal pusher cylinder around the fusion stage.
So the classical superbomb is purely using the temperature of a gun-type bomb (a gun type bomb being used to allow the hot end of the fissile material to cause heating, avoiding cooling in the TNT implosion debris around an implosion weapon).
The Greenhouse-George test (yield 225 kt) in 1951 was a scaled-down classical super (external tritium chamber separated from fissile material by beryllium oxide wall), but without the main deuterium charge and using cylindrical implosion instead of a gun type device. (Cylindrical implosion, like a gun-type device, has the advantage of exposing the ends of a cylinder of fissile material and thus allowing a maximum of unattenuated radiation to escape from there, for heating tritium.)
What really is interesting is Ulam's idea in December 1950 to create a pure fission bomb of using separate stages. I.e., one fission bomb compresses second fission stage more efficiently than chemical explosives can. So you can, using Ulam's scheme, make a fission bomb arbitrarily powerful without using any fusion materials at all. You just need to use a series of stages, which are ignited by compression. Ulam's idea was to use neutrons; Teller pointed out X-rays can do the job and then worked out the recoil from ablation of a pusher around the secondary stage. This will work either for a fusion or a fission secondary stage.
Nige ,this information well known for me,I'm even read sources that Chuck Hansen used on the subject.
Later for gigaton Super designs (1949-1951) they planned to use a Multi-Crit gun,such weapon could give up to several megatons-never actully build.
Failed Morgenstern (in reality designed to be 3Mt) and scaled -up Morgenstern-Ramrod -were in reality thermonuclear trigers for Classical Super.
Classical Super from physical point of view -propagation of thermonuclear detonation wave and not necessary in liquid deuterium and even deuterium.
In devices that they stuidied in 1970-s they of course used multi-megaton trigers,and they revealed that 2*10^7Mt needed to initiate 2kev detonation wave in ocean with D:H 1:300.
Based on this data its possible estimate energy of triger needed to initiate 50kev wave in pure deuterium.
Clean Mk36 described as 96% pure.
This mean 240kt from fission.
So what about Pu spark plug ?
Spark plugs actually existed or they speculations of Hansen and others?
Yes, they did, from the 1952 Ivy-Mike device onwards, and the spark plug (not just the fission primary) inside the liquid deuterium Dewar flask (the secondary stage of Ivy-Mike) was hollow and contained tritium gas to boost the fission, according to the interview of Ivy-Mike designers by Richard Rhodes in his 1995 book "Dark Sun". It's a big part of the story, because the tritium they added disappeared causing a drop in pressure. They thought it was leaking out, but then they realised that it was was reacting with the uranium metal surface on the inside of the sparkplug (they put in the tritium while it was still warm, before the large deuterium liquid DeWar secondary was filled). Uranium tritide formed on the inner surface of the hollow spark plug, causing the gas pressure to fall.
This tells you the time-sequence of events in the compression of the secondary stage.
If the sparkplug compression and fission occurred after the fusion reaction had got going, then the boosting of the centre of the sparkplug with tritium gas would have been unnecessary.
Clearly, the secondary stage is compressed and the fission of the sparkplug must proceed effectively to burnout, before the main fusion reaction in the secondary stage gets started. Otherwise, boosting the sparkplug is unnecessary, because it would undergo fission from the neutrons produced in the main fusion event.
In other words, the fusion stage takes a longer time to start fusing than the time taken for the fission of the sparkplug, which occurs first. So it is an advantage to boost the sparkplug, to help it heat up the centre of the fusion charge.
Hello Nige,
Do you read a second edition of Hansen's Swords?
I'm not read them yet,but it seems that he provided data about new(1962) Livermore high-yield devices ?
I have only read Hansen's 1988 published book, "U.S. Nuclear Weapons" (Aerofax), when it was in the British Library's SRIS. I exchanged emails with Hansen about nuclear weapons effects data, e.g. non-fission neutron capture ratios in U238 which produce U239 and U237 and their effects on fallout decay rates, but he replied he did not have any data on that. Presumably he did not have possession of the fully declassified detailed DASA-1251 volumes on the composition of fallout actinides. I don't think they have been fully declassified, apart from the fallout pattern compilations. The most important compilations of close-in EMP waveforms from nuclear tests are also still secret today, because the presence and relative yields of the primary fission and thermonuclear stages can be inferred from EMP data.
I rececently re-read the original 1945 Smyth report "Atomic Energy for Military Purposes", where General Groves writes in the foreword that the purpose of the report is to give out all unclassified data on the subject, and anyone securing or releasing any further data will be punished. The end of the book contains a conclusion by Smyth, stating that the American people need to be informed of the facts in order for democracy to work.
I think Hansen's point is that the secrecy failed to stop Stalin and others getting the H-bomb, so what's the point? Secrecy is a dangerous delusion which only keeps the very people who deserve the facts and pay for the research through their taxes, from being told the facts. It doesn't stop spies giving the data to the enemy. There is no security in keeping the facts secret from millions of taxpayers who fund the research, and it has the very grave danger of increasing terror and fear by allowing ignorance and superstition to go out of control.
The only way to guarantee national security is through efficient military and civil defense, and by safeguarding fissile and thermonuclear materials like deuterium for lithium deuteride, not a "top secret" stamp.
All the same, I don't think it would be wise to publish tested, detailed design data on deliverable nuclear weapons, because there will be some rogue states out there (ahem, Iran, to be specific, maybe also North Korea, India, Pakistan, and Israel to some degree) which have a nuclear weapons program and the necessary materials, so are at the stage that could well utilize ideas in detailed blueprints of 1962 American thermonuclear weapons.
So I don't think it is wise to discuss this kind of data openly. The only design related materials I've seen are fusion boosting and implosion force calculations in Bridgman's 2001 book Introduction to the Physics of Nuclear Weapons Effects (which gives some details in order to calculate in detail the neutron and prompt gamma ray emissions for different weapon designs), Rhode's 1995 book Dark Sun on the 82 ton undeliverable liquid deuterium Ivy-Mike test, Arnold and Pyne's 2000 book (they are the British government's official historians), Britain and the H-Bomb which contains the most detailed data on actual tested H-bomb design details I've seen (the book reprints a detailed summary report on all the 1950s British H-bomb tests, written at the time by AWRE physicist Dr John Corner, which shows how early designs failed and were then modified and re-tested to improve performance), and thr Cox report, http://www.house.gov/coxreport/cont/gncont.html
Arnold and Pyne's 2001 book is mainly about British H-bomb designs, but contains some interesting discussion about early American devices. The story goes that when in 1958 Britain and America finally began to exchange H-bomb secrets, the British physicists were shocked to see the very simple and crude-looking long cylindrical fusion stage in the design blueprint for the Mk28 megaton H-bomb. They were stuck a completely different British system, using spherically shaped secondary stages in all the 1957-8 British tests. According to the Cox Report, the final W87 and W88 American devices use spherical or egg-shaped secondary stages, not cylinders like early American bombs.
The key reason why some of this information should be available is for the design of long range cheap, efficient Orion type nuclear explosion propelled spacecraft for exploring the galaxy.
If spacecraft designers knew more about the safety and efficiency of thermonuclear weapon designs, there would be more progress in this area to overturn political obstacles and to abandon the worthless, expensive and inefficient gimmicks like short-ranged chemical rocket technology in favour of the only technology which can really go places.
Consider 1954 year devices.
10,000 -megaton device -the Sundial.
1.Document that Hansen cited - Minutes of the Forty-First Meeting of the General Advisory Committee to the USAEC, July 12-15, 1954. Mr.Herken used this version of document .
Full document contain 74 pages.
This version was a greatly reduced :
Document Type: CONFERENCE
Publication Date: 1954 Jul 15
Declassification Status: Sanitized
Document Pages: 0067
Accession Number: NV0073403
Document Number(s): AECGAC41
Originating Research Org.: ATOMIC ENERGY COMMISSION-GENERAL ADVISORY COMMITTE
OpenNet Entry Date: 1994 Aug 26 .
You can download it as PDF at www.osti.gov/opennet/servlets/purl/16091554-OR1Oon/16091554.pdf.
Part of content:
Weapons briefings.pp.2-35.
Livermore Briefings.pp.30-42.
Deleted Analysis 30.
Livermore Thermonuclear Plans 32.
Sundial and Deleted 33.
Possible Deleted tests 34.
All deleted.Only on pages 55 GAC they discussed Livermore program.
Actuall quotes were :
Dr.Fisk said that he felt that Committee could endorse small weapon program.He was concerned ,however ,about Dr.Eward Teller’s 10,000 MT gadget and wondered what fraction of the Laboratory’s effort was being expended on deleted .Mr.Whitman had been shocked by thought of 10,000 MT;it would contaminate the earth.Dr.Rabi’s was that talk about this device was an advertising stunt, and not to be taken to seriously.(So Rabi’s reaction was just emotions, this is entirely clear from Teller’s Memories on many other occasions after Oppie’s case). von Neumann reaction was a quite different :”Dr.von Neumann agreed that Laboratory was being by very bad organizational principles ;but functioning fell pretty well in spite of this.He said that presentation have been good”.
New redacted version of document not avaible as PDF-file ,but could be ordered via e-mail to DOD.
Publication Date: 1954 Jul 15
Declassification Status: Sanitized
Document Pages: 0073
Accession Number: NV0411974
Document Number(s): AEC-GAC 41
Originating Research Org.: AEC-GENERAL ADVISORY COMMITTEE
OpenNet Entry Date: 2006 Feb 13 .
So highly unlikely that Hansen seen this version.
Second device-GNOMON. GNOMON-deleted from old version.
But on 1000MT Hansen cited doc ( there several versions of doc on opennet,but they decl. in roughly same time.)
Title: STATEMENT OF COMMISSIONER THOMAS E MURRAY AT HEARINGS BEFORE THE JOINT COMMITTEE ON ATOMIC ENERGY ON FEBRUARY 23, 1956 (DELETED)
Author(s): UNK (AEC-ATOMIC ENERGY COMMISSION, THE U.S.)
Document Location: DOE/NV Nuclear Testing Archive, P.O. Box 98521, City: Las Vegas, State: NV, Zip: 89193-8521, Phone:(702)295-0712, Fax:(702)295-1808, Email:cic@nv.doe.gov
Document Type: REPORT
Publication Date: 1955 Nov 23
Declassification Status: Sanitized
Document Pages: 0015
Accession Number: NV0108397
Originating Research Org.: NO DATA AVAILABLE
OpenNet Entry Date: 1994 Aug 27
So ,Hansen take GNOMON from this document.
So there were 2 devices (remember ,Dr.Foster said about 1954) .
10,000MT Sundial (its possible that original name for device was SUN) and 1000 MT Gnomon-efective primary for it.
It may be that in new redacted version of Minutes theirs dimensions and weights given.
So ,Hansen falsely described it.
But Herken's description is complete fake.
In some other occasions Hansen also created rubbish:
For example Rhodes and Hansen's description may lead you to think that Hansen have a full copy of Teller's LA-643,On the Development of Thermonuclear Bombs.
Hansen claimed that report have been written in September 1947 and updated in February 1950.
But in fact this is fake.
In reality this report is still SECRET ,its actual description: E. Teller, LA-643, “On the Development of Thermonuclear Bombs: May 7,1948, LASL, 29.
[This Report is Secret-RD].
See Anne Fitzpatrick’PHD - Igniting light elements :The Los Alamos thermonuclear weapon project,1942-1952.LA-13577-T.376pages.
This is beautiful reading, since as she used ALMOST ENTIRELY CLASSIFIED Documents.
Hansen was wrong ,when he said that Teller “hypothesized” 1000mt Alarm Clock in LA-643.
In fact “Teller conceded that delivery of a super by aircraft --at least in 1947 --
would work. He suggested other technological fixes: a boat or submarine
might provide suitable alternatives to aircraft delivery. The Alarm Clock at
this time did not constitute a lighter alternative to the Super: the version
that Teller and Richtmyer had envisioned in 1946 appeared in theory capable
of producing a billion-ton TNT equivalent explosion. It too could not be
transported by air”
This is from this source.So device have have investigated.
I’m believe that in some way Sundial was a redesigned lithiated 1000MT Alarm Clock with separation of fusion stages.
And most likely its weight was 4000 000 pounds,York’s testimony on these hearings also illuminating.
So,in fact technical reports on TN weapons is still highly classified.
And docs. that Hansen obtained not technical,and he obtained only few thousand,not millions ,of course.
Part.1.
Also Hansen in 1988 created a lot of rubbish,when he falsely claimed
much Smaller ,absurd yields for a largest US weapons that were build.
He stated 10mt for mk41 instead of actual 25mt,even when he obtained docs about Mk41 and new A he tried to give 10mt fake for mk41.
When he cited 328Minutes of AEC Meeting No. 1221, August 2, 1956,he only cited for A-"On June 24, 1955,
the AEC had received a DOD request for a feasibility study of
a new Class "A" weapon with a maximum weight of 25,000 lbs.
and a 60 MT yield'
But for new B he NOT cited 10,000 lbs and 25Mt.
Old B was a Mk21 -15mt,he and his NRDC friends faked and give yield of never existed clean version for it.
MK36 described as enriched mk21 in Lee Bowen and Stuart Little ,History of AF Atomic Energy programm,five volumes,9 parts.
yield of Mk36 as 20Mt known ... since 1956 U. S. Senate. Committee on Armed Services, National Defense Establishment — Unification of the Armed Services, Hearings, 80th Cong., 1stsess., on S. 758, GPO, Washington, 1947; Universal Military Training, Hearings, 80th Cong., 1st sess., GPO, Washington, 1948; Committee on Armed Services and Committee on Foreign Relations, Military Situation in the Far East, Hearings, 82nd Cong., 1st sess., GPO, Washington, 1951; Committee on Foreign Relations, Statements of... John Foster Dulles and Adm. Arthur Radford... on Foreign Policy and Its Relation to Military Programs, Hearings, 83d Cong., 2d sess., GPO, Washington, 1954; Subcommittee on the Air Force of the Committee on Armed Services, Study of Airpower, Hearings, 84th Cong., 2d sess, GPO, Washington, 1956.
Part.2.
Yield of Mk41-25Mt,known since 1962,Dr.Foster stated this several times on various hearings.
But main question :What was yield of MK17/24 weapon,10-15,15-25Mt?
No,yield was 40mt.
New A (design study began in 1955) was 60Mt.
New b-25mt.
Old b-15mt.
Weights-new A-25kp,newb-10kp.
old a-42kp,old b-17kp.
Points to support this,Herken's interview with Rabi(1983).
". Rabi’s influence was already subtly evident at Ike’s briefing on Hardtack, where Strauss had justified high-yield H-bombs as necessary to compensate for aiming errors. Whereas Eisenhower in his first term had innocently suggested that it might be possible to confuse the Russians about U.S. capabilities by simply not announcing thermonuclear tests, Ike reminded Strauss on this occasion that a 40-megaton weapon would only cause about half again the damage of a 10-megaton bomb because “the scaling laws apply on a cube route basis...”
This was a new A question,40mt refers to old.However,Herken not understand that says about old A.
minutes of 39th Gac meeting,p.23."
The possibility that 30 megatons could be achieved in Alarm Clock (TX-14)-type device ,employing 95% Li-6 was a impressive'
From 41th GAC minutes stated that mk-17/24 used 95% Li-6.
From Letter dated September 23, 1955 to Lewis L. Strauss,
Chairman, USAEC, from I. I. Rabi, Chairman, GAC
"One question addressed to us was whether
information on the external characteristics
of a thermonuclear weapon (including size,
weight, shape, and center of gravity, but
specifically excluding any details about
the interior) and certain information on
yields, effects, and delivery systems would
reveal to another country "important
information concerning the design or
fabrication of the nuclear components" of
the thermonuclear weapon.
We arrived at the following opinion after
devoting a great deal of discussion to this
difficult question. We believe that the
information cited would not directly reveal
important information concerning the design
and fabrication of the nuclear components
of the weapon.
Nevertheless, we feel that the information,
taken in combination, may give valuable
clues and stimulus and direction to the
development of similar weapons. We are
particularly concerned in this regard with
the thought that revelation of the very
high specific energy release attained in
our weapons (on the order of two megatonsper ton weight) would call attention to the
fact that a radical new technique has been
developed and is being exploited.
So yield was 40Mt,and in AF Atomic Energy program stated that were plans to increase weight to 50,000 pounds and obtain larger yield.
And....SUNDIAL have been a LAST device designed by Teller HIMSELF.
"The lab was still studying a device called the SUNDIAL, which was to a very-high
yield weapon based upon one of the early ALARM CLOCK designs. The SUNDIAL
would be so large that it could only be transported by ship. Although design of the
SWORDS OF ARMAGEDDON
IV- 54
SUNDIAL was still very much in a preliminary stage, Livermore thought it might be
possible to test the primary of this device – called the GNOMON – during the next Pacific
test series.92 (The GNOMON was reduced to a study program with the assignment
of the XW-27 warhead and bomb to UCRL in the summer of 1955."
This is from new Swords.
As you may seen I'm was a completely right.
GNOMON was a 1000MT device.
The main problem that Hansen's research was a mainly compilation.
And reasons not to build them were not Rabi and GAC.This was a absurd conclusion by Herken and me(however I'm have a huge doubts) because we are not have information,same story with Rhodes 's book,killed by Igniting the light elements.
P.S.Minimal size of the Classical Super device an order of tens gigatons,and in 1970-s trigers for them were designed that have yield-to weight ratio ~40kt/kg-and this information open.Triger also several gigatons but with this ratio. Do you understand this ?
P.P.S.
This is absurd that you reject my comments on you blog.Howver there were emotions in them.
Ha,my attacks on Chuck were pointless,he coorrected all things-25Mt for MK41 and only 25Mt,plans were for a clean version in 1958,but were broken in 1959.,19mt for mk36y1,15mt for Mk21,yes and 40mt for Mk17/24 -so stupid sites you cited now dead,but he not corrected you Radiation mirror speculation,but he provided only a first data on warheads,like 35Mt Titan 2 warhead,the main point -Clean FRAUD.,so clean bombs were fake,ALL strikes planned against USSr in 1950-s were ground bursts,you dismissing of this is fake.
Fallout area mission-was a special mission.
in one doc.exist a chart about this.-on gorizontal axis number of delivered MK36Y1(all ground bursts),on vertical axis -radiation deaths after H+60 days.
100 bombs-(1300MT fission yield)-95 million deaths.(out of total population-210 millions).
200 bombs(2600 MT fission yield)-135 million deaths.
350 bombs(4550 MT fission yield)-165 million deaths.
500 bombs-180 million deaths.
assume august 1957(soviet first TN bomb that entered stockpile tested in oct.1957).
US have at least 522 MK36Y1.-so to cause 165 million deaths needed with 0.85 probability of delivery-412 bombs and 206 B-52.
remaining 110 bombs would be used on China and with same fallout density (assuming 700 million population) to cause 305 million deaths.
And some last remarks:
Report LA-643 have a several iterations,one of them have been for 1950,Hansen have a greatly sanitized 1950 version.
Hansen seen prior to edition of first Swords a second edited version of GAC 41 Minutes and there only GNOMON in content given and when Fisk asked on them they given.
GNOMON not given in Murray's 23 feb.1956 statement before AEC,he only gave as example -simply stating its yield to show that weapons much larger than 60MT are possible.
There some data existed on this Soviet 20-megaton warhead.
Its designation -A604G.Tsar Bomb designation -A620N.A-probably stands for Arzamas ,N-probably experimental,G-probably prime.
RDS-220 -there not existed such designation.-sites are wrong.
In 1956 Chelyabinsk-70 designed a 30-megaton bomb ( scaled -up in 10 times RDS-37 ) its have designation RDS-202.Special bomber was built Tu-95-202 (nicknamed Tu-95v,means a H-Bomb).RDS-202 weight was 40 tons with a ballistic case,in 1957-1958 this project was abandoned.At this time this bomb have been considered as weapon. But since 1960 Soviets not considered bombers as reliable weapons and arm them with a such yields-theirs main were directed on ICBMs.A620N was built as experimental device for a intimidation,but its scaled -down versions 40 and 50mt were considered as missile warheads.
A620N weight was around 20 tonnes.
its primary was a 860kt 2-stage device named 'device49' it was tested 23 feb.1958-it was ancestor of the first mass-produced Soviet tn weapons.
R-12/14/16/9A -warheads,R-13 and Kh-20 warhead +300kt tactical bomb.
"device 49" later was tested in a clan version at 440kt yield.-20 october 1958-so Soviets also considered clean devices.
So in the Tsar bomb primary gives only around 440kt from fission.
R-36 missile heavy RV(8f675) weight was 4 560 kg (payload was 5 825 kg-other part were penetration aids and etc. contrary to CIA estimates),so warhead weight was around 3600 kg,its boosted primary was no larger than 68kt.
Altough,Cia by around 30%overstimated Soviet megatons by ,in 1973 Soviets have around 12,000Mt,they greatlyunderstimated number of battlefield warheads-33,000 in 1985.But there 2 things that West never considerd,i'm previously not believed in this,but when i'm had seen actual Soviet documents i'm was scared to death:
1.Largest bioweapon program in the world ,this is after 1972.
...
2.Dead hand...
If USSR not evaporated in 1991 ,by 1995 they would arm a fraction of ICBMs with such things as super-plague ....
It seems for me that nukes not so bad things compared to this.
Please ask Hoower if you not believe me.Holloway may also help.
So,as you may seen in fact Commie planned conquest of Europe .
At the begining....
This is first comment to remove (censored) by histrorians .
Mk -41 warhead weight was 9300 pounds ,lenght-120 inches,diameter-45 inches.
Warhead have a 3 applications :
NAVAHO
Hustler's pod
NGB.
So,anecdote-this weapon was not a 3-stage,but 2-stage,tested in a clean version in Poplar. This why Taylor stated 6kt/kg ratio.
Actualy things much more complicated ,contrary to believs and Hansen 's Swords only very -very fragmental compilation .
Compare 15,000 and 7,000,000.
Source -Sandia's history of the MK 41-0.
P.S.Y2 for a Mk-41 was a Hansen's private contribution,fruit of his imagination,as well all things in his 1988 year falsebook and NRDC trash.
Are you insisting that a nuclear explosion will never be able to vaporize human bodies? So what kind of nuclear explosion that can instantly vaporize anything within the ground zero?
Hiroshima's ground zero was 600 metres under explosion because it was an AIR BURST at 600 metres height, not a surface burst.
People in the open at ground zero received surface burns on the side facing ground zero: the thermal exposure at ground zero (600 metres) from a 16 kt Hiroshima detonation with 1/3rd thermal radiation yield (ignoring atmospheric attenuation) by the inverse square law is equal to {16 (1/3) x 10^12 calories}/(4*Pi*600^2) = 1.28 x 10^6 calories/m^2 or 118 cal/cm^2.
Now, 1 cal is the temperature needed to raise the temperature of 1 gram (1 cm^3) of water (humans are 70% water) by 1 degree centigrade or 1 K, from 15 to 16 C. For humans, to reach vaporization you need to go from 37 to 100 C. If 118 cal/cm^2 is distributed in the top 1 cm thickness of flesh (with no loss due to heat reflection, ablation of the top 1 mm layer, and other damage-limiting processes) then that top 1 cm layer could potentially gain about 118 C in temperature, reaching 155 C (37 C body temperature + 118 C = 115 C), IGNORING the change of state from water to steam when 100 C is passed.
However, you need a massive amount of energy (enthalpy of vaporization, see http://en.wikipedia.org/wiki/Enthalpy_of_vaporization ) to drive water from 99 C to 100 C, because water molecules have strong bonds between them which need to be broken when water turns to steam.
"... the molecules in liquid water are held together by relatively strong hydrogen bonds, and its enthalpy of vaporization, 40.65 kJ/mol [note that 1 cal = 4.186 J, while 1 mole consists of 6.022 x 10^23 molecules of water per 18 grams of water], is more than five times the energy required to heat the same quantity of water from 0 °C to 100 °C."
- http://en.wikipedia.org/wiki/Enthalpy_of_vaporization
So the Hiroshima bomb did NOT deliver enough heat flash energy at GROUND ZERO to even vaporize a layer of water 1 cm thick. If you look at the data on the scorching depths of wood by the thermal flash at the 1955 Operation Teapot tests (Kyle P. Laughlin, Thermal Ignition and Response of Materials, Report to the Test Director, Operation TEAPOT, Nevada Test Site, February-May 1955, Office of Civil and Defense Mobilization, weapon test report WT-1198, December 1957, AD0611227), less than 1 mm of the surface of the wood was removed even by 50 cal/cm^2 or more: additional energy did not increase the depth of charring (the extra energy just went into ablating the top 1 mm more forcefully into a smoke cloud, which then absorbed the remainder of the heat flash and kept the heat absorbed well away from the underlying wood). The same applies at Hiroshima, where the moisture content of skin was 70% (much higher than wood, which was easier to heat, due to less water content; WATER HAS THE HIGHEST SPECIFIC HEAT CAPACITY OF ANY COMMON MATERIAL ON THE PLANET).
The only reason that the top 0.1 mm of roof tile surfaces bubbled at ground zero in Hiroshima was the failure of the 118 cal/cm^2 to penetrate more deeply than 0.1 mm. The tiny depth of the surface which absorbed the energy ensured that the temperature rise was massive, over 2000 C in that tiny 0.1 mm tile surface. Nobody was instantly killed by thermal radiation outside; the nuclear radiation was lethal.
Surface 1 mm
I see. Then what kind of atomic explosion that can instantly vaporize human bodies, including their bones teeth, until there will be nothing remains?
Okay one more question, what happen to human bodies if they are directly hit at the centre of the ground zero of Little Boy explosion? Can the bodies vaporize until there are not even single remains leave?
As I said, even ignoring atmospheric attenuation you get just over 100 cal/cm^2, and even Dr Herman Pearse (Professor of Surgery, Rochester University) who wrote the original panic-mongering thermal burns disaster paper on Hiroshima in the New England Journal of Medicine, began to grasp that this was survivable and went to Eniwetok Atoll in 1951 to expose animals to the 47 kt boosted Easy test and the 225 kt thermonuclear George test:
"Finally, we wanted to know how we could protect against these burns. ... I didn't care what happened to the fabrics; I wanted to know what happened to the man under the fabric. So we conceived this idea, that the important factor in studying clothing was what happened under the clothing; how it shielded the animal with cloth of different composition, weight, texture, weave, and color. We have made a great many studies in tyhe laboratory and in the field on this problem of the protective effect of clothing. ... If you have 2 layers, an undershirt and a shirt, you will get much less protection than if you have 4 layers; and if you get up to 6 layers, you have such great protection from thermal effects that you will be killed by some other thing. Under 6 layers we got about 50 percent first degree burns at 107 calories/cm^2."
- Dr Herman Elwyn Pearse, Professor of Surgery at the University of Rochester, "Biomedical Effects of Thermal Radiation", page 143 (published in the U.S. Federal Civil Defense Administration's book, "Cue for Survival, A.E.C. Nevada Test Site, May 3, 1955", pages 140-144).
This is for yields similar to Hiroshima (for modern larger nuclear weapons, even more heat is needed for a burn, because it is spread out longer in time and more heat is lost by diffusion, and of course people have longer to take evasive "duck and cover" countermeasures to get into a shadow). Therefore, if the bomb had been dropped on Hiroshima in cold weather, even at ground zero the skin under clothing would not be burned worse than 1st degree burns, mere sunburn. The outer clothing if dark in colour would be smoked to ignition on surfaces directly facing the fireball, but ignited clothing was easily rolled out or beaten out in Hiroshima. It is NOT the same as peacetime clothing burn accidents where people get their clothes soaked in gasoline before ignition!
Sure, if you ground burst a nuclear bomb on a packed beach where people are not wearing clothes, those VERY near the bomb will be quickly vaporized and pulverized by the great heat of the radiating shock wave near ground zero (a blast effect, not a thermal radiation effec), just as they would be in a CONVENTIONAL explosion. What people forget here is that in WWII conventional bombs of up to 10 tons were dropped, and the cube-root blast scaling law applies to this hot blast wave. Hence 10 kilotons blast yield in a Hiroshima or Nagasaki burst has just 10 times the radius of destruction from the WWII conventional bomb: (10,000/10)^{1/3} = 10. Big deal. All war is destructive, and nuclear bombs are a disproportionately expensive way of achieving results which could be had more cheaply from conventional weapons; which explains the LYING since Hiroshima in order to use nuclear weapons to deter WWIII instead of conscripting a massive army. On balance, the risk of WWIII is bigger with the massive arms and armies of conventional warfare, than nuclear weapons. This worked well in the Cold War for the West, but exaggerations are dangerous in encouraging terrorists and in making civil defence appear hopeless, when it is not hopeless unless you're near the crater.
Post a Comment
<< Home