Above: on 3 October 1960, James Robert Brown and Jack Sharpe filed US patent 3,126,482 for EMI Limited (the music publishers who also made and sold electronics, including Nuclear Enterprises Ltd, the Edinburgh based factory for radiation meters), “Radio-activity contamination monitor with discrimination means for alpha and beta radiation,” which was granted 24 March 1964. The brains of this invention is the ability to discriminate between 5 MeV alpha particles from typical fissile and fissionable fuel (uranium, plutonium, Am-241, etc.) and 0.5 MeV typical fission product beta particles. Alpha particles carry more energy and thereby cause larger flashes of light when they hit zinc sulphide, than beta particles (or Compton electrons released from atoms by gamma rays). All light flashes are detected in a photomultiplier probe fed with 850 volts (which is broken down into a range of potentials by a dynode chain of resistors, allowing electrons to be accelerated by successive multiplier plates in the probe). The box of tricks with the meter, batteries and loudspeaker contains 30 transistors (no ICs) which include two discriminator circuits for the amplified current pulses from the probe. Large current pulses, corresponding to particles of over 3 MeV, go into an alpha particle circuit and are assigned a "beeping pulse sound", while all smaller current pulses are assigned "clicks" (like an old fashioned geiger counter). Switched to alpha+beta, this instrument therefore allows the user to discriminate between alpha and beta radiation by the sense of hearing. Basically, the alpha detection circuit is like the "squelch" control in an old analogue radio transceiver: it cuts out (silences) all weak "noise" and is only triggered by strong signals. The scintillation probe is constantly sending out pulses of varying current. The alpha discriminator circuit simply ignores all pulses which are too small to be a beta particle. This makes the "alpha only" setting very useful in making the probe respond only to nearby contamination the clothing it is passing over. Older geiger probe based "contamination meters" proved useless in fallout areas because of the very high background count rate from beta and gamma emitters far away from the object the probe is placed near. These instruments were used in the British nuclear reactor industry to tell apart plutonium (alpha) from fission products (beta). The original 1960s instrument was the bulky box-like cream plastic PCM1 and PCM2 requiring eight D cells for power and demonstrated by Sean Connery in Dr No, but a more compact green coloured metal PCM3 soon followed, which during the 1970s evolved into the 1980s cream coloured PCM5 shown into the video which requires only two D cells (or rather, two rechargable AA cells placed in plastic A cell to D cell converters). After using the PCM5 for a few hours, you can clip the meter box by the handle into your belt and reliably assess contamination levels by hearing alone, with the probe in one hand. Various digital display versions are now available, including the LB 124 SCINT which combines the probe and analyser into one convenient handheld unit.
Here's good news. Now you can measure alpha too with the camera of a good smartphone, if you put a thin layer of zinc sulphide (ZnS, sold on ebay by chemical suppliers) on the back of a sheet of transparent plastic (such as the clear "windows" in retail packaging) and then cover the zinc sulphide with a layer of thin light-proof mylar foil (such as is used in toy balloons) which will let about half of the incoming 5 MeV alpha particles through, but stops light. Since only a small amount of zinc sulphide is needed to detect alpha particles (obviously you need a thickness of zinc sulphide that is no greater than the small range of an alpha particle in it), anyone buying a jar of it could make these hundreds or thousands of these zinc sulphide coated foil camera lens covers as very cheap "stickers", for distribution at low cost on ebay in an alpha radiation emergency: they are easily calibrated by everyone because suitable alpha radiation sources are available in household Am-241 based ionization smoke detectors (which cost about £5 in supermarkets). Putting that ZnS sticker over the smartphone camera aperture allows alpha radiation to be measured: to calibrate it, you simply compare the reading given by a sample to that given by a 1 microcurie Am-241 alpha source, contained in all ionization smoke detectors (not optical or IR smoke detectors). This is quite accurate, because of the tolerance of the Am-241 sources in the smoke detectors, which must conform to regulated and standardised safety standards.
Flashes occurring on the CCD image chip of the camera in a smartphone, like white dots of interference, are due to radiation. By calibrating a smartphone with a light-covered (black taped) camera lens using the Am-241 radioactive source from an ordinary spare household smoke detector, reliable measurements of radiation are possible using everyday household items.
Above: scintillation counters use complex electron accelerator and multiplier vacuum tubes, but their role is simply to detect the flashes of light emitted by crystals which are disturbed by radiation. (Note that there are no heaters, and there is a "dynode chain" of resistors to break up an input of 850 volts or whatever into a range of different potentials in the probe for different plates. Therefore, the scintillation counter can be connected to the detector circuit box by just two conductors, e.g. simple coax cable.) Many materials emit light in this way. Even before the discovery in 1906 that gamma radiation from radium sources damages fast-dividing cancer cells, it has been discovered by Becquerel that ionizing radiation is like Roentgen's x-rays in having light-like properties on photographic plates (hence medical x-rays) and also makes zinc sulphide crystals glow. While zinc sulphide responds to alpha particles, ordinary perspex (a transparent plastic) is a good beta particle phosphor. By coating zinc sulphide on to perspex (or anthracene) you therefore get the phosphor which is the basis for the DP2 alpha-beta "dual probe". The alpha side of the perspex is placed in direct contact with a thin light-proof mylar foil (metal coated plastic) which lets through over 50% of 5 MeV alpha particles, but stops light from entering. Beta particles can easily penetrate the thin zinc sulphide before causing light scintillations in the perspex. Gamma rays also produce Compton electrons with typically about half the energy of the original gamma ray, so those Compton electrons are similar to beta particles and can be detected in the same way. A large crystal of thallium doped sodium iodide gives much greater sensitivity for gamma radiation, however. The first scintillation counter was simply zinc sulphide fixed to the inside of a thin opaque metal foil at one end of a tube, allowing a glow of light to be seen when an alpha radiation emitter is placed beside the thin light-proof metal foil: you can see the glow by looking into the open end of the tube in a darkened room and with a lens you can even see the individual flashes, just like an expensive instrument. Geiger and Marsden used this manual approach to "counting" to prove Rutherford's nuclear atom theory by measuring the distribution of alpha particles scattered by a thin gold foil, until eye strain and boredom motivated Geiger to develop his electronic counter.
Nuclear Enterprises (EMI) Portable Contamination Meter 5, PCM5, with scintillation probe in current use for decontamination of nuclear waste.
EMI/Nuclear Enterprises radiation meters including two PCM1 alpha-beta discriminating meters, in the decontamination centre of the surprisingly hygienic but wicked Dr No, in the 1962 first James Bond movie.
Sean Connery using PCM1 in the 007 film Dr No, 1962.
Connery as 007 being carefully "frisked" by a DP2R dual alpha-beta probe, which is plugged into hand-slots and clothing monitor on the right, which has simultaneous dials for alpha and for beta contamination. This equipment was all bona fide EMI/Nuclear Enterprises stock, and the success of the realistic decontamination sequences of the film proved a marketing help for their sales of advanced radiation monitors. Although the PCM1 was fully transistorised and used only 1.5 volt D cells (twelve of them, providing 18 volts!), it was chunky and heavy. The PCM2 in 1965 differed only in offering a probe-clip to allow for one-handed operation. In 1968, the PCM3 was a streamlined metal cased version of half the weight, powered by a 9 volt battery. By 1980, this evolved into the lightweight PCM5, requiring only 3 volts (two D cells).
Different smartphone cameras have differing sensitivity to radiation: the Samsung Galaxy S4 gave 2 flashes per minute for 1 microGray/hour while the Samsung Galaxy Mini GT-S5570 have 27 flashes/minute for the same radiation level. In general, the newer and higher quality the smartphone camera, the more sensitive it is to radiation. (If you have a phone that has two cameras (front and back), you only need to cover up one of the cameras with tape to use as a radiation meter, leaving the other camera free to take pictures or video. The full list of data is found at: http://www.rdklein.de/html/radioa_data.html.)
Historically, it has been proved that having weapons is not enough to guarantee a reasonable measure of safety from terrorism and rogue states; countermeasures are also needed, both to make any deterrent credible and to negate or at least mitigate the effects of a terrorist attack. Some people who wear seatbelts die in car crashes; some people who are taken to hospital in ambulances, even in peace-time, die. Sometimes, lifebelts and lifeboats cannot save lives at sea. This lack of a 100% success rate in saving lives doesn't disprove the value of everyday precautions or of hospitals and medicine. Hospitals don't lull motorists into a false sense of security, causing them to drive faster and cause more accidents. Like-minded ‘arguments’ against ABM and civil defense are similarly vacuous.
‘As long as the threat from Iran persists, we will go forward with a missile system that is cost-effective and proven. If the Iranian threat is eliminated, we will have a stronger basis for security, and the driving force for missile-defense construction in Europe will be removed.’
‘The [ABM] treaty was in 1972 ... The theory ... supporting the ABM treaty [which prohibits ABM, thus making nations vulnerable to terrorism] ... that it will prevent an arms race ... is perfect nonsense because we have had an arms race all the time we have had the ABM treaty, and we have seen the greatest increase in proliferation of nuclear weapons that we have ever had. ... So the ABM treaty preventing an arms race is total nonsense. ...
‘The Patriot was not a failure in the Gulf War - the Patriot was one of the things which defeated the Scud and in effect helped us win the Gulf War. One or two of the shots went astray but that is true of every weapon system that has ever been invented. ...
‘President Bush said that we were going ahead with the defensive system but we would make sure that nobody felt we had offensive intentions because we would accompany it by a unilateral reduction of our nuclear arsenal. It seems to me to be a rather clear statement that proceeding with the missile defence system would mean fewer arms of this kind.
‘You have had your arms race all the time that the ABM treaty was in effect and now you have an enormous accumulation and increase of nuclear weapons and that was your arms race promoted by the ABM treaty. Now if you abolish the ABM treaty you are not going to get another arms race - you have got the arms already there - and if you accompany the missile defence construction with the unilateral reduction of our own nuclear arsenal then it seems to me you are finally getting some kind of inducement to reduce these weapons.’
Before the ABM system is in place, and afterwards if ABM fails to be 100% effective in an attack, or is bypassed by terrorists using a bomb in a suitcase or in a ship, civil defense is required and can be effective at saving lives:
‘Paradoxically, the more damaging the effect, that is the farther out its lethality stretches, the more can be done about it, because in the last fall of its power it covers vast areas, where small mitigations will save very large numbers of people.’
‘The purpose of a book is to save people [the] time and effort of digging things out for themselves. ... we have tried to leave the reader with something tangible – what a certain number of calories, roentgens, etc., means in terms of an effect on the human being. ... we must think of the people we are writing for.’
“FY 1997 Plans: ... Provide text to update Glasstone's book, The Effects of Nuclear Weapons, the standard reference for nuclear weapons effects. ... Update the unclassified textbook entitled, The Effects of Nuclear Weapons. ... Continue revision of Glasstone's book, The Effects of Nuclear Weapons, the standard reference for nuclear weapons effects. ... FY1999 Plans ... Disseminate updated The Effects of Nuclear Weapons.”
‘The evidence from Hiroshima indicates that blast survivors, both injured and uninjured, in buildings later consumed by fire [caused by the blast overturning charcoal braziers used for breakfast in inflammable wooden houses filled with easily ignitable bamboo furnishings and paper screens] were generally able to move to safe areas following the explosion. Of 130 major buildings studied by the U.S. Strategic Bombing Survey ... 107 were ultimately burned out ... Of those suffering fire, about 20 percent were burning after the first half hour. The remainder were consumed by fire spread, some as late as 15 hours after the blast. This situation is not unlike the one our computer-based fire spread model described for Detroit.’
- Defense Civil Preparedness Agency, U.S. Department of Defense, DCPA Attack Environment Manual, Chapter 3: What the Planner Needs to Know About Fire Ignition and Spread, report CPG 2-1A3, June 1973, Panel 27.
“... the city lacked buildings with fire-protective features such as automatic fire doors and automatic sprinkler systems”, and pages 26-28 state the heat flash in Hiroshima was only:
“... capable of starting primary fires in exposed, easily combustible materials such as dark cloth, thin paper, or dry rotted wood exposed to direct radiation at distances usually within 4,000 feet of the point of detonation (AZ).”
Volume two examines the firestorm and the ignition of clothing by the thermal radiation flash in Hiroshima:
“Scores of persons throughout all sections of the city were questioned concerning the ignition of clothing by the flash from the bomb. ... Ten school boys were located during the study who had been in school yards about 6,200 feet east and 7,000 feet west, respectively, from AZ [air zero]. These boys had flash burns on the portions of their faces which had been directly exposed to rays of the bomb. The boys’ stories were consistent to the effect that their clothing, apparently of cotton materials, ‘smoked,’ but did not burst into flame. ... a boy’s coat ... started to smoulder from heat rays at 3,800 feet from AZ.” [Contrast this to the obfuscation and vagueness in Glasstone, The Effects of Nuclear Weapons!]
“Ignition of the City. ... Only directly exposed surfaces were flash burned. Measured from GZ, flash burns on wood poles were observed at 13,000 feet, granite was roughened or spalled by heat at 1,300 feet, and vitreous tiles on roofs were blistered at 4,000 feet. ... six persons who had been in reinforced-concrete buildings within 3,200 feet of air zero stated that black cotton blackout curtains were ignited by radiant heat ... dark clothing was scorched and, in some cases, reported to have burst into flame from flash heat [although as the 1946 unclassified USSBS report admits, most immediately beat the flames out with their hands without sustaining injury, because the clothing was not drenched in gasoline, unlike peacetime gasoline tanker road accident victims]
“... but a large proportion of over 1,000 persons questioned was in agreement that a great majority of the original fires was started by debris falling on kitchen charcoal fires, by industrial process fires, or by electric short circuits. Hundreds of fires were reported to have started in the centre of the city within 10 minutes after the explosion. Of the total number of buildings investigated [135 buildings are listed] 107 caught fire, and in 69 instances, the probable cause of initial ignition of the buildings or their contents was as follows: (1) 8 by direct radiated heat from the bomb (primary fire), (2) 8 by secondary sources, and (3) 53 by fire spread from exposed [wooden] buildings.”
‘During World War II many large cities in England, Germany, and Japan were subjected to terrific attacks by high-explosive and incendiary bombs. Yet, when proper steps had been taken for the protection of the civilian population and for the restoration of services after the bombing, there was little, if any, evidence of panic. It is the purpose of this book to state the facts concerning the atomic bomb, and to make an objective, scientific analysis of these facts. It is hoped that as a result, although it may not be feasible completely to allay fear, it will at least be possible to avoid panic.’
‘The consequences of a multiweapon nuclear attack would certainly be grave ... Nevertheless, recovery should be possible if plans exist and are carried out to restore social order and to mitigate the economic disruption.’
‘Suppose the bomb dropped on Hiroshima had been 1,000 times as powerful ... It could not have killed 1,000 times as many people, but at most the entire population of Hiroshima ... [regarding the hype about various nuclear "overkill" exaggerations] there is enough water in the oceans to drown everyone ten times.’
In 1996, half a century after the nuclear detonations, data on cancers from the Hiroshima and Nagasaki survivors was published by D. A. Pierce et al. of the Radiation Effects Research Foundation, RERF (Radiation Research vol. 146 pp. 1-27; Science vol. 272, pp. 632-3) for 86,572 survivors, of whom 60% had received bomb doses of over 5 mSv (or 500 millirem in old units) suffering 4,741 cancers of which only 420 were due to radiation, consisting of 85 leukemias and 335 solid cancers.
‘Today we have a population of 2,383 [radium dial painter] cases for whom we have reliable body content measurements. . . . All 64 bone sarcoma [cancer] cases occurred in the 264 cases with more than 10 Gy [1,000 rads], while no sarcomas appeared in the 2,119 radium cases with less than 10 Gy.’
‘... it is important to note that, given the effects of a few seconds of irradiation at Hiroshima and Nagasaki in 1945, a threshold near 200 mSv may be expected for leukemia and some solid tumors. [Sources: UNSCEAR, Sources and Effects of Ionizing Radiation, New York, 1994; W. F. Heidenreich, et al., Radiat. Environ. Biophys., vol. 36 (1999), p. 205; and B. L. Cohen, Radiat. Res., vol. 149 (1998), p. 525.] For a protracted lifetime natural exposure, a threshold may be set at a level of several thousand millisieverts for malignancies, of 10 grays for radium-226 in bones, and probably about 1.5-2.0 Gy for lung cancer after x-ray and gamma irradiation. [Sources: G. Jaikrishan, et al., Radiation Research, vol. 152 (1999), p. S149 (for natural exposure); R. D. Evans, Health Physics, vol. 27 (1974), p. 497 (for radium-226); H. H. Rossi and M. Zaider, Radiat. Environ. Biophys., vol. 36 (1997), p. 85 (for radiogenic lung cancer).] The hormetic effects, such as a decreased cancer incidence at low doses and increased longevity, may be used as a guide for estimating practical thresholds and for setting standards. ...
‘Though about a hundred of the million daily spontaneous DNA damages per cell remain unrepaired or misrepaired, apoptosis, differentiation, necrosis, cell cycle regulation, intercellular interactions, and the immune system remove about 99% of the altered cells. [Source: R. D. Stewart, Radiation Research, vol. 152 (1999), p. 101.] ...
‘[Due to the Chernobyl nuclear accident in 1986] as of 1998 (according to UNSCEAR), a total of 1,791 thyroid cancers in children had been registered. About 93% of the youngsters have a prospect of full recovery. [Source: C. R. Moir and R. L. Telander, Seminars in Pediatric Surgery, vol. 3 (1994), p. 182.] ... The highest average thyroid doses in children (177 mGy) were accumulated in the Gomel region of Belarus. The highest incidence of thyroid cancer (17.9 cases per 100,000 children) occurred there in 1995, which means that the rate had increased by a factor of about 25 since 1987.
‘This rate increase was probably a result of improved screening [not radiation!]. Even then, the incidence rate for occult thyroid cancers was still a thousand times lower than it was for occult thyroid cancers in nonexposed populations (in the US, for example, the rate is 13,000 per 100,000 persons, and in Finland it is 35,600 per 100,000 persons). Thus, given the prospect of improved diagnostics, there is an enormous potential for detecting yet more [fictitious] "excess" thyroid cancers. In a study in the US that was performed during the period of active screening in 1974-79, it was determined that the incidence rate of malignant and other thyroid nodules was greater by 21-fold than it had been in the pre-1974 period. [Source: Z. Jaworowski, 21st Century Science and Technology, vol. 11 (1998), issue 1, p. 14.]’
‘Professor Edward Lewis used data from four independent populations exposed to radiation to demonstrate that the incidence of leukemia was linearly related to the accumulated dose of radiation. ... Outspoken scientists, including Linus Pauling, used Lewis’s risk estimate to inform the public about the danger of nuclear fallout by estimating the number of leukemia deaths that would be caused by the test detonations. In May of 1957 Lewis’s analysis of the radiation-induced human leukemia data was published as a lead article in Science magazine. In June he presented it before the Joint Committee on Atomic Energy of the US Congress.’ – Abstract of thesis by Jennifer Caron, Edward Lewis and Radioactive Fallout: the Impact of Caltech Biologists Over Nuclear Weapons Testing in the 1950s and 60s, Caltech, January 2003.
Dr John F. Loutit of the Medical Research Council, Harwell, England, in 1962 wrote a book called Irradiation of Mice and Men (University of Chicago Press, Chicago and London), discrediting the pseudo-science from geneticist Edward Lewis on pages 61, and 78-79:
‘... Mole [R. H. Mole, Brit. J. Radiol., v32, p497, 1959] gave different groups of mice an integrated total of 1,000 r of X-rays over a period of 4 weeks. But the dose-rate - and therefore the radiation-free time between fractions - was varied from 81 r/hour intermittently to 1.3 r/hour continuously. The incidence of leukemia varied from 40 per cent (within 15 months of the start of irradiation) in the first group to 5 per cent in the last compared with 2 per cent incidence in irradiated controls. …
‘What Lewis did, and which I have not copied, was to include in his table another group - spontaneous incidence of leukemia (Brooklyn, N.Y.) - who are taken to have received only natural background radiation throughout life at the very low dose-rate of 0.1-0.2 rad per year: the best estimate is listed as 2 x 10-6 like the others in the table. But the value of 2 x 10-6 was not calculated from the data as for the other groups; it was merely adopted. By its adoption and multiplication with the average age in years of Brooklyners - 33.7 years and radiation dose per year of 0.1-0.2 rad - a mortality rate of 7 to 13 cases per million per year due to background radiation was deduced, or some 10-20 per cent of the observed rate of 65 cases per million per year. ...
‘All these points are very much against the basic hypothesis of Lewis of a linear relation of dose to leukemic effect irrespective of time. Unhappily it is not possible to claim for Lewis’s work as others have done, “It is now possible to calculate - within narrow limits - how many deaths from leukemia will result in any population from an increase in fall-out or other source of radiation” [Leading article in Science, vol. 125, p. 963, 1957]. This is just wishful journalese.
‘The burning questions to me are not what are the numbers of leukemia to be expected from atom bombs or radiotherapy, but what is to be expected from natural background .... Furthermore, to obtain estimates of these, I believe it is wrong to go to [1950s inaccurate, dose rate effect ignoring, data from] atom bombs, where the radiations are qualitatively different [i.e., including effects from neutrons] and, more important, the dose-rate outstandingly different.’
‘From the earlier studies of radiation-induced mutations, made with fruitflies [by Nobel Laureate Hermann J. Muller and other geneticists who worked on plants, who falsely hyped their insect and plant data as valid for mammals like humans during the June 1957 U.S. Congressional Hearings on fallout effects], it appeared that the number (or frequency) of mutations in a given population ... is proportional to the total dose ... More recent experiments with mice, however, have shown that these conclusions need to be revised, at least for mammals. [Mammals are biologically closer to humans, in respect to DNA repair mechanisms, than short-lived insects whose life cycles are too small to have forced the evolutionary development of advanced DNA repair mechanisms, unlike mammals that need to survive for decades before reproducing.] When exposed to X-rays or gamma rays, the mutation frequency in these animals has been found to be dependent on the exposure (or dose) rate ...
‘At an exposure rate of 0.009 roentgen per minute [0.54 R/hour], the total mutation frequency in female mice is indistinguishable from the spontaneous frequency. [Emphasis added.] There thus seems to be an exposure-rate threshold below which radiation-induced mutations are absent ... with adult female mice ... a delay of at least seven weeks between exposure to a substantial dose of radiation, either neutrons or gamma rays, and conception causes the mutation frequency in the offspring to drop almost to zero. ... recovery in the female members of the population would bring about a substantial reduction in the 'load' of mutations in subsequent generations.’
George Bernard Shaw cynically explains groupthink brainwashing bias:
‘We cannot help it because we are so constituted that we always believe finally what we wish to believe. The moment we want to believe something, we suddenly see all the arguments for it and become blind to the arguments against it. The moment we want to disbelieve anything we have previously believed, we suddenly discover not only that there is a mass of evidence against, but that this evidence was staring us in the face all the time.’
From the essay titled ‘What is Science?’ by Professor Richard P. Feynman, presented at the fifteenth annual meeting of the National Science Teachers Association, 1966 in New York City, and published in The Physics Teacher, vol. 7, issue 6, 1968, pp. 313-20:
‘... great religions are dissipated by following form without remembering the direct content of the teaching of the great leaders. In the same way, it is possible to follow form and call it science, but that is pseudo-science. In this way, we all suffer from the kind of tyranny we have today in the many institutions that have come under the influence of pseudoscientific advisers.
‘We have many studies in teaching, for example, in which people make observations, make lists, do statistics, and so on, but these do not thereby become established science, established knowledge. They are merely an imitative form of science analogous to the South Sea Islanders’ airfields - radio towers, etc., made out of wood. The islanders expect a great airplane to arrive. They even build wooden airplanes of the same shape as they see in the foreigners' airfields around them, but strangely enough, their wood planes do not fly. The result of this pseudoscientific imitation is to produce experts, which many of you are. ... you teachers, who are really teaching children at the bottom of the heap, can maybe doubt the experts. As a matter of fact, I can also define science another way: Science is the belief in the ignorance of experts.’
Richard P. Feynman, ‘This Unscientific Age’, in The Meaning of It All, Penguin Books, London, 1998, pages 106-9:
‘Now, I say if a man is absolutely honest and wants to protect the populace from the effects of radioactivity, which is what our scientific friends often say they are trying to do, then he should work on the biggest number, not on the smallest number, and he should try to point out that the [natural cosmic] radioactivity which is absorbed by living in the city of Denver is so much more serious [than the smaller doses from nuclear explosions] ... that all the people of Denver ought to move to lower altitudes.'
Feynman is not making a point about low level radiation effects, but about the politics of ignoring the massive natural background radiation dose, while provoking hysteria over much smaller measured fallout pollution radiation doses. Why is the anti-nuclear lobby so concerned about banning nuclear energy - which is not possible even in principle since most of our nuclear radiation is from the sun and from supernova debris contaminating the Earth from the explosion that created the solar system circa 4,540 million years ago - when they could cause much bigger radiation dose reductions to the population by concentrating on the bigger radiation source, natural background radiation. It is possible to shield natural background radiation by the air, e.g. by moving the population of high altitude cities to lower altitudes where there is more air between the people and outer space, or banning the use of high-altitude jet aircraft. The anti-nuclear lobby, as Feynman stated back in the 1960s, didn't crusade to reduce the bigger dose from background radiation. Instead they chose to argue against the much smaller doses from fallout pollution. Feynman's argument is still today falsely interpreted as a political statement, when it is actually exposing pseudo-science and countering political propaganda. It is still ignored by the media. It has been pointed out by Senator Hickenlooper on page 1060 of the May-June 1957 U.S. Congressional Hearings before the Special Subcommittee on Radiation of the Joint Committee on Atomic Energy, The Nature of Radioactive Fallout and Its Effects on Man:
‘I presume all of us would earnestly hope that we never had to test atomic weapons ... but by the same token I presume that we want to save thousands of lives in this country every year and we could just abolish the manufacture of [road accident causing] automobiles ...’
Dihydrogen monoxide is a potentially very dangerous chemical containing hydrogen and oxygen which has caused numerous severe burns by scalding and deaths by drowning, contributes to the greenhouse effect, accelerates corrosion and rusting of many metals, and contributes to the erosion of our natural landscape: 'Dihydrogen monoxide (DHMO) is colorless, odorless, tasteless, and kills uncounted thousands of people every year. Most of these deaths are caused by accidental inhalation of DHMO, but the dangers of dihydrogen monoxide do not end there. Prolonged exposure to its solid form causes severe tissue damage. Symptoms of DHMO ingestion can include excessive sweating and urination, and possibly a bloated feeling, nausea, vomiting and body electrolyte imbalance. For those who have become dependent, DHMO withdrawal means certain death.'
Protein P53, discovered only in 1979, is encoded by gene TP53, which occurs on human chromosome 17. P53 also occurs in other mammals including mice, rats and dogs. P53 is one of the proteins which continually repairs breaks in DNA, which easily breaks at body temperature: the DNA in each cell of the human body suffers at least two single strand breaks every second, and one double strand (i.e. complete double helix) DNA break occurs at least once every 2 hours (5% of radiation-induced DNA breaks are double strand breaks, while 0.007% of spontaneous DNA breaks at body temperature are double strand breaks)! Cancer occurs when several breaks in DNA happen to occur by chance at nearly the same time, giving several loose strand ends at once, which repair proteins like P53 then repair incorrectly, causing a mutation which can be proliferated somatically. This cannot occur when only one break occurs, because only two loose ends are produced, and P53 will reattach them correctly. But if low-LET ionising radiation levels are increased to a certain extent, causing more single strand breaks, P53 works faster and is able deal with faster breaks as they occur, so that multiple broken strand ends do not arise. This prevents DNA strands being repaired incorrectly, and prevents cancer - a result of mutation caused by faults in DNA - from arising. Too much radiation of course overloads the P53 repair mechanism, and then it cannot repair breaks as they occur, so multiple breaks begin to appear and loose ends of DNA are wrongly connected by P53, causing an increased cancer risk.
1. DNA-damaging free radicals are equivalent to a source of sparks which is always present naturally.
2. Cancer is equivalent the fire you get if the sparks are allowed to ignite the gasoline, i.e. if the free radicals are allowed to damage DNA without the damage being repaired.
3. Protein P53 is equivalent to a fire suppression system which is constantly damping out the sparks, or repairing the damaged DNA so that cancer doesn’t occur.
In this way of thinking, the ‘cause’ of cancer will be down to a failure of a DNA repairing enzyme like protein P53 to repair the damage.
'For the mindset that engendered and enables this situation, which jeopardizes the existence of the United States as a nation as well as the lives of millions of its citizens, some American physicians and certain prestigious medical organizations bear a heavy responsibility.
Charles J. Hitch and Roland B. McKean of the RAND Corporation in their 1960 book The Economics of Defense in the Nuclear Age, Harvard University Press, Massachusetts, pp. 310-57:
‘With each side possessing only a small striking force, a small amount of cheating would give one side dominance over the other, and the incentive to cheat and prepare a preventative attack would be strong ... With each side possessing, say, several thousand missiles, a vast amount of cheating would be necessary to give one side the ability to wipe out the other’s striking capability. ... the more extensive a disarmament agreement is, the smaller the force that a violator would have to hide in order to achieve complete domination. Most obviously, “the abolition of the weapons necessary in a general or ‘unlimited’ war” would offer the most insuperable obstacles to an inspection plan, since the violator could gain an overwhelming advantage from the concealment of even a few weapons.’
Disarmament after World War I caused the following problem which led to World War II (reported by Winston S. Churchill in the London Daily Express newspaper of November 1, 1934):
‘Germany is arming secretly, illegally and rapidly. A reign of terror exists in Germany to keep secret the feverish and terrible preparations they are making.’
British Prime Minister Thatcher's address to the United Nations General Assembly on disarmament on 23 June 1982, where she pointed out that in the years since the nuclear attacks on Hiroshima and Nagasaki, 10 million people had been killed by 140 non-nuclear conflicts:
‘The fundamental risk to peace is not the existence of weapons of particular types. It is the disposition on the part of some states to impose change on others by resorting to force against other nations ... Aggressors do not start wars because an adversary has built up his own strength. They start wars because they believe they can gain more by going to war than by remaining at peace.’
J. D. Culshaw, the then Director of the U.K. Home Office Scientific Advisory Branch, stated in his article in the Scientific Advisory Branch journal Fission Fragments, September 1972 (issue No. 19), classified 'Restricted':
'Apart from those who don't want to know or can't be bothered, there seem to be three major schools of thought about the nature of a possible Third World War ...
* 'The first group think of something like World War II but a little worse ...
* '... the second of World War II but very much worse ...
* 'and the third group think in terms of a catastrophe ...
'When the Armageddon concept is in favour, the suggestion that such problems exist leads to "way out" research on these phenomena, and it is sufficient to mention a new catastrophic threat [e.g., 10 years later this was done by Sagan with "nuclear winter" hype, which turned out to be fake because modern concrete cities can't produce firestorms like 1940s wooden-built areas of Hamburg, Dresden and Hiroshima] to stimulate research into the possibilities of it arising. The underlying appeal of this concept is that if one could show that the execution of all out nuclear, biological or chemical warfare would precipitate the end of the world, no one but a mad man would be prepared to initiate such a war. [However, as history proves, plenty of mad men end up gaining power and leading countries into wars.]'
J. K. S. Clayton, then Director of the U.K. Home Office Scientific Advisory Branch, stated in his introduction, entitled The Challenge - Why Home Defence?, to the 1977 Home Office Scientific Advisory Branch Training Manual for Scientific Advisers:
'Since 1945 we have had nine wars - in Korea, Malaysia and Vietnam, between China and India, China and Russia, India and Pakistan and between the Arabs and Israelis on three occasions. We have had confrontations between East and West over Berlin, Formosa and Cuba. There have been civil wars or rebellions in no less than eleven countries and invasions or threatened invasions of another five. Whilst it is not suggested that all these incidents could have resulted in major wars, they do indicate the aptitude of mankind to resort to a forceful solution of its problems, sometimes with success. ...'
It is estimated that Mongol invaders exterminated 35 million Chinese between 1311-40, without modern weapons. Communist Chinese killed 26.3 million dissenters between 1949 and May 1965, according to detailed data compiled by the Russians on 7 April 1969. The Soviet communist dictatorship killed 40 million dissenters, mainly owners of small farms, between 1917-59. Conventional (non-nuclear) air raids on Japan killed 600,000 during World War II. The single incendiary air raid on Tokyo on 10 March 1945 killed 140,000 people (more than the total for nuclear bombs on Hiroshima and Nagasaki combined) at much less than the $2 billion expense of the Hiroshima and Nagasaki nuclear bombs! Non-nuclear air raids on Germany during World War II killed 593,000 civilians. The argument that the enemy will continue stocking megaton fallout weapons if we go to cleaner weapons is irrelevant for deterrence, since we're not planning to start war, just to credibly deter invasions. You should not try to lower your standards of warfare to those of your enemy to appease groupthink taboos, or you will end up like Britain's leaders in the 1930s, trying to collaborate with fascists for popular applause.
Lord Hailsham of Saint Marylebone: ‘My Lords, if we are going into the question of lethality of weapons and seek thereby to isolate the nuclear as distinct from the so-called conventional range, is there not a danger that the public may think that Vimy, Passchendaele and Dresden were all right—sort of tea parties—and that nuclear war is something which in itself is unacceptable?’
Lord Trefgarne: ‘My Lords, the policy of making Europe, or the rest of the world, safe for conventional war is not one that I support.’
Mr. Bill Walker (Tayside, North): ‘I remind the House that more people died at Stalingrad than at Hiroshima or Nagasaki. Yet people talk about fighting a conventional war in Europe as if it were acceptable. One rarely sees demonstrations by the so-called peace movement against a conventional war in Europe, but it could be nothing but ghastly and horrendous. The casualties would certainly exceed those at Stalingrad, and that cannot be acceptable to anyone who wants peace’
On 29 October 1982, Thatcher stated of the Berlin Wall: ‘In every decade since the war the Soviet leaders have been reminded that their pitiless ideology only survives because it is maintained by force. But the day comes when the anger and frustration of the people is so great that force cannot contain it. Then the edifice cracks: the mortar crumbles ... one day, liberty will dawn on the other side of the wall.’
On 22 November 1990, she said: ‘Today, we have a Europe ... where the threat to our security from the overwhelming conventional forces of the Warsaw Pact has been removed; where the Berlin Wall has been torn down and the Cold War is at an end. These immense changes did not come about by chance. They have been achieved by strength and resolution in defence, and by a refusal ever to be intimidated.’
‘... peace cannot be guaranteed absolutely. Nobody can be certain, no matter what policies this or any other Government were to adopt, that the United Kingdom would never again be attacked. Also we cannot tell what form such an attack might take. Current strategic thinking suggests that if war were to break out it would start with a period of conventional hostilities of uncertain duration which might or might not escalate to nuclear conflict. ... while nuclear weapons exist there must always be a chance, however small, that they will be used against us [like gas bombs in World War II]. ... as a consequence of war between other nations in which we were not involved fall out from nuclear explosions could fall on a neutral Britain. ... conventional war is not the soft option that is sometimes suggested. It is also too easily forgotten that in World War II some 50 million people died and that conventional weapons have gone on killing people ever since 1945 without respite.’ - - The Minister of State, Scottish Office (Lord Gray of Contin), House of Lords debate on Civil Defence (General Local Authority Functions) Regulations, Hansard, vol. 444, cc. 523-49, 1 November 1983.
‘All of us are living in the light and warmth of a huge hydrogen bomb, 860,000 miles across and 93 million miles away, which is in a state of continuous explosion.’ - Dr Isaac Asimov.
‘Dr Edward Teller remarked recently that the origin of the earth was somewhat like the explosion of the atomic bomb...’ – Dr Harold C. Urey, The Planets: Their Origin and Development, Yale University Press, New Haven, 1952, p. ix.
‘But compared with a supernova a hydrogen bomb is the merest trifle. For a supernova is equal in violence to about a million million million million hydrogen bombs all going off at the same time.’ – Sir Fred Hoyle (1915-2001), The Nature of the Universe, Pelican Books, London, 1963, p. 75.
‘In fact, physicists find plenty of interesting and novel physics in the environment of a nuclear explosion. Some of the physical phenomena are valuable objects of research, and promise to provide further understanding of nature.’ – Dr Harold L. Brode, The RAND Corporation, ‘Review of Nuclear Weapons Effects,’ Annual Review of Nuclear Science, Volume 18, 1968, pp. 153-202.
Dr Paul K. Kuroda (1917-2001) in 1956 correctly predicted the existence of water-moderated natural nuclear reactors in flooded uranium ore seams, which were discovered in 1972 by French physicist Francis Perrin in three ore deposits at Oklo in Gabon, where sixteen sites operated as natural nuclear reactors with self-sustaining nuclear fission 2,000 million years ago, each lasting several hundred thousand years, averaging 100 kW. The radioactive waste they generated remained in situ for a period of 2,000,000,000 years without escaping. They were discovered during investigations into why the U-235 content of the uranium in the ore was only 0.7171% instead of the normal 0.7202%. Some of the ore, in the middle of the natural reactors, had a U-235 isotopic abundance of just 0.440%. Kuroda's brilliant paper is entitled, 'On the Nuclear Physical Stability of the Uranium Minerals', published in the Journal of Chemical Physics, vol. 25 (1956), pp. 781–782 and 1295–1296.
A type Ia supernova explosion, always yielding 4 x 1028 megatons of TNT equivalent, results from the critical mass effect of the collapse of a white dwarf as soon as its mass exceeds 1.4 solar masses due to matter falling in from a companion star. The degenerate electron gas in the white dwarf is then no longer able to support the pressure from the weight of gas, which collapses, thereby releasing enough gravitational potential energy as heat and pressure to cause the fusion of carbon and oxygen into heavy elements, creating massive amounts of radioactive nuclides, particularly intensely radioactive nickel-56, but half of all other nuclides (including uranium and heavier) are also produced by the 'R' (rapid) process of successive neutron captures by fusion products in supernovae explosions. Type Ia supernovae occur typically every 400 years in the Milky Way galaxy. On 4 July 1054, Chinese astronomers observed in the sky (without optical instruments) the bright supernova in the constellation Taurus which today is still visible as the Crab Nebula through telescopes. The Crab Nebula debris has a diameter now of 7 light years and is still expanding at 800 miles/second. The supernova debris shock wave triggers star formation when it encounters hydrogen gas in space by compressing it and seeding it with debris; bright stars are observed in the Orion Halo, the 300 light year diameter remains of a supernova. It is estimated that when the solar system was forming 4,540 million years ago, a supernova occurred around 100 light years away, and the heavy radioactive debris shock wave expanded at 1,000 miles/second. Most of the heavy elements including iron, silicon and calcium in the Earth and people are the stable end products of originally radioactive decay chains from the space burst fallout of a 7 x 1026 megatons thermonuclear explosion, created by fusion and successive neutron captures after the implosion of a white dwarf; a supernova explosion.
How would a 1055 megaton hydrogen bomb explosion differ from the big bang? Ignorant answers biased in favour of curved spacetime (ignoring quantum gravity!) abound, such as claims that explosions can’t take place in ‘outer space’ (disagreeing with the facts from nuclear space bursts by Russia and America in 1962, not to mention natural supernova explosions in space!) and that explosions produce sound waves in air by definition! There are indeed major differences in the nuclear reactions between the big bang and a nuclear bomb. But it is helpful to notice the solid physical fact that implosion systems suggest the mechanism of gravitation: in implosion, TNT is well-known to produce an inward force on a bomb core, but Newton's 3rd law says there is an equal and opposite reaction force outward. In fact, you can’t have a radially outward force without an inward reaction force! It’s the rocket principle. The rocket accelerates (with force F = ma) forward by virtue of the recoil from accelerating the exhaust gas (with force F = -ma) in the opposite direction! Nothing massive accelerates without an equal and opposite reaction force. Applying this fact to the measured 6 x 10-10 ms-2 ~ Hc cosmological acceleration of matter radially outward from observers in the universe which was predicted accurately in 1996 and later observationally discovered in 1999 (by Perlmutter, et al.), we find an outward force F = ma and inward reaction force by the 3rd law. The inward force allows quantitative predictions, and is mediated by gravitons, predicting gravitation in a checkable way (unlike string theory, which is just a landscape of 10500 different perturbative theories and so can’t make any falsifiable predictions about gravity). So it seems as if nuclear explosions do indeed provide helpful analogies to natural features of the world, and the mainstream lambda-CDM model of cosmology - with its force-fitted unobserved ad hoc speculative ‘dark energy’ - ignores and sweeps under the rug major quantum gravity effects which increase the physical understanding of particle physics, particularly force unification and the relation of gravitation to the existing electroweak SU(2) x U(1) section of the Standard Model of fundamental forces.
Even Einstein grasped the possibility that general relativity's lambda-CDM model is at best just a classical approximation to quantum field theory, at the end of his life when he wrote to Besso in 1954:
‘I consider it quite possible that physics cannot be based on the [classical differential equation] field principle, i.e., on continuous structures. In that case, nothing remains of my entire castle in the air, [non-quantum] gravitation theory included ...’
‘Science is the organized skepticism in the reliability of expert opinion.’ - Professor Richard P. Feynman (quoted by Professor Lee Smolin, The Trouble with Physics, Houghton-Mifflin, New York, 2006, p. 307).
‘The expression of dissenting views may not seem like much of a threat to a powerful organization, yet sometimes it triggers an amazingly hostile response. The reason is that a single dissenter can puncture an illusion of unanimity. ... Among those suppressed have been the engineers who tried to point out problems with the Challenger space shuttle that caused it to blow up. More fundamentally, suppression is a denial of the open dialogue and debate that are the foundation of a free society. Even worse than the silencing of dissidents is the chilling effect such practices have on others. For every individual who speaks out, numerous others decide to play it safe and keep quiet. More serious than external censorship is the problem of self-censorship.’
— Professor Brian Martin, University of Wollongong, 'Stamping Out Dissent', Newsweek, 26 April 1993, pp. 49-50
In 1896, Sir James Mackenzie-Davidson asked Wilhelm Röntgen, who discovered X-rays in 1895: ‘What did you think?’ Röntgen replied: ‘I did not think, I investigated.’ The reason? Cathode ray expert J. J. Thomson in 1894 saw glass fluorescence far from a tube, but due to prejudice (expert opinion) he avoided investigating that X-ray evidence! ‘Science is the organized skepticism in the reliability of expert opinion.’ - Richard Feynman, in Lee Smolin, The Trouble with Physics, Houghton-Mifflin, 2006, p. 307.
Mathematical symbols in this blog: your computer’s browser needs access to standard character symbol sets to display Greek symbols for mathematical physics. If you don’t have the symbol character sets installed, the density symbol 'r' (Rho) will appear as 'r' and the 'p' (Pi) symbol will as 'p', causing confusion with the use of 'r' for radius and 'p' for momentum in formulae. This problem exists with Mozilla Firefox 3, but not with Microsoft Explorer which displays Greek symbols.
Mean yield of the 5,192 nuclear warheads and bombs in the deployed Russian nuclear stockpile as of January 2009: 0.317 Mt. Total yield: 1,646 Mt.
Mean yield of the 4,552 nuclear warheads and bombs in the deployed U.S. nuclear stockpile as of January 2007: 0.257 Mt. Total yield: 1,172 Mt.
For diffraction damage where damage areas scale as the two-thirds power of explosive yield, this stockpile's area damage potential can be compared to the 20,000,000 conventional bombs of 100 kg size (2 megatons of TNT equivalent total energy) dropped on Germany during World War II: (Total nuclear bomb blast diffraction damaged ground area)/(Total conventional blast diffraction damaged ground area to Germany during World War II) = [4,552*(0.257 Mt)2/3]/[20,000,000*(0.0000001 Mt)2/3] = 1,840/431 = 4.3. Thus, although the entire U.S. stockpile has a TNT energy equivalent to 586 times that of the 2 megatons of conventional bombs dropped on Germany in World War II, it is only capable of causing 4.3 times as much diffraction type damage area, because any given amount of explosive energy is far more efficient when distributed over many small explosions than in a single large explosion! Large explosions are inefficient because they cause unintended collateral damage, wasting energy off the target area and injuring or damaging unintended targets!
In a controlled sample of 36,500 survivors, 89 people got leukemia over a 40 year period, above the number in the unexposed control group. (Data: Radiation Research, volume 146, 1996, pages 1-27.) Over 40 years, in 36,500 survivors monitored, there were 176 leukemia deaths which is 89 more than the control (unexposed) group got naturally. There were 4,687 other cancer deaths, but that was merely 339 above the number in the control (unexposed) group, so this is statistically a much smaller rise than the leukemia result. Natural leukemia rates, which are very low in any case, were increased by 51% in the irradiated survivors, but other cancers were merely increased by just 7%. Adding all the cancers together, the total was 4,863 cancers (virtually all natural cancer, nothing whatsoever to do with radiation), which is just 428 more than the unexposed control group. Hence, the total increase over the natural cancer rate due to bomb exposure was only 9%, spread over a period of 40 years. There was no increase whatsoever in genetic malformations.
‘If defense is neglected these weapons of attack become effective. They become available and desirable in the eyes of an imperialist dictator, even if his means are limited. Weapons of mass destruction could become equalizers between nations big and small, highly developed and primitive, if defense is neglected. If defense is developed and if it is made available for general prevention of war, weapons of aggression will become less desirable. Thus defense makes war itself less probable. ... One psychological defense mechanism against danger is to forget about it. This attitude is as common as it is disastrous. It may turn a limited danger into a fatal difficulty.’
Advice of Robert Watson-Watt (Chief Scientist on the World War II British Radar Project, defending Britain against enemy attacks): ‘Give them the third best to go on with, the second best comes too late, the best never comes.’