Saturday, April 15, 2023

The Girl With the War-Winning Hair

Every day millions of Americans carefully wash, sort, and set out their recycling for collection. But while many might feel proud to be doing their bit to help save the environment, such efforts are minuscule next to the gargantuan recycling effort that accompanied the Second World War. The term “total war” refers to a state in which every facet of a nation’s economy and public life is committed to the prosecution of said war, and no conflict in human history more fully embodied this ethos than WWII. Almost overnight, nearly every commodity imaginable became a strategic material feeding the hungry war machine. Household goods like gasoline, cloth, and staple foods were heavily rationed, scrap metal drives scoured cities and towns for everything from junked cars to toothpaste tubes, housewives collected kitchen drippings to be turned into glycerine for explosives, and in Canada coloured ink became so scarce that comic book publishers were forced to print the now highly-collectible “Canadian Whites.” But few citizens could claim to have possessed a stranger strategic material than Miss Mary Babnik of Pueblo, Colorado.

Born Mitzi Babnik in 1907 to Slovenian immigrant parents, Mary Babnik was famous for her long blonde hair, which by the 1940s had grown to length of 34 inches or 83 centimetres, reaching down to her knees. She typically wore it in a long braid wrapped around her head, earning her the nickname “The Lady with the Crown.” In 1943 Mary was already contributing fully to the war effort, working by day at the National Broom Factory and teaching airmen from the local Air Force base to dance every evening as a USO volunteer.

But when Mary’s brothers were barred from military service on medical grounds, she began to feel that even this wasn’t enough:

“Both of my brothers were deferred and couldn’t go. I was thinking of all those other boys and their families, the ones who had to go. I saw so many people crying their eyes out, not wanting their sons to go. I was sad. I wanted to do something for the war effort.”

Thus, when she saw an advertisement in a local paper calling for blonde, undamaged hair at least 22 inches or 56 centimetres in length, she immediately replied. In November she was contacted by the Washington Institute of Technology, who asked her for a sample. Mary’s hair, which had never been cut, curled, straightened, or washed with anything but natural soap, was exactly what the WIT was looking for, and in 1944 she agreed to have it cut. Though the Government offered her compensation in war savings stamps, Mary refused, considering it her patriotic duty to contribute to the war effort. Nevertheless, the loss of her defining characteristic proved traumatic.

“After I did it I cried and cried. I went to my mother and said, ‘Mama, why did you let me cut my hair?’ It was two months before I went anywhere except to work. After two months, I got used to it. But at first I was so ashamed I wore a bandana to work so people wouldn’t ask me about it.”

Over the years a myth has emerged claiming that Mary Babnik’s hair was used to make the crosshairs in the Norden Bombsight carried aboard American B-17, B-24, and B-29 bomber aircraft. However, this is impossible, as the crosshairs in the Norden are not a separate component but rather etched into the glass of one of the sighting lenses.

So what was it actually used for? While it’s not fully clear in detail, it appears Mary’s hair was used in the manufacture of precision hygrometers for measuring atmospheric humidity – measurements vital to the manufacture of certain aircraft components and countless other war materials where accurate humidity measurement was essential, from the first nuclear weapons to intercontinental ballistic missiles and more.

Despite her initial regret, Mary Babnik soon came to view her curious contribution to the war effort with pride, claiming in a 1990 interview that she would “do it all again.” In 1987 President Ronald Reagan sent her a birthday greeting thanking her for her wartime service, while in 1990 she was presented with a special achievement award from the Colorado Aviation Historical Society. Mary Babnik died in 1991 at the age of 84.

If you liked this article, you might also enjoy our new popular podcast, The BrainFood Show (iTunes, Spotify, Google Play Music, Feed), as well as:

Bonus Fact

The Norden Bombsight – the device that Mary Babnik’s hair was erroneously believed to have been used in – was one of the most closely-guarded secrets of the Second World War. First developed by Dutch-American engineer Carl Norden in the late 1920s, the device was extensively used aboard B-17, B-24, and B-29 bombers throughout the war.

Unlike what is typically depicted in movies, the Norden was not merely a simple telescope and crosshairs for aiming but rather a highly-sophisticated mechanical computer and autopilot that kept the aircraft on a steady course, constantly re-calculated the bombs’ point of impact based on changing flight conditions, and automatically dropped the bombs when the aircraft arrived over the target. In fact, the Norden bombsight is best thought of not as one single device but four.

The first component of the Norden bombsight was the inertial platform, a set of two gyroscopes that kept the sight stable and level relative to the ground regardless of how the aircraft moved around it. The second component was the sighting eyepiece, which looked not straight down but through a motorized prism that gave a view of the target ahead. By adjusting the speed of rotation of the prism so that the target remained fixed in the crosshairs, the bombardier could effectively calculate the groundspeed and the position of the target relative to the aircraft. The sight could then calculate when the aircraft had arrived over the release point and automatically drop the bombs. However, the fall of the bombs was affected by a number of other factors, including altitude, air temperature, and wind direction and velocity; therefore the bombardier had to use the sight’s third component, a mechanical computer, to compensate for these. Throughout the bomb run, he would constantly adjust these values by trial and error in order to keep the target centred in the crosshairs. While early versions of the Norden included a device that signalled course corrections to the pilot to keep the aircraft on correct heading, the finalized Mk. XV model used throughout WWII incorporated a fourth component, an autopilot, to fly the aircraft throughout the bomb run. Thus, on the approach to the target the plane would be flown not by the pilot but rather the bombsight and the bombardier, whose constant wind speed, altitude, and heading corrections would automatically adjust the aircraft’s course.

In prewar testing the Norden displayed phenomenal accuracy, with a Circular Area Probable – the diameter of the circle in which half the bombs could be expected to fall – of only 75 feet. This performance informed the American doctrine of daylight precision bombing, which held that military targets such as factories or marshalling yards could be hit from high altitude with minimal collateral damage  – even if said targets were located within built-up civilian areas. Or, as US aircrew famously put it, that they could drop a bomb into a pickle barrel from 30,000 feet. This accuracy also theoretically allowed Navy aircraft to attack fleets of enemy ships at sea via high-altitude level bombing. The Norden was considered so vital to US air power that its design and production was given top secret status, and bombardiers were made to swear an oath to destroy their bombsight before bailing out of a stricken aircraft – either by heaving it overboard or emptying their service pistols into the mechanism.

Yet despite this vaunted reputation as a top-secret war-winning weapon, under actual combat conditions the Norden’s performance proved decidedly lacklustre, its CEP growing to over 1200 feet – about the same as far simpler British and German bombsights. Aircrew flying daylight raids also ran into the same problem faced by the British earlier in the war, namely that flying straight and level over a target for minutes on end tended to make bombers extremely vulnerable to enemy fighters and antiaircraft fire. Who knew? While high casualties had forced the British to switch to night raids and area bombing whereby entire cities were targeted rather than individual targets, the USAAF persisted with daylight raids, instead developing new tactics to improve bombing accuracy and aircraft survivability. These included the combat box – a special flight formation in which bomber gunners could better defend each other against fighter attack – and the lead bomber tactic, in which only a single aircraft would use its Norden to find the target, with the other bombers in the formation dropping their bombs on its command. Regardless, bombing proved almost impossible to achieve and the USAAF increasingly began adopting less discriminate area bombing tactics. Meanwhile, the Navy largely abandoned its Norden bombsights and embraced dive bombing and skip bombing to more accurately attack enemy ships.

Despite extensive attempts to keep its design a secret, details of the Norden’s operation did fall into German hands through both espionage and crashed aircraft. However, little attempt was made to reverse-engineer it due what the Germans saw as its unnecessary complexity. And despite its failure to live up to expectations, the Norden was the best the US military had and served through the rest of the war – being used to drop both Atomic Bombs – and soldiered on through Korea and Vietnam, its last use occurring in 1967 when it was used to drop electronic sensors onto the Ho Chi Minh Trail.

Expand for References

A.F. Lauds Woman Who Gave Hair, Deseret News, November 19, 1990, https://www.deseret.com/1990/11/19/18891930/a-f-lauds-woman-who-gave-hair-br

Mary Babnick Brown, Pueblo County, Colorado, https://www.kmitch.com/Pueblo/bios0094.html

Adams, Doug, The Blonde and the Bomber: The Hair That Whipped Hitler, Life in the Delta, February 2011 https://web.archive.org/web/20140103223130/http://digitalpublication.lifeinthedelta.com/display_article.php?id=635505

Woman’s Locks Key to Sights, Star News, November 22, 1989, https://news.google.com/newspapers?id=nbMsAAAAIBAJ&pg=6557,2626572

The Politics, Pickle Barrels, and Propaganda of the Norden Bombsight, Museum of Aviation Foundation, April 23, 2016, https://ift.tt/zRpcsMD

 

Tillman, Barrett, Norden Bombsight: The Pickle Barrel War, Flight Journal Magazine, Winter 2001

The post The Girl With the War-Winning Hair appeared first on Today I Found Out.



from Today I Found Out
by Gilles Messier - April 14, 2023 at 07:59PM
Article provided by the producers of one of our Favorite YouTube Channels!
-

The Gruesome Tale of the Laughing Death Epidemic

The symptoms were gradual but inexorable. It began with headaches, joint pain and tremors in the hands and feet, mild at first but growing steadily in intensity. The victims’ movements became increasingly uncoordinated and clumsy, their stance and gait unsteady. Soon they were unable to walk at all, racked by severe tremors and muscle spasms. And then came the most alarming symptoms of all, as victims began bursting into tears or uncontrollable laughter without provocation. Confusion, delirium, and paralysis followed, until they were unable to move, speak, eat, or make eye contact. Finally, 12 months after the onset of symptoms, came the inevitable arrival of death. This was Kuru, a terrifying and incurable neurodegenerative disease that for nearly 100 years terrorized the Fore [“four-ay”] people of Papua New Guinea. For more than half a century the origins of Kuru remained a mystery, until in the 1960s groundbreaking research revealed it to be caused not by a bacterium, virus, parasite, or even a fungus, but something far stranger.

In 1914, in the early stages of the First World War, the Australian army invaded and occupied the Imperial German colony of New Guinea. What eventually became known as Papua New Guinea would remain under Australian control for another 70 years, and over the following decades missionaries and Colonial Patrol Officers or “Kiaps” penetrated deeper and deeper into the interior in an attempt to bring christianity and western legal, administrative, and medical practices to the indigenous peoples of the island. In the early 1950s, reports began filtering back from Colonial officials of a strange and incurable disease plaguing the indigenous people of the Eastern Highlands, including the Fore and neighbouring Awa, Yate [“yatt-ay”], and Usurfa. Known to the Fore as kuru, from the word kuria meaning “to shake” or “to tremble,” the disease caused a progressive loss of muscular control and invariably death. Kuru was also known as “nagi-nagi” or “laughing sickness” after the tendency of victims to burst into spontaneous laughter in the latter stages of the disease. Thought to have originated sometime in the early 1900s, by the 1950s Kuru had grown into a full-blown epidemic, killing around 200 Fore – or 1% of the population – every year.

The first westerner to describe Kuru was Colonial patrol officer Arthur Carey, who observed in a 1951 report that the disease disproportionately affected Fore women and children over the men. The disease was also described by anthropologists Ronald and Catherine Berndt in 1952 and patrol officer John McArthur in 1953. However, all dismissed the epidemic as a purely psychosomatic phenomenon, a form of mass hysteria deriving from Fore beliefs in witchcraft and spirit possession. Indeed, the Fore themselves initially believed that Kuru to be caused by malicious sorcerers from rival groups, who would acquire parts of the victim’s body such as hair or nail clippings and combined with clothing, leaves, blood and other materials to form a “kuru bundle.” The sorcerer would then shake the bundle daily until the telltale tremors of kuru were induced in the intended victim.

However, American virologist Daniel Gajdusek and physician Vincent Zigas strongly doubted that Kuru could be purely psychological, writing to Ronald Berndt that:

“…our current opinion [is] that fatal kuru cannot by any stretch of the imagination be identified with hysteria, psychoses or any known psychologically induced illnesses. The evidence for direct nervous system damage is far too great in the strabismus [drooping or crossed eyes], and pictures of advanced neurological disease shown by the advanced cases.”

Consequently, in 1957 Gajdusek and Zigas launched the first proper scientific study of Kuru. Their analysis,  published in the Medical Journal of Australia, suggested that Kuru might be genetic in origin, passed down along family lines. However, confirmation of this hypothesis would require closer study of kinship among the Fore. So in 1961, medical researcher Michael Alpers and anthropologist Shirley Lindenbaum, using a research grant from the Rockefeller Foundation, travelled to the Eastern Highlands of Papua New Guinea to conduct a thorough anthropological and epidemiological study of the Fore.

Over the next two years, Alpers and Lindenbaum discovered that kinship among the Fore was not strictly based on biological relatedness, but rather a complex system of social association and bonding with neighbouring individuals. Kuru, they found, spread mainly along kinship lines and not strictly between biologically-related individuals, making it unlikely that it was transmitted genetically. Epidemiological data further ruled out transmission by air, insects, or contaminated water. So, just how was it transmitted? The unexpected answer would stun the scientific community and open a strange new chapter in the study of human disease: Kuru, Alpers and Lindenbaum discovered, was spread through cannibalism.

At the time, the Fore People practiced a form of what is known as mortuary cannibalism, ritually consuming the bodies of deceased family members as a means of honouring the dead and returning their spirit or “life force” to the community. Bodies would be buried for several days until they were infested with maggots before being exhumed and dismembered, cooked, and eaten communally, the maggots being served as a side dish. And while the Fore avoided the bodies of those who had died of dysentery, leprosy, and other diseases, those who succumbed to Kuru were still regularly consumed.

The more Alpers and Lindenbaum observed, the more convinced they became that this practice was at the heart of the Kuru epidemic. For example, preparation of the bodies was performed almost exclusively by women, who were also far more likely to engage in cannibalism than men. There were two main reasons for this: first, Male Fore believed that eating human flesh would weaken them in times of war. Second, women’s bodies were thought to be better able to tame the absorbed spirits of the dead. More telling still, the brain – the organ most affected by Kuru – was also largely consumed by women and children, while men, when they did engage in cannibalism, preferred to eat the bodies of other men. Together, these facts seemed to explain why Kuru affected predominantly women and children, with only around 3% of Fore men succumbing to the disease.

Yet despite this compelling evidence for Kuru’s origins and transmission route, scientists still did not know what actually caused the disease. In order to find out, in 1968 Alpers collected brain sample tissues from the body of an 11-year-old Fore girl who had died of Kuru and delivered them to Daniel Gajdusek at the National Institutes of Health. Along with collaborator Joe Gibbs, Gajdusek ground up the samples and injected them into the brains of chimpanzees. Within two years the apes began showing the telltale signs of kuru: severe tremors, gradual loss of muscular control and cognitive function, and finally death. When Gajdusek and Gibbs autopsied the animal, they found that its brain – particularly the cerebellum, which controls muscular coordination – was riddled with millions of microscopic holes, giving it the appearance of a sponge or Swiss cheese under the microscope. This experiment not only confirmed that Kuru was spread via infected tissue and that it directly attacked the brain, but was also an extremely rare instance of a disease jumping from one species to another in a laboratory setting and the first known case of a contagious neurodegenerative disorder. But while this research would earn Gajdusek the Nobel Prize in Physiology and Medicine in 1976, he was ultimately unable to isolate the actual pathogen responsible for Kuru. Experiment after experiment ruled out bacteria, viruses, fungi, and protists, leaving the actual causative agent a baffling mystery.

The mystery of Kuru would not be solved until the late 1970s, when Gajdusek, along with British researchers E.J. Field, Tikvah Alper, and John Griffith, noted striking similarities between Kuru and the ailments scrapie and Creutzfeldt-Jakob [“kroyts-felt yah-cob”] disease or CYD. Scrapie, first described in the 18th Century, is a disease affecting mainly sheep and goats, named for the itching sensation that causes afflicted animals to scrape their skin raw on trees and fences; while CYD is a rare human neurodegenerative disorder first described in the 1920s. Like Kuru, both are untreatable and invariably fatal. In a series of groundbreaking experiments, Griffith demonstrated that scrapie could be transmitted via infected brain tissue which had been completely sterilized, and that the causative agent, whatever it was, was resistant to destruction by heating, ultraviolet light, and ionizing radiation. This lead Griffith to a radical conclusion: that these diseases were caused not by conventional pathogens, but something far stranger: rogue misshapen proteins called prions.

Unlike bacteria, protists, fungi or – depending on which side of a particularly heated scientific debate you happen to stand – viruses, prions – a contraction of proteinaceous infectious particles – are not actually alive. Rather they are proteins, of the same types that make up the bodies of healthy humans and animals. However, prions are the Mr. Hyde to the original protein’s Dr. Jekyll, a misshapen, deformed variant incapable of performing its original biological function. When a prion enters the body, it latches on to another, normal version of the same protein, causing it to transform into the same mutated form as itself. This in turn triggers a deadly chain reaction, gradually transforming and destroying the entirety of the host tissue. Though prion diseases are typically acquired from other infected hosts, they can occur spontaneously when normal proteins mutate into a malignant form, triggering the same transformational chain reaction. Indeed, it is now believed that Kuru in New Guinea originated sometime in the early 1900s when an unknown individual spontaneously developed the disease, and gradually spread through the Fore population via mortuary cannibalism.

The sponge-like holes the prions inflict in the victim’s brain tissue are what give Kuru, CYD, scrapie, and similar diseases their collective name: transmissible spongiform encephalopathy. If that name sounds familiar, it might be because prions are also responsible for bovine spongiform encephalopathy or BSE, better known as “Mad Cow Disease.” Like Kuru, BSE is also spread through cannibalism, though of a rather different sort. In the 1980s and 1990s, a massive outbreak of BSE in the UK was linked to the use of Meat-and-Bone Meal or MBM in animal feed. MBM is rendered from the parts of slaughtered animals considered unfit for human consumption, including blood, bone, hooves, horn, skin, and  – most relevant to the transmission of BSE – brain and nerve tissue. The transmission of BSE via infected animal feed devastated the UK livestock sector, triggering worldwide bans on British beef, killing hundreds of cows outright, and resulting in four million animals being slaughtered and incinerated in an attempt to contain the outbreak. Even more tragically, 177 people contracted Variant Creutzfeldt-Jakob Disease, the human form of BSE, from eating contaminated beef, with all invariably succumbing to the disease. In the wake of the outbreak, the UK banned the use of MBM for the feeding cows, sheep, goats, and other ruminant animals, though it remains a common ingredient in pet food.

Though their existence had been theorized for over a decade, prions were not isolated in the laboratory until 1982 by American researcher Stanley Prusiner of the University of California, earning him the 1997 Nobel Prize in Physiology and Medicine. This discovery of a brand-new infectious agent opened up a brand-new field in the study of communicable diseases, and lead geneticists to rethink certain assumptions regarding the transmission of structural information between genes and proteins. Some scientists now also suspect that Alzheimer’s disease may also be caused by a type of prion, opening exciting new avenues of research for the treatment of this disease, which afflicts up to 30 million people worldwide every year.

Yet despite this new understanding, mysteries still surrounded Kuru, the first human prion disease to be positively identified. Even before cannibalism was suspected as the cause of Kuru, the practice was strongly discouraged by missionaries and Colonial patrol officers. By the 1960s, when Alpers and Lindenbaum began their groundbreaking study, the Fore had all but abandoned mortuary cannibalism. However, the epidemic continued for decades, with deaths between 1987 and 1995 averaging 7 per year. It is now known that Kuru has an extremely long latency period, ranging anywhere from 3 to 50 years between infection and the onset of symptoms, allowing cases to appear long after the abandonment of cannibalism. Indeed, the fact that nobody born after 1960 has ever contracted the disease has served to confirm the link between Kuru and the practice of mortuary cannibalism.

Thankfully, however, there is reason to believe that after more than 100 years, the Kuru epidemic may at last be over. The last person died of Kuru 2009, and no new cases have been reported as of 2010. Even more encouragingly, it appears as though Fore may actually have developed immunity to Kuru, shielding them from future outbreaks. A 1996 study conducted by a team from University College London found a high prevalence of a gene mutation called G127 among the Fore, which prevents malignant prions like Kuru from infecting the brain. Genetic analysis has further revealed that this mutation appeared as recently as 10 generations ago. As John Collinge, a researcher at University College’s Prion Unit remarked:

“It’s absolutely fascinating to see Darwinian principles at work here. This community of people has developed their own biologically unique response to a truly terrible epidemic. The fact that this genetic evolution has happened in a matter of decades is remarkable.”

So for all you aspiring Hannibal Lecters out there, the tragic case of Kuru provides yet one more reason why – outside of extreme survival situations – cannibalism is rarely a good idea. Better to pair your fava beans and chianti with something slightly less exotic.

If you liked this article, you might also enjoy our new popular podcast, The BrainFood Show (iTunes, Spotify, Google Play Music, Feed), as well as:

Expand for References

Alpers, Michael, The Epidemiology of Kuru in the Period 1987 to 1995, Australian Government Department of Health, December 31, 2005, https://www1.health.gov.au/internet/main/publishing.nsf/content/cda-cdi2904i.htm

Weiler. Nicholas, Alzheimer’s Disease is a ‘Double-Prion Disorder’, Study Shows, University of California San Fransisco, May 1, 2019, https://www.ucsf.edu/news/2019/05/414326/alzheimers-disease-double-prion-disorder-study-shows

Stanley B. Prusiner – Facts, The Nobel Prize for Physiology and Medicine 1997, https://www.nobelprize.org/prizes/medicine/1997/prusiner/facts/

Press Release: Baruch S. Blumberg and D. Carleton Gajdusek, Karolinska Institutet, October 14, 1976, https://www.nobelprize.org/prizes/medicine/1976/press-release/

Kelleher, Colm, Brain Trust: the Hidden Connection Between Mad Cow and Misdiagnosed Alzheimer’s Disease, Paraview Piocket Books, New York, 2004, https://books.google.ca/books?id=AGAhAtI3kJEC&q=Gajdusek+drill+holes&pg=PA53&redir_esc=y#v=onepage&q=Gajdusek%20drill%20holes&f=false

Lindenbaum, Shirley, An Annotated History of Kuru, April 14, 2015, http://journals.ed.ac.uk/index.php/mat/article/download/4590/6242?inline=1

Kompoliti, K & Ferguson-Smith, M, Kuru (Disease), ScienceDirect, https://www.sciencedirect.com/topics/medicine-and-dentistry/kuru-disease

Liberski, Pawel et al, Kuru, the First Human Prion Disease, Viruses, March 2019, https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6466359/

Brain Disease ‘Resistance Gene” Evolves in Papua New Guinea Community; Could Offer Insights into CJD, Science Daily, November 21, 2009, https://www.sciencedaily.com/releases/2009/11/091120091959.htm

Liberski, P & Brown, P, Kuru: Its Ramifications After Fifty Years, Experimental Gerontology, 2008, https://hal.archives-ouvertes.fr/hal-00499057/file/PEER_stage2_10.1016%252Fj.exger.2008.05.010.pdf

Bichell, Rae, When People Ate People, a Strange Disease Emerged, NPR, September 6, 2016, https://ift.tt/V9Kgd7w

Rudan, Igor, The Laughing Death, https://irudan.medium.com/the-laughing-death-edba313c6791

Lagnado, John, From Pabulum to Prions (via DNA): a Tale of Two Griffiths, Past Times, https://ift.tt/AXU2f4Q

The post The Gruesome Tale of the Laughing Death Epidemic appeared first on Today I Found Out.



from Today I Found Out
by Gilles Messier - April 13, 2023 at 07:28PM
Article provided by the producers of one of our Favorite YouTube Channels!
-

Thursday, April 13, 2023

Why is the Universal Sign for a Hitchhiker the Thumbs Up, Held Out? And is Hitchhiking Actually Dangerous?

hitchhiker-womanSticking your thumb out to convey some meaning has seemingly been around for about as long as humans have been humaning, or, at least, as far back as written history goes. But how did the gesture come to be a signal that you’d like to hitch a ride from someone? And is hitching a ride actually dangerous?

To begin with, almost as soon as the automobile began to become popular in the United States, “thumbing” seems to have become a thing in lock step. As to why this, and not some other gesture, it is often put forth that this is attached to the idea that “thumbs up” means like “ok”, “yes”, “do it” etc. So, in essence, staring the person down in the car and giving them the thumbs up/positive gesture in hopes they’ll stop. The issue with this hypothesis, however, is that that particular “ok” or positive meaning of the gesture didn’t definitively become a widely popular thing until after WWII, a few decades after thumbing a ride was already well established.

Now, at this point, if you’ve watched basically any movie or show featuring Roman gladiators, you might already be heading to the comments to argue with us, as the thumbs up after a fight supposedly meant live and the thumbs down meant die, with this being the commonly touted origin of the whole thumbs up / positive / yes / ok meaning in the first place.

However, this is unequivocally false. While it is true that in the days of gladiatorial combat in the Colosseum and the earlier (and significantly larger) Circus Maximus, the audience could decide the fate of a fallen gladiator with a simple hand gesture, this isn’t typically depicted accurately and has nothing to do with why thumbs up and thumbs down means what it does today.

To wit, the fate of a gladiator, in terms of whether the audience was voting for a kill, was decided with what is known as “pollice verso”, a Latin term which roughly translates to “turned thumb”. More precisely what this means isn’t known and there are no accounts that have survived to this day that describe it in any real detail. As such, we’re unable to say for sure which way the thumb was supposed to be pointed if the audience wanted a given gladiator to be killed, or if they could just wave their thumbs around at random, which it seems may well have been the case. But either way, the turned thumb was to indicate not life, but death.

So what about indicating the person should live? The gesture to spare a given gladiator’s life seems to have been neither a thumbs up nor a thumbs down. Instead, you had to hide your thumb inside your fist, forming a gesture known as pollice compresso, “compressed thumb”.

The reasons for this have been speculated to be twofold: one, it made the decision of the crowd easier to discern, since it’s easier to tell the difference between a thumbs turned and a closed fist than a thumbs up and a thumbs down from a long ways away. And two, the gestures themselves are thought to be largely symbolic of what they represented- a pointed thumb represented the audience’s desire for the victorious gladiator to deliver his coup de grâce (stab the fallen foe), while a hidden thumb symbolised that they wished for the gladiator to stay his blade, sheathing it much in the way they’d hidden their thumbs. Hence why it’s thought “turned thumb” may well have been simply waving your thumb around in the air, perhaps in a stabbing motion.

Of course, moving on to hitchhiking, using the thumb as a symbolic sword to stab people isn’t probably going to increase your odds of getting picked up… So definitely no connection there. And, as mentioned, the thumbs up meaning more or less as we think of it today in the general, non-hitchhiking sense, didn’t really become popular until after WWII.

On this one, during WWII, the thumbs up gesture was used extensively by American pilots as a shorthand way of indicating to their ground crews that they were ready to fly.

It has been speculated that they got this from the Chinese, with specifically the Flying Tiger brigade of American pilots based in China seemingly being the first (or among the first) to popularly use the gesture, at least as far as photographic evidence from the era seems to indicate.  To the Chinese at this time, the thumbs up gesture meant “number one” or “nice job” depending on context. (Why this is the case is up for debate.) Whether it truly was adopted from the Chinese or some other source, the American pilot version initially meant “I’m ready” or “good to go”.

From here, things become much clearer. This “ready” meaning soon evolved into a simple, all-encompassing way to indicate that everything was okay in situations where verbal cues weren’t possible or advisable. It was also picked up by the rest of the American military who proceeded to make extensive use of the gesture during their many campaigns across Europe; in the process, it was picked up by the locals and soldiers from other militaries.

That said, there are several places on Earth where a thumbs up is considered a grave insult. For example, in places like Iraq and Greece, sticking up your thumb is akin to saying “shove it up your ass” (which, like the stabbing meaning, probably also wouldn’t be the best way to hail down a car… unless they are into that… We here at TodayIFoundOut don’t judge. You do you.)

Noteworthy, it also meant this in Australia before WWII, but afterward switched to the modern meaning thanks to the aforementioned dissemination of the gesture throughout the Allied military. The exact reasoning behind this more insulting meaning, as with many gestures, isn’t clear, but it’s believed to be representative of the action that would be required for you to act out the insult itself. In fact, in some of these cultures that interpret it this way, an up and down movement often accompanies the gesture to make the meaning perfectly clear. Funnily enough on this one, when American troops first started being stationed in Iraq, some reported being greeted by civilians offering a thumbs up, with the soldiers (and many in the media) interpreting it as most Westerners would, all the while not realizing the obscene connotations it has in that country.

In any event, given the lack of hard evidence of the thumbs up meaning as we think of it today, at least popularly, before WWII, the hypothesis that there is a connection here with regards to the origin of the thumbs up in hitchhiking is, unsurprisingly, highly suspect.

Even more so when we really dig into the hard documented evidence of the evolution of the hitchhiking sign. Whoever was the first to do it, one of the earliest references to hitchhiking, though without mention of the thumbs out gesture, comes from American poet Vachel Lindsay, who wrote in 1916, “He it is that wants the other side of the machine weighed down. He it is that will offer me a ride and spin me along from five to twenty-five miles before supper.”

By 1923, the practice had been given its proper name in the Nation on September 19, with “Hitch-hiking is always done by twos or threes.” Some claim that its popularity arose with the increasing presence of more and more cars combined with the desire of soldiers on furlough during WWI to get home as cheaply as possible.

Whatever the case, the practice quickly became popular with other vehicularly-challenged folks as well, including college students, kids and even (*gasp*) girls by themselves (more on whether this is actually dangerous or not in a bit).

By 1925, however, we do have reference to not only the gesture being associated with hitchhiking, but also, in part at least, why the thumb was used. This comes to us from a 1925 article in American Magazine that described how “The hitch hiker stands at the edge of the road and point with his thumb in the direction he wishes to go.”

Further doubling down on this reasoning, around the same time “thumb-pointer” was also a synonym for “hitchhiker”. In other words, it appears the use of the thumb out originally was not so much a thumb’s up sign to hail a ride, but actually dually indicating you wanted to be picked up, as well as telling the driver which way you wanted to go.

Of course, early hitchhikers could have pointed in other ways, so why the thumb was settled on instead of the index finger isn’t really clear. Perhaps so as not to be confused with giving the finger from a distance (and yes, the middle finger meaning FU is actually something that goes back to ancient times, as we’ll get into in the Bonus Facts in a bit.) But whatever the case, the purpose of the thumb originally was not to point up, but rather in the direction you wanted to go.

Going back to the early days of hitchhiking, along with thumb-pointer, other synonyms also appeared to refer to a hitchhiker including thumb-jerker. Such as in the October 6, 1926 edition of the Milwaukee Journal where it also makes clear why you jerk your thumb,

“Each age brings its own words. Mr. Webster would certainly push up his spectacles and bend low over the present-day dictionary if he could see some of the additions to the child of his wearisome labors. We have “gate-crasher” “cake eater” and “high hatter”. Mr. Webster might well scratch his head over the word “thumb jerker”.

A thumb jerker is a person who stands out on a road and jerks his thumb at passing automobiles, indicating that he wishes a lift in their direction. Boys on summer trips call it hitch-hiking. Tourists stop and take them in and carry them a certain distance. Then they pick up another car, or another car picks them up, and eventually they reach their distention without trouble or expense to themselves.”

On this note, while the practice was widely popular in the early days of the automobile, it wasn’t always looked upon kindly as a thing to do. For example, the Journal piece goes on:

“Not long ago a young man told his father he was going to hitchhike it to school some 200 miles distant, “I”ll use my carfare for spending money.” said the boy. “No, son” said the father, “You’ll not hitch-hike it. When I was a boy I hoofed it to the same school with the feeling that I was pretty lucky to get my tuition money. There was no money for carfare and I walked the whole distance on dirt roads, sleeping in barns at night. I didn’t depend on anything to get me there.

.. Never turn into a thumb jerker. If you jerk your thumb at cars for people to pick you up, you are acquiring a habit. The next time you will jerk your thumb for another kind of a favor perhaps. I’m not going to have you going through life expecting other people to do you favors. Stand on your own feet- and walk on your own feet.””

While perhaps great advice in some sense, also horrible in the sense that accepting a ride, particularly one someone is already taking, isn’t inherently a sign of laziness. Further, it could even be the opposite if the time or money for transport was utilized more wisely on the other end. And on top of that, nobody ever got anywhere in life without being willing to accept help from others somewhere along the way. Trying to do everything yourself is a great way to get a lot less done and, much like the walker vs successful hitchhiker, get where you want to go much slower, or even not at all.

But in any event, by 1928, the Saturday Evening Post printed a series of popular stories featuring two waitresses thumbing their way all around from New York to Florida. Across the pond, a September 6, 1927 edition of the Glasgow Herald explicitly mentions the word “hitch-hiking” and the new practice stating,

“AMERICA, which is the melting-pot not only of races but of colloquial English, has produced not a few startling words and phrases… The hobo has been rivalled (so an American correspondent informs us) by the hitchhiker, which is the latest curiosity born out of the linguistic genius of the Yankee. The hobo, long familiar to readers of fiction and social investigators, stole rides from one end of the continent to the other on freight trains. The hitchhiker, with the same passion for free travel, indulges it at the expense of the motorist. There are apparently hitchhikers in the United States, who boast they can travel 500 miles free of charge without walking more than 10.”

While the Glasgow Herald was speculating the practice would never catch on in the region owing to relatively inexpensive public transportation there, sometime over the course of the next decade or so, hitchhiking, including sticking your thumb out, does seem to have caught on in Europe, if perhaps never quite as popular as in the U.S. with its massive expanses of basically nothing between many towns and cities. For example, British novelist Nicholas Monsarrat, in his 1939 novel This is the Schoolroom, has his character having “thumbed [his] way across England in a day-and-a-bit.”

Going back to the U.S., the Great Depression also seems to have had a hand in boosting the popularity of the practice in the United States. With people all over the country thumbing rides from place to place in search of work.

Claudette_Colbert_in_It_Happened_One_NightThe gesture itself saw even wider adoption as the defacto way to indicate you’d like to hitch a ride thanks to Clark Gable’s exposition on the subject in It Happened One Night (1934). The movie itself was obscenely popular at the time, winning Oscars for Best Picture, Best Writing, Best Actress (Claudette Colbert), Best Director (Frank Capra) and Best Actor (Gable). In his role as Peter Warne, Gable attempted to instruct Colbert (playing Ellie Andrews) on the best method to employ “The Hitchhiker’s Hail“:

“Well, it is simple. It’s all in that old thumb, see? Some people do it like this. Or like this. All wrong. Never get anywhere. . . . . But that old thumb never fails.”

Of course, being a romantic comedy, Gable’s technique completely fails to convince anyone to stop, and the pair only ultimately get a lift after Colbert employs another method to get someone to stop and pick you up – she sticks out her leg and lifts her skirt a bit.

The gas rationing of WWII saw the practice surge even more with it being downright patriotic to hitch a ride rather than drive solo, as well as soldiers using hitchhiking to get around when in the states. On this one, again, it was considered patriotic to pick up any hitchhiker wearing a uniform.

Of course, in more modern times, the idea that hitchhiking is inherently dangerous, especially to women, in conjunction with public transportation becoming more widespread and relatively inexpensive, has seen the percentage of people who hitchhike drop off the table. Another contributing factor in places where public transportation isn’t always the greatest, is car ownership becoming ubiquitous, such as in the U.S. where around 95% of households currently have direct access to a car.

This all brings us to how dangerous hitchhiking actually is. While it is true there does seem to be a slight risk to it, the hard data to date doesn’t really support the idea of it being any more dangerous than, say, taking a public bus or going clubbing or walking along a sidewalk or countless other things we all do. That’s not to say there is no risk, simply that it’s far less dangerous, as we’ll get into in a bit, than public perception seems to think.

As Julian Portis in his thesis, Thumbs Down: America and the Decline of Hitchhiking,

The real danger of hitchhiking has most likely remained relatively constant, but the general perception of this danger has increased. … [O]ur national tolerance for danger has gone down: things that we previously saw as reasonably safe suddenly appeared imminently threatening. This trend is not just isolated to the world of hitchhiking; it has become a pernicious artifact throughout the American cultural conscience.

This is similar to the rise of the whole “stranger danger” warning to kids which occurred around the same time hitchhiking also began to be perceived as dangerous. On this one, it’s noteworthy that while just shy of a million kids are reported missing, at least temporarily, in a given year in the U.S., only a little over 100 of that number on average turn out to be an abduction by a stranger. With many arguing that instilling that inherent fear of strangers in kids causes more direct harm than good, in the sense that a child lost or in trouble may be hesitant to seek help. For example, in Utah in 2005 an 11 year old Boy Scout was lost in the wilderness for four days despite rescuers being deployed not long after he went missing. The boy had heard and seen the people searching for him many times, but didn’t know them so, to quote him after he was found, he hid from them because “I didn’t want someone to steal me.” It wasn’t until he became a bit delirious after four days with no food that a searcher almost literally stumbled on the boy who was walking along a trail.

Others further argue that the general instilling of widespread distrust of anyone you don’t know just isn’t a good thing to teach kids.

Whatever your opinion on that, as for public perception concerning hitchhiking, various films and shows featuring the whole psychopath hitchhiker thing seem to have risen around the same time that hitchhiking more and more began being perceived as dangerous, with the two things seemingly feeding on one another.

For example, ABC’s 1979 Diary of a Teenage Hitchhiker has this gem of a teaser, “The Girls by the Side of the Road: You’ve seen her standing there. Thumb out. Smiling. There are thousands like her all over America. And you’ve heard about what happens to some of them when they get into the wrong car. This movie is about one of these kids. And about her family. But it could be about your family. Where is your daughter tonight?” DUN DUN DUN!!!!!

A 1973 Reader’s Digest article even went so far as to say, “In the case of a girl who hitchhikes, the odds against her reaching her destination unmolested are today literally no better than if she played Russian roulette.”

On top of that, right at the start of the major decline in hitchhiking in the 1970s, we have such quotes as this 1971 gem of a reflection of attitudes from a Sacramento Country Sheriff’s office stating hitchhiking is, to quote, “like open season for every pervert, homosexual and creep in the area.” (…Dang homosexuals always picking people up and giving them a ride to where they want to go… *shakes fist*)

On the other end of the country, one New York City officer noted, “Any girl on the road, she’s asking for trouble because any guy driving by thinks she’s loose.” (Girls wanting a ride to go somewhere = whoreishness. Got it… She should probably wear a nun’s outfit or something whenever out and about too, else she’s clearly asking for it.)

So just how dangerous is hitchhiking? It is notoriously difficult to nail down hard data on the dangers of hitchhiking for a variety of reasons, including that a body thrown in the ditch on the side of the road could have been a hitchhiking instance, or it could be completely unrelated to hitchhiking.

That said, one of the most comprehensive studies we have on this to date is from the end of the heyday of hitchhiking era, a 1974 California Highway Patrol study looking into all the claims in the 1970s that hitchhiking was dangerous. Their results?

Among many other things they noted that hitchhiking with someone reduced the already extremely low likelihood of being harmed in some way while hitchhiking by 600%. Further, it is more likely that the person hitchhiking would be the victim, rather than the person picking up the hitchhiker (72.7% compared to 28.3%). Moving on from there, the average male hitchhiker was 22 years old, and the average female hitchhiker 19 years old. And about 90% of hitchhikers were male.

As you might expect, female hitchhikers were 7-10 times more likely to be victims of crimes than the males and 80% of the crimes committed against female hitchhikers were sexual in nature, more or less accounting for most of the increased crime potential. Basically approximately the same “other” crimes either way, but tacking on the sexual ones with women.

However, they further note that the sex crimes committed related to hitchhiking were insignificant compared to the total numbers for the state and that the women here weren’t statistically more likely to be the victims of a sex crime while hitchhiking compared to the rest of their day to day lives.

Further, they found that in most cases in hitchhiking related crimes the person either hitchhiked or picked someone up specifically with the intent to commit the crime in the first place. Thus, while hitchhiking was more or less the vehicle for the crime, they speculate it’s probable the person would have found another method for the crime if hitchhiking wasn’t a thing.

On this note, very significant and lending to their ultimate conclusion we’ll get to shortly, hitchhikers who were victims or suspects of a crime while hitchhiking were almost just as likely to be a victim or suspect of a crime outside of anything related to hitchhiking.

Thus, they concluded, “No independent information exists about hitchhikers who are not involved in crimes. Without such information, it is not possible to conclude whether or not hitchhikers are exposed to high danger. However, the results of this study do not show that hitchhikers are over represented in crimes or accidents beyond their numbers. When considering statistics for all crimes and accidents in California, it appears that hitchhikers make a minor contribution.”

They also note from all this that in their opinion, eliminating the practice of hitchhiking probably wouldn’t reduce crime in a statistically significant way.

As for some broader numbers, the FBI notes that from the three decade span of 1979 to 2009, there were only 675 cases of murder or sexual assault along interstates, 500 of which included murder, or about 22 cases per year across the country. Further, these not only included known hitchhiking instances, but also just bodies found along the road and the like which may or may not have been hitchhiking related.

Thus, while the data is pretty lacking to come to any firm conclusions about the exact danger with hitchhiking or picking a hitchhiker up, the studies done to date and the limited data we do have seem to lean towards that while hitchhiking does expose you to some level of risk you wouldn’t have if, say, you’re just sitting on your couch, the greatest risk you’re possibly going to be exposed to when hitchhiking is the fact that you are in a rapidly moving vehicle, and those sometimes get in crashes and cause you to be injured or die.

As ever, as has become clear to us over the years doing such analyses on the various ways to get from point A to point B and the inherent risks, the only truly safe way to travel is via elevator. Even walking is shockingly dangerous. And don’t even get us started on leaving your home at all. Best to not do that. Ever. There are literally people out there. Ya, thanks but no thanks my dude.

Bonus Fact:

Some common gestures, such as the high five, have pretty well known and surprisingly modern origins. For example, in this one the high five was invented and popularized thanks to a 1977 major league baseball game in which Glenn Burke decided to switch up the low five and give Dodger teammate Dusty Baker a high five after Baker hit a home run. Interestingly here, as Glenn Burke was the first openly gay major league baseball player and invented the high five, for a little while after, the high five was used as a gay pride symbol, particularly by Burke himself.

Moving on to the middle finger, it turns out one of the most popular gestures of all, giving the bird, unlike the high five and modern meaning of the thumbs up, isn’t a recent invention at all and has been around for well over two thousand years, including having various similar connotations as it has today throughout.

Unsurprisingly once you stop and think about versions of the expression’s meaning, extending the middle finger simply represents the phallus, with it perhaps natural enough that our forebears chose their longest finger to symbolically represent man’s favorite digit. (Although, there are some cultures that instead chose the thumb, seemingly preferring to have their girth, rather than length, represented here…) It’s also been speculated that perhaps people noticed that the curled fingers (or balled fist in the case of the thumb) made for a good representation of the testicles.

Either way, given the symbolism here, it’s no surprise that the expression has more or less always seemed to have meant something akin to “F&*k You” in some form or other, sometimes literally.

For example, in Ancient Greece, beyond being a general insult, in some cases there seems to be a specific implication from the insult that the person the gesture was directed at liked to take it up the bum. In the case of men, despite male on male lovin’ being widely accepted in the culture at the time, there were still negative connotations with regards to one’s manliness when functioning as the bottom in such a rendezvous, particularly the bottom for someone with lower social standing.

Moving on to an early specific example we have Aristphanes’ 423 BC The Clouds. In it, a character known as Strepsiades, tired of Socrates’ pontificating, decides to flip off the famed philosopher.

SOCRATES: Well, to begin with,
they’ll make you elegant in company—
and you’ll recognize the different rhythms,
the enoplian and the dactylic,
which is like a digit.

STREPSIADES: Like a digit!
By god, that’s something I do know!

SOCRATES: Then tell me.

STREPSIADES: When I was a lad a digit meant this!

[Strepsiades sticks his middle finger straight up under Socrates’ nose]

 

For whatever it’s worth, in the third century AD Lives of the Eminent Philosophers, we also have this reference of a supposed incidence that occurred in the 4th century BC, concerning famed orator Demosthenes and philosopher Diogenes.

[Diogenes] once found Demosthenes the orator lunching at an inn, and, when he retired within, Diogenes said, “All the more you will be inside the tavern.” When some strangers expressed a wish to see Demosthenes, [Diogenes] stretched out his middle finger and said, “There goes the demagogue of Athens.”

(No doubt water was needed to put out the fire created by that wicked burn.)

Moving on to the first century AD, Caligula seems to have enjoyed making powerful people kiss his ring while he extended his middle finger at them. On a no doubt completely unrelated note, the chief organizer of his assassination, and first to stab him, was one Cassius Chaerea who Caligula liked to do this very thing with, as noted by Suetonius:

Gaius used to taunt him, a man already well on in years, with voluptuousness and effeminacy by every form of insult. When he asked for the watchword Gaius would give him “Priapus” or “Venus,” and when Chaerea had occasion to thank him for anything, he would hold out his hand to kiss, forming and moving it in an obscene fashion.

Speaking of the implications of this insulting gesture, it seems to have fallen out of favor during the Middle Ages with the rise of Christianity, or at least records of it diminish. This may mean people actually stopped popularly flipping the bird or may just mean its uncouth nature saw it something not generally written about. That said, we do know thanks to the Etymologiae of Isidore of Seville that at least as late as the 6th century people were still extending the finger as an insult, in this reference particularly directed at someone who had done something considered “shameful”.

Moving on to more modern times and back to baseball, the gesture was popularly resurrected in documented history starting around the early 19th century, with early photographic evidence later popping up in the latter half of the 1800s. Most famously, we have a photograph of the gesture flashed by present day Twitter sensation and former 19th century baseball iron man Charley “Old Hoss” Radbourn. Radbourn was a pitcher for the Boston Beaneaters in 1886 when the team, along with the New York Giants, posed for a group photo. In the photo, Old Hoss can be seen giving the bird to the cameraman.

Expand for References

The post Why is the Universal Sign for a Hitchhiker the Thumbs Up, Held Out? And is Hitchhiking Actually Dangerous? appeared first on Today I Found Out.



from Today I Found Out
by Daven Hiskey - April 13, 2023 at 12:05AM
Article provided by the producers of one of our Favorite YouTube Channels!
-

Tuesday, April 11, 2023

That Time the United States Tested Biological Warfare on its Own Citizens

For the residents of San Francisco, October 11, 1950 started out like any other day, with thick banks of autumn fog rolling in from the bay and across the city. By the afternoon, however, it became clear that something was seriously wrong. On that day alone, eleven patients were admitted to Stanford Hospital with pneumonia, fever, and serious urinary tract infections. One of them, a 75-year-old retired pipe fitter named Edward J. Nevin, died three weeks later. Tests revealed  the culprit to be Serratia marcescens, a bacterium so uncommon that not a single case of infection had been recorded in the entire history of San Fransisco. So baffled were the hospital’s doctors by this unusual cluster of infections that they reported the incident in a medical journal, though when no new cases appeared they dismissed it as a fluke. But unknown to the doctors and the residents of San Fransisco, the thick fogs that crept over the city that autumn carried a secret passenger: trillions of bacteria sprayed from a Navy ship sailing just offshore. Code-named Sea Spray, this operation was part of a top-secret Cold War project to test the city’s vulnerability to a potential Soviet biowarfare attack. But San Fransisco was far from alone; between 1949 and 1969, the U.S. Military deliberately exposed dozens of American cities and millions of ordinary citizens to potentially harmful bacteria and chemicals, all in the name of national security. This is the shocking story of one of the largest programmes of human experimentation in American history.

Biological warfare has long been a part of human conflict, from the medieval practice of catapulting infected corpses and rats into besieged cities to spread disease to the infamous use of smallpox-infected blankets during the 18th Century French and Indian War. But it was not until the late 19th Century, when scientists like Robert Koch and Louis Pasteur discovered the microorganisms that cause disease and how to cultivate them that the development of dedicated, effective biological weapons began in earnest. By the time of the First World War, Imperial Germany had built an extensive bioweapons programme, perfecting strains Anthrax and Glanders with which it planned to infect its enemies’ livestock and military draft animals. However, none of these weapons were ever deployed before the war ended. But the horrifying effects of the chemical weapons that had been used during the war – such as phosgene and mustard gas – left such an impression on world leaders that in 1925, 146 countries came together to draft the Geneva Protocol for the Prohibition of the Use of Asphyxiating, Poisonous, and Other Gases, and of Bacteriological Methods of Warfare. The treaty was signed by 38 nations including France, Great Britain, the Soviet Union, Japan, and the United States, though the latter two would not ratify it until the 1970s.

Over the next two decades, the majority of the Geneva Protocol’s signatories avoided developing biological weapons – with one major exception. In 1936, the Imperial Japanese Army established a biowarfare research centre outside the Chinese city of Harbin, in the Japanese puppet state of Manchukuo.  Run by General Shiro Ishii, the facility, known as Unit 731, would go on to commit some of the most horrific atrocities in modern history. In experiments that make the work of Nazi scientists like Dr. Josef Mengele look like the pinnacle of medical ethics, researchers at Unit 731 used local Chinese citizens as human guinea pigs, exposing them to deadly pathogens like Anthrax and Bubonic Plague before dissecting them alive without anaesthetic to study the effects of these diseases. Japanese aircraft also dropped bombs loaded with anthrax, plague, cholera, salmonella, and other agents on 11 Chinese cities, leading to tens of thousands of deaths.

While such horrors might seem like a unique product of Imperial Japan’s particular brand of extreme militarism, it was not long before the western powers, too, succumbed to the dark allure of biological warfare. Following the Nazi invasion of Poland in September 1939, the United Kingdom set up its own biowarfare program based at Porton Down in Wiltshire and Toronto in Canada, with research focusing on the weaponization of tularaemia, psittacosis, brucellosis, Q fever, and Anthrax – and for more on this, please check out our previous video Grosse Île – Canada’s Anthrax Island.

The Japanese attack on Pearl Harbour on December 7, 1941, also caused the United States to reverse its stance on biowarfare. In early 1942, U.S. Secretary of War Henry Stimson expressed concern to President Franklin D. Roosevelt regarding America’s vulnerability to biological attack. In response to this and growing pressure from the British, in November 1942 Roosevelt approved the creation of an American bioweapons program, overseen by the U.S. Army Chemical Warfare Service and centred at Fort Detrick, Maryland. By 1945, the US biowarfare program had succeeded in producing several tons of weaponized pathogens including anthrax and smallpox, though none were ever used in combat. American wartime policy dictated that such weapons were only to be used in retaliation or as a deterrent against enemy biological attacks, and in this the program was highly successful;  after the war, captured documents revealed that fear of American retaliation had convinced Nazi Germany to abandon its own biowarfare program.

The dawn of the Cold War brought a new sense of urgency to the U.S. bioweapons program, as intelligence obtained by the CIA revealed the existence of a vast Soviet biological warfare research program based in the city of Sverdlovsk in the Ural Mountains. Desperate to gain any possible advantage over the Soviets, the US Government went so far as to pardon the scientists of Japan’s Unit 731 in exchange for their data and expertise. Yet despite the Japanese’s extensive real-world experience in conducting biological warfare, many questions remained: which pathogens would cause the most damage? What was the most effective means of dispersal? How would pathogens spread in cities compared to the countryside? Which Soviet and American cities were most vulnerable to biological attack, and how could the latter be protected? Three potential methods were evaluated for answering these questions: first, small-scale testing using model cities in wind tunnels; second, full-scale testing using live pathogens in simulated cities; and third, full-scale testing using simulated pathogens in real cities. While wind tunnel tests by the British had yielded some useful results, the first two methods were quickly rejected – the first due to its technical limitations and the second due to the exorbitant cost of simulating an entire city. That left method #3: releasing simulated pathogens on real cities. The search thus began for American cities that could reasonably approximate Soviet population centres.

This proved more challenging than anticipated, for most regions which matched Russian cities in temperature and precipitation did not match them geographically – and vice-versa. In the end, however, eight cities were found to have the desired combination of climate, geography, and architecture: Oklahoma City, Kansas City, Omaha, Cincinnati, St. Louis, Chicago, and Winnipeg in Canada – with Minneapolis, St. Louis, and Winnipeg being identified as particularly suitable. Cities in California and Florida were also selected for tests involving coastal areas. To simulate biowarfare agents, researchers chose four different kinds of bacteria: Serratia marcescens, Bacillus globigii, Bacillus subtilis, and Aspergillus fumigatus. Chosen for their similarity to real biowarfare agents like anthrax and tularaemia, these bacteria were also readily found in nature and easy to grow – indeed, Serratia marcescens is responsible for the pink film often found growing in bathtubs and toilets. Chemical simulants were also used, including zinc cadmium sulphide, a powder whose small particle size and fluorescent glow made it ideal for tracking the airborne dispersal of infectious agents. At the time, all these simulants were considered harmless to humans. Despite this, for reasons of security and obtaining the most accurate results possible, the residents of the targeted cities would not be informed that the tests were taking place. Thus began one of the most ethically fraught chapters in the history of American military research.

The first bioweapons test on U.S. soil took place in August 1949, when agents of Camp Detrick’s Special Operations Division released inert bacteria into the ventilation system of the Pentagon. Larger-scale operations soon followed, including Operation Sea Spray. Between September 20 and 27, 1950, a U.S. Navy minesweeper sailed just off of San Fransisco Bay spraying a mixture of Serratia marcescens and Bacillus globigii from large onboard hoses. Meanwhile, 43 monitoring stations throughout the city recorded the dispersion of the bacteria. According to Leonard J. Cole, author of the book Clouds of Secrecy, the data revealed that:

“Nearly all of San Francisco received 500 particle minutes per liter. In other words, nearly every one of the 800,000 people in San Francisco exposed to the cloud at normal breathing rate (10 liters per minute) inhaled 5000 or more particles per minute during the several hours that they remained airborne.”

Similar tests were conducted off the coast of South Carolina, Georgia, and Florida, while between 1953 and 1975 the UK’s Chemical Defence Experimental Establishment at Porton Down carried out the Dorset Biological Warfare Experiments, spraying a combination of zinc cadmium sulphide and Bacillus globigii off the coast of Southwestern England.

In 1965, as part of the Pentagon’s Project 112, American researchers released Bacillus globigii at the National Airport and Greyhound Terminal in Washington, DC. More than 130 passengers were exposed, spreading the simulant bacteria to 39 cities in 7 states over the next two weeks. The following year, Bacillus Subtilis was released into the New York subway system by dropping lightbulbs filled with bacteria onto the tracks. These bacteria also spread quickly through the subway lines, leading the official Army report on the experiment to conclude:

“Similar covert attacks with a pathogenic disease-causing agent during peak traffic periods could be expected to expose large numbers of people to infection and subsequent illness or death.”

The largest of these experiments, however, was Operation LAC, which took place between 1957 and 1958. Short for “Large Area Coverage”, LAC evaluated the feasibility of covering large areas with biowarfare agents by releasing them from aircraft. Using Fairchild C-119 Flying Boxcar cargo aircraft, LAC released hundreds of tons of zinc cadmium sulphide over 33 rural and urban areas in the midwestern United States and Canada, with ground stations monitoring the dispersion of the fluorescent powder. The tests revealed the aerial dispersion method to be extremely effective, with the simulant travelling up to 1,900 kilometres from where it was dropped.

As covered in our previous videos That Time US Scientists Injected Plutonium Into People Without Their Knowledge and The Appalling Tuskeegee Syphilis Experiment, human medical experimentation in the United States has tended to have a strong racial component, often targeting poor black communities and other vulnerable groups. Operation LAC was no exception. Starting in the mid-1950s, the Army began spraying zinc cadmium sulphide powder from motorized blowers mounted atop Pruitt-Igoe, a massive public housing block in St. Louis inhabited almost entirely by poor blacks. As part of the Army Chemical Corps’s St. Jo program, simulant was also sprayed from aircraft and trucks in St. Louis, Minneapolis, and Winnipeg – again, mostly in poorer neighbourhoods. As the sprayers could not be easily hidden, residents were told that they produced an invisible smoke screen that would shield the cities from Soviet radar.

Between 1949 and 1969, the U.S. Armed Forces conducted a total of 239 open-air biowarfare experiments on 66 American and Canadian cities, 80 of which used live bacteria. The program was only halted due to a 1969 directive by President Richard Nixon calling for the elimination of the United States’ entire stockpile of biological warfare agents – the destruction of which was completed by 1973. While U.S. Government officials hoped that all records of the human biological warfare experiments would be destroyed along with the weapons themselves, in 1976 Newsweek reporter Drew Fetherston uncovered classified documents revealing many of the secret tests. This in turn led the San Fransisco Chronicle to uncover and report on the Operation Sea Spray experiments of September 1950. In light of these revelations, in 1977 the federal government formed the U.S. Senate Subcommittee on Health and Scientific Research in 1977 to investigate allegations of unethical experimentation.

While the U.S. Army believed that the biowarfare simulants used it its live experiments were harmless to humans, it is now known that in large enough doses Serratia marcescens and Bacillus globigii can cause serious infections. Indeed, it is now believed that the release of these bacteria over San Francisco permanently altered the microbiome of the region, leading to an epidemic of heart valve infections in hospitals and other serious infections among intravenous drug users throughout the 1960s and 1970s. And in 2004, a string of infections caused by an influenza vaccine was traced to Serratia marcescens contamination at the Chiron Corporation’s factory in Alameda, California. However, it is now believed that the 11 cases of Serratia marcescens-induced urinary tract infections on October 11, 1950 were unrelated to Operation Sea Spray. As Army officials testified in the 1977 Senate hearings, all 11 patients had recently undergone minor surgeries and the outbreak was confined to a single hospital, indicating that the source of infection lay inside the hospital itself. Nonetheless, in 1977 the surviving family members of Edward J Nevin, who had allegedly died as a result of the 1950 experiments, sued the federal government for negligence and financial and emotional harm, with Nevin’s grandson, Edward J. Nevin III, stating:

“My grandfather wouldn’t have died except for that, and it left my grandmother to go broke trying to pay his medical bills.”

Unfortunately, the U.S. District Court in San Francisco ruled against the Nevins, claiming there was insufficient evidence that the bacteria used in the test were responsible for Edward J. Nevin’s death. Undeterred, the Nevins took their case all the way to the U.S. Supreme Court, with the trial finally taking place on March 16, 1981. In his opening statement, Edward Nevin III, himself a lawyer, questioned the legal and ethical validity of the biowarfare experiments, stating:

“On what basis of law does the U.S. government of the United States justify the dispersion of a large collection of bacteria over the civilian population in an experiment…without informed consent?”

Unfortunately for Nevin, the government had assembled a formidable team of legal representatives and expert witnesses, including attorney John Kern, who proceeded to deny every one of Nevin’s arguments. The strain of bacteria that had killed Nevin’s grandfather, Kern argued, was of an entirely different strain than the one used in the Operation Sea Spray experiments. Furthermore, in tests conducted at Fort Detrick in 1940, volunteers exposed to Serratia marcescens had suffered nothing more serious than coughing, redness of the eyes, and fever, with symptoms lasting no longer than four days. Kern then dramatically hammered home is point by thrusting his pen into the air and declaring:

“Every atom in this pen could decide right now to rise up about six inches and turn around 180 degrees. That would be about as likely to happen as the bacteria killing someone.”

One of Kern’s witnesses, a doctor for the biological warfare unit at Fort Detrick, concurred, chillingly stating:

“The strain [wasn’t] pathogenic, [and] I would still spray SF again today.”

Kern then proceeded to dismantle Nevin’s arguments regarding the legality of the biowarfare tests, making the extraordinary claim that the government needed no permission to experiment on the public without their consent or knowledge. While the 1946 Federal Tort Claims Act gives the public the right to sue the Federal Government, this right is suspended in cases where the Government is “performing appropriately under policy.” According to Kern, this exception applied to activities carried out in the interest of national security – including spraying civilians with bacteria.

Though Nevin put up a valiant fight, he knew it was all over when Kern called his final witness to the stand: General William Creasy, commander of the U.S. Army biological warfare unit. In his testimony, Creasy stated that not only was obtaining informed consent from the public not necessary, it was not desirable, stating:

“I would find it completely impossible to conduct such a test trying to obtain informed consent. I could not have hoped to prevent panic in the uninformed world in which we live in telling them that we were going to spread non-pathogenic particles over their community; 99 percent of the people wouldn’t know what pathogenic meant.”

The trial only got more belligerent from there, with Creasy berating Nevin for his alleged lack of respect for military officials and even trying to start a fistfight during recess. In the end, however, the Supreme Court sided with the Government and refused to overturn the San Francisco District Court ruling. The Nevins’ four-year battle for justice ended in defeat.

Meanwhile, doubts have been raised regarding the safety of the zinc cadmium sulphide simulant used in the U.S. Army’s LAC experiments.While at the time the compound was considered harmless, it is now known that cadmium is a powerful human carcinogen and in high concentrations can cause damage to the lungs, kidneys, and other organs. After learning of the stimulant-spraying experiments in St. Louis, in 2012 sociology professor Lisa Martino-Taylor claimed to have examined medical records and discovered a significant spike in cancer rates in the decades following the tests. However, no further evidence has emerged to confirm this link, with the U.S. National Research Council’s official study on the matter concluding:

“After an exhaustive, independent review requested by Congress, we have found no evidence that exposure to zinc cadmium sulfide at these levels could cause people to become sick.”

Thus, without further, independent study, the true impact of the LAC experiments may never be known. But regardless of the morally dubious nature of these experiments, it appears that they did, in fact, yield genuinely useful results. As Leonard Cole, adjunct professor of political science at Rutgers University explains:

“We learned a lot about how vulnerable we are to biological attack from those tests. I’m sure that’s one reason crop dusters were grounded after Sept. 11: The military knows how easy it is to disperse organisms that can affect people over huge areas.”

Today, the knowledge gained through these tests is used for purely defensive purposes. In 1972, 109 countries including the United States signed the Convention on the Prohibition of the Development, Production, and Stockpiling of Bacteriological (Biological) and Toxin Weapons and on Their Destruction. Since then, the U.S. military has not maintained any offensive or defensive biological warfare capability – not officially, anyway. Yet accusations persist that human testing continued in secret. For example, in 2019 Republican representative for New Jersey Chris Smith alleged that from 1950-1975, the U.S. Army released ticks infected with Lyme disease to test its effect on the American public. If true, this would mean that the U.S. Government knew about Lyme disease long before its official discovery in 1982. However, no convincing evidence has yet emerged to back up Smith’s claims.

Along with other secret military projects of the era, such as the CIA’s MKULTRA mind-control experiments and the University of California’s plutonium injection studies, the U.S. Army’s biological warfare tests represent one of the great ironies of the Cold War. For while these experiments were ultimately intended to protect the public and preserve American institutions, in the end they succeeded only in harming millions of American citizens, shattering their faith said institutions, and proving the old adage: “just because you’re paranoid doesn’t mean they’re not out to get you.”

If you liked this article, you might also enjoy our new popular podcast, The BrainFood Show (iTunes, Spotify, Google Play Music, Feed), as well as:

Expand for References

 

Toxicologic Assessment of the Army’s Zinc Cadmium Sulfide Dispersion Tests, National Research Council, 1997, https://www.ncbi.nlm.nih.gov/books/NBK233494/#ddd00091

 

Salter, Jim, The Army Sprayed St. Louis With Toxic Aerosol During a Just revealed 1950s Test, Business Insider, October 4, 2012, https://www.businessinsider.com/army-sprayed-st-louis-with-toxic-dust-2012-10

 

Carlton, Jim, Of Microbes and Mock Attacks: Years Ago, The Military Sprayed Germs on U.S. Cities, The Wall Street Journal, October 22, 2001, https://www.wsj.com/articles/SB1003703226697496080

Military Once Used SF Fog For Simulated Germ-Warfare Attack, Exposing 800,000 To Harmful Bacteria, CBS Bay Area, July 10, 2015, https://www.cbsnews.com/sanfrancisco/news/military-used-san-francisco-fog-for-simulated-germ-warfare-attack-exposing-800000-people-to-harmful-bacteria/

 

The US Has a History of Testing Biological Weapons on the Public – Were Infected Ticks Used Too? The Conversation, July 22, 2019, https://theconversation.com/the-us-has-a-history-of-testing-biological-weapons-on-the-public-were-infected-ticks-used-too-120638

 

Crockett, Zachary, How the US Government Tested Biological Warfare on America, Priceonomics, October 30, 2014, https://priceonomics.com/how-the-us-government-tested-biological-warfare-on/

 

Secret Testing in the United States, The American Experience, https://www.pbs.org/wgbh/americanexperience/features/weapon-secret-testing/

 

Barnett, Antony, Millions Were In Germ War Tests, The Guardian, April 21, 2002, https://www.theguardian.com/politics/2002/apr/21/uk.medicalscience

 

The Dorset Biological Warfare Experiments 1963-1975, https://ift.tt/2P0qnwG

The post That Time the United States Tested Biological Warfare on its Own Citizens appeared first on Today I Found Out.



from Today I Found Out
by Gilles Messier - April 10, 2023 at 12:59AM
Article provided by the producers of one of our Favorite YouTube Channels!
-

Monday, April 10, 2023

The Largely Forgotten Airship Disaster That Helped Kill These Cruise Ships of the Sky

The age of the giant rigid airship or Zeppelin was tragically brief, lasting barely forty years from the first Zeppelin flights in 1900 to the scrapping of the last surviving such airship, the Graf Zeppelin II, in 1940. When we think of the end of giant airships, we tend to picture the Hindenburg, which met its end in fiery end outside Lakehurst, New Jersey on May 6, 1937. But while the Hindenburg was one of the final nails in the coffin for the dream of commercial lighter-than-air flight, the death of the giant airship had begun long before, with a long string of increasingly deadly crashes stretching all the way back to the turn of the century And among the worst was a now largely forgotten 1930 disaster that killed more people than the Hindenburg and ended the dream of airship travel in the British Empire. This is the story of the tragic loss of His Majesty’s Airship R-101.

The early leader in rigid airship technology was Germany, with Count Ferdinand von Zeppelin, a former Army officer, making his first experimental flights in 1900. Zeppelin’s designs were the first to combine innovations such as a rigid aluminium frame, multiple hydrogen-filled gas cells, air filled ballonets for pressure control, and a water ballast system for altitude control into a single, workable package, and his sleek cigar-shaped creations soon became a common sight in the skies over southern Germany. Despite some early mishaps, within a decade Zeppelin had perfected the rigid airship into a dependable form of transport, and in 1910 he formed the world’s first airline, DELAG. The appeal of the Zeppelin as a form of commercial transport was obvious: compared to the small and primitive aircraft of the time, airships could carry a larger number of passengers longer distances in opulence and comfort approaching that of an ocean liner, with relatively little turbulence or risk of motion sickness. But there was a hitch: being lighter than air, Zeppelins were extremely vulnerable to the vagaries of wind and weather. Thus, unlike a modern airline DELAG never kept a regular schedule, its Zeppelins only flying when the weather permitted. Despite this the airline proved extremely popular with Germany’s elite, who gladly paid exorbitant fares for two-hour aerial excursions over the countryside.

To Zeppelin, however, DELAG was little more than a sideshow, for as a former Army officer he saw the airship primarily as a weapon of war. With the outbreak of the Great War in 1914, the German military, who had for years rejected Zeppelin’s proposals, finally expressed interest in airships and commandeered DELAG’s Zeppelins for military use. Particularly enthusiastic was the Imperial Navy, who on May 31, 1915 used Zeppelins to carry out the first aerial bombing raid on London. Such raids would continue throughout the war, and while they were a massive propaganda coup for the Germans and brought the horrors of war home to ordinary Britons, they caused relatively minor damage. And improved defences like antiaircraft guns and fighter aircraft firing incendiary bullets soon made Zeppelin raids increasingly costly, and resulted in most of Imperial Germany’s airships – and their crews – going down in flames.

After the war, with Germany’s economy in tatters and Zeppelin production banned by the Treaty of Versailles, the lead in airship development passed to the British. In July 1919, the British airship R.34 made history by making the first east-west crossing of the Atlantic from East Fortune, Scotland to Long Island, New York, a journey in 108 hours. Four days later it completed the return journey, making the first two-way aerial crossing of the Atlantic. While this feat demonstrated the viability of long-distance airship travel, the struggling British economy could not support further development and the airship program was shut down in 1921. The initiative passed once again to the German Zeppelin Company, who after Versailles Treaty restrictions were relaxed in 1923 produced a number of increasingly massive airships for foreign customers, including the USS Shenandoah for the U.S. Navy. Then, in 1928, the company produced what would become the only truly successful passenger airship: the LZ-127 Graf Zeppelin. The Graf Zeppelin was unlike anything the world had ever seen, measuring a whopping 236 metres long and 30 metres tall and featuring accommodations for 24 passengers which rivalled most ocean liners for comfort and opulence. In 1929, in an event bankrolled by American newspaper magnate William Randolph Hearst, the Graf Zeppelin became the first airship to circumnavigate the globe, flying from Lakehurst, New Jersey to Friedrichshafen in Germany, then across the steppes of Siberia to Tokyo, then across the Pacific Ocean to Los Angeles, and finally cross-country back to Lakehurst, a total journey of 33,234 kilometres that took 21 days, 5 hours, and 31 minutes.

Meanwhile in Britain, the accomplishments of the Zeppelin company had reignited interest in airship travel. In 1924, after two years of lobbying, inventor and parliamentarian Sir Charles Burney introduced the Imperial Airship Scheme, an ambitious plan to connect Britain’s far-flung empire with a fleet of long-range airships. Unfortunately, Burney’s campaign coincided with the election of the Labour government of Prime Minister Ramsay MacDonald, who slashed Burney’s proposed fleet of 6 airships to only two and organized a bizarre contest. One airship, the R.100, would be built by private enterprise, while the other, the R.101, would be built by a Government concern. This competition, the Government hoped, would demonstrate the superiority of public enterprise, and lead to the R.100 being dubbed the “capitalist ship” and the R.101 the “socialist ship.” Of the scheme, R.100 assistant engineer Nevil Shute – later to become famous as the author of novels like On the Beach and A Town Called Alice – wrote:

“The controversy of capitalism versus state enterprise has been argued, tested and fought out in many ways in many countries, but surely the airship venture in England stands as the most curious determination of this venture.”

Air Ministry specifications called for airships of not less than 140,000 cubic metres in volume, with a useful lifting capacity of 60 tons, accommodations for 100 passengers, fuel tankage for 57 hours’ flight, and a maximum speed of 110 km/hr. In case of war, the ships would also be expected to carry up to 100 fully-equipped troops or five fighter aircraft carried in onboard hangars.

R.100 was designed and built by the private engineering firm Vickers-Armstrong under the direction of engineer Barnes Wallis, later famous for designing the “bouncing bomb” used in the “Dambusters” raid during WWII. While ambitious in scale at 216 metres long – the second-largest airship at the time after Graf Zeppelin – in design the R.100 was relatively conservative, using proven technologies and construction techniques. This resulted in a vehicle that was not only sturdy, structurally efficient, and pleasant to fly, but which even exceeded its maximum design speed by nearly 20 km/hr. But it wasn’t all smooth sailing. The R.100’s construction hangar in Howden was infested with vermin and was so cold and humid that workers often arrived in the morning to find the airship’s aluminium frame covered in ice. A labour shortage even forced Vickers Armstrong to throw together a workforce composed mainly of hastily-trained local farmers. But after four years of endless delays, R.100 finally took to the skies for the first time on December 16, 1929, flying to the Royal Airship Works hangar in Cardington for preliminary testing. After a lengthy shakedown period in which numerous adjustments were made to the airship’s engines and fabric covering, the R.100 was declared fit for its first international flight. The initial plan had been to fly to India, but as the R.100 was fuelled with gasoline – considered too volatile for use in hotter climates – the destination was changed to Canada. On July 29, 1930, the R.100 left Cardington with 44 passengers and crews aboard and headed out over the Atlantic, reaching St. Hubert outside Montreal after 79 hours. The R.100 was a smash hit, staying in Canada for nearly two weeks and drawing massive crowds wherever it went. Finally, on August 13, the giant airship departed Montreal and returned to Cardington, completing the return journey in only 57 hours. The trip was a triumph for Vickers-Armstrong, who now had a winning entry for Sir Charles Burney’s Imperial Airship Scheme.

The Government-built R.101, however, was a different matter entirely. From the beginning the Air Ministry was determined to push the limits of airship design and produce the safest, most technologically advanced airship ever built. This, however, lead to a number of questionable design decisions. For instance, while most airships were built of lightweight aluminium alloy, R.101’s designers opted instead for stainless steel and a structural design that significantly reduced the volume of the lifting gas cells. Heavier diesel engines were also chosen over gasoline for greater safety in warm climates, and electric servomotors over simpler cable-actuated controls. This resulted in an airship so overweight that its useful lifting capacity was a mere 35 tons. In warmer climates like India this would drop to 24 tons due to reduced air density. And the problems didn’t end there. In an attempt to increase lift, the R.101’s designers loosened the nets holding the gas cells in place, causing them to chafe against the airship’s structure and leak gas. The cells also had a tendency to surge backwards and forwards, making the airship extremely unstable, while the new gas valves were so sensitive that they constantly vented precious hydrogen. Finally, instead of covering the entire structure with fabric and shrinking it tight with fabric dope as was common practice, the R.101’s designers decided to use smaller pre-doped panels laced into place, most of which rotted and tore before the airship ever left its hangar. In flight tests R.101 was sluggish and difficult to control, with a tendency to suddenly dive and climb. Indeed, about the only redeeming features of the airship were its palatial passenger quarters, which featured two decks of cabins, a gold-trimmed dining salon complete with potted palms, a promenade deck with picture windows, and even an asbestos-lined smoking room.

By the summer of 1930 it was clear that the R.101 was nowhere near ready to make its planned inaugural flight to Karachi in British India. But the successful Canadian flight of the R.100 forced the Government’s hand, for to delay the flight any further would be to admit defeat. Particularly insistent that the flight proceed on schedule was Lord Thomson of Cardington, the Secretary of State for Air. Thomson, who had designs on becoming Viceroy of India, believed that arriving triumphantly in the subcontinent aboard R.101 would greatly increase his chances of promotion, and scheduled the flight for September 1930 to coincide with the Imperial Conference in London.

But it was clear to even the most impatient Government official that the R.101 would never make it to India without extensive modifications. So on June 29, 1930, R.101 re-entered its shed at Cardington to be cut in half, lengthened, and fitted with an additional gas cell. It emerged on October 1 with an improved lifting capacity of 49 tons and total length of 236 metres – a metre longer than the Graf Zeppelin. In total, the British Government’s flagship Zeppelin had absorbed some 717,000 pounds of public funds, compared to only 500,000 for the R.100.

While some in Government like Sir Sefton Brancker, Director of Civil Aviation, objected to Lord Thomson’s haste, the die was already cast. The R.101 was given a temporary certificate of airworthiness on the condition that certain tests be conducted en route to India, and the next three days were spent feverishly preparing the giant airship for its maiden voyage. Apparently ignorant of the R.101’s weight problems, the Government proceeded to load the ship up with cases of champagne, beer, and silverware for state dining and unroll a heavy blue carpet down the main corridor and passenger lounge. And while each crew member’s baggage allowance was capped at 10 pounds, Lord Thomson’s weighed in at nearly a ton. Yet despite this and the R.101’s long history of handling problems, Thomson remained supremely confident, declaring that the mighty airship was “as safe as a house.” Nonetheless, he prudently chose to purchase extra insurance for himself and his valet.

Finally, on the evening of October 4, 1930, R.101 prepared to depart Cardington with 54 people aboard – 42 crew and 12 passengers including Lord Thomson, Sir Sefton Brancker, and Reginald Colmore, the Director of Airship Development. The weather was blustery, with winds gusting up to 60 km/hr, and clouds prematurely darkened the horizon. Under any other circumstances the flight would have been postponed, but Lord Thomson was determined the flight should depart on schedule. At 6:36 PM the R.101 detached from its mooring mast, but instead of rising the giant airship began to rapidly sink to the ground. Nearly two tons of water ballast had to be jettisoned before the ship could be levelled. It was a foreboding taste of things to come.

From Cardington the R.101 set a course for the southeast coast and the English Channel, struggling against a 40 km/hr headwind. The crew found the ship sluggish and difficult to control, often dipping as close as 150 metres to the ground. A resident of the town of Hitchin north of London heard the drone of the approaching airship and ran outside to witness an alarming sight:

“We rushed out – and there was the R101 aiming straight for the house. She was so low it didn’t seem as if she could miss it. We could see the people dining, and the electric bulbs in the ceiling. She seemed to be going very slowly. As the green and red tail lights move away up the drive, horror descended on us all.”

At 8:21 the R.101 reached the coast and the radio operator signalled back to base:

“Over London. All well. Course now set for Paris.”

Despite encountering headwinds of up to 80 km/hr, the R.101 still did not turn back. And while airships typically cruised at a height three times their length, the R.101’s altitude over the channel averaged barely 300 metres. Furthermore, heavy turbulence caused the gas valves to continuously pop open, venting hydrogen and forcing the crew to dump ever more ballast to keep the ship airborne.

At 11:36 the R.101 crossed the French coast at Pointe de St. Quentin near the mouth of the Somme River. Shortly thereafter, the radio operator sent another message back home:

“After an excellent supper our distinguished passengers smoked a final cigar, and having sighted the French coast, have now gone to bed to rest after the excitement of their leave taking. All essential services are functioning satisfactorily. The crew have settled down to watch keeping routine.”

The ship was quiet now as she traversed the final 100km to Paris. At 2AM the watch changed as per usual, with Second Officer Maurice Steff relieving the ship’s Captain, Flight Lieutenant Carmichael Irwin, as Officer of the Watch. Four minutes later, however, the R.101 suddenly pitched forward and entered a steep dive. In the smoking lounge, foreman engineer Harry Leech, who was smoking a cigar before retiring to bed, was thrown against the wall along with all the lounge’s lightweight balsa wood furniture. The R.101 fell nearly 300 metres before Chief Coxswain George Hunt managed to level her. Realizing that a crash was inevitable, Flight Lieutenant Irwin ordered slow on the engine telegraph, ordered Rigger Samuel Church forward to manually release the water ballast, and sent Coxswain Hunt to wake the Captain. At 2:09 AM, as Hunt ran down the corridor yelling “we’re down lads!”, the R.101 struck the ground at a speed of 8 km/hr near the edge of a forest called the Bois de Coutumes, 50km northwest of Paris. The first person to see the airship go down was a rabbit trapper named Alfred Rabouille, who later described what happened next:

“There was at once a tremendous explosion that knocked me down. Soon flames rose into the sky to a great height. Everything was enveloped by them. I saw human figures running about like madmen in the wreck. Then I lost my head and an away into the woods.”

In the starboard aft engine car, engineer Joe Binks had just relieved engineer Arthur Bell when the R.101 struck the ground. The pair looked on in horror as the envelope burst into flames, the skeletal steel frame glowing white-hot in the heat. A moment later, a ballast tank burst above them, soaking them in water, and the pair took the opportunity to escape the wreckage. Scrambling with their backs to the flames, they gasped for air as the flames consumed all the oxygen around them. They soon came across another survivor: engineer Harry Leech, who had clawed his way out of the smoking lounge and fallen into a tree. Once safely on the ground Leech returned to the wreck to try and rescue more passengers, but it was already too late. Of the R.101’s 54 passengers and crew, only five others escaped the flames: engineers Victor Savory and Arthur Cook, Riggers Samuel Church and Walker Radcliffe, and wireless operator Arthur Disley. Church and Radcliffe later died of their injuries, bringing the total dead to 48 – 8 more than in the more famous Hindenburg disaster, where by the way in the Hindenburg’s case, over half the passengers survived.

By morning little remained of the R.101 but a charred skeleton at the edge of the wood. The bodies of the crew and passengers, most burned beyond recognition, were laid out under bedsheets, while the survivors were placed in the care of nuns at a nearby hospital. The disaster shocked the world, and a day of mourning was observed throughout France and her colonies. When the bodies were carried to the railway station for transport to Britain, 100,000 people and battalions of French infantry and cavalry escorted the coffins. In London, half a million people watched the funeral procession, which stretched for two miles and took an entire hour to pass. The bodies were buried in a common grave at Cardington and a marble memorial erected on the site.

A board of inquiry was convened, and to nobody’s surprise much of the blame for the disaster was placed on the Government’s excessive haste and corner-cutting, the board concluding:

 “It is impossible to avoid the conclusion that the R101 would not have started for India on the evening of October 4th if it had not been that reasons of public policy were considered as making it highly desirable for her to do so if she could.”

No charges were ever laid, as all those responsible – including Lord Thomson – had perished in the disaster. The actual cause of the crash itself, however, has never been determined, the leading theory positing that a gust of wind tore away a large section of the R.101’s fabric covering, causing its forward gas cell to suddenly lose a large volume of hydrogen. Nor is it known what caused the R.101 to catch fire, for many airships had crashed under similar circumstances without exploding. Here again theories abound, including that a severed electrical cable created sparks, that a gas cell rubbing against the structure generated static electricity, or that ballast water spilled onto calcium signal flares stored in the control car. But whatever the cause, the disaster soured British opinion towards giant airships, with newspaper headlines calling for the Government to “Ban the Gas Bags!” The Committee on National Expenditure recommended that the Imperial Airship Scheme be scrapped, and in November 1931 the R.100, despite its excellent flight record, was broken up and sold as scrap for barely 600 pounds.

But while crash of the R.101 was the worst aviation disaster in British history up until that point, it was neither the first nor the worst airship disaster. On December 21, 1923, the French military airship Dixmunde, a former German Zeppelin confiscated as war reparations, exploded in a thunderstorm off the coast of Sicily, killing 52 of its crew. A year earlier on February 21, 1922, the Italian-built U.S. Navy airship Roma crashed into power lines near Norfolk, Virginia and exploded, killing 34. This disaster convinced the Navy to fill all its subsequent airships with safer Helium, on whose production the U.S. Government held a virtual monopoly. But even this could not save the airships from their traditional enemy: the weather. On September 3, 1925, the USS Shenandoah was on a goodwill tour of the Midwest when it ran into a squall line near Caldwell, Ohio and was torn apart, killing 14 of its 43 crew.

In 1929, believing it could outdo the venerable Zeppelin company, the U.S. Navy began construction on two Akron-class airships, the largest Helium-filled dirigibles ever built. Measuring 239 metres long, the Akron class ships were designed to perform forward aerial reconnaissance for the fleet and serve as flying aircraft carriers, carrying a complement of Curtiss F9C Sparrowhawk fighters which could be launched and recovered using a retractable trapeze. On April 4, 1933, the U.S.S. Akron, the lead ship in her class, was conducting an exercise off the coast of New Jersey when she encountered a storm and was thrown into the sea. The airframe quickly broke up and sank in the stormy waters, carrying 73 of her 76 crew to their deaths. It was the single worst airship disaster in history. Two years later on February 12, 1935, her sister ship, the U.S.S. Macon, ran into a storm off Point Sur, California and crashed. While lessons learned from the Akron crash led to all but 2 of her crew being rescued, the Macon was also swallowed up by the waves, bringing the era of American airships to an end.

The decommissioning of the U.S. Navy’s last remaining airship, the U.S.S. Los Angeles, left only one giant airship operating in the world: the German Hindenburg…and, well, we all know how that ended. No longer able to be used for passenger flights, the Hindenburg’s sister ship, Graf Zeppelin II, was acquired by the German Air Ministry and used to probe British radar defences before finally being broken up in April 1940 so its aluminium could be used to build other aircraft. The age of the rigid airship was over.

During their brief heyday, airships seemed to many like the way of the future, promising a more elegant form of air travel with all the grace and comfort of a luxury ocean liner. But the stately giants of the air never lived up to that promise, proving more troublesome and deadly to operate than their champions dared admit. After barely 40 years they were gone from the skies, for by then it clear that the future of aviation belonged not to the airship, but to the aeroplane.

If you liked this article, you might also enjoy our new popular podcast, The BrainFood Show (iTunes, Spotify, Google Play Music, Feed), as well as:

Expand for References

Botting, Douglas, The Giant Airships, The Epic of Flight, Time Life Books, Alexandria, Virginia, 1981

 

The Imperial Airship Scheme, Airship Heritage Trust, https://www.airshipsonline.com/airships/r101/index.html

 

R.101 Crew and Passenger List, Airship Heritage Trust, https://www.airshipsonline.com/airships/r101/R101%20Passenger%20%20Crew%20List.htm

 

Crash of the British Airship R-101, https://ift.tt/qZItufh

The post The Largely Forgotten Airship Disaster That Helped Kill These Cruise Ships of the Sky appeared first on Today I Found Out.



from Today I Found Out
by Gilles Messier - April 10, 2023 at 10:02PM
Article provided by the producers of one of our Favorite YouTube Channels!
-