Sunday, July 31, 2022

Is There Such a Thing as a Sound or Smell So Strong it Can Kill You?

There are very many things in this world that may kill you – in fact, there is almost nothing that cannot – but can your own senses be a danger to you? Is there a noise loud enough to kill you or can you be so assaulted by a bad smell that you never recover from it?

The answer to the first question is a clear ‘yes’. Loud noises can actually directly harm or even kill you.

Beyond the obvious of a loud noise distracting you or causing you to be startled to the extent that you, say, jump and smack your head on something causing irrevocable brain damage or the like, very loud noise can also, of course, cause damage to our tissues. For example, exposure to sounds over 150 decibels, would typically result in the bursting of your eardrums. To understand how this would work we  should think of what noise is.

Decibels (named for Alexander Graham Bell) is a unit of sound pressure level. Acoustic energy is after all just waves of varying sound pressure. The higher the energy the higher the sound. It’s important to understand that each decibel level translates to a micropascal value (pascal being  the S.I. standard unit of pressure) and it’s not a linear progression. So if 20 micropascals (for a given standard reference is) 60 dB, doubling the pressure, i.e. 40 micropascals is 66 Decibels.

Generally, human speach is around 60 decibels, and a chainsaw is around 110 dB. As noted, the eardrums will be damaged at values around 150. This is the level of  noise you would find close up around a  jet aircraft during take-off.

There is of course even louder noises, for example near a rocket lunch you might find sounds in excess of 170 dB. As you might have guessed from all of this, even louder sounds, i.e. pressure waves capable of crushing the body, will not be perceived, or at least not for long, through the already destroyed ear drums, but could cause an air embolism in your lungs, which then travels to your heart and kills you. Alternatively, your lungs might simply be significantly damaged from the sudden increase in air pressure.

Naturally experiments to see what humans can take here are rather scarce, but experiments on our little fury friends indicate that, for example, lung and other tissue damage, including damage to internal organs, occurs at just over 180 dB and becomes quite deadly at not far beyond that.

So are there such devices designed in such a way to kill people? Given the experiments done on mice, you will be unsurprised to learn that the answer is yes, such as one developed by the European space agency. Of course, murdering humans wasn’t exactly the point of making theirs… or so our lizard overlords, by Grabthar’s hammer may they reign forever, would have you believe. For those who refuse to see the truth, the agency’s primary reason for building the machine was to use on satellites. You see, as noted, when a rocket takes off it is, well, extremely loud owing to vibrations of the air molecules caused after some form of release of energy. As you might expect, these vibrations can often be harmful for the delicate electronics and other parts of any satellite cargo. Various methods are used to get around the problem. For example, in order to help dampen the high noise one method is to pour immense amounts of water on the platform during launch, on the order of 300 thousand gallons or so… Better, of course, is to simply build satellites capable of taking such a sound wave beating. Thus, a dedicated system was built to produce sounds at extremely high decibel levels to test satellites with. A perfectly innocent use for this killer sound machine.

Beyond this, there are of course weapons specifically built to utilize the weaknesses of the human sense of hearing, so-called ‘sound canons’ and of course the famous ‘mosquito’ sonic devise. As for the former, these sonic devices usually work by producing highly directional ‘sound cones’ of very loud audible sound. Ideally, if you stand just outside the cone you hear nothing or very little. These devices are mainly used by police or military to target unwanted or dangerous gatherings in order to help disperse them.

But what about the mosquito devise? This one targes the bane of all human existence, the dreaded teenager. The acoustic range of the human ear is generally defined as 20 Hz to 20KHz but that’s just a ballpark range. In fact, the older you are, the narrower the range, helping ensure the more wizened among us don’t have to listen to crickets asking for sex all night long. On the other side of things, little human parasites are typically able to hear higher frequencies than the adults they leach off of. Enter the Mosquito sonic devise, designed to emit sounds in frequencies teens and younger can hear, helping to disperse teenagers loitering around shops  or other such places where they aren’t wanted.

Speaking of teenagers, this all brings us to deadly smells.

Whether or not a material can have a smell is dependent on how easily it can enter a gaseous state. This is called being volatile. As heat and in some cases humidity can increase the chemical activity of many materials, they are more likely to give off a scent when warmed or dissolved.

This is where the nose comes in. Within it, the so-called olfactory epithelium can be found, and it is normally covered with mucus. This mucus helps dissolve fragrant particles – and so the olfactory receptor cells can really begin doing their job. These exist in abundance; each nose has around four million of them. As with most cells, they are further specialized and can belong to one of about 400 different types, which can only detect one particular smell. Not everyone has all types of receptor cells; therefore it depends on genetics whether or not a person can smell a particular odor.

A research team based in the University of Manchester has for example been able to determine that homo sapiens and his far relatives Neanderthals and Denisovans share the ability to identify the scent of pigs and their ancestors, bringing to the foreground how evolutionarily important these animals must have been to their respective diet.

Smell in particular has the additional curious trait of being closely connected with emotional association. This is the case because the part of the brain whose function it is to sort out smells, the olfactory bulb, belongs to the limbic system, which is responsible for how we behave, our moods and emotional responses, and even memory and learning.

But as nice as it is that the scent of a particular brand of tobacco might involuntarily make one fondly remember a deceased great-grandfather, what is of interest to the topic at hand is the function of olfaction as a warning mechanism.

When an odor is identified as bad, the body reacts without consulting you first. Depending on just how terrible the smell is, how close you are to the source, and how dangerous your brain deems that smell, you may experience the following.

At first, your nostrils may flare. If the scent is faint, this is to draw in more of it in order to determine what it is. Your nose may wrinkle – closing half of your membranes to avoid further exposure – and your facial muscles contract to signify disgust to the humans around you, also warning them. You may automatically hold your breath until you can get away from the source of the odor. If the smell is especially bad, you may begin gagging, as the body’s natural reaction to getting rid of a harmful substance. Your eyes and nose will produce more liquid, as will your mouth, to similarly flush it out. In the worst case, you may throw up.

What it doesn’t do, no matter how much you might describe a particularly smelly fart as “deadly”, is kill you, with one caveat we’ll get into shortly. Your sense of smell is supposed to warn you, and therefore, protect you. No reaction of chemicals with your olfactory receptors can cause death ,at least not directly. However, smell can play a role in detecting dangerous substances, and if you do not listen to it, these substances may kill you in other ways.

For example, even a small amount of smoke will alert us to deadly danger, although the flames themselves may not be the most dangerous facet of fire.

It is, as mentioned, not the odor of smoke that will cause permanent damage or even death, but rather the poisonous particles it may carry. For example, smoke from rubber, plastic or foams that contain polyvinyl chloride (PVC) are particularly harmful.. When burnt, these produce among other dangerous substances hydrogen chlorite, which, when in contact with mucus such as in your nose, mouth and throat, forms hydrochloric acid. This, in turn, does not remain in a liquid state alone, but also dissolves into a gas, which will then be inevitably inhaled into your lungs. It cannot be stressed enough how much you do not want this in your lungs if you like them at all.

Another product of burning plastic may also be hydrogen cyanide, but it can also occur in other places. Cyanide gas forms an interesting case, as you – depending on your genetic code – may or may not be able to smell it. While not exclusive to bitter almonds, this is the scent you are looking for. And if you do detect it, you better calmly retreat to an open space with plenty of air to surround you. Cyanide may also be ingested or even absorbed through your skin while touching contaminated soil, but in gaseous form, it is at its most dangerous.

Again, it is not the interaction between cyanide molecules and odor receptors that will kill you when exposed for a while or in larger quantities. Rather, cyanide causes a chemical reaction where the cells can no longer process oxygen normally. Thus the blood remains oxygenated after it passes through your body and back to the lungs. If you’d like to cease to continue to be, you have nothing to worry about here.

Another effect of a gas may be that it can cause long-term damage. An example of this is chloroform. A staple of Hollywood, the damsels and detectives knocked out with a cloth doused in this substance held over their mouths may actually have greater issues if they survive whatever show-down is about to occur, as exposure to chloroform can not only cause fatal cardiac arrhythmia or respiratory failure on the spot, but also be a factor in developing liver or kidney cancer. This is particularly tragic as between 1842 and the 1930s, chloroform was a very common anaesthetic in both veterinarian and human medicine.

Now, all that said, there is one roundabout way in which a smell can be deadly without directly causing damage to your body, and this has to do with another natural response to bad smells – vomiting.

Nausea is one of the strongest of all warning signs that whatever is causing this particular odor should be avoided. But should you find yourself incapable of moving and lying or sitting in a position where vomit may not simply flow freely from your mouth (for example after a long night at the bar), there is a chance that when faced with a particularly revolting smell, you may be in genuine danger because of it. With no place to go, the vomit may be held back in your mouth and nose, and even get into your lungs, effectively drowning you. In this case, ironically the warning mechanism inducing your body to accidentally kill you.

As with sounds, the military has been quick to utilize the power of bad smells, though sometimes with this backfiring. For example, members of the French Resistance in World War II attempted to use a sort of stink bomb called ‘Who Me’ against the occupying Germans. The scent was reminiscent of fecal matter and was meant to be sprayed on German soldiers, mostly to humiliate them. However, the sulfuric smell was nearly impossible to control, and escaped easily, settling on anything it touched, therefore also affecting the French, who, being naturally smelly, presumably didn’t notice the change.

Moving on from there, the likes of the U.S. Department of Defense occasionally uses stink bombs to disperse crowds.

We’re still waiting on a military agency to combine these two weapons creating the ultimate in sensory killers capable of 180 dB+ sounds and smells so pungent they induce instance vomiting- The dreaded Nuclear Whoopie Cushion.

If you liked this article, you might also enjoy our new popular podcast, The BrainFood Show (iTunes, Spotify, Google Play Music, Feed), as well as:

Expand for References

The post Is There Such a Thing as a Sound or Smell So Strong it Can Kill You? appeared first on Today I Found Out.



from Today I Found Out
by Nasser Ayash - July 31, 2022 at 11:53PM
Article provided by the producers of one of our Favorite YouTube Channels!
-

Wednesday, July 20, 2022

Why Do We Want to Squeeze/Bite/Pinch Cute Stuff So Badly?

Has this ever happened to you? You are presented with something unbearably cute – a baby, perhaps, or a puppy or a kitten – and are suddenly gripped by an overwhelming desire to pinch, squeeze, crush, or even bite the little thing. You clench your hands, grit your teeth, and maybe even let out an audible growl, so overtaken are you by this sudden rush of aggression. If so, then you are not alone. Known as “cute aggression,” this response is commonly experienced by around 70% of the adult population. But why? At first glance, cute aggression would appear to be evolutionarily maladaptive, increasing our chances of harming the very things – whether human or animal – are supposed to be taking care of. However, recent behavioural and neurological research suggests that this apparently contradictory response may be more psychologically useful than it might at first appear.

While the English term “cute aggression” was first coined in 2013, the phenomenon has been known about for much longer, and many languages even have a specific word for it: for example, “gigil” in Filipino Tagalog, “geram” in Malay, “gemas” in Indonesian, and “muchlovat” in Czech. In behavioural terms, cute aggression is what psychologists call a dimorphous response, in which a positive experience elicits a response usually associated with negative emotions – and vice-versa. Common examples of dimorphous responses include crying during a happy or romantic scene in a movie or laughing uncontrollably when frightened or stressed. In the case of cute babies or animals, the expected response should be an overwhelming desire to care for or protect said creature. Indeed, the very concept of “cuteness” and the instinct to care for cute things are evolutionarily hard-wired into our brains. In 1943, German ethologist Konrad Lorenz proposed what he called the kindenschema or “baby schema,” a set of physical characteristics which our brains interpret as being distinctively “baby-like” and which elicit a strong nurturing instinct. These characteristics include a large, rounded head; a small chin, mouth, and nose; large ears, and large, low-set eyes. The more a face conforms to this archetype, the cuter we consider it to be. Our innate attraction to the kindenschema has even bled over into our treatment of domesticated animals. Dogs, for example, have been selectively bred over thousands of years to appear more and more puppy-like, with floppy ears and more rounded features – such that an adult Labrador  looks considerably younger than a wolf of the same age. The arts and entertainment industry has also taken note of this principle, with cartoons like Mickey Mouse or characters from Japanese manga and anime gradually evolving to possess increasingly childlike features such as rounded heads and disproportionately large eyes.

But if we are hard-wired to care for and protect cute things, why do so many of us also feel like squeezing them so hard? The first formal study to tackle cute aggression – and give it its name – was conducted in 2013 by a Yale University research team lead by neurologist Oriana Aragón. Aragón and her colleagues had 105 online participants fill out a questionnaire featuring such questions as “I can be so happy to see someone that I cry,” “I can be so angry that I laugh,” “If I am holding an extremely cute baby, I have the urge to squeeze his or her little fat legs,” and “I am the type of person that will tell a cute child “I could just eat you up!” through gritted teeth.” The study found that around 64% of respondents confessed to having felt the urge to squeeze a cute baby or animal, while 74% confessed to having acted on that impulse. As a follow-up, the team invited 90 participants to come into the laboratory and watch a slideshow of either cute, funny, or neutral animals. As they watched, the participants were given a sheet of bubble wrap and told to pop as few or as many bubbles as they wanted. Those who watched the cute slideshow popped an average of 120 bubbles, compared to 100 for the neutral slideshow and and 80 for the funny one – providing empirical evidence of an aggressive response to cuteness.

As for why this occurs, Aragón and her colleague Rebecca Dyer hypothesize that cute aggression may serve a regulatory function, allowing people to better control and apply their nurturing instincts. Other neurological studies have shown that cute stimuli activate the mesocorticolimbic or reward system of the brain, and that in certain cases this activation can be overwhelming. As Rebecca Dyer explains: “We think it’s about high positive-affect, an approach orientation and almost a sense of lost control. You know, you can’t stand it, you can’t handle it, that kind of thing.”

According to neurologist Katherine Stavropoulos, in a caretaking situation such an emotional overload could prove detrimental to the child being cared for: “A baby can’t survive alone, but if you’re so overwhelmed by how cute it is and how much you love it, then you can’t take care of it, and that baby won’t survive.”

Aragón and Dyer thus posit that cute aggression might serve as a check to such intense positive feelings, preventing the person from becoming overwhelmed:

“It might be that how we deal with high positive-emotion is to sort of give it a negative pitch somehow, that sort of regulates, keeps us level and releases that energy. It could possibly be that somehow these expressions help us to just sort of get it out and come down off that baby high a little faster.”

To determine whether such a regulating effect was indeed present, in 2018 Stavropoulos and her colleagues at the University of California Riverside conducted an experiment in which 54 participants were exposed to images of both younger and older-looking animals and babies, some of which had been photographically manipulated to make them especially cute. Prior to the experiment participants were given a questionnaire similar to Aragón and Dyer’s, asking whether they had experienced any. Common dimorphous reactions in the past, while afterwards they were asked to describe their emotional reaction to the baby and animal images. Participants were then made to carry out a neutral task such as a word search before being exposed to another batch of images. Stavropoulos hypothesized that not only would those who reported experiencing dimorphous reactions in the past experience more intense emotional reactions to the images, but that they would also experience a less intense reaction in the second session. The results of the study appear to confirm this:

“The expressors of cute aggression are coming down off of that cute-high faster. [But] it could be that they’re just moving back down to baseline because they move more than people that don’t and so it’s really hard to detangle.”

To study the neurological underpinnings of the cute-aggression response, Stavropoulos next wired the participants to an EEG or brainwave monitor and monitored their responses as they watched the baby and animal images. She observed a strong event-related potential or ERP signal which appeared to confirm the results of the earlier study, being stronger in those prone to other dimorphous reactions and weaker on average in the second round of image exposure.

This appears to suggest that cute aggression is part of a complex emotional tug-of-war intended to keep the brain functioning smoothly in the face of powerful emotional stimuli:

“It’s not just reward and it’s not just emotion. Both systems in the brain are involved in this experience of cute aggression. The appetitive side of the reward system is about that forward momentum, the antsy feeling, the pursuit, the urge. So it could be that when we see this aggressive expression, it’s an expression of that urge. It’s showing that you want to get to the baby.”

However, Stavropolous believes that cute aggression might serve another function entirely:

“The very first thoughts were that maybe it is about some sort of emotional homeostasis but we have much stronger and more consistent evidence suggesting it’s something else – a powerful communication signal…[it] reminds you how much bigger and stronger you are physically than this cute little thing.”

Stavropoulos further suggests that the negative facial expressions induced by the onset of cute aggression may communicate to a baby that one is concerned with their welfare and will likely take care of them.

But whatever its ultimate purpose, the fact that cute aggression is not a universal experience is what truly fascinates her:

“When I describe the phenomenon to people, I usually see that about 70 to 75 percent of people nod immediately and know exactly what I’m describing and have experienced it. They think “this is weird; I’m probably the only one who feels this way. I don’t want to hurt it. I just want to eat it.” The other 25 to 30 percent look at me strangely and have no clue what I’m talking about or why anyone would feel that.”

That remaining 25-30% may also offer valuable insights into other, more serious psychological conditions such a sociopathy, psychopathy, or postpartum depression, all of which involve difficulty feeling empathy or nurturing instincts. Cute aggression may also help explain certain aspects of autism, as Stavropoulos explains:

“There’s a lot of literature about people with autism having service dogs with huge success, or having horses they really connect with that help them understand the social world. Maybe they feel the strong caretaking urge but don’t feel overwhelmed, and that’s a strength of theirs.”

The upshot of all of this, however, is that as odd as it might feel, cute aggression is entirely normal. As Rebecca Dyer reassures us: “We don’t have a bunch of budding sociopaths in our studies that you have to worry about.”

So go ahead: squeeze that adorable puppy. You know you want to.

If you liked this article, you might also enjoy our new popular podcast, The BrainFood Show (iTunes, Spotify, Google Play Music, Feed), as well as:

Expand for References

Hamilton, Jon, When Too Cute is Too Much, The Brain Can Get Aggressive, NPR, December 31, 2018, https://www.npr.org/sections/health-shots/2018/12/31/679832549/when-too-cute-is-too-much-the-brain-can-get-aggressive

 

Stavropoulos, Katherine & Alba, Laura, ‘It’s so Cute I Could Crush It!”: Understanding Neural Mechanisms of Cute Aggression, Frontiers of Behavioural Neuroscience, December 4, 2018

 

Aragon, Oriana et al, Dimorphous Expressions of Positive Emotion: Displays of Both Care and Aggression in Response to Cute Stimuli, Association for Psychological Science, 2015, Volume 26(3), https://clarkrelationshiplab.yale.edu/sites/default/files/files/Psychological%20Science-2015-Aragón-259-73.pdf

 

Explainer: What is Cute Aggression? The Conversation, September 9, 2013, https://theconversation.com/explainer-what-is-cute-aggression-16884

 

Pappas, Stephanie, ‘I Wanna Eat You Up!’ Why We Go Crazy for Cute, Live Science, January 21, 2013, https://www.livescience.com/26452-why-we-go-crazy-for-cuteness.html

 

Katz, Brigit, Why We Want to Squeeze Cute, Little Things, Smithsonian Magazine, December 31, 2018, https://www.smithsonianmag.com/smart-news/why-we-want-squeeze-cute-little-things-180971143/

 

Brandt, Katie, Cute Aggression: Why You Want to Squeeze Adorable Creatures, Brain Facts, September 10, 2019, https://ift.tt/MTvE6fC

 

Mull, Amanda, This is Your Brain on Puppies, The Atlantic, December 11, 2018, https://www.theatlantic.com/health/archive/2018/12/cute-aggression-its-so-fluffy/577801/

The post Why Do We Want to Squeeze/Bite/Pinch Cute Stuff So Badly? appeared first on Today I Found Out.



from Today I Found Out
by Gilles Messier - July 20, 2022 at 02:28PM
Article provided by the producers of one of our Favorite YouTube Channels!
-

The Surprisingly Long and Determined Effort to Create a Flying Submarine

In our previous video, The Surprisingly Long and Determined Effort to Create a Literal Flying Tank, we looked at how designers in the 1930s and 40s devoted a considerable amount of time and effort trying to combine two of the 20th Century’s most revolutionary weapons of war: the tank and the aeroplane. But as ill-conceived and ultimately futile as these projects were, they were far from the strangest attempts to create a hybrid military vehicle. That dubious distinction instead belongs to an improbable series of efforts to mash together the two unlikeliest of vehicles: the aeroplane…and the submarine.

It will come as no surprise to regular viewers that the first nation to tinker with such a vehicular abomination was the Soviet Union. In 1937, while studying at the Dzerzhinsky Naval Engineers’ Academy in Saint Petersburg, Soviet engineer Boris Ushakov drafted a technical proposal for a vehicle which could operate both in the air and underwater. Featuring thick, stubby wings resembling a manta ray and a pair of floats for takeoff and landing, Ushakov’s flying submarine would be powered by three 800-horsepower gasoline engines on the surface and an electric motor underwater, giving it a maximum speed of 100 knots in the air and 3 knots submerged. Once the craft landed, the transition from aeroplane to to submarine would be accomplished by sealing off the engine compartments with retractable metal plates and flooding empty spaces in the wings and floats, causing the craft to submerge. The cockpit would also be flooded, forcing the crew to retreat into a watertight compartment complete with conning tower and periscope from which the submarine would be controlled. The craft’s armament was to be two 18-inch torpedoes mounted under the hull.

But what possible use could any Navy have for such an outlandish vehicle? As absurd as it might seem, Ushakov’s concept actually filled a number of roles that aircraft and submarines of the time could not. While fast, agile, and able to carry large weapons payloads, aircraft of the 1930s were far from stealthy, a fact which became increasingly relevant with the wide-scale adoption of radar. On the other hand, submarines, while stealthy, were also extremely slow underwater and largely blind, relying on periscopes and hydrophones to track and home in on their targets. Aircraft and submarines were also largely ill-suited to attacking enemy ships in harbour, which were typically defended by extensive antiaircraft batteries and antisubmarine obstacles like booms and nets. Throughout the 1930s and 1940s, there were numerous attempts to solve these tactical shortcomings, such as the development by several nations of midget submarines capable of infiltrating harbours and other protected spaces. This approach was pioneered by the Italian Navy, whose elite Decima Flottiglia MAS unit of combat frogmen used specially-designed human torpedoes nicknamed maiale or “pigs” to carry out a series of daring raids against Allied shipping in Alexandria, Malta, and Gibraltar. The maiale were copied by the British – who dubbed them “Chariots” – and, along with more conventional midget submarines known as X-Craft, used in a number of unsuccessful attempts to sink the German battleship Tirpitz at her anchorage in Norway. Japanese Type A Ko-hyoteki  midget submarines participated in the 1941 attack on Pearl Harbour and two 1942 attacks on Sydney Harbour and Diego Suarez Harbour in Madagascar, while various German midget submarines like the Neger, Seehund, and Biber carried out attacks on Allied shipping in the English Channel in the final years of the war. However, none of these vehicles proved as effective as their designers had hoped. For one thing, they had a limited range, requiring them to be carried close to their target, launched, and retrieved by a larger mother submarine. They were also slow, difficult to control, and despite their small size, easily spotted and engaged by enemy defensive forces. Consequently, the vast majority of midget submarine operations ended in the death or capture of their crews.

Another potential solution to the stealth-vs-speed conundrum was the submarine aircraft carrier. In the 1920s, a number of submarines were built to carry a small reconnaissance floatplane in a special watertight hangar behind the conning tower. Once the submarine had surfaced, the aircraft would be removed from its hangar, assembled, and launched using a steam catapult built into the deck. Upon completing its mission, the aircraft would land alongside the submarine and be hoisted aboard using a crane. While both the French Surcouf and British HMS M2 cruiser submarines possessed this capability, the most famous submarine aircraft carriers ever built were the Japanese I-400 class. The largest submarines ever fielded during WWII and the largest ever built until the 1960s, the I-400s were designed to carry and launch three folding Aichi M6A Seiran floatplanes, each capable of carrying 900 kilograms of bombs. The Japanese Navy planned to use these unusual weapons to attack the Panama Canal, San Diego, and Ulithi Atoll, but Japan surrendered before any of these plans could be carried out. The three completed I-400s were captured by the Americans, examined, and scuttled to prevent the Soviets from learning their technological secrets.

But as impressive as they were, the I-400s suffered from a fatal flaw. Launching and retrieving aircraft took up to 45 minutes and could only be done while the submarines were on the surface, making them highly vulnerable to detection and attack. Boris Ushakov’s flying submarine, on the other hand, neatly solved this problem. The craft could theoretically cover vast distances of ocean at high speeds, allowing it to track down and shadow an enemy fleet. It could then land, submerge, and use the cover of darkness to attack the fleet before stealthily slipping away. The craft was also well-suited to infiltrating harbours, able to fly over minefields, anti-submarine nets, and other defences before landing in the harbour basin, submerging, and attacking enemy shipping using torpedoes. Indeed, the Soviet Navy saw sufficient merit in Ushakov’s idea to submit his proposal to its Scientific Research Committee for evaluation. But while the concept made it through two rounds of official evaluations and revisions, it was ultimately rejected as too impractical, and Ushakov’s vehicular chimera never made it off the drawing board.

But the allure of the flying submarine never truly died, and the following decades would see numerous attempts to resurrect the concept. In 1961, American inventor Donald V. Reid of Ocean Township, New Jersey, cobbled together various discarded aircraft parts to create a working flying submarine, which he rather unimaginatively dubbed the Reid Flying Submarine or RFS-1. Though far smaller than Ushakov’s design at only 10 metres in length, Reid’s vehicle worked on exactly the same principle. Looking like something out of a contemporary James Bond movie, in the air, RFS-1 was powered by a 65-horsepower engine and propeller mounted on a tall pylon behind the cockpit, while underwater it was propelled by a 1-horsepower electric motor, diving being accomplished by flooding the craft’s fuselage and twin pontoons. The transition from flying to diving, however, was a less than elegant process, requiring the pilot to remove the propeller and seal off the engine pod using a rubberized cloth cover. The craft’s open cockpit also required the pilot to use Scuba gear to breathe while submerged. Nonetheless, on June 9, 1964, RFS-1 made the world’s first – and thus far only – full-cycle flying submarine flight over the Shrewsbury River, flying at 10 metres altitude before submerging and achieving a speed of 2 knots at a depth of 2 metres. While the craft’s immense weight limited it to making short, low-altitude hops, Reid proved that a flying submarine was a workable proposition, inspiring dozens of future efforts to perfect the concept.

In the same year as Reid’s historic flight, the Proceedings of the U.S. Naval Institute published a study by Naval hydrodynamics engineer Eugene Handler examining the feasibility of a flying submarine. As in Boris Ushakov’s original 1937 concept, such craft were envisioned for use against enemy shipping in harbour or in closed, heavily-defended waters like those of the Baltic, Black, or Caspian seas. As the Navy article states:

“Handler writes of a possible craft with an operating depth of 25 to 75 feet, a submerged speed of five to 10 knots for four to 10 hours, airspeed of 150 to 225 knots for two or three hours and a payload of 500 to 1,500 pounds. He says it is believed these characteristics can be attained within a vehicle weighing 12,000 to 15,000 pounds. A little flying sub might carry out its mission and take its crew back. It could, Handler says, fly from a favourable location to its destination at minimum altitude to avoid detection by radar. At the completion of its underwater mission it could travel as a submersible to a location best suited for takeoff, become airborne and return to base…. The Bureau of Naval Weapons has recently awarded a contract to the Convair and Electric Boat Divisions of General Dynamics for analytical and design studies of the essential components and operational aspects of such a vehicle.”

In the same article, Handler also acknowledges the various technical and bureaucratic obstacles which had long held back the development of a flying submarine, stating that:

“The development of a practical flying submarine prototype will be both complex and laborious, but the potential returns are substantial and valuable. Consequently the concept of such a vehicle merits careful engineering examination rather than the overly optimistic accolade of a few imaginative enthusiasts and the simultaneous cold shoulder denial of the hard headed realist.”

Inevitably, like every other flying submarine project, Handler’s concept also never made it off the drawing board. However, it may have directly inspired an iconic piece of pop culture. While developing the 1965 television series Voyage to the Bottom of the Sea, producer Irwin Allen hired researcher Elizabeth Emanuel to compile an archive of existing underwater technology on which to base the show’s vehicles and props. Among the material Emanuel uncovered in her research was the U.S. Navy’s flying submarine study, which is thought to have inspired the very similar stingray-shaped vehicle prominently featured in the show and which first introduced the concept of the flying submarine to the general public.

One of the fundamental flaws with the flying submarine concept is the need for a strong watertight compartment for the crew, which significantly increases the weight of the vehicle and makes it difficult to achieve flight. If the crew are eliminated altogether, however, then this particular engineering problem suddenly becomes a whole lot simpler. The first attempt to launch an unmanned aerial vehicle or UAV from a submarines was made in 1946, when a U.S.-built version of the WWII German V-1 cruise missile called the JB-2 Loon was test-launched from the deck of the USS Cusk. These experiments ultimately resulted in the development of the SM-N-8 Regulus, the U.S. Navy’s first submarine-launched nuclear missile. While a significant leap forward, the Regulus was a fundamentally flawed weapon. Carried in special water-tight compartments built into the submarine’s hull, the Regulus could only be extracted and launched one the submarine had surfaced. This made Regulus-equipped submarines extremely vulnerable to detection and attack, just like the Japanese I-400 submarines before them. This problem was eventually solved by the development of the Polaris and Trident ballistic missiles and the Tomahawk cruise missile, which could be launched while the submarine remained safely submerged.

In 1991, the Strategic Arms Reduction Treaty or START signed by the United States and the Soviet Union left the U.S. Navy wondering what to do with half of its ballistic missile submarines, whose nuclear payloads had been outlawed by the treaty. This resulted in a flurry of proposals for alternative non-nuclear weapons to occupy this considerable underwater real estate. Among these was Project Cormorant, first proposed in 2003 by DARPA, the Pentagon’s advanced research projects agency. Officially designated the Multi-Purpose Unmanned Aerial Vehicle or MPUAV, the Cormorant was a 6-metre-long jet-powered drone designed to be launched from a submarine missile tube. Pushed out of the launch tube by compressed air, the Cormorant would rise to the surface before being launched into the air by a pair of rocket boosters. The wings would then unfold, the jet engine inlet and outlet would open, and the vehicle would fly off on its reconnaissance mission, covering a distance of up to 800 kilometres. Upon completing its mission, the Cormorant would return and parachute into the sea, whereupon the launching submarine would deploy a Remotely Operated Vehicle or ROV to attach a cable to the drone, allowing it to be winched back into its launch tube. While hardly the exotic convertible vehicle of Boris Ushakov and Donald Reid’s imaginations, the Cormorant nonetheless solved the decades-long problem of combining the speed and maneuverability of an aircraft with the stealth of a submarine. Unfortunately, in 2008 the MPUAV project fell victim to budget cuts and, like all its predecessors, the Cormorant was never built.

Yet Ushakov’s dream still lives on, and shortly after the cancellation of Project Cormorant, the Naval Surface Warfare Center in Carderock, Maryland, released yet another proposal for a true flying submarine. 6 meters long with a 30-metre wingspan, the vehicle was designed to carry two crew and six Special Forces troops up to 1,200 kilometres by air or 20 kilometres underwater. While it is as yet unknown whether the Carderock flying submarine was ever built and tested, the fact that this absurd James Bond-esque vehicle continues to capture the imagination of Naval designers after nearly a century just goes to show that sometimes truth is sometimes stranger than fiction.

If you liked this article, you might also enjoy our new popular podcast, The BrainFood Show (iTunes, Spotify, Google Play Music, Feed), as well as:

Expand for References

Handler, Eugene, The Flying Submarine, U.S. Naval Institute Proceedings, September 1964, http://www.waterufo.net/flyingsubs/NavyFlyingSubHtml1.htm

 

The Dream of Flying Submarines and Aircraft Carriers, National Interest, April 28, 2019, https://nationalinterest.org/blog/buzz/dream-flying-submarines-and-aircraft-carriers-54782

 

MPUAV, Lockheed Martin, February 17, 2012, https://www.youtube.com/watch?v=8mCTVvh-zPE

 

Ganjin Please Ushakov LPL, https://www.reddit.com/r/Warthunder/comments/jjo5j6/ganjin_please_ushakov_lpl/

 

The Flying Submarine Story, FliteTest, October 25, 2018, https://ift.tt/2Lu7jqv

 

Hand, Jill, Weird NJ: The Flying Submarine of Ocean Township, App.com, July 23, 2017, https://ift.tt/z3ust28

https://ift.tt/2Lu7jqv

The post The Surprisingly Long and Determined Effort to Create a Flying Submarine appeared first on Today I Found Out.



from Today I Found Out
by Gilles Messier - July 20, 2022 at 02:23PM
Article provided by the producers of one of our Favorite YouTube Channels!
-

The Curious Case of the Extreme Sport Mensur

The modern sport of classical fencing has come a long way from its origins in the 15th-Century practice of duelling. Over the past 600 years swords became blunted, protective equipment increased, and rules steadily codified to produce a safe, formalized Olympic sport in which serious injuries are rare. But at the same time, another, more brutal form of fencing also survived, clinging to existence in university basements and clubhouses across Europe. It is a form of ritualized swordplay so steeped in the traditions of duelling and martial honour that not only are injuries common, they are actually encouraged. Welcome to the hardcore world of Mensur, or German Academic Fencing.

Mensur is a practice unique to Studentenverbindungen or student corporations, a kind of fraternity common to universities in German-speaking countries like Germany, Austria, and Switzerland and Baltic states like Latvia and Estonia. These societies, which date back to the Middle Ages, are built around the notion of lifelong brotherhood among their members and carry out a number of elaborate rituals to reinforce this bond, including the wearing of couleur, distinctive caps and ribbons bearing the corporation’s colours; the kneipe, or ceremonial gathering; and Mensur.

Deriving its name from the Latin word for “dimension” – referring to the distance between participants – Mensur is a formal duel between two individuals fought using special sharpened, basket-hilted sabres called mensurschläger. Unlike in traditional fencing, fighters – or Paukanten – stand a fixed arm’s length apart and are forbidden from moving their feet or even dodging their opponents’ blows. There is also no scoring nor any designated winner or loser. This is because the aim of Mensur is not swordsmanship, but rather to demonstrate one’s courage and character by taking an opponent’s blows without fear or flinching. As Hermann Rink, former head of the Association of Old Corps Students explains:

“The object and purpose of [student corporations] was and still is solely the education of students to become a strong, free and cosmopolitan personality who is not held back by religious, racist, national, scientific or philosophical limitations of the mind. The need to overcome one’s own fear, dedicated to the union of his Corps, and the connected strengthening of the sense of community aids the personal growth just as does taking a hit without losing one’s stand and accepting the assessment of the Mensur by the own Corps Brothers.”

These notions of courage, honour, and the ability to endure hardship unflinchingly were considered central to the German character for hundreds of years, and were instrumental in preserving and maintaining the practice of Mensur up to the present day. The role of Mensur in defining a student’s character is perhaps best exemplified by the associated culture of duelling scars. Though Mensur bouts are carried out with sharpened swords, as the aim is not to kill or seriously wound one’s opponent the combatants wear elaborate protective gear including a long padded or chainmail shirt, a throat protector, a gauntlet on the sword hand, and metal goggles with a nose protector. During the bout only hits to the head are permitted, with the bout ending when first blood is drawn. The resulting scar, known as a “smite” or schmiss, has long been considered a badge of honour, with German Chancellor Otto von Bismarck once declaring that a man’s courage and bravery could be judged by the number of scars on their cheeks. In the 19th and early 20th centuries, duelling scars – and the corporation membership they indicated – were seen as a sign of a man’s ability to hold government office; indeed, in 1928, 20% of senior civil service positions in Prussia were held by former members of the Kösener student association. Scars were also thought to enhance a man’s eligibility as a potential husband, leading many who were unable attend university or join student corporations to cut their faces with razors to achieve the same effect. Those with actual duelling wounds also often picked at their scabs to deepen and enhance the resulting scars. As most student corporation members were also members of the aristocracy and many aristocrats became military officers, the practice of Mensur eventually lead to the cliché of the scarred German officer. Indeed, many prominent military leaders up to the Nazi era – including SA leader Ernst Röhm and SS Commando leader Otto Skorzeny – bore prominent duelling scars on their faces.

Despite its eventual longevity, Mensur nearly disappeared at numerous times throughout its history, only to reappear and adapt itself to the changing times. While modern Mensur emerged around the mid-19th Century, the tradition of student duelling stretches back 15th Century, when the court sword or Kostümdegen was an essential part of everyday aristocratic dress. While the common people were forbidden from carrying swords, in many German principalities an exception was made for university students so they could defend themselves if attacked while travelling to and from school. This, unsurprisingly, lead to an explosion of duelling, with students fighting each other over the slightest perceived insult to their honour. These affairs were very often deadly, as the preferred duelling weapon, a thrusting rapier known as a Pariser or “Parisian” sword, could easily inflict lethal puncture wounds. In the 17th century an attempt was made to curb these so-called strassenrencontre or “street fights,” through the introduction of the kartellträger, or regulated duel. Instead of fighting each other on the spot, the aggrieved parties would agree to meet at a prearranged time and place, the duel being overseen by a referee, the duellists’ “seconds”, and a doctor to tend to any injuries.These duels were fought not to the death but typically to the drawing of first blood, the aim being to secure satisfaction for the insulted party. But due to the thrusting nature of the combat many duellists still died, and it was not until the 1760s that the University of Göttingen introduced a new type of slashing sword known as the Göttinger Hieber. This was soon adopted by many German-speaking universities, causing a dramatic drop in the death rate.

In 1763 following the Seven Year’s War, Frederick the Great, King of Prussia, outlawed street duels and the public wearing of swords by civilians, resulting in legalized duelling becoming the exclusive preserve of military officers and university students. At first duels were reserved for resolving disputes and perceived insults, but as more and more students without actual grievances sought to prove their courage and skill on the duelling ground, a formal system of challenges was developed. This involved uttering a standard code-phrase – typically “dummer junge,” or “stupid boy” – which was not just an insult but rather an invitation to duel. But by the mid-19th Century this too was abolished in favour of the Bestimmungsmensur or “determining duel”, wherein rather than challenging each other, combatants were instead chosen by the vice-chair of their student corporation. These duels were no longer about settling disputes but rather proving one’s character, and fighting one became a prerequisite of entry into a corporation. This is considered the birth of modern Mensur.

The latter half of the 19th Century saw an explosion in the popularity of Mensur, especially in Prussia a phenomenon attributed to two major factors: extended peace and government reforms. For much of Prussian history the officer corps was the exclusive preserve of the aristocracy, but in 1859 sweeping military reforms lead to the creation of a new reserve army whose leadership was open to members of the middle class. Aristocratic officers, seeing this as a challenge to their elite status, took to fighting duels in order to demonstrate their superior breeding and character. Between the end of the Franco-Prussian War in 1871 and the outbreak of the First World War in 1914, the newly-unified Germany fought no major conflicts, and an officer corps thirsting for combat turned once again to Mensur to preserve their martial spirit. During this period German chancellor Otto von Bismarck also implemented sweeping government reforms including lowering taxes and introducing social security, one result of which was to allow many middle-class students to attend university for the first time. Like the officer corps, the largely aristocratic student corporations saw these new arrivals as a threat to their elite status, and Mensur as one means of preserving it. By the late 19th Century Mensur was so popular in Germany that Mark Twain devoted several chapters of his 1880 travelogue A Tramp Abroad to it.

Yet despite its popularity, Mensur was about to face one of the first major challenges to its existence. While by this time Mensur had become one of the safest sports in Europe – placing behind even cycling in terms of annual deaths – accidents still happened, and in 1877 the death of a student fencer at the University of Göttingen lead the German government to ban Mensur outright. However, the practice still continued, with secret duels taking place in the basements of student corporation clubhouses across Germany. In 1890, Kaiser Wilhelm II carried out an investigation into the prevalence of Mensur in Germany, at the conclusion of which he declared: “[Duelling provides] the best education which a young man can get for his future life.”

This royal endorsement effectively overturned all previous prohibitions on Mensur, and the practice enjoyed yet another surge in popularity. It would remain popular until the mid-1930s when it was officially banned by the Nazi regime. Hitler’s quarrel lay not with Mensur itself but rather with the student corporations who practiced it, whom he saw as representing the old aristocratic class whom he despised. Hitler was also suspicious of the strong bond of brotherhood between corporation members, which he believed undermined loyalty to the state. But just like in the 1880s, Mensur did not disappear; it simply went underground. Student corporations, forced by the Nazis to suspend their activities, instead reorganized as “comradeships” like the Hermann Löns group in Freiburg, which continued to host Mensur duels in secret. Indeed, over 100 recorded duels were fought in Freiburg alone throughout the Second World War.

After the war, the occupying Allied forces banned all military sporting organizations in Germany. While Mensur had only been practiced by students for decades, its martial nature and heritage was deemed sufficient for it to be included in the ban. This prohibition lasted until 1953, and while Mensur quickly reestablished itself at many German universities, it would never again come close to the popularity it enjoyed in the late 19th Century. Today Mensur is practiced by around 400 student corporations across Germany, Austria, Switzerland, Belgium, Poland, and the Baltic States, though in some countries like Switzerland the practice is frowned upon as excessively violent, with many corporations turning to other sports like extreme hiking as an alternate means of building character and brotherhood. But many students still swear by this medieval trial by combat and the brutal but vital lessons it teaches. As Albrecht Fehlig, spokesman for the Student Corps Associations, explains: “Some people regard honour as an old-fashioned term. We see it in close relationship with human dignity. Corps students are obliged to respect the dignity of other persons and not to tolerate a violation of their own dignity. This has a significant effect on our social life and contributes to the unique atmosphere of a corps house.”

If you liked this article, you might also enjoy our new popular podcast, The BrainFood Show (iTunes, Spotify, Google Play Music, Feed), as well as:

Expand for References

History of European Martial Arts Part X – Academic Fencing – Mensur, Academy of Historical Martial Arts, December 14, 2016, https://ift.tt/ylhkmsb martial-arts-part-x-academic-fencing-mensur

 

Scull, J.C, Dueling Scars: the Badge of Honour of Many Nazi Officers, Medium, July 17, 2020, https:// medium.com/history-of-yesterday/dueling-scars-the-badge-of-honor-of-many-nazi-officers-8bfc72f1dfa

 

Morin, Roc, Fighting for Facial Scars in Germany’s Secret Fencing Frats, Vice, February 17, 2015, https://www.vice.com/en/article/av4bp4/frauleins-dig-them-0000573-v22n2

 

Jackson, Patrick, My Germany: Student Fencer, BBC News Berlin, September 11, 2013, https:// www.bbc.com/news/world-europe-23975881

 

Young, Patrick, Die Waffen Hoch! The Resiliency of Academic Fencing in Germany, University of Florida College of Liberal Arts and Sciences, April 20, 2011, https://ift.tt/t17KGfo

The post The Curious Case of the Extreme Sport Mensur appeared first on Today I Found Out.



from Today I Found Out
by Gilles Messier - July 20, 2022 at 02:19PM
Article provided by the producers of one of our Favorite YouTube Channels!
-

Tuesday, July 19, 2022

Review: Augustino's Mini Snack Bites Tomato and Oregano



These wheat-based snacks looked like little bread bits, with rectangular but somewhat curved shapes measuring about 3/4 inch by an inch and a half. ...

from Taquitos.net Snack Reviews
by July 19, 2022 at 01:32PM

Sunday, July 3, 2022

Is Ring Around the Rosie Really About the Plague?

Most of us are familiar with this classic nursery rhyme, while many of us are also aware with its surprisingly dark origins. While on its face “Ring Around the Rosie” may appear to be just a silly song for children, it is, in fact, a chilling description of the Black Death, the outbreak of Bubonic Plague that wiped out nearly a third of Europe’s population between 1346 and 1353. According to this interpretation, “Ring Around the Rosie” describes the rosy red rash that appeared on plague victims, and“Pocket Full of Posies” the bunches of flowers or fragrant herbs carried by medieval people to ward off the disease. Meanwhile, “Ashes! Ashes!” –  sometimes sung as “A-tishoo! A-tishoo!” – refers to either the cremation of dead bodied or the sneezing of the victims, while “We All Fall Down” refers to, well, death.   

 only no, it doesn’t. While countless books, articles, and videos state this dark hidden meaning of “Ring Around the Rosie” as fact, in reality the plague interpretation is only a couple of decades old and has been largely dismissed by folklorists. And while the notion of children innocently dancing and singing about the Black Death might appeal to our macabre imaginations, all evidence points to a far more mundane origin for the song.

The first strike against the plague interpretation of “Ring Around the Rosie” is the age of the song itself.

While most sources trace the song’s origin to the Black Death of 1346, others cite the more recent Great Plague of 1665. However, the song in its modern form did not appear in print until 1881 – some 200 or 500 years after its supposed origin – while the earliest recorded versions go back no earlier than the 1790s. If the plague theory is correct, this would mean that the song survived orally for nearly half a millennium before anyone bothered to write it down – a suspiciously unlikely occurrence.

Another strike against the theory is that the song’s lyrics have varied wildly from place to place and decade to decade. If the lyrics were originally about the plague, then we would expect the earliest versions to contain the same references as the modern-day ones. But this is not the case. The earliest recorded version of the song, printed in Germany in 1796, translates as:

“A ring, a ring, a round dance,

We are the children three,

we sit under the elderbush,

and all go hush, hush, hush!

 While one of the earliest English-language versions, printed in an 1846 article in the Brooklyn Eagle newspaper, goes:

Ring a ring a Rosie,

A bottle full of posie,

All the girls in our town

Ring for little Josie.

Other versions include this one, recorded by American author Ann S. Stephens in her 1855 novel The Old Homestead:

A ring – a ring of roses,

 Laps full of posies;

 Awake – awake!

 Now come and make

 A ring – a ring of roses.

And this one, published by William Wells Newell in 1883:

Round the ring of roses,

Pots full of posies,

The one stoops the last

Shall tell whom she loves the best.”

Even this 1898 variation, collected by Alice Gomme in the Dictionary of British Folklore, differs substantially from the “modern” 1881 version recorded in illustrator Kate Greenway’s Mother Goose; or, the Old Nursery Rhymes:

“Ring, a ring o’ roses,

A pocket full o’ posies,

Up-stairs and down-stairs,

In my lady’s chamber —

Husher! Husher! Cuckoo!”

In all these versions, the only common elements are the title “Ring of Roses” and the rhyming reference to posies. The following lines differ substantially from the modern version, with “Ashes! Ashes!” being either completely absent or variously replaced with nonsense syllables, imitations of sneezing, or alternate constructions like “Red Bird, Blue Bird” or “Green Grass, Yellow Grass”. Variations continued to appear well into the 20th Century, with this version being recorded among African-American schoolgirls in Wiergate, Texas in 1939:

“Ring around a Rosey

Pocketful o’ posies

Light bread, sweet bread, squat!

Guess who she told me, tralalalala

Mr. Red was her lover, tralalalala

If you love him, hug him!

If you hate him, stomp!”

Given how dramatically these lyrics evolved over only 100 years, it is unlikely they would have survived unchanged from the time of the Great Plagues to the present day. And even we accept that the modern lyrics do indeed refer to the plague, they do a remarkably bad job describing the actual pandemic. Firstly, while people in the Middle Ages did indeed carry around flowers, spices, and other fragrant substances to ward off the “bad air” thought to spread the disease, the fact that earlier versions of the song refer to “pots” or “laps” full of posies makes this connection tenuous at best. Second, while the theory claims that the titular “ring of roses” refers to a ring-shaped rash characteristic of the plague, this is in fact a very rare symptom – as is the sneezing supposedly depicted in the line “A-tishoo! A-tishoo!” In fact, coughing and sneezing are more common to the pneumonic form of the disease – in which the Yersinia Pestis bacterium infects the lungs – than the flea-borne bubonic variety. Third, contrary to the “Ashes! Ashes!” line, plague victims weren’t ever cremated – they were always buried. Indeed, the Catholic Church considered cremation sacrilegious, as it denied the Christian doctrine of the resurrection of the dead at the Last Judgement. In fact, the first modern cremation in Britain did not take place until 1884, while the Catholic Church did not lift its ban on cremation until 1963. And finally, while the final line, “We all fall down,” appears to be a straightforward reference to dying, many versions of the song, like this one from 1883, don’t reference falling at all but instead a “curchey” or “curtsey” to be performed by the singers:

“A ring, a ring o’roses

A pocket full of posies

One for Jack and one for Jim and one for little Moses

A curchey in and a curchey out

And a curchey all together”

These many inconsistencies make it unlikely that “Ring Around the Rosie” was ever about the plague. Indeed, despite its popularity, the plague theory is a remarkably recent one. The first work to make the discordant connection between the innocent nursery rhyme and disaster was a 1949 article in the newspaper The Observer, which included a nuclear-themed parody version:

“Ring-a-ring-o’-geranium,

A pocket full of uranium,

Hiro, shima

All fall down!”

In 1951, folklorists and nursery rhyme experts Iona and Peter Opie acknowledged the widespread belief that the song was about the plague, but dismissed the theory as dubious, later complaining that:  “We ourselves have had to listen so often to this interpretation we are reluctant to go out of the house.”

However, the first known work to state the theory as fact is James Leasor’s book The Plague and the Fire, published a full decade later in 1961. From there, the theory spread like, well, the plague, until it became accepted as fact. But as critics of the theory point out, if the hidden meaning of the song was known for over 500 years, why did it take until the mid-20th Century for anyone to write it down? Most folklorists have thus dismissed the plague interpretation as a folk etymology, and have concluded that the song likely originated in 18th Century Germany and spread to England, the United States, and elsewhere via German immigrants.

So if not the plague, then what is the song actually about? According to folklorist Philip Hiscock, “Ring Around the Rosie” is part of a long tradition of children’s play songs intended to get around Protestant religious bans on dancing:

“Adolescents found a way around the dancing ban with what was called in the United States the “play-party.” Play-parties consisted of ring games which differed from square dances only in their name and their lack of musical accompaniment. They were hugely popular, and younger children got into the act, too. Some modern nursery games, particularly those which involve rings of children, derive from these play-party games. “Little Sally Saucer” (or “Sally Waters”) is one of them, and “Ring Around the Rosie” seems to be another. The rings referred to in the rhymes are literally the rings formed by the playing children. “Ashes, ashes” probably comes from something like “Husha, husha” (another common variant) which refers to stopping the ring and falling silent. And the falling down refers to the jumble of bodies in that ring when they let go of each other and throw themselves into the circle.”

 As with the lyrics of the song, the rules of these “play parties” varied widely. In nearly every variation, the last person to fall down, bow, or curtsey had to pay some kind of penalty. In some versions this involved hugging or kissing another member of the group, while in others the loser moved to the centre of the circle to become the “Rosie” around which the others danced. The name “Rosie” is believed to derive from the French rosier [“rose-yay”] or “rosebush,” while the game itself likely derives from ancient pagan rituals that involved dancing in circles around sacred trees or bonfires – and for more on this, please check out our video That Time British Witches Tried to Stop a Nazi Invasion Using Magic.

 But what do the other lyrics of the song mean? In a word, nothing. While some classic nursery rhymes like “Old King Cole” are known to be based on historical figures and events, others are simply fun rhyming nonsense for children to dance and play to. As far as folklorists and etymologists have been able to determine, “Ring Around the Rosie” is one of the latter, and likely has no literal meaning at all. But we humans abhor ambiguity and randomness, leading us to seek out logic and meaning where none exists. This rationalizing instinct has resulted in countless dubious theories seeking to explain the origins of nursery rhymes, including this one:

“Mary, Mary, quite contrary,

How does your garden grow?

With silver bells, and cockle shells,

And pretty maids all in a row.”

 One interpretation holds that the “Mary, quite contrary” of the rhyme is none other than Queen Mary I, who attempted to convert England back to Catholicism between 1553 and 1558. According to this theory, “How does your garden grow?” and “Silver bells and cockle shells” refer to Mary’s inability to bear a child, while “Pretty maids in a row” is a derogatory reference to catholic nuns. However, there is no evidence whatsoever for this particular interpretation, and the actual origin of the song remains unknown.

Not even contemporary works are immune from such over-analysis. Gallons of ink have been spilled explaining how L. Frank Baum’s The Wonderful Wizard of Oz is in fact a cleverly-disguised allegory for populism even U.S. monetary policy, while a common urban legend holds that Phil Collins’s 1981 hit song In the Air Tonight is actually about an incident Collins witnessed where a man failed to save a drowning victim. Indeed, the Beatles became so frustrated with people trying to find hidden meanings in their lyrics that John Lennon and Paul McCartney intentionally wrote 1967’s I Am the Walrus to be as nonsensical and meaningless as possible, with Lennon allegedly remarking: “Let the f***ers work that one out!”

Intriguingly, the plague interpretation of “Ring Around the Rosie” has taken on a life of its own, becoming what folklorists call meta-folklore – folklore about folklore. Despite being thoroughly debunked by folklorists, the theory has nonetheless persisted, satisfying as it does both our desire for rational explanations and our love of the ironic and the macabre. And like regular folklore, it has evolved over time, growing and changing with each retelling, and branched off into numerous distinct variations. As mentioned before, some versions of the theory trace the song’s origins to the Black Death of 1346, others to the Great Plague of 1665. Similarly, some versions say song originated in London, while others say it came from Eyam, a village in the English Midlands hard-hit by the 1665 Plague. Some versions even claim that the children of Eyam sang the song while dancing around the bodies of the victims! But the one thing all these theories have in common is that they are all patently false. After all, as Sigmund Freud might have put it, sometimes a nursery rhyme is just a nursery rhyme.

If you liked this article, you might also enjoy our new popular podcast, The BrainFood Show (iTunes, Spotify, Google Play Music, Feed), as well as:

Expand for References

 Mikkelson, David, Is ‘Ring Around the Rosie’ About the Black Plague? Snopes, November 17, 2000, https://ift.tt/p1QoZlz

 

Winick, Stephen, Ring Around the Rosie: Metafolklore, Rhyme and Reason, Library of Congress, July 24, 2014, https://blogs.loc.gov/folklife/2014/07/ring-around-the-rosie-metafolklore-rhyme-and-reason/

 

Ring Around the Rosie, Professor Buzzkill, https://ift.tt/o6J0amk

 

McDaniel, Spencer, “Ring Around the Rosie” is Not About the Black Death, Nor Has it Ever Been, Tales or Times Forgotten, May 3, 2017, https://ift.tt/MQuG9rH

The post Is Ring Around the Rosie Really About the Plague? appeared first on Today I Found Out.



from Today I Found Out
by Alexis DeStout - July 03, 2022 at 09:54PM
Article provided by the producers of one of our Favorite YouTube Channels!
-

The Final Frontier in a Little Ball

When we think of ‘the final frontier’ and the last unexplored places we tend to think of outer space. Yet despite covering 70% of the planet, more than 80% of the world’s oceans remain unexplored. In fact, we know more detail about the surface of the Moon than the bottom of the ocean, as for most of history we lacked the technology to penetrate its depths. But this began to change on June 6, 1930, when two men climbed into a tiny 5-foot metal sphere and plunged into the waters off Bermuda, descending to a depth of 800 feet – three times deeper than any human had ever gone before. This crude vehicle, known as the bathysphere, helped launch a new era of ocean exploration and gave humanity its first tantalizing glimpses of Earth’s largest and most mysterious habitat.

The two men behind the Bathysphere could not have been more different. At age 52, William Beebe was one of the most famous naturalists in America. The founding ornithologist at the New York Zoological Society, Beebe was a leading expert on tropical birds, a prolific writer and popularizer of science, and an early advocate of ecology and conservation. He was also a figure from another era, a highly disciplined and moralistic gentleman explorer and naturalist in the Victorian tradition.

By contrast, Otis Barton was nearly twenty years Beebe’s junior, the independently wealthy son of New Hampshire textile mill owners. An often difficult man prone to mood swings, Barton had a restless imagination and dreamed of becoming a globetrotting explorer like William Beebe. As a young man he had built his own diving helmet to explore the waters off New York and spent a year on safari in Africa before finally enrolling in Columbia University’s department of engineering.

Yet in one key respect Beebe and Barton were very alike: they were both outsiders. Despite his impressive accomplishments, Beebe had no formal scientific education, having abandoned a degree at Columbia University in 1897 to work for the Zoological Society. This lack of credentials meant that Beebe often struggled to be taken seriously by the mainstream scientific establishment.

In 1928 Beebe established a research station on Nonsuch Island in Bermuda for studying the ocean environment. However, he soon found his scientific ambitions frustrated by the available technology of the time. The standard technique of dredging – dragging a scoop along the seafloor – didn’t allow sea creatures to be observed in their natural habitat, while contemporary diving suits and submarines were unwieldy and unsuited to scientific research. Beebe realized he would need to build his own custom research submersible, and in an article in the New York Times sketched out a concept for a cylindrical diving chamber fitted with lights and portholes, which could be lowered into the ocean on a cable. But Beebe did not posses the technical know-how to construct such a craft, nor did the Zoological Society have the funds to pay for it.

It was then that Beebe’s scheme came to the attention of Otis Barton, who immediately realized that  a cylindrical chamber would not be strong enough to withstand the immense pressures in the deep ocean. Drawing on his engineering knowledge, Barton instead designed spherical diving chamber he dubbed the Bathysphere, from the Greek bathys, meaning “deep”. He then set upon the difficult task of actually trying to contact Beebe; since publishing his article in the Times, the naturalist had been bombarded with dozens proposals from crackpot inventors seeking to share the limelight. But as luck would have it, the idea of a spherical diving vessel had already been proposed to Beebe a decade earlier by none other than former President and fellow conservationist Teddy Roosevelt. Beebe liked the simplicity of Barton’s design, and the two men quickly hammered out an agreement: Barton would pay for the Bathysphere’s design and construction in return for accompanying Beebe on his dives, while Beebe would hire the ship and all the equipment needed to carry out the expedition.

While to modern eyes Barton’s Bathysphere appears crude, at the time it was a marvel of engineering. The craft took the form of a hollow cast-steel sphere 4.5 feet in diameter with walls 1-inch thick – just large enough for two men to crouch inside. Nothing of this size and complexity had ever been cast in one piece before, so Barton turned to the Watson Stillman Hydraulic Machinery Company in Roselle, New Jersey, who specialized in casting cathedral bells. The finished casting weighed an astonishing five tons and was could withstand the pressures at up to a mile in depth. But Barton soon discovered there wasn’t a crane or winch in Bermuda that could lift such a weight, so this first Bathysphere was melted down and recast to a more reasonable weight of 2.5 tons, with a maximum depth of half a mile.

To allow the divers to see outside, the Bathysphere was fitted with three portholes made of fused quartz – a high-strength glass recently developed by General Electric. Oxygen was supplied from compressed cylinders, while trays of calcium hydroxide absorbed carbon dioxide from the divers’ breath. For illumination a powerful searchlight was mounted behind one of the portholes, while a telephone system allowed communication with the surface. The whole device was lowered into the ocean on special high-strength anti-snag steel cable manufactured by Roebling & Sons, the famous builders of the Brooklyn Bridge.

In 1930, Barton and the finished Bathysphere arrived on Nonsuch Island, where Beebe was ready with the expedition ships – an old Royal Navy barge called the Ready fitted with a crane and towed by the tug Gladisfen. Beebe and Barton made their first dive on May 27, 1930, descending to a shallow depth of 45 feet. After several more manned and unmanned tests, they were finally ready to make their first deep dive on June 6. The pair later admitted to being extremely apprehensive, even arranging a system whereby they would speak into the telephone every five seconds; if they failed to speak, they were to be immediately winched up. The dive turned out to be an eventful one, an electrical fire in the searchlight and a leaking porthole forcing Beebe to halt the descent. Nonetheless, they had reached a depth of 800 feet – tripling the previous diving record.

After making minor adjustments to the Bathysphere, Beebe and Barton made dozens of dives throughout the 1930 season, pushing the craft ever deeper and discovering dozens of bizarre new species. The pair also performed physics experiments such as measure sunlight absorption as well as shallow “contour dives” with the bathysphere suspended under the moving ship to map the ocean floor.

While the work was productive, it could also be uncomfortable and sometimes dangerous. Beebe and Barton were both over six feet tall and struggled to cram themselves into the tiny bathysphere, often emerging bruised and bloody from being knocked about inside. At depth the sphere would become uncomfortably cold, and in all but the calmest seas it tended to bounce wildly on the end of its cable, rendering its occupants violently seasick. Worse still, if the Ready sank or the cable snapped, there was no chance of rescue – the divers would plunge helplessly into the abyss. And the consequences of even a minor leak were made graphically clear during one unmanned test dive, when the Bathysphere emerged from the ocean full of water. Realizing this water was under immense pressure, Beebe ordered the crew to stand clear while he started to loosen the hatch bolts. He described what happened next in his bestselling book Half Mile Down:

“Suddenly, without the slightest warning, the bolt was torn from our hands, and the mass of heavy metal shot across the deck like the shell from a gun. The trajectory was almost straight, and the brass bolt hurtled into the steel winch thirty feet away across the deck and sheared a half-inch [13 mm] notch gouged out by the harder metal. This was followed by a solid cylinder of water, which slackened after a while into a cataract, pouring out the hole in the door, some air mingled with the water, looking like hot steam, instead of compressed air shooting through ice-cold water.”

Later, Beebe would somewhat grimly remark that if the bathysphere were to spring a leak at depth, the divers would not have time to drown, for under the immense pressure:

…the first few drops of water would have shot through flesh and bone like steel bullets.”

Despite this promising beginning, no dives were conducted during the 1931 season due to cracks in the Ready’s winch and bad weather in Bermuda. The ongoing Great Depression had also made expedition funds harder to come by, so Beebe and Barton went in search of a new sponsor. Eventually they secured a deal with David Sarnoff of NBC to broadcast one of their dives live over the radio. To further hype up the event, Beebe and Barton promised to descend to a depth of 2,600 feet – a half-mile down. The dive took place on September 22, 1932, but unfortunately the broadcast didn’t go quite as planned. A rough sea caused Barton to become seasick and vomit, and the moment the broadcast ended Beebe aborted the dive and ordered the bathysphere winched up from 2,200 ft – just 400 feet short of their half-mile goal. But the broadcast had done its job: Beebe and Barton were now international celebrities, their dives seen as death-defying and heroic as the moon landings forty years later.

The 1933 season saw no dives due to lack of funds, with Beebe once more embarking on the fundraising circuit. But money wasn’t the only problem plaguing the project; by now Beebe and Barton’s relationship had become incredibly strained. Their partnership had always been one of convenience, Beebe having the clout and connections to organize the expeditions and Barton the technical know-how to make the Bathysphere work. But despite Beebe’s attempts to give Barton his fair share of credit, the media roundly ignored him in favour of the already more famous Beebe. For his part, Beebe couldn’t stand Barton’s mercurial temperament, indiscipline, and disregard for the natural environment. But there was to be one last moment of triumph. Beebe struck a deal with the National Geographic Society to fund another season of dives in exchange for articles written by Beebe for the Society’s magazine, and on August 15, 1934, Beebe and Barton climbed once more into the Bathysphere and descended to an astonishing depth of 3,028 ft – a half mile down.

The 1934 diving season would be the Bathysphere’s last. Not only had the Depression made the dives too expensive to sustain, but Beebe had seen everything he wanted to see. He and Barton, now on worse terms than ever, parted ways and reportedly never spoke again. William Beebe never returned to the ocean, moving instead to South America and then Trinidad to study insects and rain forest ecology. He would continue to publish papers and books until his death from Pneumonia in 1962. Otis Barton, on the other hand, would spend the rest of his life trying to escape Beebe’s shadow. In 1938 he produced and starred in Titans of the Deep, a half-documentary, half horror exploitation film based on the Bathysphere dives. Not only was the film a notorious flop, but its promotion made it seem as though Beebe was somehow involved in the production, overshadowing Barton in his own film. But Barton did manage to win some of the glory he so desperately sought when in 1949 he rode an improved version of the Bathysphere called the Benthoscope to a depth of 4,500 feet – a record for cable-lowered submersibles that still stands to this day.

Beebe was widely criticized by the scientific establishment for his published work on the Bathysphere dives, particularly for describing four new species of fish based only on fleeting observations through the bathysphere portholes rather than physical specimens. But the true legacy of the Bathysphere was in how it inspired future generations of deep-ocean explorers and launched a new era of scientific research. And Beebe’s tireless promotion of science to the general public would set the template for the science popularizers of the television age, including Jacques Cousteau and David Attenborough.

As for the Bathysphere itself, after its final dive Beebe donated it to the New York Zoological Society, who displayed it in their exhibit at the 1939 New York World’s Fair. It was then placed in the society’s storage yard beneath Coney Island’s Cyclone roller coaster, where it sat, forgotten and rusting, for almost 60 years. Then, in 2005, the Bathysphere was restored to its original condition and placed on permanent display outside the New York Aquarium where it can still be seen today – a fascinating reminder of a heroic age of ocean exploration.

*Note: the name Beebe is pronounced “Bee-Bee”

If you liked this article, you might also enjoy our new popular podcast, The BrainFood Show (iTunes, Spotify, Google Play Music, Feed), as well as:

Expand for References

Matsen, Brad, Descent: The Heroic Discovery of the Abyss, Random House, 2005

Beebe, William, Half Mile Down, New York Zoological Society 1934 https://archive.org/details/halfmiledown00beeb

Bathysphere and Beyond, Wildlife Conservation Society, 2014, https://www.youtube.com/watch?v=VtNHKfm6Wc4

3,000 Feet Under the Sea, British Pathé, 1934 https://www.youtube.com/watch?v=TvLiG8P7uq0

The post The Final Frontier in a Little Ball appeared first on Today I Found Out.



from Today I Found Out
by Alexis DeStout - July 03, 2022 at 09:49PM
Article provided by the producers of one of our Favorite YouTube Channels!
-