You are browsing the archive for History.

Avatar of admin

by admin

Somali pirates hijack Maersk Alabama ship

April 1, 2020 in History

By History.com Editors

Pirates had not captured a ship sailing under the American flag since the 1820s until April 8, 2009, when the MV Maersk Alabama was hijacked off the coast of Somalia. The high-profile incident drew worldwide attention to the problem of piracy, commonly believed to be a thing of the past, in the waters off the Horn of Africa.

Decades of instability in Somalia and the accompanying lack of policing in its territorial waters led to a resurgence of piracy in the region that peaked in the late 2000s. Just a day before the attack, the Maersk Alabama received warning from the United States government to stay at least 600 miles off the coast of Somalia, but Captain Richard Phillips kept the ship about 240 miles from the coast, a decision which was later criticized by members of his crew. On April 8, the crew saw a skiff carrying four armed pirates approaching the ship and initiated the protocol for such an event. Chief Engineer Mike Perry got most of the crew to a safe room and managed to swamp the pirates’ craft by swinging his ship’s rudder, but the pirates were nonetheless able to board and take Phillips hostage. After one of their number was injured fighting with the ship’s crew, the other three pirates fled in a lifeboat, taking Phillips with them in the hopes of using him as a bargaining chip.

Early the next morning, the destroyer USS Bainbridge and another U.S. Navy vessel arrived on the scene. What followed was a three-day standoff, with the pirates holding Phillips in the lifeboat. Attempts to negotiate failed, and at one point the pirates fired (harmlessly) at the destroyer. Finally, on April 12, with authorization from recently inaugurated President Barack Obama, Navy SEAL snipers opened fire on the lifeboat. In a stunning display of accuracy, the SEALS firing from a ship’s deck through the windows of the tiny boat hit all three pirates in the head, killing them, while leaving Phillips unharmed.

The surviving pirate, Abduwali Muse, was taken into custody and later sentenced to over 33 years in U.S. prison—though he was tried as an adult, he and the other hijackers were reportedly all teenagers at the time of the attack. The incident received international attention, bringing the problem of modern-day piracy to many people’s attention for the first time. Phillips’ story was made into a movie starring Tom …read more

Source: HISTORY

Avatar of admin

by admin

U.S. media release graphic photos of American soldiers abusing Iraqi prisoners at Abu Ghraib

April 1, 2020 in History

By History.com Editors

On April 30, 2004, the CBS program 60 Minutes reports on abuse of prisoners by American military forces at Abu Ghraib, a prison in Iraq. The report, which featured graphic photographs showing U.S. military personnel torturing and abusing prisoners, shocked the American public and greatly tarnished the Bush Administration and its war in Iraq.

Amnesty International had surfaced many of the allegations in June of 2003, not long after the United States invaded Iraq and took over Abu Ghraib, which soon became the largest American prison in Iraq. As the 60 Minutes report and subsequent investigations proved, torture quickly became commonplace at Abu Ghraib. Photographs depicted American soldiers sexually assaulting detainees, threatening them with dogs, putting them on leashes and engaging in a number of other practices that clearly constituted torture and/or violations of the Geneva Convention.

In at least one instance, the Army tortured a prisoner to death. President George W. Bush assured the public that the instances of torture were isolated, but as the scandal unfolded it became clear that, in the words of an International Committee of the Red Cross official, there was a “pattern and a broad system” of abuse throughout the Department of Defense. Torture techniques, which the CIA and military often referred to as “enhanced interrogation,” had in fact been developed at sites like the Guantanamo Bay detention center and were routinely employed in Iraq, at Guantanamo, and at other “black sites” around the world.

In June of 2004, it was revealed that the Bush Administration—specifically Deputy Assistant Attorney General John Yoo—had not only been aware of widespread torture but had secretly developed a legal defense attempting to exempt the United States from the Geneva Convention. A 2006 court decision subsequently ruled that the Geneva Convention did apply to all aspects of the “War on Terror.”

Eleven soldiers were eventually convicted by military courts of crimes committed at Abu Ghraib, while Brigadier General Janis Karpinski, who had been in charge there, was merely demoted. Bush and Defense Secretary Donald Rumsfeld apologized for the abuses, but Bush did not accept Rumsfeld’s offer to resign. Yoo went on to teach at Berkeley Law and is a Visiting Fellow at the American Enterprise Institute. In the years after the revelations, legal scholars have repeatedly suggested that Bush, Rumsfeld and soldiers who carried out the abuses at Abu Ghraib could be prosecuted for war …read more

Source: HISTORY

Avatar of admin

by admin

Big Ben stops at 12:11 pm for 54 minutes

April 1, 2020 in History

By History.com Editors

On April 30, 1997, at exactly 12:11 pm, London’s iconic Big Ben clock stops ticking. For 54 minutes, the most famous clock in the world failed to keep time.

Completed in 1859, Big Ben has a long history of technical issues. The first bell cast for the tower cracked before it could be installed, and the second bell also developed a crack shortly after installation, resulting in silence from the tower until 1862. The bells stopped ringing again during World War I, and the tower was not illuminated at night for the duration of World War II, when most of London was kept dark to make German bombing more raids difficult. Despite the heavy damage that the Blitz inflicted on London, however, the clock stayed within a second and a half of GMT for the duration of the war.

Since then, both extreme heat and the buildup of snow have caused Big Ben to stop ticking. In 1962, snow delayed the bells, causing the capital of Britain to ring in the new year ten minutes later than the rest of the country. The April 1997 stoppage occurred the day before that year’s general election, but the malfunction was probably not a factor in the voting, which Tony Blair’s “New Labour” won in a landslide over incumbent Prime Minister John Major. Big Ben stopped again in May of 2005, on one of the hottest May days ever recorded in London.

READ MORE: How Did Big Ben Get Its Name?

…read more

Source: HISTORY

Avatar of admin

by admin

Quarantined for Life: The Tragic History of US Leprosy Colonies

March 31, 2020 in History

By Natasha Frost

Stripped of their most basic human rights, patients nonetheless built lives and communities.

For millennia, a diagnosis of leprosy meant a life sentence of social isolation. People afflicted with the condition now known as Hansen’s disease—a bacterial infection that ravages the skin and nerves and can cause painful deformities—were typically ripped from their families, showered with prejudice and cruelly exiled into life-long quarantine.

In the United States, patients were confined to a handful of remote settlements, where over time, a crude existence evolved into one with small touchstones of normalcy. But patients were consistently deprived of fundamental civil liberties: to work, to move freely and see loved ones, to vote, to raise families of their own. Some who bore children had their babies forcibly removed.

By the 1940s, after a cure emerged for the condition—and science made clear that most of the population had a natural immunity to it—other countries began to abolish compulsory isolation policies. But in the U.S., even as leprosy patients’ health and conditions improved, old stigmas, fear of contagion and outdated laws kept many confined for decades longer.

READ MORE: in 1971. For some, that “home for life” translated more closely to a prison, however picturesque. “You were brought here to die,” said Sister Alicia Damien Lau, who first came to the Molokai in 1965, in a 2016 interview. “You were not able to leave the island.”

While patients’ families could visit, they were housed in separate quarters, and allowed to communicate only through a chicken wire screen. “They catch you like a crook and you don’t have any rights at all,” Olivia Robello Breitha, a longtime patient, wrote in her 1988 autobiography. “They didn’t care about ruining a life… I was just a number.”

Known as Kalaupapa for the name of the peninsula, the settlement was one of a small handful of leper colonies in the United States, where patients were stripped of their rights and sent to live out their days. Among them were tiny Penikese Island in Buzzard’s Bay, off the coast of Massachusetts, and the Carville National Leprosarium, in Louisiana. With almost 8,000 patients over about 150 years, Kalaupapa was by the far the largest.

READ MORE: Why the Second Wave of the Spanish Flu Was So Deadly

The ‘separating sickness’

A federally operated institution for some 350 leprosy cases in Carville, Louisiana. Photographed in 1955.

Named for Gerhard Armauer …read more

Source: HISTORY

Avatar of admin

by admin

Hate Paying Income Tax? Blame William H. Taft

March 31, 2020 in History

By Patrick J. Kiger

Republican president William Taft successfully advocated for a permanent, national income tax.

Every year, millions of Americans have to amass their financial records and fill out forms—or pay professionals to do it for them—in order to file their federal tax returns. It’s an annual ritual that traditionally takes place in the spring, though in 2020, the Internal Revenue Service delayed the April 15 filing deadline by three months, due to the disruption caused by the COVID-19 virus outbreak.

For those who grumble over having to contemplate those numbered boxes on the IRS Form 1040, they have William Howard Taft to thank. The nation’s 27th President, who served just a single term from 1909 to 1913, is probably best known for being the heaviest president in U.S. history as well as the first to ride in an official presidential limousine, and for his obsession with golf. But Taft also established the federal income tax as a permanent part of Americans’ lives.

READ MORE: Why We Pay Taxes

Abraham Lincoln First Imposed an Income Tax

Taft didn’t actually invent the idea of a federal income tax. That would be Abraham Lincoln, who in 1861 convinced Congress to pass the Revenue Act and impose a temporary 3 percent tax on incomes over $800, as an emergency measure to help finance the massive military expenditures required by the Civil War. That measure was allowed to expire in 1872.

Investors panicking in the New York Stock Exchange in 1893.

The idea of a federal income tax resurfaced after the Panic of 1893, an economic downturn so severe that it caused a quarter of the nation’s labor force to lose their jobs. As Jeffrey Rosen notes in his 2018 biography of Taft, populist Democrats argued that the tariffs and excise taxes that the government depended upon for revenue put a disproportionate burden upon struggling farmers and workers, and argued for a tax that would capture more of affluent Americans’ income.

In 1894, they joined forces with progressive Republicans to pass legislation that created a 2 percent tax on incomes over $4,000, along with reduced tariffs. But that tax didn’t last long. In an 1895 case, Pollock v. Farmers’ Loan and Trust Company, the Supreme Court found that directly taxing Americans’ income was unconstitutional.

Even so, progressives’ desire to pass an income tax and cut back on taxing imports …read more

Source: HISTORY

Avatar of admin

by admin

What Language Did Jesus Speak?

March 30, 2020 in History

By Sarah Pruitt

While historians and scholars debate many aspects of Jesus’ life, most agree on what language he mainly spoke.

While scholars generally agree that Jesus was a , Jesus probably didn’t know more than a few words in Latin. He probably knew more Greek, but it was a common language among the people he spoke to regularly, and he was likely not too proficient. He definitely did not speak Arabic, another Semitic language that did not arrive in Palestine until after the first century A.D.

So while Jesus’ most common spoken language was Aramaic, he was familiar with—if not fluent, or even proficient in—three or four different tongues. As with many multilingual people, which one he spoke probably depended on the context of his words, as well as the audience he was speaking to at the time.

READ MORE: The Bible Says Jesus Was Real. What Other Proof Exists?

…read more

Source: HISTORY

Avatar of admin

by admin

Before Vaccines, Doctors ‘Borrowed’ Antibodies from Recovered Patients to Save Lives

March 30, 2020 in History

By Dave Roos

Doctors first tried injecting patients with blood plasma in the early 1900s. The method has been used against diphtheria, the Spanish Flu, the measles and Ebola.

In 1934, a doctor at a private boy’s school in Pennsylvania tried a unique method to stave off a potentially deadly measles outbreak. Dr. J. Roswell Gallagher extracted blood serum from a student who had recently recovered from a serious measles infection and began injecting the plasma into 62 other boys who were at high risk of catching the disease.

Only

Korean War Troops Were Saved by Plasma Treatments

A US Army chaplain prays while wounded soldiers get dressings and plasma at a medical station on the war front, Korea, August 10, 1950.

By the 1940s and 1950s, antibiotics and vaccines began to replace the use of convalescent plasma for treating many infectious disease outbreaks, but the old-fashioned method came in handy yet again during the Korean War when thousands of United Nations troops were stricken with something called Korean hemorrhagic fever, also known as Hantavirus. With no other treatment available, field doctors transfused convalescent plasma to sickened patients and saved untold numbers of lives.

Greene says that convalescent plasma was even deployed against 21st century outbreaks of MERS, SARS and Ebola, all novel viruses that spread through communities with no natural immunity, no vaccine and no effective antiviral treatment. Today, the best treatment for Ebola is still a pair of “monoclonal antibodies,” individual antibodies isolated from convalescent plasma and then cloned artificially in a lab.

READ MORE: The Most Harrowing Battle of the Korean War

Fighting COVID-19 With Convalescent Plasma


Dr. Kong Yuefeng, a recovered COVID-19 patient who has passed his 14-day quarantine, donates plasma in the city’s blood center in Wuhan, China on February 18, 2020.

One of the best-known modern uses of convalescent plasma is for the production of antivenom to treat deadly snake bites. Antivenom is made by injecting small amounts of snake venom into horses and allowing the horse’s immune system to produce antibodies that neutralize the poison. Those equine antibodies are isolated, purified and distributed to hospitals as antivenom.

In March 2020, doctors at Johns Hopkins University began testing convalescent plasma as a promising stop-gap treatment for COVID-19 while the search continued for a permanent vaccine. The advantage of convalescent plasma is that it can be drawn from recovered patients using the same …read more

Source: HISTORY

Avatar of admin

by admin

World Wide Web (WWW) launches in the public domain

March 30, 2020 in History

By History.com Editors

On April 30, 1993, four years after publishing a proposal for “an idea of linked information systems,” computer scientist Tim Berners-Lee released the source code for the world’s first web browser and editor. Originally called Mesh, the browser that he dubbed WorldWideWeb became the first royalty-free, easy-to-use means of browsing the emerging information network that developed into the internet as we know it today.

READ MORE: The Invention of the Internet

Berners-Lee was a fellow at CERN, the research organization headquartered in Switzerland. Other research institutions like the Massachusetts Institute of Technology and Stanford University had developed complex systems for internally sharing information, and Berners-Lee sought a means of connecting CERN’s system to others. He outlined a plan for such a network in 1989 and developed it over the following years. The computer he used, a NeXT desktop, became the world’s first internet server. Berners-Lee wrote and published the first web page, a simplistic outline of the WorldWideWeb project, in 1991.

CERN began sharing access with other institutions, and soon opened it up to the general public. In releasing the source code for the project to the public domain two years later, Berners-Lee essentially opened up access to the project to anyone in the world, making it free and (relatively) easy to explore the nascent internet.

Simple Web browsers like Mosaic appeared a short time later, and before long the Web had become by far the most popular system of its kind. Within a matter of years, Berners-Lee’s invention had revolutionized information-sharing and, in doing so, had dramatically altered the way that human beings communicated. The creation and globalization of the web is widely considered one of the most transformational events in human history. 4.39 billion people, including you, are now estimated to use the internet, accounting for over half the global population. The average American now spends 24 hours a week online. The internet’s rise has been the greatest expansion in information access in human history, has led to the exponential growth in the total amount of data in the world, and has facilitated a spread of knowledge, ideas and social movements that was unthinkable as recently as the 1990s.

READ MORE: The World’s First Web Site

…read more

Source: HISTORY

Avatar of admin

by admin

18-year-old Ryan White, national symbol of the AIDS crisis, dies

March 30, 2020 in History

By History.com Editors

On April 8, 1990, 18-year-old Ryan White dies of pneumonia, due to having contracted AIDS from a blood transfusion. He had been given six months to live in December of 1984 but defied expectations and lived for five more years, during which time his story helped educate the public and dispel widespread misconceptions about HIV/AIDS.

White suffered from hemophilia and thus required weekly blood transfusions. On December 17, 1984, just after his 13 birthday, he was diagnosed with AIDS, which he had contracted from one such transfusion. It was later revealed that roughly 90 percent of American hemophiliacs who had received similar treatments between 1979 and 1984 suffered the same fate. White was given six months to live, but recovered from the illness that had brought his disease to light and eventually felt healthy enough to return to school.

Though the scientific community knew that AIDS could only be transmitted through bodily fluids, the community around White’s Russiaville, Indiana school was paranoid that he would contaminate his classmates. White was denied entry to his school, and when the Indiana Department of Education ruled that he must be admitted the local school board unanimously voted to appeal the decision. From August of 1985 until the following June, White’s family and their opponents—who at one point held a fundraiser in the school gymnasium to support the cause of keeping him out—fought a legal battle that garnered national headlines. A diverse array of public figures appeared with White and spoke on his behalf, including Elton John, Michael Jackson, Alyssa Milano, Kareem Abdul-Jabbar and former President Ronald Reagan.

White was eventually allowed to return to school and spent his remaining years living a relatively normal life, although he made regular media appearances in an effort to educate the public about his illness. By the time of his death, just months before he was to graduate high school, White had become one of the leading figures in the movement to destigmatize HIV/AIDS. Several months later, the Ryan White CARE Act became federal law, providing a dramatic boost in funding for the treatment of low-income, under-insure and un-insured people with HIV/AIDS.

READ MORE: The History of AIDS

…read more

Source: HISTORY

Avatar of admin

by admin

When Polio Triggered Fear and Panic Among Parents in the 1950s

March 27, 2020 in History

By Volker Janssen

Since little was understood about the virus that left some paralyzed and others dead, fear filled the vacuum.

In the 1950s, the polio virus terrified American families. Parents tried “social distancing”—ineffectively and out of fear. Polio was not part the life they had signed up for. In the otherwise comfortable World War II era, the spread of polio showed that middle-class families could not build worlds entirely in their control.

For the Texas town of San Angelo on the Concho River, halfway between Lubbock and San Antonio, the spring of 1949 brought disease, uncertainty and most of all, fear. A series of deaths and a surge of patients unable to breathe prompted the airlifting of medical equipment with C-47 military transporters.

Towns Practice Extreme Social Distancing

Children in San Angelo residential areas watch Texas Health employees spray DDT over vacant lots in the city to combat a recent increase in the number of polio cases. All theaters, swimming pools, churches, schools and public meeting places were closed.

Fearful of the spread of the contagious virus, the city closed pools, swimming holes, movie theaters, schools and churches, forcing priests to reach out to their congregations on local radio. Some motorists who had to stop for gas in San Angelo would not fill up their deflated tires, afraid they’d bring home air containing the infectious virus. And one of the town’s best physicians diagnosed his patients based on his “clinical impression” rather than taking the chance of getting infected during the administration of the proper diagnostic test, writes Gareth Williams, Paralyzed with Fear: The Story of Polio. The scene repeated itself across the nation, especially on the Eastern seaboard and Midwest.

The virus was poliomyelitis, a highly contagious disease with symptoms including common flu-like symptoms such as sore throat, fever, tiredness, headache, a stiff neck and stomach ache. For a few though, polio affected the brain and spinal cord, which could lead meningitis and, for one out of 200, paralysis. For two to 10 of those suffering paralysis, the end result was death.

Transmitted primarily via feces but also through airborne droplets from person to person, polio took six to 20 days to incubate and remained contagious for up to two weeks after. The disease had first emerged in the United Sates in 1894, but the first large epidemic happened in 1916 when public health experts recorded 27,000 cases and …read more

Source: HISTORY