You are browsing the archive for 2019 March 12.

Avatar of admin

by admin

Getting Into Harvard Was Once All About Social Rank (Not Grades)

March 12, 2019 in History

By Erin Blakemore


It was 1749, and John Winslow was about to go to sea. Any voyage was a risky, time-consuming endeavor in those days, so before he left he sat down to write an important letter. His correspondent wasn’t a loved one or even a friend: It was Edward Holyoke, president of Harvard University.

“Rank in our way is looked upon as a sacred thing,” he wrote, “and it is generally allowed that the sons of New England Cambridge are placed according to the degrees of their ancestors. I have therefore put in my pretensions for my son.” He launched into a long list of his genealogy and his own accomplishments.

Winslow was writing on behalf of his son, Pelham. The hope wasn’t that Pelham would get into Harvard; his son’s ability to pay tuition and his education as the son of a gentleman would have taken care of that. But Winslow knew that college was no meritocracy—and that his family’s rank would seal his son’s fate inside the university.


Harvard University circa 1638.

In the 17th and 18th centuries, Harvard relied on strict class rankings that weren’t based on grades, or even tuition. Rather, the school treated students differently based on the perceived social stature of their parents—rankings that colored every aspect of college life.

At Harvard, the university’s president was personally responsible for the ranking, which was printed each year and posted on the school’s bulletin board. It affected everything from where students sat to the order in which they were called to recite.

“The official notice of this was given by having their names written in a large German text, in a handsome style,” recalled Paine Wingate, who graduated from Harvard in 1759. “This arrangement was never afterward altered either in College or in the Catalogue, however the rank of their parents might be varied.”

Housing decisions were made on the basis of rank, and those of a lower rank were expected to defer to their more highly ranked fellow students. Rank even determined who marched when during commencement.

At the time, notes historian Joseph Kett, “an individual’s likely value to a community was an accepted foundation of social distinctions.” Everything from plots of colonial lands to seats in church were doled out based on social standing, Kett writes, and people were expected to live up to their social status.

Harvard’s system differed from …read more

Source: HISTORY

Avatar of admin

by admin

How St. Patrick’s Day Was Made in America

March 12, 2019 in History

By Christopher Klein

St. Patrick may be the patron saint of Ireland, but many St. Patrick’s Day traditions were born in the United States.

Every March 17, the

After Irish Catholics flooded into the country in the decade following the failure of Ireland’s potato crop in 1845, they clung to their Irish identities and took to the streets in St. Patrick’s Day parades to show strength in numbers as a political retort to nativist “Know-Nothings.”

“Many who were forced to leave Ireland during the Great Hunger brought a lot of memories, but they didn’t have their country, so it was a celebration of being Irish,” says Mike McCormack, national historian for the Ancient Order of Hibernians. “But there was also a bit of defiance because of the bigotry by the Know-Nothings against them.”

McCormack says attitudes toward the Irish began to soften after tens of thousands of them served in the Civil War. “They went out as second-class citizens but came back as heroes,” he says. As the Irish slowly assimilated into American culture, those without Celtic blood began to join in St. Patrick’s Day celebrations.

The meal that became a St. Patrick’s Day staple across the country—corned beef and cabbage—was also an American innovation. While ham and cabbage was eaten in Ireland, corned beef proved a cheaper substitute for impoverished immigrants. McCormack says corned beef became a staple of Irish-Americans living in the slums of lower Manhattan who purchased leftover provisions from ships returning from the tea trade in China.

PHOTOS: Shocking Conditions of Tenement Slums in Late 1800s

“When ships came into South Street Seaport, many women would run down to the port hoping there was leftover salted beef they could get from the ship’s cook for a penny a pound,” McCormack says. “It was the cheapest meat they could find.” The Irish would boil the beef three times—the last time with cabbage—to remove some of the brine.

A St. Patricks day postcard, circa 1850.

While St. Patrick’s Day evolved in the 20th century into a party day for Americans of all ethnicities, the celebration in Ireland remained solemn. The Connaught Telegraph reported of Ireland’s commemorations on March 17, 1952: “St. Patrick’s Day was very much like any other day, only duller.” For decades, Irish laws prohibited pubs from opening on holy days such as March 17. Until 1961, the …read more

Source: HISTORY

Avatar of admin

by admin

The Massive, Overlooked Role of Female Slave Owners

March 12, 2019 in History

By Becky Little

Most Americans know that George Washington owned enslaved people at his Mount Vernon home. But fewer probably know that it was his wife, Martha, who dramatically increased the enslaved population there. When they wed in 1759, George may have owned around 18 people. Martha, one of the richest women in Virginia, owned 84.

The high number of people Martha Washington owned is unusual, but the fact that she owned them is not. Stephanie E. Jones-Rogers, a history professor at the University of California-Berkeley, is compiling data on just how many white women owned slaves in the U.S.; and in the parts of the 1850 and 1860 census data she’s studied so far, white women make up about 40 percents of all slave owners.

Slaveholding parents “typically gave their daughters more enslaved people than land,” says Jones-Rogers, whose book They Were Her Property: White Women as Slave Owners in the American South came out in February 2019. “What this means is that their very identities as white southern women are tied to the actual or the possible ownership of other people.”

White women were active and violent participants in the slave market. They bought, sold, managed and sought the return of enslaved people, in whom they had a vested economic interest. Owning a large number of enslaved people made a woman a better marriage prospect. Once married, white women fought in courts to preserve their legal ownership over enslaved people (as opposed to their husband’s ownership), and often won. “For them, slavery was their freedom,” Jones-Rogers observes in her book.

They Were Her Property upends a lot of older scholarship. For example, previous scholars have argued that most southern white women didn’t buy, sell or inflict violence on enslaved people because this was considered improper for them. But Jones-Rogers argues that white women were actually trained to participate from a very young age.

An illustration of a slave auction, where both white men and women took part.

“Their exposure to the slave market is not something that begins in adulthood—it begins in their homes when they’re little girls, sometimes infants, when they’re given enslaved people as gifts,” she says. Citing interviews with formerly enslaved people that the Works Progress Administration—a New Deal agency—conducted in the 1930s, Jones-Rogers shows that part of white children’s training in plantation management involved beating enslaved people.

“It didn’t matter whether the child was large or small,” …read more

Source: HISTORY

Avatar of admin

by admin

D-Day: Facts on the Epic 1944 Invasion That Changed the Course of WWII

March 12, 2019 in History

By Dave Roos

The Allied invasion of Normandy was among the largest military operations ever staged. Learn how many fighting forces took part, why it was called D-Day, stats on its planning, execution and more.


On June 6, 1944, more than 156,000 American, British and Canadian troops stormed 50 miles of Normandy’s fiercely defended beaches in northern France in an operation that proved to be a critical turning point in World War II.

View the 8 images of this gallery on the original article

Without the brilliant planning and heroic sacrifices of the D-Day invasion, the Allies may have never defeated the Nazi forces in Europe. On June 6, 1944, more than 156,000 American, British and Canadian troops stormed 50 miles of Normandy’s fiercely defended beaches in northern France in an operation that proved to be a critical turning point in World War II. Below are key facts on the planning and execution of the epic Allied invasion.

1. The ‘D’ in D-Day doesn’t actually stand for anything.
Unlike V-E Day (“Victory in Europe”) or V-J Day (“Victory over Japan”), the “D” in D-Day isn’t short for “departure” or “decision.” As early as World War I, the U.S. military used the term D-Day to designate the launch date of a mission. One reason was to keep the actual date out of the hands of spies; another was to serve as a placeholder until an actual date was chosen. They also used H-Hour for the specific time of the launch.


President Franklin D. Roosevelt and Prime Minister Winston Churchill orchestrated the D-Day plans.

2. The D-Day invasion took years of planning.
Allied leaders Franklin Roosevelt and Winston Churchill knew from the start of the war that a massive invasion of mainland Europe would be critical to relieve pressure from the Soviet army fighting the Nazis in the east. Initially, a plan called “Operation Sledgehammer” called for an Allied invasion of ports in northwest France as early as 1943, but Roosevelt and Churchill decided to invade Northern Africa first and attack Europe’s “soft underbelly” through Italy.

3. D-Day was the largest amphibious invasion in military history.
According to the D-Day Center, the invasion, officially called “Operation Overlord,” combined the forces of 156,115 U.S., British and Canadian troops, 6,939 ships and landing vessels, and 2,395 aircraft and 867 gliders that …read more

Source: HISTORY

Avatar of admin

by admin

The Battle for the Future of U.S. Foreign Policy Has Begun

March 12, 2019 in Economics

By Ted Galen Carpenter

Ted Galen Carpenter

There are signs of growing congressional inconsistency, if not
incoherence, regarding the authority of the president in foreign
affairs. The legislature seeks to interfere on issues that are the
president’s responsibility while still failing to fulfill its own
constitutionally mandated responsibility regarding the war
power.

An incident of misplaced assertiveness took place in January
when the House of Representatives passed the NATO Support Act prohibiting the executive
branch from using any funds to facilitate U.S. withdrawal from the
Alliance in any way. The legislation appears to bar a drawdown of
U.S. troop levels in Europe and any effort to terminate U.S.
membership in NATO.

Enacting the legislation seemed weirdly premature. President
Donald Trump has not even taken any substantive actions that might
diminish U.S. participation in NATO affairs. He has merely
criticized the allies for their lack of burden-sharing in the
collective defense effort and (correctly) suggested that NATO
itself might be “obsolete” given how much the European
and global security environments have changed since the
Alliance’s birth at the dawn of the Cold War against the
Soviet Union.

The constitutionality of such restrictions also is highly
suspect. The Constitution makes the president the steward of
foreign affairs. Presidents historically have enjoyed wide latitude
regarding the interpretation, execution, and even termination of
U.S. treaties. Chief executives have enjoyed even greater latitude
regarding troop deployments, especially in noncombat situations,
and the nature, extent, and duration of military commitments to
implement treaties or other agreements. Congressional interference
in the form of the NATO Support Act would be truly
revolutionary-and not in a good way. It is a transparent
congressional attempt to usurp the president’s rightful
constitutional authority and seek to micromanage U.S. foreign
policy.

The NATO issue is not the only case in which congressional
efforts are underway to seize control of foreign-policy decisions
from the executive. A faction in both houses is now pushing a
measure that would prevent the White House from failing to honor
U.S. obligations under the Intermediate Nuclear Forces (INF) Treaty
until it expires in February 2021. Unlike in the case of NATO, the
president has taken tangible steps against the INF. After accusing
Russia of violating its obligations under the treaty, Trump
announced that the United States intended to withdraw from the INF. It should be stressed
that the issue is not whether the president’s policies on NATO or
the INF are wise or misguided; the relevant issue is whether the
Constitution invests the executive or Congress with the authority
to make those decisions.

Greater congressional assertiveness on such issues is especially
odd given how readily Congress has abandoned its own explicit
constitutional authority to control the war power. …read more

Source: OP-EDS

Avatar of admin

by admin

The Hanoi Summit – What Happens Next in U.S.-North Korea Relations

March 12, 2019 in Economics

By Ted Galen Carpenter

Ted Galen Carpenter

Despite the spin that both the Trump administration and the
North Korean government adopted, the outcome of the Hanoi summit
was a major disappointment. Widespread expectations existed that
the meeting would at least produce a joint declaration ending the
Korean War, the establishment of liaison offices in the two
capitals, and some progress on the thorny issue of Pyongyang’s
nuclear and missile programs. The abrupt end to the summit without
even the publication of a joint communique was not according to
script.

Both sides apparently wish to continue the bilateral dialogue,
and that’s a good sign. However, the United States needs to adopt
more limited, realistic goals. North Korea’s complete, verifiable,
and irreversible denuclearization remains a long-shot proposition
at best. It would require a degree of mutual trust that does not
exist now and is not likely in the foreseeable future. Washington’s
duplicitous behavior toward Libya and Iran
following agreements on their nuclear programs has hardly
encouraged such trust.

Instead of continuing to pursue the chimera of an end to
Pyongyang’s nuclear and missile programs, Washington should propose
more modest, limited agreements. Indeed, if North Korea truly did
seek only a partial lifting of sanctions in exchange
for closing the Yongbyon reactor complex, Trump should have
accepted that trade. Likewise, an understanding (written or
unwritten) to continue the mutual restraint whereby North Korea
refrains from conducting nuclear and missile tests and the United
States puts its annual joint military exercises with South Korea on
hold is a limited but constructive step. A peace declaration and
the establishment of embassies (not just liaison offices) are both
achievable, worthwhile accords. Such measures should be at the top
of the agenda for the next summit.

Washington’s overall goal needs to reflect two changes. One is
to establish a normal relationship with Pyongyang. That means
ending the cold war hostility to the North Korean regime, despite
its repression and brutality. A normal relationship likely also
means learning to live with a North Korea that has at least a
limited nuclear capability.

The other policy change should be to reduce America’s risk
exposure. That means not remaining the point man in dealing with
North Korea over the long term. After establishing a more stable
bilateral relationship with Pyongyang, U.S. leaders should inform
North Korea’s neighbors that they must now take primary
responsibility for containing that country. It is both bizarre and
unnecessary for the United States, located thousands of miles away,
to be in charge of policy toward Pyongyang. East Asian countries
that have more extensive interests at stake should have that task.

Ted Galen
Carpenter
, a senior fellow in security studies …read more

Source: OP-EDS

Avatar of admin

by admin

Some Regulations Deter Private Schools from Participating in Voucher Programs

March 12, 2019 in Economics

By Corey A. DeAngelis, Lindsey Burke, Patrick J. Wolf

Corey A. DeAngelis, Lindsey Burke, and Patrick J. Wolf

Regulations of school voucher programs can be well-intended.
Policymakers may hope to prevent “bad” schools from
operating or may limit schools’ ability to be selective in
their admissions procedures in the name of establishing equal
access to private options. But do top-down regulations of school
voucher programs come with any unintended consequences? Our
just-released study suggests some do.

We used surveys to randomly assign different regulations
commonly found in school choice programs to 4,825 private school
leaders in the states of California and New York and asked them
whether or not they would participate in a new private school
choice program during the following school year. Here’s what
we found.

Relative to no additional regulations, open-enrollment mandates
— preventing private schools from having specific admissions
policies — reduced the likelihood that private school leaders
were certain to participate in a hypothetical choice program by 60
percent. State standardized testing requirements reduced the
likelihood that private school leaders were certain to participate
by 29 percent. However, we found no evidence to suggest that
mandating private schools to accept the voucher as full payment or
requiring them to administer a nationally norm-referenced test of
their own choosing affected the willingness of private school
leaders to participate.

These overall results largely mirror what we found in our
previous experiment in Florida. Statistically
significant overall effects can be found in the figures below.

Our overall results suggest additional government regulations,
beyond those that all private schools face, largely reduce the
number of options available to families. But it is possible that
regulations were more likely to deter lower-quality private schools
from participating in the programs. If so, regulations could have
increased the quality level of the private schools participating in
the hypothetical voucher programs, on average.

However, using four different measures of school quality —
Google review scores, GreatSchools review scores, tuition levels,
and enrollment trends — we did not find any statistically
significant evidence to suggest that any of the regulations
improved the average quality of participating private schools by
disproportionately deterring lower-quality schools from
participating.

In fact, the only marginally significant result detected
indicated that state standardized testing requirements were more
likely to deter private schools with higher Google review scores.
Specifically, one model found that a one-point (on a five-point
rating scale) increase in Google review scores was associated with
a 14.5 percentage point larger negative effect of the state testing
mandate on anticipated program participation for higher quality
schools compared to lower-quality ones. As we’ve hypothesized before, regulations could actually
reduce the average quality levels of participating private schools.
Lower-quality private schools may be more …read more

Source: OP-EDS