Thursday, February 27, 2020

The End of the Democratic German Revolution



For four years, the German Empire took on much of the world in what was originally called the Great War. Berlin had allies, after a fashion — the decrepit Austro-Hungarian and Ottoman empires, along with Bulgaria, a smaller kingdom on the outs with its Balkan neighbors. Against them was a formidable array: United Kingdom and its dominions/colonies, France, the Russian Empire, Italy, Japan, Romania, Serbia, and the United States.
The Germans did surprisingly well, defeating Russia and Romania, stalemating and almost knocking out France, and undergirding Vienna against Italy and Serbia. But preserving the Ottomans against the UK and an Arab revolt was beyond Germany’s power. The German military ultimately could not stop the disintegration of the ramshackle, multi-ethnic Austro-Hungarian Empire. Nor could the Reichsheer, the world’s finest fighting force, withstand years of privation from the crippling British blockade, increasing spread of destructive Bolshevik propaganda through German troops redeployed from the east, and arrival of the ever-expanding American Expeditionary Force.
By September 1918, the German army was losing ground as Berlin’s allies collapsed. The Supreme Army Command, represented by Gen. Erich Ludendorff, Germany’s de facto dictator, informed the Kaiser that a ceasefire was imperative. The government was turned over to the elected Reichstag as negotiations with the Allies commenced. The Entente agreed, but only on condition that the Reich surrender its heavy weapons to prevent resumption of hostilities. The result was the armistice that took effect on November 11.
The Kaiser wanted to remain, but his people had different ideas. The navy planned a last suicidal sortie for the Imperial Fleet, sparking a mutiny by sailors that grew as workers and soldiers joined in. Workers’ and Soldiers’ Councils spread across the country. In Bavaria, socialist journalist Kurt Eisner led a revolution that proclaimed a Volksstaat, or People’s Republic. Widely hated for making public documents from Germany’s diplomatic campaign, he was later murdered as he prepared to resign. Secondary monarchs of individual states, who retained some authority, abdicated before mobs forced them to act. The Councils ended up in conflict with the Social Democrats, who pushed constitutional, democratic reform.
In early November, the Kaiser still hoped to return with the army from the battlefield to suppress opposition, but his plans were disconnected from reality. Friedrich Ebert, head of the Social Democrats, warned, “If the Kaiser does not abdicate, the social revolution is unavoidable. But I do not want it, indeed I hate it like sin.” On November 9, the civilian authorities announced the Kaiser’s abdication without his approval, followed by Ebert’s appointment as chancellor. Ebert’s short-lived predecessor, Prince Maximilian von Baden, a liberal, said to Ebert, “I entrust you with the German Reich.” That latter responded, “I have lost two sons for this Reich.”
Ebert was born in 1871 and later learned the saddlemaker trade. He became active politically and settled in Bremen, where he became local party chairman. In 1905, he became SPD secretary general and in 1913 was elected party co-chairman. A moderate, he led the majority into the Burgfrieden, a pact in which the political parties agreed to set aside their differences for the duration of the war. By backing the government, as most socialists in the other combatants also did, he ensured a split with radicals, who then left the SPD. In January 1918, he simultaneously supported striking workers at a munitions plant and urged them back to work. Some saw him as a traitor to Germany, others to the workers. Increasingly he cooperated with more moderate bourgeois parties, which continued after the Kaiser’s fall.
As Ebert took over as chancellor, Karl Liebknecht, founder of the radical Marxist Spartacus League, proclaimed a socialist republic, which envisioned revolutionary change. An intense power struggle ensued in the following weeks, with the moderate Social Democrats maintaining control of the government and ultimately relying on army troops and nationalist auxiliaries to rebuff communist rebellion. After the Kaiser’s departure, the pragmatic Ebert made what became known as the Ebert–Groener deal with Gen. Wilhelm Groener, successor to Ludendorff as army quartermaster general. Ebert promised to protect the military’s autonomy and suppress left-wing revolts while Groener pledged the armed services’ loyalty to the government.
The two set up a secret hotline for daily communication and worked together in December to suppress a short-lived revolt by naval personnel. Nationalist Freikorps forces also fought against revolutionaries with great ferocity. Such battles were replicated across the country. Many on the left turned against Ebert and the Social Democrats. In early January, the Spartacists, named after the famous Roman slave, and the newly founded Communist Party of Germany attempted to seize Berlin and oust the Ebert government. The army and Freikorps prevented the putsch.
A week later a constituent assembly was elected, which met in the city of Weimar, outside of Berlin, and chose Ebert as provisional Reichpräsident. A constitution was drafted for what became known as the Weimar Republic. Ebert’s presidential term was extended to 1925 to avoid an election amid political upheaval.
Perhaps his gravest challenge in 1919 was the Versailles Treaty. Negotiated by the allies without German input, it was presented as what the Germans called a Diktat. Government officials resigned rather than sign, leaving Gustav Bauer, a leading social democrat, head of the cabinet. He sought changes, but the allies issued an ultimatum: If Berlin did not sign, they would resume their march, which the German army admitted that it could not prevent. Bauer capitulated to the allied ultimatum. When the treaty came before the National Assembly, Ebert asked Germany’s military command if the army could defend Germany against an allied invasion. Told no, he urged the legislators to ratify the document, which they did on July 9.
Weimar’s life was more short than sweet. Regional resistance to Berlin’s rule continued elsewhere in Germany. Ebert turned to the security forces to eliminate radical workers’ councils. In 1920, disaffected army troops and Freikorps members staged a coup, known as the Kapp Putsch, named after civil servant Wolfgang Kapp. The government was forced to flee, but a general strike ended the revolt. The SPD-dominated regime told the German people, “Only a government based on the constitution can save Germany from sinking into darkness and blood. If Germany is led from one coup to another, then it is lost.”
In the following years, however, officials were assassinated by nationalists and Adolf Hitler joined the Nazi Party and in 1923 attempted to seize power in Bavaria with the Beer Hall Putsch, which was put down by the Germany police and army. Ebert appointed right-leaning officials and employed the president’s emergency powers — which eventually supplanted parliamentary rule — 134 times. The republic slowly gained authority, but not legitimacy.
Opponents concocted the infamous Dolchstoßlegende, or “stab-in-the-back myth.” Ludendorff’s request that the Kaiser seek a truce was conveniently forgotten as nationalists argued the military was not defeated in the field, but instead had been betrayed at home. This claim was reinforced by the fact that civilians signed the hated Versailles Treaty.
The agreement detached Germany territory, placed ethnic Germans under foreign rule, blamed the war on Berlin, restricted the German military, and obligated the Germans to pay reparations for the entire cost of the conflict. The Allies fell between two stools. They neither imposed a Carthaginian peace, effectively dismantling Germany, nor conciliated Berlin, creating a stable international order. And when Berlin failed to comply, the former Entente powers neither enforced nor rewrote the treaty’s terms. It fueled political discontent and extremism.
The result was disaster, but Ebert did not live to see the future. Sickly, he died of septic shock after an attack of appendicitis on February 28, 1925. In a tragedy that Germans did not then understand, he was replaced by conservative authoritarian Paul von Hindenburg, the talented field marshal whose reputation survived Germany’s defeat. The latter narrowly defeated Wilhelm Marx, the candidate of the moderate Centre Party of Germany, dominated by Catholics. Had Marx been elected, German democracy would have had a better chance to survive, despite the manifold challenges to come.
A decade later, wild inflation and the Great Depression had ravaged the German middle class. The Nazis and Communists gained a parliamentary majority, forcing the aging Hindenburg, no fan of Hitler, to rule by emergency decree. Eventually appointed chancellor, Hitler quickly consolidated power. And on the president’s death on August 2, 1934, the Nazi Führer took over those powers, as well. The sadly imperfect Weimar democratic experiment was over. From there the political road led directly to World War II and the Holocaust.
Ebert’s role remains controversial. Today’s Social Democrats embrace him, having named their party think tank after him. But some on the left attack his support for the German state in World War I and later tactical alliance with conservative, even reactionary forces. Some on the right argue that he undermined the German state when it was at its most vulnerable in World War I.
In fact, Ebert balanced conflicting responsibilities unusually well, despite inevitable missteps. He faced challenges beyond the abilities of most statesmen. Whatever his initial plans, he became Germany’s most pivotal defender of democracy and liberal governance. He steadfastly backed elections even when presented with the opportunity to seize power in the name of a socialist workers’ state. He turned to the military, but only against radicals determined to impose their rule on others. And he governed with prudence and restraint, refusing to wreck an already ravaged nation by embarking on hopeless resistance to the Allies and the Versailles Treaty.
Ebert was only 54 when he died. Had he lived, his nation still would have faced abundant economic and political crises. But it is hard to imagine him appointing Hitler as chancellor. Ebert might have found a way to create and preserve a Left–Right alliance against the extremes. Perhaps that, too, would have proved to be a dead end. But we shall never know since, tragically, it was the road not taken.


 

Friday, January 10, 2020


What explains the curious persistence of the Myers–Briggs personality test?


BOOK REVIEW of "What’s Your Type? The Strange History of Myers–Briggs and the Birth of Personality Testing" by Merve Emre

Comments by Australian psychologist Nick Haslam below. Haslam is good at exposing the Myers Briggs nonsense but he is not equally good at examining his own assumptions



Standing at the end of a line, pressed up against the glass wall of a well-appointed meeting room, I asked myself the rueful question that all personality psychologists have posed at least once: why is the Myers–Briggs Type Indicator so damned popular? The smart, charismatic consultant facilitating this leadership course had given the questionnaire to his class and instructed us to line up according to our scores on extraversion–introversion. Far to my right on this spectrum of perkiness stood a colleague with a double-espresso personality; down this end, with no one to my left, I was decidedly decaf.

Let me get off my chest what’s wrong with the Myers–Briggs, or MBTI as it is known in the acronymphomaniac world of personality testing. The MBTI classifies people according to four binary distinctions: whether they are extraverts or introverts, intuitive or sensing types, thinkers or feelers, and judges or perceivers. Three of these distinctions rest on an archaic theory of personality typing proposed by Carl Jung, and the fourth was invented and grafted on by the test’s developers.

The four distinctions bear little relation to what decades of systematic research have taught us about the structure of personality. They are smeared unevenly over four of the five dimensions that most contemporary personality psychologists accept as fundamental, and completely ignore a fifth, which is associated with the tendency to experience negative emotions. The same effort to erase the dark side of personality is evident in the MBTI’s use of sanitising labels to obscure the negative aspects of its four distinctions. In large measure, being a thinking type amounts to being interpersonally disagreeable, and being a perceiving type to being impulsive and lacking in persistence. But in MBTI-world, all personality types are sunnily positive, a catalogue of our “differing gifts.”

The MBTI doesn’t only misrepresent the content of personality. It also gets the nature of personality fundamentally wrong. Despite masses of scientific evidence that human personality is not composed of types, its four distinctions are understood as crisp dichotomies that combine to yield sixteen discrete personality “types,” each with a four-letter acronym such as INTJ or ESFP. In reality, personality varies by degrees along a set of continuous dimensions, just like height, weight or blood pressure. In the face of mountains of research demonstrating that personality is malleable throughout the lifespan, proponents of the MBTI also argue that one’s type is inborn and unchanging. In short, the MBTI presents personality as a fixed essence whereas the science of personality shows it to be a continuous flux.

The MBTI also fails to meet the standard statistical requirements of psychological tests. Its items employ a problematic forced-choice format that requires people to decide which of two statements describes them better. Its scales lack coherence. The typology lacks re-test reliability, which means that people are commonly scored as having different types when they complete the measure on two separate occasions. Evidence that MBTI type correlates with real-world behaviour — known as predictive validity in the trade — is scant.

So why is a test with weak psychometric credentials, based on a musty theory of personality that gets the structure of human personality wrong, so enduringly popular? Arguably its weaknesses from a scientific standpoint are precisely what give it its appeal. Personality may not really form discrete types, but people relish the clarity of noun categories and binary oppositions. Personality may not really come in sixteen flavours, but MBTI types are sweet simplifications. Personality may be mutable, but people find reassurance in the idea that they have an unchanging true self. And the average person could not give two hoots about the statistical considerations that trouble test developers.

What matters to most people, at least those who complete the MBTI as an exercise in self-understanding rather than a compulsory workplace activity, is whether it offers accessible and palatable insight. And the MBTI undoubtedly provides that in spades. Its four-letter codes are readily grasped, its descriptions flatter our strengths, and the fact that its four distinctions bear some relationship to fundamental personality traits ensures that it offers a certain truthiness.

Although the shortcomings of the MBTI have been discussed within academic psychology for decades, a historical analysis has been lacking. Merve Emre’s fascinating new book fills that gap stylishly. Emre, a literature academic at Oxford, documents the genesis of the MBTI in the Jungian enthusiasms of Katharine Briggs and the more worldly ambitions of her daughter, Isabel Briggs Myers. Despite the subtitle’s questionable reference to the “birth” of personality testing — the first test dates back almost another thirty years to the first world war — the book’s recounting of the origins of the instrument is colourful and revealing.

Katharine Briggs emerges as someone single-mindedly devoted to making sense of human individuality and using that sense to guide people in directions to which she believed them suited. As a young mother without training in psychology, she developed a system of personality typing that she used in an informal child guidance, or “baby training,” enterprise, later finding a resonance between her ideas and those expressed in Carl Jung’s Psychological Types, which was published in 1921. Jung became Katharine’s “personal God”: at one point she wrote a hymn to him (“Upward, upward, from primal scum / Individuation / Is our destination / Hoch, Heil, Hail to Dr Jung!”). Encouraged by her correspondence with the great man, and armed with 3ʺ x 5ʺ index cards, Katharine refined her classification system and compulsively typed everyone she encountered, from neighbourhood children to Adolf Hitler.

Katharine’s daughter Isabel Briggs Myers had a more pragmatic cast of mind but inherited her mother’s absorption in types. After writing two mystery novels, she developed an early version of the MBTI while working for America’s first corporate personality consultant in 1943. Soon after, she launched it as a small commercial proposition. In the late 1950s the questionnaire was picked up by the Educational Testing Service, an eminent test developer and publisher in Princeton, New Jersey, giving it a chance at mainstream success and respectability. After endless wrangling between Isabel and staff psychometricians, though, the ETS lost interest and cut its losses. Seeing the instrument as “little better than a horoscope,” ETS staff insisted on conducting the same validation research as any other test would undergo, but Isabel remained resistant and possessive. Eventually a new publisher released the MBTI as a self-scored test and it quickly became a staple of the US$2 billion personality assessment industry, especially beloved by personnel consultants.

As history goes, Emre’s book is compelling and well paced. It presents Katharine and Isabel as rounded characters and places them in a richly drawn cultural and historical context. But as an account of personality testing more generally, the book is flawed. Despite having chronicled the many ways in which the MBTI was a cuckoo in the nest of personality psychology — the product of obsessed amateurs, disparaged by the psychometric orthodoxy at the ETS, popularised rather than professionalised — Emre sees it as emblematic. An emblem it is not. Unlike most other major tests, its use is not restricted to trained professionals and its legacy is protected by an almost cultish organisation that forbade Emre access to most of the Briggs–Myers papers, despite their officially being open to the public. Unlike other tests, the MBTI doesn’t promote itself by appeal to a validating body of scientific evidence. To treat the MBTI as representative of contemporary personality testing is like presenting the primal scream as representative of modern psychotherapy.

Emre is on more solid ground when she describes the functions of workforce personality testing, using the MBTI as an example. Its key purpose in that domain — only one of several in which it is used, it must be said — is indeed to select people who are likely to perform better than others in particular lines of work. Ideally that rationale is backed by evidence that the tests are valid predictors of workplace performance. Whether this purpose is benign or sinister is open to debate. It can be viewed positively as the legitimate application of behavioural science to enhance the wellbeing of workers and the success of organisations, or negatively as a dystopian tool for creating human cogs for the corporate machine.

Emre favours the darker interpretation, writing that personality typing “conscripts people into bureaucratic hierarchies.” This charge is hyperbolic: even if one is critical of the use of the MBTI or other testing, it does not force people into any position against their will, it is not employed exclusively in bureaucratic organisations, and it is used at least as much to differentiate people horizontally according to their strengths as it is to stratify them in hierarchies. The very same charge could be made against any other approach to selecting or assigning people to organisational roles, including interviews, hiring quotas or old boy networks.

The key question has to be whether personality testing selects and assigns people to work roles in ways that are better or worse than its alternatives: whether it is fairer and more valid, efficient or desirable than some other preferred metric. Unless there are grounds for believing that personality tests are worse than these alternatives, to criticise them for conscripting people into bureaucratic hierarchies is merely to express hostility to bureaucratic hierarchies.

Emre also struggles to form a consistent view when she discusses personality testing’s relationship to individuality. At times she presents the MBTI as a tool that promotes individualism by claiming to clarify each person’s specialised strengths and aid in their quest for self-discovery. At others she describes it in over-heated terms as “liquidating” or “annihilating” the self, as if a questionnaire had the capacity to destroy the person’s uniqueness. Here she cites the work of German social theorist Theodor Adorno, fierce critic of commodification (and jazz), who proclaimed that personality tests undermine human individuality.

Emre never quite resolves these antithetical views, but the paradox is only apparent. Receiving a score on a personality test, or even being assigned to an MBTI “type” does not submerge individuality. It simply provides it with a partial description that other people may share. Being described as brunette, overweight, liberal or a typical Taurus does not undermine a person’s selfhood but merely qualifies it, and the same is true when someone is described as being an ENTP. MBTI types, for all their conceptual failings, don’t reduce personal identity to one of sixteen psychological clones. They simply offer people a language for capturing some aspects of their personal distinctiveness.

In passing, Adorno’s critique of the “reified consciousness” involved in personality testing has a certain irony to it. In one of his books he recalled being asked by an American colleague whether he was an extravert or an introvert, writing contemptuously that “it was as if she, as a living being, already thought according to the model of multiple-choice questionnaires.” A few years later, while conducting his influential studies of authoritarianism, Adorno proceeded to create his own multiple-choice personality questionnaire.

Another confusion arises in Emre’s discussion of personality typology. Remembering the horrors of the Holocaust, Adorno rightly condemned the practice of assigning people to categorical types. This is a legitimate criticism of the MBTI, whose proponents view personality types as discrete and unchanging facts of nature. (Emre writes that Isabel Briggs Myers was astonished to find that scores on the MBTI’s scales were distributed in a bell curve, not in the camel-humped way that type theory supposed.) Emre notes this criticism of typology but then mistakenly applies it to personality testing in general. In contrast to the MBTI, almost all personality tests are explicitly anti-typological. These tests assess differences between people along a continuum without invoking bogus categories, and they do not make ill-founded claims that their scores correspond to unchanging personal essences. By failing to recognise that typological thinking is a specific failing of the MBTI, Emre misses the extent to which major criticisms of that instrument do not tarnish personality testing as a whole.

To serious students of personality, the continuing success of the MBTI within the testing industry is a source of bafflement. Emre’s book does not diminish that dismay, but it helps to clarify why the instrument is the way it is. Despite its unpromising beginnings, she demonstrates that it has a powerful appeal, offering an intuitively attractive way to apprehend ourselves as a pattern of distinctive strengths. In Emre’s preferred Foucauldian terminology, the MBTI is an effective “technology of the self.” The fact that it is a rather Bronze Age technology is almost immaterial.

https://insidestory.org.au/not-my-type/

Wednesday, December 25, 2019


When Massachusetts was the battlefield in the war on Christmas


Theologically, the Puritans were perfectly right

Ebenezer Scrooge and the Grinch had nothing on the 17th-century Puritans, who actually banned the public celebration of Christmas in the Massachusetts Bay Colony for an entire generation.



The pious Puritans who sailed from England in 1630 to found the Massachusetts Bay Colony brought with them something that might seem surprising for a group of devout Christians—contempt for Christmas. In a reversal of modern practices, the Puritans kept their shops and schools open and churches closed on Christmas, a holiday that some disparaged as “Foolstide.”


A Puritan governor disrupting Christmas celebrations.

After the Puritans in England overthrew King Charles I in 1647, among their first items of business after chopping off the monarch’s head was to ban Christmas. Parliament decreed that December 25 should instead be a day of “fasting and humiliation” for Englishmen to account for their sins. The Puritans of New England eventually followed the lead of those in old England, and in 1659 the General Court of the Massachusetts Bay Colony made it a criminal offense to publicly celebrate the holiday and declared that “whosoever shall be found observing any such day as Christmas or the like, either by forbearing of labor, feasting, or any other way” was subject to a 5-shilling fine.

Why did the Puritans loathe Christmas? Stephen Nissenbaum, author of “The Battle for Christmas,” says it was partly because of theology and partly because of the rowdy celebrations that marked the holiday in the 1600s.

In their strict interpretation of the Bible, the Puritans noted that there was no scriptural basis for commemorating Christmas. “The Puritans tried to run a society in which legislation would not violate anything that the Bible said, and nowhere in the Bible is there a mention of celebrating the Nativity,” Nissenbaum says. The Puritans noted that scriptures did not mention a season, let alone a single day, that marked the birth of Jesus.


Increase Mather

Even worse for the Puritans were the pagan roots of Christmas. Not until the fourth century A.D. did the church in Rome ordain the celebration of the Nativity on December 25, and that was done by co-opting existing pagan celebrations such as Saturnalia, an ancient Roman holiday of lights marked with drinking and feasting that coincided with the winter solstice. The noted Puritan minister Increase Mather wrote that Christmas occurred on December 25 not because “Christ was born in that month, but because the heathens’ Saturnalia was at that time kept in Rome, and they were willing to have those pagan holidays metamorphosed into Christian [ones].”

According to Nissenbaum, “Puritans believed Christmas was basically just a pagan custom that the Catholics took over without any biblical basis for it. The holiday had everything to do with the time of year, the solstice and Saturnalia and nothing to do with Christianity.”

The pagan-like way in which Christmas was celebrated troubled the Puritans even more than the underlying theology. “Men dishonor Christ more in the 12 days of Christmas than in all the 12 months besides,” wrote 16th-century clergyman Hugh Latimer. Christmas in the 1600s was hardly a silent night, let alone a holy one. More befitting a rowdy spring break than a sacred occasion, Christmas revelers used the holiday as an excuse to feast, drink, gamble on dice and card games and engage in licentious behavior.

In a Yuletide twist on trick-or-treating, men dressed as women, and vice versa, and went door-to-door demanding food or money in return for carols or Christmas wishes. “Bands of mostly young people and apprentices would go house to house and demand that the doors of prosperous people be open to them,” Nissenbaum says. “They felt they had a right to enter the houses of the wealthy and demand their high-quality food and drink—not meager handouts, but the stuff prosperous people would serve to their own families.” Those who failed to comply could be greeted with vandalism or violence.

Even after public commemoration of Christmas was once again legal in England following the restoration of the monarchy in 1660, the Yuletide ban remained firmly on the books in Massachusetts for an entire generation. Although outlawed in public, the celebration of Christmas endured in private homes, particularly in the fishing towns further afield from the center of Puritan power in Boston that Nissenbaum writes were “notorious for irreligion, heavy drinking and loose sexual activity.”

In his research, Nissenbaum found no records of any prosecutions under the 1659 law. “This was not the secret police going after everybody,” he says. “It’s clear from the wording of the ban that the Puritans weren’t really concerned with celebrating the holiday in a quiet way privately. It was for preventing disorders.”

The prohibition of public Christmas celebrations was unique to Massachusetts, and under the reign of King Charles II political pressure from the motherland steadily increased for the colony’s Puritan leaders to relax their intolerant laws or risk losing their royal charter. In 1681, the Massachusetts Bay Colony reluctantly repealed its most odious laws, including the ban on Christmas.

Hostility toward the public celebration of Christmas, however, remained in Massachusetts for years to come. When newly appointed royal governor Sir Edmund Andros attended Christmas Day religious services at Boston’s Town House in 1686, he prayed and sang hymns while flanked by Redcoats guarding against possible violent protests. Until well into the 1800s, businesses and schools in Massachusetts remained open on December 25 while many churches stayed closed. Not until 1856 did Christmas—along with Washington’s Birthday and the Fourth of July—finally become a public holiday in Massachusetts.

This war on Christmas, to coin a phrase, lasted a remarkably long time in Massachusetts. More than 100 years after the Legislature repealed its ban on the holiday, the Puritan-infused hostility to Yuletide merriment remained palpable.

"When I was a school-boy I always went to school on Christmas Day, and I think all the other boys in town did," recalled Edward Everett Hale, the popular Boston author and preacher, in the December 1889 issue of New England Magazine. On Christmas Eve, Hale and his schoolmates might walk past King's Chapel — the city's first Anglican church, where Christmas services were held — and "see the men carrying hemlock for the decorations. But that was the only public indication that any holiday was approaching." When he lived in Worcester as a young man in the 1840s, Hale wrote, Christmas for many people was still a non-event. "The courts were in session on that day, the markets were open, and I doubt if there had ever been a religious service."

As late as 1856, Henry Wadsworth Longfellow could still describe New England as being "in a transition state about Christmas." There was enough of the old Puritan animus to keep it from being a "cheerful hearty holiday," he said. But "every year makes it more so."

Eventually, of course, popular culture came down unreservedly in favor of Christmas. Santa Claus and light-strewn trees, midnight masses and Handel's "Messiah," holiday eggnog and gift-wrapped presents — today they're as much a part of Christmas in Boston and New England as in any other part of the country. In countless ways, American life is still influenced by those devout English Christians who sailed to New England four centuries ago. But not when it comes to Christmas.

God rest them merry, but that's one war the Puritans lost.

https://www.history.com/news/when-massachusetts-banned-christmas

http://www.jeffjacoby.com/23614/when-massachusetts-was-the-battlefield-in-the-war

Wednesday, September 25, 2019


Scientists have discovered evidence of an ancient kingdom previously thought to have been a mythical creation in the Bible.


The Old Testament described Edom as a neighbouring enemy state of Judea, located southeast of the Dead Sea where explorers would now find parts of Jordan and Israel.

It was spoken of extremely harshly, with some biblical texts indicating that it was complicit in the destruction of Judea and the holy city of Jerusalem.

Edom has been described as a place "where kings reigned before any Israelite king reigned", but is later said to have been defeated and plundered by King David of Israel.

Such tales have been scoffed at by plenty of historians down the years, but discoveries by a team of scientists and archaeologists in the area where it would have stood have raised new questions about its possible existence.

Researchers from the University of California and Tel Aviv University have been working at the supposed site in what is now known as the Arabah Valley.

There they excavated a copper production site dubbed Slaves' Hill, dating back more than 6,000 years, which yielded layers of smelting waste that have helped reconstruct a time when the region enjoyed a "technological leap".

Using a process called radiocarbon dating, which helps determine how old an organic object is, the researchers were able to put a date on the smelting waste - better known as slag.


The modern day Arabah Valley

Analysis of the minerals and metals within the slag was then used to work out how smelting techniques changed over the centuries, with lower concentrations of copper indicating that more had been extracted.

Efficiency improved dramatically in the second half of the 10th century BC and the techniques also became common across various sites in the region - indicating that other workers were picking them up.

Detailing the findings in the journal PLOS ONE, team leader Erez Ben-Yosef said the technological leap played a key role in the move from the Bronze Age to the Iron Age.

"Our study sheds new light on the emergence of the archaeologically elusive biblical kingdom of Edom, indicating that the process started much earlier than previously thought," he said.

"That said, the study's contribution goes beyond the Edomite case, as it provides significant insights on ancient technological evolution and the intricate interconnections between technology and society.

"The results demonstrate that the punctuated equilibrium evolutionary model is applicable to ancient technological developments, and that in turn, these developments are proxies for social processes."

https://www.msn.com/en-au/news/world/ancient-kingdom-presumed-to-be-bible-story-could-be-real/ar-AAHPrBR?OCID=ansmsnnews11


Saturday, September 7, 2019


Modern-day Indians are descendants of one of humanity's most ancient civilizations


Ancient DNA evidence reveals that the people of the mysterious and complex Indus Valley Civilization are genetically linked to modern South Asians today.

The same gene sequences, drawn from a single individual who died nearly 5,000 years ago and was buried in a cemetery near Rakhigarhi, India, also suggest that the Indus Valley developed farming independently, without major migrations from neighboring farming regions.

It's the first time an individual from the ancient Indus Valley Civilization has yielded any DNA information whatsoever, enabling researchers to link this civilization both to its neighbors and to modern humans.

The Indus Valley, or Harappan, Civilization flourished between about 3300 B.C. and 1300 B.C. in the region that is now covered by parts of Afghanistan, Pakistan and northwestern India, contemporaneous with ancient Egypt and Mesopotamia.

The people of the Indus Valley forged an impressively advanced civilization, with large urban centers, standardized systems of weights and measurements and even drainage and irrigation systems. Yet despite that sophistication, archaeologists know far less about the civilization than that of ancient Egypt or Mesopotamia, in part because the Indus Valley writing system hasn't yet been deciphered.

Gathering ancient DNA from the Indus Valley is an enormous challenge, Vagheesh Narasimhan, one of the leading authors of the new research and a postdoctoral fellow in genetics at Harvard Medical School, Live Science, because the hot, humid climate tends to degrade DNA rapidly. Narasimhan and his colleagues attempted to extract DNA from 61 individuals from the Rakhigarhi cemetery and were successful with only one, skeleton likely belonging to a female which was found nestled in a grave amid round pots, her head to the north and feet to the south.

The first revelation from the ancient gene sequences was that some of the inhabitants of the Indus Valley are connected by a genetic thread to modern-day South Asians. "About two-thirds to three-fourths of the ancestry of all modern South Asians comes from a population group related to that of this Indus Valley individual," Narasimhan said.

Where the Indus Valley individual came from is a more difficult question, he said. But the genes do suggest that the highly agricultural Indus people were not closely related to their farming neighbors in the western part of what is now Iran.

"We were able to examine different associations between the advent of farming in that part of the world with the movement of people in that part of the world," said Narasimhan.

Farming, Narasimhan said, first began in the Fertile Crescent of the Middle East around 10,000 years ago. No one knows precisely how it spread from there. Did agriculture pop up independently in areas around the globe, perhaps observed by travelers who brought the idea to plant and cultivate seeds back home? Or did farmers move, bringing their new agricultural lifestyle with them?

In Europe, the genetic evidence suggests that the latter is true: Stone Age farmers introduced Southern Europe to agriculture, then moved north, spreading the practice as they went.

But the new Indus Valley genetic evidence hints at a different story in South Asia. The Indus Valley individual's genes diverged from those of other farming cultures in Iran and the Fertile Crescent before 8000 B.C., the researchers found.

"It diverges at a time prior to the advent of farming almost anywhere in the world," Narasimhan said. In other words, the Indus Valley individual wasn't the descendent of wandering Fertile Crescent farmers. She came from a civilization that either developed farming on its own, or simply imported the idea from neighbors — without importing the actual neighbors.

Both immigration and ideas are plausible ways to spread farming, Narasimhan said, and the new research suggests that both happened: immigration in Europe, ideas in South Asia. The results appear today (Sept. 5) in the journal Cell.

Complex populations

The researchers also attempted to link the Indus Valley individual to his or her contemporaries. In a companion paper published today in the journal Science, the researchers reported on ancient and modern DNA data from 523 individuals who lived in South and Central Asia over the last 8,000 years.

Intriguingly, 11 of these people — all from outside the Indus Valley — had genetic data that closely matched the Indus Valley Individual. These 11 people also had unusual burials for their locations, Narasimhan said. Together, the genetic and archaeological data hint that those 11 people were migrants from the Indus Valley Civilization to other places, he said.

However, these conclusions should be viewed as tentative, warned Jonathan Mark Kenoyer, an archaeologist and expert on the Indus Valley Civilization at the University of Wisconsin, Madison, who was not involved in the new research. Archaeological evidence suggests that Indus Valley cities were cosmopolitan places populated by people from many different regions, so one person's genetic makeup might not match the rest of the population. Furthermore, Kenoyer said, burial was a less common way of dealing with the dead than cremation.

"So whatever we do have from cemeteries is not representative of the ancient populations of the Indus cities, but only of one part of one community living in these cities," Kenoyer said.

And though the Indus individual and the 11 potential migrants found in other areas might have been related, more ancient DNA samples will be needed to show which way people, and their genes, were moving, he said.

Narasimhan echoed this need for more data, comparing the cities of the Indus Valley to modern-day Tokyo or New York City, where people gather from around the world. Ancient DNA is a tool for understanding these complex societies, he said.

"Population mixture and movement at very large scales is just a fundamental fact of human history," he said. "Being able to document this with ancient DNA, I think, is very powerful."


https://www.livescience.com/south-asians-descend-from-harappan-civilization.html

Thursday, September 5, 2019


Cows have the right of way, the phone book features nicknames and homes are sold with the furniture thrown in





Norfolk Island, which is roughly 1500kms off Byron Bay, has a host of fun facts under its belt — like how there are no traffic or street lights — but perhaps the most surprising is that the peaceful and picturesque South Pacific island has a median house price of $400,000.

A place most of us have heard of, but know little about, Norfolk Island was for a long time an Australian territory with quirky laws restricting the purchase of real estate to those who were born on the island, married an islander or bought a local business.

In 2015, the remote island underwent historic changes to ownership rules when the Australian government announced comprehensive reforms. Since then, the dreamy locale has opened up to a world of new househunters. Originally only people born on the island could buy there — but that’s now changed.

“It’s now the same as if you’re moving from Sydney to Melbourne, there are no restrictions whatsoever for Australians and New Zealanders,” said David Hall, a local real estate agent who has been selling property on the 35sq km island for more than a decade.

“Up until Australia took over, prior to that, it was different.  “The Norfolk Island Government had an immigration policy so you couldn’t just come here and live. You could buy a house, but only live here for up to three months and that was restrictive.

“The only way you could buy and live here permanently back then was to buy a business, run that business for five years, and then after that period if you’d behaved yourself, you could apply for residency.”

The Australian external territory, which was home to 1748 residents at the last Census, is located at the centre of a triangle made up of Australia, New Zealand and New Caledonia.

It is just 8km long and 5km wide, has one solitary roundabout, no public transport, a maximum speed limit of 50km and peak hour only exists when the local livestock take to the streets.

As well as speaking English, locals have their own living language. Norfuk is a blend of eighteenth century English and Tahitian derived from the original settlers on the island.

While there were no indigenous people recorded on the island at the time of European settlement, the descendants of Tahitians and the HMS Bounty mutineers, including those of Fletcher Christian, were resettled on Norfolk from the Pitcairn Islands.

As a result, the Norfolk Island phone book has a “faasfain” (or fast find) section featuring nicknames, rather than surnames, because so many of the locals share just a few family names.

Mr Hall said life on the island is simple, but that’s just the way locals like it and now more mainlanders than ever are buying their own slice of the property paradise.

“The sales volume has gone up, probably nearly doubled since Australia took over and that’s been fantastic. People who want to retire are happy because they don’t have to work, they can come over here now and just retire. And why not, it’s one of the safest, cleanest and most beautiful islands in the Pacific,” he said.

“It’s so safe here that the only cars you find in the street that will be locked will be those of tourists, because they can’t get out of the habits of mainland Australia.”

While retirees are definitely taking note of Norfolk Island, with the median age almost a decade older than the rest of the country at 49, Mr Hall said there is also plenty on offer for aspiring residents of all ages.

“Kids can ride their bikes to school and don’t have to be taken by their parents. It is so safe you can walk the streets day or night, in complete safety. It’s unique, it’s really lovely. We don’t have street lights, but most people have a torch,” Mr Hall added.

Norfolk Island’s local school caters for more than 300 children from Kindergarten to Year 12 and according to Census 2016, there were officially only 16 unemployed people.

“If you’re over here and you’re not working, it’s because really you don’t want to work,” explained Mr Hall.

“We’re also getting more people who are working via the internet, now we’ve got the NBN,” he said, adding that there is a need for more traditional talent as well.

“We have quite a few really good builders, we have a couple of timber mills, three joiners, a heap of plumbers and electricians. But there is a shortage, basically, of tradespeople, because they’re all so busy.”

The unique island, which is home to just over 1000 houses, has an unusual real estate market unlike that of the mainland.

While the median house price might sit a little higher than some other remote Australian regions, Norfolk Island buyers get a lot more bang for their buck.

“They’re mostly fully furnished, right down to your knives and forks and crockery, to your beds and blankets and in some cases it’ll even include the car,” Mr Hall said.

“We probably need to do more in promotions, so that people know that they can come live here, and at a fraction of the price you’re going to pay in the mainland,” he said.

“The highest sale price last year was $1.2 million. It was lovely modern home with magic views. For something like that, you would probably pay $4 million to $6 million in Sydney.”

Although property data firms don’t collect sale prices for Norfolk Island, Mr Hall has determined that the median price last year was $399,000, up from $313,000 in 2014 which was a year before the ownership rule changes.

Investors have begun to take note of Norfolk Island, with a solid rental market according to Mr Hall, however anyone looking to ride the holiday rental wave should reconsider.

“We’ve had some people buy without even coming to look at the property, but what we really don’t need is any more short term accommodation, there is a lot of that here,” he said.

Property purchases do not attract stamp duty, instead there is a local “transaction levy” which is calculated on a sliding scale.

“Basically, it’s 2 per cent up to $250,000; between $250,000 and $500,000 it’s 3 per cent; and above $500,000 it is 4 per cent,” he explained.

Another financial quirk is that the island has no GST, so life’s vices such as alcohol and cigarettes are actually cheaper than on the mainland.

While the idea of a tree or sea change might be a dream for many worn out city slickers, Mr Hall warns that island life runs at a very different pace.

Island life is remote — a world away from Amazon deliveries and Westfield shopping centres — so Norfolk Islanders have had to become a resourceful bunch, living by the rules of self sufficiency and “reduce, reuse, recycle” long before hipsters took on the second-hand economy.

“Most people grow at least some fruit and vegetables at home because the climate is ideal for it, but not everyone does.  “If you can’t grow what you need you’ll probably find it at a roadside stall where there are honest boxes,” Mr Hall said.

For new furniture, household appliances or even personal travel, patience pays off. Airfreight and travel costs can be sky high given there isn’t a flight everyday and sea mail is slow.

“You can sometimes wait months for things to be shipped over.

For travel, if you plan your visits back to the mainland in advance you can get a good deal with flights. You just have to be prepared to wait, but it’s worth it,” Mr Hall said.

https://www.realestate.com.au/news/norfolk-island-rare-opportunity-arises-to-buy-property-on-australias-mystery-isle/

Thursday, June 6, 2019


Nigella Lawson says restaurants should not play music because it drowns out the taste of the food





I dislike loud music in restaurants too.  I always ask them to turn it down and if they refuse I just leave.  I sometimes ask first are they selling food or music and that sometimes makes a favourable impact

The loud thump of music is now something to be expected in many fashionable restaurants - but Nigella Lawson has said it leaves her unable to taste her food.

The cook and television presenter has said she is "allergic to all noise" including "music in shops and restaurants".

She added: "It is utterly draining. And it drowns out the taste of the food.

"I’ve always presumed that these decisions are made by people who feel uncomfortable without noise."

Chef Richard Corrigan, who has won two Michelin stars and cooked for the Queen, said he sticks to quiet jazz piano music in his Mayfair restaurants.

The restaurateur, who owns Bentley's Oyster Bar and Corrigan's restaurant, said: "Loud music, personally I'm not a fan of it in restaurants, I don't mind some music, some live piano like we have in  Bentley's and Corrigan's is good, but what you don't want is speakers over the table.

"A good playlist is as equal to a really good menu in the right environment. It shows soul and individuality. You need to stay away from restaurants that play Abba or Eric Clapton loudly."

He added that while those in their twenties may enjoy loud music as they eat their meal, getting older means that the noise is grating.

Mr Corrigan said: "Getting old is a great, great thing, but as we get old, noise to your eardrums of any description has a detrimental effect.

"It is more about age more than anything, as you get older you look for a bit more solitude.

"When Nigella was in her twenties, she probably did love music in a restaurant".

Paul Askew, Chef Patron of The Art School in Liverpool, said that "great food needs to be tasted in a softer, more gentle environment", adding that in his restaurant he tries to "create an oasis of calm and a sanctuary of restoration for the soul."

Oisin Rogers, who runs The Guinea Grill in Mayfair, said bad music can ruin one's appetite. He explained: "Music. Everybody loves it. It provides instant atmosphere. But if there's already a good vibe there's no need for it. Restaurants provide sensory experiences. Good sights, delicious tastes and flavours, beautiful aromas, textures that intrigue and pleasant sounds.

"Some get music really right...many don't. Canned music is often an irritant, an annoyance. It might please some folk, but never all. And if it is irritating, Nigella is perfectly correct, it's impossible to enjoy food while irritated."

However, some restaurateurs said music is a crucial aspect of eating out. Two Michelin-starred chef Sat Bains said: "I think it’s important to have music in restaurants, it creates atmosphere and is necessary at the beginning of a service when the room is slightly quieter. Obviously, it shouldn’t be booming but nice and subtle. The best type of music for me is guests chattering and having a good time."

Jason Atherton of The Social Company added: "Music is really important to me, I actually highlighted this to my restaurant teams this morning; you can be sat in the most amazing restaurant, tasting the most beautiful food but if there’s no music and the atmosphere is flat, the overall experience is ruined.

"I agree music shouldn’t be overbearingly loud and it has to suit the restaurant’s ambience, but it’s such an important factor in the guest’s experience to get right."

Tom Brown of Cornerstone in Hackney Wick agreed, explaining: "I think having good music playing is essential. At Cornerstone, it’s the thing that people comment on the most, after the food. When you go out it should be fun. You obviously want to be able to have a conversation and there’s nothing worse than sitting in a silent dining room."

Action on Hearing Loss recently found that some restaurants play music up to 90 decibels on busy nights, while recommendations from the charity suggest establishments keep it below 50db.


https://www.telegraph.co.uk/news/2019/06/05/nigella-lawson-says-restaurants-should-not-play-music-drowns/?WT.mc_id=e_DM1026490&WT.tsrc=email&etype=Edi_FPM_New_ES&utm_source=email&utm_medium=Edi_FPM_New_ES_2019_06_05&utm_campaign=DM1026490