Tuesday, December 29, 2020

Just How Dark Were the Dark Ages?


Whether it’s the idea of barbarian hordes run amok across a continent ruled by the Romans for centuries, or the notion that science and the arts went through a 300-year freeze, the concept of the Dark Ages has always titillated the imagination.

In truth, a big part of what makes the era dark to modern eyes is the relative lack of surviving information. But what we don’t know has always been at least as interesting as what we do know. Did King Arthur really exist, let alone send his knights on a quest to find the Holy Grail? Was there ever a legendary hero named Beowulf, and how long had his story existed before the oldest known surviving manuscript appeared in roughly the 10th century?

Of course, the Dark Ages also refers to a less-than-heroic time in history supposedly marked by a dearth of culture and arts, a bad economy, worse living conditions and the relative absence of new technology and scientific advances. While the period continues to fascinate history buffs, scholars and fantasy fans looking for some tangible link to their favorite mytho-historical heroes, the term “Dark Ages” has largely fallen out of use among serious researchers, due to some of the implications and assumptions made by those who first propagated its use.

“No academic uses it today — because it’s actually one of the most fascinating and vibrant periods about which we are discovering new knowledge every year,” says Julia Smith, a professor of medieval history at the University of Oxford’s All Souls College.

Let’s take a closer look at those aspects of the period that scholars typically refer to now as the Early Middle Ages to separate, the dark from the light.

Shadows of the Empires

The origin of the term “Dark Ages” is itself a little murky in the historical record, but typically it was used in contrast to the praise heaped on the shining cultural achievements of the Greek and Roman empires, compared to the knowledge and culture that existed after their decline and fall.

This concept carried on into the Age of Enlightenment, when many scholars of the day pointed to the great architectural achievements of the Romans and compared them to a return to simpler wood structures of the following period, says Alban Gautier, a professor of medieval history at the University of Caen-Normandy in France. The idea of a dark, barbarian period was also pointed out in contrast to 19th century civilizations in Europe and America.

“This phrase is deeply steeped in the 19th century western European idea that some civilizations are superior to others, which today sounds very difficult to hear,” Gautier says.

Gautier believes the term still has some use in a strictly academic sense — particularly as it applies to historians. While the Romans were excellent record keepers, historical texts and documents are comparatively scarce starting with the 5th century and for several hundred years thereafter.

“It’s dark for historians. It’s difficult for historians to understand what happened,” he says.

Art in Darkness

But Gautier points to notable exceptions. After the Roman apparatus collapsed, taking with it many of its institutions, such as secular schools, the Catholic Church stepped in to provide some form of learning and scholarship in many parts of Europe.

“The Church in western Europe and all the regions north of the Mediterranean becomes the biggest element of stability,” he says. Monks worked to copy much of the literature and scientific texts of the Roman period, and to a lesser degree the Greek period.

“Of course they had a religious agenda, but in order to forward this agenda, they had to know Latin,” Gautier says. “Knowing Latin grammar meant keeping knowledge and learning from the Latin texts.”

Meanwhile in England, the absence of many works of significant writing dating to this period doesn’t mean society was idle. In fact, some of the most enduring legendary characters of England emerged in this period. In what’s attributed to a 6th century Welsh poet, the earliest known reference to England’s most famous heroic monarch comes in a form of comparison, when the poet describes a warrior who killed many people, but noted that this fighter "was no Arthur," says Bryan Ward-Perkins, a professor at the University of Oxford and author of The Fall of Rome and the End of Civilization.

And while the oldest written poem of warrior Beowulf dates roughly to the 10th century, some scholars believe the legend was taken from oral traditions that date back far earlier.

Dark Economy

Another common characteristic associated with the Dark Ages is the relative lack of monumental architecture. Towns and cities no longer built large new stone structures. And the slow deterioration of Roman infrastructure such as aqueducts likely had an effect on quality of life in cities, Gautier says.

Populations of major cities like Rome and Constantinople shrank in this period. But Gautier believes rural life may have actually improved, especially in the largely bucolic British Isles. During the Roman period, farmers would have had to pay regular taxes to support the empire and local cities. But as administration fell apart, the tax burden likely diminished.

“The cities and the towns were smaller. It was less necessary for farmers to produce and work a lot in order to feed the cities,” Gautier says.

But Ward-Perkins says that archaeological evidence does suggest some scarcity of resources and goods for common people. “The other way it might be dark is just the lack of evidence, which is probably a symptom of economic decline,” he says. By 450, the evidence of simple day-to-day items such as new coins, pottery or roof tiles largely disappeared in many parts of Europe, and wasn’t found again until roughly 700.

Barbarian Science

As for the claims that societies took a step back in terms of science and understanding during this period? While it’s true that western Europe didn’t show as many achievements in technology or science in the Dark Ages as it would demonstrate later, those shortcomings were countered by an explosion in culture and learning in the southern Mediterranean, with the first few Islamic caliphates.

Europe itself did maintain some practical technology, such as watermills. In terms of learning, Isidore of Seville, an archbishop and scholar, created an encyclopedia of classical knowledge, much of which would have been otherwise lost, in his massive Etymologiae.

The relative isolation of the British Isles also allowed people there to develop unique styles of jewelry and ornate masks, Ward-Perkins says. Some of these can be found today in the archaeological excavation of graves of Sutton Hoo in eastern Anglia, which included a Viking ship burial.

“The relative dearth of written sources is more than counterbalanced by the huge amount of archaeological evidence,” Smith says.

While the Dark Ages may have started with the fall of the Roman Empire, the Medieval period, around the end of the 8th century, begins to see the rise of such leaders as Charlemagne in France, whose reign united much of Europe and brought continuity under the auspices of the Holy Roman Empire.

Although most scholars would agree that the so-called Dark Ages represent a distinct period throughout most of Europe, many of the assumptions that first made that term popular are no longer valid. Even the most persistent idea that the period represents one of violence, misery and backwards thinking has been largely disproven.

“The idea that’s completely out of fashion these days is that it was dark because it was morally worse,” Ward-Perkins says. But these days, he notes with a touch of dark humor, “everybody pretty much accepts that humans are pretty horrid all the time.”

https://www.discovermagazine.com/the-sciences/just-how-dark-were-the-dark-ages?utm_source=pocket-newtab-intl-en

Thursday, February 27, 2020

The End of the Democratic German Revolution



For four years, the German Empire took on much of the world in what was originally called the Great War. Berlin had allies, after a fashion — the decrepit Austro-Hungarian and Ottoman empires, along with Bulgaria, a smaller kingdom on the outs with its Balkan neighbors. Against them was a formidable array: United Kingdom and its dominions/colonies, France, the Russian Empire, Italy, Japan, Romania, Serbia, and the United States.
The Germans did surprisingly well, defeating Russia and Romania, stalemating and almost knocking out France, and undergirding Vienna against Italy and Serbia. But preserving the Ottomans against the UK and an Arab revolt was beyond Germany’s power. The German military ultimately could not stop the disintegration of the ramshackle, multi-ethnic Austro-Hungarian Empire. Nor could the Reichsheer, the world’s finest fighting force, withstand years of privation from the crippling British blockade, increasing spread of destructive Bolshevik propaganda through German troops redeployed from the east, and arrival of the ever-expanding American Expeditionary Force.
By September 1918, the German army was losing ground as Berlin’s allies collapsed. The Supreme Army Command, represented by Gen. Erich Ludendorff, Germany’s de facto dictator, informed the Kaiser that a ceasefire was imperative. The government was turned over to the elected Reichstag as negotiations with the Allies commenced. The Entente agreed, but only on condition that the Reich surrender its heavy weapons to prevent resumption of hostilities. The result was the armistice that took effect on November 11.
The Kaiser wanted to remain, but his people had different ideas. The navy planned a last suicidal sortie for the Imperial Fleet, sparking a mutiny by sailors that grew as workers and soldiers joined in. Workers’ and Soldiers’ Councils spread across the country. In Bavaria, socialist journalist Kurt Eisner led a revolution that proclaimed a Volksstaat, or People’s Republic. Widely hated for making public documents from Germany’s diplomatic campaign, he was later murdered as he prepared to resign. Secondary monarchs of individual states, who retained some authority, abdicated before mobs forced them to act. The Councils ended up in conflict with the Social Democrats, who pushed constitutional, democratic reform.
In early November, the Kaiser still hoped to return with the army from the battlefield to suppress opposition, but his plans were disconnected from reality. Friedrich Ebert, head of the Social Democrats, warned, “If the Kaiser does not abdicate, the social revolution is unavoidable. But I do not want it, indeed I hate it like sin.” On November 9, the civilian authorities announced the Kaiser’s abdication without his approval, followed by Ebert’s appointment as chancellor. Ebert’s short-lived predecessor, Prince Maximilian von Baden, a liberal, said to Ebert, “I entrust you with the German Reich.” That latter responded, “I have lost two sons for this Reich.”
Ebert was born in 1871 and later learned the saddlemaker trade. He became active politically and settled in Bremen, where he became local party chairman. In 1905, he became SPD secretary general and in 1913 was elected party co-chairman. A moderate, he led the majority into the Burgfrieden, a pact in which the political parties agreed to set aside their differences for the duration of the war. By backing the government, as most socialists in the other combatants also did, he ensured a split with radicals, who then left the SPD. In January 1918, he simultaneously supported striking workers at a munitions plant and urged them back to work. Some saw him as a traitor to Germany, others to the workers. Increasingly he cooperated with more moderate bourgeois parties, which continued after the Kaiser’s fall.
As Ebert took over as chancellor, Karl Liebknecht, founder of the radical Marxist Spartacus League, proclaimed a socialist republic, which envisioned revolutionary change. An intense power struggle ensued in the following weeks, with the moderate Social Democrats maintaining control of the government and ultimately relying on army troops and nationalist auxiliaries to rebuff communist rebellion. After the Kaiser’s departure, the pragmatic Ebert made what became known as the Ebert–Groener deal with Gen. Wilhelm Groener, successor to Ludendorff as army quartermaster general. Ebert promised to protect the military’s autonomy and suppress left-wing revolts while Groener pledged the armed services’ loyalty to the government.
The two set up a secret hotline for daily communication and worked together in December to suppress a short-lived revolt by naval personnel. Nationalist Freikorps forces also fought against revolutionaries with great ferocity. Such battles were replicated across the country. Many on the left turned against Ebert and the Social Democrats. In early January, the Spartacists, named after the famous Roman slave, and the newly founded Communist Party of Germany attempted to seize Berlin and oust the Ebert government. The army and Freikorps prevented the putsch.
A week later a constituent assembly was elected, which met in the city of Weimar, outside of Berlin, and chose Ebert as provisional Reichpräsident. A constitution was drafted for what became known as the Weimar Republic. Ebert’s presidential term was extended to 1925 to avoid an election amid political upheaval.
Perhaps his gravest challenge in 1919 was the Versailles Treaty. Negotiated by the allies without German input, it was presented as what the Germans called a Diktat. Government officials resigned rather than sign, leaving Gustav Bauer, a leading social democrat, head of the cabinet. He sought changes, but the allies issued an ultimatum: If Berlin did not sign, they would resume their march, which the German army admitted that it could not prevent. Bauer capitulated to the allied ultimatum. When the treaty came before the National Assembly, Ebert asked Germany’s military command if the army could defend Germany against an allied invasion. Told no, he urged the legislators to ratify the document, which they did on July 9.
Weimar’s life was more short than sweet. Regional resistance to Berlin’s rule continued elsewhere in Germany. Ebert turned to the security forces to eliminate radical workers’ councils. In 1920, disaffected army troops and Freikorps members staged a coup, known as the Kapp Putsch, named after civil servant Wolfgang Kapp. The government was forced to flee, but a general strike ended the revolt. The SPD-dominated regime told the German people, “Only a government based on the constitution can save Germany from sinking into darkness and blood. If Germany is led from one coup to another, then it is lost.”
In the following years, however, officials were assassinated by nationalists and Adolf Hitler joined the Nazi Party and in 1923 attempted to seize power in Bavaria with the Beer Hall Putsch, which was put down by the Germany police and army. Ebert appointed right-leaning officials and employed the president’s emergency powers — which eventually supplanted parliamentary rule — 134 times. The republic slowly gained authority, but not legitimacy.
Opponents concocted the infamous Dolchstoßlegende, or “stab-in-the-back myth.” Ludendorff’s request that the Kaiser seek a truce was conveniently forgotten as nationalists argued the military was not defeated in the field, but instead had been betrayed at home. This claim was reinforced by the fact that civilians signed the hated Versailles Treaty.
The agreement detached Germany territory, placed ethnic Germans under foreign rule, blamed the war on Berlin, restricted the German military, and obligated the Germans to pay reparations for the entire cost of the conflict. The Allies fell between two stools. They neither imposed a Carthaginian peace, effectively dismantling Germany, nor conciliated Berlin, creating a stable international order. And when Berlin failed to comply, the former Entente powers neither enforced nor rewrote the treaty’s terms. It fueled political discontent and extremism.
The result was disaster, but Ebert did not live to see the future. Sickly, he died of septic shock after an attack of appendicitis on February 28, 1925. In a tragedy that Germans did not then understand, he was replaced by conservative authoritarian Paul von Hindenburg, the talented field marshal whose reputation survived Germany’s defeat. The latter narrowly defeated Wilhelm Marx, the candidate of the moderate Centre Party of Germany, dominated by Catholics. Had Marx been elected, German democracy would have had a better chance to survive, despite the manifold challenges to come.
A decade later, wild inflation and the Great Depression had ravaged the German middle class. The Nazis and Communists gained a parliamentary majority, forcing the aging Hindenburg, no fan of Hitler, to rule by emergency decree. Eventually appointed chancellor, Hitler quickly consolidated power. And on the president’s death on August 2, 1934, the Nazi Führer took over those powers, as well. The sadly imperfect Weimar democratic experiment was over. From there the political road led directly to World War II and the Holocaust.
Ebert’s role remains controversial. Today’s Social Democrats embrace him, having named their party think tank after him. But some on the left attack his support for the German state in World War I and later tactical alliance with conservative, even reactionary forces. Some on the right argue that he undermined the German state when it was at its most vulnerable in World War I.
In fact, Ebert balanced conflicting responsibilities unusually well, despite inevitable missteps. He faced challenges beyond the abilities of most statesmen. Whatever his initial plans, he became Germany’s most pivotal defender of democracy and liberal governance. He steadfastly backed elections even when presented with the opportunity to seize power in the name of a socialist workers’ state. He turned to the military, but only against radicals determined to impose their rule on others. And he governed with prudence and restraint, refusing to wreck an already ravaged nation by embarking on hopeless resistance to the Allies and the Versailles Treaty.
Ebert was only 54 when he died. Had he lived, his nation still would have faced abundant economic and political crises. But it is hard to imagine him appointing Hitler as chancellor. Ebert might have found a way to create and preserve a Left–Right alliance against the extremes. Perhaps that, too, would have proved to be a dead end. But we shall never know since, tragically, it was the road not taken.


 

Friday, January 10, 2020


What explains the curious persistence of the Myers–Briggs personality test?


BOOK REVIEW of "What’s Your Type? The Strange History of Myers–Briggs and the Birth of Personality Testing" by Merve Emre

Comments by Australian psychologist Nick Haslam below. Haslam is good at exposing the Myers Briggs nonsense but he is not equally good at examining his own assumptions



Standing at the end of a line, pressed up against the glass wall of a well-appointed meeting room, I asked myself the rueful question that all personality psychologists have posed at least once: why is the Myers–Briggs Type Indicator so damned popular? The smart, charismatic consultant facilitating this leadership course had given the questionnaire to his class and instructed us to line up according to our scores on extraversion–introversion. Far to my right on this spectrum of perkiness stood a colleague with a double-espresso personality; down this end, with no one to my left, I was decidedly decaf.

Let me get off my chest what’s wrong with the Myers–Briggs, or MBTI as it is known in the acronymphomaniac world of personality testing. The MBTI classifies people according to four binary distinctions: whether they are extraverts or introverts, intuitive or sensing types, thinkers or feelers, and judges or perceivers. Three of these distinctions rest on an archaic theory of personality typing proposed by Carl Jung, and the fourth was invented and grafted on by the test’s developers.

The four distinctions bear little relation to what decades of systematic research have taught us about the structure of personality. They are smeared unevenly over four of the five dimensions that most contemporary personality psychologists accept as fundamental, and completely ignore a fifth, which is associated with the tendency to experience negative emotions. The same effort to erase the dark side of personality is evident in the MBTI’s use of sanitising labels to obscure the negative aspects of its four distinctions. In large measure, being a thinking type amounts to being interpersonally disagreeable, and being a perceiving type to being impulsive and lacking in persistence. But in MBTI-world, all personality types are sunnily positive, a catalogue of our “differing gifts.”

The MBTI doesn’t only misrepresent the content of personality. It also gets the nature of personality fundamentally wrong. Despite masses of scientific evidence that human personality is not composed of types, its four distinctions are understood as crisp dichotomies that combine to yield sixteen discrete personality “types,” each with a four-letter acronym such as INTJ or ESFP. In reality, personality varies by degrees along a set of continuous dimensions, just like height, weight or blood pressure. In the face of mountains of research demonstrating that personality is malleable throughout the lifespan, proponents of the MBTI also argue that one’s type is inborn and unchanging. In short, the MBTI presents personality as a fixed essence whereas the science of personality shows it to be a continuous flux.

The MBTI also fails to meet the standard statistical requirements of psychological tests. Its items employ a problematic forced-choice format that requires people to decide which of two statements describes them better. Its scales lack coherence. The typology lacks re-test reliability, which means that people are commonly scored as having different types when they complete the measure on two separate occasions. Evidence that MBTI type correlates with real-world behaviour — known as predictive validity in the trade — is scant.

So why is a test with weak psychometric credentials, based on a musty theory of personality that gets the structure of human personality wrong, so enduringly popular? Arguably its weaknesses from a scientific standpoint are precisely what give it its appeal. Personality may not really form discrete types, but people relish the clarity of noun categories and binary oppositions. Personality may not really come in sixteen flavours, but MBTI types are sweet simplifications. Personality may be mutable, but people find reassurance in the idea that they have an unchanging true self. And the average person could not give two hoots about the statistical considerations that trouble test developers.

What matters to most people, at least those who complete the MBTI as an exercise in self-understanding rather than a compulsory workplace activity, is whether it offers accessible and palatable insight. And the MBTI undoubtedly provides that in spades. Its four-letter codes are readily grasped, its descriptions flatter our strengths, and the fact that its four distinctions bear some relationship to fundamental personality traits ensures that it offers a certain truthiness.

Although the shortcomings of the MBTI have been discussed within academic psychology for decades, a historical analysis has been lacking. Merve Emre’s fascinating new book fills that gap stylishly. Emre, a literature academic at Oxford, documents the genesis of the MBTI in the Jungian enthusiasms of Katharine Briggs and the more worldly ambitions of her daughter, Isabel Briggs Myers. Despite the subtitle’s questionable reference to the “birth” of personality testing — the first test dates back almost another thirty years to the first world war — the book’s recounting of the origins of the instrument is colourful and revealing.

Katharine Briggs emerges as someone single-mindedly devoted to making sense of human individuality and using that sense to guide people in directions to which she believed them suited. As a young mother without training in psychology, she developed a system of personality typing that she used in an informal child guidance, or “baby training,” enterprise, later finding a resonance between her ideas and those expressed in Carl Jung’s Psychological Types, which was published in 1921. Jung became Katharine’s “personal God”: at one point she wrote a hymn to him (“Upward, upward, from primal scum / Individuation / Is our destination / Hoch, Heil, Hail to Dr Jung!”). Encouraged by her correspondence with the great man, and armed with 3ʺ x 5ʺ index cards, Katharine refined her classification system and compulsively typed everyone she encountered, from neighbourhood children to Adolf Hitler.

Katharine’s daughter Isabel Briggs Myers had a more pragmatic cast of mind but inherited her mother’s absorption in types. After writing two mystery novels, she developed an early version of the MBTI while working for America’s first corporate personality consultant in 1943. Soon after, she launched it as a small commercial proposition. In the late 1950s the questionnaire was picked up by the Educational Testing Service, an eminent test developer and publisher in Princeton, New Jersey, giving it a chance at mainstream success and respectability. After endless wrangling between Isabel and staff psychometricians, though, the ETS lost interest and cut its losses. Seeing the instrument as “little better than a horoscope,” ETS staff insisted on conducting the same validation research as any other test would undergo, but Isabel remained resistant and possessive. Eventually a new publisher released the MBTI as a self-scored test and it quickly became a staple of the US$2 billion personality assessment industry, especially beloved by personnel consultants.

As history goes, Emre’s book is compelling and well paced. It presents Katharine and Isabel as rounded characters and places them in a richly drawn cultural and historical context. But as an account of personality testing more generally, the book is flawed. Despite having chronicled the many ways in which the MBTI was a cuckoo in the nest of personality psychology — the product of obsessed amateurs, disparaged by the psychometric orthodoxy at the ETS, popularised rather than professionalised — Emre sees it as emblematic. An emblem it is not. Unlike most other major tests, its use is not restricted to trained professionals and its legacy is protected by an almost cultish organisation that forbade Emre access to most of the Briggs–Myers papers, despite their officially being open to the public. Unlike other tests, the MBTI doesn’t promote itself by appeal to a validating body of scientific evidence. To treat the MBTI as representative of contemporary personality testing is like presenting the primal scream as representative of modern psychotherapy.

Emre is on more solid ground when she describes the functions of workforce personality testing, using the MBTI as an example. Its key purpose in that domain — only one of several in which it is used, it must be said — is indeed to select people who are likely to perform better than others in particular lines of work. Ideally that rationale is backed by evidence that the tests are valid predictors of workplace performance. Whether this purpose is benign or sinister is open to debate. It can be viewed positively as the legitimate application of behavioural science to enhance the wellbeing of workers and the success of organisations, or negatively as a dystopian tool for creating human cogs for the corporate machine.

Emre favours the darker interpretation, writing that personality typing “conscripts people into bureaucratic hierarchies.” This charge is hyperbolic: even if one is critical of the use of the MBTI or other testing, it does not force people into any position against their will, it is not employed exclusively in bureaucratic organisations, and it is used at least as much to differentiate people horizontally according to their strengths as it is to stratify them in hierarchies. The very same charge could be made against any other approach to selecting or assigning people to organisational roles, including interviews, hiring quotas or old boy networks.

The key question has to be whether personality testing selects and assigns people to work roles in ways that are better or worse than its alternatives: whether it is fairer and more valid, efficient or desirable than some other preferred metric. Unless there are grounds for believing that personality tests are worse than these alternatives, to criticise them for conscripting people into bureaucratic hierarchies is merely to express hostility to bureaucratic hierarchies.

Emre also struggles to form a consistent view when she discusses personality testing’s relationship to individuality. At times she presents the MBTI as a tool that promotes individualism by claiming to clarify each person’s specialised strengths and aid in their quest for self-discovery. At others she describes it in over-heated terms as “liquidating” or “annihilating” the self, as if a questionnaire had the capacity to destroy the person’s uniqueness. Here she cites the work of German social theorist Theodor Adorno, fierce critic of commodification (and jazz), who proclaimed that personality tests undermine human individuality.

Emre never quite resolves these antithetical views, but the paradox is only apparent. Receiving a score on a personality test, or even being assigned to an MBTI “type” does not submerge individuality. It simply provides it with a partial description that other people may share. Being described as brunette, overweight, liberal or a typical Taurus does not undermine a person’s selfhood but merely qualifies it, and the same is true when someone is described as being an ENTP. MBTI types, for all their conceptual failings, don’t reduce personal identity to one of sixteen psychological clones. They simply offer people a language for capturing some aspects of their personal distinctiveness.

In passing, Adorno’s critique of the “reified consciousness” involved in personality testing has a certain irony to it. In one of his books he recalled being asked by an American colleague whether he was an extravert or an introvert, writing contemptuously that “it was as if she, as a living being, already thought according to the model of multiple-choice questionnaires.” A few years later, while conducting his influential studies of authoritarianism, Adorno proceeded to create his own multiple-choice personality questionnaire.

Another confusion arises in Emre’s discussion of personality typology. Remembering the horrors of the Holocaust, Adorno rightly condemned the practice of assigning people to categorical types. This is a legitimate criticism of the MBTI, whose proponents view personality types as discrete and unchanging facts of nature. (Emre writes that Isabel Briggs Myers was astonished to find that scores on the MBTI’s scales were distributed in a bell curve, not in the camel-humped way that type theory supposed.) Emre notes this criticism of typology but then mistakenly applies it to personality testing in general. In contrast to the MBTI, almost all personality tests are explicitly anti-typological. These tests assess differences between people along a continuum without invoking bogus categories, and they do not make ill-founded claims that their scores correspond to unchanging personal essences. By failing to recognise that typological thinking is a specific failing of the MBTI, Emre misses the extent to which major criticisms of that instrument do not tarnish personality testing as a whole.

To serious students of personality, the continuing success of the MBTI within the testing industry is a source of bafflement. Emre’s book does not diminish that dismay, but it helps to clarify why the instrument is the way it is. Despite its unpromising beginnings, she demonstrates that it has a powerful appeal, offering an intuitively attractive way to apprehend ourselves as a pattern of distinctive strengths. In Emre’s preferred Foucauldian terminology, the MBTI is an effective “technology of the self.” The fact that it is a rather Bronze Age technology is almost immaterial.

https://insidestory.org.au/not-my-type/