Jeff’s Blog #19: Modernity and Antimodernism

Jeff’s Blog, Volume 19

This week I’d like to write about modernity and antimodernism. These are difficult concepts to get a handle on, or even to define. We’re used to hearing “modernism” refer to an artistic movement, but, as one of this week’s authors points out, the earliest modernists were making art about economic and social change. When sociologists want to distinguish the modern from the premodern, they often reach for the terms gemeinschaft and gesellschaft, or community and society, which Ferdinand Tonnies used to distinguish a world organized around families and communities from one organized around corporations and states. I am persuaded by a definition of modernity as a condition of contradiction. Modern people are unbound by feudal privilege, thrilled at the opportunity to remake the world, but at the same time rootless and buffeted about by change. Joseph Schumpeter’s famous phrase “creative destruction” may be the watchword of modernity. Or, even better, is this sentence from Karl Marx (criminally underrated as a writer): “All that is solid melts into air, all that is holy is profaned, and men are at last forced to face with sober senses the real conditions of their lives and their relations with their fellow men.”

It seems to me that the only intellectually responsible position is to face modernity with ambivalent horror and awe. Modernization is a liberatory process. Modern people are empowered to create inventions, songs, and lifestyles unimaginable to our ancestors. But, back to Marx, “All our inventions and progress seem to result in endowing material forces with intellectual life, and stultifying human life into a material force.” In my ambivalence it turns out I am in good company. The history of modernity is the history of people worrying about modernity, as I learned in some of the books below. But I find this topic especially important to the present. There are obviously intolerable aspects of the nationalist movements that have risen in Europe and the United States, but if we were forced to identify a legitimate intellectual core, I think it would be antimodernism. I recommend this essay from Will Wilkinson on regional inequality and moral polarization. Wilkinson takes the familiar, if tired, “urban elite vs. Real America” dichotomy and reframes it in more sensible (and income-agnostic) terms using a moral typology from the World Values Survey–“secular-rational and self-expression oriented” in cities and college towns vs. “zero-sum survival” and “traditional” values in rural areas and exurbs. J.D. Vance, in his appearance on Ezra Klein’s podcast, made a similar point about the moral disconnect he has observed between Red and Blue America. I have found it helpful to think about this divide as one of modernism and antimodernism.

Of course, as I explore below, modern conservative thinkers rarely live up to the promise of genuine antimodernism. They tend to settle for a cheap facsimile that goes off the rails in blaming women or immigrants or minorities instead of something deeper like individualism, atheism, or the cult of progress. As you can tell, I think there is something worth saving in antimodernism. If you’re even a tiny bit idealistic, the hope is obvious: to hold onto the good parts of modernity while reclaiming some of what has been lost. How hubristic! To embark on a quest that foiled Marx, Goethe, and Dostoevsky! My only counter is that perhaps now, by 2017, enough of the luster has worn off modernity for us to face it unintimidated.

1. All that is holy is profaned
Berman, who I mentioned above, gives an account of the discovery of modernism in the 19th century in his book All That is Solid Melts into Air. This is one of the most profound books I’ve read recently, and one that immediately skyrockets to a list of my favorites. I would classify it as revisionist history through literary criticism: tracing a common intellectual thread through a hundred years of great books. Its first major contribution is a definition of modernism as unrelenting change and development–of self and of society. Berman points out that change is both a process and a destination, and one may be more rewarding than the other. The image that drives this home is a scene from Notes from Underground where the Underground Man contemplates the Crystal Palace, the ultra-modern site of the first World’s Fair. He speculates that designing the palace must have been heroic and creative, but it would not be natural to stay too long: “How do you know, perhaps [man] only likes that edifice from a distance and not at close range, perhaps he only likes to build it, and does not want to live in it.” And so with the modern world.

Construction and architecture provide most of the best stories and metaphors in Berman’s analysis. I thought the most brilliant bit was his discussion of Goethe’s Faust. He calls Faust “the first and still best tragedy of development.” Development is at first glance a strange word for Faust. Or so it seemed to me, because I knew nothing of Part Two! Part One presents the tragic incompatibility of grand ambition with traditional small-town life, as seen through Faust’s doomed relationship with Gretchen (“there is no dialogue between an open man and a closed world”). But in Part Two, Faust progresses from personal development to world development. He stands on a mountain with Mephistopheles, looks over the sea, and rages that someone should be harnessing it for productive use! Faust Part Two is about modern civil engineering projects, and much more. Up until that point, Faust’s superhuman accomplishments had been fueled by black magic–the Devil’s helping hand. Going forward, though, his main magic trick is the division of labor. Forces of the underworld are replaced by forces of industrial organization. Faust organizes thousands of people and builds canals, harbors, dikes, levees, and a great city. The story even includes “the first embodiments in literature of a category of people that is going to be very large in modern history: people who are in the way–in the way of history, of progress, of development.” It is an old couple occupying the last plot of unclaimed land. Faust has them disposed of. His project complete, Faust has no more reason to exist: “Ironically, once this developer has destroyed the premodern world, he has destroyed his whole reason for being in the world.” Longtime blog readers will recognize that this is precisely the tragedy of finite, as compared to infinite, games.

Berman draws the comparison between Faust and Robert Moses, as well he should. Moses is just one in a long line of developers, modernizers, who fit into the Faustian archetype of summoning forth dark energies of creative destruction to remake the world. But Berman doesn’t go for the predictable bait of anti-growth ideology; he says in a modern society only the most extravagant “thinking big” can possibly open up opportunities for “thinking small.” He sees the post-WW2 atomic scientists, especially Leo Szilard, as those who have best understood the tragic depth of the Faust myth. The scientists have played the role of Mephistopheles, offering a dangerous bargain to mankind. We must decide whether to take it. “In the project of development, we are all experts.”

2. Party like it’s (18)69
Goethe, Marx, and Dostoevsky sensed the tragic contradiction at the heart of modernism from the get-go. But most people were unabashedly enthusiastic about the progress and increasing wealth of the 19th century. In the United States, the most admired public intellectual in the late 1800s might have been Herbert Spencer, best known as a leading social Darwinist, who argued that societies evolve from primitive homogeneity to complex heterogeneity and that material and moral progress go hand in hand. T.J. Jackson Lears’ book, No Place of Grace: Antimodernism and the Transformation of American Culture, 1880-1920 is the story of the first generation to turn against modernity. Think of this as the first coming of the 1960s. The revolt came from those who had seen the most modernism: the upper classes. “Many beneficiaries of modern culture began to feel they were its secret victims.” The gist of the complaint, bluntly, was that well-heeled Americans felt they were getting soft. More Americans worked in offices and factories instead of on farms. Decades after the fire and brimstone of the Second Great Awakening, mainstream Protestantism had grown more liberal and accommodating. The public health crisis of the day was “neurasthenia,” which was their word for a nervous condition thought to afflict people who were thinking too much and repressing more “natural” emotions (this is the context in which Freud burst onto the scene). One critic, trying to explain the popularity of Kipling, Stevenson, and other “masculine” adventure writers, summed the malaise up well: “We had become so over nice in our feelings, so restrained and formal, so bound by habit and use in our devotion to the effeminate realists, that one side of our nature was starved. We must have a revolt at any cost.”

The revolt came in the form of several cultural fads with one common thread: to reclaim “real,” “authentic” experience. These keywords would have resonated in the 1960s, as they do today. Indeed, Lears’ central argument is that antimodernism ultimately “helped ease accommodation to new and secular cultural modes.” In searching for authenticity, the antimodernists invented a new culture of consumption: “Their criticism has frequently dissolved into therapeutic quest for self-realization, easily accommodated to the dominant culture of our bureaucratic corporate state.” In other words, potentially liberatory pursuits like yoga, mindfulness, and woodworking are transformed into hobbies and consumer products, palliatives for life under corporate capitalism rather than alternatives.

The Arts and Crafts movement was one leading example. Although we now think of “Arts and Crafts” as a period design style, it was also a political ideology. Printers, furniture makers, and other craftsmen, especially in middle-class Boston communities, disdained factory mass production and aspired to be Medieval artisans. But their critique of factory work was reactionary; rather than worry about the laborer’s lack of autonomy, hygiene, and pay, they worried that factory workers, bored and shiftless, would become criminals and radical anarchists. Would-be reformers sponsored workshops to teach laborers to take pride in their work. Meanwhile, the more affluent craftsmen sold their handmade wares to one another. The resulting “handcraft” aesthetic standards were what Thorstein Veblen was referring to in his early critiques of conspicuous consumption.

Medieval times were a common theme in the antimodernists’ escapism. Just as 1960s antimodernists looked to Eastern spirituality, their 1890s predecessors looked to saints, peasants, cathedral builders, and the works of Dante. One aspect of this cultural appropriation I found amusing was the translation of Medieval Catholicism into an overwhelmingly Protestant world. Lears argues that Catholicism–with its relics, rituals, and stereotypically Southern European sense of emotion–was a welcome escape for adherents of a repressed, bloodless Protestantism. Knights, of course, were the most popular Medieval characters. This is the period when Twain wrote A Connecticut Yankee in King Arthur’s Court, the Knights of Labor became the most powerful labor organization, and many prep schools, country clubs, and universities adopted heraldic symbols. Part of the appeal of knighthood was its connotation with physical vitality. Lears suggests that enthusiasm for the Spanish-American War stemmed from a pervasive longing for violence as a shot in the arm to a neurasthenic nation: “War was seen as a moral medicine or purgative for over-civilization.”

At the end of each chapter, Lears dismisses each antimodernist fad with roughly the same argument: it was only accommodation and escapism; it did nothing to stem the onward flow of modernity. It’s somewhat frustrating to read a book where you can predict each argument in advance. But I take Lears’ point. We would-be critics of modernity are much too bound up in its riches to stage a serious revolt. If we were to do so, Lears stresses that it cannot be based in self-gratification and self-fulfillment–what he calls the “therapeutic” attitude familiar from the Eat, Pray, Love genre. It would have to be a collective act. At the same time, Lears reminds us that collective antimodernism is a dangerous force in its own right. It’s very easy to blame other people for having made us soft rather than acknowledging we are all both subjects and objects of development. Consider one obvious case: “The Nazi myth labeled the Jew as the quintessentially modern man–urban, ruthless, rational, immersed in the inauthentic realm of commercial exchange. In a sense, the war to exterminate the Jews marked the ultimate extension of one form of antimodernism.”

3. The best the far right has to offer
On that note, fast-forward to eastern Europe of the present day. Communism is mostly in the rearview mirror. But not everyone thinks liberal democracy is any better. At least not Ryszard Legutko, a Polish philosopher and politician, who argues that both communism and liberal democracy push inexorably toward progress and modernity, sweeping all that came before into the dustbin of history. Both are integrated political/cultural systems with no room for dissent. This is the argument of The Demon in Democracy: Totalitarian Temptations in Free Societies, one of the hottest books currently circulating in the antimodernist Christian right (think Ross Douthat and friends). This is a very frustrating book. It offers a formidable, legitimately powerful critique of liberalism, and then squanders the moral weight of that critique on mundane bigotry you could find in five minutes on Twitter. What to do with such a book? As a political matter, it might be reasonable to ignore such people. But as a student of ideas, I feel compelled to follow good ones. Maybe the most important task is to confirm that Legutko’s good ideas can stand alone without his bigotry.

Legutko offers criticism of the individualism he (and Berman) takes to be at the core of modernity. He says liberalism is “anthropologically minimalist” in its conception of human beings. By “minimalist” he means a lack of any particular aspiration for what a good human life should entail. The only principle actively affirmed by liberalism is equality, but in his view this equality requires a regrettably low standard for language, education, and moral conduct. People are free to become whatever they want, but not encouraged to become anything particularly good. Classical liberals like J.S. Mill insisted that “It is better to be a human being dissatisfied than a pig satisfied; better to be Socrates dissatisfied than a fool satisfied,” but modern liberalism forbids looking down on other people’s priorities. He argues that our reverence for individual satisfaction elevates entertainment to an inappropriate level of cultural relevance. He believes fear of sin is a crucial motivator that we have all but lost. In a clear hint of where his practical politics lie, he focuses closely on what he sees as the worst excess of consumerism: sexual liberation. Modern unrepressed sexuality marks the ascent of pleasure as a yardstick of human existence. While “happiness” or “fulfillment” have long been important moral concepts, they can only be achieved over the course of a full life, whereas “pleasure” can be achieved in short isolated moments. To agree with all of Legutko’s disdain for modern individualism, it probably helps to have grown up Catholic. But in general I find this a valuable corrective to the dominant culture. I recognize, of course, that individualism may be the only option in a diverse society incapable of agreeing on universal standards for virtue.

Legutko follows his discussion of individualism with a critique of how politics works in liberal democratic societies. At its least original, this is just the standard argument against interest group politics and identity politics. The more interesting aspect is his claim that liberalism is totalizing in a similar way to communism. I should note that Legutko has impeccable anti-totalitarian credentials as a former leader of Solidarity in 1980s Poland. Just like communist regimes could not tolerate a shred of anti-communist dissent, he says, liberal societies cannot tolerate illiberalism. In this sense liberalism is not a theory capable of being disagreed with, but a religion: “In a sense it is a super theory of society, logically prior to and higher than any other. It attributes to itself the right to be more general, more spacious, and more universal than any of its rivals.” He worries
that liberalism tends to politicize everything by turning social groups into something closer to political parties. This is his objection to multiculturalism: that it is not really about culture, but about weaponizing culture to form a more divisive politics. Legutko especially objects to liberal attitudes toward discrimination and intolerance. In his view, a liberal society–which prizes equality above all–should leave room for dissenters to be intolerant. It’s an interesting question: what does pluralism really mean? Is it pluralistic to say that if you’re not for pluralism, you’re on the wrong side of history? Legutko thinks that this totalizing liberal vision is why liberal democracies are suspicious of families, communities, and religious groups, which are independent power structures of alien illiberal character. For example, some families afford men unequal power over women. Legutko thinks this should be the family’s prerogative to decide and not a matter for the liberal state to settle…by criminalizing domestic abuse.

See what just happened there? We were coasting through some unorthodox but thought-provoking political philosophy, and then suddenly you notice the real-world implication is it should be legal (if immoral) to beat your spouse. Welcome to the experience of reading Legutko. For the first few chapters, it’s unclear which aspects of modern life are animating his lament. Capitalism merits nary a mention. In the second half of the book, it becomes clear: Legutko is just one more guy who thinks women, gay people, and immigrants have too much power. Yawn. He sees Christianity–the bedrock of European civilization–as under attack by at least some of these forces (it’s less clear what women have done wrong in this context). I kept finding myself surprised, although maybe I shouldn’t have been, by what a large share of his complaints concern sexual politics. The main “moves against Christianity” are “in-vitro fertilization, so-called reproductive rights, and rehabilitation of new sexual disorders.” He contradicts his previous argument for a more robust pluralism, where no ideology stands above others, in saying that Christianity should have special pleading. It is “the last great force that offers a viable alternative to the tediousness of liberal democratic anthropology.”

In fact, that may be so, at least applied to religion broadly. But only if we are talking about a version of religion that emphasizes humility, charity, and human frailty, not one obsessed with sexual regulation. I am struck by the centrality of religious discourse to all robust statements of antimodernism. Toward the end of No Place of Grace, Lears writes: “Liberal Christianity has forgotten the stratum of hardness in the Christian tradition, evaded the tragic contradictions at the heart of life, and lost much of its ability to impart a sense of gravity and larger meaning to the human condition.” As I continue to explore antimodernism, I plan to read more religious writings. Reinhold Niebuhr, Marilynne Robinson, and the history of American Evangelicalism are on my list. Please share your suggestions with me as well.

Jeff’s Blog #18: The End of an Era

Jeff’s Blog, Volume 18

Today’s entry is a little different. Thanks for bearing with me; I think the historical moment calls for it.

1. Ambition
The final day of Barack Obama’s presidency deserves a moment of consideration. Over the past month, I’ve read a bunch of essays that review the Obama years, define his legacy, and try to imagine how future historians will rank him. Jonathan Chait, in several essays and a new book, goes for apologia: a resolute defense of Obama as one of the greatest presidents of all time. Chait’s elevator pitch is three-pronged: Obama rescued us from the Great Recession, passed the most important safety net program since the New Deal, and did a bunch of little things, culminating in the Paris Agreement, to turn the tide against climate change. Of these, I think the Affordable Care Act still, even now, has the best chance to stand the test of time. And it was a major accomplishment: mostly for the Medicaid expansion, but more subtly for proving that exchanges don’t really work and thus building the long-term case for single payer. Chait also defends Obama’s grand political philosophy: a flexible, technocratic pragmatism best associated with Rockefeller Republicans (see also Emmett Rensin skewering Chait’s excessive enthusiasm).

Chris Hayes’ review, while mostly positive, suggests a slightly darker angle on Obama’s laudable tendency to be the most reasonable adult in the room. Hayes’ 2012 book Twilight of the Elites is one of the most important texts for understanding the roots of the pessimism and nihilism that led to Trump. During a period when faith in institutions was plummeting, Obama was the consummate institutionalist. He continually preached keeping the faith in the inexorable arc of national progress. The Republican Party’s decision to become the Party of No–“insurrectionists,” as Hayes calls them–was perfectly tailored for this historical moment. They caught a populace with such low expectations that gridlock was tolerable. Candidate Obama–the one with the “hopey-changey stuff”–seemed like the right savior for this era of lost faith. I’m not sure that President Obama, the reasonable centrist, actually was.

Now, it’s important to note that in his second term, once the Republican gambit was fully apparent, Obama was more aggressive in using executive action to circumvent Congress on issues like immigration and climate policy. But he found his progressive footing far too late. Matt Stoller reminds us that by far Obama’s biggest opportunity to help poor and middle-class Americans came in 2009 during the response to the financial crisis. Led along by Tim Geithner’s Treasury Department, the administration put most of the burden on borrowers, encouraged nine million foreclosures that might have been prevented by a HOLC-like cost-sharing program, and put creditors (the banks) first in line for recovery.

There were other failings. An under-discussed development is that the Democratic Party fell apart during the Obama years. The curse of Obama’s unique charisma was that, as Keith Ellison has noted, the DNC became a presidential campaigning tool instead of a national party apparatus. In the last eight years, Democrats lost 12 governors, 13 Senators, 69 House seats, and over 900 state legislative seats. Obama’s Middle Eastern policies inspired praise from few, not even Chait. Jeffrey Goldberg’s long Atlantic article was the best exploration of how Obama thought about when and how to intervene abroad (with great subtlety, as you might expect, and perhaps not enough decisiveness). Finally, most ominously for the future, Obama expanded the power of the state. He continued the Bush administration policy whereby NSA wiretapping is not subject to a warrant. And the practice of killing by drone strike, while cumulatively less lethal than traditional acts of war, is uniquely removed from human guilt and accountability.

2. Tragedy
For all this criticism, I am conflicted. Like many of us who cast our first ballot in his name, I will always be sentimental about Barack Obama. He holds a unique spot among public figures I’ve followed: the most analytical, the most self-aware, the most literary, the most inquisitive. Those are some of the adjectives I admire most in a person. On a human level, I want to be like him. But take a second look: those are words with which you praise a writer, not a statesman. I predict that the best Obama biography will be one that uses his identity as a reader and writer as the central frame. And so the best way I’ve found to settle the ambivalence I feel about Obama’s legacy is to decide that there are two Barack Obamas: the president and the sage. The sage was on full display in his final address in Chicago last week. He preached open-mindedness, participation, and the obligations of citizenship. He was, as always, funny, insightful, and humane. The silver lining, I think, is that Obama can continue to play his best role–public philosopher–in the post-presidency.

I expect that Obama the president will go down as a tragic figure in American history. Like all great tragic heroes, he was noble, possessed a fatal flaw, was subject to unfair circumstances that compounded it, and suffered worse than he deserved. The flaw, as Stephen Skowronek writes, was his “rational, sensible approach to problem-solving.” Americans don’t want pragmatism. They want the “old Jacksonian idea of redemptive politics, of reconstruction, the idea that we have to make America great again.” I would add a second fatal flaw, rooted in the deepest, most shameful threads of our history: he was black. The more you study American history, the more you realize that slavery and its aftermath are waiting around every corner. In 1998, Tupac said “although it seems heaven-sent, we ain’t ready to see a black president.” In 2008, Nas respectfully rebutted him. Maybe one day the historians will tell us who was right.

3. Renewal
Skowronek’s theory of the cycles of presidential history is the most interesting thing I’ve read in Obama’s last days. It is a theory (adapted from a 1997 book) that has roughly characterized the past 200 years (but don’t take it too literally). Presidents come in sets of six:

 “Reconstructive” presidents like Franklin Roosevelt and Ronald Reagan (to take only the last two cycles) transform American politics in their own image, clearing the field of viable competition and setting the terms of political debate. They are followed by hand-picked successors (Harry S. Truman and George H.W. Bush) who continue their predecessors’ policies and do little more than articulate an updated version of their ideas. They are usually succeeded in turn by presidents whom Skowronek calls “pre-emptive”—Dwight D. Eisenhower, Bill Clinton—who represent the opposite party but adopt the basic framework of the reigning orthodoxy. Next comes another faithful servant of that orthodoxy (John F. Kennedy/Lyndon Johnson; George W. Bush), followed by another preemptive opposition leader (Richard Nixon, Barack Obama) who again fails to overturn it. The final step in the sequence is a “disjunctive” president—usually somebody with little allegiance to the orthodoxy who is unable to hold it together in the face of the escalating crises it created and to which it has no response. The last disjunctive president, in Skowronek’s schema, was Jimmy Carter.

Skowronek’s framework reminds us that Obama was still a hostage of the Reagan era–but he didn’t know it. He was trapped in a world where government is the problem and big business runs the show. The tragic thing about Obama, if you accept Skowronek’s premises, is that he tried to end the cycle of transformation and rebirth. He tried to convince us that we don’t need sweeping ideology, which always divides us; we just need common sense solutions. He may have been right, but we didn’t hear him. He may have been the president we needed, but he wasn’t the president we deserved. The cycle rolls on.

According to Skowronek’s timeline, Trump will preside over a time of crisis. One might expect our society’s towering inequality to buckle under its own weight. Trump won’t look to the Republican orthodoxy to fix the crisis–he doesn’t believe in it:

“The kind of president who reigns over the end of his party’s own orthodoxy is always a guy with no relationship to his party establishment, someone who catches popular mood and says he is going to do it all by himself. Someone like Herbert Hoover, who carefully cultivated his own political brand and image as a “wonder boy”—the guy who can fix anything. Disjunctive presidents are always loners.”

After Trump fails, there may be yet another opportunity for transformation. The question is how much will be lost on the way.

4. Tomorrow
The essay for the moment:

Having spent three-quarters of a century fretting about enemies abroad, we have never fully processed a lesson of history: that great civilizations almost invariably collapse from within. We are Athens, we are Rome — we are, more than anything, Paris in the 1930s, another society divided against itself, living in what one historian described as “the age of unreason.”

And so the election of Trump will come to mark the end of the international order that was built to avoid repeating the catastrophes of the first half the twentieth century, and which did so successfully — horrors that we like to imagine we have outgrown.

We will face a great moment of crisis, after the next major terrorist attack in the U.S. (something no American President could prevent), which will present something like a perfect storm: a thin-skinned, impulsive leader with authoritarian instincts, a frightened public, an environment of permissive racism, and a post-fact information environment. In such a moment basic civil liberties will be at risk: due process will be assailed as “protecting terrorists”; free speech will be challenged as “giving aid and comfort to the enemy.” And that will be the moment when each of us must stand up and be counted, and never forget Tolstoy’s admonition: “There are no conditions to which a man may not become accustomed, particularly if he sees that they are accepted by those about him.” Our portion is to make sure that never comes to pass.

Jeff’s Blog #17

Jeff’s Blog, Volume 17

Thanks for continuing to read along! If you’ve been enjoying this, consider looping in a friend or two who would also like it? It’s been really great to hear from some of your friends already.

 1. Work: from the Indo-European root erg, like energy, lethargy, synergy…
I’m taking a short hiatus from thinking about “the future of work” to consider its past. As a starting point for a reading spree on the history of work in modern societies, I really appreciated John Budd’s The Thought of Work. This was a survey of ten major conceptions of work you find in the history of ideas. Because everyone loves a list, I’ll share them: (1) work as a curse, (2) work as freedom, (3) work as a commodity, (4) work as occupational citizenship, (5) work as disutility, (6) work as personal fulfillment, (7) work as a social relationship, (8) work as caring for others, (9) work as identity, and (10) work as service. What I love about this book is that it draws from many disciplines–especially economics, sociology, psychology, philosophy, and theology–and clearly delineates where their respective conceptions of work conflict with one another. It’s useful to be reminded that when different intellectual traditions give different answers, sometimes it traces back to a fundamental disagreement about what the object under study actually is. So, for example, if work is occupational citizenship (#4), labor unions are groups of people who pool their resources because they share interests, analogous to a political party. But if work is a disutility traded in competitive markets (#5), labor unions are monopolists. And if work is about individual fulfillment (#6), a union meant to provide collective rewards seems to miss the point.

The two perspectives that were least familiar to me and that I most enjoyed thinking about were the “pluralist industrial relations” perspective and the feminist perspective. The word “pluralist” refers to the assumption that workers and management share some interests (unlike Marxists), but not all interests (unlike the “unitarist” perspective in Human Resources Management). This is Budd’s own academic speciality so he explained it extremely convincingly. The idea that really stuck with me was the notion that corporations are essentially unionized shareholders–organized collectives of individual investors pooling their resources to hire management–so a level playing field naturally requires unionized workers doing the same thing. The feminist perspective begins with the observation that the “cult of domesticity” only emerged after industrialization when men and women switched from working side-by-side on the farm to a factory-based specialization (for the man) that relied upon unpaid care work (for the woman) at home. At the risk of generalizing, a major principle of feminist labor theory seems to be that care work (broadly construed as bringing up children, educating people of all ages, meeting health needs, treasuring the environment) should be highly valued both privately and publicly. Virginia Held’s The Ethics of Care is a major work in this vein.

Finally, I enjoyed the contributions of religious thinkers to conceptions of work. “Work as curse” was a common response to original sin (God curses the ground beneath Adam’s feet and says “In toil shalt thou eat of it”), but Luther, Calvin, and Confucius said that work was a calling, a way to honor the family and please God. The modern theologian Miroslav Volf has a book about work. Most of all I liked this from Rumi, imagining the judgement day: “I gave you hands and feet as tools / for preparing the ground for planting. / Did you, in the health I gave, / do the plowing?”

2. Profit over people
Last week I wrote a bit about the “law and economics” perspective on corporations, which holds that there is really no such thing as a corporation, nor do employees have any special stake in it, because the corporation is just a bundle of contracts that could be dissolved and renegotiated to serve shareholder value. If this is true, or more to the point if the law treats corporations as if this were true, Joel Bakan argues that The Corporation is a pathological entity. In clinical terms: irresponsible, manipulative, grandiose, lacking empathy, asocial, unable to feel remorse. The basic argument of the book is that while many businesspeople are well-meaning and can, through the magic of “corporate social responsibility,” convince themselves that they are a force for good, public corporations are inevitably single-minded, and therefore indifferent to whatever human or natural obstacles might lie in the way. I think this is an extremely important argument, and I agree with it in its bare-bones formulation, but found this book disappointing in its delivery.

The most common pieces of evidence were interviews with public executives and investors who have seen the light. E.g. “The corporation is in the externalizing machine, in the same way that a shark is a killing machine…There isn’t any question of malevolence or of will.” There was Milton Friedman saying corporate social responsibility must be a sham because profit is the only bottom line. Bakan goes so far as to say corporate social responsibility might be illegal under Dodge v. Ford (1919) which held that managers must act in the interest of shareholders, but this seems far-fetched under Shlensky v. Wrigley (1968) which held that managers have extremely broad discretion in doing so. The most helpful evidence for me was Bakan’s telling of the lawsuit against GM when Chevy Malibus were exploding in collisions. GM was aware that such fuel-fed explosions were likely, but had calculated that the costs of moving the fuel tank back from the bumper would cost several more dollars per car than the anticipated legal expenses of 500 deaths per year (they were wrong about this, at least). As illuminating as that episode is, I don’t think you can hope or expect to find a smoking gun on every company that subverts the public interest. Bakan is a (Canadian) law professor, and I would have liked to see a much more legally focused argument attacking corporate personhood. There seems to be a much-ignored, barely perceptible thread in American legal history casting doubt on the idea that the 14th Amendment and other marks of personhood should be applied to corporations (count Justices Douglas and Black as dissenters). I plan to learn more about this.

3. The next Pan Am?
Everyone loves airline stories. Bloomberg had this big profile of Emirates, which has dominated the aviation business in recent years but may be “running out of sky.” Read the story if only for the slew of crazy facts about how big Emirates is–they have “an industrial-size wine cellar in Burgundy, with 3.75 million bottles aging at any given time,” the world’s largest kitchens and dishwashing operation, and they alone fly the majority of the world’s Airbus A380’s, the largest commercial plane. The interesting conflict is with the U.S. Big 3 carriers which are lobbying to shut Emirates out of U.S. airports, arguing that Emirates gets unfair advantages (no unions, big subsidies, custom-made airport) from the government of Dubai. There’s definitely some irony here if you recall when I wrote about American Airlines benefiting from a cozy relationship with regulators during the U.S. Air merger. But fun little hypocrisy aside, the American carriers are broadly right–Emirates has a ton of assets (start with location!) they will never have. The interesting thing, to me, is the possibility that this dynamic might reappear in many industries. We’re familiar with competitive advantage; sometimes the winner ends up the only game in town. But what happens when the town is the entire world, and the market is winner-take-all? Perhaps Emirates will be the only airline, Amazon the only retailer, Exxon the only oil company. More likely protectionism becomes more acceptable. The airlines are certainly counting on Trump to save them.

4. Alternatively, there’s the Trump Foundation…
I really liked Chris Blattman’s framework for charitable giving. It’s hard not to read this kind of thing as a response (challenge?) to the GiveWell model, which is presumably well-known and well-respected by people who read Chris Blattman. At first my reaction was “yeah, Chris’s approach resonates with me more than GiveWell does,” but on second thought I see how GiveWell is putting forth the most defensible, most universally unobjectionable framework for giving, while Chris’s view is more narrowly tailored to his own values, which also happen to be my own. GiveWell is careful to recommend only charities that have demonstrably positive impact on human life. Based on the stuff we know how to measure well, this ends up skewing heavily toward health and medical interventions in developing countries. That’s really important! But Chris’s view is that “the means and end to human well being is good government and political rights and freedoms,” which he acknowledges are very hard to measure. So he gives mostly to organizations that work on civil society and good governance. To their credit, the GiveWell people, through their Open Philanthropy Project offshoot, are making a massive leap into these more policy-oriented areas of giving as well. Chris gave to way more organizations than I did this year, but we had some overlap: GiveDirectly, ACLU, Amnesty International, and International Rescue Committee.

5. Why read, continued
Julia Galef created an interesting “taxonomy of ways books change your worldview.” She has four big categories: books that offer data, books that offer theory, books that change your values, and books that change your thinking style. I don’t think it really works as a taxonomy since it’s neither mutually exclusive nor collectively exhaustive, but all of the items in her list strike me as nice articulations of what non-fiction books can do. In writing non-fiction, I should strive to do several of these at any time. In this blog, I can most easily do things in the theory category: present models of how a phenomenon works, point out a problem, offer a general concept or lens. I also probably make explicit arguments about values more than you would like. The most impressive books, to me, are those that “write from a holistic value structure, letting you experience that value structure from the inside,” and those that “tickle your aesthetic sense.” This relates directly to the justification for reading full books I gave a few weeks ago: that it’s often more about getting inside the head of an interesting thinker than learning any specific facts or arguments.

6. Real conservatism
I’m in the early chapters of a book about antimodernism around the turn of the last century. More on this soon, I hope, but the general story is about members of the bourgeois elite feeling alienated by consumerism and the new industrial society, and looking to the East and Medieval times for fantasies of “more authentic experience.” There seem to be some parallels to today. In any case, the introduction provided my favorite two sentences of the week: “The more thoughtful anti-modernists remind us what Left critics too often forget: in a society dedicated to economic development and “personal growth” at the expense of all larger loyalties, conservative values are too important to be left to pseudo-conservative apologists for capitalism. In our time, the most profound radicalism is often the most profound conservatism.”

Jeff’s Blog #16

Jeff’s Blog, Volume 16

1. Moving on from the Old New Deal
I’m halfway through a few books about the New Deal, and I think Jefferson Cowie’s The Great Exception is the one I would recommend to anyone who wants to think about The New Deal and its legacy at a high level. This isn’t a blow-by-blow of FDR’s legislative efforts and the characters who shaped his administration; for that I’d probably recommend Michael Hiltzik’s The New Deal: A Modern History. Cowie’s project is more unusual: tell the story of American political economy in the 20th century through the lens of the New Deal. Cowie’s argument is simple and bold: the New Deal and the broad economic prosperity in its wake (1935-1970s) were aberrations in American history, the products of unusual circumstances that will probably never coincide again. In some sense the doomed protagonist of the book is really class politics or labor politics, which has thrived exactly once in American history (during this mid-century period). Cowie renames several historical periods to highlight his theme. The Gilded Age was not unique but rather the “Incorporation of America” when law was first settled to support big business against labor. The Reagan Revolution is better seen as a “Reagan Restoration,” undoing the aberrant New Deal and setting the stage for our current return to corporate dominance.

The core of the argument is about why Americans have always been skeptical of class and labor politics, and why those obstacles dissolved for a brief moment in the 1930s. Cowie holds the notion of Jeffersonian individualism in some contempt as the mythos that keeps Americans from appreciating collective action. One especially interesting instantiation of this ideal came during the Free Labor movement in the 19th century. Northerners distrusted collective organization of workers because it resembled slavery. “By contrasting free wage labor in the north with slave labor in the south, the standard for white American working class identity was set low: not enslaved.” See Eric Foner’s Free Soil, Free Labor, Free Men for more on this ideology. Of course, racism is the most important historical obstacle to a class politics. FDR could not have passed his major initiatives without Southern Democrats. The result was systematic exclusion of black people from New Deal programs; e.g. domestic and agricultural workers not included in Social Security. When LBJ passed the Voting Rights Act, Democrats lost the South for well more than a generation. When Reagan won in 1980, segregationist Strom Thurmond marveled that it was the same platform he himself had run on in 1948. “The core, tragic dilemma of American class politics was that to include African Americans meant not to have class politics.”

A second historical obstacle to class politics has been nativism and immigration hysteria. As we know, no wave of immigrants has been immune to violence and abuse. But the middle of the century was a unique moment of relative calm, between the 1924 Johnson-Reed Act (which limited immigration) and the 1965 Immigration Act (which abolished quotas based on national origin). There was thus little controversy about immigrants benefiting from collective bargaining (the Wagner Act) or minimum wage (Fair Labor Standards Act) or Social Security. Another piece of the Great Exception was the absence of religious fundamentalism in the 1930s. Religiously inspired social reform movements reappear throughout American history, but the New Deal landed in between the Social Gospel movement of the early 20th century and the rise of Jerry Falwell, the Moral Majority, and the New Christian Right in the 60s and 70s. Finally, there were the extraordinary circumstances of the Depression and World War II (these aberrations also play a key role in Thomas Piketty’s argument for why wealth inequality shrank in the middle of the century before resuming its upward climb). It’s remarkable how brief a period accounted for all the significant pro-labor policy-making of the 20th century; every key New Deal program passed between 1935 and 1938. As Cowie shows, most labor politics of the subsequent decades was simply a defensive struggle to preserve the Wagner Act.

As you can tell, this book offers a very pessimistic take on American history. Indeed, Cowie’s aim is to disillusion us from believing that we can recreate mid-century equality by replicating New Deal-style programs, universal benefits, and collective bargaining–at least at the national level. It’s a rebuke that says Bernie Sanders is living in the past–but not deep enough past to recognize that our current situation is normal in the scope of our history. If Cowie has any hope it comes from the Progressive Era which at least had “militant voluntarism in the shadow of a state hostile to collective interests.” Unable to pursue national legislation, the Progressives pushed for local initiatives, worked outside the state, and shamed the powerful through muckraking. Local Fights for $15 would fit in this tradition. But based on the little I’ve learned about the Progressive Era recently, I am not optimistic that there is a great lesson for populist politics in there either.

2. Corporations are nexuses of contracts, my friend
Since the financial crisis, sociologists have been at work uncovering the “financialization” of American life.   This is the framework of thinking about companies as financial products, households as investors, and Wall Street as the center of the economy. It includes the shift from managerialism–the notion that managers run companies at their discretion–to the idea that corporations exist only to satisfy shareholder value. In retrospect, financialization began in the 1970s and seems to have continued apace after the road bump of 2008. Jerry Davis’s Managed by the Markets: How Finance Reshaped America is a helpful chronicle of this moment. In a sense this story is a sequel to The Great Exception. At least, it describes a world where the workers who benefited from the New Deal are beside the point. Davis: “My basic argument is that twentieth-century American society was organized around large corporations, particularly manufacturers and their way of doing things. It is now increasingly organized around finance.”

I found that this book succeeded more as intellectual history, chronicling the evolution of corporate law, than as a political or business history of the rise of finance. The book proceeds as a reflection on a 1932 classic, The Modern Corporation and Private Property, which introduced the theme of the separation of ownership from control. While that book bemoaned how much power managers have, Davis shows how the tables have turned and now shareholders rule. A key moment was the introduction of the notion of a “market for corporate control” (Manne 1965), namely that the lower the stock price, the more attractive a takeover becomes to those who think they could do better. This sets a limit on how much managers can ignore stock prices. And managers are not the only ones beholden to the markets: governments are too. The idea of a “market for corporate charters” suggests that states will compete to offer laws most favorable to shareholders–a competition which Delaware has won. It was interesting to learn about states attempting to stand up to shareholders, like when Pennsylvania sought to keep Hershey from changing ownership out of fear it would harm workers and the namesake town which relies on the company. But the dominant law and economics perspective hardly believes there is such thing as a company. In this view, companies are just a collection of contracts between the owners of labor and capital and customers (Jensen & Meckling 1976). I find this approach extremely dangerous as it has no regard for the obvious (to anyone outside academia) reality that companies are social institutions embedded in communities.

Managed by the Markets also includes plenty of history of the fall of commercial banking, the rise of investment banking, the finance-driven merger movement, the shift in household wealth from bank deposits to mutual funds, and the evolution of houses into leveraged financial assets. There was a shockingly blunt quote from the CEO of Wells Fargo in the 90s: “The [commercial] banking industry is dead, and we ought to just bury it.” After a series of big bank mergers, seven of the ten largest U.S. cities lack a major local commercial bank. Another important development was when the major investment banks abandoned the traditional partnership model and went public. Compensation for bankers became more closely tied to short-term results rather than long-term customer service. Thence scandals like that of Henry Blodget at Merrill, who “conceded that his Internet research group had never issued a “Sell” recommendation, even on companies referred to in internal e-mails as “dogs” and “crap.” For all these interesting facts about the rise of finance, I came away from the book lacking a strong intuition for why this happened. At several points Davis says information technology made it easier to trade complicated securities, but this felt insufficient. I think closer attention to lobbying and public policy toward finance would be helpful. Act of Congress about the passage of Dodd-Frank is great for insight into finance’s lobbying power. I think Greta Krippner’s Capitalizing on Crisis: The Political Origins of the Rise of Finance is probably the right book for the political history. Krippner argues that as post-war growth slowed in the 1970s, the state deregulated finance to supply the credit necessary to maintain the entitlements and standard of living people had come to expect. A lot of stuff comes back to the Great Exception and how to go on in its absence…

3. Bubbles, and not the molecular gastronomy kind
Thrillist has a three part series critiquing the American restaurant business. Part one explores how small cities across the country developed identical hip restaurant scenes, all parroting Portland circa 2007. Part two discusses a shortage of cooks driven by too many restaurants and not enough immigration. And part three argues that the restaurant business (at least the full-service segment) is in a bubble. In 2016, the number of independent restaurants declined 3%. The author cites “rising labor costs, rent increases, a pandemic of similar restaurants, demanding customers unwilling to come to terms with higher prices.” From these reasons you notice an economic story and a cultural one. The cultural story is what makes the piece fun to read, but it’s not clear whether self-parodying excess of foodie culture is really what’s driving restaurants out of business. We hear about hyper-seasonal menus that change five times a season, chefs who refuse to buy someone else’s bread or charcuterie, and customers who expect world-class entrees under $30. But it seems like nothing is more important than rising healthcare costs, especially as the Obamacare employer mandate defines full-time work as 30 hours per week. The restaurant featured throughout the story paid $14,400 for health insurance in 2012, but $86,400 in 2015. The increase alone represents a third of profit in their best year ever. Food delivery apps are also reportedly a problem, since they drive a growing share of business to skimp on tips and alcohol. The author predicts that we will stop eating in sit-down restaurants as much as we have in the past few years, and that fast casual will continue to gobble up market share. Consumers will adapt. The real losers, he suggests, are the current generation of chefs who have poured their savings into new restaurants as the bubble inflated.

4. Maybe Carrier should do this…
There was some pretty amazing timing on two stories about China last week. Shot: this NYT report on the massive suite of incentives and perks the government of Zhengzhou gave Foxconn to locate its iPhone assembly factory there. Chaser: Foxconn “plans to replace almost every human worker with robots.” Now, it’s not as if the government worked so hard to lure Foxconn exclusively for the purpose of creating jobs. But that sure seems like a big part of it. The government spent $1 billion to create dormitories for workers; helps recruit and train the workers and pays a subsidy for new hires; and eliminated corporate taxes and value-added taxes for several years (so it’s not as if they’re making up their investment in tax receipts!). Perhaps the government was aware of Foxconn’s rough plans, which have included some major automation component for several years now. But this must rankle someone. Especially, as the NYT notes, since China’s national development goals have shifted slightly away from manufacturing for export and toward domestic consumption and services. If you’re interested in domestic consumption, the last thing you want is a company throwing hundreds of thousands of people out of work–right after you paid them a few billion dollars to create the jobs. Something I often talk about in the U.S. context is how corporations should serve other ends beyond profit. It would not surprise me if China is much more successful at limiting antisocial business practices (e.g. automation, if deemed as such) than we are. Whether it is ultimately a good thing for a government to coerce prosocial behavior is a question for another day.