Odourless Utopia

In Europe, the war bulletins come not just from Ukraine, but also from the climate front. The French government has cracked down on water use, banning watering lawns and washing cars in 62 of 101 departments, as more than 100 municipalities no longer have potable water. Nuclear power plants on the Rhône and Garonne have had to reduce production due to insufficient water in the rivers. In Italy, the government has declared a state of emergency in 5 of 20 regions, while Second World War bombs are discovered on the beds of its largest river, the dried-up Po. In Germany, the Rhine is so low that the barges plying its 1,000 kilometres from Austria to Holland have had to reduce their cargo from 3,000 to 900 tons so as not to run aground, and the river is expected to soon become impassable to freight traffic. In England, for the first time on record, the source of the Thames has dried up and the river is beginning to flow more than 5 miles further downstream. In Spain, restrictions on water consumption have been imposed in Catalonia, Galicia and Andalusia.

These are all warning signs. In a few centuries, the idea of water as an abundant resource and universal right may be unimaginable. It is easy to forget that even in the so-called advanced world, domestic running water – for toilets, cooking, personal hygiene, washing clothes and dishes – is a very recent and ephemeral phenomenon, dating back less than a century. In 1940, 45% of households in the US lacked complete plumbing; in 1950, only 44% of homes in Italy had either indoor or outdoor plumbing. In 1954, only 58% of houses in France had running water and only 26% had a toilet. In 1967, 25% of homes in England and Wales still lacked a bath or shower, an indoor toilet, a sink and hot- and cold-water taps. In Romania, 36% of the population lacked a flushing toilet solely for their household in 2012 (down to 22% in 2021).

The availability of domestic running water varies depending on one’s individual wealth and on the affluence of one’s nation. While in Western Europe and the US, the number of households with toilets equipped with running water currently exceeds 99%, in a number of African countries the percentage is between 1 and 4: Ethiopia 1.76%; Burkina Faso 1.87%; Burundi 2.32%; Uganda 2.37%; Chad 2.50%; Niger 2.76%; Madagascar 2.83%; Mozambique 2.87%; Mali 3.71%; Rwanda 3.99%; Congo 4.17%. In these countries the toilet is a marker of class status; in Ethiopia less than one in 56 households has one. The data also contains some surprises: there are more toilets in Bangladesh (35%) than in Moldova (29%), India is in roughly the same situation as South Africa (44% versus 45%) and just ahead of Azerbaijan (40%). While in Baghdad the number of houses with flushing toilets is 94.8%, in central Kabul it is 26%, and in Afghanistan as a whole it is 13.7%.

It is possible to trace the social and geopolitical history of running water. Its widespread accessibility was the the result of two primary factors: 1) the industrial revolution that provided the pipelines and purification plants needed for this colossal planetary enterprise; and 2) urbanization, for it is fairly obvious that bringing running water to a series of isolated cottages is far more expensive and complex than to centres of high population density. Urbanization was stimulated by the industrial revolution, and then in turn by the availability of running water for newly-arrived citizens. This may well be one of the most significant, and most peculiar, features of contemporary civilization. For what it created was the utopia of an odourless society. This would not have been possible without the spread of running water, but it was accelerated by the growing desire to deodorize the human habitat. In the twenty-first-century, we no longer perceive smells as our ancestors did.

In The Foul and the Fragrant (1988), Alain Corbin asks, ‘What is the meaning of this more refined alertness to smell? What produced the mysterious and alarming strategy of deodorization of everything that offends our muted olfactory environment? By what stages has this far-reaching anthropological transformation taken place?’ An incisive answer is offered by Ivan Illich in his brilliant little book, H2O and the Waters of Forgetfulness (1986), which reminds us that it was not until the last years of Louis XIV’s reign that a decree was passed for the weekly removal of faeces from the corridors of Versailles. It was in this era that the project to deodorize began. ‘The sense of smell’, Illich writes,

was the only means for identifying the city’s exhalations. The osmologists (students of odors) collected ‘airs’ and smelly materials in tightly corked bottles and compared notes by opening them at a later time as though they were dealing with vintage wines. A dozen treatises focusing on the odours of Paris were published during the second part of the eighteenth century…By the end of the century, this avant-garde of deodorant ideologues is causing social attitudes toward body wastes to change…Toward the middle of the century shitting, for the first time in history, became a sex specific activity…At the end of the century, Marie Antoinette has a door installed to make her defecation private. The act turns into an intimate function…Not only excrement but the body itself, it was discovered, emanates bad odours. Underwear that up to this time had served to keep one warm or attractive began to be connected with the elimination of sweat. The upper classes began to use and wash it more frequently, and in France the bidet came into fashion. Bed sheets and their regular laundering acquired a new importance, and to sleep in one’s own bed between sheets was charged with moral and medical significance…On November 15, 1793, the revolutionary convention solemnly declared each man’s right to his own bed as part of the rights of man.

Being odourless thus became a symbol of status:

smelling now began to become class-specific. Medical students observed that the poor are those who smell with particular intensity and, in addition, do not notice their own smell. Colonial officers and missionaries brought home reports that savages smelled differently from Europeans. Samojeds, Negroes and Hottentots could each be recognized by their racial smell, which changes neither with diet nor with more careful washing.

Naturally this myth was self-fulfilling, to the extent that colonized peoples were denied running water, soap and flushing toilets. Subaltern classes also began to smell and arouse revulsion. ‘Slowly’, Illich continues,

education has shaped the new sense for cleanly individualism. The new individual feels compelled to live in a space without qualities and expects everyone else to stay within the bounds of his or her own skin. He learns to be ashamed when his aura is noticed. He is embarrassed at the thought that his origin could be smelled out, and he is sickened by others if they smell. Shame at being smelled, embarrassment at coming from a smelly environment, and a new proneness to be offended by smell – all taken together place the citizen in a new kind of space.

Realizing this ideal of olfactory neutrality required increasing amounts of water. Before the Second World War, bathing once a week was considered hygienist paranoia. Only with the mass production of household washing machines did cleaning clothes become more frequent. I remember the London of the 1970s: on the Underground, the City clerks could be recognized by their detachable cuffs and collars; the former were changed regularly but the latter were grayish from having been worn for a week straight. The families that hosted us would ask us to insert coins into a special hot water meter: breakfast was included in the price, showering was not.

Now, though, the utopia of an odourless humanity has conquered much of the planet. Yet, as with many aspects of modernity, the moment we acquired the means to achieve a goal, its enabling condition (namely the abundant, unlimited availability of water) was lost. An ever more populous and rapidly warming planet will likely return to a state in which water is scarce and contested. This future may however be marked by a significant cultural difference. Whereas in the past, water was scarce for a humanity able to live happily with odours, now it will be scarce for one that considers their own odours insufferable, not to mention those of others.

I remember being struck by the extraordinary success of the Canadian TV drama H2O (2004), whose trailer announced:

A dead Prime Minister. A country in turmoil. A battle for Canada’s most precious resource – water. On the eve of testy discussions with the US Secretary of State, Prime Minister Matthew McLaughlin is killed in an accident. His son, Tom McLaughlin, returns to Canada to attend his fathers’ funeral where he delivers a eulogy that stirs the public propelling him into politics and ultimately the Prime Minister’s office. The investigation into his father’s death, however, reveals that it was no accident, raising the possibility of assassination. The trail of evidence triggers a series of events that uncovers a shocking plot to sell one of Canada’s most valuable resources – water.

As James Salzman noted in his book Drinking Water (2012), this omitted ‘the most exciting part, where American troops invade Canada to plunder their water supply’. A US–Canadian war over water! Until now, such conflicts seemed to be the preserve of semi-desert areas in the Middle East (think of Eyal Weizman’s writing on the Israelis’ use of water to surveil and punish Palestinians), or torrid Africa (as in the latent conflict between Egypt, Sudan and Ethiopia over the Grand Ethiopian Renaissance Dam built on the Blue Nile). But with the possible desertification of the central European plain, war for water will become a real prospect, even in regions once famous for high rainfall and water infrastructure. We citizens of ‘rich countries’, ‘industrialized nations’, ‘more developed powers’, will fight to smell less.

Translated by Francesco Anselmetti.

Read more: Nancy Fraser, ‘Climates of Capital’, NLR 127.


Iron Musk

It is difficult to hide a certain satisfaction upon witnessing the collapse of bitcoin. Since I last dealt with the topic for Sidecar seven months ago, the total capitalization of cryptocurrencies has decreased from $2.6 trillion – equivalent to the total GDP of France – to only $901 billion (as of 15 June). One feels sorry, but only a little, for those gullible people who invested their modest savings in crypto currencies hoping for easy profits and got fleeced by another pyramid scheme – an updated version of the seventeenth-century tulip fever in the Netherlands, history’s first senseless financial bubble.

This schadenfreude is all the greater since the cryptocurrency crash particularly affects Elon Musk – in theory the world’s richest man, with assets valued at $268 billion. In the media, Musk is depicted as contemporary capitalism’s very own Tony Stark, alter ego of the Marvel superhero Iron Man: a business magnate, playboy, philanthropist, inventor and scientist. In 2019, Musk decided to accept cryptocurrencies as payment for the electric vehicles produced by his company Tesla. The following year, he invested $1.5 billion in the cryptocurrency Dogecoin. Musk has relied on the fact that the cryptocurrency market is controlled by a small number of people who are able to manipulate its ebbs and flows (save any sudden waves of panic). For several years these capitalists propped up the value of their investments in bitcoin by continuing to accumulate cryptocurrencies, just as public companies do when they inflate their own shares through ‘buybacks’.

In the space of a year, however, Dogecoin has lost over 80% of its value, dropping from $40 billion to $6.9 billion. Undeterred, Musk has continued to assert his faith in the venture, relaunching it in May as a means to pay for the merchandising of his space corporation, SpaceX. Every announcement made by Musk is followed by a rise in the price of Dogecoin: a fact that illuminates the mechanism through which this new form of capitalism increases the fortunes of its standard-bearers. The capitalist announces on social media that they will buy a given share. Their followers (or, perhaps more aptly, believers) rush to buy the same shares, which experience a vertiginous surge, after which the capitalist cashes in by selling a part of the bloated stock, easily covering the cost of the initial purchase.

What’s producing revenue here is influence. In Musk’s case, influence is accrued through his own comic-book persona: he will continue to amass wealth so long as he is seen as a Stark-like figure. This is how his image as the Iron Capitalist remains credible. For this reason, Twitter is the most efficient financial tool at his disposal: his 91 million followers scattered around the world are his real capital. Hence why on 4 April the value of Twitter’s shares increased by 27% after Musk announced he had bought 9% of the company’s stock (Dogecoin also went up 20% as a result). It stands to reason that Iron Man would want to control the source of his revenue by investing in it.

Musk’s adherence to this superhero persona is therefore not only – or not even primarily – a vain ostentation, but quite literally a question of economic interest. Throughout his career as an entrepreneur he has carefully fashioned his image as an inventor or scientist (even if he dropped out of his graduate studies in material sciences at Stanford after only two days). As Forbes emphatically proclaims, ‘Elon Musk is working to revolutionize transportation both on Earth, through electric car maker Tesla – and in space, via rocket producer SpaceX’. Musk must constantly renew these superheroic credentials, investing in fanciful, futuristic projects reminiscent of science-fiction: electric cars, space exploration, artificial intelligence and neurotechnology. The key is to launch a new project before the previous one has been completed; new investments make earlier ones look profitable, thereby raising the value of their stock.

Exemplary in this regard is the story of Tesla, the electric vehicle company which, without having established a foothold in the industry (how many Teslas do you see driving around?), launched itself into the field of self-driving cars, with predictably disastrous results. As of 20 February, Tesla cars had caused 11 accidents, 17 casualties and one fatality. But, for Musk, the mere promise of automated cars served to obfuscate the broader failure of the electric vehicle. Tesla went public in 2010, after receiving $500 million worth of financing from the US government. From 2010 to 2019 its value increased, but at a fairly typical pace for an innovative tech company in a period of quantitative easing. (At this time, investment funds were able to take out billions in interest free-loans, and, without quite knowing where to channel it all, invested in companies that were seen as promising; it’s this that underpinned the enormous boom in stocks, despite the near-stagnant real economy). Over the following two years, the company truly went into orbit, peaking at $1.2 trillion in November 2021, before sinking to $662 billion as of 15 June.

This valuation does not correspond in any way to Tesla’s ‘real’ size, which remains modest both in terms of vehicles produced (305,000 the whole of last year) and sales ($54 billion). In comparison, the Volkswagen group had a revenue of $250 billion and produced 5.8 million cars, but its capitalization only amounted to $167 billion. The ascent of Tesla was also fuelled by the growth of bitcoin, the promise of space exploration and, in 2021, the long-publicised touristic rocket ‘excursion’, which helped SpaceX surpass the $100 billion valuation threshold. In this way, the SpaceX and bitcoin boom retroactively triggered the rise of Tesla.

As we’ve seen, the valuation of Musk’s enterprises, as well as the aleatory estimates of his wealth, have always been based on the promise of future expansion: achievements that are just out of reach, just over the next hill. His trust in bitcoin therefore indicates more than just a speculative opportunism; it embodies the business model that operates across his various industries. It also demonstrates that the influence exercised by Musk through Twitter doesn’t only affect small investors (those that Italian stock traders call parco buoi, ‘the flock’), but also ‘professionals’: stockbrokers, financial advisors, fund managers and so on.

Every epoch has an entrepreneur who symbolizes its particular style of capitalism. At the end of the nineteenth century, during the robber baron era, it was the evangelist of modern billionaire philanthropism, Andrew Carnegie and his Gospel of Wealth (1889). Then it was Henry Ford, the fascist-sympathizing industrialist behind the Model T, who shocked the world by paying his workers five dollars per day and was deemed ‘the one great orthodox Marxist of the twentieth century’ by Alexandre Kojève. The post-World War II period, with its social democratic compromise, lacked Promethean entrepreneurs of the kind envisaged by figures such as Werner Sombart and Joseph Schumpeter. Yet in the 1980s the mythos of the entrepreneur was revived with the rise of Reaganism. Richard Branson emerged as the fitting stepson of Thatcher, whose privatizations and deregulations paved the way for Virgin Atlantic and Virgin Healthcare. In 1986 the then Prime Minister appointed him ‘litter tsar’, tasked with ‘keeping Britain tidy’. Later, the Blair government entrusted him with managing part of the newly privatized British rail infrastructure.

Branson inaugurated the era of the performer-entrepreneur, a man of showbiz more than business, foreshadowing the new generation of moguls who operate on social media. Mark Zuckerberg, who deftly exploited Facebook to build his own personal brand, was the first. Then, in truly cinematic fashion, entered Iron Man Elon. Yet these symbolic figures aren’t necessarily the most significant ones. John Rockefeller or John Pierpont Morgan were far more important than Carnegie, even if they never embodied an epochal style. Bill Gates was just as important as Steve Jobs (himself a mythical character, though he died before the new wave of social media). In the same way, Amazon’s Jeff Bezos shapes our lives far more than Elon Musk, even though his presence on social media is close to nil, and he is markedly less representative of what might be called ‘comic book capitalism’.

The truth is that Musk’s significance is more political than economic. I know from personal experience that public figures – however cynical their stated positions may appear – end up identifying with the role they play and believing in the principles they thought they were exploiting. Tony Stark inevitably begins to see himself as Ulysses, ‘that man skilled in all ways’, whose ingenuity allows his people to fulfil their historic mission. Yet, unlike his former Paypal associate and fellow cryptocurrency enthusiast Peter Thiel, Musk has little use for political proclamations. His actions speak for themselves. They reveal an individual convinced of his right to shape the fate of the world – not primarily through his wealth, but through his membership of a ‘cognitive aristocracy’, an elect few more intelligent, more knowledgeable and more perceptive than the rest.

Here we enter the phantasmagorical world of the comic-book capitalists, who often use their vast wealth to realise their teenage fantasies. Relevant to this dreamland is the disproportionate influence, especially in the eighties, of Ayn Rand’s Atlas Shrugged (1957), in which the Russian exilée describes ‘a dystopian United States in which private businesses suffer under increasingly burdensome laws and regulations’, plus the resistance of some heroic capitalists who eventually migrate and establish a free society elsewhere (a notable super-fan of this extremely dull book was Alan Greenspan).

The 2008 crisis dealt a blow to the partisans of Rand’s rational egoism (Greenspan himself ultimately abjured it). But it was soon to be replaced by a new cult work entitled The Sovereign Individual: How to Survive and Thrive During the Collapse of the Welfare State (1997), co-written by James Dale Davidson, a financial consultant whose expertise lay in how to profit from catastrophes, and William Rees-Mogg (1928-2012), long-standing editor of The Times. A 2018 Guardian article summarized the book’s four main theses:

1) The democratic nation-state basically operates like a criminal cartel, forcing honest citizens to surrender large portions of their wealth to pay for stuff like roads and hospitals and schools.

2) The rise of the internet, and the advent of cryptocurrencies, will make it impossible for governments to intervene in private transactions and to tax incomes, thereby liberating individuals from the political protection racket of democracy.

3) The state will consequently become obsolete as a political entity.

4) Out of this wreckage will emerge a new global dispensation, in which a ‘cognitive elite’ will rise to power and influence, as a class of sovereign individuals ‘commanding vastly greater resources’ who will no longer be subject to the power of nation-states and will redesign governments to suit their ends.

Though written in 1997, the book is perfectly synchronized with the world of cryptocurrencies, created a decade later in the immediate aftermath of the financial crash. The Sovereign Individual found an early adherent in Thiel, a member of the so-called Paypal Mafia, the group of young entrepreneurs – including Musk – that launched Paypal in 1998 and subsequently spawned a whole host of companies; Reid Hoffman founded LinkedIn, Russel Simmons and Jeremy Stoppelman founded Yelp; Keith Rabois was an early investor in YouTube; Max Levchin became the CEO of Slide, Roelof Botha a partner at Sequoia Capital. With the exception of Musk, they all appear together in a famous photo published by Fortune in 2007, sitting in a bar, dressed as Italian-American gangsters.  

Not all of this clique would become disciples of The Sovereign Individual: some continue to fund liberal causes and Democratic electoral candidates. Yet the real division within the group is between the paladins of crypto and the others. Remember, bitcoin presented itself as a tool that could render the state superfluous as a guarantor of currency – undermining one of its two remaining monopolies (the other being the monopoly on legitimate violence). bitcoin was a way of realizing Robert Nozick’s ultra-minimalist state in the economic and financial realm, well beyond even the most audacious Friedmannian vision, where the supply of money is entrusted to the market.

Even more radical in his political convictions is Thiel, who, as we learn in a recent article in the London Review of Books,

predicts the demise of the nation-state and the emergence of low or no tax libertarian communities in which the rich can finally emancipate themselves from ‘the exploitation of the capitalists by workers’, has long argued that blockchain and encryption technology – including cryptocurrencies such as bitcoin – has the potential to liberate citizens from the hold of the state by making it impossible for governments to expropriate wealth by means of inflation.

Thiel recently hired as Global Strategist for his investment fund the former Austrian chancellor Sebastian Kurz, a conservative politician increasingly gravitating towards the extreme libertarian right. Thiel has also become a fervid exponent of the ‘Dark Enlightenment’, the new philosophy embraced by the alt-right and by some Trumpians (Thiel was one of Trump’s earliest financers), which proposes the creation of a neo-feudal system governed by a small set of cognitively superior elites.  

These patricians cloak themselves in the noblest of robes: those of meritocracy. After all, who would be against the idea that whoever deserves more should obtain more? The problem is that this reasoning is always performed backwards, moving from consequences to causes; so-called meritocracy, far from arguing that rewards should be commensurate to merit, actually maintains the opposite. Possessing wealth is already incontrovertible proof of the fact that it’s deserved. The rich are rich because they deserve to be, and everyone else is the undeserving poor. Musk is the living apologue of this principle, its celebrity incarnation. Yet, precisely for this reason, he doesn’t need to express radical positions like his ex-partner Thiel. The concept of cognitive feudalism is irrelevant for him, since he can simply exercise such tyranny over his employees. Rather than flaunting his radicalism, he puts it into practice. He doesn’t gloat about cryptocurrencies’ ideological virtues; he merely uses them to inflate the valuation of his companies. As Nobel laureate Wole Soyinka wrote in his stinging critique of négritude: ‘a tiger does not proclaim his tigritude, he pounces’.

Yet the limits of this approach are plain to see. Tesla’s market performance mirrors that of cryptocurrencies with an astonishing similarity (Tesla’s collapse from $1 trillion to $662 billion since last November coincides with the recent crypto crash). The end of quantitative easing and the monetary tightening that central banks will implement to check inflation will precipitate the collapse of overvalued firms and Ponzi schemes of all types. At this point, capitalism will have to find itself some other heroes (or some other comics).

P.S. If the collapse of bitcoin was one good story this spring, there was also another. Last May, it was as if the Davos Economic Forum didn’t even take place; nobody paid it the slightest attention, and it hardly appeared in any news report. Before the pandemic, Davos seemed like the yearly reunion of the masters of the universe. Its sumptuous choreography suggested that movie stars and heads of state were visiting the Alpine ski resort, rather than capital’s bureaucrats and paper pushers. By contrast, this new sobriety is a breath of fresh air. Meagre consolation in the face of the war, perhaps, but still a small glimmer of hope.

Translated by Francesco Anselmetti.

Read on: George Cataphores, ‘The Imperious Austrian’, NLR I/205.


Gary’s Inferno

There is only one thing you need to know about American democracy: it does not exist. Using data collected on 1,779 policy issues between 1981 and 2002 – well before Citizens United v. FEC made corruption first amendment protected speech – a 2014 study by two professors of political science at Princeton concluded that ‘the preferences of the average American appear to have only a minuscule, near-zero, statistically non-significant impact upon public policy’. This finding has significant downstream consequences on all aspects of American life, not least because it is largely unacknowledged, even as its effects, to varying degrees of injuriousness and fatality, are felt by 99% of the population.

The sovereignty of the ‘average American’ is outbid by economic elites as a matter of course, not excluding those supposedly divisive issues (e.g., abortion, gun control, single-payer healthcare) on which there is in fact bipartisan consensus in the electorate, but that does not mean he has become depoliticized as a result. On the contrary, he is more politically engaged than ever before – only his political activities consist in impotently watching cable news, posting about it on social media, arguing with family and friends during the holidays, decking out his lawn and bookshelf with totemic merch and PayPaling donations to whatever politician or cause has most recently shoved its cup into his diminished span of attention. Yes, he sometimes goes to rallies, protests, city council meetings, or even the ballot box, but these activities, whatever he may believe, have become ends in themselves. ‘Politics is downstream from culture’, one of the more astute ghouls recently uncorked in America liked to say, and it is true that the country’s myriad cultural pathologies give its politics their particularly rancid flavour. But the real takeaway from the Princeton study is that, in the daily experience of the average American, politics is culture, culture is politics, and – with one class of exceptions – never the twain shall meet.

Best known for his tenure as the sharp-tongued, hard-to-impress art critic at the Village Voice during the last gasp of the counterculture, Gary Indiana’s insight into this state of affairs is that insofar as the US has become a ‘televised democracy’, it may, like any other aesthetic phenomenon, be reviewed. Now 72, Indiana – born Gary Hoisington in the ‘factory world’ of Derry, New Hampshire in 1950 – has enjoyed an accelerating renaissance since the 2015 publication of I Can Give You Anything But Love. The utterly unsentimental ‘anti-memoir’ touches on his time in Berkeley in the late 60s, the LA punk scene of the 70s, and the downtown art world of the late 80s hollowed out by financialization and decimated by AIDS and the ‘depraved indifference’ of the Reagan and Bush administrations. Semiotext(e) and Seven Stories Press reprints of his collected Voice columns (1985-1988), his early novels Horse Crazy (1989) and Gone Tomorrow (1993), as well as his true crime trilogy Resentment (1997), Three Month Fever (1999) and Depraved Indifference (2002), have followed and been greeted with increasing interest. American readers born in the 80s, in particular, have been drawn not just to his nuclear-grade pithiness, his brazenly queer and bohemian narrative persona and his marriage of the techniques of American New Journalism and French avant-garde fiction, but also to the refreshing absence, rare among members of his generation, of nostalgia and apologetics in his accounts of the political events that have formed the pre-history of their lives.

Fire Season, an eclectic new selection of thirty-nine essays from 1984 to 2021, spans my own, give or take a few months on either end. It makes a compelling case that the window on American democracy closed sometime before I became a teenager: between Bill Clinton’s surprising second-place finish in the New Hampshire primary on 18 February 1992 and the opening of the assisted suicide trial of Dr. Jack Kevorkian on 20 April 1994. In that period, Indiana filed five pieces for the Voice – ‘Northern Exposures’, ‘Disneyland Burns’, ‘Town of the Living Dead’, ‘LA Plays Itself’ and ‘Tough Love and Carbon Monoxide in Detroit’ – that deserve to be regarded as classics of cultural reportage and travel writing. When paired with the more recent art, film and book reviews collected in Fire Season, they connect, as Christian Lorentzen writes in his introduction, ‘the twentieth and twenty-first centuries in ways readers and critics are only beginning to apprehend’.  

In ‘Northern Exposures’, Indiana returns to towns of the Granite State where he grew up to sit in reconverted porn movie houses and rooms that look like furniture showcases and Anglophile prep school auditoria alongside the ‘blue rinse jobs with ropes of synthetic pearls’, ‘Alan Alda types’ or else the ‘dewlapped, earnest preppies’ who have shown up to hear the mercenary, delusional or deranged pitches of half-a-dozen Presidential hopefuls (five Democratic, one Republican). Throughout Fire Season, Indiana shows himself to be landscapist worthy of Bosch and a portraitist worthy of Francis Bacon: he paints with a rich palette of displeasures whose pigments range from the scatological to the refined. What makes the candidates and the people who will vote for them symbiotic is not only that they are grotesque, hideous, odious, scabrous, tacky – to use some of Indiana’s favourite epithets – nor simply – with the exception of Pat ‘Caliban’ Buchanan, who is ‘tediously, exactly’ the frightening bigot and sexual hysteric he appears to be – that they are fake. It is that, like pink urinal cakes in a football stadium bathroom, their half-hearted attempts at concealment have only made everything smell worse. Critics are often praised for their visual abilities and Indiana’s eye for the revealing detail is second to none, but the moral sense is in the nostrils, and New Hampshire reeks of something that has ‘crawled up in you and died’.

New Hampshire’s vices are those of the nation: self-pity (they ‘regard themselves as the only true victims of history’), buck-passing (they ‘admit nothing’ and ‘blame everybody’) and provincialism (they refuse ‘to learn from the larger world’). ‘Of course people are “hurting”’, Indiana writes, giving the campaign cliché the scare quotes it deserves – ‘you usually do hurt after shooting yourself in the foot’. But to shoot oneself in the foot is still a form of agency. If only someone would inject the readership of the Union Leader with a journalistic cocktail of rabies vaccine and truth serum, the logic goes, the residents of this ‘backwards but perhaps not entirely hopeless state’ might start to acknowledge the ‘bad choices’ they have made, replace the ‘bad leaders’ they have put into power and actually fix problems that do not fail to repeat themselves, no matter how bitterly they are complained about.

The degree of civic optimism this presupposes, however miniscule, is no longer in evidence when Indiana flies to Paris a few months later to visit the Euro Disney resort in Marne-la-Vallée. An ‘obvious expression of cultural imperialism’, he writes in ‘Disneyland Burning’, Euro Disney is – no less than New Hampshire – a microcosm of the nation that made it, and not just in the literal sense that you can find Carnegie Deli and Big Bob’s Country Western Saloon there, or even in the sense that Mickey, Minnie et al are ‘genuine American archetypes’. A merger of state and corporation policed by a private security force backed up by gendarmes and staffed by ruthlessly exploited labourers, the ‘superficially varied’ architectural styles of the theme park’s six ‘lands’ ‘articulate…a mode in which any escape from cliché has become impossible’ and ‘presume a universe in which human beings no longer have any minds at all’. Indiana estimates that it would take two hours to make a complete tour of the park, but the average visit requires an outlay of three vacation days because you will spend most of your hard-earned leisure time waiting in line for rides and restaurants while being bombarded with advertisements telling you and hundreds of others how great the experience you’re not having is. Like the off-brand version in Branson, Missouri he will later describe in his essay ‘Town of the Living Dead’, what Euro Disney offers visitors is the slow suicide of this ‘alienated duration’, mort à credit. (Later, making mental notes for a piece on the artist Barbara Kruger, he will say, apropos of Walmart: ‘You can get anything you want at Walmart. The fact that you want it means you are already dead.’) Indiana is not the first to compare Disney’s franchises to concentration camps, but what makes them red-white-and-blue is that you have to pay to get in.

On this description, American culture is something more sinister than merely the means by which political power obfuscates its workings; it is the soft adjunct of a killing machine that reaches its telos in the carbon monoxide pumped through a rubber hose and into the mask held over your face by kindly Dr. Kevorkian. Just as Disney promises the ‘time of your life’ only to deliver dead time, Kevorkian’s ‘Kmart kind of suicide for a democratic era’ promises a dignified death, only to ‘surrender the last remaining mystery to faceless consumerism’. These days, the importance of aesthetic concerns is routinely downgraded in favour of more obviously political ones, but in a country where, as he writes in the Kruger essay, ‘democracy = capitalism = demolition of utopia’, matters of taste and tastelessness are far from irrelevant to the question: how should we live?

If it is to be objected that taking Disney and Kevorkian as the alpha and omega of US culture is to cherry pick from the bottom of the barrel, Indiana does not let its putatively higher precincts off the hook either. The American literary world will later be excoriated for its culture of ‘careerism’ and ‘fatuous self-promotion’ in his introduction to the French transgressive novelist Pierre Guyotat’s memoir Coma. Behind the ‘costume of authenticity’ worn by the first-person narrators of today’s ‘bourgeois literary writing’ – whether memoir, food writing or autofiction à l’Américain – ‘lies the mercantile understanding that a manufactured self is another dead object of consumption…a “self” that constructs and sells itself by selecting promotional items from a grotesque menu of prefabricated parts’. To write such a book is to indulge in the provincialism of personal identity; to read one is to be given yet another anesthetizing hit of ‘cultural morphine’. 

Meanwhile, some thirty miles north of the Disney mother ship in Anaheim, the only genuinely democratic event that occurs in a society where there is no means of peacefully translating popular will into public policy took place. Following the acquittal in the state trial of four LAPD officers charged with the use of excessive force in the beating of Rodney King, thousands of people staged a massive six-day insurrection (or if we must, ‘riot’) in South Central Los Angeles and Koreatown, which lead to over 60 deaths, 2,000 injuries, 12,000 arrests, and $1 billion of property damage. Covering the federal trial a year later in ‘LA Plays Itself’, Indiana does not fail to note that the father of Laurence Powell – the officer with the ‘put-upon, porcine expression of a slow-witted high school bully’ who broke King’s leg with his baton – ‘usually sports one of three differently coloured Mickey Mouse ties’. Nor does he fail to observe that although every politician, including Bill Clinton, that could get in front of a TV camera between 29 April and 4 May 1992 described the insurrection as a ‘wake-up call’, the underlying juridical and socio-economic conditions – the analogy Indiana reaches for is apartheid – that lead to it hadn’t ‘changed an iota’ since. What had changed? Gun sales. Police budgets. The ‘heavier application of cosmetics to a festering wound’.

‘The model really appears to be the old patronizing thing, corporations coming down, helping out, chipping in a little bit, rather than long-term stimulus’, LA Weekly reporter Ruben Martinez tells Indiana. ‘Given that the economic outlook is still piss-poor, and that that’s what set people so much on edge, how can you think there’s not going to be another riot eventually, whether it’s after the trial or some other occasion?’ Fast forward three decades, pausing the tape in Ferguson, Baltimore: the cover of Fire Season features a painting by Sam McKinniss, one of the artists reviewed in it, of an NYPD squad car that got torched during the uprising that followed the murder of George Floyd by police officers in Minneapolis, while then-Presidential candidate Joe Biden – who helped author the 1994 legislation that would give the US world history’s largest gulag – was in Philadelphia calling the murder a ‘wake-up call’.

‘Do I honestly “believe” in democracy?’ Indiana asks himself in his Obama-era travelogue ‘Romanian Notes’, channel hopping between coverage of Tahrir Square and James Clapper’s testimony to the House Intelligence Committee about PRISM, a surveillance program so extensive it would have made Securitate, Ceaușescu’s secret police, turn green with envy. The trap for a critic of such unrelenting negations is cynicism; however close he edges to despair, Indiana does not fall into it. Cynicism, after all, is just the flip side of the coin of ingenuousness, a sign that one has lost the ability to make distinctions. Although Indiana knows that democracy is ‘irrelevant’ to the people who run the US and a ‘joke to the people who own it’, he also knows that when you look into the abyss, the abyss looks into you. For critique – of an artwork, of a society – to be meaningful it must be undertaken, at least implicitly, in the name of a preferred alternative. When his wallet is stolen by a Bucharest taxi driver in a companion essay, ‘Weiner’s Dong, And Other Products of the Perfected Civilization’, Indiana is surprised and indignant to learn how many details of his personal life the customer service representative at Chase is able to access based on ‘publicly accessible information’. Surprise and indignation are emotions you are capable of feeling only if you believe things ought to be otherwise.

Fire Season is a vision of hell, but just as every Inferno must have its gradations of offense, it ought to have a place in it for virtuous pagans. The book’s heroes are, first of all, the reporters: Alisa Solomon of the Village Voice, Ed Leibowitz and Ruben Martinez of LA Weekly, Masha Gessen, Anna Politkovskaya. What earns them deserved praise is, quite simply, that they tell the truth. Truth is a much-abused concept in our time; what Indiana means by it is less our gamified sense of fact-checking than an ethos of candour: to ask questions others will not, yes; to point out inconsistencies and outright lies, yes; but also to refrain from omitting ‘complicating facts, mitigating causalities’ from one’s account, or to lard it with ‘exaggerations’. It was for her candour about the Second Chechen War that Politkovskaya was shot four times, once in the head, in a contract-style killing whose timing suggests that it doubled as an obscene birthday present for Vladimir Putin, Ramzan Kadyrov, or both. ‘There is no need for the truth anymore’, a Chechen war widow says in a documentary Indiana watches in ‘I Did Not Know Anna Politkovskaya’, a reflection on the art and function of journalism, the profession for which she gave her life. ‘That is why they killed her’. The widow is right in one sense – despite what Politkovskaya uncovered in Grozny two decades ago, Putin and Kadryov are, as I write, savagely burnishing their résumés as war criminals with the blood of Ukrainian civilians. But we also need the truth, and always will, because it is essential to human dignity. Without it, we are already dead.

The book’s other heroes are its artists (Louise Bourgeois, Tracy Emin, Kruger, McKinnis, Andy Warhol), its filmmakers (Robert Bresson, Louis Buñuel, Pier Paolo Pasolini, Jean-Pierre Melville, Barbet Schroder) and its many writers (Renata Adler, Samuel Beckett, Anya von Bremzen, Jean Echenoz, Guyotat, Jean-Patrick Manchette, Mary McCarthy, Paul Scheerbart, Unica Zürn). What by and large unites this somewhat disparate group is a certain sensibility: a fatalism that does not lead to resignation, a stoicism that does not preclude sympathy, an ironism that is tolerant of human folly, a tragicomic sense of life that is born of unperformed familiarity with grief, psychic extremity or violence. It is a sensibility that Indiana, as a reviewer of their work, shares. As with his reportage and his true crime novels, the idiom of his criticism is equally at home in the gutter as it is in the firmament; it is informed by the conviction that these are the zones of lived and aesthetic experience, however painful they may be to occupy, where something of value can be wrested from a cruel and cretinizing late capitalist social order.

It is noteworthy – but should come as no surprise – how few of the names on the above list were born in the US and how little of the work for which they are best known was done during the years 1984-2021. This sensibility may be diametrically opposed to the one laid out in ‘Northern Exposures’, yet in a way it is also a kind of shooting oneself in the foot. Over and over again, Indiana compares the experience of such art to a symbolic mutilation: Guyotat ‘spoils the flavour of bourgeois literary writing’ and ‘exposes [the class system’s] corruption of feeling’; Bresson ‘ruins one’s taste for mediocrity’, like a cigarette put out on the tongue; Kruger’s work is ‘the ruin of certain smug and reassuring representations, the defacement of delusion’, such as the one about living in a democratic society. The function of good art – and by extension criticism practiced as an art – is to render its audience unfit to serve as an extraction site for the cultural killing machine. Fire season, as I’m sure you’ve noticed, is every day now. This is what we’ll need if we’re going to survive.

Read on: Alexander Cockburn, ‘Dispatches’, NLR 76.



In September 2020 Sir Geoffrey Nice announced the creation of the Uyghur Tribunal to ‘investigate China’s alleged Genocide and Crimes against Humanity, against Uyghurs, Kazakh and other Turkic Muslim populations’. On 23 March last year, 17 British MPs signed a parliamentary motion condemning the ‘Atrocities against the Uyghurs in Xinjiang’. On 6 May, the House Foreign Affairs Committee held a hearing entitled ‘The Atrocities Against Uyghurs and Other Minorities in Xinjiang’. Between October and December, ‘atrocities’, ‘genocide’ and ‘crimes against humanity’ filled the pages of the international press from the Guardian to Turkish dailies to Ha’aretz. On 20 January this year, a majority in the French parliament ‘officially recognised the violence perpetrated by the People’s Republic of China against the Uyghurs as constituting crimes against humanity and genocide’.

The words ‘atrocities’, ‘massacres’, ‘genocide’, ‘ethnic cleansing’, ‘torture’ and ‘crimes against humanity’ are used interchangeably in such denouncements. In other cases, reference is often made to ‘war crimes’. These terms have become so embedded in the news cycle that they scarcely induce any reaction at all. Their routine inflation weakens their capacity to appal, to stir, even to prompt reflection.

We rarely pause to consider that up until the end of the nineteenth century, such categories were alien to political discourse. They were exceedingly rare objects of moral indignation (see, for instance, Bartolomé de las Casas on the massacre of the indios), which had not yet solidified into justifications for political or military intervention. Nobody had ever been convicted of ‘war crimes’. Acts committed during war were never considered more culpable than the war itself. Vanquished enemies were enslaved or deported, but they weren’t cast as criminals; defeat – and everything it implied – was punishment enough.  

The substantive difference between ‘war crimes’ and ‘atrocities’ is that the former are tried and convicted once a war is over, as a sanction against the defeated party and legitimation for the victors. ‘Atrocities’, on the other hand, are more often mobilized in the interest of waging war; they are one method by which modernity constructs an enemy. The same act can be defined as an ‘atrocity’ before a war is declared and a ‘war crime’ once a war is over. The Uyghurs are certainly persecuted and oppressed by the Chinese state, but the persistent use of ‘atrocities’ by the Western security establishment is a semantic escalation that signals a political transition: away from peaceful diplomacy, towards New Cold War confrontation.

Before the Enlightenment, jurists used the word ‘atrocity’ when discussing punishment – though never with critical intentions, Foucault tells us, for the atrocity of the supplice (torture, quartering) was seen as proportionate to the atrocity of the crime, a formula which laid bare a specific conception of power: ‘a power exalted and strengthened by its visible manifestations…for which disobedience was an act of hostility…a power that had to demonstrate not why it enforced its laws, but who were its enemies, and what unleashing of force threatened them’. It was to eliminate this form of punishment that the Enlightenment introduced a new conception of atrocity, to substitute forms of retribution ‘not in the least ashamed of being “atrocious”’ with ‘punishment that was to claim the honour of being “humane”’. Atrocity was ‘the exacerbation of punishment in relation to the crime’, a novel surplus, irreducible to the accounting of crime and retribution, an excess in relation to the existing economy of infringement.

Another century was needed, however, for the atrocity to acquire its political definition. The first deployment – to my knowledge – of the term ‘atrocity’ by a Western statesman (who, in fact, used it as evidence of a ‘just’ cause for a possible war), was in an invective Gladstone addressed to the Ottomans in 1896: ‘…this is not the first time we have been discussing horrible outrages perpetrated in Turkey, and perpetrated not by Mohammedan fanaticism, but by the deliberate policy of a Government. The very same thing happened in 1876’, but the Sultan’s government, ‘declared that there were no atrocities, no crimes committed by Turks or by the agents of the Government’. Should the Sultan continue to commit these crimes and massacres, Gladstone concluded, ‘England …should take into consideration means of enforcing, if force alone is available, compliance with her just, legal, and human demand.’ 

It wasn’t just any politician, then, who inaugurated the discourse of ‘atrocities’. For over forty years (1852-94), Gladstone dominated British politics (he was Prime Minister for 13 years, Chancellor for another 13, and leader of the House of Commons for 9). It was Gladstone, above all, who invented humanitarian imperialism, or ‘liberal imperialism’ as it was then known, whose heyday would come in the American Century.

Why is it that atrocities hardly appear as an issue in the fifty preceding centuries? Because atrocities were taken for granted. It was common knowledge that power kills, tortures, sweeps away. Nobody threatened to wage war on Charles V for the ‘atrocities’ committed by the Landsknechte when sacking Rome (1527). Before the second half of the twentieth century, the United States never even questioned the genocide of native Americans, the victims of which were in the tens of millions.

Nowadays, atrocities mark the limit of acceptable or legitimate violence. Indignation towards them has become a key part of political etiquette – a means of demonstrating one’s respect for the rules of war, just as one would display one’s manners in a distinguished salon. Like all etiquette, this involves a great deal of hypocrisy. Outrage at ‘excessive violence’ serves to soften or conceal the ubiquitous violence described by Nietzsche, whereby humans inflict harm simply because they can. Exhibiting concern for atrocities is a means of of civilizing the struggle for global power, as if a more ennobled form could somehow change its content. This discourse has the effect of reintroducing an ideological element to war that was largely absent since the peace of Westphalia (1648). Peter the Great of Russia fought the Swedish king Charles X not for ideological reasons, not for civilization, not for good to prevail over evil or to put an end to any genocide, holocaust or massacre, but simply and purely to accrue more power.

The principle according to which only just wars should be fought, or, better still – that a war should first be ‘rendered just’ before it is waged – is a somewhat bizarre and thoroughly modern idea, rooted in the confluence of three long-term tendencies. The first is the Reformation, with its exigency for a redemptive motive for every human action (even for profiting). The second is colonialism, and the notion that wars against the colonized served to civilize them (what Kipling famously called ‘the white man’s burden’). Third is the emergence of public opinion. For it is before this audience that atrocities are paraded in order to justify moving against a constructed enemy (the absence of ‘public opinion’ is another reason why the issue had never been raised in preceding millennia). The atrocity must create a scandal, otherwise it’s ineffective. From this perspective, the politics of atrocities is a symptom of mass communication: first newspapers, then radio and TV, now social media.

In the 1890s, Britain witnessed the triumph of popular newspapers: from 1854 to 1899 the number of dailies in London grew from 5 to 155. Millions of readers were suddenly distressed by stories of atrocities taking place in exotic countries; dark skins, naked bodies, violence. It’s no coincidence that the first great revelations of this type came from the Congo, then the Amazon: atrocities against the ‘savages’. In 1885, the Berlin Conference assigned the Congo to the Association Internationale Africaine, an ante litteram NGO – or ‘humanitarian’ association – which had once employed the famous American explorer Henry Morton Stanley (‘Dr Livingston, I presume?’), and was controlled by the Belgian King Leopold II. This stretch of property measuring 2.6 million km2 was intended to resolve the rivalry between two major colonial powers in Africa, Britain and France (the birth of Belgium in 1830 was a consequence of the defeat of Napoleon, cutting off France’s northeastern provinces, which were integrated into Belgian Walloon). It’s not surprising, then that the campaign against the Belgians’ atrocities in the Congo flared up at exactly the moment Gladstone delivered his speech, nor that it was fanned by the Anglophone press.

When the British government commissioned the Irish diplomat Roger Casement to write a report on the matter, completed in 1904, the document served to establish the rhetorical canon for all future reports on atrocities: accounts of genocide, famine, farced labour, imprisonment, torture, rape, mutilation. One particular episode, reinforced by photographs, struck contemporaries’ imagination: the hands of dead enemies were amputated, so that local conscripts to the Force Publique (the Congo’s military police) could prove that they had really used their bullets, rather than pocketing them. The report enjoyed a global reception, thanks also to Mark Twain’s King Leopold’s Soliloquy (1907) and Arthur Conan Doyle’s The Crime of the Congo (1909). As a result, in 1908 the Congo was transferred from Leopold’s private holdings to public ownership by the Belgian state.

Casement also wrote the second great report on atrocities: those committed in the Peruvian Amazon’s Putumayo region, where he was sent to investigate in 1910-11, as the company that held the right to exploit the area’s rubber was registered in London and hired Barbadian labour – that is, British subjects. Here the report also certified mistreatments, malnutrition, forced labour, rape, murder, amputations, torture. In 1911, Casement was knighted for his findings. On the extraordinary life of this figure who went from international human rights superstar avant la lettre to concluding his earthly sojourn on the gallows at Pentonville, two texts are worth reading: Colm Tóibín’s ‘Roger Casement: Sex, Lies and the Black Diaries’ (2004) and Mario Vargas Llosa’s The Dream of the Celt (2012).

A great contributor to the dissemination of Casement’s report on Putumayo was the then British ambassador in Washington, Lord James Bryce, noted in the United States for his book American Commonwealth (with its lapidary verdict: ‘in Latin America, whoever is not black is white, in German America, whoever is not white is black’). With the advent of the First World War, it was Bryce that London would entrust in 1915 with the task of compiling a report on German atrocities committed in Belgium. The Bryce Report collected testimonies of various ‘outrages’ perpetrated by German soldiers, but global public opinion was particularly enflamed by one specific case, so much so that it would be cited by states that had until then remained neutral – Italy, the US – as justification for entering the war. The report outlined that ‘a third kind of mutilation, the cutting of one or both hands, is said to have taken place frequently’. A form of historical lex talionis: a decade earlier, photos circulated that showed the Force Publique practicing the very same punishment, now subjected to the Belgians that had introduced it.

The truth is, even if the Germans committed countless ‘outrages’, these specific accusations eventually proved unfounded, though they were still taught in French elementary schools in the 1930s. This leads us to the problems involved in campaigns against atrocities. For one, do they correspond to reality, or do they bend it to their advantage, in order to make the bad guys look a little worse? What’s more, not all atrocities become an object of scandal. Despite its similarity to the very worst of the Putumayo incursions, the British colonists’ hunting of aboriginals in Tasmania never generated comparable clamour.

Finally, efficacy. Sometimes scandals provoked by atrocities prove to be sharp tools. King Leopold’s crimes forced the Belgian state to take the reins of sovereignty in Congo; German atrocities in Belgium facilitated the entry of neutral powers into the war; the Nanjing massacre of December 1937 prepared American public opinion for war against Japan; the My Lai massacre in March 1968 accelerated Americans’ revolt against the Vietnam War; the atrocities at Srebrenica in the summer of 1995 built the foundations of the anti-Serbianism that precipitated intervention in Kosovo in 1999.

Yet there are just as many incidents that produce no such results: after the Putumayo ‘scandal’, Peru wasn’t sanctioned, and indigenous Amazonians continued to be oppressed, if more discreetly. In Rwanda, after the massacres of 1994 everything was forgotten. In such cases, the horrors registered in photographs and documentaries were initially transformed into a sort of monster by which humanity was transfixed, eager yet impotent to fathom the sheer quantity of evil for which it was responsible. The scandal became an occasion for contemplating the heart of darkness inside each of us. But it also induced an inurement to horror. An unintended consequence of the proliferation of campaigns against atrocities has been a kind of mithridatisation, in which we all become peaceful cohabitants with monstrosity.

Translated by Francesco Anselmetti.

Read on: Tor Krever, ‘Dispensing Global Justice’, NLR 85.


Rebel Maids

The Brazilian cartoon series Irmão do Jorel offers a cosy-satirical picture of family life, not unlike The Simpsons. Strikingly, however, there is a character with no homologue in the US series: the family’s maid, represented as a purple octopus – amorphous, voiceless, nameless, with eight arms ready to carry out any task requested of her (for a representative episode, see here). The Brazilian entertainment industry has many such images. From blackface turns on TV comedy shows to the maids who form a girl band in the hit telenovela Cheias de Charme, representations of domestic workers pervade the country’s cultural imagination.

Beyond the culture industry, paid domestic work is a daily reality for nearly 6 million Brazilian women, as well as for the millions of households that employ them. It is an occupation strongly marked by the sexual and racial division of labour: 92% of those employed in this category are female, and two-thirds of the latter are black. Restricted to the private sphere, the actual experience of this work has been atomized and largely invisible.

Despite the prevailing silence in Brazil’s public sphere about the realities of domestic work, domestics played a leading role in the 100,000-strong March of Black Women against racism and violence in November 2015. In 2016, a rapper and former maid, Preta-Rara, shared some memories of her time in domestic service on Facebook and was flooded with responses from other domestic workers. The page she set up for them soon garnered thousands of personal stories, from multiple viewpoints – giving voice to the experiences of Brazilian domestic workers in a way that cold official statistics could never do.

In 2019 Preta-Rara – real name, Joyce Fernandes; preta rara translates as rare or precious black girl – produced a compilation of these social media accounts in a book, Eu, Empregada Doméstica (I, Domestic Servant) with the subtitle: ‘The Maid’s Room is the Modern Slave Quarters’. It opens with the story of her grandmother, Noêmia, who began work as a maid at the age of fourteen. Preta-Rara’s mother, Maria Helena, followed the same path, and tells her daughter of the lasting trauma left by never having been taught to read or write. (Preta-Rara herself later made it to college and has taught high-school history, in addition to her music – her first album, Audácia, appeared in 2015 – and establishing a major social-media presence.)

Though some of the stories collected in Eu, Empregada Doméstica recall humane employers, the structural situation of the work means that exploitation is standard. Many depict psychological humiliations: accusations of theft, sexual harassment, moral harassment, effective imprisonment, occupational diseases and chronic exhaustion. Women recall being sent off in their early teens to work in strangers’ houses. Younger children, accompanying their mother to work when there is no one to watch them at home, get mistreated by employers or bullied by their children. Domestic workers often make huge sacrifices to help their children, especially their daughters, avoid going through the same experience. Access to education is frequently seen as the key to change – sometimes provoking mockery and disbelief from their employers. Escape from exploitation and subordination requires enormous individual effort and resources.

Preta-Rara herself recalls the indignity of being forced to use the ‘service’ lift in an apartment block, and to climb eight flights of stairs when it was out of order, because maids were not allowed to use the ‘social’ elevator. A common response emerges, when domestic workers are pushed to their limit: ‘Never going back to that place again.’ It is a phrase that occurs over and over in the contributions, a series of one-woman strikes against an intolerable situation, now brought together by Preta-Rara in collective form.

Nancy Fraser has analysed the heightened contradictions of ‘capital and care’ under today’s form of financialized capitalism, as neoliberal pressures put a squeeze on essential forms of material and affective reproductive labour – birthing and raising children, maintaining households, sustaining personal and community relationships. She argues that every form of capitalist society harbours a deep-seated crisis tendency, as capital’s drive to unlimited accumulation – free-riding on the life world, as she puts it – tends to destabilize the reproductive processes that are indispensable to the perpetuation of society itself, without which there can be ‘no culture, no economy, no political organization’.

For Fraser, these contradictions take different forms in the core and on the peripheries of world capitalism, as also across successive eras or ‘regimes of accumulation’: 19th-century liberal imperialism and colonial extraction; mid-20th century welfare-state Fordism and third-world developmentalism; 21st-century neoliberal globalization. Each, she has argued, produced its own asymmetrical fix for staving off the contradictions of capital and care: the ‘separate spheres’ gendering 19th-century bourgeois life, the expanded welfare provision and male breadwinner of Fordism, the two-earner families of neoliberal emancipation. Each fix in turn entered into crisis. The latest manifestation of this tendency in the US is the ‘crisis of care’ – time poverty, family-work balance – already attracting attention even before the reproductive catastrophe of the global Covid-19 pandemic.

Yet the Brazilian experience – and perhaps, more broadly, that of Latin America – alters this picture. The stories collected by Preta-Rara speak not of epochal ruptures in forms of reproductive labour, but of intergenerational continuities. ‘Almost all women in my family started their lives as domestic servants’, one woman wrote. ‘My grandmother was enslaved – because that’s the right word – from childhood. My mother started to work as a family’s nanny when she was a teenager. My aunt has asthma attacks brought on by excessive work with chemical-cleaning products’.

‘Breaking the cycle of misery to which we were subjected is an arduous task’, wrote another. ‘It means fighting against everything and everyone. My grandmother worked all her lifetime in the fields, my mother was a maid, and I followed in her footsteps. Going against all of this leaves scars, physical and on the soul.’ At stake here are historical continuities traceable back to slavery – the connection Preta-Rara underscores with her subtitle, identifying the maid’s room as the slave quarters. Some of the social media narrators use the colonial term sinhá – ‘madam’ – to refer to their employers. Another makes the same link: ‘I’m always thinking that, if the memory [of paid domestic work] hurts me, I can imagine it must have hurt my mother and my grandmother much more, because, even allegorically, they had to bear the “lash” so that we could eat bread.’

As noted by the Brazilian social scientist and activist Lélia Gonzalez, to understand the place of black women in Brazilian society today, we need to examine their role under slavery. Gonzalez – herself the daughter of a black maid – summarized the historical role of the black mucama: ‘It was her task to keep the master’s house running at all levels: washing, ironing, cooking, spinning, weaving, sewing, and nursing the children born from the “free” wombs of the little senhoras… And after the heavy work at the master’s house, she was also responsible for taking care of her own children, as well as helping her friends who had come from the plantations, etc., who were starving and exhausted.’ The Argentine anthropologist Rita Segato has emphasised the longue durée nature of this ‘transferred motherhood’ in Latin America, dating from the onset of colonialism. It has been naturalized over the centuries by serial cultural forms, predecessors of the purple octopus in Irmão do Jorel.

The developmentalist era in Brazil brought many changes, but – pace Fraser – the underlying role of black women in social reproduction continued throughout. If anything, young girls were dispatched from the interior in greater numbers to work as maids in the booming cities. The social media stories illustrate this process well: ‘My mother comes from a tiny hinterland village and was sent to the capital to work at the age of thirteen’ is a typical beginning. This ‘national care chain’ – the internal migratory flow of girls and women from the Brazilian backlands to the cities, which peaked at the height of the ‘rural exodus’ of the 1960s-80s – has its equivalent in the ‘global care chain’ of which Fraser and others also write: the pull of globalized financialized capitalism inducing the emigration of racialized women from poor countries to undertake social-reproductive work in rich countries where, with the onset of the long downturn and collapse of the ‘male breadwinner model’,  women were heading into waged white-collar work.

Brazil is certainly part of the migratory flow of the global care chain, in keeping with its middle-ranking position in the world economy. Immigrant domestic workers are, for example, Bolivian, Haitian, Venezuelan and Filipina women, whose migratory condition intersects with racial, class and gender rankings. Brazilian women, on the other hand, mainly emigrate to the Global North, especially the United States and Western Europe.

How to explain the continuities in Brazil’s social-reproductive order, compared to the successive regimes that Fraser analyses? Here it may be helpful to draw upon the notion of colonialidad developed by the Peruvian world-systems theorist Aníbal Quijano, who pointed out that ruling classes in early 19th-century Latin America battled to prevent the decolonization of their societies even as they fought for independent states. Through this dynamic, the ‘coloniality of power’ was incorporated into the state-formation process itself. The sexual and racial division of paid domestic labour, and its historical continuity with practices dating from the colonial and slavery periods, underlines the relationship between social-reproductive relations in Latin America and this foundational hierarchy. In this context, the ‘care gap’ is not a recent process. It is a dynamic inscribed in the very ‘coloniality of power’.

Thus, paid domestic work is both an expression of the structural inequities within Brazilian society and the perpetuation of them. Its availability at a low cost for Brazilian middle and upper classes lessens the potential pressure for welfare-state measures aimed at supporting socio-reproduction activities – day-care centres, full-time education, community restaurants, community laundries and care centres for the elderly. As Rita Segato puts it in Crítica da Colonialidade em Oito Ensaios (2021), the continuity of women’s invisible low-paid work allows an ‘evasion of social-sector investment’.

It also dissipates tensions within middle and upper-class families, where women’s ‘double shift’ of domestic labour is alleviated, as well as the demand that their partners and other family members do their share. As the American sociologist Patricia Hill Collins argued in Black Feminist Thought (1990), historically many white families in the US similarly maintained their class position because they used black maids as cheap labour. At the same time, the delegation of domestic work tends to intensify racial and class inequities, accentuating the polarization between women, especially between domestic workers and their female employers.

Brazilian domestic workers have been particularly hard hit during the Covid pandemic, given their fragile or non-existent social protection. They were torn between continuing to work at high risk of infection or stopping work and losing their income. Nor were they given priority-worker status for the vaccine.

The aggravation of precarious social conditions suggests to some that we are moving forward towards the past. In Critique of Black Reason (2013), Achille Mbembe argued that the world is becoming nègre, as capitalism accentuates the exclusion, alienation and degradation of workers in general. From another perspective, the question of care provides a route to the future. For the Madrid collective Precarias a la Deriva, care should be a guiding principle in all political-economic considerations. Fraser argues that struggles over social-reproduction – encompassing housing, healthcare, food security, migrants’ and workers’ rights, day care, elder care, paid parental leave – are ‘tantamount to the demand for a massive reorganization of the relations between production and reproduction’.

Care and social reproduction are also central to movements such as Quilombismo and Bien Vivir, which focus on social practices based on cooperation, solidarity and equality. Production and reproduction go hand in hand in these radically democratic projects. Everyday resistance takes place in multiple ways, even if it is just to say, ‘Never going back to that place again.’ Certainly, Brazilian society will never be emancipated unless domestic workers are emancipated too.

Read on: Guilherme Boulos, ‘Struggles of the Roofless’, NLR 130.


Belgian Sorrows

‘I shall not forget the evening I spent in the luxurious restaurant in the Grand Place… beyond its windows the Belgian people went about their usual lives: eating chocolates, crashing their fine cars and wondering whether Belgium was really a country at all’. So recounts the protagonist of Doctor Criminale (1992) Malcolm Bradbury’s satirical novel examining the state of Europe after the fall of the wall. A decade earlier, the Belgian novelist Hugo Claus had given a more pessimistic prognosis for the future of Western Europe’s most unlikely country in The Sorrow of Belgium (1983), where collaborationist Flemish nationalists, despotic nuns, and degenerate policemen make up the central characters. For Claus, the country’s fractures were plain to see. Communist militants were attacking pro-NATO businesses, former military personnel went on rampages in Belgian supermarkets. Two years before in fact, Walter Van Den Broeck had gone as far as writing a fully-fledged scenario of the break-up of Belgium in The Siege of Laken (1980), with soldiers occupying the Grande Place and the royal family seeking refuge in a forest chalet.

Bradbury’s Belgium is, of course, still here. In the 1990s the country split into three new regions (Flanders, Wallonia, and Brussels-Capital), siphoning policymaking off to the regional level. By 2011 it had broken records for the longest government formation in modern history, only resolved by a grand coalition à la belge which ended in a resounding separatist victory in the north for the New-Flemish Alliance (N-VA). In the first lockdown season its death toll broke all comparative records; hospitals and care homes were overrun. Last month, regional governments announced a collective retightening of restrictions, with new mask mandates in public spaces and a French-style corona pass. Brussels has barely reached a vaccination rate of 60%, while Flanders and Wallonia head towards the OECD average. The pandemic has re-enflamed older regionalist tensions, with even some staunch Belgian unionists now contemplating the dissolution of the federal social security system – the crown jewel of Belgium’s industrial working class.

Despite all this infighting, an endearing picture of Belgium persists abroad. As The Economist noted earlier this year, Belgium is ‘the world’s most successful failed state’ with Belgians ‘almost as rich as Germans and better off than Britons or the French’. Health services are excellent, wages are high, asset price inflation has never dropped. The abnormally high suicide rate and regional inequality between Wallonia and Flanders aside, Belgians remain well educated, wealthy and secure (‘the country is at peace’, Tony Judt noted in a 1999 report for the New York Review of Books, ‘if not with itself then at least with everyone else.’). More than anywhere in Scandinavia, the country appears as the ideal location for a sheltered, safe, social-democratic paradise.

Yet Belgians’ prosperity has hardly immunized them against a deep sense of disaffection and unease. Belgium’s traditional party democracy is imploding, state capacity is waning, and an emboldened far right is on the rise. In May, a fugitive military corporal named Jürgen Conings went on the loose, hunting for the prominent virologist Marc Van Ranst who previously advised the government on its lockdown policies. The underfunded Belgian military went on a frantic search. Muslim parents withdrew their children from school, tanks cruised through the forests.

The soldier’s body was found in a woodland by his house in late June. He was first spotted by a ranger who promptly sold snapshots of his corpse to international journalists. Later in the day, the local liberal mayor noticed the stench on his weekend bike ride, recognising it because of an earlier habit of digging up burial sites (few inquired further). Both macabre and surreal, the episode was Belgian to the core. It also proved a major ordeal for the new coalition government led by liberal prime minister Alexander De Croo, who replaced an interim predecessor that had been ruling by decree since the start of the pandemic.

Tens of thousands expressed their support on Facebook for Conings’s vigilantism. Most turned out to be relatively wealthy exurban households, occupying a village society without villages. Desperately poor in the nineteenth century, the Flemish were hoisted into the new Fordist middle class by the 1960s and have remained there since. The last became the first in post-war Belgium, but the Flemings never overcame the trauma of a century of linguistic repression, and still suffer from an inferiority complex of Freudian proportions.

For a long time, the Christian Democrats (CD&V) presided over electoral fiefdoms in the Flemish countryside. Party, church and the local history society provided cohesion as the Flemish made their entry into modernity. But this ‘precious fabric’ – a favourite phrase of Bart De Wever, phlegmatic leader of the N-VA, lifted from Tory theorist Theodore Dalrymple – has now been eroded, eaten away by thirty years of consumerism and digitalisation. Flemings no longer look to their local clergyman for voting advice; a new outspoken citizen has leaped into the void without a party card, laptop at the ready. Flanders’ Christian Democrats still had an impressive 130,000 members in 1990; they now count a meagre 43,000 and are polling under 10%. In the same period, the Socialists plummeted from 90,000 to 10,000 members.

No party however has gone through a more unwieldy transformation than the Flemish Socialists. In the south, the francophone Socialist Party rules unencumbered with a membership base larger than its French counterpart in a region of barely 4 million inhabitants. Its Flemish outfit is much smaller. In September 2020, its chairman Conner Rousseau announced a new name for the party: the slick sounding Vooruit (Forward). Elected on a platform of modernisation, Rousseau has followed his namesake in stressing the necessity of direct democracy and of turning his party into a ‘network’. He has also managed to pacify the warring clans within his party. Primarily though, Rousseau has combined vaguely patriotic appeals with loud lamentations about declining state capacity in the age of COVID.

The end result looks more like a Belgian Five Star movement than the conservative Danish Social Democrats. At a recent party conference, Rousseau appeared behind a red curtain, his silhouette projected onto a large screen overhead. When the futuristic music stopped, Rousseau was supposed to dramatically appear, but he got stuck in the curtain; a belated smoke bomb went off as he entered the stage. ‘We’re back bitches’ was his cry. The results of this gamble are unclear: Vooruit is polling only two percentage points higher, just behind the ailing Christian-Democrats.

Belgian politics thus evinces a curious combination of political turmoil and stasis. In many ways, the country’s political centre has fallen out; for the first time in Belgian history, the mainstream bloc dropped under 50% in the European Parliament elections. In Belgium, the party families that classically populated the state since 1893 – the three ‘pillars’ of liberals, socialists and Christians ­– have now lost their joint majority. If anyone was looking for a model of ‘post-classical’ democracy, this is it. Belgium’s twentieth century is well and truly buried. In an interview with a Sunday paper, Christian-Democrat leader Joachim Coens also pondered a transformation of his vehicle into a ‘party-network’ which would consult non-members and organize citizens’ assemblies. He mentioned Samen (Together) as a possible new name; ‘Forward Together’ now became a possible coalition option.

Vooruit has come up with its own variant of what Christopher Bickerton and Carlo Invernizzi Accetti have termed ‘techno-populism’. On one side we have Rousseau, dancing with rappers on his TikTok account. On the other, we have Frank Vandenbroucke, federal minister of public health and luminary of the Socialist Party, who since 2020 has served as the nation’s father figure – the bearer of bad news for a nation eager to return to normal. A Trotskyist in his youth, Vandenbroucke went into government with the Socialists in the 1990s but left after a scandal concerning money for helicopter purchases, during which he notoriously was asked to burn the offending banknotes in a forest. He relocated to Oxford for a PhD with G. A. Cohen and Anthony Giddens, rebranding himself as a staunch social security reformer and supporter of workfare policies. Now, he has returned as the guardian of a more protective and pastoral state.

As in other European countries, protectionism has steadily grown across the spectrum, not least given the mounting cost of energy prices. At the request of EU authorities, Belgium finalized the liberalization of its energy markets shortly before the 2008 crash. This effectively created an oligopoly which forces users to switch providers based on confected deals every two months. Facing a winter of price hikes, even the leader of the Flemish Christian Democrats came out in favour of renationalization, clarifying that he shared the communist line.

Condemned to cohabitation with former sister parties, Belgium’s traditional parties are alarmed above all by the rise of the Vlaams Belang (Flemish Interest), which now polls at a menacing 26%, pushing the separatist bloc over 50%. The party was originally formed in the late 1970s as a response to the Egmont pact, one of the country’s attempted compromises between secessionism and federalism. Its initial ambition was that of a separatist pressure group: it would win a plurality and push for a republican break. By the early 1990s, however, inspired by the social nativism of Jean-Marie Le Pen, it began to mutate into a broader anti-systemic force, turbocharging its anti-immigrant rhetoric and issuing open calls for their expulsion. In a country reeling from the Dutroux affair – where a Walloon serial killer was repeatedly able to elude police, spawning the so-called ‘White Marches’ against a dysfunctional justice system – and slotted into the Maastricht order without a clear popular mandate, this message was bound to resonate. In 1991, the party broke through its electoral ceiling on so-called ‘Black Sunday’.

Formerly known as the Vlaams Blok (‘Flemish Bloc’), by the early 2000s the party was forced to rebrand after it was found to have breached anti-racism laws. The new Flemish Interest caused trouble for the traditional parties for a few years but was subsequently surpassed by the more neoliberal nationalism of De Wever’s N-VA, who were able to capitalise on the communitarian stalemate after 2008. The N-VA though was inexperienced when it entered government in 2014. It came out battered and bruised, unable to deliver on its pro-market promises and hamstrung by the veto of southern socialists.

The Vlaams Belang has proven all too adept at exploiting this defeat. Unlike the parties of other hard right impresarios such as Viktor Orbán or Éric Zemmour, the Belang continues to have a solid and wide local base, complete with cafeterias and youth clubs, motor gangs and gymnastic outfits. Its young leader Tom Van Grieken presents himself as the far right’s ideal son-in-law. Since 2015, the party has managed to mobilise its older civil infrastructure for digital outreach: no party spends more on social media. The young, independent MP Dries Van Langenhove – not an official member but elected on the party’s ticket – runs a podcast where he urges followers to abstain from masturbation and remain fit, part of his long-term attempt to preserve the white race (Van Langenhove first achieved notoriety as a member of a far-right chat group at the University of Ghent, posting the usual photos with semi-automatics and frog memes).

Vlaams Belang has also proven effective at exploiting Flemish unease towards a newly assertive, primarily millennial anti-racism and ecologism. Above all, the Belang obsess over the former liberal Antwerp politician Sihame El Kaouakibi, who was accused of embezzling public funds for her private start-up last year. With its distinctly Belgian mode of ethnic brokerage politics, El Kaouakibi represents a new entrepreneurial elite with migrant roots. Another prominent target is the queer student Anuna De Wever (no relation), who has led several school strikes for the climate in the past two years.

Ethnic minorities like El Kaouakibi still face a wall of prejudice, and by any standard the nation is far behind when it comes colonial self-examination. Visitors at music festivals still occasionally sing chants about ‘cutting hands in the Congo’. Some years ago, the refurbished Africa Museum – completed by Leopold II after he handed over his personal Congolese dominion to the Belgian state – held an Africa-themed gathering in its gardens where partygoers came clad in blackface, pith helmets and leopard skins. The museum later issued an apology; even the British Telegraph picked up on the scandal. Belgium’s ‘Great Awokening’ is, in every way, uneven and combined.

Some parameters have begun to shift, however. In April, a radio presenter at the mainstream MNM station was attacked with a vial of acid in a park in Antwerp. Some weeks before, she had featured in a general-interest television programme in which she showed a photograph of her grandfather. The man was later revealed to be Gerard Soete, one of the mercenaries charged with disposing of the captured independence leader Patrice Lumumba, killed on joint orders of the Belgian crown and the CIA in 1961. In the late 1990s he had been approached by Belgian journalists to talk about his involvement in the affair. During an infamous interview, he opened one of the bureaus in his Brussels apartment and got out a box filled with ivory-white teeth – ostensible remains of the acid treatment applied to Lumumba’s disfigured body (The author Ludo De Witte reached the same conclusion in his 1999 The Assassination of Lumumba). In January 2016, a tooth purportedly belonging to Lumumba’s body was confiscated at the residency of Soete’s daughter. His granddaughter said she knew next to nothing of her grandfather’s previous life.

Leopold’s ghosts clearly still haunt the national patrimony. Several of his statues have been defaced in the past year and the current king recently issued a carefully worded apology to the Congo for the atrocities committed during his reign. With a country that never experienced a significant influx of post-colonial immigration, such activism is bound to remain minoritarian – and, more dangerously, unlikely to marshal votes. Looking forward to 2024, De Wever has stated his willingness to move beyond constitutional niceties. ‘I hardly believe’, he declared in an interview with a local newspaper, ‘that anything can be done in a legalistic way anymore. The country is completely jammed’. Instead, he invoked the need for ‘a new coup, a new Loppem moment’ – referring to when the Belgian king convoked socialist and liberal party leaders after the First World War in the village of Loppem, without Catholic consultation. The meeting ratified full male suffrage, new social security mechanisms, and the promise of a Dutch-speaking university.

De Wever’s bombast is more a sign of desperation than fortitude. Above all, he finds himself outflanked by a resurgent far right, capitalizing on his broken separatist and nativist promises. In the centre, Flemish employers’ associations are dissatisfied with his failure to introduce regressive cuts to unemployment benefits and social services. Socially liberal voters uneasy with the party’s intensifying anti-immigration, meanwhile, are deserting the N-VA for the Greens – the party left government in 2019, after all, in protest against the signing of a UN-backed global migration agreement, the so-called ‘Marrakech pact’. The growing communist PVDA/PTB – the last unionist party of the country – meanwhile has issued its own alternative program for 2024, clearly distinguishing its unionism from ‘the “Belgique à papa” which sent children to work in the mines – the Belgium of Leopold II and of colonisation, of the Société Générale, of collaboration and discrimination’.

It is unlikely that De Wever can push for a Catalan-style conflagration in 2024. There is no haughty federal authority which would dispatch troops to Antwerp, as Madrid did with Barcelona. And the constitution’s parity requirements between Flemings and Walloons make a unilateral declaration of independence extremely risky.

It has become a commonplace to describe Belgium as an ‘impossible’ nation, a state living on borrowed time. Destined for Kleinstaaterei or upward absorption into federalist Europe, the country appears as little more than a relic of bourgeois Europe – an ‘accident of nineteenth-century history’, as De Wever likes to say. Many Belgian professionals share a quaint enthusiasm for this Belgian non-identity, seeing it as the ideal model for Habermas’s ‘post-national’ age. A motley collage of Brussels mussels, Jean-Claude Van Damme, René Magritte and the Red Devils football team is often the best marketing teams can come up with.

Belgian reality is far more prosaic, however. Like its Red Devils, the country relies on national champions and star players to deal with its endemic crises – but it never musters enough team spirit to truly overcome them. Belgians are wealthy and secure but have little sense of how they might collectively mobilize that wealth and security. Stuck between federalism and regionalism, Belgium’s successful failed state is left to decompose without ever fully imploding. Chroniclers have of course been predicting the death of Western Europe’s most unlikely country for several decades now. Anno 2021, however, it feels more likely that obituaries will be composed for the United Kingdom before its overseas neighbour.

Read on: Anton Jäger, ‘Rebel Regions’, NLR 128.


21st-Century Gossip

If we try to recall the most widely circulated pieces of gossip from the last century – those we read or heard about (or saw adapted for the screen) – we find: Prince Rainier III’s marriage to Grace Kelly (1956); the divorce of Shah Reza Pahlavi and Soraya Esfandiary-Bakhtiari (the ‘Sad-Eyed Princess’), who were obliged to separate despite being ‘madly in love’ according to the tabloids; the opera singer Maria Callas’s divorce from Giovanni Battista Meneghini, and her subsequent affair with the Greek shipping magnate Aristotle Onassis (1959); the adventures of Marilyn Monroe and John Fitzgerald Kennedy, and his brother Bobby (1954-62); the love story between Onassis and the widowed Jacqueline Kennedy, for whom he left Callas (1968); the diamond-studded infatuations and disavowals of Liz Taylor and Richard Burton (1961-76); and the last great scandal of the 20th century, Prince Charles and Lady Diana’s divorce (1996), plus her death the following year. This is the luxurious world of private islands and jewelry, where one’s only piece of nightwear is Chanel No. 5. It is the world of aristocrats – regents, even – heads of state, actors, opera divas and prime donne.

But if we browse the gossip sections of 21st-century newspapers, we no longer see a sparkling world of nobility and celebrity. Instead, we find ourselves in the boardrooms of major corporations. In the second half of the last century the deaths that caused the most commotion – other than the Kennedys, Monroe and Diana – were those of Eva Perón (wife of the Argentinian general and statesman Juan Perón), Albert Camus, Jimi Hendrix and Janis Joplin. In this century, however, the death that stands out in terms of its global resonance was that of Steve Jobs.

One story that has occupied public attention in recent months, and continues to fill the broadsheet pages, is Bill and Melinda French Gates’s divorce in May 2021. Much like the classic cases of scandal-reporting from the last century, details about the Gates–French affair have gradually been drip-fed by the media, such as Bill’s association with Jeffrey Epstein. But however laden with outrages and infidelities these stories might be, it’s not the sexual content of the hearsay that captures our attention. It’s the money, pure and simple. Luxury has been supplanted by money in the tabloid reader’s fantasies. The distinction between the two is the same as that between eroticism and hard porn. The former is a semiotic unit – its meaning refers to other meanings (like refinement and style) – whereas the latter refers only to itself.

We have read about the dizzying sums involved in Jeff Bezos and MacKenzie Scott’s 2019 divorce: Bezos leaving his ex-wife 4% of Amazon’s shares (worth $38 billion). We know that Steve Jobs’s widow Laurene Powell inherited nearly the entirety of his fortune, estimated at the time to be around $10.8 billion (including 5.5 million shares in Apple). The amount Melinda French Gates will receive is still unclear, though it will certainly be between $3 and $65 billion. In this whirlwind of billions we are struck by another fact. High society ladies of previous centuries completed their education in private colleges (finishing schools), more often than not Swiss, such as the Institut Alpin Videmanette (attended by Lady Diana and, more recently, by fashion magnate Tamara Mellon); the Mon Fertile (Camilla Parker Bowles); the Institut Le Mesnil (Queen Anne-Marie of Greece); the Brillantmont (Maharani Gayatri Devi of Jaipur); but also American, such as Manhattan’s Finch College (Isabella Rossellini). By contrast, their contemporary equivalents have a far more direct path to fortune. Laurene Powell obtained a BA in Political Science, then a BS in Economics from the Wharton School, and finally an MBA at the Stanford Graduate School of Business; in between she spent three years working for Goldman Sachs and Merill Lynch. Melinda Gates graduated from Duke with a bachelor’s in computer science and economics before pursuing an MBA at its School of Business. MacKenzie Scott’s trajectory seemed less predestined: she studied literature at Princeton, where she was taught by Toni Morrison, before writing a number of middle-brow domestic novels. But she underwent an entrepreneurial transformation at Amazon, brainstorming names for the company in its early days and shipping its very first orders through UPS.

Bourdieu noted that women’s education had been revolutionized in the 19th century by changes in the marriage market, as the emerging bourgeoisie came to expect new qualities from their brides. It was no longer necessary for them to possess land, wealth, jewels; but it was vital for them to know how to preside over a salon, play the piano and hold conversation, preferably in French. The education of these ‘madams’ could no longer be confined to sewing and running a kitchen. Now, in the new millennium, we are witnessing another such revolution, where the ability to play Chopin from memory is replaced by the capacity to identify opportune moments to buy or sell subordinated risk swaps on the derivative market. If paparazzi photos used to show celebrities naked, nowadays their bodies are covered but their finances are laid bare. Fortunes are revealed, wallets unbuttoned. Diderot’s Indiscreet Jewels have reverted to their literal meaning.

The Gates’ separation is exemplary. The more we learn about the story, the less its jealousies and indiscretions (real or imagined) seem to matter. Bill Gates had a tartuffesque approach to inviting women for dinner: ‘If this makes you uncomfortable, pretend it never happened’, he had the habit of writing in email postscripts. But much more weight has been placed on his rift with Melinda over the management and control of the Gates Foundation, which holds assets valued at around $49.8 billion. The struggle for sovereignty over this kingdom was what made their break-up particularly bitter. This was well-understood by Warren Buffett – ‘the Oracle of Omaha’ – one of the richest men in the world with a net worth of $104 billion, who rules over the conglomerate Berkshire Hathaway. Buffett joined the Gates Foundation in 2006 and has since invested $33 billion, serving until this year as one of its trustees. As soon as there was word of divorce, he ran for the hills as quickly as his nonagenarian legs allowed him.

It’s astonishing how seldom the philanthropy business has been the object of sustained criticism. It is barely known that, as an institution, the ‘philanthropic foundation’ in its current legal form is relatively recent, dating back to the First World War. From the moment it was dreamt up, the institution – now seemingly as natural and necessary as the air we breathe – was met with raised shields and vigorous opposition, not just from American unions, but from establishment politicians such as Theodore Roosevelt and William Taft. When the proposal was first brought to establish a Rockefeller foundation, Roosevelt observed that ‘No amount of charities in spending such fortunes can compensate in any way for the misconduct in acquiring them.’ Taft called on Congress to oppose the plan – describing it as ‘a bill to incorporate Mr. Rockefeller’ – while the American Federation of Labor President Samuel Gompers growled, ‘The one thing that the world would gratefully accept from Mr. Rockefeller now would be the establishment of a great endowment of research and education to help other people see in time how they can keep from being like him.’ (I discussed the controversy over foundations more extensively in my latest book, Dominio. La guerra invisibile dei ricchi contro i poveri – Dominion: The Invisible War of the Rich on the Poor – published in Italy last October.)

Today we still have only a rough idea of the financial benefits these tax-exempt foundations enjoy. Indeed, most people are unaware that their returns on investments also go untaxed. If the Gates Foundation happens to invest in Microsoft shares, the dividends of those shares (or the capital gains if they are sold) will not be touched by the state. This explains one of the great mysteries of the foundations: however much they donate, their assets continue to grow.

It’s clear that MacKenzie Scott, like many of her peers, hates having so many billions; she donated $4.1 billion in 2020 and another $8 billion in June of this year alone. But according to the Bloomberg Billionaire Index, on 9 August 2021 her net worth had grown to $59.2 billion (whereas in 2019 it was ‘only’ $38 billion following her divorce settlement). This increase is partly attributable to the fact that Amazon shares rose from $1,789 to $3,341 in the space of two years. Likewise, Laurene Powell’s wealth never seems to diminish despite her vast donations: she was worth $21 billion in 2012; now she has $22.5 billion to her name. Not to mention the spectacular and constant appreciation of the assets held by the Gates Foundation:

YearNet worth ($ billion)

For all their self-aggrandizing PR, such institutions are ultimately financed by tax exemptions – that is, by public funds. In 2011, for example, the total amount of ‘donations’ made by American foundations was $49 billion; but in the same year, fiscal subsidies to philanthropic works cost the US Treasury $53.7 billion. In other words, American philanthropists donated $4.7 billion less than what they took from the state. They gave away none of their own money, but rather that of others.  

But there’s more: the conduct of a foundation is rarely subject to scrutiny because the donors are under no obligation to answer for their actions. As Joanne Barkan writes, ‘When a foundation project fails…the subjects of the experiment suffer, as does the general public. Yet the do-gooders can simply move on to their next project.’ Alongside this lack of accountability is a tendency to encourage obsequiousness. Before the donors we are all mendicants come to beg for support, a loan, logistical assistance. This feature of the philanthropy economy was enough to scandalize even one of the standard-bearers of the Chicago School, Richard Posner, a man who once supported a ‘free baby market’ – i.e. the unregulated buying and selling of children – as an optimal mechanism for adoption. He remarked in 2006 that

A perpetual charitable foundation is a completely irresponsible institution, answerable to nobody. It competes neither in capital markets nor in product markets (in both respects differing from universities), and, unlike a hereditary monarch whom such a foundation otherwise resembles, it is subject to no political controls either. It is not even subject to benchmark competition. The puzzle for economics is why these foundations are not total scandals.

Posner hits the nail on the head. The foundation is a monarchy, absolute rather than constitutional. This is perhaps the key to understanding why the foundation magnates are so often the protagonists of 21st-century gossip. As contemporary royals, wielding all the arcane power associated with such figures, they carry out the same symbolic functions that were once the purview of royal dynasties in the previous century. The Marquess of Boeing, the Archduke of Facebook, the Prince of Google, the Landgrave of Amazon.  

Translated by Francesco Anselmetti.

Read on: Robin Blackburn, ‘Reform to Preserve?’, NLR 120.