Larger Font   Reset Font Size   Smaller Font  

Reappraisals, Page 2

Tony Judt


  Moreover, war in the twentieth century frequently meant civil war: often under the cover of occupation or “liberation.” Civil war played a significant role in the widespread “ethnic cleansing” and forced population transfers of the twentieth century, from India and Turkey to Spain and Yugoslavia. Like foreign occupation, civil war is one of the great “shared” memories of the past hundred years. In many countries “putting the past behind us”—i.e., agreeing to overcome or forget (or deny) a recent memory of internecine conflict and intercommunal violence—has been a primary goal of postwar governments: sometimes achieved, sometimes overachieved.

  The United States avoided all that. Americans experienced the twentieth century in a far more positive light. The U.S. was never occupied. It did not lose vast numbers of citizens, or huge swaths of national territory, as a result of occupation or dismemberment. Although humiliated in neocolonial wars (in Vietnam and now in Iraq), it has never suffered the consequences of defeat. Despite the ambivalence of its most recent undertakings, most Americans still feel that the wars their country has fought were “good wars.” The USA was enriched rather than impoverished by its role in the two world wars and by their outcome, in which respect it has nothing in common with Britain, the only other major country to emerge unambiguously victorious from those struggles but at the cost of near-bankruptcy and the loss of empire. And compared with the other major twentieth-century combatants, the U.S. lost relatively few soldiers in battle and suffered hardly any civilian casualties.

  As a consequence, the United States today is the only advanced country that still glorifies and exalts the military, a sentiment familiar in Europe before 1945 but quite unknown today. America’s politicians and statesmen surround themselves with the symbols and trappings of armed prowess; its commentators mock and scorn countries that hesitate to engage themselves in armed conflict. It is this differential recollection of war and its impact, rather than any structural difference between the U.S. and otherwise comparable countries, which accounts for their contrasting responses to international affairs today.

  It also, perhaps, accounts for the distinctive quality of much American writing—scholarly and popular—on the cold war and its outcome. In European accounts of the fall of Communism and the Iron Curtain, the dominant sentiment is one of relief at the final closing of a long, unhappy chapter. Here in the U.S., however, the same story is typically recorded in a triumphalist key.3 For many American commentators and policymakers the message of the last century is that war works. The implications of this reading of history have already been felt in the decision to attack Iraq in 2003. For Washington, war remains an option—in this case the first option. For the rest of the developed world it has become a last resort.

  After war, the second characteristic of the twentieth century was the rise and subsequent fall of the state. This applies in two distinct but related senses. The first describes the emergence of autonomous nation-states during the early decades of the century, and the recent diminution of their powers at the hands of multinational corporations, transnational institutions, and the accelerated movement of people, money, and goods outside their control. Concerning this process there is little dispute, though it seems likely that those who regard the outcome—a “flat world”—as both desirable and inevitable may be in for a surprise, as populations in search of economic and physical security turn back to the political symbols, legal resources, and physical barriers that only a territorial state can provide.

  But the state in my second sense has a more directly political significance. In part as a result of war—the organization and resources required to fight it, the authority and collective effort involved in making good its consequences—the twentieth-century state acquired unprecedented capacities and resources. In their benevolent form these became what we now call the “welfare state” and what the French, more precisely, term “l’état providence”: the providential state, underwriting needs and minimizing risks. Malevolently, these same centralized resources formed the basis of authoritarian and totalitarian states in Germany, Russia, and beyond—sometimes providential, always repressive.

  For much of the second half of the twentieth century, it was widely accepted that the modern state could—and therefore should—perform the providential role; ideally, without intruding excessively upon the liberties of its subjects, but where intrusion was unavoidable, then in exchange for social benefits that could not otherwise be made universally available. In the course of the last third of the century, however, it became increasingly commonplace to treat the state not as the natural benefactor of first resort but as a source of economic inefficiency and social intrusion best excluded from citizens’ affairs wherever possible. When combined with the fall of Communism, and the accompanying discrediting of the socialist project in all its forms, this discounting of the state has become the default condition of public discourse in much of the developed world.

  As a consequence, when now we speak of economic “reform” or the need to render social services more “efficient,” we mean that the state’s part in the affair should be reduced. The privatization of public services or publicly owned businesses is now regarded as self-evidently a good thing. The state, it is conventionally assumed on all sides, is an impediment to the smooth running of human affairs: In Britain both Tory and Labour governments, under Margaret Thatcher and Tony Blair, have talked down the public sector as dowdy, unexciting, and inefficient. In Western societies taxation—the extraction of resources from subjects and citizens for the pursuit of state business and the provision of public services—had risen steadily for some two hundred years, from the late eighteenth century through the 1970s, accelerating in the course of the years 1910-1960 thanks to the imposition of progressive income tax, inheritance tax, and the taxation of land and capital. Since that time, however, taxes have typically fallen, or else become indirect and regressive (taxing purchases rather than wealth), and the state’s reach has been proportionately reduced.

  Whether this is good or bad—and for whom—is a matter for discussion. What is indisputable is that this public policy reversal has come upon the developed world quite suddenly (and not only the developed world, for it is now enforced by the International Monetary Fund and other agencies upon less developed countries as well). It was not always self-evident that the state is bad for you; until very recently there were many people in Europe, Asia, and Latin America, and not a few in the U.S., who believed the contrary. Were this not the case, neither the New Deal, nor Lyndon Johnson’s Great Society program, nor many of the institutions and practices that now characterize Western Europe would have come about.

  The fact that Fascists and Communists also explicitly sought a dominant role for the state does not in itself disqualify the public sector from a prominent place in free societies; nor did the fall of Communism resolve in favor of the unregulated market the question as to the optimum balance of freedom and efficiency. This is something any visitor to the social-democratic countries of northern Europe can confirm. The state, as the history of the last century copiously illustrates, does some things rather well and other things quite badly. There are some things the private sector, or the market, can do better and many things they cannot do at all. We need to learn once again to “think the state,” free of the prejudices we have acquired against it in the triumphalist wake of the West’s cold war victory. We need to learn how to acknowledge the shortcomings of the state and to present the case for the state without apology. As I conclude in Chapter XIV, we all know, at the end of the twentieth century, that you can have too much state. But . . . you can also have too little.

  The twentieth-century welfare state is conventionally dismissed today as European and “socialist”—usually in formulations like this: “I believe history will record that it was Chinese capitalism that put an end to European socialism.”4 European it may be (if we allow that Canada, New Zealand, and—in respect of social security and national health for the aged—the USA are all
for this purpose “European”); but “socialist”? The epithet reveals once again a curious unfamiliarity with the recent past. Outside of Scandinavia—in Austria, Germany, France, Italy, Holland, and elsewhere—it was not socialists but Christian Democrats who played the greatest part in installing and administering the core institutions of the activist welfare state. Even in Britain, where the post-World War II Labour government of Clement Attlee indeed inaugurated the welfare state as we knew it, it was the wartime government of Winston Churchill that commissioned and approved the Report by William Beveridge (himself a Liberal) that established the principles of public welfare provision: principles—and practices—that were reaffirmed and underwritten by every Conservative government that followed until 1979.

  The welfare state, in short, was born of a cross-party twentieth-century consensus. It was implemented, in most cases, by liberals or conservatives who had entered public life well before 1914 and for whom the public provision of universal medical services, old age pensions, unemployment and sickness insurance, free education, subsidized public transport, and the other prerequisites of a stable civil order represented not the first stage of twentieth-century socialism but the culmination of late-nineteenth-century reformist liberalism. A similar perspective informed the thinking of many New Dealers in the United States.

  Moreover, and here the memory of war played once again an important role, the twentieth-century “socialist” welfare states were constructed not as an advance guard of egalitarian revolution but to provide a barrier against the return of the past: against economic depression and its polarizing, violent political outcome in the desperate politics of Fascism and Communism alike. The welfare states were thus prophylactic states. They were designed quite consciously to meet the widespread yearning for security and stability that John Maynard Keynes and others foresaw long before the end of World War II, and they succeeded beyond anyone’s expectations. Thanks to half a century of prosperity and safety, we in the West have forgotten the political and social traumas of mass insecurity. And thus we have forgotten why we have inherited those welfare states and what brought them about.

  The paradox, of course, is that the very success of the mixed-economy welfare states, in providing the social stability and ideological demobilization which made possible the prosperity of the past half century, has led a younger political generation to take that same stability and ideological quiescence for granted and demand the elimination of the “impediment” of the taxing, regulating, and generally interfering state. Whether the economic case for this is as secure as it now appears—whether regulation and social provision were truly an impediment to “growth” and “efficiency” and not perhaps their facilitating condition—is debatable. But what is striking is how far we have lost the capacity even to conceive of public policy beyond a narrowly construed economism. We have forgotten how to think politically.

  This, too, is one of the paradoxical legacies of the twentieth century. The exhaustion of political energies in the orgy of violence and repression from 1914 through 1945 and beyond has deprived us of much of the political inheritance of the past two hundred years. “Left” and “Right”— terminology inherited from the French Revolution—are not quite without meaning today, but they no longer describe (as they still did within recent memory) the political allegiances of most citizens in democratic societies. We are skeptical, if not actively suspicious, of all-embracing political goals: The grand narratives of Nation and History and Progress that characterized the political families of the twentieth century seem discredited beyond recall. And so we describe our collective purposes in exclusively economic terms—prosperity, growth, GDP, efficiency, output, interest rates, and stock market performances—as though these were not just means to some collectively sought social or political ends but were necessary and sufficient ends in themselves.

  In an unpolitical age, there is much to be said for politicians thinking and talking economically: This is, after all, how most people today conceive of their own life chances and interests, and any project of public policy that ignored this truth would not get very far. But that is only how things are now. They have not always looked this way, and we have no good reason for supposing that they will look this way in the future. It is not only nature that abhors a vacuum: Democracies in which there are no significant political choices to be made, where economic policy is all that really matters—and where economic policy is now largely determined by nonpolitical actors (central banks, international agencies, or transnational corporations)—must either cease to be functioning democracies or accommodate once again the politics of frustration, of populist resentment. Post-Communist Central and Eastern Europe offers one illustration of how this can happen; the political trajectory of comparably fragile democracies elsewhere, from South Asia to Latin America, provides another. Outside of North America and Western Europe, it would seem, the twentieth century is with us still.

  OF ALL THE TRANSFORMATIONS of the past three decades, the disappearance of “intellectuals” is perhaps the most symptomatic. The twentieth century was the century of the intellectual: The very term first came into use (pejoratively) at the turn of the century and from the outset it described men and women in the world of learning, literature, and the arts who applied themselves to debating and influencing public opinion and policy. The intellectual was by definition committed—“engaged”: usually to an ideal, a dogma, a project. The first “intellectuals” were the writers who defended Captain Alfred Dreyfus against the accusation of treason, invoking on his behalf the primacy of universal abstractions: “truth,” “justice,” and “rights.” Their counterparts, the “anti-Dreyfusards” (also intellectuals, though they abhorred the term), invoked abstractions of their own, though less universal in nature: “honor,” “nation,” “patrie,” “France.”

  So long as public policy debate was framed in such all-embracing generalities, whether ethical or political, intellectuals shaped—and in some countries dominated—public discourse. In states where public opposition and criticism was (is) repressed, individual intellectuals assumed de facto the role of spokesmen for the public interest and for the people, against authority and the state. But even in open societies the twentieth-century intellectual acquired a certain public status, benefiting not only from the right of free expression but also from the near-universal literacy of the advanced societies, which assured him or her an audience.

  It is easy in retrospect to dismiss the engaged intellectuals of the last century. The propensity for self-aggrandizement, preening contentedly in the admiring mirror of an audience of like-minded fellow thinkers, was easy to indulge. Because intellectuals were in so many cases politically “engaged” at a time when political engagement took one to extremes, and because their engagement typically took the form of the written word, many have left a record of pronouncements and affiliations that have not worn well. Some served as spokesmen for power or for a constituency, trimming their beliefs and pronouncements to circumstance and interest: what Edward Said once called “the fawning elasticity with regard to one’s own side” has indeed “disfigured the history of intellectuals.”

  Moreover, as Raymond Aron once remarked apropos his French contemporaries, intellectuals seemed all too often to make a point of not knowing what they were talking about, especially in technical fields such as economics or military affairs. And for all their talk of “responsibility,” a disconcerting number of prominent intellectuals on Right and Left alike proved strikingly irresponsible in their insouciant propensity for encouraging violence to others at a safe distance from themselves. “Mistaken ideas always end in bloodshed,” Camus wrote, “but in every case it is someone else’s blood. That is why some of our thinkers feel free to say just about anything.”

  All true. And yet: The intellectual—free-thinking or politically committed, detached or engaged—was also a defining glory of the twentieth century. A mere listing of the most interesting political writers, social commentators, or public m
oralists of the age, from Émile Zola to Václav Havel, from Karl Kraus to Margarete Buber-Neumann, from Alva Myrdal to Sidney Hook, would fill this introduction and more. We have all but forgotten not only who these people were but just how large was their audience and how widespread their influence. And to the extent that we do have a shared recollection of intellectuals, it is all too often reduced to the stereotype of a rather narrow band of left-leaning Western “progressives” who dominated their own stage from the 1950s through the 1980s: Jean-Paul Sartre, Michel Foucault, Günter Grass, Susan Sontag.

  The real intellectual action, however, was elsewhere. In the Soviet Union and Eastern Europe, opposition to Communist repression was for many years confined to a handful of courageous individuals “writing for the desk drawer.” In interwar Europe both Fascism and “anti-Fascism” could draw on a talented pool of literary advocates and spokespersons: We may not be altogether comfortable acknowledging the number and quality of nationalist and Fascist intellectuals in those years, but at least until 1941 the influence of writers like Ernst Jünger in Germany, Pierre Drieu La Rochelle and Louis-Ferdinand Céline in France, Mircea Eliade in Romania, or Henri de Man in Belgium was probably greater than that of their left-leaning contemporaries whom we more readily celebrate today: André Malraux, John Dewey, or even George Orwell.