A few weeks ago I published an essay on Extractive Politics which sparked a good bit of interest. In it, I argued that the extractive model of modern economic activity—financialisation, asset-stripping, derivatives and so forth—had now extended to other areas, notably politics. The idea of actually achieving anything had been replaced by the extraction of the maximum political and financial advantages from a given situation.
In general, I start writing these essays with only a rough idea in my head of what I want to say, and so ideas often strike me that I don’t have the space for, but want to come back to later. As I was finishing the essay, I looked at the word count, and realised that there was a lot more to say, but not really the space to say it in, so I decided to return to it subsequently. Hence this essay.
In fact, the extractive mind-set is the norm everywhere today, and this cannot be a coincidence. It generally features three things. The first is the abandonment of genuine ideas of progress, or even something genuinely new. Instead, we see despair, nihilism, and competitive blaming of others. Second, and as a consequence, is the promotion of trivial and transitory novelties as though they were genuine changes and improvements. Third, is the compulsive mining of the past, not as inspiration or for emulation, but simply as raw material to be turned into something that can be sold, or something from which political advantage can be gained. Our society is thus essentially consuming itself.
Now notice that I say “our society” because I don’t think this is a problem for the human race as a whole. It is a problem of modern, western Liberal societies, and there have been signs of it coming (albeit largely ignored) for a century now. By contrast, societies around the world that don’t have this problem, or have it to a lesser extent (often what we call “civilisational states”) are the objects of our hatred and enmity because they do appear to be doing new things and actually moving forward, whereas we compete to blame each other for our inertia and our inability to get things done.
So I propose to take enough examples of extractive behaviour to establish the point, and then refer to two (and fleetingly a third) authors who I think help us better understand what’s going on. The two principal thinkers I want to invoke are the (relatively obscure) Swiss-German writer Jean Gebser, and the British writer Ian McGilchrist. I also want to make a glancing reference to the late Julian Jaynes, and his theories of the breakdown of the bicameral mind. Now, if you’ve never even heard of any of these authors, don’t worry: I’m not competent to discuss them in any detail anyway. I just want to use a couple of their ideas as a springboard for a little bit of theorising of my own.
Let’s start with economics. When I was taught economics, an awfully long time ago, economic activity was considered to be about production. A businessman, I was taught, borrowed money or sold shares to raise capital to build a factory, thus providing employment and meeting the needs of customers, and so earning profits. Now this may have been a caricature, but it did reflect the general assumption at the time that economic activity was about something, and intended to lead somewhere. And it’s certainly true that my youth was full of stories of reconstruction after the War, building new houses, building motorways and high-speed trains, the development of the aerospace industry and the space programmes. Banks were boring, respectable places, and there was a thing called the Stock Market where the low-achieving sons of the upper middle class often went.
But this was all underpinned—expressly so in many cases—by a hope for and interest in the Future, and a belief that with effort it was possible to go on creating a better society. Investment in the sense of my economics lessons was therefore natural and necessary. But when you no longer believe in the future, all you can do is to find ways of monetising the present and the past, which has led to the financialisation of everything and the rise of the heritage industry. And strip-mining the past has its limits: if there are signs of a profitable trend towards nineties nostalgia, I have yet to see them.
In many ways, therefore, the Liberal market economy is eating itself. Liberalism has always been hostile to manufacturing (working-class men in brown coats) and the sighs of relief from the Treasury in the 1980s as British industry sank beneath the waves were clearly audible. But the process of deconstruction (which is what this is) has to stop somewhere, simply because there will be nothing left to deconstruct, nothing left to financialise. We are preciously close to that point already, and the twin shocks of Covid and the war in Ukraine have brought home to people that it is too late to go into reverse now.
But, you may say, that’s just one area, and it’s a series of catastrophic and wicked blunders. It can’t be like that everywhere. Well, take the furthest removed example you can think of. How about philosophy? As I’ve argued elsewhere, Sartrean Existentialism represents the last philosophical school that actually claimed to address serious issues of interest to ordinary people. Philosophers in the traditional sense do still exist today, but they work in very narrow and technical areas and generally write for other philosophers. Indeed, the average philosopher today writes about other philosophers, for other philosophers and for students of philosophy who will go on to be philosophers. Conceptually it’s not that different from derivatives trading by banks.
Moreover, at least since the time of the Vienna Circle, trends in philosophy, have been essentially destructive. The idea that the only proper subject matter of philosophy is empirically testable propositions, evacuates essentially all of the important questions of existence, which remain of interest to people, even if, as Wittgenstein argued, they are not real problems at all. But his effective advocacy of apophatic mysticism (“if we can’t talk about it we should shut up”) rather leaves out the fact that there are things we do need to talk about if we are to live at all. Language has its limitations, as Wittgenstein demonstrated (and if you think he’s not nuanced and complicated enough read Saul Kripke) but it’s all we have.
This, perhaps, explains both why some traditional religion persists, and more importantly why Buddhism, Advaita Vedanta, Taoism and other eastern systems of thought are attracting more and more interest, since they do grapple with basic problems of existence, knowledge and ethics in a way that western philosophy no longer really pretends to do. It’s interesting and intellectually exciting to read Foucault and Barthes deconstructing language and discourse and meaning, but I think they would be horrified what people who have read bad translations of their work have done in the last forty years. As with financialisation, when you have deconstructed everything (even deconstruction) you find there’s nothing left to talk about, because there’s nothing left. It’s much the same with established religion, which, since at least the 1960s, has avoided talking about anything so vulgar as belief. I still remember the slight shock I felt encountering the theologian Don Cupitt’s 1980 book Taking Leave of God, which basically argued that we should abandon the idea of a transcendental, metaphysical God, and look inside ourselves instead. Cupitt eventually ended up something like a postmodernist: everything, including God, was just language. Now there’s a solution to all our moral and ethical problems. It’s no wonder that people have turned to evangelical churches, to the remains of the pre-Vatican II Catholic Church, or even to Islam. Those people see religion as real, not about playing derivative intellectual games.
Is culture any better? I honestly don’t think so. It’s not that there aren’t novelties, new movements, incessant manifestoes and calls for change, but at a deeper level nothing much has changed in the last century. Cinema was the last great cultural invention (and no, I really don’t think video games qualify). In fact, the great cultural developments of the last few generations have essentially involved going backwards, to something like the original staging of Shakespeare’s plays, or Lully’s baroque operas. (The Messiah of my youth, broadcast every Christmas, has nothing in common with the “period” performances that you find in catalogues today.) And I can’t remember how many derivative productions or adaptations of Shakespeare’s plays I’ve seen that have proclaimed themselves to be “relevant” and “contemporary” only to be forgotten immediately and never revived. Indeed, Shakespeare is an inexhaustible asset, which is why there’s even a derivatives industry devoted to arguing that the plays were not written by Will of Stratford but by Queen Elizabeth I, or a committee chaired by Francis Bacon. Much modern culture, in fact, is effectively parasitic on traditional culture : the French nouveau roman for example, would have been unthinkable except as a reaction against the already existing traditional novel.
You can argue, I think, that the real development of literature largely came to a halt in the decade after the First World War. Certainly, between them Joyce and Proust had taken the traditional realist novel as far as it could go, and in The Waste Land, Eliot anticipates many later poets in producing a poem which is as much an anthology with sampling as it is a new work. The next logical step is therefore Finnegans Wake, in which Joyce retreats into a completely personal, solipsistic form of art, producing a text which perhaps only he ever really fully understood, and which frankly, doesn’t merit the enormous effort required for the endless decoding. This isn’t to say of course that innovative great literature wasn’t still being produced (Auden’s first poems were, after all, published in 1929), but what we think of as “modern” literature has for a century now increasingly used the literature of the past itself as raw material. It will be objected that literature has always looked to past forms, but I think there’s a difference between working within an established tradition, and simply being parasitic on it. It’s interesting that some of the most innovative recent literature (Salman Rushdie is a good example) has been produced by authors with strong influences from outside the western heartland, just like the magical realism of Marquez and others.
The derivative nature of modern western popular culture has itself become a cliché, to be exploited for intellectual and financial profit. But the point is valid, and it seems to me related to the end of the idea of the Future, about which the late lamented Mark Fisher wrote. There is no new development, only endless exploitation of the past. Why invest when you can exploit existing assets? Why develop a sound of your own when you can simply concoct one from sly, knowing references to the sounds of others in the past? It’s hard to imagine today, perhaps, that popular music could ever have been innovative, but it was. The popular music of 1920 was quite different from the popular music of 1940 or 1960, in part because much music was still live, and therefore continually developing. And between roughly the early 1960s and the mid 1970s, popular music went through an explosive period of innovation, for a number of social and technological reasons that would take too long to go into here. Certainly, everyone who heard the Beatles or Bob Dylan for the first time must have had the same shiver-down-the-spine reaction: what on earth is that? And these innovative sounds were rapidly followed by a host of others: electronica, heavy rock, reggae, the folk revival, jazz rock, progressive this and that, and of course an unconscionable number of singer-songwriters. Now much of this was inevitably rubbish, but there was a hard core of blazing innovation that is still paying dividends today for the sampling industry.
As with finance, it was technology that enabled popular music to cannibalise itself. When the Beatles first used sampling on Sergeant Pepper, it was exciting and innovative, but it rapidly became a cliché, and a substitute for actual creativity. The Beatles were also pioneers in making records that could not be reproduced live—although the songs could still be sung—and opened the way for the now nearly complete chasm between recorded sound and live performance. This had the inevitable effect, of course, of discouraging innovation. And now I see they are threatening us with artists brought back to life and playing songs they never recorded, all done by so-called Artificial Intelligence. We will all disappear into single-person artistic ghettoes, each with a different version of the album that the Doors would have recorded if Morrison had not died, assembled according to our own criteria. And you thought tribute bands were a step backward …`
You can make essentially the same criticism of cinema, and you can add that almost all popular films these days seem to be for children, who notoriously dislike originality and want to be told the same stories again and again. Modern cinema cannibalises the films and literature of the past for profit, not simply finding direct inspiration with film adaptations and remakes, but sucking the creativity of the past dry, without adding anything. (Star Wars, where the rot began, was essentially the incoherent presentation of asset-stripped mythology and the kind of science-fiction serials they showed at the cinema every week when I was young. There wasn’t an ounce of originality in it, which is why children loved the stories so much.) Of course, popular cinema can use myths and archetypes to good advantage if the directors are skilful enough, and the stories have enough resonance. So Close Encounters of the Third Kind is an allegory of the birth, death and resurrection of Christ, just as Sam Mendes’s 1917 is an allegory of suffering and redemption, but neither is just a retelling of a familiar story. Perhaps the cinema is condemned indeed to eat itself in an endless spiral of self-reference and revival. Such an approach can produce great art (Once Upon a Time in Hollywood for example) but it requires a great director to do it.
I suspect the battle against the pointless but profitable use of technology has been lost also. The story is a familiar one: the first uses of CGI in films like Gladiator and The Lord of the Rings were startling and original, but as computers became capable of doing everything, the space for traditional skills of acting, scenario writing and directing started to be marginalised in favour of ever greater effects. I’ve long argued that the ideal capitalist company would have no actual employees at all. It would just automatically take in money, extract a percentage for the owners and pass the rest on. I fear we might be quite close to that in the cinema, where so-called AI will, in theory at least, be able to play all the roles.
I’ve already said a lot about politics a few weeks ago, but you will see the implications, I hope. Political parties have come to resemble private companies, and they are run for the benefit of their leaders just as companies are run for their shareholders and managers. They produce nothing, but pay political dividends in the form of jobs and money. Political parties no more have a responsibility towards the electorate than private companies do towards society. So I’ll conclude this set of examples with, of all things, restaurants. (On reflection, though, what could be more suitable for a discussion of society eating itself?) As it happens, I was looking through a series of restaurants recently trying to choose one, and I was reminded of something that had occurred to me before: the derivative and dependent way in which the food was made and described. Now, the vocabulary of French cuisine is often very flowery and translates badly into English. But it struck me once again how many dishes and styles of cooking were “revisited” “re-thought” or even “re-looked” (yes, there is a French verb relooker I’m afraid.) Everything, in other words, is derived from the past, even when it’s presented as new, and even traditional restaurants are presented as a “return” to the past. Actual genuine innovation is out: the Nouvelle cuisine dates from the 1970s, after all. And what is “fusion cuisine” other than a post-modernist attempt to mix genres and an analogy to an ill-fated merger between different companies with different cultures?
Well, that’s enough cooking. But if what I have laid out above is broadly convincing, how do we make sense of it all? As I indicated, I will try to do so with reference to a couple of books, and a glance at a third, and I will extend their arguments to suggest that they are are relevant to a society which is going mad from the mechanical extrapolation of rationality, engendering processes that have no off-switch.
The first writer is the German, later Swiss thinker Jean Gebser, who until recently was little known in the English-speaking world. His single work, Ursprung und Gegenwart, (1949) finally translated as The Ever-present Origin, is a magisterial discussion of the development of human consciousness through different phases: the archaic, the magical, the mythical, the mental-rational and (in the future) the integral. Gebser argued that we were living in the “mental” phase, one dominated by logic, the written word, a linear concept of time and a clear distinction between the individual ego and the outside world. He believed that this age had started around 1225 BC, and that we were now living in its final stage, what he called the “deficient” stage of the phase, when the advances and advantages of this mode of thinking had been exhausted and the problems and disadvantages were starting to dominate. These phases are not absolutely distinct from each other, and they merge one into the other over a period of time, reaching maturity at points like the Renaissance, and then starting to decline. Gebser believed that “we” (he meant the West, and this is important) were probably heading for some kind of catastrophe, unless we could somehow move to a final, “integral” stage when the insights and virtues of the different phases could be combined.
The second is Iain McGilchrist, psychiatrist and philosopher, best known for his weighty tome The Master and his Emissary, and author of a still more weighty sequel, which I have to confess I have not yet read. The subtitle is “the divided brain and the making of the western world” and McGilchrist traces in fascinating detail the steady ascent of the left-brain to dominance over the right. He is at pains not to over-simplify the argument, using the latest discoveries in psychology about the relationship between the hemispheres. But his basic argument is that, nonetheless, the left hemisphere, which developed as the “emissary” of the right has come to usurp its role in recent times. The right brain sees the big picture and the left brain does the detailed work. Through much of human history, McGilchrist argues, there has been a fruitful cooperation mixed with tension between the two hemispheres, but since the Industrial Revolution, and more particularly in the last century or so, the left hemisphere has come to dominate. The result is fragmentation: all detail and no vision, all process and procedure but no context, all rules but no purpose. However, McGilchrist remains optimistic that some fruitful combination can be found again.
It’s obvious that the two authors were thinking along similar lines (though McGilchrist doesn’t give any indication of knowing of Gebser’s book.) Both identify the problems of over-reliance on rationality devoid of context, and both believe something has gone badly wrong with the way we look at the world. And to complete this list, I’ll just name-check Julian Jaynes, whose equally ambitious solitary work, The Origin of Consciousness in the Breakdown of the Bicameral Mind argues what its title suggests it does: consciousness, as we understand it is a recent invention (it is not found in Homer for example) and the voices from the gods that our ancestors heard actually came from within their own minds, whose hemispheres functioned completely independent of each other. The invention of logic, or indeed of rational thought of any kind, is therefore a recent development..
I’m not going to attempt to summarise further because there isn’t space. I’d just urge you to read the books I’ve cited, which are the best known, but not the only useful ones on the subject. But what do we make of these ideas? Well, it’s trivially true that our ancestors were Not Like Us, and that the further we go back in history the truer this is. To some extent the problem is disguised by what I have called Chronicism, the tendency to look at the past through contemporary eyes, award marks out of ten for appearing to resemble us, and focus on those parts of the past we can understand. But if we are serious, we realise that views of the world, of our place in it and relations between individuals, have changed incomprehensibly over the millennia. Reading Plato’s Timaeus for example, not as a work of literature or philosophy, but as a pragmatic account of the creation of the world and human beings, makes us feel dizzy. Planets and stars as living beings? Four elements, all made up of triangles? Water compressed into stone?
It therefore seems entirely possible that, with the increasing complexity of individual and collective life, human consciousness indeed began to change, and that this change was greatly accentuated by the Enlightenment and by the development of capitalism, and led to the triumph of what Gebser called the “mental-rational” phase of consciousness. This may well be associated with increasing left-brain dominance, and the development of individual consciousness, but there isn’t space here to discuss the relationship between the various theses. Still, let’s consider Gebser’s suggestion that western society is headed for a catastrophic breakdown. Since his death in 1973, this kind of thinking has moved from the margins to the mainstream, and many people now fear exactly that, but why? Remember that we are not talking directly about climate change or Ukraine, but about changing habits of thought. Why should they provoke a catastrophe?
Let’s go back to the beginning of this essay, where I set out a thesis that our society in all its aspects now revolves around versions of the concept of a derivative, and has exhausted its capability to do new things. Broadly, we can say that rational, or left-brain thinking, is concerned with the detail, and right-brain, or mythical thinking with the big picture. One believes that by collecting lots of small pieces, we can arrive at a larger truth, the other that we can begin from a larger truth, and work our way down. There is no pragmatic doubt that the second alternative is more effective. Left-brain thinking produces detailed rules, and adds to them, right-brain thinking produces general guidelines. Both are needed, of course, but I think we can show that rational/left-brain thinking has indeed got out of hand. Consider some examples.
One aspect of left-brain/rational thinking is the mindless application of rules, since the left brain cannot make qualitative judgements, and so does not know when to stop. So if your instructions are to maximise the value of your firm to shareholders and owners, then eventually you will stop investing and start cannibalising your assets and people. You may realise that this is suicidal in the long-term, but then the left brain famously has no concept of time: it just keeps going. If you find derivatives profitable, then why not try derivatives of derivatives, and derivatives of derivatives of derivatives, and so on to infinity? If everything can be deconstructed, even deconstructionism, what is the point of deconstruction itself? Why even bother writing? If Actors in Funny Costumes 4 was a success, why not make parts 5, 6, 7 8 and so on for ever? University teaching, and indeed knowledge and research in general, do not follow left-brain rules. So universities strike back by trying to impose left-brain rules upon them—citations, evaluations, research impacts etc—in place of an actual commitment to such touchy-feely things as teaching and research.
This kind of thinking values surface novelty as a release from boredom, but cannot really produce new ideas: it therefore turns to derivatives of existing ones. It reacts to setbacks by doing the same thing again but more. Its motto when things go wrong is “do it again:” it never asks if it should be doing something else. It adds controls, oversight, layers of supervision and management, because it cannot conceive of a big picture. It’s always the next reorganisation, the next layer of oversight, that will solve the problem. And the next and the next. It deals with corruption, for example, about which I’ve recently written, by producing more and more regulations and more and more layers of control and oversight. The actual solution, a culture of honesty, can’t be measured and reported on, and only exists as an overall, right-brain concept.
That reminds us, perhaps, that the left-brain/rational approach is inherently suspicious, even paranoid, because it cannot make the leap to see the big picture. Thus, organisations these days spy on their workforces and desperately try to enforce ever more stringent rules. But it’s actually been argued by writers such as Lous Sass that modern culture is in effect close to schizophrenia—not in the popular sense of a split personality, but rather the inability to integrate things and events and understand their relationship, and a suspicious and hostile relationship with life and with others.
I think we can see this in today’s political culture, which is hopelessly derivative: reality does not matter, it’s all about the latest tweet, looking good on the TV news, and enjoying a brief bump in the opinion polls. Our political system is like a schizophrenic, who cannot relate properly to real life and tries to withdraw from it. Unfortunately, political leaders have real-life responsibilities, and reality cannot be kept at bay by drugs. Which brings us, inevitably, I suppose, to Ukraine.
I suggest quite simply that to the question “have our leaders gone mad?” which is asked repeatedly these days, the answer is “yes, they have.” Or more precisely, they operate in a political culture which has itself gone insane. In particular, there is no ability to see the big picture, or even interest in it. If you think about it, politics until recently was primarily about competing stories, what Gebser would call “myths” in the neutral sense of that term. For the Right, it was the myth of preserving the best aspects of the present and looking back to the past. For the Left it was the myth of preserving the best of the present and moving towards the future. The end of ideology is also the end of the mythical, right-brain approach to politics, and the triumph of the left-brain, rational technocratic pursuit of simple power. And it also represents the triumph of “me” over “us,” as we lose all sight of the bigger picture, or even the interests of others.
The behaviour of western and especially European, leaders during the crisis is exactly what we would expect from this kind of political culture. In particular, there is a total incapacity to relate the various elements and consequences to each other. So at some level, western politicians must understand that the game is over, and they will have to deal with a strong and angry Russia quite soon. But this co-exists with a belief that somehow the West is going to win, if we only keep doing the same things, based largely on derivative thinking: surely all those experts and all the national leaders I meet can’t be wrong? Their lack of any actual knowledge about anything, including such mundanities as energy supplies, global supply chains, military production and procurement and for that matter war itself, means that the “debate” itself is at a derivative level, about who said what when and whether it was anti-Russian enough. Whether the political system is actually capable of accepting reality, and what will happen if it can’t, are things we need to start thinking about now.
Ironically, critics of the war and similar adventures fall into the same errors, through desperate attempts to impose a rational, left-brain interpretation on something which is an incoherent mess brought about by a political system that has gone mad. Once you give up the attempt to interpret the behaviour of western leaders as if it were rationally-based, it is no longer necessary to construct elaborate and complex master plans pursued over decades, in an attempt to force on events a unity that they do not possess. Nor do we need to mimic the behaviour of schizophrenics, for whom there are hidden meanings and menaces everywhere.
All this sounds pretty depressing, and it’s true that there is no immediately obvious way of returning to a saner approach. But both Gebser and McGilchrist argued that there was at least the possibility of a new synthesis and a fruitful collaboration between different modes of consciousness or different sides of the brain, as was the case in the past. After all, we do need the mental/rational mode and the left brain, they just have to stay under control. Towards the end of his life, Gebser took comfort from the burgeoning alternative spirituality movements, and the increasing interest in eastern mysticism. Much of that, of course, has since degenerated into derivative money-making, but you only have to look through bookshops, through Youtube channels and podcasts to see a restless western population, vaguely aware that something is wrong, and looking to go beyond ego-driven attempts at personal optimisation. Beyond the nonsense of expensive online yoga classes and mindfulness lessons for bond traders, I think that over the last fifty years there has been a gradual ground-level movement away from excessive concentration on left-brain rational egoism which, let’s face it, hasn’t actually done many people much good, even as it has benefited small minority. I’ll close with two possibilities that give me a little hope, although the first may not immediately seem very hopeful.
The left-brain rational mindset cannot cope with fundamental change, almost by definition, because it is concerned with process and not content, so when the content changes it’s lost. This is why governments and militaries tend to go through quite large personnel changes when wars start, for example. I think it’s quite possible that a combination of the after-effects of Ukraine, climate change and the continuation of Covid or a close relative, will have a cumulatively traumatic effect on the political and media classes. They won’t learn anything, because they can’t, but if our societies are to survive, our current rulers will have to go. I don’t think there will be pitchforks in the street (though you never know) but I do think we are going to see something like a nervous breakdown or a psychotic episode by our ruling class, who will be whining I can’t do this! It’s too hard! I suspect we are going to see a lot of re-nationalisation of economies, re-localisation of effort and more organisations based around communities (remember them?) if only because the alternative is too horrible to contemplate.
And secondly, the increased standing and cultural influence of China and India, and even perhaps Russia, are going to force the West to take seriously societies that are not as based on left-brain rational pursuit of egoistic wants, but have a much greater social cohesion and collective ethic. In the past, the West has been able to take an à la carte attitude to Asian culture (Manga and Zen from Japan for example) and ignore what it didn’t find attractive or couldn’t be marketed. But at some point people are going to realise that these cultures have strengths that we don’t, and wonder whether there aren’t some ideas to transfer.
The fundamental idea, which we had but have lost, is embededness in society, history and culture, and the consequent ability to understand what the other person is saying. If you’ve worked professionally in Asia, you’ll have been struck by the way in which people introduce themselves and what their business cards say. A western businessman might introduce themselves saying “I’m Darth J Vader, Chief Engulfment Officer of Megagreed Corporation.” But a Japanese businessman might say something you could translate as “Honshu Bank’s Tokyo Ginza Branch’s Deputy General Manager Saito is (here.)” The second is not only less egoistic, it’s also much more helpful and informative, since it enables the interlocutor to place themselves immediately by reference to you. Of course our society will never be like Japan and China (which may be a good thing) but if we are to survive, we will have to consider reviving some of the ideas and attitudes of the past which were closer to the way these societies (and many others) work. Otherwise I fear that we will all go mad.
These essays are free and I intend to keep them so, although later in the year I’ll introduce a system where people can make small payments if they wish to. But there are other ways of showing your appreciation as well. Likes are flattering to have, but also help me judge which subjects people are most interested in. Shares are very useful in bringing in new readers, and it’s particularly helpful if you would flag posts that you think deserve it on other sites that you visit or contribute to, as this can also bring new readers. And thanks for all the very interesting comments: keep them coming!
Classical liberalism is a luxury. As a political idea it only has potential when times are good and the population at large believes times will remain good. We can get behind liberalism as a unifying idea only when most everyone believes they are not slipping behind in their material conditions.
So liberalism served well as a political theory to justify (rationalize) the social changes that came with colonial wealth in Europe and it was the perfect political theory for the new nation of the USA.
But productive capitalism has been unable to provide the growth needed in many Western countries in recent decades. Paul Volcker and others helped switch the USA to finance capitalism. The UK tried to do the same but with less success. And now with the war in Europe income from production is slowing even more it's looking like the end for growth in the real economy.
So what does a nation of liberals (in the classical sense) do when times are bad and they look to be going to continue that way? Liberalism is useless when you can't adequately feed and house your family or provision and secure your household.
The contrast between the West and the East (defined broadly) that I can never forget is the way Orwell and Dostoevsky reacted to 2+2 (or, 2x2 in the latter). Orwell thought the notion that 2+2=4 is so obvious and natural that it took torture to force through 2+2 = 5. Dostoevsky's Grand Inquisitor (telling title of the character), on the other hand, would say "2x2 = 4 is mathematics. Try arguing with that," implying that freedom lies in being able to successfully fight the "rationalist" mindset. This didn't make any sense to me when I was young. Now, I'm struck by how naive even Orwell was in his blithe acceptance of the West as the norm.