Towering columns
For The Times, Tomiwa Owolade believes that we must address the impact of gang violence in order to protect the lives of black people.
The Runnymede Trust, one of the most prominent anti-racist charities in the UK, posted this last Monday: “This lack of police accountability perpetuates cycles of violence and impunity. Our thoughts and solidarity are with Chris’ loved ones and wider community. Chris Kaba, 23.07.1999 — 05.09.2022 #JusticeForChrisKaba.” But how can there be a lack of “police accountability” when a police officer was put on trial? This week a BBC story carried the headline: Kaba case traumatising, say black community leaders. Instead of cynically invoking the language of victimhood, these “community leaders” should ask what pushed a young man down a path of criminality that led to him trying to kill other young men, violently confronting the police and putting his own life in danger.
There is another problem with presenting Kaba as a righteous victim of racial injustice. People like him pose a greater threat to the lives of other black people — in particular the young black men seduced into this lifestyle of violence and criminality — than the police do. Far more young black men die through gang violence than at the hands of the police. Many parents of black children already know this. They tell their children to stay away from gangsters for this good reason.
The protests against the acquittal of Blake proclaim “black lives matter”. Anyone committed to that statement needs to acknowledge the role that gang violence plays in destroying the lives of many black boys and young men. Of course, we ought to scrutinise the police and hold them to high standards; we invest them with the awesome responsibility of ensuring our safety. Any police misconduct ought to be punished with great severity. But so much of the conversation after the Kaba trial constitutes not a defence of justice but a dereliction of moral duty. So much of what has been expressed is a racism of lowered expectations — seeing Kaba as a victim incapable of moral agency — that is no less poisonous than the racism protesters are marching against. Kaba is not a martyr of racial injustice but a warning of the dangers of gang violence.
On his Substack, Neil O’Brien considers the growing absence of order and civility in much of modern British society.
Successful places understand the connection between order and success. Applying Broken Windows theory in New York, Bill Bratton drove down levels of crime and disorder which had previously been thought to be completely intractable. But new left-wing political leadership has fettered their police on human rights / racial justice grounds, and now serious crime is shooting up again.
Singapore under Lee Kuan Yew understood the multi-level nature of what is needed for a civilised society. Arriving in London he had marvelled at the kind of high-trust society that enabled people to pay for newspapers with an honesty box. For him, decisive policy on crime was a no brainer. He said of their death penalty for drug dealers: “If we could kill them a hundred times we would”. But he also launched a concerted push to green the city, and making it the greenest city in Asia. The famous Singaporean ban on chewing gum was something LKY adopted reluctantly: people were sticking gum on the sensors for train doors and making trains late. For him needs of the many outweighed the interests of selfish few.
In the US today there is a shift of talented people out of places where disorder has been allowed to take hold - like San Francisco - and towards safer places like Texas (Though a backlash may now be starting in SF). The US experiment with “defunding” or withdrawing the police has demonstrated once again the people who suffer from a disorderly society are not the strong and the rich but the weak, poor and different. Ironically, black people in the US have borne the brunt of an increase in homicides under the “defund the police” movement. Disorder in its strong form is obviously tragic. Britain has got used to crimes that would once have been news for months: kids stabbed on the way to school, killings carried out by children, and more. Violent disorder grows out of lower level disorder. But although it’s less dramatic, there also is a lot of direct suffering from lower-level disorder and the loss of civilised standards.
For The Spectator, David Butterfield reflects on how Cambridge University has allowed falling standards and ideological groupthink to dominate.
For students, the risks have never been lower. Grade inflation is rampant in Cambridge, as elsewhere in the sector. A third-class performance, let alone a failure, is almost impossible in most subjects, as students can either intermit for the year and take the exams again, or avoid them on health grounds and be given an effective pass. When I came to Cambridge, students would be removed from the university for lack of attainment; it is now unheard of for students to be sent down for insufficient academic performance.
These changes reflect a bigger shift: for various reasons declarations of disability have spiked dramatically. Over the past 15 years, disability at Cambridge has increased more than fivefold, and is now declared by some 6,000 students (roughly one in four). The two major areas of growth have been ‘mental health conditions’ and ‘specific learning difficulties’. Many students register anxiety as the cause, yet the university and the NHS have neither the bandwidth nor the incentive to stress-test the claims. In four years, the number of students with ADHD has doubled, and is now approaching a thousand. As a result, the university’s Accessibility and Disability Resource Centre has gone into overdrive, mandating changes in teaching and examination across the board.
Whatever the truth behind the much-discussed ‘mental health crisis’, it has ushered in developments that disrupt university life. Many students are now excused from writing essays and permitted to submit bullet points; deadlines are extended, and regularly missed without penalty; extra time is given for all examinations. The pace of change over the past decade has been astonishing, driven on by three forces: an administrative class that wants to minimise complaints, a subset of academics who actively resent the no-nonsense traditions of the university, and a proportion of students who will take the easiest path proffered. The result is a steady infantilisation of education, whereby challenging workloads are reduced, and robust criticism of bad writing and bad thinking is avoided.
For The Times, Dominic Lawson laments Britain’s failure to pursue civil nuclear power when energy demands are set to increase dramatically.
Given that the tech industry is publicly committed to the net-zero carbon agenda, what does it think could be the solution? Last week both Google and Amazon gave their answer, and it wasn’t wind turbines. They separately announced tie-ups with power plant developers for the construction of small modular nuclear reactors. They were following the example of Microsoft, which a few weeks ago announced its own arrangement, involving the recommissioning of a nuclear reactor in Pennsylvania, to be called the Crane Clean Energy Center…
…Yet, while the new administration recently chose to pledge billions more in subsidies for Drax’s untested scheme to “capture” the carbon it emits, it abjectly failed, at its investment summit last week, to announce the “final investment decision” for the Sizewell C nuclear reactor. It is amazing, and not in a good way, that the last British nuclear plant to have been connected to the grid was that approved by my late father as energy secretary almost four decades ago: Sizewell B. It is all the more deplorable given that we led the world into the age of civil nuclear power: the Calder Hall reactor began supplying this form of energy to the nation’s homes in the year of my birth, 1956; by 1965 we still had more nuclear reactors than the rest of the world put together…
…The new (and former) energy secretary, Ed Miliband, has focused chiefly on wind, authorising turbine construction onshore, after years when the Conservatives allowed only offshore development. But efficient production is all about energy density (each gram of uranium generates three million times as much energy as a gram of coal). The group Greens for Nuclear Energy notes: “Whitelee onshore wind farm would need to occupy 840 square miles to generate the same amount of electricity as Hinkley C nuclear power station.”
At Compact, Marc Fasteau and Ian Fletcher examine how tariffs can be effectively used to reshape the make-up of a country’s trade.
What a country imports plays a huge role in determining what industries will thrive, or even survive, in America. To remain a high-wage society, America needs high productivity, which, in turn, requires strong positions in advanced industries like electric cars and their supply chains. If Americans import electric cars from a low-wage, high-subsidy country like China, the domestic EV industry won’t stand a chance. This is why President Biden was right to impose a tariff against Chinese EVs in June 2024, and why then-President Donald Trump was also right to impose tariffs and quotas on Chinese steel.
To illustrate the benefits of industry-specific tariffs, consider the effects of Trump’s 2018 near-global 25 percent tariff on steel. First, it reduced America’s steel imports by 24 percent. Second, capacity utilization—how much factories were making versus how much they were capable of making—rose above the 80 percent required for the industry’s long-term viability. And third, the American steel industry invested $16 billion, adding significant new US capacity, including 15 brand-new mills and other steel-making facilities, at locations from Florida to Texas to Arizona…
…Where a country imports products from is also important. If the national interest demands that Americans economically decouple from China—an option seriously under debate—country-specific tariffs will be essential to doing this without also decoupling from friends and allies. Washington will need their help to block Chinese goods that are relabeled and shipped through their ports, plus goods made by Chinese-controlled manufacturers that set up plants in those countries. The United States is never going to win a subsidy race with China, but Amerivans can use tariffs to make sure these subsidies don’t allow the People’s Republic to wrest away US markets from US producers.
At UnHerd, Leighton Woodhouse argues that social breakdown in California is the product of Puritan regulation and Scots-Irish libertarianism.
From time to time, as in the Sixties, this ideological hybrid yields a politics that is vital and new. But more often, it manifests as a distinctively dysfunctional kind of progressive politics that sets the state apart from its Yankee cousins. There is no better example of this than the way that the state manages its mental health crisis. California once ran a vast Department of Mental Hygiene, but in 1967, shocked by conditions in its 14 asylums, state legislators all but banned involuntary commitment of the insane in a law they characterised as a “Magna Carta” for California’s mentally ill. In doing so, the state swung from an excess of Yankee-style social control to the opposite extreme: a hardcore civil libertarian regime that has left the mentally ill languishing on city sidewalks. This right to suffer from drug addiction and psychosis on the street without intervention from the government may constitute “freedom” in the Scots-Irish backcountry, but a traditional Yankee would not recognise it as such.
Drug policy is another case-in-point. California’s libertarian attitude toward recreational drug use began in the Sixties. Today, its ramifications can be seen in San Francisco’s Tenderloin and South of Market districts and Skid Row in Los Angeles, where not only drug use but open drug dealing is decriminalised, in part as a result of Proposition 47, a progressive ballot initiative that California voters passed in 2014. In San Francisco’s influential activist circles, drug-dealing is hardly considered a crime at all, and drug enforcement is deemed an act of state repression against the poor. The assumption, which would be right at home in the Appalachian backcountry, is that the government is nothing more than a malign apparatus of coercion. Those same activists, however, also believe in the government’s responsibility to provide expansive services and treatment to those who want it — reflecting the political philosophy of colonial New England…
…The result of this inscrutable ideology is a state that fails to fulfil the most elementary obligation of government: the provision of basic social order. You see it everywhere in California, in the tent encampments that line the beaches of Venice and Santa Monica and the dusty sidewalks of Fresno and Bakersfield. California is where two fundamentally incompatible Anglo traditions merged, yielding a unique kind of social dysfunction that’s as indelible a part of the state as the coastal cliffs and the redwood forests.
Wonky thinking
The Information Technology & Innovation Foundation has published Go to the Mattresses: It’s Time to Reset U.S.-EU Tech and Trade Relations by Robert D. Atkinson. The author argues that the European Union is using protective practices to secure tech sovereignty by specifically discriminating against U.S. businesses. But distancing from the transatlantic connection could just make Europe more vulnerable to Chinese economic power.
U.S. policymakers must do more than complain diplomatically to French, German, and other EU policymakers in various dialogues and fora about their efforts to replace goods and services from U.S. tech firms. If the United States tried something similar in a sector in which European firms held most of the market—such as luxury vehicles—the righteous outcry from European defenders of the rules-based global trading system would be immediate and damning. Yet, because European leaders drape their efforts in “European values” and associated privacy, competition, and cybersecurity interests, U.S. policymakers and the media give them a free pass. Even when European leaders repeatedly and consistently say the quiet part out loud—that they want to replace U.S. firms and products—U.S. policymakers still pretend that European partners are acting in good faith, or that they don’t really mean it.
Encouraging Europe to change its approach to U.S. tech has geostrategic implications. Europe is in a different place strategically in relation to China than it was just five years ago. Europe and many other Western nations, such as Australia, Canada, New Zealand, and South Korea, would like nothing more than to see the Chinese mercantilist “cat” belled, but they all lack the courage to do it. So, Europe is happy to leave that task to the United States—describing the conflict wrongly as the U.S.-China trade war—so they can avoid punitive retaliation, while at the same time enabling their companies to take U.S. market share in China. Indeed, that is exactly what has happened. While the United States has been taking the heat in fighting back against Chinese innovation mercantilism—a task that benefits Europe perhaps even more than it does America—the EU has moved in and captured what was formerly U.S. market share there. That is not how an ally behaves.
Europe is still largely anchored to policies of the past in thinking that it can appease China and maintain its market access, and China won’t eventually target its advanced industries for eradication. European officials often talk in private about concerns they share with the United States about China, but the time for subtlety and inaction has long since passed, because the United States no longer has the ability on its own to potentially force change in China. That time is long past. It will take a concerted and coordinated effort for the United States, Europe, Japan, Australia, Canada, South Korea, and other allies to have any chance of enacting collective defenses against predatory Chinese economic practices. A united front can impose costs on China that together would have the potential to limit its gains in advanced industry market share.
So, the EU needs to join with the United States to limit China’s techno-economic aggression, and at the same time cease its own aggression against the United States. But the United States has not established any real policy to make the EU think there could be tangible consequences for its discriminatory policies. U.S. officials’ approach has been to complain to their European counterparts, not “go to the mattresses,” largely because the U.S.-European policy community argues that America needs the EU for broader strategic interests, especially resisting Russian aggression. While President Trump (and other past U.S. presidents, such as Kennedy) have focused on the threat of withdrawing troops, Europe has always known that was a paper tiger. And so it has proceeded apace.
It’s time for the United States to respond more strategically. Just as the EU seeks to add new instruments to its strategic autonomy toolbox to ensure it can defend itself against countries that abuse its openness—from a carbon border adjustment mechanism (CBAM) to mirror clausesto anticoercion instruments—so too must the United States.4 Congress and the next administration should review U.S. trade defense tools to reflect the rise of European protectionism and digital sovereignty. If left unchecked, Europe’s approach will provide a model for other countries and regions to emulate, which ultimately will lead to the fragmentation of the global digital economy into national and regional “walled gardens.”
The Council on Geostrategy has published A more lethal Royal Navy: Sharpening Britain’s naval power by William Freer and Dr Emma Salisbury. This is part of a stream of research into how to establish ‘strategic advantage’ in British foreign and defence policies. As an island nation, maritime interests remain of the utmost importance to national security and require a strong Royal Navy to defend it.
The Royal Navy’s force design should be determined by a combination of threats to the nation, the nation’s resources, and the nation’s interests. According to the Integrated Review Refresh (IRR) of March 2023, the present context of a belligerent Russia and an increasingly confrontational PRC means that Britain needs a more sober but determined approach to international relations. As per the IRR’s ‘strategic framework’, HM Governm ent seeks to deter opponents and shape the international order in pursuit of British interests. Equally, the IRR notes that as the Indo-Pacific becomes more connected to the Euro-Atlantic, the UK will not have the luxury of choice between one theatre or the other. It concludes that Britain should embrace being in both theatres of operation – the Euro-Atlantic and the Indo-Pacific – albeit with two different, though complementary, postures.
To extrapolate, given Britain’s location, the Royal Navy’s primary focus should be on the Euro-Atlantic, working with NATO allies to enact sea control (see: Map 1). Sea control is achieved when a navy is able to establish a persistent, or even permanent, maritime presence which deters rivals from confrontation. Depending on the capability of the country in question, the objectives it wants to achieve, and the strength of its adversaries, sea control can be enacted locally, regionally, or even globally. Meanwhile, in the Indo-Pacific, the Royal Navy should contribute to sea denial – which necessitates capabilities to prevent a rival navy from operating with impunity (i.e., from establishing sea control). This can be achieved in multiple ways including by threatening sea-based assets from land, the use of naval mines, and deploying naval forces themselves (usually larger numbers of smaller vessels).
Historically, Britain has been well versed at practising sea control and denial simultaneously. Since the reign of Elizabeth I, the Royal Navy, in conjunction with allies and partners, has been tasked with enacting sea control in the waters surrounding the British home islands, while modulating sea control and denial in more distant theatres, with this modulation being dictated by the geostrategic significance of the theatre and the strength of adversaries. When the waters around the British Isles have been threatened by an adversary, the Royal Navy has been focused in North Atlantic waters. This has occurred many times throughout history, such as before the First World War, during the Second World War (until roughly mid-1944 when the German naval threat had been eliminated), and again in the 1970s and 1980s during the vast Soviet naval build-up.
Although the Royal Navy needs to support two regional postures, it does not necessarily need two separate fleets. Naval platforms are inherently flexible (due to the variety of systems they can host), and most of those operated by the Royal Navy can contribute to both postures to varying degrees. While warships are flexible and can switch from sea control to denial with relative ease, the problem is that Britain’s rivals are regenerating or modernising their own fleets. HM Government designs the tasks it wants the Royal Navy to achieve and works out that it needs roughly three to four ships for each task a ship is required for (as some will be in refit, or preparation for deployment). The current posture was largely designed over a decade ago, when geopolitical competition was less severe. What required only a single ship or two in 2010 or 2015 will require potentially several by the 2030s or 2040s. And the UK does not have enough.
Book of the week
We recommend Creating the Cold War University: The Transformation of Stanford by Rebecca S. Lowen. The author uses the case of Stanford to explore how academia played a vital role in the burgeoning military-industrial complex, particularly how university administrators worked with military leaders and academic scientists.
Before World War II, America's universities were peripheral to the nation's political economy. They were committed to promoting the scientific method, to allowing academic scientists and scholars to discover and study “truths,” and to developing the character of their students who were, for the most part, the sons of the nation's business and professional elites. While some universities included service to society as part of their mission, none of them conceived of this service as direct assistance to the federal government. Autonomy from the federal government was, in fact, central to the definition of the university as well as of the science and scholarship conducted within it. Similarly, the concept of autonomy from private industry and, more broadly, the world of commerce, distinguished the university from the mere technical institute. Universities were, in their own imagining, ivory towers.
During the cold war, the nation's leading universities moved from the periphery to the center of the nation's political economy. To Clark Kerr, chancellor of the University of California at Berkeley in the 1950s and early 1960s and one of the first to herald publicly this shift, the postwar university was a wholly new institution, one that was uniquely responsive to the society of which it was now very much a part. Kerr recognized the significance of the cold war to American universities. Such military technologies as ballistic missiles, guidance systems, hydrogen bombs, and radar required the expertise of highly trained scientists and engineers. By the early 1960s, the federal government was spending approximately $10 billion annually on research and development, with the Department of Defense and the Atomic Energy Commission contributing over one half of the total. Universities and university affiliated centers received annually about one tenth—or $1 billion—of these federal research and development funds, more than half of which went to just six universities by the early 1960s. These universities, in turn, depended on federal patronage for over fifty percent of their operating budgets.
But Kerr also saw the postwar university as the product of broader forces that were shifting the nation from an industrial to a postindustrial society, from an economy in which labor and raw materials were the essential inputs to one in which expert knowledge was the key to economic growth and prosperity. Nothing better exemplified for Kerr the centrality of the university to postindustrial society than the shifting industrial geography. High-technology companies were clustering around the Berkeley and Stanford campuses in the San Francisco Bay Area, around Harvard and MIT in Massachusetts, and around the campuses and research facilities in the Chicago area. These “cities of intellect,” or “science regions,” as they have since been called, testify to the importance of the American university to the post-World War II economy as do the assiduous, although not always successful, efforts to replicate them.
If the university's relationship to society changed after World War II, the university underwent internal changes as well. Large laboratories staffed with myriad researchers having no teaching duties and working in groups with expensive, government funded scientific equipment became common features of the leading research universities after World War II. As the organization and funding of science changed, so did the kinds of knowledge produced and taught. Universities made room for new fields of study, such as nuclear engineering and Russian studies, which bore obvious relevance to the nation's geopolitical concerns. Traditional social science disciplines also shifted their emphases, stressing quantitative approaches over normative ones and individual behavior and cultural studies over sociological ones.
Most commented upon, then and now, were changes in the education of students and in the role of university professors. As Kerr readily acknowledged, professional advancement in the postwar university depended wholly on research and publication, which in turn depended on the development of patronage. Thus, undergraduate teaching received short shrift as professors, not surprisingly, showed more interest in and loyalty to their patrons outside the university than to their own institutions and students. If the university before World War II was a community of scholars and scientists, the postwar university was, as Kerr described it, a loose collection of academic entrepreneurs—scientists and scholars perpetually promoting themselves and their research to potential patrons.
That the nation's leading universities changed in the ways that Kerr outlined in the 1963 Godkin Lectures at Harvard (and published later that year as The Uses of the University) has never been disputed. What has been a subject of debate, however, is whether these changes represented progress or degradation and decline. As is often the case with valuative questions, the generally accepted answer has tended to change along with shifts in the nation’s political and cultural climate. In the 1950s and early 1960s, for example, the consensus was that the metamorphosis of the university was part of a broader progressive trend; The Uses of the University remains the classic presentation of this view. While Kerr regretted what was being lost—a sense of the university as a community and a devotion to undergraduate education—this regret was overwhelmed by his appreciation for what was being gained. The university was no longer an ivory tower but a flexible institution with multiple constituencies, a “multiversity.”
Kerr's views came under attack in the decade following the publication of The Uses of the University. To Robert Nisbet, a conservative sociologist who idealized the prewar academic community, the university had been hopelessly degraded by its emphasis on service and its relationship to the federal government and to private enterprise. Nisbet leveled his most searing criticisms at fellow colleagues, whom he accused of abandoning traditional, if unglamorous, commitments to teaching and scholarship to chase after money and fame. The multiversity was also attacked by the left at this time for its purported dehumanizing approach to education and its collaboration with the nation's war machinery.
The end of the Vietnam War took much of the heat out of the controversy surrounding the university and its role in society, but expressions of dissatisfaction, often quite strident, resurfaced in the mid-1980s. The university was attacked from the right for neglecting undergraduates and introducing curricula emphasizing relevance and cultural diversity instead of transcendent values and traditions. As before, blame focused on university professors who, it was claimed, had been student radicals in the 1960s. Less attention was paid in these years to critics on the left concerned about the increase in defense dollars pouring into university laboratories and the connections between federally supported academic researchers and privately owned, for-profit businesses. Defenders of the cold war university also reemerged to stress the economic benefits derived from universities' collaboration with the federal government and private industry and to insist that state supported (and, specifically, military-supported) science produced good, not bad, science.
Quick links
Thirty year gilts are now at higher yields than during the mini-budget.
The number of British companies being liquidated by directors in October 2024 is double the number in October 2023.
Britain has fallen to thirtieth place in the International Tax Competitiveness Index.
An Italian court struck down the Meloni government’s plan to process asylum seekers in Albania.
Welsh legislators rejected assisted dying legislation by 26 to 19 votes with Labour MPs split on the issue.
Research on puberty blockers in the US remains unpublished due to doctors fearing the political consequences.
Germany’s birth rate continued to fall with 80,000 fewer children born than expected in 2023 than in 2022…
…but fertility decline is a global phenomenon with Mexico’s birth rate dropping below the US for the first time.
Young men are leaving the Democratic Party, falling from 51% in 2016 to 39% in 2023.