The Year 2100 and All That

Gary North - January 05, 2017
Printer-Friendly Format

Remnant Review

We look around us and see trends. Some of these trends seem irreversible. But are they? They seem comprehensive. But are they? How much reliance should we place in them? Will they really shape our lives and the world we live in?

Almost 50 years ago, my professor Robert Nisbet wrote a classic article: "The Year 2000 and All That." It was published in the Jewish intellectual magazine, Commentary, although it was in no way Jewish. You can read it here. You probably should. (Save the text on Evernote. The full article will be blocked as soon as you leave the page.)

This article had more effect on me than any other article he wrote, and he wrote a lot of articles. It dealt with the issue of prediction. Nisbet was aware of the fact that certain kinds of computerized prediction techniques were becoming popular. He began:

The approach of the year 2000 is certain to be attended by a greater fanfare of predictions, prophecies, surmises, and forewarnings than any millennial year in history. In the past twelve months, at least four books on this subject have appeared, all of them concerned with the probable shape of American and world society in the year 2000. How many articles have appeared I cannot even guess. But books and articles are in any event only the exposed part of the iceberg. There are today centers, institutes, and bureaus, not to mention specific commissions, whose principal business it is to forecast or predict the future.There is with us, in short, as part of the already huge knowledge industry, the historical-prediction business; and this business is certain to become ever larger, ever more ramified. Through every conceivable means--game theory, linear programming, systems analysis, cybernetics, even old-fashioned intuition or hunch--individuals and organizations are working systematically on what lies ahead during the next thirty-two years, and indeed during the century or two after that. Nor is this an American enterprise only. In France there is the Futuribles project under the distinguished direction of Bertrand de Jouvenel. In England, the Social Science Research Council has established the Committee on the Next Thirty Years. There will be other forays into the future, in this country and abroad, for the lure of the game is spreading fast. An official of the State Department's Bureau of Educational and Cultural Affairs, Frank Snowden Hopkins, has already proposed the organization of an institute in which "rising young government administrators would each year spend some nine or ten months . . . studying the American future in all its aspects," to which one can only say, nice work if you can find it--the future, that is.

Nisbet was intensely skeptical of the entire strategy. He had spent the first stage of his academic career as a sociologist, but he was really a specialist in the history of intellectual movements and ideas. After a decade in administration, 1954-65, he had re-entered the world of letters. He had an amazingly successful second career.

His article is a survey of the history of futurology, beginning with the French Revolution.

By the late 17th century, Western philosophers, noting that the earth's frame had still not been consumed by Augustinian holocaust, took a kind of politician's courage in the fact, and declared bravely that the world was never going to end (Descartes, it seems, had proved this) and that mankind was going to become ever more knowledgeable and, who knows, progressively happy. Now, of a sudden, the year 2000 became the object of philosophical speculation. One of the more charming manifestations of this in the 18th century was a play, L'An 2000, written by the bohemian man of letters Restif de la Bretonne, which has been described as a heroic comedy representing how marriages would be arranged in the year 2000, at which time some twenty nations would be allied to France under the wise supremacy of "our well-beloved monarch Louis Francois XXII." (I almost wrote Charles de Gaulle.) A few years earlier Mercier had written a small volume, much read and discussed, titled L'An 2440, in which a more or less ideal future is limned, one that we have no difficulty seeing as an extension into the future of economic, political, and intellectual "trends" that no doubt seemed as real to Mercier as current "trends" do to us at the present time.

His point was simple: we see the future as a linear extension of the present: "more of the same." Certainly since 1800, this has not worked.

Here is the lure:

The wizardry of contemporary technology not-withstanding, the essential and lasting methodology of future-predicting was set forth in the early 18th century by the great Leibniz. One sentence, taken from his "Principles of Nature and of Grace," will suffice to express the crucial elements of Leibniz's law of continuity: The present is big with the future, the future might be read in the past, the distant is expressed in the near.

Nisbet was highly skeptical.

For at least a couple of centuries, the essential meaning of Leibniz's law of historical continuity has been axiomatic in Western thought. It has been the basis of all that we call philosophy of history and social developmentalism. From it has come the widely accepted notion that there is an entity called civilization or culture, that this entity obeys certain immanent principles of growth in time, that the continuity of time is roughly the same as the continuity of this growth, that past, present, and future have not merely a chronological relation but a genetic relation, and that through sufficient study of the past and the present it is possible to foresee the future simply by extending or extrapolating ongoing processes.

He listed nineteenth-century social theorists as having adopted this view: Comte, Hegel, Marx, and Spencer.

Marx spent decades in the British Museum studying economic history--of England chiefly--not because he was enamored of English economic life but because, under Leibnizian assumptions that he no more questioned than had Comte, he discerned something called "capitalism," something that was universal in type, actual or potential, and that would, he thought, obey laws of development which if correctly identified and clearly understood would make prediction of the future as scientifically unassailable as prediction of the movements of the earth around the sun.

In the year that Nisbet published this article, my book on Marx appeared. Marx was by far the most successful of such future-predictors in terms of his influence, yet he was a total failure in terms of the accuracy of his predictions. Lenin's October revolution of 1917 took place in Russia, which was semi-feudal, not the United States or England, which were vanguards of capitalism. That revolution had been a fluke. Of the 3,000 or so revolutionaries in the late 1800's, Lenin was the only one who pulled off a successful revolution. He did so because the German military had sent him in a train to Finland to get him into Russia, create a revolution, and take Russia out of World War I -- all of which he did. Until then, hardly anyone remembered Marx or his obscure theories.

Nisbet commented on practice as of 1968.

The trouble is, there are so many references of a "hard data" sort, so many allusions throughout the contemporary literature on the future to all the puncture-proof, self-sealing devices of models, programming, and systems, that the unwary reader may be deceived into thinking that projections and forecasts of the future have the same secure relation to these devices that our accounting systems, traffic controls, and market analyses do. It would really be a shabby trick if we somehow left the inference around, to be picked up by the public, that computers and systems-analysts do look into the future in ways that were denied to a Tocqueville or a Marx.

Nisbet took no prisoners.

Reading all of these volumes, we can thrill with repressed horror at the thought of the mantle of too too solid flesh that will one day cover the earth (a mantle that the physicist-population expert, Sir Charles Darwin, once told an audience, much in the manner of the old fashioned temperance lecture, would reach, present rates continuing, one mile in height by the year 3500, or was it 2500?). One can feel his toes trampled on as he reads that by the year 2000 there will be two people for every foot of waterline in the U.S. One can hypnotize himself into a state of driver-fury by merely reading about the 250 million automobiles (we now have about 59 million) on American streets and highways. The thought of 225 billion passenger miles to be flown by the airlines in the year 2000, in contrast to a 1960 figure of 35 billion, is enough to keep everyone home, which would indeed be a change. But change is not, alas, what these books are predicting; they are only extrapolating present rates, many of which remind one of a mad physiologist predicting giants at age twenty on the basis of growth rates at age ten.

He ended the essay with one of the most profound insights I have ever read. Etch this in your memory.

It is very different with studies of change in human society. Here the Random Event, the Maniac, the Prophet, and the Genius have to be reckoned with. We have absolutely no way of escaping them. The future-predicters don't suggest that we can avoid or escape them--or ever be able to predict or forecast them. What the future-predicters, the change-analysts, and trend-tenders say in effect is that with the aid of institute resources, computers, linear programming, etc. they will deal with the kinds of change that are not the consequence of the Random Event, the Genius, the Maniac, and the Prophet.

To which I can only say: there really aren't any; not any worth looking at anyhow.

That really is the famous bottom line.

Think of predicting the most influential thinkers in the nineteeth century. In retrospect, we know: Karl Marx and Charles Darwin. But in 1848, when the Manifesto of the Communist Party was published anonymously, it was nothing except Marx's usual tirades against rival German revolutionary theorists no one had heard of. He had been hired by the obscure League of the Just to write a pamphlet to launch the 1848 revolution that everyone figured was coming. But Marx and Engels got it out too late. The revolution had already begun. As for everything else Marx wrote, it was unreadable if Engels did not edit it -- turgid Germanic scholarship at its worst. In 1883, he died. About a dozen people attended his funeral.

Darwin died in 1882. He was buried at Westminster Abbey. The whole intellectual world recognized his triumph. Yet in 1858, when Darwin's jointly published article along with Alfred Wallace appeared in an obscure academic journal, no one noticed. A year later, Origin of Species sold out in one day, but the print run was only 1,200 copies. Then the book reviewer for the London Times sent it back, saying he was not qualified to review it. The editor then sent it to Thomas Huxley. And the rest was history. The reclusive, timid, hypochondriac Darwin got his public bulldog. On such events are revolutions made.

Even Marxism was a fluke. In 1842, Marx was an unemployed German with a Ph.D. in Greek philosophy from an obscure university: Jena. He was a revolutionary with no followers. He had no interest in economic history or communism. Engels was a revolutionary, but not a communist. At the end of 1842, he was converted to communism by Moses Hess. Marx became a communist in 1843. The first sign of Marxian communism as a system is in the unpublished manuscript, The German Ideology (1845). Hess wrote part of it. Hess later went on to found Zionism. He was the most important nineteenth century figure hardly anyone has ever heard of: the founder of two major movements.

We don't know what the future will bring, other than more of the same. We know this: 2100 will not be what we imagine.

We can extend trends, such a Moore's law's successor, but that will not be more of the same.

The establishments of the world think they can give direction to social forces. They are wrong. They are huffing and puffing to keep up with the results of their own creativity.

Printer-Friendly Format