Jump to content
IndiaDivine.org

Astrology in Encarta Encyclopedia

Rate this topic


Guest guest

Recommended Posts

Guest guest

|| Jaya Jagannath ||

Dear Jyotisha,

 

Here I am posting the information on Astrology as mentioned in the Encarta

Encyclopedia.... There are plenty of misinformation. I hope that, the time

shall surely change in our favour.

 

Best Wishes

Sarajit Poddar

SJC- Asia

 

_____________________________

Encarta Historical Essays reflect the knowledge and insight of leading

historians. This collection of essays is assembled to support the National

Standards for World History. In this essay, Peter N. Stearns of Carnegie Mellon

University argues that the ability to accurately predict the future depends on

how well we understand the past.

 

Predicting the Future: How History Counts

By Peter N. Stearns

Humans have long been interested in predicting the future. It is impossible to

know when groups of people became aware that what happens in the future is

likely to differ from what is happening at the present moment, but realize this

they did. Over the years, societies have developed various ways to try to divine

the future. Some groups attempted to acquire insight into events through magic

or contact with the supernatural. To do this, they might have read portents in

the entrails of animals or in tea leaves. In ancient Rome, generals used these

methods to calculate their likely success in upcoming battles. Reliance on

patterns of stars as a means of predicting personal futures also developed

early on. Astrology, the study of how events on earth correspond to the

positions and movements of astronomical bodies, was a key science in classical

China, Greece, and Rome, and in the Islamic Middle East. Although astrology and

astronomy went their separate ways during the 1500s, as late as the 17th century

many Europeans consulted astrologers to calculate the fate of an imminent

wedding or a sign of illness. For many years, scientists have rejected the

principles of astrology. Even so, millions of people continue to believe in or

practice it.

Well before the considerable decline of beliefs in magic by the 18th century,

however, human societies had also developed ways to think about the future in

clearer relation to historical time. That is, they became aware that their

societies had pasts, and they tried to relate those pasts to the future. Most

of the forecasts we deal with today, such as those that inform military or

business policy, actively use history because the forecasters assume a

connection among past, present, and future events. As we will see, the types of

connections on which predictions are based, as well as the success rate of those

predictions, vary hugely. However, the need to assess predictions applies

regardless.

Three major types of predictive modes, or history-to-future thinking, exist. The

first mode to arise, and one that is still widely used today, is based on

assumptions about the recurrence of historical events and patterns. Analysts

who employ this predictive mode assume that certain types of past developments

will happen again, and that by understanding history, they can better handle

future recurrences. This thinking lies behind the familiar phrase, “Those who

do not know the past are condemned to repeat it.” The second predictive mode to

develop, and by far the most dramatic, involves assumptions about a phenomenon

called historical disruption. In this mode, prediction highlights the belief

that some force is about to radically change the course of history, and

therefore, the future. The third predictive mode, not necessarily the newest

but certainly the one developed most systematically during the past century,

involves looking to recent history for the trends that are likely to continue

in the future. Although this is the most conservative approach to using history

to predict the future, it is often the most accurate. However, each of these

attempts to use history as a basis for predicting the future is inherently

flawed. Therefore, they do not provide entirely accurate descriptions of the

future. Perhaps this is why some people continue to prefer fortune-tellers and

astrological charts to predictions based on historical events.

Predictive Mode I: Cycles and Analogies

Probably the first systematic use of historical knowledge to predict the future

assumed that human history moved in cycles, that is, that what had happened

before would later recur. Many Chinese historians adopted this cyclical view,

which seemed to describe the experience of the imperial dynasties: A new

dynasty would come into power, flower, and then decline, and the cycle would

begin again with the next dynasty. The Confucian thinker Mencius, who lived

from about 371 to 289 bc, argued that every 500 years, a “true king” would

arise in China. Other societies speculated about historical cycles, though some

based their calculations more on the properties of numbers than on real

understanding of the past. This was true of the Maya belief in cosmic cycles.

The Maya completely trusted the gods’ control of certain units of time and of

all of the people’s activity during those times.

The intellectual tradition that developed in Western Europe during and after the

Renaissance rarely included study of formal cycles. Christian predictions, as we

will see, tended to emphasize a sudden and dramatic shift from the past to a

very different future, rather than a recurrence of past events. However, many

intellectuals did believe that particular patterns of events might recur, and

they believed that one could use historical analogies to get a sense of future

developments. Historical analogy remains a vital tool in predicting the future,

whether in personal life or in wider political or military realms.

Analogy works like this: a person faced with a particular situation wants to

know what will happen next, although he or she recognizes that the future is

hard to predict. So the person recalls a past situation or pattern roughly

similar to what he or she is newly experiencing, hoping that this will give

some approximate idea of how the current situation will play out in the future.

Analogical thinking can be used to make predictions in countless personal

situations. Most obviously, if we have had some striking success or failure in

the past and a new situation seems to be developing in a similar

fashion—representing an analogy, in other words—then we are likely to assume

that we can safely repeat, or must avoid, the past action in order to succeed.

Government leaders often use analogy to set policy. In hosts of conflicts after

World War II, including Korea and Vietnam, American policymakers assumed that

they could not back down in face of a presumed Communist threat. This

assumption was based on what happened after Britain and France’s failed attempt

in 1938 to appease Germany. (Appeasement is the concession to demands of rival

states in order to avoid war.) To stop Hitler’s ambitions and avoid further

conflict after the devastation of World War I, war-weary Britain and France

agreed in the Munich Pact that Sudetenland, a region of Czechoslovakia, could

be ceded to Germany. Rather than curb Hitler’s ambition, the Munich Pact merely

resulted in Hitler’s invasion of Poland and the declaration of World War II. The

lesson, or analogy, of Munich became a dominant model of what could happen in

the future unless threatened countries adopted different intervention

strategies. During the 1970s, many American policymakers, concerned about

Middle Eastern limits on oil supply, sought an analogy for facing shortages of

a vital material that would guide their thinking about future policy. For a

time, many found a suitable analogy in the World War II development of

synthetic rubber, which replaced sources of natural rubber blocked by the

Japanese. Here, as well as with the Munich Pact example, policymakers assumed

that they could envision the future by analogizing a current situation with a

known past—in this case, developing a synthetic substitute for oil. However,

oil flow resumed and the analogy was dropped. More recently still, analogies

related to Nazi slaughter of Jews may have helped motivate or justify the 1999

NATO intervention in Yugoslavia’s ethnic cleansing practices.

Analogical thinking about the future strongly guided the historical tradition

that developed during the European Renaissance, which revived patterns that had

first emerged in ancient Greece and Rome. For example, accounts of how earlier

generals and political leaders coped successfully with roughly similar problems

in the past were used to advise leaders about military or political

uncertainties to come. During this time Machiavelli (1469-1527), a leading

Italian political thinker, argued that a prince confronted with unrest in his

state could shape the future by looking at decisive actions taken by earlier

Roman or Italian princes. The situations would be sufficiently similar, the

thinking went, that the future results of policy could be safely predicted.

Well into the 20th century, elite educators in Europe and the United States

assumed that studying historical cases, particularly those from the classical

past, would provide valid guidance for future strategy—not prediction exactly,

but rather a sense of what would work effectively amid uncertainty. Even today,

the study of battle histories forms an important part of the training of

military leaders.

Looser kinds of analogical thinking affect more general forecasts. At various

points during the 20th century, worried intellectuals wondered if Western

society was collapsing, much as the Roman Empire had by the 5th century. They

attributed the Roman collapse to barbarian pressure from outside; moral decay

of the upper class, which turned from the pursuit of the public good to the

pursuit of pleasure; and corrupt urban masses kept in line by government

handouts and entertainment. Recognizing these factors in modern times, the

intellectuals worried that the results would be the same. During the 1920s

Oswald Spengler pointed out these parallels in his book The Decline of the West

(1918-1922), winning wide attention amid the gloom of this period in Europe:

The Roman Imperium collapsed, and thus only two of the three empires [China and

India] continued, and still continue, as desirable spoil for a succession of

different powers. Today it is the ‘red-haired barbarian’ of the West who is

playing before the highly civilized eyes of Brahman and Mandar, in the role

once played by Mongul and Manchu, playing it neither better nor worse than

they, and certain like them to be superseded in due course by other actors.

 

In sum, a key way that analysts and policymakers think about the future involves

the idea of recurrence—using the past as a predictive guide. Few people today

believe that formal historical cycles exist, but almost everyone uses analogy

in all sorts of forecasting situations, large and small. Knowing that we do

this, and understanding how our analogies are based on assumptions we make

about recurring historical patterns, is a vital step in deciding whether this

predictive mode is particularly fruitful. Many historians argue that it is not.

Predictive Mode II: Historical Disruption

A second use of history to envision the future is quite different from its use

in analogical thought. In the predictive mode that looks for historical

disruptions, analysts assume that human history has gone along a fairly

well-defined path for quite a while but that some force is about to move it in

a dramatically different direction. In this case, the past becomes not a guide

to the future, but rather a measurement of how striking the change is. One can

look at the historical disruption predictive mode from two standpoints, the

religious and the secular.

Religious Interpretations of the Dramatic Disruption Formula

The religions that developed in the Middle East before and after the 1st century

ad set an intellectual basis for thinking about the future in terms of dramatic

disruption. However, prophets were often imprecise in their predictions. The

Hebrew religion was the first to take this approach, developing the assumption

that at some future point a messiah would come. By introducing the kingdom of

God to earth, this messiah would change the whole framework of human existence.

The Jewish faith combined painstaking interest in history with the assumption

that the established pattern would sometime be drastically altered by

intervention from on high.

Islam incorporated the same kind of expectation about future change when it

assumed that a Last Day, or day of judgment, would come. The Koran stipulated

that this event would usher in a period of 50,000 years during which people

would be sorted according to their assignments to heaven or to hell, “till the

judgment among men is finished.”

Christianity also predicted a divinely guided transformation in which human

history would end. The book of Mark, in the New Testament, conveyed this

forecast from Christ: “But in those days…the sun shall be darkened, and the

moon shall not give her light…And then shall they see the Son of man coming in

the clouds with great power and glory” (Mark 13:24-26). The Book of Revelation,

written about ad 95, forecast a 1,000-year reign of Christ, followed by Satan’s

return and a period of terrible trial, which was then followed by the final

resurrection of those who were to be saved (see Revelation 21:4-10).

Christianity thus set up an assumption of human history’s end, and a version of

apocalyptic or millennial future change. Early Christian leaders assumed that

Christ would soon return; however, this expectation gradually faded, leaving

views of the future more amorphous.

During the early Middle Ages (400 to 1000 AD), many Christian intellectuals were

content to write year-by-year records of the history they witnessed, without

much attention to the future one way or the other. However, this

straightforward pattern of recording history began to change during the 12th

and 13th centuries. With this change a much clearer strand of Christian

thinking about dramatic futures took shape. What the change required, along

with the Biblical framework, was a firmer sense of time, in the sense both of

history and of numerology. European interest in mathematics awakened, thanks to

exposure to Arab learning and recovered materials from ancient Greece.

Chroniclers began to list the number of people in armies, while Italian cities

took statistical inventories. Further evidence of this awakening numerical

sense is demonstrated in one astronomer’s recording of the duration of a 1239

eclipse in terms of “the time taken to walk 250 paces.” Joined with this new

numerical sense was a more precise grasp of the calendar. The Catholic Church

hailed the arrival of the 14th century with a papal jubilee, the first time in

history such a calendar-based event was celebrated.

At the same time, a vivid Christian apocalyptic subculture was emerging. Within

this subculture, people assumed divine intervention would interrupt the normal

course of history. This intervention would shape a future fraught with dire

political and natural catastrophes, though ultimately crowned by the definitive

reign of God. A number of 12th-century theologians had already begun to

speculate about the reign of the Antichrist, or enemy of Christ, which the book

of Matthew said would occur before the Judgment Day. Joachim of Fiore, born

about 1130, became the first clearly apocalyptic prophet when he developed a

theory of historical ages based on the steady unfolding of the Trinity. Joachim

saw the implications of calendar divisions and used them as a basis for looking

into the future, setting about forty generations for each age. Based on this

reasoning, Joachim expected the arrival of the Antichrist and an end to the

normal patterns of human history to occur probably between 1256 and 1260.

This tradition of apocalyptic forecasting then flourished for several centuries,

with Nostradamus emerging in the 16th century as the most ambitious single

prophet of modern times. A French doctor, Nostradamus began issuing books of

prophecy in the 1550s. He foretold an array of impending disasters stretching

from the immediate future to the year 2000. He warned of wars, of the murder of

kings, of great fires, and of huge naval battles. Later observers have cited

these descriptions as predictions of developments as diverse as the Great Fire

of London in 1666, the American and French revolutions (1775-1783 and

1789-1799), Hitler’s attack on Poland in 1939, and World War II (1939-1945).

Nostradamus’ assertion that “the great man will be struck down in the day by a

thunderbolt, at a young age” has been interpreted as a prediction of John F.

Kennedy’s assassination. Like Joachim, Nostradamus predicted the arrival of the

Antichrist. However, he predicted it would occur much later—around 1999—along

with new plagues and famines. Nostradamus’ prediction for 1999 is quoted in

James Randi’s book, The Mask of Nostradamus (1990).

The year 1999, seven months. From the sky will come a great King of Terror: To

bring back to life the great King of the Mongols, Before and after Mars [god of

war] to reign by good luck.

 

This apocalyptic subculture became less prominent by the 17th century, as

scientific thinking pushed ideas of apocalypse to the background. But beneath

the facade of intellectualism, Christian ideas about historical disruption by

God persisted. A host of small Protestant groups predicted an imminent end to

human history. Apocalyptic prophecies surrounded Martin Luther in Germany, and

then the English Civil Wars of the 17th century. Many millennialists, people

who predicted a 1,000-year period during which holiness would prevail on earth,

migrated to the United States seeking greater religious toleration. The emerging

nation became the world’s premier haven for apocalyptic thinking. It was in the

1790s, for example, that Mother Ann Lee, the founder of the Shakers, came to

the United States from England. There, people had looked down on her views

about an imminent second coming of Christ. “I knew that God had a chosen people

in America,” she wrote. During the 18th century, many Americans indulged in

prophecies of the apocalypse, reacting to fears of Native American attacks or

even earthquakes. Jonathan Edwards, a leading American theologian during that

time, indulged in the hallowed pastime of playing with Biblical numbers to show

that the world was in its final stages.

Mainstream American Protestantism became more conservative during the 19th

century, but religious prophesizing continued on its fringes. In upstate New

York during the 1830s, William Miller attracted as many as 50,000 followers for

his prediction that the world would end in 1843. Disappointed that Miller’s

prediction was wrong, the sect regrouped as the Seventh-day Adventists. Later,

members predicted that government growth, slavery, or what they saw as

pervasive sin were signs that the Biblical final age had arrived. During the

1850s, John Darby predicted a period called the tribulation, in which the

Antichrist would rule for seven years. According to Darby, the tribulation

would be preceded by a rapture, in which all believers would rise to meet

Christ in the air. During the 1880s, James Brooks gained wide publicity by

making a similar prediction. Brooks preached about a period of unprecedented

wickedness followed by Christ’s return. World War I, the Russian Revolution,

and later the Cold War have since been interpreted as signs of an imminent

historical disruption, inspired and ultimately guided by God.

This kind of apocalyptic prophesizing persists today. In 1997, for example, the

sighting of the Hale-Bopp comet was treated by most Americans as an interesting

natural phenomenon. However, it was interpreted by members of a San Diego sect

as a sign of the world’s end and their own imminent salvation. To prepare for

this event, they all committed suicide. By 1998, a new crop of Christian and

New Age apocalyptic predictions had emerged in anticipation of the year 2000.

That year, a newspaper called Millennial News greeted readers with stories of

Biblical experts’ consensus that the final stage of history was at hand.

Meanwhile, certain television prophets contended that the 1990s constituted

“the most important decade in the history of the world.” They predicted a world

dictatorship that would lead directly to the Antichrist, followed by God’s

1,000-year reign and then a return of chaos. Their theories were based on both

numbers analysis—6,000 years had passed since creation, one millennium for each

day of creation—and current events—perceived rampant immorality, plagues, the

European Union, and the rise of China. Many people seemed to agree with this

prediction, including some individuals troubled by the Y2K crisis in which

computers would need to be adjusted to account for the date 2000 or potentially

wreak havoc on information systems worldwide.

Secular Interpretations of the Dramatic Disruption Formula

Since the 18th century, secular versions of the dramatic disruption formula have

become a much more common form of prophecy among modern people. Elements of the

formula’s basic structure resemble characteristics of the religious version: a

basic force will transform the standard process of historical evolution into a

new age. However, secularists believed that an earthly force, not God, would

direct the change. Although the new secular prophets believed signs of the

change already existed, they rarely played with numerical formulas. Like the

religious prophets, they believed change would be extreme and would emphasize

either horrendous or beneficent transformations.

With scientific achievements blossoming and a general view of earthly progress

gaining ground, Antoine-Nicolas Condorcet, a figure in the late French

Enlightenment, wrote an essay called “Historical Outline of the Progress of the

Human Mind” (1794). In the essay, Condorcet forecast the establishment of a

virtually perfect human society. Condorcet wrote the essay while fleeing from

French revolutionary officials who sought his death, testifying to a new kind

of faith in a dramatic worldly future. Condorcet viewed history as a record of

steady progress, which by this point had reached a state whereknowledge and,

through evolution, the human body were ready to make a leap into a final age of

bliss. He states his view in his essay:

We shall find in the experience of the past, in the observation of the progress

that the sciences and civilization have already made, in the analysis of the

progress of the human mind and of the development of its faculties, the

strongest reasons for believing that nature has set no limit to the realization

of our hopes.… The time will therefore come when the sun will shine only on free

men who know no other master but their reason; when tyrants and slaves, priests

and their stupid or hypocritical instruments will exist only in works of

history and on the stage; and when we shall think of them only to pity their

victims and their dupes; to maintain ourselves in a state of vigilance by

thinking on their excesses, and to learn how to recognize and so to destroy, by

force of reason, the first seeds of tyranny and superstition, should they ever

dare to reappear among us.

 

Building on this general vision, a host of 19th-century prophets predicted a

dramatically improved future, in which characteristic problems of earlier and

even present human history would be transcended. Utopian socialists thought

that education and example could end ages of human exploitation and selfishness

and introduce a future of perfect equality and communal harmony. Karl Marx, on

the other hand, saw the laws of history leading to a final proletarian

revolution. This revolution would dramatically end class warfare and injustice,

which had dominated human history since its beginning. It would also usher in a

classless, stateless utopia in which recurrent catastrophe and change would

end. In 1888 the American novelist Edward Bellamy wrote the bestseller Looking

Backward, which he set in the year 2000. In the novel, the United States has

become a socialist utopia in which equality and humane treatment has replaced

the evils of capitalism. In this scenario too, history has been transformed.

While some of these secular prophets, like Condorcet and particularly Marx, had

an explanation for why history would end, in many other secular forecasts the

mechanisms that would transform the future were a bit vague. The hand of God

responsible for drastic change in religious prophesies had not clearly been

replaced in the secular version. But during the 1860s a line of prediction that

identified a powerful causal force began to take shape, and that force was

technology. It was at this point that the French author Jules Verne began to

develop the science fiction genre, with titles like From the Earth to the Moon

(1865) and Journey to the Center of the Earth (1864). Verne’s writings

essentially predicted airplanes, submarines, space satellites and missiles, and

television. As Verne foretold these technological developments, he sketched a

picture of a transformed humanity with capacities and concerns far different

from those that previously had framed human history. Science fiction writers

have been amplifying this picture of the future ever since, some optimistically

as Verne had, others with dire warnings. As technology replaced divine

intervention as the catalyst for drastic change, two assumptions emerged:

first, that massive additional technological change is imminent, and second,

that it will dominate predictions of the human future. Technology, in other

words, reshapes human history and ushers in a dramatic new age. This vision is

depicted in Jules Verne’s From the Earth to the Moon and a Tour of the Moon:

Now when they observed the earth through the lower window, it looked like

nothing more than a dark spot, drowned in the solar rays. No more crescent, no

more cloudy light! The next day, at midnight, the earth would be new, at the

very moment when the moon would be full. Above, the orb of night was nearing

the line followed by the projectile, so as to meet it at the given hour. All

around the black vault was studded with brilliant points, which seemed to move

slowly; but, at the great distance they were from them, their relative size did

not seem to change. The sun and stars appeared exactly as they do to us upon

earth. As to the moon, she was considerably larger; but the travelers’ glasses,

not very powerful, did not allow them as yet to make any useful observations

upon her surface, or conversations all about the moon.… Besides, the excitement

of the three travelers increased as they grew near the end of their journey.

They expected unforeseen incidents, and new phenomena; and nothing would have

astonished them in the frame of mind they then were in. Their over-excited

imagination went faster than the projectile, whose speed was evidently

diminishing, though insensibly to themselves. But the moon grew larger to their

eyes, and they fancied if they stretched out their hands they could seize it.

 

While this generally optimistic strand of dramatic-disruption forecasting

persisted during the 20th century and in fact reached new heights during the

1970s and 1980s, other predictions took another tone as they reflected a

century that seemed to invite pessimism. The rise of modern armies and strong

governments created a new, darker mode of dramatic forecasting, first in World

War I and then through the emergence of unprecedented dictatorships in Stalin’s

Russia and Hitler’s Germany. This mode was based on worries that political force

had achieved overriding importance in society and was backed by technologically

enhanced mind control and numbing, hedonistic indulgence. Aldous Huxley, in

Brave New World (1932), and George Orwell, in Nineteen Eighty-four (1949)

described societies dominated by faceless, authoritarian bureaucracies and

populated by humans who were constantly monitored and manipulated, essentially

reduced to robot status. In each novel, efforts to regain human spontaneity and

joy were ruthlessly repressed. In both of these novels, technology is the

catalyst for dramatic future change. The character of O’Brien in George

Orwell’s Nineteen Eighty-four says:

The old civilizations claimed that they were founded on love or justice. Ours is

founded upon hatred. In our world there will be no emotions except fear, rage,

triumph, and self-abasement. Everything else we will destroy—everything.…There

will be no loyalty, except loyalty to the party. There will be no love, except

the love of big brother. There will be no laughter, except the laugh of triumph

over a defeated enemy. There will be no art, no literature, no science. There

will be no distinction between beauty and ugliness. There will be no curiosity,

no enjoyment of the process of life. All competing pleasures will be

destroyed.…If you want a picture of the future, imagine a boot stamping on a

human face—forever.

 

Another possible catalyst for drastic change vied with technology during the

1960s and briefly won wide attention: the population bomb. In this scenario,

the cause of a dramatically altered future was not God, or technology, or

revolution, but rather an uncontrolled birth rate. Demographers, particularly

those in the United States, painted a vision in which world population growth,

if unchecked, would lead to unprecedented disaster. People in crowded countries

would force their way into the less crowded, richer lands. Countries would fight

bitter wars over scarce resources. Food supplies would fail to keep pace with

population, and environmental deterioration would add to the problem. As the

number of people on Earth stripped the planet of its available resources, an

unprecedented change of historical patterns would take place. While the

specific population bomb forecast was short-lived, it did lead many Americans

to reduce the size of their families, ending the heady years of the post-World

War II baby boom. And its influence continues: some of the more dramatic

environmentalist arguments of the 1980s and 1990s have employed elements of the

population bomb prophecy.

In the most recent, widely articulated version of the disruptive forecast, the

catalyst is again technology. These forecasts are buoyed by the recent

developments in robotics, computerization, and genetic engineering. During the

1970s and 1980s, a host of academic and popular prophets predicted that the

expansion of new technology would herald the arrival of a postindustrial

society in which fundamental features of human life and organization would be

radically altered. While some postindustrial prophets warned of problems to

come, the bulk of this forecasting was resolutely optimistic: the future would

be dramatically different from the historical past, and the transforming power

of technology would improve it immeasurably.

Time spent working would decline, and human beings would find new personal

satisfaction in leisure activities, such as electronic games, perhaps. As

people could again work at home, the role of cities would change. Cities would

cease being production centers and would survive only as entertainment

complexes. Social structure would be transformed as well; the key to power

would be control of information rather than ownership of land or capital. New

technologies that empowered individuals would reverse previous trends of

organization: people could work according to their own personal schedules, and

products would be tailored to individual taste. A few critics of this type of

forecast wondered if it applied to the whole world rather than merely the

wealthiest countries. But a few forecasters argued that dissemination of

technology could propel even poor societies into postindustrialism.

Alvin Toffler’s interpretation of the postindustrial revolution world is

portrayed in The Third Wave (1974):

So profoundly revolutionary is this new civilization that it challenges all our

old assumptions. Old ways of thinking, old formulas, dogmas, and ideologies, no

matter how cherished or how useful in the past, no longer fit the facts. The

world that is fast emerging from the clash of new values and technologies, new

geopolitical relationships, new life-styles and modes of communication, demands

wholly new ideas and analogies, classifications, and concepts. We cannot cram

the embryonic world of tomorrow into yesterday’s conventional cubbyholes. Nor

are the orthodox attitudes or moods appropriate. This Third Wave of historical

change represents not a straight-line extension of industrial society but a

radical shift of direction, often a negation, of what went before. It adds up

to nothing less than a complete transformation at least as revolutionary in our

day as industrial civilization was three hundred years ago. Furthermore, what is

happening is not just a technological revolution but the coming of a whole new

civilization in the fullest sense of that term.

 

Interestingly, while faith in technology remained high during the 1990s, no new

prophets emerged to propose another compelling futuristic vision. Except for

the religious millennialists, the late 1990s did not produce a new wave of

dramatic-disruption theories.

Predictive Mode III: Trend Extrapolation

The two predictive modes discussed previously used history either as a source

for recurring analogies or as a foil for dramatically altered futures. Another

mode uses history in yet a different way. As its name suggests, trend

extrapolation uses history to identify trends. While trend extrapolation was

practiced to some extent previously, it has gained wide currency in recent

decades. Reasons for this are twofold. The first is the explosion of expert

knowledge—including knowledge about history. The second is growing demand by

government and business leaders—especially stockbrokers—for best-possible

forecasts of what the future will bring. In fact, some agencies, such as the

American Social Security Administration, require that 50-year forecasts be used

as guidelines for setting funding policies. These forecasts are always based on

extrapolation of current relevant trends.

Trend-based forecasting has two parts. First, the analyst must identify

important current trends in a particular society. Second, and more important,

he or she must determine the causes of these trends and probable reasons for

their persistence. Not all trends, after all, are durable, making it crucial

that analysts assess the causes of trends.

The safest short-term forecasts, and some of the predictions most familiar to

us, are based, in fact, on extrapolation or projection. An example of this type

of forecast is the prediction that the average age of populations in the United

States and Japan will increase over the next several decades. This assumption

is based on knowledge of an already low birthrate and increasing life

expectancy. It holds that the average age in both countries, already unusually

high by historical standards, will be even higher by the year 2020. Thus,

whereas today in the U.S. there are three workers for every one person over 65,

in 2020 there will be only two workers for every one person over 65. Some

stabilization may occur thereafter, but current trends suggest that aging will

continue more slowly as the birthrate stagnates and as adult life expectancy

rises gradually. This forecast assumes, of course, that present trends will not

be disrupted by some new surge in the birthrate, by a higher death rate among

adults in late middle age, or by new immigration patterns that alter the

current demographic structure.

The same kind of trend analysis could take this prediction a bit further into

the future. Such a prediction might hold that by the mid-21st century, most

societies will begin to encounter the aging of their populations experienced by

the West in the 1920s and at the end of the baby boom and, more recently,

experienced by Japan. Current trends that support this prediction are the rapid

per capita birthrate decline (despite continuing population growth resulting

from previously high birthrates), the eventual stabilization of the population,

and increased life expectancy). So, by trend extrapolation, one might argue that

the 21st century could very well shape up as the geriatric century, or century

of old age. Indeed, a fair portion of public policy during the century may well

be shaped by the growing elderly segment as older people adjust to their new

roles and as society reacts to the group’s unprecedented large numbers.

Trend assessment can cast a long shadow, particularly if it is combined with

examples or analogies from relevant past history. During the mid-1990s, many

investment bankers argued that for the next 50 to 100 years, the rapid economic

growth areas of the world would be East Asia, east central Europe, and Latin

America, plus possibly South Africa and perhaps even Russia. This projection

was based on recent trends, such as the fast growth in the Pacific Rim and in

Latin America. The gross national products of key Latin American countries are

expected to grow 7 to 8 percent each year in the near future, and China has

already been generating 10 percent growth. In making their forecast, the

investment bankers combined these trends with the more general historically

grounded assumption that younger industrial areas grow particularly fast,

compared both to mature industrial areas and to nonindustrial areas. The

historical bases of this assumption are Japan from 1920 to 1960 and the United

States from 1870 to 1930.

These recent predictions did not consider every possibility, however. They

assumed that there would be no new factors, such as a major war. In addition,

they did not consider changes that could be brought about by the unanticipated

realization of potential in Africa, South Asia, or elsewhere, changes that

could very well echo those that occurred in Latin America after the 1930s.

Finally, the predictions did not anticipate the more recent financial problems

of the Pacific Rim. Even so, whatever the ultimate accuracy of the investment

bankers’ forecasts, they demonstrate how much information can be gleaned from

trend extrapolation when it is supplemented by historical background that

suggests solid causation. The projections seem plausible, at least for the next

several decades; however, the temptation to extend them farther, possibly to

cover a full century, may well demonstrate their limitations. Even so,

discussions of trends constantly lure us in ambitious directions because they

originate in clear, verifiable data.

Trend projection routinely shows up in job forecasts. Occupations for which

workers are needed during the present decade are touted as the source of peak

opportunities in the next. Based on this type of trend forecasting,

occupational analysts predict that film projectionists will lose work to

automation and jobs for actors and amusement park attendants will soar. Medical

practitioners of all sorts will be in new demand. Paralegals and software

innovators will have their pick of jobs, as will security guards. And so the

forecasts roll on, telling us that what is now happening will happen even more

intensely over the next short term.

And while trend analysis yields forecasts that are quite useful and studied, as

in the case of job forecasts, it also may result in forecasts that many

policymakers wish to ignore. For example, it seems safe to argue that over the

next 20 years, more and more countries will acquire nuclear weapons. The trend

has already begun—witness Pakistan’s demonstration of nuclear weapons in 1998.

The causes for this trend are solidly in place: technological and scientific

capacity continues to increase steadily, more and more regional powers want to

demonstrate their strength, and great-power influence is declining—and the

great powers sabotage their case for nuclear limitation by refusing to abandon

their own arsenals. By 2020, many countries, not just a few, will have become

nuclear nations. Given this prediction, one would think current policymakers

would want to prepare for this future, rather than ignore it. The big question

that arises out of this situation, of course, depends not on trends but rather

on historical analogy: in the past, once a country acquired weapons, it

normally used them. Will this pattern recur, or can we alter this historical

precedent?

Although trend analysis has been used successfully in specific areas such as

aging or weaponry, it can also be used to create sweeping generalities. Many

authorities believe that societies around the world are modernizing in some

predictable directions, so that, unless some unforeseen catastrophe occurs,

they will become increasingly similar. In the course of this modernization,

industrial production will spread; families will increasingly consist only of

parents and children and reflect lower birthrates; education will become more

widespread, leading to growing familiarity with science; gender inequalities

will diminish; and consumer culture will capture ever more attention. Experts

argue that these trends are already visible in most societies, and they predict

the trends will continue to spread. Some experts would use recent expansions of

democracy on most continents, from Paraguay to South America and from Taiwan to

Poland, to support their argument that democratic political systems are also a

wave of the future. As we can see, while trend analysis is not generally used

to make such sweeping predictions as those made with dramatic disruption

formula, its implications can be wide ranging.

Why We Cannot Know the Future

With the three major prediction forms at our disposal, all of them plausible and

widely used, why does the future continue to elude us? Why are so many

predictions wrong? Many seem plausible at the time they are made—even the 1940s

forecast that by the 1970s everyone would be riding around in helicopters rather

than cars and the predictions during the 1970s that communes would replace

individual families and youth would become a revolutionary force. Why are we

still wrong?

In the first place, the three predictive modes clash with each other. A forecast

based on the assumption that current selective trends will intensify in the

future will, by its nature, be different from a prediction based on the theory

of dramatic disruption. In addition, both of these predictive modes tend to

downplay the use of historical cycles or recurrence. To understand the inherent

differences of the dramatic disruption and trend forecasting modes, and how no

one single mode can accurately predict the future, consider two possible

predictions about the year 2020. Will this be an age dramatically transformed

by further computerization and robotics? Or will the processes of aging

dominate it more, as retired people come to make up a greater percentage of the

population? Both characteristics might apply, perhaps, but few forecasters

manage to put together a picture with this kind of complexity. Technology

proponents tend to ignore the effects of the aging population, and trend

watchers might overlook technology’s potential for creating a dramatic

disruption. Thus, we do not really have models rich enough to capture what is

likely to happen.

Further, each prediction mode has its own characteristic vulnerability, based on

its very use of history. For example, analogies based on the idea of recurrence

assume that historical events or patterns will be sufficiently alike over time

to allow comparable actions with comparable effects. But many historians

believe that real comparability is quite rare, revealing the inherent

limitations of analogy. When a university president makes an analogy such as

“computers will do for education what the steam engine did for manufacturing,”

are the cases close enough to have much real meaning beyond the obvious

allusion to dramatic change? In an extreme case, pursuing analogy can lead to

disaster, as it did when France, during the 1930s, mistakenly assumed that

World War II would be like World War I and built an elaborate fortification

line along its northern border to prevent German invasion. France was at an

immense disadvantage when the World War II Germans, with their new technology,

simply swept around the line.

Use of the dramatic disruption theory to predict the future obviously depends on

faith—in God, or technology, or some sweeping political cause. Because of this,

predictions based on it cannot be disproved save by the passage of time. Most

of the disruptive forecasts have not come fully true yet. The year 1984, for

example, passed with little resemblance to what Orwell had predicted in his

novel. A key weakness in the dramatic disruption mode is the assumption that

one factor will shape the future. Actual human history has shown that major

societal changes are usually caused by several factors, and they embrace

considerable continuity as well. As we can see, reliance on a single dramatic

event as an impetus of future change is unrealistic because the approach is

often too simplistic given society’s various complexities.

Trend analysis, the most conservative predictive mode, on the other hand, is

vulnerable to unexpected variables. For example, any number of events could

disrupt the trend of an aging population. New costly insurance policies might

deny medical care to the elderly and therefore curb adult longevity. New

immigration policies could bring young people in from other countries and alter

the age balance of the population. Or birth rates might unexpectedly increase,

as in the surprise baby boom of the 1940s. The theory that societies will

become similar through modernization is compelling but does not consider

unexpected variables in religious developments, such as the rise of Islamic and

Hindu fundamentalism in the Americas.

In conclusion, in spite of our best efforts to make predictions by using

historical analogy, studying historical cycles or trends, or identifying

catalysts of dramatic disruptive change, we cannot know what the future holds.

We can, however, enjoy speculating about it, and study why some predictions are

more plausible than others. History will remain a key basis for this assessment.

We will sort out predictions by how, and how well, they use history. In spite of

our acknowledgment that one cannot know for sure what the future holds, in all

probability, we can predict that forecasts about the future will continue to be

based on the past.

About the author: Peter N. Stearns is the Heinz Professor of History and Dean of

the College of Humanities and Social Sciences at Carnegie Mellon University. He

is the author of Millennium II, Century XXI: A Retrospective on the Future

among numerous other publications.© 1993-2003 Microsoft Corporation. All rights

reserved.

___________

__ IncrediMail - Email has

finally evolved - Click Here

Attachment: (image/gif) [not stored]

Link to comment
Share on other sites

Join the conversation

You are posting as a guest. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
×
×
  • Create New...