We are searching data for your request:
Upon completion, a link will appear to access the found materials.
Street Car Washington DC
The later part of the 19th century was a period of rapid technological progress; progress that was quickly adopted by industry and consumers. Beyond the railroads, which had the greatest impact, other inventions effected American growth. During this period, the electric light bulb was invented and commercialized and the phonograph was invented as well.
Of course, another major invention during this period was the telephone. When Alexander Bell invented the telephone, Western Union (the telegraph company) was seen as an unassailable monopoly — with 12,000 telegraph offices, 400,000 miles of telegraph wire, and annual profits of $7,000,000. Western Union scoffed at Bell's invention, calling the telephone "a hobby". However, some of the most important inventions were incremental innovations that came together on the factory floor and resulted in a massive increase in factory productivity. Small factories became big factories and became every more productive. New machines allowed one employee to turn out 300 boots a day. With the invention of new, increasingly efficient equipment, one factory in Massachusetts produced more shoes in a year than all 32,000 bookmaker in Paris. There were fewer factories, producing more goods, with less employees. It should be noted that factory employment grew overall throughout this period, but production outpaced employment gains.
New Technologies began to transform America. Electric street cars were introduced toward the end of the century. This allowed cities to expand dramatically. In 1890, 1,260 miles of street car tracks had been electrified. By 1903 that number of street car tracks had tripled.
How has technology changed - and changed us - in the past 20 years?
Just over 20 years ago, the dotcom bubble burst, causing the stocks of many tech firms to tumble. Some companies, like Amazon, quickly recovered their value – but many others were left in ruins. In the two decades since this crash, technology has advanced in many ways.
Many more people are online today than they were at the start of the millennium. Looking at broadband access, in 2000, just half of Americans had broadband access at home. Today, that number sits at more than 90%.
This broadband expansion was certainly not just an American phenomenon. Similar growth can be seen on a global scale while less than 7% of the world was online in 2000, today over half the global population has access to the internet.
Similar trends can be seen in cellphone use. At the start of the 2000s, there were 740 million cell phone subscriptions worldwide. Two decades later, that number has surpassed 8 billion, meaning there are now more cellphones in the world than people
At the same time, technology was also becoming more personal and portable. Apple sold its first iPod in 2001, and six years later it introduced the iPhone, which ushered in a new era of personal technology. These changes led to a world in which technology touches nearly everything we do.
Technology has changed major sectors over the past 20 years, including media, climate action and healthcare. The World Economic Forum’s Technology Pioneers, which just celebrated its 20th anniversary, gives us insight how emerging tech leaders have influenced and responded to these changes.
How Does Technology Affect the Economy?
Technology has affected the economy through direct job creation, contribution to GDP growth, creation of new services and industries, workforce transformation and business innovation. The use of technology has been linked to marketplace transformation, improved living standards and more robust international trade. Technology has revolutionized virtually every industry in the economy.
Technological advances have significantly lowered the cost of doing business. For example, a manufacturing plant can be operated by just a few technicians controlling robotic systems. Innovative inventory systems are capable of supplying needed parts within a short time for assembly. Computers have drastically reduced the cost of carrying large inventories of intermediate parts and finished products via computerized accounting. Advancements in the computer industry, coupled with advancements in telecommunications, have increased job opportunities and strengthened economic growth. The Internet has overcome the physical barriers to communication over distances, and organizations and individuals can easily place orders through an online platform.
Similarly, manufacturing companies have developed tech-based links to their suppliers and customers. Suppliers can keep track of production-line efficiencies through computer hook-ups and can more efficiently ship parts and materials to the required location, reducing inventory and downtime. International manufacturing companies connect design centers in different countries to create international teams. E-commerce and online-banking capabilities have also helped reduce the cost of doing business.
The Long Boom: A History of the Future, 1980
To revist this article, visit My Profile, then View saved stories.
To revist this article, visit My Profile, then View saved stories.
A bad meme—a contagious idea—began spreading through the United States in the 1980s: America is in decline, the world is going to hell, and our children's lives will be worse than our own. The particulars are now familiar: Good jobs are disappearing, working people are falling into poverty, the underclass is swelling, crime is out of control. The post-Cold War world is fragmenting, and conflicts are erupting all over the planet. The environment is imploding—with global warming and ozone depletion, we'll all either die of cancer or live in Waterworld. As for our kids, the collapsing educational system is producing either gun-toting gangsters or burger-flipping dopes who can't read.
By the late 1990s, another meme began to gain ground. Borne of the surging stock market and an economy that won't die down, this one is more positive: America is finally getting its economic act together, the world is not such a dangerous place after all, and our kids just might lead tolerable lives. Yet the good times will come only to a privileged few, no more than a fortunate fifth of our society. The vast majority in the United States and the world face a dire future of increasingly desperate poverty. And the environment? It's a lost cause.
But there's a new, very different meme, a radically optimistic meme: We are watching the beginnings of a global economic boom on a scale never experienced before. We have entered a period of sustained growth that could eventually double the world's economy every dozen years and bring increasing prosperity for—quite literally—billions of people on the planet. We are riding the early waves of a 25-year run of a greatly expanding economy that will do much to solve seemingly intractable problems like poverty and to ease tensions throughout the world. And we'll do it without blowing the lid off the environment.
If this holds true, historians will look back on our era as an extraordinary moment. They will chronicle the 40-year period from 1980 to 2020 as the key years of a remarkable transformation. In the developed countries of the West, new technology will lead to big productivity increases that will cause high economic growth—actually, waves of technology will continue to roll out through the early part of the 21st century. And then the relentless process of globalization, the opening up of national economies and the integration of markets, will drive the growth through much of the rest of the world. An unprecedented alignment of an ascendent Asia, a revitalized America, and a reintegrated greater Europe—including a recovered Russia—together will create an economic juggernaut that pulls along most other regions of the planet. These two metatrends—fundamental technological change and a new ethos of openness—will transform our world into the beginnings of a global civilization, a new civilization of civilizations, that will blossom through the coming century.
Think back to the era following World War II, the 40-year span from 1940 to 1980 that immediately precedes our own. First, the US economy was flooded with an array of new technologies that had been stopped up by the war effort: mainframe computers, atomic energy, rockets, commercial aircraft, automobiles, and television. Second, a new integrated market was devised for half the world—the so-called free world—in part through the creation of institutions like the World Bank and the International Monetary Fund. With the technology and the enhanced system of international trade in place by the end of the 1940s, the US economy roared through the 1950s, and the world economy joined in through the 1960s, only to flame out in the 1970s with high inflation—partly a sign of growth that came too fast. From 1950 to 1973, the world economy grew at an average 4.9 percent—a rate not matched since, well, right about now. On the backs of that roaring economy and increasing prosperity came social, cultural, and political repercussions. It's no coincidence that the 1960s were called revolutionary. With spreading affluence came great pressure from disenfranchised races and other interest groups for social reform, even overt political revolution.
Strikingly similar—if not still more powerful—forces are in motion today. The end of the military state of readiness in the 1980s, as in the 1940s, unleashed an array of new technologies, not the least of which is the Internet. The end of the Cold War also saw the triumph of a set of ideas long championed by the United States: those of the free-market economy and, to some extent, liberal democracy. This cleared the way for the creation of a truly global economy, one integrated market. Not half the world, the free world. Not one large colonial empire. Everybody on the planet in the same economy. This is historically unprecedented, with unprecedented consequences to follow. In the 1990s, the United States is experiencing a booming economy much like it did in the 1950s. But look ahead to the next decade, our parallel to the 1960s. We may be entering a relentless economic expansion, a truly global economic boom, the long boom.
Sitting here in the late 1990s, it's possible to see how all the pieces could fall into place. It's possible to construct a scenario that could bring us to a truly better world by 2020. It's not a prediction, but a scenario, one that's both positive and plausible. Why plausible? The basic science is now in place for five great waves of technology—personal computers, telecommunications, biotechnology, nanotechnology, and alternative energy—that could rapidly grow the economy without destroying the environment. This scenario doesn't rely on a scientific breakthrough, such as cold fusion, to feed our energy needs. Also, enough unassailable trends—call them predetermined factors—are in motion to plausibly predict their outcome. The rise of Asia, for example, simply can't be stopped. This is not to say that there aren't some huge unknowns, the critical uncertainties, such as how the United States handles its key role as world leader.
Why a positive scenario? During the global standoff of the Cold War, people clung to the original ideological visions of a pure form of communism or capitalism. A positive scenario too often amounted to little more than surviving nuclear war. Today, without the old visions, it's easy enough to see how the world might unravel into chaos. It's much more difficult to see how it could all weave together into something better. But without an expansive vision of the future, people tend to get short-sighted and mean-spirited, looking out only for themselves. A positive scenario can inspire us through what will inevitably be traumatic times ahead.
So suspend your disbelief. Open up to the possibilities. Try to think like one of those future historians, marveling at the changes that took place in the 40-year period that straddled the new millennium. Sit back and read through the future history of the world.
The Boom's Big Bang
From a historical vantage point, two developments start around 1980 that will have profound consequences for the US economy, the Western economy, then the global economy at large. One is the introduction of personal computers. The other is the breakup of the Bell System. These events trigger two of the five great waves of technological change that will eventually help fuel the long boom.
The full impact can be seen in the sweep of decades. In the first 10 years, personal computers are steadily adopted by businesses. By 1990, they begin to enter the home, and the microprocessor is being embedded in many other tools and products, such as cars. By the turn of the century, with the power of computer chips still roughly doubling every 18 months, everything comes with a small, cheap silicon brain. Tasks like handwriting recognition become a breeze. Around 2010, Intel builds a chip with a billion transistors—100 times the complexity of the most advanced integrated circuits being designed in the late 1990s. By 2015, reliable simultaneous language translation has been cracked—with immediate consequences for the multilingual world.
The trajectory for the telecommunications wave follows much the same arc. The breakup of Ma Bell, initiated in 1982, triggers a frenzy of entrepreneurial activity as nascent companies like MCI and Sprint race to build fiber-optic networks across the country. By the early 1990s, these companies shift from moving voice to moving data as a new phenomenon seems to come out of nowhere: the Internet. Computers and communications become inextricably linked, each feeding the phenomenal growth of the other. By the late 1990s, telecom goes wireless. Mobile phone systems and all-purpose personal communications services arrive first with vast antennae networks on the ground. Soon after, the big satellite projects come online. By 1998, the Iridium global phone network is complete. By 2002, Teledesic's global Internet network is operational. These projects, among others, allow seamless connection to the information infrastructure anywhere on the planet by early in the century. By about 2005, high-bandwidth connections that can easily move video have become common in developed countries, and videophones finally catch on.
The symbiotic relationship between these technology sectors leads to a major economic discontinuity right around 1995, generally attributed to the explosive growth of the Internet. It's the long boom's Big Bang—immediately fueling economic growth in the traditional sense of direct job creation but also stimulating growth in less direct ways. On the most obvious level, hardware and infrastructure companies experience exponential growth, as building the new information network becomes one of the great global business opportunities around the turn of the century.
A new media industry also explodes onto the scene to take advantage of the network's unique capabilities, such as interactivity and individual customization. Start-ups plunge into the field, and traditional media companies lumber in this direction. By the late 1990s, the titans of the media industry are in a high-stakes struggle over control of the evolving medium. Relative newcomers like Disney and Microsoft ace out the old-guard television networks in a monumental struggle over digital TV. After a few fits and starts, the Net becomes the main medium of the 21st century.
The development of online commerce quickly follows on new media's heels. First come the entrepreneurs who figure out how to encrypt messages, conduct safe financial transactions in cyberspace, and advertise one to one. Electronic cash, a key milestone, gains acceptance around 1998. Then come businesses selling everyday consumer goods. First it's high tech products such as software, then true information products like securities. Soon everything begins to be sold in cyberspace. By 2000, online sales hit US$10 billion, still small by overall retail standards. Around 2005, 20 percent of Americans teleshop for groceries.
Alongside the migration of the traditional retail world into cyberspace, completely new types of work are created. Many had speculated that computer networks would lead to disintermediation—the growing irrelevance of the middleman in commerce. Certainly the old-style go-betweens are sideswiped, but new types of intermediaries arise to connect buyers to sellers. And with the friction taken out of the distribution system, the savings can be channeled into new ventures, which create new work.
The Birth of the Networked Economy
New technologies have an impact much bigger than what literally takes place online. On a more fundamental level, the networked economy is born. Starting with the recession of 1990-91, American businesses begin going through a wrenching process of reengineering, variously described at the time as downsizing, outsourcing, and creating the virtual corporation. In fact, they are actually taking advantage of new information technologies to create the smaller, more versatile economic units of the coming era.
Businesses, as well as most organizations outside the business world, begin to shift from hierarchical processes to networked ones. People working in all kinds of fields—the professions, education, government, the arts—begin pushing the applications of networked computers. Nearly every facet of human activity is transformed in some way by the emergent fabric of interconnection. This reorganization leads to dramatic improvements in efficiency and productivity.
Productivity, as it happens, becomes one of the great quandaries stumping economists throughout the 1990s. Despite billions invested in new technologies, traditional government economic statistics reflect little impact on productivity or growth. This is not an academic point—it drives to the heart of the new economy. Businesses invest in new technology to boost the productivity of their workers. That increased productivity is what adds value to the economy—it is the key to sustained economic growth.
Research by a few economists, like Stanford University's Paul Romer, suggests that fundamentally new technologies generally don't become productive until a generation after their introduction, the time it takes for people to really learn how to use them in new ways. Sure enough, about a generation after the introduction of personal computers in the workplace, work processes begin mutating enough to take full advantage of the tool. Soon after, economists figure out how to accurately measure the true gains in productivity—and take into account the nebulous concept of improvement in quality rather than just quantity.
By 2000, the US government adopts a new information-age standard of measuring economic growth. Unsurprisingly, actual growth rates are higher than what had registered on the industrial-age meter. The US economy is growing at sustained rates of around 4 percent—rates not seen since the 1960s.
The turn of the century marks another major shift in government policy, as the hidebound analysis of inflation is finally abandoned in light of the behavior of the new economy. While the Vietnam War, oil shocks, and relatively closed national labor markets had caused genuine inflationary pressures that wreaked havoc on the economy through the 1970s, the tight monetary policies of the 1980s soon harness the inflation rate and lead to a solid decade with essentially no wage or price rises. By the 1990s, globalization and international competition add to the downward pressure. By 2000, policymakers finally come around to the idea that you can grow the economy at much higher rates and still avoid the spiral of inflation. The millennium also marks a symbolic changing of the guard at the Federal Reserve Bank: Alan Greenspan retires, the Fed lifts its foot off the brake, and the US economy really begins to take off.
More Tech Waves
Right about the turn of the century, the third of the five waves of technology kicks in. After a couple false starts in the 1980s and 1990s, biotechnology begins to transform the medical field. One benchmark comes in 2001 with the completion of the Human Genome Project, the effort to map out all human genes. That understanding of our genetic makeup triggers a series of breakthroughs in stopping genetic disease. Around 2012, a gene therapy for cancer is perfected. Five years later, almost one-third of the 4,000 known genetic diseases can be avoided through genetic manipulation.
Throughout the early part of the century, the combination of a deeper understanding of genetics, human biology, and organic chemistry leads to a vast array of powerful medications and therapies. The health care system, having faced a crossroads in 1994 with President Clinton's proposed national plan, continues restructuring along the more decentralized, privatized model of HMOs. The industry is already booming when biotech advances begin clicking in the first decade of the century. It receives a further stimulus when the baby boomers begin retiring en masse in 2011. The industry becomes a big jobs provider for years to come.
The biotech revolution profoundly affects another economic sector—agriculture. The same deeper understanding of genetics leads to much more precise breeding of plants. By about 2007, most US produce is being genetically engineered by these new direct techniques. The same process takes place with livestock. In 1997, the cloning of sheep in the United Kingdom startles the world and kicks off a flurry of activity in this field. By the turn of the century, prize livestock is being genetically tweaked as often as traditionally bred. By about 2005, animals are used for developing organs that can be donated to humans. Superproductive animals and ultrahardy, high-yielding plants bring another veritable green revolution to countries sustaining large populations.
By the end of the transitional era, around 2020, real advances begin to be made in the field of biological computation, where billions of relatively slow computations, done at the level of DNA, can be run simultaneously and brought together in the aggregate to create the ultimate in parallel processing. So-called DNA computing looks as though it will bring about big advances in the speed of processing sometime after 2025—certainly by the middle of the century.
Then comes the fourth technology wave—nanotechnology. Once the realm of science fiction, this microscopic method of construction becomes a reality in 2015. Scientists and engineers figure out reliable methods to construct objects one atom at a time. Among the first commercially viable products are tiny sensors that can enter a person's bloodstream and bring back information about its composition. By 2018, these micromachines are able to do basic cell repair. However, nanotechnology promises to have a much more profound impact on traditional manufacturing as the century rolls on. Theoretically, most products could be produced much more efficiently through nanotech techniques. By 2025, the theory is still far from proven, but small desktop factories for producing simple products arrive.
By about 2015, nanotech techniques begin to be applied to the development of computing at the atomic level. Quantum computing, rather than DNA computing, proves to be the heir to microprocessors in the short run. In working up to the billion-transistor microprocessor in 2010, engineers seem to hit insurmountable technical barriers: the scale of integrated circuits has shrunk so small that optical-lithography techniques fail to function. Fortunately, just as the pace of microprocessing power begins to wane, quantum computing clicks in. Frequent increases in computing power once again promise to continue unabated for the foreseeable future.
The Earth Saver
All four waves of technology coursing through this era—computers, telecom, biotech, and nanotech—contribute to a surge of economic activity. In the industrial era, a booming economy would have put a severe strain on the environment: basically everything we made, we cooked, and such high-temperature cooking creates a lot of waste by-products. The logic of the era also tended toward larger and larger factories, which created pollution at even greater scales.
Biotech, on the other hand, uses more moderate temperature realms and emulates the processes of nature, creating much less pollution. Infotech, which moves information electronically rather than physically, also makes much less impact on the natural world. Moving information across the United States through the relatively simple infotechnology of the fax, for example, proves to be seven times more energy efficient than sending it through Federal Express. Furthermore, these technologies are on an escalating track of constant refinement, with each new generation becoming more and more energy efficient, with lower and lower environmental impact. Even so, these increasing efficiencies are not enough to counteract the juggernaut of a booming global economy.
Fortunately, the fifth wave of new technology—alternative energy—arrives right around the turn of the century with the introduction of the hybrid electric car. Stage one begins in the late 1990s when automobile companies such as Toyota roll out vehicles using small diesel- or gasoline-fueled internal-combustion engines to power an onboard generator that then drives small electric motors at each wheel. The car runs on electric power at low RPMs but uses the internal-combustion engine at highway speeds, avoiding the problem of completely battery-powered electric vehicles that run out of juice after 60 miles. The early hybrids are also much more efficient than regular gas-powered cars, often getting 80 miles to a gallon.
Stage two quickly follows, this time spurred by aerospace companies such as Allied Signal, which leverage their knowledge of jet engines to build hybrids powered by gas turbines. By 2005, technology previously confined to aircraft's onboard electric systems successfully migrates to automobiles. These cars use natural gas to power the onboard generators, which then drive the electric motors at the wheels. They also make use of superstrong, ultralight new materials that take the place of steel and allow big savings on mileage.
Then comes the third and final stage: hybrids using hydrogen fuel cells. The simplest and most abundant atom in the universe, hydrogen becomes the source of power for electric generators—with the only waste product being water. No exhaust. No carbon monoxide. Just water. The basic hydrogen-power technology had been developed as far back as the Apollo space program, though then it was still extremely expensive and had a nasty tendency to blow up. By the late 1990s, research labs such as British Columbia-based Ballard Power Systems are steadily developing the technology with little public fanfare. Within 10 years, there are transitional hydrogen car models that extract fuel from ordinary gasoline, using the existing network of pumps. By 2010, hydrogen is being processed in refinery-like plants and loaded onto cars that can go thousands of miles—and many months—before refueling. The technology is vastly cheaper and safer than in the 1960s and well on its way to widespread use.
These technological developments drive nothing less than a wholesale transformation of the automobile industry through the first quarter of the new century. Initially prodded by government decrees such as California's zero-emission mandate—which called for 10 percent of new cars sold to have zero emissions by 2003—the industrial behemoths begin to pick up speed when an actual market for hybrid cars opens up. People buy them not because they are the environmentally correct option but because they're sporty, fast, and fun. And the auto companies build them because executives see green—as in money, not trees.
This 10- to 15-year industrial retooling sends reverberations throughout the global economy. The petrochemical giants begin switching from maintaining vast networks that bring oil from remote Middle Eastern deserts to building similarly vast networks that supply the new elements of electrical power. Fossil fuels will continue to be a primary source of power into the middle of the 21st century—but they will be clean fossil fuels. By 2020, almost all new cars are hybrid vehicles, mostly using hydrogen power. That development alone defuses much of the pressure on the global environment. The world may be able to support quite a few additional automobile drivers—including nearly 2 billion Chinese.
While the end of the Cold War initiates the waves of technology rippling through our 40-year era, that's only half the story. The other half has to do with an equally powerful force: globalization. While it is spurred by new technologies, the emergence of an interconnected planet is propelled more by the power of an idea—the idea of an open society.
From a historical vantage point, globalization also begins right around 1980. One of the souls who best articulates this idea of the open society is Mikhail Gorbachev. It's Gorbachev who helps bring about some of its most dramatic manifestations: the fall of the Wall, the collapse of the Soviet empire, the end of the Cold War. He helps inititate a vast wave of political change that includes the democratization of eastern Europe and Russia itself. To kick it off, Gorbachev introduces two key concepts to his pals in the Politburo in 1985, two ideas that will resonate not just in the Soviet Union but through all the world. One is glasnost. The other is perestroika. Openness and restructuring—the formula for the age, the key ingredients of the long boom.
An equally important character is China's Deng Xiaoping. His actions don't bring about the same dramatic political change, but right around the same time as Gorbachev, Deng initiates a similarly profound shift of policies, applying the concepts of openness and restructuring to the economy. This process of opening up—creating free trade and free markets—ultimately makes just as large a global impact. No place is this more apparent than in Asia.
Japan grasps the gist of this economic formula long before the buzz begins, pulling a group of Asian early adopter countries in its wake. By the 1980s, Japan has nearly perfected the industrial-age manufacturing economy. But by 1990, the rules of the global economy have changed to favor more nimble, innovative processes, rather than meticulous, methodical economies of scale. Many of the attributes that favored Japan in the previous era, such as a commitment to lifelong employment and protected domestic markets, work against the country this time around. Japan enters the long slump of the 1990s. By the end of the decade, Japan has watched the United States crack the formula for success in the networked economy and begins to adopt the model in earnest. In 2000, it radically liberalizes many of its previously protected domestic markets—a big stimulus for the world economy at large.
Japan's rise, however, is but a prelude to the ascendance of China. In 1978, Deng takes the first steps toward liberalizing the communist economy. China slowly gathers force through the 1980s, until the annual growth in the gross national product consistently tops 10 percent. By the 1990s, the economy is growing at a torrid pace, with the entire coast of China convulsed with business activity and boomtowns sprouting all over the place. Nineteen ninety-seven—a year marked by both the death of Deng and the long-awaited return of Hong Kong—symbolizes the end of China's ideological transition and the birth of a real economic world power.
The first decade of the new century poses many problems for China domestically—and for the rest of the world. The overheated economy puts severe strain on the fabric of Chinese society, particularly between the increasingly affluent urban areas on the coast and the 800 million impoverished peasants in the interior. The nation's relatively low tech smokestack economy also threatens to single-handedly push the global environment over the edge. The Chinese initially do little to reduce their level of dependence on coal, which in the late 1990s still supplies three-quarters of the country's energy needs. Only sustained efforts by the rest of the world to ensure that China has access to the very best transportation and industrial technology avert an environmental catastrophe. Occasionally using draconian measures, China manages to avoid severe internal disturbance. By 2010, the sense of crisis has dissipated. China is generally acknowledged to be on a path toward more democratic politics—though not in the image of the West.
With the reemergence of China's economic might, the 3,500-year-old civilization begins to assert itself and play a larger part in shaping the world. Chinese clan-based culture happens to work very well within the fluid demands of the networked global economy. Singapore and Hong Kong prove the point through the 1980s and 1990s, when the two city-states with almost no land mass or natural resources become economic powers through pure human capital, primarily brainpower.
For years, Chinese expatriates have established intricate financial networks throughout Western countries, but especially in Asia. Many Southeast Asian economies—if not governments—are completely dominated by the overseas Chinese. By about 2005, the mainland Chinese decide to capitalize on this by formalizing the Chinese diaspora. Though the entity has no legal status vis-a-vis other governments, it has substantial economic clout. That date also marks the absorption of Taiwan into China proper.
By 2020, the Chinese economy has grown to be the largest in the world. Though the US economy is more technologically sophisticated, and its population more affluent, China and the United States are basically on a par. China has also drawn much of Asia in its economic wake—Hong Kong and Shanghai are the key financial nodes for this intricate Asian world.
Asia is jammed with countries that are economic powerhouses in their own right. India builds on its top-notch technical training and mastery of the lingua franca of the high tech world, English, to challenge many Western countries in software development. Malaysia's audacious attempt to jump-start an indigenous high tech sector through massive investments in a multimedia supercorridor pays off. The former communist countries Vietnam and Cambodia turn out to be among the most adept at capitalism. The entire region—from the reunited Koreas to Indonesia to the subcontinent—is booming. In just 20 years, 2 billion people have made the transition into what can be considered a middle-class lifestyle. In the space of one full 80-year life span, Asia has gone from almost uninterrupted poverty to widespread wealth.
The European Shuffle
Meanwhile, on the other side of the planet, the new principles of openness and restructuring are applied first in politics, then economics. In the aftermath of the spectacular implosion of the Soviet Union, most energy is spent promoting democracy and dismantling the vestiges of the Cold War. With time, an equal amount of energy is applied to restructuring and retooling economies—in some obvious and not so obvious ways.
First, Europe at large has to reintegrate itself, both economically and politically. Much of the 1990s is spent trying to integrate eastern and western Europe. All eyes first focus on the new Germany, which powers through the process on the basis of sheer financial might. Next the more advanced of the eastern European countries—Poland, Hungary, the Czech Republic—get integrated, first into NATO, with formal acceptance in 2000, and then into the European Union in 2002. The more problematic countries of eastern Europe aren't accepted into the union for another couple years. Alongside this East-West integration comes a more subtle integration between the western European countries. With fits and starts, Europe moves toward the establishment of one truly integrated entity. The European currency—the euro—is adopted in 1999, with a few laggards, like Britain, holding out a few more years.
Though the UK may have dragged its feet on the European currency measure, in an overall sense it's far ahead of the pack. The economic imperative of the era is not just to integrate externally but to restructure internally. Right around 1980, Margaret Thatcher and Ronald Reagan begin putting together the formula that eventually leads toward the new economy. At the time it looks brutal: busting unions, selling off state-owned industries, and dismantling the welfare state. In hindsight, the pain pays off. By the mid-1990s, the US unemployment rate is near 5 percent, and the British rate has dropped to almost 6 percent. In contrast, unemployment on the European continent hovers at 11 percent, with some individual countries even higher.
Indeed, through the 1990s, the rest of Europe remains trapped in the legacy of its welfare states, which maintain their political attractiveness long after they outlive their economic worth. By 2000, chronic unemployment and mounting government deficits finally force leaders on the continent to act. Despite widespread popular protests, especially in France, Europe goes through a painful economic restructuring much like the United States did a decade before. As part of this perestroika, it retools its economy using the new information technologies. This restructuring, both of corporations and governments, has much the same effect it had on the US economy. The European economy begins to surge and create many new jobs. By about 2005, Europe—particularly in the northern countries like Germany—even has the beginnings of a serious labor shortage as aging populations begin to retire.
Then the Russian economy kicks in. For 15 years, Russia had been stumbling along in its transition to a capitalist economy, periodically frightening the West with overtures that it might return to its old militaristic ways. But after almost two decades of wide-open Mafia-style capitalism, Russia emerges in about 2005 with the basic underpinnings of a solid economy. Enough people are invested in the new system, and enough of the population has absorbed the new work ethic, that the economy can function quite well—with few reasons to fear a retrenchment. This normalization finally spurs massive foreign investment that helps the Russians exploit their immense natural resources, and the skills of a highly educated populace. These people also provide a huge market for Europe and the rest of the world.
Darrell M. West
Vice President and Director - Governance Studies
Senior Fellow - Center for Technology Innovation
The Need for a Clear Focus on Innovation
In moving forward, it is clear that information technology enables innovation in a variety of policy areas. According to Philip Bond, the president of TechAmerica, “each tech job supports three jobs in other sectors of the economy.” And in information technology, he says, there are five jobs for each IT position.[xv]
Faster broadband and wireless speeds also enable people to take advantage of new digital tools such as GIS mapping, telemedicine, virtual reality, online games, supercomputing, video on demand, and video conferencing. New developments in health information technology and mobile health, such as emailing X-rays and other medical tests, require high-speed broadband. And distance learning, civic engagement, and smart energy grids require sufficient bandwidth.[xvi]
High-speed broadband allows physicians to share digital images with colleagues in other geographic areas. Schools are able to extend distance learning to under-served populations. Smart electric grids produce greater efficiency in monitoring energy consumption and contribute to more environment-friendly policies. Video conferencing facilities save government and businesses large amounts of money on their travel budgets. New digital platforms across a variety of policy domains spur utilization and innovation, and bring additional people, businesses, and services into the digital revolution.
In the education area, better technology infrastructure enables personalized learning and real-time assessment. Imagine schools where students master vital skills and critical thinking in a personalized and collaborative manner, teachers assess pupils in real-time, and social media and digital libraries connect learners to a wide range of informational resources. Teachers take on the role of coaches, students learn at their own pace, technology tracks student progress, and schools are judged based on the outcomes they produce. Rather than be limited to six hours a day for half the year, this kind of education moves toward 24/7 engagement and learning fulltime.[xvii]
These represent just a few of the examples where innovation is taking place. Technology fosters innovation, creates jobs, and boost long-term economic prosperity. By improving communication and creating opportunities for data-sharing and collaboration, information technology represents an infrastructure issue as important as bridges, highways, dams, and buildings.
Getting Serious about Innovation Policy
To stimulate innovation, we need a number of policy actions. Right now, the United States does not have a coherent or comprehensive innovation strategy. Unlike other nations, who think systematically about these matters, we make policy in a piecemeal fashion and focus on short versus long-term objectives. This limits the efficiency and effectiveness of our national efforts. There are a number of areas that we need to address.
Research and Development Tax Credits: An example of our national short-sightedness is the research and development tax credit. Members of Congress have extended this many times in recent years, but they generally do this on an annual basis. Rather than extend this credit over a long period of time, they renew it episodically and never on a predictable schedule.
This makes it difficult for companies to plan investments and pursue consistent strategies over time. Due to political uncertainties and institutional politics, we end up creating inefficiencies linked to the vagaries of federal policymaking.[xviii] While companies in other countries invest and deduct on a more predictable schedule, we shoot ourselves in the foot through a short-sighted perspective. Bond notes that “23 countries now offer a more generous and stable credit” than the United States.[xix]
Commercializing University Knowledge: Universities represent a crucial linchpin in efforts to build an innovation economy. They are extraordinary knowledge generators, but must do a better job of transferring technology and commercializing knowledge. University licensing offices must speed up their review process in order to encourage the formation of businesses. Universities should think more seriously about innovation metrics so they allocate resources efficiently and create the proper incentives.
Right now, many places count the number of patents and licensing agreements without much attention to the businesses created, products that are marketed, or revenue that is generated. They should make sure their resources and incentives are aligned with metrics that encourage technology transfer and commercialization.[xx]
STEM Workforce Training and Development: The United States is facing a crisis in STEM training and workforce development. There are many dimensions of this challenge, but one of the most important concerns is the low number of college students graduating with degrees in science, technology, engineering, and math. Few American students are developing proficiency in these subjects, which is hindering the country’s economic future. Past American prosperity has been propelled by advances in the STEM fields. Skills in these areas helped the country win the space race and the Cold War and we need them now as we transition to a technology driven economy.
To deal with this problem, President Barack Obama’s Council of Advisors on Science and Technology (PCAST) has produced an official report that calls for the creation of a Master Teachers Corps. Among other recommendations, the report emphasizes two actions: 1) hiring 100,000 new STEM teachers and 2) paying higher salaries to the top 5 percent of STEM teachers.[xxi] However, in an era of budget cutbacks and attacks on teacher unions, it has been difficult to build support for raising teacher salaries in general and adopting differential pay in particular.
In his 2011 State of the Union, the President restated his commitment to putting education at the forefront of the national agenda, emphasizing the need for quality teachers, investment in STEM education programs, and a “bold restructuring” of federal education funding. He called for identifying effective teachers and creating reward systems to retain top-performing individuals.
It is vital to address these issues because basic facts about STEM teaching and competency are not well known. Failing schools not only harm students, they weaken the overall economy. With the U.S. facing a crisis of massive proportions in terms of its ability to innovate and create jobs, it is imperative that we transform STEM teaching to prepare students for the future economy. Real emphasis should be placed on teacher investment because research has shown that teachers are the primary factor in ensuring student growth and achievement.
An Einstein Strategy for Immigration Reform: We need reasonable immigration reform. One of our most important challenges is a new narrative defining immigration as a brain gain that improves economic competitiveness and national innovation. A focus on brains and competitiveness would help America overcome past deficiencies in immigration policy and enable our country to move forward into the 21 st century. It is a way to become more strategic about promoting our long-term economy and achieving important national objectives.[xxii]
We need to think about immigration policy along the lines of an “Einstein Principle.” In this perspective, national leaders would elevate brains, talent, and special skills to a higher plane in order to attract more individuals with the potential to enhance American innovation and competitiveness. The goal is to boost the national economy, and bring individuals to America with the potential to make significant contributions. This would increase the odds for prosperity down the road. It has been estimated that “over 50,000 workers with advanced degrees leave the country for better opportunities elsewhere.”[xxiii]
O-1 Genius Visas: In order to boost American innovation, current policy contains a provision for a visa “brains” program. The so-called “genius” visa known as O-1 allows the government to authorize visas for those having “extraordinary abilities in the arts, science, education, business, and sports.” In 2008, around 9,000 genius visas were granted, up from 6,500 in 2004. The idea behind this program is to focus on talented people and encourage them to come to the United States. It is consistent with what national leaders have done in past eras, where we encouraged those with special talents to migrate to our nation.
However, this program has been small and entry passes have gone to individuals such as professional basketball player Dirk Nowitzki of Germany and various members of the Merce Cunningham and Bill T. Jones/Arnie Zane dance companies.[xxiv] While these people clearly have special talents, it is important to extend this program in new ways and target people who create jobs and further American innovation. This would help the United States compete more effectively.
EB-5 Job Creation Visas: There is a little-known EB-5 visa program that offers temporary visas to foreigners who invest at least half a million dollars in American locales officially designated as “distressed areas.” If their financial investment leads to the creation of 10 or more jobs, the temporary visa automatically becomes a permanent green card. Without much media attention, there were 945 immigrants in 2008 who provided over $400 million through this program.[xxv] On a per capita basis, these benefits make the program one of the most successful economic development initiatives in the federal government.
This is a great way to tie U.S. immigration policy to job creation. If a goal of national policy is to encourage investment and job creation, targeted visas of this sort are very effective. Such programs explicitly link new immigration with concrete economic investment. They also generate needed foreign capital ($500,000) for poor geographic areas. There is public accountability for this policy program because entry visas are granted on a temporary basis and become permanent only AFTER at least 10 jobs have been created. This kind of visa program is the ultimate in targeting and quality control. Unless the money is invested and leads to new jobs, the newcomer is not allowed to stay in the United States.
H-1B Worker Visas: Right now, only 15 percent of annual visas are set aside for employment purposes. Of these, some go to seasonal agricultural workers, while a small number of H-1B visas (65,000) are reserved for “specialty occupations” such as scientists, engineers, and technological experts. Individuals who are admitted with this work permit can stay for up to six years, and are able to apply for a green card if their employer is willing to sponsor their application.
The number reserved for scientists and engineers is drastically below the figure allowed between 1999 and 2004. In that interval, the federal government set aside up to 195,000 visas each year for H-1B entry. The idea was that scientific innovators were so important for long-term economic development that we needed to boost the number set aside for those specialty professions.
Today, most of the current allocation of 65,000 visas run out within a few months of the start of the government’s fiscal year in October. Even in the recession-plagued period of 2009, visa applications exceeded the supply within the first three months of the fiscal year. American companies were responsible for 49 percent of the H-1B visa requests in 2009, up from 43 percent in 2008. The companies which were awarded the largest number of these visas included firms such as Wipro (1,964), Microsoft (1,318), Intel (723), IBM India (695), Patri Americas (609), Larsen & Toubro Infotech (602), Ernst & Young (481), Infosys technologies (440), UST Global (344), and Deloitte Consulting (328).[xxvi]
High-skill visas need to be expanded back to 195,000 because at its current level, that program represents only six and a half percent of the million work permits granted each year by the United States. That percentage is woefully inadequate in terms of the supply needed. Entry programs such as the H-1B, O-1, and L-1 visa programs grant temporary visas for a period of a few years to workers with special talents needed by American employers. They enable U.S. companies to attract top people to domestic industries, and represent a great way to encourage innovation and entrepreneurship.
Regional Economic Clusters: We need regional economic clusters that take advantage of innovation-rich geographic niches. There are several examples of successful and geographically-based clusters such as Silicon Valley, Boston’s Route 128, and the Research Triangle in North Carolina. In each of these areas, there is a combination of creative talent associated with terrific universities, access to venture capital, and state laws that promote innovation through tax policy and/or infrastructure development.
Research has demonstrated that these innovation clusters generate positive economic results. According to a Brookings report by Mark Muro and Bruce Katz, “it is now broadly affirmed that strong clusters foster innovation through dense knowledge flows and spillovers strengthen entrepreneurship by boosting new enterprise formation and start-up survival, enhance productivity, income-levels, and employment growth in industries, and positively influence regional economic performance.”[xxvii]
The question is how to promote such clusters in other geographic areas. There clearly are other places with the underlying conditions that foster technology innovation. But Muro and Katz caution that political leaders can’t force clusters that don’t already exist and that they should let the private sector lead in encouraging cluster formation. It is important to leverage existing resources and take advantage of workforce development programs, banking rules, educational institutions, and tax policies.[xxviii]
[i] Christine Zhen-Wei Qiang, “Telecommunications and Economic Growth,” Washington, D.C.: World Bank, unpublished paper.
[ii] Taylor Reynolds, “The Role of Communication Infrastructure Investment in Economic Recovery,” Working Party on Communication Infrastructures and Services Policy, OECD, March, 2009.
[iii] Erik Brynjolfsson and Adam Saunders, Wired for Innovation, Cambridge, Massachusetts: MIT Press, 2009.
[iv] Daniel McGinn, “The Decline of Western Innovation: Why America is Falling Behind and How to Fix It,” The Daily Beast, November 15, 2009.
[v] Daniel McGinn, “The Decline of Western Innovation: Why America is Falling Behind and How to Fix It,” The Daily Beast, November 15, 2009.
[vi] Daniel McGinn, “The Decline of Western Innovation: Why America is Falling Behind and How to Fix It,” The Daily Beast, November 15, 2009.
[vii] Darrell West, Brain Gain: Rethinking U.S. Immigration Policy, Washington, D.C.: Brookings Institution Press, 2010.
[viii] Darrell M. West, Biotechnology Policy Across National Boundaries, New York: Palgrave/Macmillan, 2007.
[ix] Michael Arndt, “Ben Franklin, Where Are You?” Business Week, January 4, 2010, p. 29.
[x] Organisation for Economic Co-Operation and Development, Science and Technology Statistical Compendium, 2004.
[xi] Organisation for Economic Co-Operation and Development, Science and Technology Statistical Compendium, 2004.
[xii] Darrell West, Brain Gain: Rethinking U.S. Immigration Policy, Washington, D.C.: Brookings Institution Press, 2010.
[xiii] Organisation for Economic Co-Operation and Development, Science and Technology Statistical Compendium, 2004.
[xiv] National Science Board, “Science and Engineering Indictors 2004,” Washington, D.C.: National Science Foundation, 2004, p. 0-4.
[xv] Philip Bond, “Tech Provides Map for Nation’s Future,” Politico, September 18, 2011.
[xvi] Darrell West, “An International Look at High-Speed Broadband,” Washington, D.C.: Brookings Institution, February, 2010.
[xvii] Darrell West, “Using Technology to Personalize Learning and Assess Students in Real-Time,” Washington, D.C.: Brookings Institution, October 6, 2011.
[xviii] Martin Baily, Bruce Katz, and Darrell West, “Building a Long-Term Strategy for Growth through Innovation,” Washington, D.C.: Brookings Institution, May, 2011.
[xix] Philip Bond, “Tech Provides Map for Nation’s Future,” Politico, September 18, 2011.
[xx] Martin Baily, Bruce Katz, and Darrell West, “Building a Long-Term Strategy for Growth through Innovation,” Washington, D.C.: Brookings Institution, May, 2011.
[xxi] President’s Council of Advisors on Science and Technology, “Prepare and Inspire: K-12 Education in Science, Technology, Engineering, and Math for America’s Future,” September, 2010.
[xxii] Richard Herman and Robert Smith, Immigrant, Inc.: Why Immigrant Entrepreneurs Are Driving the New Economy and How They Will Save the American Worker, Hoboken, New Jersey: John Wiley & Sons, 2010.
[xxiii] Center for Public Policy Innovation, “Restoring U.S. Competitiveness: Navigating a Path Forward Through Innovation and Entrepreneurship,” Washington, D.C., September 7, 2011.
[xxiv] Moira Herbst, “Geniuses at the Gate,” Business Week, June 8, 2009, p. 14.
[xxv] Lisa Lerer, “Invest $500,000, Score a U.S. Visa,” CNNMoney.com.
[xxvi] Moira Herbst, “Still Wanted: Foreign Talent—And Visas,” Business Week, December 21, 2009, p. 76.
The growing importance of the technology economygeralt (CC0)
Technological advances have significantly improved operations and lowered the cost of doing business. Currently, as an example, just a few technicians controlling robotic systems can operate an entire manufacturing plant, and innovative inventory systems are capable of supplying needed parts within a short time for assembly. Advancements in the computer industry, coupled with advancements in telecommunications, have increased job opportunities and strengthened economic growth.
All physical barriers to communication over distances have been properly overcome by the internet. In a similar way, manufacturing and consumer goods companies have developed online links to their suppliers and customer support. Suppliers can keep track of production line efficiencies through automated systems and can more efficiently ship parts and materials to the required locations, reducing inventory and downtime. In addition to that ecommerce and online banking capabilities have also helped reduce the cost of doing business.
Within this new context, and given the fast-paced emergence of disruptive products and business models, as well as the transformative power of digital technologies on business and society, executives must become masters of the global “technology economy”, being capable of detecting the economic impact of such fast technological changes and respond with similar speed and foresight.
Many researches from many respected companies, such as BCG, IMF and World Economic Forum show that whenever companies cut back on technology investments aiming to shore up profits, the result is the opposite, as profits sink significantly, and, as a side effect, GDP also falls dramatically, then a chain reaction starts with the fall of labor productivity after a few years.
As a matter of fact, what companies are really doing is cutting back on an important investment that could create the next growth wave and, in many instances, that investment could generate huge leverage, helping to lower costs and expenses much faster than technology spending rises, but companies can only achieve that by managing their technology spending properly. To do that, senior executives require new metrics and new ways of thinking. In order to successfully navigate the technology economics scenario and leverage optimum business performance, executives must create, measure, and track virtual economic measures just as carefully as they follow metrics about the physical world.
The impact of technology economy
The impact of technology economy in the market is very significant, infusing even the measurement of the market economy. Some of the largest indexes known in the market, such as the Dow Jones Industrial Average (DJIA) and the S&P 500, have changed. Tech powerhouses like Apple, Google, and Amazon, whose stocks are valued much higher than those of many long-time industrial members, are replacing large industrial super companies. Apple, with its high market capitalization, accounts for such a large share of the DJIA, for example, that any hiccup in its quarterly earnings can move the entire index, situation that was once done by other large corporations such as GM and Caterpillar.
Technology has an amazing power of permeate companies. An important measurement of the technology economy is the observing the Worldwide IT Spending volume, which is regarding the corporate spending for hardware, software, data centers, networks, and staff, both internal and outsourced IT services. Currently, this volume is close to USD6 trillion per year. To put this number on a more illustrative perspective, if we were to consider the global technology economy a country and its yearly spending its GDP, it would be ranked as the world’s third largest economy, between the economies of China and Japan and more than twice the size of the UK economy, as shown on the chart below:
Technology spending, gross margins and economic growth have a strong relationship when measured by productivity and GDP. A good example is that executives can predict with some accuracy the impact on the overall economy of a decline in technology spending. Whenever companies cut back on discretionary spending in order to improve profits during a downturn, they slash their investments in technology. Soon afterward, GDP falls dramatically, and, within a few years, labor productivity across the economy falls, as technological innovation is an important component of productivity.
The drop in technology intensity that results from a decline in technology spending causes the labor force to decrease, which shows up in productivity up to three years later because productivity is a “stickier” measure. The relationship between technology intensity and GDP is better illustrated on the chart below:
IMF Rubin Worldwide
As a matter of fact, whenever considering a company’s productivity, it possible to observe not only a connection between technology intensity and gross margins but also a strong correlation, which means that technology intensity and gross margins tend to rise and decline together, one as a consequence of the other. It’s possible to set as a recent example of this effect before and after the recent world economic crash that started in 2007, when companies were investing more and more heavily in technology relative to revenues and operating expenses, and gross margins were rising. That trend accelerated through 2008 and until 2009, when companies belatedly realized the magnitude of what had happened and began to cut technology investment dramatically. After that, technology intensity dropped precipitously along with gross margins. The chart below illustrates this effect:
IMF Rubin Worldwide
Within most companies around the globe, in every single industry, technology investment is growing faster than revenues and, in many cases, faster than the GDP of any country. It is clear to all companies that technology is vital to to the successful operations of companies and, mainly, to the global economy, but being able to manage technology spending properly within a few years ahead will require an increasingly sophisticated way of looking at the world and at a company’s performance.
With that in mind, it is essential for companies to control, adapt, and optimize investments in real-time according to market conditions and on the basis of new forms of market data. Companies need to consider all inputs and outcomes and look at technology economically to gain competitive advantage before competitors do. Finally, if executives understand it and look at technology investments this way, it will not only matter, it will make all the difference for the their companies and for the global economy.
Next read this:
Marco Antonio Cavallo is a digital transformation strategy expert and author. He is founder and chief editor of CIO Global Network Group and an advisory board member for the Technology & Innovation Committees at Florida International Bankers Association (FIBA) and Federación Latinoamericana de Bancos (FELABAN).
The Fourth Industrial Revolution: what it means, how to respond
We stand on the brink of a technological revolution that will fundamentally alter the way we live, work, and relate to one another. In its scale, scope, and complexity, the transformation will be unlike anything humankind has experienced before. We do not yet know just how it will unfold, but one thing is clear: the response to it must be integrated and comprehensive, involving all stakeholders of the global polity, from the public and private sectors to academia and civil society.
The First Industrial Revolution used water and steam power to mechanize production. The Second used electric power to create mass production. The Third used electronics and information technology to automate production. Now a Fourth Industrial Revolution is building on the Third, the digital revolution that has been occurring since the middle of the last century. It is characterized by a fusion of technologies that is blurring the lines between the physical, digital, and biological spheres.
There are three reasons why today’s transformations represent not merely a prolongation of the Third Industrial Revolution but rather the arrival of a Fourth and distinct one: velocity, scope, and systems impact. The speed of current breakthroughs has no historical precedent. When compared with previous industrial revolutions, the Fourth is evolving at an exponential rather than a linear pace. Moreover, it is disrupting almost every industry in every country. And the breadth and depth of these changes herald the transformation of entire systems of production, management, and governance.
The possibilities of billions of people connected by mobile devices, with unprecedented processing power, storage capacity, and access to knowledge, are unlimited. And these possibilities will be multiplied by emerging technology breakthroughs in fields such as artificial intelligence, robotics, the Internet of Things, autonomous vehicles, 3-D printing, nanotechnology, biotechnology, materials science, energy storage, and quantum computing.
Already, artificial intelligence is all around us, from self-driving cars and drones to virtual assistants and software that translate or invest. Impressive progress has been made in AI in recent years, driven by exponential increases in computing power and by the availability of vast amounts of data, from software used to discover new drugs to algorithms used to predict our cultural interests. Digital fabrication technologies, meanwhile, are interacting with the biological world on a daily basis. Engineers, designers, and architects are combining computational design, additive manufacturing, materials engineering, and synthetic biology to pioneer a symbiosis between microorganisms, our bodies, the products we consume, and even the buildings we inhabit.
Challenges and opportunities
Like the revolutions that preceded it, the Fourth Industrial Revolution has the potential to raise global income levels and improve the quality of life for populations around the world. To date, those who have gained the most from it have been consumers able to afford and access the digital world technology has made possible new products and services that increase the efficiency and pleasure of our personal lives. Ordering a cab, booking a flight, buying a product, making a payment, listening to music, watching a film, or playing a game—any of these can now be done remotely.
In the future, technological innovation will also lead to a supply-side miracle, with long-term gains in efficiency and productivity. Transportation and communication costs will drop, logistics and global supply chains will become more effective, and the cost of trade will diminish, all of which will open new markets and drive economic growth.
At the same time, as the economists Erik Brynjolfsson and Andrew McAfee have pointed out, the revolution could yield greater inequality, particularly in its potential to disrupt labor markets. As automation substitutes for labor across the entire economy, the net displacement of workers by machines might exacerbate the gap between returns to capital and returns to labor. On the other hand, it is also possible that the displacement of workers by technology will, in aggregate, result in a net increase in safe and rewarding jobs.
We cannot foresee at this point which scenario is likely to emerge, and history suggests that the outcome is likely to be some combination of the two. However, I am convinced of one thing—that in the future, talent, more than capital, will represent the critical factor of production. This will give rise to a job market increasingly segregated into “low-skill/low-pay” and “high-skill/high-pay” segments, which in turn will lead to an increase in social tensions.
In addition to being a key economic concern, inequality represents the greatest societal concern associated with the Fourth Industrial Revolution. The largest beneficiaries of innovation tend to be the providers of intellectual and physical capital—the innovators, shareholders, and investors—which explains the rising gap in wealth between those dependent on capital versus labor. Technology is therefore one of the main reasons why incomes have stagnated, or even decreased, for a majority of the population in high-income countries: the demand for highly skilled workers has increased while the demand for workers with less education and lower skills has decreased. The result is a job market with a strong demand at the high and low ends, but a hollowing out of the middle.
This helps explain why so many workers are disillusioned and fearful that their own real incomes and those of their children will continue to stagnate. It also helps explain why middle classes around the world are increasingly experiencing a pervasive sense of dissatisfaction and unfairness. A winner-takes-all economy that offers only limited access to the middle class is a recipe for democratic malaise and dereliction.
Discontent can also be fueled by the pervasiveness of digital technologies and the dynamics of information sharing typified by social media. More than 30 percent of the global population now uses social media platforms to connect, learn, and share information. In an ideal world, these interactions would provide an opportunity for cross-cultural understanding and cohesion. However, they can also create and propagate unrealistic expectations as to what constitutes success for an individual or a group, as well as offer opportunities for extreme ideas and ideologies to spread.
The impact on business
An underlying theme in my conversations with global CEOs and senior business executives is that the acceleration of innovation and the velocity of disruption are hard to comprehend or anticipate and that these drivers constitute a source of constant surprise, even for the best connected and most well informed. Indeed, across all industries, there is clear evidence that the technologies that underpin the Fourth Industrial Revolution are having a major impact on businesses.
On the supply side, many industries are seeing the introduction of new technologies that create entirely new ways of serving existing needs and significantly disrupt existing industry value chains. Disruption is also flowing from agile, innovative competitors who, thanks to access to global digital platforms for research, development, marketing, sales, and distribution, can oust well-established incumbents faster than ever by improving the quality, speed, or price at which value is delivered.
Major shifts on the demand side are also occurring, as growing transparency, consumer engagement, and new patterns of consumer behavior (increasingly built upon access to mobile networks and data) force companies to adapt the way they design, market, and deliver products and services.
A key trend is the development of technology-enabled platforms that combine both demand and supply to disrupt existing industry structures, such as those we see within the “sharing” or “on demand” economy. These technology platforms, rendered easy to use by the smartphone, convene people, assets, and data—thus creating entirely new ways of consuming goods and services in the process. In addition, they lower the barriers for businesses and individuals to create wealth, altering the personal and professional environments of workers. These new platform businesses are rapidly multiplying into many new services, ranging from laundry to shopping, from chores to parking, from massages to travel.
On the whole, there are four main effects that the Fourth Industrial Revolution has on business—on customer expectations, on product enhancement, on collaborative innovation, and on organizational forms. Whether consumers or businesses, customers are increasingly at the epicenter of the economy, which is all about improving how customers are served. Physical products and services, moreover, can now be enhanced with digital capabilities that increase their value. New technologies make assets more durable and resilient, while data and analytics are transforming how they are maintained. A world of customer experiences, data-based services, and asset performance through analytics, meanwhile, requires new forms of collaboration, particularly given the speed at which innovation and disruption are taking place. And the emergence of global platforms and other new business models, finally, means that talent, culture, and organizational forms will have to be rethought.
Overall, the inexorable shift from simple digitization (the Third Industrial Revolution) to innovation based on combinations of technologies (the Fourth Industrial Revolution) is forcing companies to reexamine the way they do business. The bottom line, however, is the same: business leaders and senior executives need to understand their changing environment, challenge the assumptions of their operating teams, and relentlessly and continuously innovate.
The impact on government
As the physical, digital, and biological worlds continue to converge, new technologies and platforms will increasingly enable citizens to engage with governments, voice their opinions, coordinate their efforts, and even circumvent the supervision of public authorities. Simultaneously, governments will gain new technological powers to increase their control over populations, based on pervasive surveillance systems and the ability to control digital infrastructure. On the whole, however, governments will increasingly face pressure to change their current approach to public engagement and policymaking, as their central role of conducting policy diminishes owing to new sources of competition and the redistribution and decentralization of power that new technologies make possible.
Ultimately, the ability of government systems and public authorities to adapt will determine their survival. If they prove capable of embracing a world of disruptive change, subjecting their structures to the levels of transparency and efficiency that will enable them to maintain their competitive edge, they will endure. If they cannot evolve, they will face increasing trouble.
This will be particularly true in the realm of regulation. Current systems of public policy and decision-making evolved alongside the Second Industrial Revolution, when decision-makers had time to study a specific issue and develop the necessary response or appropriate regulatory framework. The whole process was designed to be linear and mechanistic, following a strict “top down” approach.
But such an approach is no longer feasible. Given the Fourth Industrial Revolution’s rapid pace of change and broad impacts, legislators and regulators are being challenged to an unprecedented degree and for the most part are proving unable to cope.
How, then, can they preserve the interest of the consumers and the public at large while continuing to support innovation and technological development? By embracing “agile” governance, just as the private sector has increasingly adopted agile responses to software development and business operations more generally. This means regulators must continuously adapt to a new, fast-changing environment, reinventing themselves so they can truly understand what it is they are regulating. To do so, governments and regulatory agencies will need to collaborate closely with business and civil society.
The Fourth Industrial Revolution will also profoundly impact the nature of national and international security, affecting both the probability and the nature of conflict. The history of warfare and international security is the history of technological innovation, and today is no exception. Modern conflicts involving states are increasingly “hybrid” in nature, combining traditional battlefield techniques with elements previously associated with nonstate actors. The distinction between war and peace, combatant and noncombatant, and even violence and nonviolence (think cyberwarfare) is becoming uncomfortably blurry.
As this process takes place and new technologies such as autonomous or biological weapons become easier to use, individuals and small groups will increasingly join states in being capable of causing mass harm. This new vulnerability will lead to new fears. But at the same time, advances in technology will create the potential to reduce the scale or impact of violence, through the development of new modes of protection, for example, or greater precision in targeting.
The Fourth Industrial Revolution, finally, will change not only what we do but also who we are. It will affect our identity and all the issues associated with it: our sense of privacy, our notions of ownership, our consumption patterns, the time we devote to work and leisure, and how we develop our careers, cultivate our skills, meet people, and nurture relationships. It is already changing our health and leading to a “quantified” self, and sooner than we think it may lead to human augmentation. The list is endless because it is bound only by our imagination.
I am a great enthusiast and early adopter of technology, but sometimes I wonder whether the inexorable integration of technology in our lives could diminish some of our quintessential human capacities, such as compassion and cooperation. Our relationship with our smartphones is a case in point. Constant connection may deprive us of one of life’s most important assets: the time to pause, reflect, and engage in meaningful conversation.
One of the greatest individual challenges posed by new information technologies is privacy. We instinctively understand why it is so essential, yet the tracking and sharing of information about us is a crucial part of the new connectivity. Debates about fundamental issues such as the impact on our inner lives of the loss of control over our data will only intensify in the years ahead. Similarly, the revolutions occurring in biotechnology and AI, which are redefining what it means to be human by pushing back the current thresholds of life span, health, cognition, and capabilities, will compel us to redefine our moral and ethical boundaries.
Neither technology nor the disruption that comes with it is an exogenous force over which humans have no control. All of us are responsible for guiding its evolution, in the decisions we make on a daily basis as citizens, consumers, and investors. We should thus grasp the opportunity and power we have to shape the Fourth Industrial Revolution and direct it toward a future that reflects our common objectives and values.
To do this, however, we must develop a comprehensive and globally shared view of how technology is affecting our lives and reshaping our economic, social, cultural, and human environments. There has never been a time of greater promise, or one of greater potential peril. Today’s decision-makers, however, are too often trapped in traditional, linear thinking, or too absorbed by the multiple crises demanding their attention, to think strategically about the forces of disruption and innovation shaping our future.
In the end, it all comes down to people and values. We need to shape a future that works for all of us by putting people first and empowering them. In its most pessimistic, dehumanized form, the Fourth Industrial Revolution may indeed have the potential to “robotize” humanity and thus to deprive us of our heart and soul. But as a complement to the best parts of human nature—creativity, empathy, stewardship—it can also lift humanity into a new collective and moral consciousness based on a shared sense of destiny. It is incumbent on us all to make sure the latter prevails.
This article was first published in Foreign Affairs
Author: Klaus Schwab is Founder and Executive Chairman of the World Economic Forum
Image: An Aeronavics drone sits in a paddock near the town of Raglan, New Zealand, July 6, 2015. REUTERS/Naomi Tajitsu
The Taino constructed two types of houses, Bohios and Caneys, the Taino people used materials from their surrounding environment to build their homes. Using bamboo, palm fronds, vines, mud, carat leaves and surrounding trees the Taino managed to build strong houses that withstood tropical hurricanes and torrential rains. Taino dwellings were built with a game court where they could play batos and also built separate from their farms. Their houses were both round and rectangular in shape, here is an example of Taino housing:
The 1990s: When Technology Upended Our World
If you were to pick the one, singular, culture-defining moment from the s𠅊 decade that gave us so many—you𠆝 be hard pressed to beat Bill Clinton–Monica Lewinsky affair. Even now, in our current climate of oversharing and punch-drunk numbness to the spewing of digital media, the Lewinsky affair still seems incredible in the excruciating level of its detail. That that detail should eventually bring down a president was an unprecedented moment in American politics. There has been endless analysis of how it all happened, but essentially, you can blame it on technology.
The s was a decade of enormous disruption, the axis on which the old world ended and a new one began. Often a vehicle for affectionate nostalgia among Generation Xers, this is a gross underestimation of the decade. The s was not just a decade that gave us Kurt Cobain and “The Simpsons:” Its political events were deeply transformative, and the thread that ran through them all was technology.
Speaking to those who lived through some of its most compelling moments, “The Untold Story of the 90s” makes a compelling case for a decade that saw the changing of the Western order. As Sen. Marco Rubio of Florida tells it, “That period of the s from the fall of the Berlin wall to 9/11 was one of extraordinary transformation societally, economically and in our politics. A lot of the roots of the things we are facing today came from that period.”
The growing power of the Internet, the scrutiny of an ever more powerful press, the rise of entertainment culture in politics and the advance of technology in collecting DNA evidence all came together in 1998. Clinton’s affair struck at just the moment when technology, science, the press and popular culture met. Rumors of the Lewinsky affair first surfaced on the Drudge Report, at that time an insignificant politics blog.
𠇋loggers used to be ridiculed as guys working in their pajamas out their basements, but what really changed that perception was the Drudge Report,” says Dana Perino, who served as White House press secretary between 2007 and 2009. “It had an edginess to it, and a little bit of opinion. The Drudge Report absolutely changed things for news coverage and politics in particular.”
Traditional media relied on phalanxes of editors and lawyers, but bloggers—they could just post and be damned. Once the information was out, it was out, and there was𠅊nd still is—no comeback. Thinking he could face this one down, Clinton uttered those memorable words that would ultimately bring him down. The Internet hummed with rumor and speculation, the newly born cable channels were competing for ratings and coverage was 24/7.
By now even “Saturday Night Live” was running an investigation. The presidency was reduced to a conversation around blowjobs and cigar dildos.
And then investigators found DNA evidence on a blue dress. An independent investigator was appointed to ascertain whether the president had lied. Eleven months and acres of media coverage later, both parties were left shamed and broken.
January 1, 1990: A German citizen takes a hammer to the Berlin Wall, one of the most potent symbols of the Cold War. (Photo by Guy DURAND/Gamma-Rapho via Getty Images)
To illustrate the series of events that signaled the power shift, the film begins with the fall of the Berlin Wall.
The manner of its disintegration was an accident of human judgement, as Mary Sarotte, Kravis Distinguished Professor of Historical Studies at Johns Hopkins University, explains.
Events were spurred to unravel when a policy wonk droning on in a press conference misspeaks. Journalists reported the story on their cable channels within minutes, and by the time the hour was up, East and West Berliners were hammering on the gates—thanks to new media, the flow of information crossed borders, and both sides now understood the wall was open, even while the policy wonk was still droning on.
Next up came the world’s first televised war—one that was broadcast in real time, on a 24-hour news cycle. CNN reporters embedded in Baghdad and on the Kuwaiti border were providing the White House with more information than it was getting from its own generals.
Back in the U.S., the beating of Rodney King by white police officers, filmed on a video camera by a bystander, showed the world the reality of the treatment black people endured at the hands of a white police force. “The Rodney King tape was the beginning of what we see today—now that everyone has a cell phone,” says Julián Castro, former secretary of Housing and Urban Development.
Residents of South Central Los Angeles walk through a neighborhood burned down during the 1992 riots that swept the area after police officers accused of beating motorist Rodney King were cleared of all charges. (Photo by Ted Soqui/Corbis via Getty Images)
That tape, replayed on news media, triggered a social crisis where policing and justice no longer had legitimacy. When the fires of Los Angeles stopped burning, a new generation of voters needed change. They wanted a different kind of authority, a different kind of president. One who spoke their language and understood their culture.
Bill Clinton, who had run an unpromising campaign up to this point, changed tack, and met the people where the people were: on late-night TV. He appeared on “The Arsenio Hall Show,” and instead of speaking policy, he played his saxophone. Everything changed. Yeah, he smoked (but he didn’t inhale). MTV became a legitimate media outlet for his messages and Generation X and the Baby Boomers got it. The World War 2 generation didn’t𠅋ut they no longer mattered. The generation whose world view had been defined by the Cold War, an us-and-them protectionism and a conservative pride had had their day. President George Bush was out, Clinton was in and the s were on their way.
The technological revolution—so far powered by satellite TV and 24-hour news reporting—was about to take a major injection from the Internet. Yes, it was to wreak havoc, but it was also to deliver real beneficial change. Netscape, the Mosaic consumer-facing Internet browser, opened up the web to the entire world. Everyone could access each other now, they could share information and collapse time and distance.
Communities and causes had a channel. When a young gay man named Matthew Shepard was brutally beaten, burned and strung up on a fence left for dead, the Internet surfaced the story. The gay community finally had a way to talk.
As Jon Barrett, former editor-in-chief of The Advocate says, “Up until the internet we often didn’t hear what was going on in the gay community. You had a sense that there were people out there like you, but you may not be able to find them. I didn’t come out until I had access to AOL.” Gay hate crimes were at peak levels back then—in 1998, 1000 were reported, and many more went unreported.
“In times of struggle there are often defining moments that help the broader community see how wrong their actions have been,” says Sen. Chris Coons of Delaware. Matthew Shepard’s death was one of those moments. John Aravosis, a journalist, activist and politician, posted news of the murder on his blog at the time.
“It was amazing how much the crime touched people, but also the sense of community this website gave people,” he says. “People found other people they could commune with. We came up with these ideas of candlelit vigils, 77 happened simultaneously. Having these vigils in each town created local news too. It raised awareness to a new level that empowered and encouraged people to come out and fight.”
The great social liberalization of the s is no better expressed than in the change that was wrought around gay rights. As Matthew’s mother, Judy Shepard, says: 𠇊 whole generation of advocates and activists were born in that moment.” The emergence of gay marriage and gay rights as a mainstream idea was one of the s finest moments. 𠇊nd it happened with lightning speed,” says professor of history Gil Troy of McGill University. “It was about culture and much more about technology.”
“You felt in the s you were in the midst of this tech explosion. There was a lot that was good about that, but we also lost something,” says Castro.
Shawn Fanning, co-founder of the online music service Napster, leaving the 9th Circuit Court of Appeals on October 2, 2000 in San Francisco, during a battle that pitted the upstart web company against the mainstream music industry. (Photo by Alan Dejecacion/Newsmakers)
Few felt this more painfully—or still feels this𠅊s much as business. Shawn Fanning, the college student who founded Napster, set it in motion. Fanning’s breakthrough idea signaled the end of the analog world. Inventing a way for users to download music files for free, Napster was responsible for the greatest transfer of intellectual property in history. It was the beginning of free. The music industry didn’t like it one little bit, but once the genie was out, it could not be returned.
“Napster felt like this magical amazing thing—like why doesn’t music work like this? It was like the Internet should enable things like this,” says Jonah Peretti, the digital founder behind HuffPost and BuzzFeed.
Not realizing this was a terminal situation, the industry fought back—namely in the shape of the band Metallica, which filed a lawsuit and triggered a Senate Judiciary Committee hearing. The testimony of a young Gene Kan, an anonymous developer at Gnutella (a platform offering a similar service to Napster), proved very prescient that day in June 2001. “The benefits of digital downloadable media are infinite,” he told the committee. m Napster users can’t be wrong. 20m todaym tomorrow. Technology moves forward and leaves the stragglers behind. The adopters always win, and the stalwarts always lose. Mechanized farming is a good example. You don’t see anyone out there with a horse and plow these days. The Internet touches everyone and everything. Everyone must adapt, business and intellectual property owners are not excluded.”
In the end Napster was ahead of its time, and the Senate ruled that it be shut down. But Napster was the canary in the coal mine for all media, and a new paradigm had been set.
“It was incredible how many years it took after Napster was shut down to get back to something that was even half as good as Napster,” says Peretti. “We’ve got closer to it now with paid models like Spotify. Napster pointed to the way the world could work, the Internet could work.”
Politics was also experiencing its own disruption: The Florida recount in the 2000 Bush–Gore presidential deadlock defined how divided a nation America had become. But it also had an even more pernicious effect. Days of uncertainty revolving around the unlikely “hanging chads” stalled a resolution. The election mechanisms—yet another institution—had failed.
The Supreme Court was called in to decide, divisively overruling the recount. This threw into doubt any idea that the system was one of fairness and justice, forcing both sides to entrench themselves further.
The fallout of that is a matter of deep discussion today, but this was the moment it all began.
“In the 1990s, with all the cynicism in the media, with all the individuation in the Internet, [something happens],” says Troy, the history professor. “When I go to the Internet I go deeper and deeper into my right-wing rabbit hole, I go deeper and deeper in my left-wing rabbit hole. And so the Internet—which becomes the world’s greatest organizing tool, and the world’s greatest community-building tool𠅌ould also be the world’s, and America’s, most polarizing tool.”
Technology had one more killer blow to deliver. The Internet also helped usher in the unseen ascent of a global terror network that was to scorch itself onto the world’s conscience on the morning of September 11, 2001. The s were over and a new decade—with a new set of problems—was beginning.
Hear from the people quoted in this story by watching “The Untold Story of the 90s.”
Tiffanie Darke is the Editor in Chief of History and author of Now we Are 40, Whatever Happened to Generation X? (HarperCollins). Follow her on Twitter @tiffaniedarke.
History Reads features the work of prominent authors and historians.
Role Of Technology In Economic Development (Essay Sample)
The development of technology is rapidly altering every aspect of the lives of human beings and extending the changes towards economic development. Technology refers to the body of information whose utilization elicits the manufacture of products and services from various sources of the economy. Today, the use of technology has penetrated all sectors in a country, from crop management, food processing, healthcare, marketing, resource management, environmental management, communication, and transportation among others. According to business experts, technology plays the pivotal role of the source in the development of the economy with its numerous changes contributing immensely to the growth of the economy in various ways.
Technology promotes the usage of natural resources. Different countries boast of a wide array of natural resources found in various places in their environment especially below the earth surface. Due to the lack of know-how and ability to detect and retrieve them, numerous countries, particularly in developing nations, grapple with sitting on economic treasures such as minerals, oil, and land that are unutilized. However, Technology alters this dynamic by developing various machinery that enables businesses and the government to adequately access and utilize a wide range of natural resources efficiently, benefiting businesses and boosting the overall economy of the institution and country.
Technology elicits the saving of time. The utilization of technology in various aspects of industries such as computerization of data, use of machinery to carry out different jobs instead of human labor, as well as, utilization of the internet for communication readily alters the dynamic and operation of a business. It does so by ensuring the utilization of machines powered by technology plummets the time needed to produce various goods and to decrease the delivery period for the said products. Additionally, the adoption of technologically fueled communication through the internet ensures customers can communicate with the production team on placing urgent orders or making changes to orders within a limited period, promoting efficiency and the overall increase in profits that translates to economic development.
Utilization of technology results in saving of labor. Among the most expensive expense in business is the management of resources due to the high cost of labor. However, through technology use, this has changed as many manufacturing, and other industries move away from human labor and adopt automation and robotics in the different sectors of their businesses. For instance, in production, data management and in food service industry where robots deliver food. Consequently, this has reduced the burden of work and lessening of the input of a labor force, thereby resulting in reduced number of employees.
Technological advancement facilitates research. Technology offers new opportunities and the discovery of new ideas. The conducting of research encompasses activities such as the writing of a proposal, creating theoretical models, designing and conducting experiments, collecting data and maintaining of communication with research sources and colleagues. Technology comes in handy by facilitating the doing of research by all businesses sectors. Thus, they gain insight that aids in advancing businesses through the discovery of alternative methods of running and managing a business that is more profitable, and the discovery of new technical capabilities in the market.
The utilization of technology has boosted international trade. Information technology has revolutionized the sale of products to customers throughout the world. The presence of internet has made the world a global village, connecting different people from all over the world into a single platform, where they can communicate, share ideas, sell and buy commodities. Thus, for businesses, utilization of the internet enables them to reinvent themselves by ensuring their online presence. Hence, reaching a wide array of customers within a short period and increasing their sales by acquiring a more significant customer base through online shops, social media and by use of ICT marketing tools from all over the world.
Use of technology in businesses has resulted in the expansion of industries. With the usage of technology, numerous institutions including the government have adopted measures of making available most of their services to the public through online platforms. The new changes have necessitated the development of new infrastructure to support all these services. As a result, new sectors have come up such as the App industry, creating more opportunities for numerous people and resulting in the expansion of industries. Moreover, technology has increased the efficiency of production while reducing the workload and production cost. Consequently, businesses reap more profits from their ventures at a lower expense while giving them opportunities to expand their businesses for more returns.
In conclusion, the adoption of advanced technology is the wisest decision for any business in the world today, as its benefits are innumerable in all sectors. Technology plays crucial roles in economic development by facilitating the utilization of resources saving on time and labor, boosting research and international trade while leading to the expansion of industries. Countries that are alive and ready to make substantial investments in the changing environment can continually benefit from technology.