We Need To Accelerate Innovation—Here’s How:

We Need To Accelerate Innovation—Here’s How:

Technology February 17, 2016 / By Greg Satell
We Need To Accelerate Innovation—Here’s How:
SYNOPSIS

We can no longer assume that we can separate things into neat little categories like basic and applied research and hope that they meet up somewhere down the line. At the same time, with the world moving as fast as it does now, private industry needs help to stay abreast of it all.

Look at any marvel of our technological age, whether it be an iPhone, a self driving car or a miracle cure and you’ll find three things: An academic theory, a government program and an entrepreneurial instinct. When it all works it is a wonder to behold, not only creating prosperity, but solving our most difficult problems in the process

It is the unlikely partnership between academia, the public sector and private enterprise that allow us to navigate the path from discovery, to innovation, to transformation. The process, however, is often unwieldy, taking decades to go from from primary discovery to a measurable impact on society.

Unfortunately, most efforts to accelerate innovation focus on just one facet, such as giving tax breaks for innovation, increasing investment for research and helping start-up companies find funding. Nevertheless, these approaches ignore the fact that innovation is a complex process, requiring us to integrate a variety of efforts. That’s where we need to focus now.

How America Became An Innovation Superpower

For most of its history, the United States was a backwater. At the turn of the 20th century, a promising young student would often need to go to Europe to get an advanced degree in science from a world class institution. Perhaps not surprisingly, inventions that drove the industrial age, the steam engineinternal combustion and electricity all came from Europe.

The balance began to tip in the runup to World War II. As the backlash to “Jewish physics” in Europe grew, top minds like Einsteinvon Neumann and Fermi migrated to the US. It was our openness—to new people and new ideas—that made America exceptional. With Europe in turmoil, America attracted the greatest collection of scientific talent the world had ever seen.

In 1940, after Germany invaded France, Vannevar Bush went to President Roosevelt with a vision to combine government funding with private enterprise to mobilize America’s latent scientific resources. Under his plan, public grants for defense research would be issued to private institutions in order to accelerate progress.

Bush’s plan was, of course, a historic success and, as the war was coming to a close, Roosevelt asked him to issue a report recommending how the wartime efforts could be replicated in peacetime. That report, Science, the Endless Frontier, became the blueprint for America’s technological dominance.

The Bush Plan

Bush was an engineer, not a scientist, yet he considered science essential to his work. He also recognized that scientific discovery was a long-term process without clear timelines and objectives. Its aim was to expand horizons, not to create practical applications. “Science, by itself, provides no panacea for individual, social, and economic ills,” he wrote.

But then he continued, “without scientific progress no amount of achievement in other directions can insure our health, prosperity, and security as a nation in the modern world.” He considered investment in research as leading to an increase of “scientific capital” which could then “turn the wheels of public and private enterprise.

It was a unique vision. An economist might say that Bush was addressing the problem of appropriability. The benefits of basic science, which have no immediate application, can only be appropriated by society as a whole. However, the practical applications that discoveries make possible represent a clear profit motive that is best pursued by private enterprise.

Yet still, despite the success of the Bush architecture, it’s hard for many to get their heads around it. It lacks strategy and direction. That’s probably why other countries have consistently gone another way.

Different Strategies, Different Results

To understand, the how different the United States is, it’s helpful to look at how computer technology was developed. For example, the first digital computer was not, as many believe, invented in the US, but in the United Kingdom during World War II. Alan Turing, a British scientist, is still considered the father of modern computing.

However, considering the machine a military secret, Winston Churchill ordered it destroyed. Then they locked Turing away in a physics lab to work in relative isolation. Later, after a conviction for homosexual acts, he had to endure a harsh sentence of chemical castration, which led to his suicide. That’s how the British killed their technology industry.

The French, for their part, recognized the value of computer technology and launched a national effort in the sixties called Plan calcul to develop advanced mainframes. The Japanese, through it’s legendary Ministry of International Trade and Industry (MITI), invested heavily in semiconductors. Both programs had early successes and then faded away.

The American approach was far different. Government funding lead to the IAS machine, but the technology was widely shared and largely driven by industry. Later government grants helped lead to microchips, the Internet and other advances, but the application of those discoveries were largely left up to entrepreneurs, rather than bureaucrats. The results speak for themselves.

Attacking The Problem Of Technology Transfer

One point that Bush did not address was that of technology transfer, which is how a promising technology gets to market. This usually involves industry scientists poring through mountains of academic papers or working through a government agency’s office of technology transfer (like this one at NIH). It can be a cumbersome and ineffective process.

There are also informal barriers. As Lynda Chin, Chief Innovation Officer for Health Affairs at the University of Texas University system pointed out to me, we still have a long way to go to effectively integrate the work of scientists who make new discoveries and that of engineers who transform those insights into new products and miracle cures.

“Basic science is a long-term proposition” she told me. “You need to be single-minded and stick with it until your hypothesis is disproved. Applied science, however, requires execution by a cross-disciplinary team and you need to constantly make decisions about time and resources, taking into account not only probability of success, but also opportunity cost. Often, projects need to be delayed or killed outright if they are not feasible in an actionable timeframe.

“This means two different cultures and it is a challenge to integrate them effectively. We need to build a culture of understanding between the two disciplines.” At MD Anderson, she helped found the Institute for Applied Cancer Science to do just that. But to effectively accelerate progress, we need a more pervasive, national effort.

Integrating Our Way To The Future

While no one can deny the success of Bush’s plan, the world has changed quite a bit in the 70 years since he wrote Science, The Endless Frontier. Many of the fledgling industries that he sought to support, such as information technology and pharmaceuticals, are now major components of the economy.

So we desperately need to update the initial vision. We can no longer assume that we can separate things into neat little categories like basic and applied research and hope that they meet up somewhere down the line. At the same time, with the world moving as fast as it does now, private industry needs help to stay abreast of it all.

One possible model is President Obama’s National Network for Manufacturing Innovation (NNMI), which sets up innovation hubs where people from both industry and academia can collaborate on specific areas such as 3D printing or advanced materials. These are public-private partnerships, so have minimal budget impact and everyone has skin in the game.

Another approach is for the federal government to set out ambitious goals that galvanize researchers, industry and the public, such as Vice President Biden’s recent call for a Moonshot for Cancer, as well a smaller efforts already underway in areas such as roboticsand nanotechnology. These, much like the original moonshot and the Human Genome project, can deliver outsize impact.

Perhaps most of all, we need to recognize that people like Steve Jobs and Elon Musk don’t succeed on their own. Today, we live in a world of the visceral abstract, in which our most important products come from the most unlikely ideas. While we lionize our great entrepreneurs—and rightly so—we cannot neglect the science that makes them possible.

This article originally appeared at DigitalTonto.

Follow Greg on Twitter @Digitaltonto  

comments powered by Disqus
RECOMMENDED
FOR YOU