The Increasing Complexity of Technology
An Interview with Innovation Scholar Don Kash
-
-
Share
-
Share via Twitter -
Share via Facebook -
Share via Email
-
Following our Lessons From the Shale Revolution conference in late January, the Breakthrough Institute had a chance to catch up with Don Kash, professor emeritus of George Mason University’s School of Policy and Government. Kash spent his entire career working at the intersection of technology, policy, and society, and has been a major influence on contemporary energy and innovation scholars. Long before the Breakthrough Institute made the case for ‘making clean energy cheap,’ he highlighted the crucial role of public energy R&D to improve environmental outcomes.
Following in the tradition of Joseph Schumpeter and Alfred Chandler, Kash spent much of his later career documenting the transition to postindustrial societies in advanced economies. He argued that the emergence of new production systems and shifting consumer expectations had made technology more complex. And as technology evolved, so to did the requirements of innovation: collaboration was increasingly a necessity. Much of today’s work on open innovation, public-private partnerships, and policy interventions to tackle network failure owe a debt of gratitude to Kash’s pioneering work.
Tell us a little bit about your education and background.
I received all of my three degrees at the University of Iowa. I did my PhD dissertation on international cooperation in space because I attended a NASA-sponsored meeting in the summer of 1972 in which space experts talked about what NASA should do when it got a man on moon. This was the first step in the nearly 50 years of my life trying to get my head around technology and innovation. I started with space, then military, then energy, and then technology more generally.
When did you start working on energy?
In 1970, I started the Science and Public Policy Program at the University of Oklahoma. I was interested in trying to understand the second order consequences of technology, the indirect effects. The first technology assessment we did at Oklahoma was on offshore oil and gas in 1971. We did it in part because of the Santa Barbara blowout that occurred in 1969. It was the largest oil spill that had ever occurred, and was really the trigger for the creation of EPA and the passage of legislation that required environmental impact statements.
The assessment was released as a book, called Energy Under the Oceans, in 1973 –– a month before Nixon announced a tripling of leasing of offshore sites. Naturally, it got a lot of attention, and then it got even more attention after the Oil Embargo. Offshore was seen as the future and the book helped drive revisions to the Outer-continental Shelf Lands Act. As I used to tell my graduate students, timing is always the key!
In 1984, I wrote a book called US Energy Policy: Crisis and Complacency, which attempted to lay out a strategy for energy innovation in a period where the Reagan administration had significantly reduced most federal R&D in energy. The book argued that the country couldn’t rely on market mechanisms to secure its energy future, both because of basic economics but also because of the social and environmental impacts of energy.
So how did you transition from working on energy to innovation more broadly?
In 1989, I published Perpetual Innovation (1989), which argued that we had entered an era of continuous innovation and continuous obsolescence. The striking thing about this new period was that designed obsolescence became an obsolete notion because technology no longer had time to wear out; it was replaced by something that was so much better. In the old days, you would replace your car when it broke down with something roughly comparable. Today you switch from the iPhone 4 to 5 not because there is anything wrong with the 4 but because the 5 is so much better.
When did this period start?
The origins of this era were in WWII when the feds got involved for first time in developing new technology, including radar, the atomic bomb, penicillin, and more. After the war, and up until 1960, industry was essentially making roughly the same stuff it had been making before the war because there was so much pent up demand. By 1960, however, most of this demand was spent, so industry needed to look elsewhere.
The space program came along at just the right time, and we saw federal expenditure rise substantially during that period. A key benefit of the space program was the support it provided to universities on a massive scale, which created a huge, highly qualified technical cadre for industry to hire. Similarly, the Department of Defense was driving technology. Although, most of the time it wasn’t the direct goal of DOD’s work to support industry; in fact DOD served as our department of industry.
In your later work, you advance the thesis that technology is becoming more complex. What are some of the reasons behind this?
What strikes me is that we don’t have many E=MC2 breakthroughs anymore. Almost all of our innovations are incremental improvements to components of systems. When we get a change in trajectory, it’s because the components from one system merge into another. As an example, think of the use of electromechanical tools in biological systems. Or think of manufacturers who have as a part of their process biological processes that create synthetic materials.
This pattern is, in part, the fruit of the digital revolution. What you have now is integration of different departments in engineering schools. The future is probably best seen as one characterized by the synthesis of genetics-robotics-information and nanotechnologies. I think it is generally true that the old disciplines aren’t satisfactory to do most things anymore; the world has become interdisciplinary.
Will technology ever get simpler again or is fated to continuously grow in complexity?
The scientific era has been characterized by the development of three variable theories we have used to explain most things. Most of us look around for single-, 2-, or 3-variable explanations of what’s going on because our mental capabilities cannot handle much more. We may get another unifier, a la Einstein, but I’m skeptical. There is a possibility of simplicity, whereby an individual can understand something in detail and accurately communicate it over time and space, but that is not the way I see things going. Most complex innovations will require the involvement of networks as well. I haven’t been able to find a complex technology that is the product of a single firm, even if it’s a firm like GE.
How does the role of IP differ between complex and simple technologies?
My general sense is that patents have been impediments to speeding up innovation in complex tech innovation. They can facilitate innovation of simple technologies, which are more easily replicable. Apple and Intel’s use of patents has probably slowed innovation down elsewhere.
Who did you find yourself most often arguing against?
Almost from the beginning, the people with whom I argued the most were the conventional American economists who were never willing to grapple with tech change and where it came from. For the most part that is still a community that sees technology as something that just falls from heaven.
In fact most changes to technology come from the evolution of tech capabilities, and recognizing this takes you places where there is no demand, where it has to be created! I spent a large part of my life not realizing I needed a computer and an even longer part of my life not realizing I needed a smartphone. Conventional economics hasn’t come to grips with this (with some exceptions). It looks like the addiction to sophisticated mathematics has got in the way of progress. If there’s a conflict between theory and facts, facts have to go.