2015

The 53rd Annual R&D 100 Awards Report-Out: November 13, 2015

When a company’s innovation levels are high, or a company’s innovation levels are low and not recognized as being low, GGI has suggested that our clients and customers enter into an award contest.  In the former case, a win creates lovely price premium and brand value opportunities for several years.  In the latter case, competing can be calibrating to both management and to the members the cross-functional product development community.  Benchmarking company products provides learning opportunities up and down the ladder.

There are many award competitions around the globe each year.  Regardless of the purpose or the type of award, there are none as prestigious as the annual R&D 100 Awards – excepting possibly a Nobel Prize.  The R&D 100 Awards are now produced by Advantage Business Media [ABM].  ABM has become one of the publishing powerhouses in the scientific and engineering community with a portfolio of some twenty-five trade publications across the fields of communications, design, manufacturing, and science.  R&D Magazine is among its most prized assets and is the originator and long-time home of the R&D 100 Awards.

As a subscriber to R&D Magazine since the early 1990s, I had followed the results most years and of course when GGI’s clients competed.  But, I had never been to the actual awards ceremony.  The bottom line is that everyone should go to the R&D 100 Awards Dinner at least once in their career.

New Twists For The 2015 R&D 100 Awards

Two things were different this year.  First, ABM decided to produce a two-day conference preceding the Awards banquet on Friday evening.  The 1st Annual R&D Technology Conference had four tracks and attracted over two hundred registrants.   Tim Studt, R&D Magazine’s renowned editor of several decades and the chief architect of the R&D 100 Awards and annual R&D Funding Forecast, asked me to speak on how the complexity of managing R&D will increase in the next decade.

The second thing that was different, was that for the first time in 53 years since the awards began, the winners were not announced in advance.  You may imagine that there were a good number of people bursting with anticipation throughout the conference and right up to the end of the Awards banquet Friday evening at 10:30 PM.

A black tie event, the Awards banquet did not disappoint.  Held at beautiful Caesars Palace in Las Vegas, guests were first surprised by the entrance to the ball room.  There was immediate curiosity as to what might await as one passed through the gateway.

53rd R&D 100 Awards Entrance

As I mentioned, the R&D 100 Award winners were not announced in advance.  After a nice dinner, the lights went out and on came the spotlights.  ABM had orchestrated the event to the level of the Emmy Awards.  There were spotlight instructions identifying where every winner sat at every table in sequential order; for all 100 winners and for the approximately twenty five extra awards that were presented that evening.  The only thing missing was the TV cameras.

53rd R&D 100 Awards Event

Tim Studt and Bea Riemschneider hosted the evening.  Bea is the Editorial Director of Advantage’s Science Group. The Science Group includes R&D Magazine and ten other science and medical publications.

 

The 2015 R&D 100 Award Winners – Highlights

The largest group of winners were the United States National Laboratories and NASA.  Collectively, they took home over thirty awards and were listed as contributors to a dozen more companies that won.  A number of these awards, notably from Oak Ridge National Laboratories, were for advances in 3D printing technology.

The next big block of winners came from companies who all shared the word “Dow” in their names.  “Dow companies” took home seven awards.

The third big block, consisting of four awards, was from Taiwan ROC’s Industrial Technology Research Group [ITRG].  One of these awards will be seen shortly in all our fire departments.  ITRG created an impeller that goes in line before a fire hose’s nozzle and spins with the flow of water.  The spin generates electricity which is captured to power a new spotlight attached to the nozzle.  Whenever the hose is used, the light shines on the target.

From memory, I think Tim said from the podium that the winners came from twenty countries.  Some of the more recognizable winning companies included: Waters, Millipore, Ethicon, Thermo Electron, Milliken, Agilent, Qualcomm, IBM, Boston Scientific, Adelphi, MSC, Mitsubshi, Toyota, and Shimazdu.

Several universities also won awards. Another possible game-changing product came from the University of Central Florida.  They have patented an electrical power transmission cable that transmits power, that also has an outer concentric cable that acts as a battery to store power.  The power cable and the battery are one unit.  There are implications for many industries.

Finally, and please pardon me for this, but I counted roughly twenty companies as the night proceeded that had participated in GGI’s Innovation or Metrics Summits some time in the past 5-8 years.  My thought was that many of these innovation-hungry companies must do a methodical assessment of what was available to be learned on the subject as part of their effort to innovate.  GGI’s next Summit is December 8-10, 2015, the 13th R&D-Product Development Innovation Summit.

 

The 2015 R&D 100 Award Winners – List

Lindsay Hock, Editor of Product Design & Development Magazine [PD&D], published the list of the 2015 winners on November 18.

2015 R&D 100 Award Winners

 

The 2015 Inaugural Technology Conference – Summary

Bea Riemschneider, Editorial Director for ABM’s Science Group, published an article on November 19 that summarizes the first annual science and technology conference.

R&D Industry Leaders Explore the Innovation Process at Inaugural Science and Technology Conference

The Journey to Mastering Innovation

Let’s consider for a moment the corporate journey to improve innovation that has taken place, and where we are now.

Productivity improvement was driven into logistics in the ’70s and ’80s, into manufacturing and operations in the ’80s and ’90s, and into the supply chain in the ’90s and ’00s. As a result of those efforts, companies that create and commercialize products achieved the lion’s share of their improvement entitlements in the “downstream half” of the company.

Improvements in R&D and product development, beginning in the ’80s and extending to the ’00s, similarly focused on productivity via improved execution by striving for faster time-to-market and speeding-up product life cycles. By the early ’00s, companies could foresee that the opportunity for big improvements was also nearly exhausted. The only place to look for the next big opportunities was further upstream in the company.

In the early 2000s, companies began focusing on pushing the best possible portfolio of products through their fairly well optimized development and operations pipelines. The quest for higher-value, more-innovative portfolios began.

But alas, one commonality among the past three decades of improvement initiatives was that they primarily focused on improving the bottom line. Profit is no small thing, granted, but 30 years of corporate focus on the bottom line left many companies challenged to attack top-line initiatives with the same vigor and expertise.

The Journey to Mastering Innovation [Machine Design – November 2015] makes the case that the next great source of competitive advantage will accrue to those companies that master the ability to innovate, not innovation=execution but rather innovation=creation.

innovation-creation.JPG

Physical vs. Virtual Colocation, and the Effects of Interruption

Officially, “colocation” is the proper spelling, but “co-location” and “collocation” are also recognized spellings. Regardless of how you spell it, it boils down to the science of communication probabilities and qualities between individuals.

Tom Allen, at MIT in the late 1970s, put the first benchmark on the table. He found that the probability of communication depended on whether any two people had any common organizational bonds, such as working in the same department or on the same team. He called this “intra-group” communication; otherwise it was “inter-group.”  If people sat more than 10 meters from one another, there was only a 5% chance of inter-group communication and a 10% chance of intra-group. Unless people sit close to each other, they rarely communicated.

By the early 1990s, while videoconferencing, the Internet, and email were emerging, a study conducted by GGI found 300 companies examining roughly 50 distinct approaches to simulating colocation. A new industry was developing to facilitate effective colocation regardless of physical distance. Since then, a myriad of “solutions” have entered the marketplace.

Alas, the enabling technology has advanced more quickly than its target audience’s behavior. If one examines the relationship of individuals to their work assignments and locations, individuals have not changed significantly since 1930 studies driven by unionization efforts. Individuals without systemized and policed corporate policies still largely behave as they did 80 years ago with regard to the task in front of them.

Physical vs. Virtual Colocation, and the Effects of Interruption [Machine Design – October 2015] discusses several studies conducted in the past ten years that explore the differing workplace environment for physical versus distributed workers and the effects of interruptions on individual and corporate productivity.

TRIZ: A Best-In-Class Innovation Tool

TRIZ, a systematic innovation technique developed in Russia, whose acronym is loosely translated as “Theory of Inventive Problem Solving,” is based on an analysis of the inventive attributes of several hundred thousand patents.  Evolving since the 1950s, the body of knowledge was codified and documented by the 1980s.  It was first translated into English in 1992.

Like mathematics, where two plus two always equals four, the inventive attributes of patents are timeless.  Several companies, in very recent times, have repeated and expanded the analysis of the original inventor without any significant changes to Genrich Altshuller’s original findings.  Now deceased, several global organizations continue to nurture and espouse his TRIZ frameworks.

GGI researched some 300 innovation tools that are available to practitioners today.  Not a surprise, the USPTO web site was the most cited tool being used by companies in North America. TRIZ emerged as the second most popular innovation tool in use today.

TRIZ Plus – A Modern Tool for Enhancing Design Innovation , was written by Doug Hoon in his blog on Machine Design’s web site. He describes some of the key attributes and benefits of the methodology, and cited some of GGI’s research findings on TRIZ.

Planning For Intellectual Property Revenues

Not too many years from now, business and program planners will have a new challenge when preparing a product or business plan.  Since the dawn of the Industrial Age, the state of practice has been to estimate an ROI by forecasting product revenues and profits.  Soon, product and business plans will contain two forecasting spreadsheets.  There will be the section on product revenues and profits that we are all familiar with.  And, there will be revenue and profit forecasting for IP.

Work on financial liquidity and monetization of IP began in the 1990s.  Baruch Lev at NYU is often credited with lighting the torch.  Let’s define financial liquidity as the ability to more easily transact IP in the marketplace as a commodity.  Let’s define monetization as the ability to assign a dollar value to a block of IP, and the rate that the initial value depreciates over time.  Liquidity and monetization are different challenges.  One requires a marketplace(s) capable of transacting assets, commodities.  The other requires the ability to assign a recognized value that can be generally agreed.  Just to pronounce the point, some day we might see certain types of IP traded like gold, silver, and soybeans.  The market would have much less volume, but the principles would largely be the same.  So how do we get there?

Financial Liquidity:  Markets are already in development [Figure 1].  There are a number of them and they are all emerging.  Who knows the ones that will stand the test of time.  And, the new ones are certain to emerge.  Right now “scouting firms,” “innovation intermediaries,” “ip auctioneers,” and “crowdsourcing companies” all stand to be market makers.  There are a handful of others.  The front half of industry is engaged in testing the waters at some level.  It typically takes about half of industry to be active before software developers start generating applications to manage new activities.   Software began to emerge at an increasing rate the past couple of years.

Figure 1
Percentage of Companies Currently Utilizing One or More IP Markets

 Percentage Of Companies Currently Utilizing IP Markets

Monetizaton:  Quietly, over the past twenty years, there have been committees and groups studying the merits of actually putting IP assets into financial statements.  A partial list reads like alphabet soup, but you’ll probably recognize them:  SEC, COMEX, FASB, and NAA.  Their goal is to be able to assign a value to a block of IP such that it can be treated like a new furnace or truck or mainframe and placed on the balance sheet as an asset and depreciated.  To meet generally accepted accounting principles, both a depreciation rate and a liquidation value would have to be determinable.  It is not necessary for all this to be worked out before the market can become liquid however, but some level of progress is necessary to determine ranges of salable values.  The accountants are currently paying attention to transacted values today that are agreed by a selling and buying entity.  After thousands of transactions occur in the years ahead, there will be enough data to assign some standard values.  There is great argument however.  Many do not want standard values.  Companies may lose their ability to negotiate premiums for their prized jewels.

Planning for Intellectual Property Revenues [Machine Design – July 2015] discusses the inevitable advancement of IP into the everyday business planning and decision making processes associated with new products. The article concludes with a bit of stretch thinking.

Measuring Product Development Vitality

One of the things that satisfy engineers and product developers the most is to see their newly released products sell like hotcakes.  It goes right to the core of why they pursued careers in science and engineering, to create things that better the lives and capabilities of others.

One of the things that satisfy finance and business professionals the most is to see the products they decided to invest in become successful.

One of the things that satisfies investment bankers and brokers the most is to see companies with a continual stream of winning products year after year.

Perhaps these are the reasons why a metric that did not exist before 1988 is now the number one corporate R&D performance metric in North America.

Most of us know the metric as “New Product Sales,” or “Revenue Due To New Products,” along with a myriad of other names.  What they all strive to measure is the “newness” of annual revenues.  The actual value of this key performance indicator [KPI] differs greatly by market, by the length of product lifecycle, and by the age of the company.

Published on Product Design & Development Magazine’s Product Design & Development website, Measuring Product Development Vitality examines the evolution and the industry adoption rates of this metric since its inception in 1988.  The article explores differing definitions of the metric, differing definitions of how companies classify a product as being “new” or “old,” offers benchmarks for industry values when a new product is defined as being less than three years old, and discusses consequences when companies try to game the value of the metric.

Press Release: 12th Summit To Focus On Advances In Corporate R&D Innovation & Intellectual Property Strategy

NEEDHAM, Mass. — (BUSINESS WIRE) — February 23, 2015 — Corporate demand for specific strategies, tactics, techniques, tools, and software to bolster Corporate Innovation have steadily increased for fifteen years.

As corporate demand grew, large think tanks and research firms studied what was working. Solutions providers, Makers, and service firms entered the market as money could be made. Like all emergent markets, many initial offerings had a high fall out rate. While still the case, certain approaches are producing financial results and initial patterns of success are apparent.

Intellectual Property [IP] strategy and management, both by itself and aided by the pull from increasing levels of corporate innovation, has also matured. Starting a few years earlier than innovation, circa the technology boom of the 1990s, companies sought to trade, barter, license, purchase, and sell IP at greater levels than any prior time in history. As the IP market grew, suppliers emerged to meet the increased demand.

Successful IP enablers are also beginning to sort themselves out. Corporate abilities to identify, value, and transact IP will soon approach corporate abilities to identify, price, and sell winning products.

The R&D-Product Development Innovation Summit, last held in 2012, offers an update on the evolution of the Corporate Innovation and IP bodies of knowledge. Much changed during the Great Recession. Companies really had to focus to generate good business results in an increasingly normalized, globalized, and economically challenged competitive landscape.

This fact driven and data intensive event, targeted to CXOs and corporate leaders, draws on findings from over fifty secondary research sources along with primary research sources on both subjects.  GGI’s sixth industry study since 1998, found notable shifts in the strategic management of R&D and in processes for Organic InnovationOpen Innovation is on the rise.  Patent and Trade Secret practices are changing.  The ability to determine the value of, and then to monetize IP, is maturing.  And, the metrics corporations use to measure R&D and IP output and productivity are now primarily focused on business results.

Produced in conjunction with The Management Roundtable of Waltham, MA, this Summit holds the content to empower executives with the state-of-the-state knowledge needed to lead innovation and IP improvements in their companies.

The Difference Between Research and Development

The already unclear lines separating research from development are getting even blurrier as more companies allocate some part of their R&D budget to take on riskier projects, and invest in the necessary infrastructure to manage these riskier activities.

New products are now being launched out of recently formed “Innovation” organizations”, and more are coming from existing “Advanced Development” organizations.  “Product Development” is no longer the only organization that launches new products.

Several factors have complicated matters for industry observers trying to stay abreast of what might be coming to market by simply paying attention to product development pipelines. These factors include:

  • The changing corporate approaches described above.
  • The desires of developers to bring solutions to market, not just pieces of a solution.
  • The globalization of R&D that has, in effect, decentralized R&D.
  • Naming conventions for organizations that differ by industry and country.

The jury is still out as to whether today’s approaches to R&D will prove more productive than historical approaches. Historical approaches to “pre-product development” generally restricted the scope of activities to reduce uncertainty and improve the predictability of key enabling features, capabilities and technologies—and then turned those enablers over to product development.

The Continuum™ of Research and Development

The Continuum of Research and Development

Published on R&D Magazine’s Research & Development website, The Difference Between Research and Development discusses historical approaches to The Continuum™ and contrasts them with the changing corporate practices that are occurring today.  The available alternatives for the navigation system of a robotic lawn mower are used to illustrate the key points.

Why The Innovation Revolution?

Just about every company, since the Industrial Revolution began in the late 1800s, has wished to improve its level of innovation.  How to be a better innovator has been a subject of study for decades.

It was not until the early 2000s however, that collective industry demand for “better innovation” reached a level so as to spawn a revolution in the slowly evolving body of knowledge.  Why then?  Many factors contributed.

In the early 1980s, industry began shifting from a focus on manufacturing and operations excellence to a focus on R&D and Product Development excellence.  The first articles on competing through product development excellence appeared in 1983.  In 1986, Robert Cooper introduced the first “Stage-Gate” framework in his book “Winning On New Products.”  Initially designed to improve “over the wall” from engineering to manufacturing, the framework rapidly evolved to be an end-to-end process from concept to customer.  With an end-to-end framework now available to all industries, the time compression and/or Time-To-Market quest began.  By the late 1990s, some companies were moving so fast that they were losing potential ROI by replacing their own products too quickly.  As a result of the extreme time emphasis, the design communities had even less time for free thinking. Having less time ran contrary to many of the fundamental values as to why engineers, scientists, and designers went into their profession.  There was no longer as much time or budget to innovate.  A “push back” began to emanate from design communities.

Two other major industry initiatives, Six Sigma and Lean for product development, affected development communities in a similar manner.  Lean resulted in fewer people to do the same work, and had some overhead to measure and monitor capabilities.  Six Sigma resulted in many additions to the requirements of executing a product development process, while focusing efforts to eliminate all sources of unnecessary variation.  Experimentation and variation are necessary for innovation and invention.

Globalization across industries also contributed, for good and not so good reasons.  When products could get knocked off without the ability to enforce the intellectual property, companies had to improve their cycles of innovation and learning to get and to stay ahead.  At the same time, there were now many more competitors addressing what once were largely captive geographic markets.  New entrants wished to be better than the current market players.  Current market players wished to be better to stave off new entrants.

The ability of just about anyone to develop Software, combined with the advent of the Internet, was also a giant driver.  Many classic industries were under siege by companies that either delivered their products or services in a different way, or that added additional value into products through software and/or internet connectivity.

The confluence of Time-To-Market, Six Sigma, Lean, Globalization, Software Emergence, and Internet Emergence during the 1980s and 1990s were the primary driving forces that lead to an innovation revolution in the 2000s.

The 12th R&D-Product Development Innovation Summit, April 7-9, 2015, offers a current snap shot of the progress in the development and maturation of the Innovation Body of Knowledge during the past fifteen years.  GGI espouses no methodology of our own at the Summit.  Our sole goals are to look at what has taken place, to cull out what works, and to identify what companies are adopting into practice. The Summit, a defined three day curriculum targeted to decision makers and thought leaders, will address innovation and intellectual property strategies, tactics, processes, techniques, tools, and software.

  12th-rd-product-development-innovation-summit.png