Believers-in-Technology of all Nations, Unite!

Three criteria that make digital information concepts shine.

Knowledge
© EtiAmos / AdobeStock

© EtiAmos / AdobeStock

Don't worry. We're not looking to found a party or shape new political or economic theories in which technology becomes a collective solution to problems. Instead, we’d like to report on a phenomenon that can be observed across many industry-related companies – often enough even at kothes.

A fictional example of what it might look like at any industrial company in Germany: A typical conference room. A long table with five-to-ten people seated, “wood-effect” wallpaper on the walls, metaplan boards and flipcharts in opposing corners. A big-screen TV displays a slide with large letters: “Workshop: Industry 4.0 at XYZ GmbH“.

The moderator spreads the upcoming challenges of the market out in front of the participants, and he leads the increasingly fierce competition into the field before the solution brainstorming can happen. Then the Technology-Dropping begins: Augmented Reality, Smart Glasses, Artificial Intelligence, Digital Twin, Deep Learning, and Voice Assistant are the terms that are called out in the room or written down on flashcards. A technology with majority appeal emerges. A striking application opportunity for your own company is formulated. Various companies are invited, the most promising are commissioned with a Proof of Concept (PoC), and so on.

This approach can work, but it often fails, and then the technologies are blamed. Although they’re usually part of the solution, they are only very rarely the solution, themselves. In hindsight, concepts that allow for a successful implementation frequently must overcome three important challenges:

First, the right use case has to be found. Although possible applications are discussed, they’re often based only on what one has seen or heard about the technologies. This is usually marketing material from the technology providers, themselves. For their part, they’ve made assumptions and suggest that they have the use cases, and that their implementation will bring about a demonstrable increase in efficiency. And although this is legitimate it doesn’t necessarily help the company. Which use cases offer the potential to increase efficiency to such an extent that it justifies the introduction of the technology must be closely scrutinised by every company – whether internally or with external help.

The Amazon Fire Phone can be used as an example of a bad use case. Amazon wanted to position this smartphone in the market as an "iPhone killer", back in 2014. Among other things, the "Dynamic Perspective" feature was supposed to bring about a breakthrough by providing 3D views from almost any angle. Unfortunately, there were hardly any use cases and applications for this feature. In addition, the battery life apparently suffered, so that the production of the Fire Phone was finally halted.

The second criterion: The target audience must be considered. A simple insight, which is unfortunately difficult to get across. If, for example, a meaningful use case is to be developed that makes service calls to a machinery plant more efficient, the service technicians must become involved. It doesn't provide value if everyone in the conference room agrees, but solutions become established that don't help "out in the field".

There are plenty of products and services that haven’t taken the target audience into consideration: take the "Portable Baby Cage" for example, probably invented in the 1920s. This was intended to allow children in balcony-free, large-city apartments to get plenty of fresh air. The target audience – the worried parents – obviously weren’t enthusiastic about this idea over the long-term.

The last challenge: Use technologies only when they’ve reached the appropriate maturity level for their own use cases. Each year, Gartner Inc., the market research and advisory firm, publishes its Hype Cycle for Emerging Technologies. The core message: Technologies will at some point become the focus of attention and, as a point of hype, create unrealistic expectations that they cannot (yet) fulfil. When the public becomes aware, the technology then sinks into a hole of disappointment. There are two ways out of this hole: The technology continues to evolve and can still eventually meet the previously unrealistic expectations, or the expectations become more realistic and the technologies are used in a different context, and thus they can reach productive use.

The degree of maturity of the technology must therefore match its own use cases. It’s therefore irrelevant whether the technology is currently being discussed across all media or has already been written off. If the technology meets the requirements of the target audience and solves its own use cases, it’s therefore at the appropriate level of maturity.

So it didn’t help Apple to present its Apple Newton PDA at such an early stage. Without digital business models, mobile data connections and the importance of the Internet at the time of the first real smart devices, the technology couldn’t effectively establish itself.