Just when you have your data collection and analysis systems in place, technology changes mean that your company needs new, updated systems. This is a problem for many companies — but it can also be an opportunity.
Relentless technology change can feel like a rollercoaster — simultaneously exhilarating and exhausting. Just as we reorient from one twist, we’re thrust into the next one.
For managers, one difficulty with new technology is that it typically must integrate with the old. Managers working with data generated by technology rarely have the luxury of a single version of data. Instead, the analysis must incorporate multiple generations that — by definition — differ from each other.
The rise of embedded devices for data collection makes this situation far, far worse. Consider the industrial equipment market: The long life of this hardware combined with the rapid evolution of sensors is difficult to manage.
Siemens AG provides an excellent example both of this difficulty and the opportunity it presents. Gerhard Kress, director of mobility data services, describes the data challenge the situation creates:
“Industrial equipment has time spans that are so much longer than IT time spans. That is a huge issue because, for example, you cannot just shut down the control center of a power plant to upgrade. Or given that the average life span of a rolling stock vehicle is about 30 to 40 years, you have a large, installed base that simply does not have all the modern functionalities in it yet.”
Instead, his group must be ready to handle the latest technology (for new products) as well as quite old technology (for their installed products). Siemens, for instance, still services a water turbine that was installed 105 years ago. Lifetimes for heavy equipment “easily last 20 to 40 years.” While it’s a challenge, Siemens is able to create advantage from this difficulty in several ways.
Turning Legacy Into Advantage
Using its data platform, Siemens differentiates through compatibility. Kress notes that: “One of Siemen’s advantages, and a barrier to entry for the other [nonrail] players … is that [we] understand very old rail data formats. Siemens can read data formats from 30 years ago; we have to read them.” While someone else could develop systems to read these, it would take significant time to build. Kress notes that documentation is rarely a priority and that “a lot of the documentation was only through people, and [gathering information] required talking to the experts who designed it.” Reverse engineering takes considerable time and effort for others, but organizations that already understand these legacy formats have an advantage.
Understanding How Data Collectors Evolve
Beyond just the old data formats, Siemens has also built up considerable expertise in tracking changes in the data-providing sensors themselves. Kress describes that with sensor data, “I cannot assume sensors to be always correct, because they might have a life span of 10 years but are installed on equipment that is still in use. Some begin generating errors.”
As a result, Siemens spends considerable effort validating data from its equipment and has a rich understanding of how the data reliability changes over time and the kinds of errors that begin to creep in. In an industrial setting, Kress describes how misinterpreting this data can be critical, “because imagine you take a train out of operation and evacuate all the passengers because you believe something is broken, but it turns out it’s a sensor.” An organization with a deep history of data creates advantage in that it will understand how to interpret this data correctly.
Creating New Algorithms for Old Data
The interpretation process can take the form of tacit knowledge in the organization. But it can also manifest itself in explicit and differentiating algorithms. Kress describes that “we’ve also spent [an] enormous amount of effort in the last three years in researching new methods, because industrial data behaves very different than, let’s say, clickstream data from the internet.”
Siemens uses standard machine learning but has also developed new methods when needed. Kress notes, “We’ve filed about 30 patents on new mathematics, because we couldn’t find what we needed.” These advances provide further opportunity for advantage, since “it took us a while to learn and to find out that these structural differences in the data that cause so many problems.” Organizations that work with data from old and new equipment can learn more about the shortcomings that modern techniques have in this context and can gain advantage in developing tools that no one else has.
Perhaps ideally, organizations would always be able to have all of their data-providing equipment on the latest and greatest platforms. In many IT contexts, this may be close to possible. For example, the IT “Gang of Four” (Amazon, Apple, Facebook, and Google) can often deploy changes relatively quickly across their technology infrastructure; historical data is likely different, but current data is homogeneous. Or companies can ask their eager consumers to embark on yet another buying spree of adapters and peripherals to be compatible with their latest gizmo that has rendered existing products obsolete.
But realistically, many organizations operate in a context where this rapid cutover is nowhere near possible. As more and more companies become technology companies and combine technology in their products, this difficulty will become much more widespread. However, rather than a disadvantage, organizations can create advantage by effectively managing the complexities that mismatched physical and virtual time horizons create.