Intelligence at the edge

Software and networking companies are likely to disrupt the industrial and commercial automation and control markets as more intelligence gets pushed to the edges of the network.

The industrial controls and factory automation market is over $150 billion in 2013 according to Markets & Markets.  The vast majority of this market consists of either stand-alone automation components or systems integrated into a centralized control or automation software system (e.g., SCADA in the utility sector).  According to a 2014 PWC survey only 24% of companies said they had deployed an extensive network of connected sensor devices.  Incumbents in this arena are racing just to get their devices connected to the internet and communicating data back to the cloud.  If this were a baseball game, we’d be in the second or third inning.

At a time when incumbent industrial automation companies are struggling with connectivity, IT-oriented companies such as Cisco and others are accelerating their push to bring intelligence to the edge of the network. Fourteen months ago, Cisco introduced the term “fog computing”.   Today when you enter that into a Google search, you get over 1.2 million results.  “Fog computing” refers to embedding computing power throughout a network instead of in a central “cloud”.  The idea is to reduce the costs of data transmission and increase the speed of analytics in a world of billions of connected “things”.

Bringing intelligence to the edge of an industrial network via concepts like “fog” opens significant new opportunities for customers.  With computing power placed closer to the end devices, customers can utilize the huge amount of real-time data that comes off machines.  Instead of doing post-event analytics, machine learning embedded at the edge can improve the performance of machines in real time allowing them to get better over time.  Video is one of the fastest growing types of data.  Edge intelligence allows analytics to occur closer to the devices themselves reducing the cost of transmission of high cost data traffic.  As these edge devices become more connected, the power of the internet of things can really emerge.  The machines can communicate with one another in real time.  Machine learning algorithms enable those networks to become more effective working together than as stand-alone reporting devices back to a centralized data store.

Who will own the intelligence at the edge?

Requirements:

  • Network computing expertise:

Creating an automation system with the right intelligence placed in the right places requires understanding of networked computing hardware and how best to design such a system given the computing and data communication requirements.  How much memory is necessary? What is the most efficient network system for communication? What information needs to be centrally analyzed and what can stay at the edge? All of these questions and many more computing-oriented questions are critical to build the right solution.

  • Analytics software expertise

Machine learning/deep learning and other forms of advanced analytics will further enhance the power of the intelligence at the edge. Writing self-learning analytical engines to embed out in the network so that all data doesn’t have to flow centrally is critical in this vision. Most, if not all, of these algorithms will be originally constructed using historical data.  But over-time embedding those into edge devices will improve the speed and actionability of the decisions.

  • Device-embedded software

One of the most powerful impacts of connected devices is the ability to do over-the-air updates to devices in the field.  This requires embedded software that is designed for upgradability in the field.  Most of the embedded software in today’s industrial automation is not upgradable at all…and certainly not with simple pushes of OTA releases.

  • Domain expertise

Industrial automation equipment is purpose built with deep knowledge of the industry and specific solution for which it is intended.  This domain expertise is a critical differentiator for industrial technology firms.  In many cases the little things make significant differences.  And many of these things are not seen until you are deep into user testing.

Contenders:

  • Network computing vendors (e.g., Cisco)

These players bring strengths in software and network computing.  They are already trying to integrate deeper domain expertise into their devices through partnerships and hiring of key industry personnel.  They will never have the level of depth that a focused industrial player will bring.  They can potentially build more and more of the cross-device intelligence into their devices enabling them to capture larger share of the total value available from the device manufacturers.

  • Industrial automation equipment vendors

These players obviously bring deep domain expertise as well as large installed bases of equipment.  In some cases, they also have their own automation control software products (e.g., Emerson’s PlantWeb or Johnson Controls and Siemens Building Automation Systems).  Industrial customers are loathe to replace automation equipment because of the risks of downtime or safety issues.  However, these companies typically do not have the level of software or networking expertise of their new rivals in this marketplace.

  • New entrants with an “edge intelligence” focus

We see the emergence of new entrants into this market who will blend domain expertise, software, analytics and networking expertise to capture value from the industrial automation vendors.  They won’t seek to eliminate the Emerson’s of the world.  Instead, they will offer value added capabilities by bringing greater intelligence to the system.  These companies will bring analytical software and connectivity to the edge of the network to allow various OEM devices to work better together.  They will use Moore’s Law to their advantage to embed more intelligence into standard network computing capabilities to enhance the productivity and safety of industrial and commercial processes.

Conclusion:

We are in the very early days of this battle.  Industrial automation is a very attractive market: large, growing and highly profitable.  To date, incumbents have held the upper hand given their installed base and track records.  However, IOT and, in particular, technologies like “fog computing” or “intelligence at the edge” will open this market to new players with new strategies to deliver ever increased productivity and safety with new technologies.

Standard

Data Democracy

In several panels and conferences lately I have heard of the benefits of “data democracy”.  This is the notion that we will see much greater benefits from big data and advanced analytics as a broader set of managers have access to the data and the analytical tools.  A large number of companies are pursuing solutions to provide just such ease of access.

I do agree that the potential upside from more widespread access is significant for several reasons. It can solve part of the supply challenge of data analysts and their more highly paid cousins, data scientists.  It can reduce translation problems between operations and analytics. It can improve speed and efficiency by reducing the time from question to answer.  Perhaps most importantly, it can help improve the fundamental insights due to more appropriate questions being asked.

These benefits, however, come with equal if not greater risks to businesses.

  • Analysis-Paralysis: Although it is great that we can now ask questions of any sort we want, it doesn’t mean that we always should.  In my experience serving clients I found that more data often led to more questions, but not necessarily more or better decisions.  For instance, how does a retailer decide on pricing and marketing?  Marketing managers have access to lots of data on what promotion worked in a similar category last week or month or year.  They have analyses that show if you spend extra money on advertising, not surprisingly you can charge higher prices.  The merchants have data that show the impact of price moves relative to competition.  The pricing “committee” meets to decide on price structure plans and with different analytical perspectives sends the teams off to do more analysis to understand alternative trade-offs.  In many cases each group uses its own analytical tools to bring new perspectives to the problem, often-times making the decision even more murky.

Yes, I know this is a governance problem, not a data problem.  No one has structured the critical question to answer.  No one has led the team to work together towards a conclusion.  There is no clarity in terms of who “owns” the decision.  All true.  However, I have seen this same pattern albeit with different metrics, questions, and industries countless times.  Humans tend to believe their own analysis more than someone else’s.  And in many cases they are incented differently driving a different set of questions and objectives.

  • A little learning is a dangerous thing: As Alexander Pope wrote, “A little learning is a dangerous thing….shallow draughts intoxicate the brain…”  This point is even more true today with our access to tremendous amounts of “shallow draughts” as it was in 1709 when Pope wrote.  When we conduct an analysis and it “proves” our hypothesis, our beliefs become more entrenched.  Humans are naturally overconfident, but when supplied with a piece of data that supports our original bias, our confidence is at its zenith. This is known as the “confirmation bias” among psychologists.

As a result, people become ever more stuck in their positions.  Due to a biased search for information, people then tend to do analysis that will further support their hypotheses.  This in the end leads to a less flexible team and more room for narrowing of perspectives rather than what we hope for from data access…widening perspectives.

Access to data does not keep us from our human psychological limitations.  In fact, it may only reinforce them.

The conclusion, however, is not to put the genie back in the bottle and hide the data behind the wizard’s facade of data scientists.  There are several key elements to manage the democracy of data to enable it to deliver its benefits with a minimum amount of risk.

  1. Training.  One of the biggest selling points of many of the new data analytics tools is the minimal amount of training necessary to use the software.  “Just click here to download and in 15 minutes you’re up and running.”  How true.  But what (and where) are you running?  The necessary training is not just in the techniques and statistical fundamentals.  It is in training people of their own biases and how to adjust for them.  It is training on what questions can and should be asked rather than just on how to come up with answers.  It is training on “so what” development or how to develop the implications of any analysis.  As with Excel, people are expected to “figure it out” or take a course on line.  But in a world where the data tools are ever more powerful…and the reliance on their outputs ever more accepted…it is incumbent on leaders to ensure proper holistic training of how to use the power of democracy appropriately.
  2. Clarity of governance.  In some ways, this data “babble” may force clearer governance in companies as it makes more obvious the lack of decision-making authority.  To take advantage of the possible efficiencies from greater data access, enterprises must establish much clearer guidelines on decision-rights.  Who has final accountability? Who is authorized to select one model’s outcome rather than another’s? What are the consequences for that person or team to ensure they are incented appropriately?  Without clarity of governance, data will lead to less rather than greater efficiency.
  3. Robust quality control on modeling.  Quality control typically lives in the realm of the operations or manufacturing organization.  In a world of much greater access and reliance on data models, “six sigma-like” quality control procedures are as important in your modeling as in your production line.  Unfortunately, in very few enterprises today are these processes in place.  And in fact, in most they are not even discussed as necessary.  Just as in manufacturing where quality has become the expected responsibility of everyone on the assembly line or upstream, now in modeling everyone must have the same level of responsibility for calling out problems and stopping the process when a mistake is encountered.
  4. Embed a test & learn mindset across the organization.  The risks of data democracy increase when the decisions are black and white/all or nothing.  A test & learn mindset encourages managers to regularly test their data insights and hypotheses in real world settings against a control group.  Test & learn reduces the analysis paralysis because it creates real world outcomes.  While this approach to management is typical in many tech start-ups, it is not second nature outside that community.  Changing the cultural dynamic in a long-standing organization will require leadership to set the example and begin to move towards a more “learning” environment.

The future opportunities from the data and analytics revolution are too large to ignore.  But like many other such opportunities for enterprises, the gain is not without some pain.  If one approaches it as just the next step in making decisions, the risks may outweigh any perceived benefits.  Take a holistic step back and decide what in the organization needs to change to really tap into the upside available.

Standard

Intranets of Things

In our work at Sensorsolve (www.connectedsensorsolutions.com), our company that is building a portfolio of connected-sensor-solutions, we see substantial growth in enterprise adoption of connected devices.  From oil wells, to agricultural equipment.  From HVAC equipment to gen sets.  Companies are realizing the benefits of better visibility into their assets.

It is still absolutely true that too many vendors are “technology-first” rather than “benefit-back”.  However, businesses large and small can see the productivity improvements that come from better tracking, monitoring, and understanding of their supply chains and assets.

In the vast majority of cases, however, these are “point solutions” that solve a specific operational problem.  As I wrote in my last post, this is one of the ways in which companies will create value from connected sensors.  These are solutions, in most cases, led by line functions to solve an operational pain point.  And in most successful cases it is driven locally by leaders who benefit tangibly from the improved information available.  These point solutions also avoid many of the security concerns of the “internet of things” by riding over a private network rather than over the public internet.

The productivity and value creation of these solutions cannot be ignored.  As an example, from past work done in commercial equipment repair environments, we know that the cost of maintenance and repair can be reduced by over 30-40% by leveraging temperature, vibration, and other performance data on a regular basis.

What is emerging, therefore, is a fragmented collection of “intranets of things” rather than an “internet of things.”  The internet implies that “things” are communicating with one another autonomously across a public communication infrastructure (with security protocols, of course).  An intranet implies that they are communicating over an internal, private network.  And, in most cases they are not directly communicating with one another, but instead just acting as an aggregation of the data for a centralized resource to manage.  Individual operational departments build what they need to solve a specific problem.  Therefore, IT departments are left with a patchwork of individual solutions.

We see this as a natural first step in the evolution for established enterprises for several reasons.

  • Point solution intranets can add substantial operational value quickly with less security risk (both real and perceived)
  • Just delivering a point solution is time consuming and requires significant behavior change in established organizations.  Larger change could delay or risk impact altogether.
  • The additional benefits from the “internet” – that is open data accessed over the web – is not yet clear.  If I am an upstream oil producer, I can see the immediate benefits of gathering information on whether my well is operating or not, what the pressures are, whether the tanks are nearing capacity, etc.  However, how connecting to data outside my firewall helps my operation is less clear.

Over the next several (maybe twenty) years, we see an evolution from where enterprises are today:

  1. Increase the number of point solutions
  2. Integrate the “intranets” at least at the data layer to capture the insights available from seeing data across the enterprise
  3. “One way/pull” information from the internet into the enterprise’s analytics stream to improve the insights by capturing available public data…e.g., weather data, crop production data, etc.
  4. Enable the “things” to communicate directly with each other within the intranet to automate decision making and problem solving with prescriptive analytics
  5. Publish data to the internet because there are new revenue streams available from allowing others to access your data.

Obviously not every company will go through each step in order nor at the same pace.  But the “internet of things” where millions of industrial machines are talking with one another autonomously across the public internet will likely take some time.

This evolution creates opportunities for multiple players.  First, to the vendors who can focus on solutions that can deliver rapid pay back.  Second, to the nimble competitors who can find value in the internet more quickly to deliver greater customer benefits or lower cost.  Third, to the “industrial media companies” who develop valuable information that can generate new revenue from publishing it for others to access.

Standard

Introductory post

The core theme to my writing is the management and leadership changes necessary for businesses, governments, and non-profit organizations to take advantage of the growing data connectivity and advanced analytics using that data. The terms “internet of things”, industrial internet, Big Data, etc. have, in my opinion, become over-hyped in the amount of impact they will have in the short-term.  They will go through the “zone of disillusionment” at some point soon, after more organizations try and fail to generate value from using them. But as organizations adjust their management practices – incentive structures, roles & responsibilities, skills, processes, and strategies – these technologies will significantly alter the way economic value is created and delivered.

Last week I attended the Internet of Things World Forum in Chicago. The “hype-meter” was in the red with vendors such as Cisco, Rockwell, Siemens, etc. all highlighting the potential for productivity and customer service improvement when everything is connected.  There were, in fact, several great case examples of companies employing connectivity and analytics to drive dramatic improvements in throughput, quality, and/or new services.  However, the operators in the audience and on the panels were much more cautious.  Many of them have employed connectivity & automation over the past decade and are proud of their accomplishments.  But they’re careful to highlight how long it takes to implement, how difficult it is to integrate across entire plants or supply chains, and how risky it is to connect to cloud infrastructure.

Capturing the economic value from new technologies requires changes to management & leadership at least as great as the technology itself.  ERP systems, sales force automation, electronic hospital records all required significant behavioral change to implement, and as a result took longer for mass adoption and impact than people thought.  For example, retailers had visibility into their inventory levels but management had to change its view of stocking incentive structures, cost allocations, and store management incentives to enable stores to be used as shipping warehouses as well as retail outlets.  In large organizations, these kind of changes require breaking current mental models, constantly communicating a new way of doing business, and investing in training on new processes.  None of this is as easy as connecting a device, even in a complicated wireless world.

For aggressive companies, this is good news. Leadership teams that recognize the changes necessary and move aggressively stand to gain an advantage on the pack.  Not to say that first movers will win.  But companies that thoughtfully combine significant change in management practices aligned with the changes in connected technologies can shift the operating model of their industry.

Tesla is an example of one such company.  We normally think of Tesla as a manufacturer of electric cars.  However, another way of looking at their business is as a company employing a completely new business and management operating model to take advantage of connected devices.  The “connected car” has long been discussed. But Tesla realized they needed to reinvent the way car companies were managed to deliver those benefits.  Benefits such as avoiding the need to go to the dealer for simple updates by using over the air software updates, predictive maintenance, eventually automated driving, all required a car that was fully electronic.  It also required different distribution and incentive models with dealers; different types of relationships with suppliers; a change to product releases from every five years to a regular rhythm of software releases.  All of these changes (and likely many more) are necessary to deliver on the promise of connectivity.  The electric vehicle was just one of the many pieces necessary.

Other industries offer similar opportunities to transform the customer experience with connected technologies.  We will likely see three successful models in the near-term.

  1. Point solutions within larger companies where the management changes necessary are limited in scope and one part of the organization can execute without a cross-organization change effort.
  2. New, innovative companies that re-think the business model and management practices from the top down.  These may not always be start-ups, but they will most likely be smaller companies without the breadth of changes necessary in a large company.
  3. A top-down transformation of a larger company that sees the threat from new players and realizes early the shift of management necessary to take advantage of these technologies.

This is not to say that the projections on connected devices is over-stated. Nor that the insights coming from the data are not real.  But the business and economic impact from these insights will only be realized when management changes to take advantage of them.

In the coming weeks and months I look forward to continuing this theme and adding greater depth to what organizations can do to really benefit from the connected world.

Standard