Vodafone recently released their annual “IoT Barometer Report” providing insight into how enterprises are implementing Internet of Things throughout their organizations and the future of the industry. Based on a survey of over 1,200 businesses of varying sizes across the globe. It is an intriguing read and provides some very interesting statistics including average ROI, average number of devices, and reasons for or against adopting IoT. One particular survey result jumped out at me though:
Figure 1- Source: Vodafone 2017/2018 IoT Barometer Report https://syswinsolutions.com/wp-content/uploads/2017/10/Vodafone-IoT-Barometer1718,0.pdf
Only 37% of respondents said they tested during development. And only 27% said they segment their IoT solutions from other systems. That means there is a significant number of companies that implemented some form of an IoT solution, commingled it with other systems, and never bothered to test beforehand. Granted, there is ambiguity in the results – are the other systems core IT or siloed scanners, was the IoT a simple sensor or something much more intricate?
The high number of organizations not testing begs the question though, why? I would venture to guess that the answer for many is, “Because it is extremely difficult!”
As the Vodafone report details, these organizations are deploying devices in the tens-of-thousands, some upwards of 50,000+. The deployments are most likely spread across wide geographies and traverse a variety of IT functions – compute, network, storage, security, and applications.
For example, consider how an enterprise might deploy IoT as a method to improve supply chain efficency and the number of different departments and each supporting system involved (i.e. shipping, forecasting, warehousing, accounting, etc.). The below image, borrowed from an article written by JR Fuller (HPE’s IoT expert) on TechBeacon, demonstrates just how many variables are included in such a deployment.
Figure 2 – Source: https://techbeacon.com/4-stages-iot-architecture Author- JR Fuller
Adding yet one more layer of complexity, new technologies and protocols continually enter the market causing additional variables in testing, an even greater reason for the requirement to do so. As the report indicates, there are already multiple connectivity choices within a single IoT implementation and several new options are just coming available (i.e. 5G, LP-WAN, LoRa, NB-IoT, etc.).
The scale and complexity of all this is so daunting that it is not surprising that many organizations are forced to shoot from the hip. Which also probably points to why, according to another survey conducted by Cisco, 60% of IoT projects fail at the proof of concept stage. (https://www.slideshare.net/CiscoBusinessInsights/journey-to-iot-value-76163389)
How can an organization truly test at the scale of IoT?
The answer is actually hiding in both the Cisco and Vodafone survey. Collaboration. Collaboration across departments, from customer to vendor, and from partners like Tokalabs.
From a technology perspective, our solutions can remove much of the complexity that surrounds testing at IoT scale. With the LaunchStation, it is easy to control large numbers of devices that are spread across geographies and deployment types (physical, virtual, cloud). In doing so, complex testing can be executed in a coordinated fashion that actually produces meaningful results.
As the Cisco research pointed out, there are many issues surrounding IoT projects with the most difficult being the human factor to IoT projects. Not too surprising, as this holds true in just about every complex IT project that organizations undertake. While none of our products can force humans to communicate or uncover underlying reasons for resistance, the LaunchStation can become a centralized source of information for collaboration across organizations that have opposing objectives, may not speak the same language or just use different terminology, or have varying skillsets that could lead to emotional pushback. The simple interface and repository in LaunchStation allows for all parties involved in the project to view tests, topologies, results, etc. and collaborate as necessary.