Harnessing the Internet of Things across any business in no drag-and-drop affair — successful implementations need to be thought-out from the ground up.
Although D&D might possibly refer to Dungeons & Dragons for many technology practitioners, the acronym is better known for its reference to drag-and-drop. By suggesting that there is no drag-and-drop (or indeed plug-and-play) in the IoT we are clarifying the need for a strategic, logically architected and above all intelligently delegated approach to the task in hand.
But what does this really mean? In practical terms it necessitates a ‘separating out’ of the constituents’ elements of the full computing stack and the roles pertaining to each level of the infrastructure.
When we look at building the optimal conditions for the IoT-enriched business to flourish, we need to step back and ask ourselves what core competencies our firm can command. We may have a pretty sharp IT function, but just how good is our storage capability?
Storage can be sexy
As we know, IoT objects from sensors to monitoring cameras to web-connected fridges are capable of consuming huge quantities of storage. So question 1 should be: Do we have a specific person within the IT function who we clearly recognise as the ‘storage guru’ today? If nobody gets excited over the mention of ‘drive bay enclosures’, then we may need to outsource this function.
Nobody drags-and-drops additional storage or brings in Disaster Recovery as-a-Service without strategic planning. Yes, okay, we can ‘spin up’ new cloud instances faster than we can order a sandwich, but storage is not a question of snapping on extra Lego bricks i.e. the right storage needs to go in the right place at the right time with the right level of transactional Input/output provisioning.
Of course it’s not just storage. This overriding concept of task delegation and differentiation spreads from front to back across any conscientiously executed IoT deployment.
Preparing for data
Ingesting data from the Internet of Things is simply not possible without data preparation — otherwise it’s just drinking from a fire hose.
As I have explained elsewhere, data preparation is a procedure for gathering, cleaning, amalgamating and consolidating our target information sources into one location (a file, database or table). This process also covers the error correction, merging of data sources, deduplication of extraneous data and accommodating for areas where nulls and zeros exist (or where the data is incomplete in some other way) so that we can progress to building our data model.
Johan den Haan, CTO at aPaaS digital transformation company Mendix, spoke to Internet of Business to reiterate the need for painstaking architectural process as we now merge the digital and physical worlds in 2016.
“The challenge (and the opportunity) is to harness the deluge of IoT data to transform companies’ operations, products and services and business models. But it is this notion of the business model that must form the core blueprint for the way we now engineer our IT stacks for the IoT,” said den Haan.
Mendix’s den Haan highlights the tender truth here i.e. it is tough to pull off an end-to-end approach. That is to say, with still-forming standards, so many componentised disparate elements and constant change, constructing the new IoT-enabled firm is a big ask.
Not humanly possible
“It will not be humanly (or machine-ly) possible to interpret all of the data being generated by the new sensors and connected devices coming online this year.
“Although we will rely on machine learning to give us the step up that we need, firms will need apps to analyse data from connected devices, sensors and more – apps that do it fast and flexibly. They’ll need these development capabilities internally, and they’ll need to use agile, rapid application development techniques to build iteratively, one step at a time.”
Matt Davies, technical evangelist at operational intelligence platform company Splunk also spoke to Internet of Business to say that the machine data generated by IoT will be fast moving, time-series and unstructured (or semi-structured) and consideration needs to be given to applying structure to data when storing it.
“Assigning a rigid schema or structure can hinder the exploration of data and restrict the ability for different people to ask different questions of the IoT data. Storing all your data and applying “schema on the fly” means that when asking questions of stored IoT data you get the data and the schema back at search or query time. This means you can rapidly evolve and iterate the questions you want to ask to make the most of IoT data. In a nutshell, store but don’t structure at data ingest time to give your IoT data flexibility,” said Davies.
The Splunk evangelist insists that to find the value from IoT data we need to be able spot patterns, apply machine learning, democratise it with business analytics and extend it by building new applications on sensor generated information. “This requires thinking holistically and end-to-end from storage, through a data platform to the wide range of users that will want to harness the data from IoT,” he said.
Automation is the answer?
But it’s not all doom, gloom and hard graft. We have already mentioned machine learning as an essential facilitating buttress on our journey to establish the new IoT-driven business — and automation will also play a key role.
In this context, we define automation technology as the elements within a software data system that we ‘could’ build from scratch, but that we can more productively find in tools provided by specialists in this layer of computing. Automation controls could include: process-driven workflows, built in Business Intelligence (BI), integrated checklists and elements relating to design and navigation.
Director at cross-platform document-oriented database company MongoDB, Mat Keep, spoke to Internet of Business in line with this story to say that when it comes to real world implementations.
“Your old enterprise architecture will not work. The Internet of Things requires you to re-evaluate technology choices. But it’s not a simple ‘drag and drop to the cloud’ problem, like some other projects may have been. Poor performance or lack of scale will kill your IoT dreams.”
MongoDB’s Keep insists that data has gravity. Wherever the data is first created and stored, is likely to be the best place to keep and use it. Given IoT applications are created on the Internet, that means commodity infrastructure in the cloud. He explains that that doesn’t mean your strategy should be giving up control of all your data to a one size fits all asS (as-a-service). You need depth of understanding and control at every layer of your stack to run a true enterprise IoT project
Keep concludes by saying: “When planning, organisations should understand that sensor data is raw crude oil. On its own it’s useless, but it becomes extremely valuable when correctly refined. For IoT, that refining process is the analysis and visualisation of multiple streams of data. Rather than just detect and respond, it is transformed to “predict and act” typically through machine learning, and human training and interpretation. Nothing will be perfect the first time and significant tuning will be needed. The key is that, like crude oil, you don’t want to throw anything away – the data has intrinsic value, and its byproducts are endless.”
Ramming home the point
Given all these factors, it’s still not easy to make the IoT work as an out-of-the-box, end-to-end, drag-and-drop solution with plug-and-play technologies. This is a deep in the guts engineering play, but the industry will change that and more complete service solutions will emerge.
If your business is bringing in its first IoT data streams, think about your backend before you think about your front end… that is all we ask.