IoT data management requires mission-critical edge processing
IoT’s reach extends the enterprise outward into the world, and edge computing will expand right alongside it. Organizations must understand how to use edge computing for IoT data management to keep up with the data avalanche and enhance edge data security.
The expansion is orders of magnitude faster and broader than the growth of the cloud. IoT device deployment will reach almost 40 billion by the end of next year according to Gartner, so organizations diving into IoT must build edge processing resources. Security is perhaps the biggest concern because IoT’s larger attack surface presents a field day for hackers, and IoT networks require edge gateways to lock down device output. But the role of edge computing includes much more to match the rapidly increasing expectations for IoT.
Secure, share and sanitize IoT data
A large part of an edge server’s burden is plugging holes in the pipeline between IoT and the clouds that devices feed into. In large-scale scenarios — such as traffic management and supply-chain operations — edge processing can involve dynamic routing of IoT data to multiple clouds, including partner organizations’ clouds that share the data.
IoT data also must be cloud-worthy to begin with; IoT devices conform to no universal standard in any area of their functionality, including security, protocol and fault tolerance. The age range of IoT hardware can stretch to 20 years, which can add up to a lot of noise. Edge servers also struggle to cope with noise.
Real-time response and decision support cause bigger problems
Security and data routing present major challenges, but IoT data management now presents an even greater challenge: IoT networks require immediate response or real-time decision support, such as during an outage in a factory or a snag in a traffic system.
In scenarios like these — which are becoming more and more routine — there is no time for a round trip to the cloud to crunch the data, analyze the problem and return a result. IoT technology must receive a response in seconds — not hours or minutes.
Both of these scenarios require dynamic response. The technology needs to compensate for some change in a physical environment — such as a sudden temperature change or a fault warning in a piece of equipment — or alter a complex workflow based on an unexpected disruption, such as a traffic accident involving freight transport. The event triggering the response might require action, and the threshold for intervention could itself be dynamic. This is a job for AI.
AI algorithms are the best way to handle scenarios that require a dynamic response when there is no time or opportunity to involve human beings. The IoT network itself has to be a smart system, capable of making decisions instantaneously, and it needs to actually live on the edge.
The edge structure means that IoT data needs to be parsed, not just by what goes to the home cloud and what goes to a B2B partner’s cloud, but by what data real-time processes and more traditional processes need. By definition, the instantaneous data needs to be filtered into those processes right away. Batch data can be consigned to temporary storage and carted off to the cloud at leisure.
Getting better at the edge
Best practices include two key innovations. IoT data management tasks, including managing data transport, should take place in the edge, not the cloud. IoT generally consists of new architecture appended to existing, centralized technology, so it’s tempting to take a top-down approach to manage new data collected on the edge. Cloud systems are no longer a centralized endpoint; they are one target destination among many. IoT technology performs a sizable number of closed-loop processes on the edge. It makes more sense to administrate that data from the servers gathering it, especially when both its routing and application might be dynamic.
Until the enterprise software industry develops turn-key technology, the most economical and effective way to manage data is through custom pipelines and microservices, which are easily maintained and extended in decentralized processes. Creating a dashboard for data traffic analytics is fairly straightforward, and Python is an excellent choice for implementation.
Keep the models and machine learning in the cloud. If the goal of a particular IoT implementation is real-time response in a physical environment or real-time decision support, the best approach is to decouple the analytics and AI from the IoT technology doing the work. Let the model and the machine learning processes stay in the cloud. As the model changes, the algorithms used to generate the analytics feeding the IoT network will update in turn. This requires some extra work, but much less work than deploying machine learning on the edge, where it would be much harder to maintain and even harder to secure.
Industry standards have yet to emerge, but because security is the immediate issue most organizations face in edge server deployment, it’s often someone in IT infrastructure. They should be involved in any case, but support of IoT data management and edge processes needs to include both data architects and enterprise solution architects moving forward. No routing and real-time processing is possible without very efficient data modeling and a strong workflow optimized to accommodate it.
This news was previously published on: https://internetofthingsagenda.techtarget.com/tip/IoT-data-management-requires-mission-critical-edge-processing
Sudipto writes technical contents periodically and backs it up with extensive research and relevant examples. He’s an avid reader and a tech enthusiast at the same time with a little bit of “Arsenal Football Club” thrown in as well. He’s got a B.Tech in Electronics and Instrumentation engineering.