Clouds, Fog and the IoT – Isolation or Collaboration?
IoT = Internet of Connected Things?
The idea of transmitting sensor data over the Internet is becoming quite popular – we call it the “Internet of Things”. But the quality of Internet connectivity is not the same across all locations.
To have maximum impact on improving business processes, sensor data granularity is important to ensure accurate monitoring. Thus data gathered from sensors should be collected at high frequency, and it should never be lost.
Sensors are often situated in remote, unmanned sites which, for many real-world applications, will be quite numerous (e.g. Telecom Radio Base Stations). Cellular connectivity provided by mobile network operators is often used to transmit data from remote locations, but can quickly become very expensive for high volumes or repetitive data transmission. Thus transmission frequency should be kept at a minimum, but without losing the level of data accuracy obtained by polling sensors at high frequency.
Additionally, and in any location, connectivity might be affected by unplanned, unpredictable interruptions, introducing the risk of losing collected data if this is not somehow ‘stored’.
Last, the variable nature of Internet connectivity makes it especially difficult – and risky – to deploy ‘sense & respond’ scenarios based on data transmission over the Internet.
The need to address these issues has led to the emergence of the concept of ‘Fog Computing’.
Cisco uses the term ‘fog nodes’ to define tools capable of collecting real time sensor data from any device and processing it locally, transiently storing it for periodical transmission to the cloud; these tools also provide the data intelligence required to send control commands to local actuators.
Fog nodes typically make use of local mass storage (non-volatile memory) to decouple the frequency of data polling from sensors from that of its onward transmission, as well as to prevent sensor data loss caused by unplanned connectivity interruptions.
However, collecting sensor data at high frequency for storage in non-volatile memory prior to internet transmission could easily become very expensive, as it requires high frequency write & delete operations, causing fast memory deterioration and thus increasing the maintenance needs of computers that could be located in remote, difficult to reach areas.
IoT = Interoperable connected Things
The Internet of Things is more than just connecting devices to the Internet. According to McKinsey it is device interoperability – i.e. sharing device data across multiple applications – that will unlock up to 60% of IoT value in business environments.
McKinsey defines an individual IoT system as ‘sensors and/or actuators connected by networks to computing capabilities that enable a single IoT application’. An operational definition of multi-stakeholders, collaborative IoT scenarios could then be ‘sensors and/or actuators connected by networks to distributed computing capabilities that enable multiple and diverse IoT applications’.
The requirement for a collaborative IoT, but still capable of addressing Internet connectivity and security issues, suddenly makes relevant a ten year old technology paradigm, In-Memory Data Grid (IMDG).
Gartner defines an IMDG as “a distributed, reliable, scalable and … consistent in-memory NoSQL data store[,] shareable across multiple and distributed applications.”
Combining the concepts of Fog Computing and In-Memory Data Grid might provide the architectural basis to leverage ‘edge’ intelligence for distributed, collaborative IoT scenarios.
The future of collaborative IoT
The architecture of collaborative IoT will need to be based on grids of remote intelligent Feeders (fog nodes) collecting real time sensor data, processing and storing it in volatile memory (e.g. RAM) as/if required, and sending it to brokers in the cloud using industry standard protocols (MQTT, AMQP, DDS etc.). Brokers will in turn relay sensor data to any ‘listening’ application.
As they do not require non-volatile memory to process and store data, Feeders will be installed on small footprint computers in remote locations, or on virtual networks in urban and business environments.
The use of intelligent Feeders will decouple the frequency of data collection and data transmission, adapting the flow to the quality of available connectivity – and accommodating its sudden interruptions. In the presence of optimal connectivity, all applications in collaborative IoT scenarios will simultaneously receive a real time data stream.
For example, in a food retail environment, energy usage and temperature data will be immediately available to applications for e.g. energy management, predictive maintenance, hazardous analysis and critical control points (HACCP). In a smart city traffic scenario, all moving vehicles’ position data could be used at the same time to provide drivers with traffic information and advice for alternate routes, public transport users with updates on buses timetables, traffic wardens with advice to move to congestion areas, automated traffic lights with commands to adjust their timing, pollution management systems with real time trend analysis and, of course, car insurers and their customers with risk related real time information.
Additionally, Feeders will provide the intelligence required to respond to changes locally with the lowest possible latency – for example by slowing down pumps in an oil rig in response to a trend showing a decrease in pressure.
Present Future
Feeders will daily collect, process in-memory and push to the cloud millions of data points, connecting to any device, using any communication protocol and running on small footprint edge gateway devices or computers, or on virtual platforms. They will provide the intelligence required to control local actuators, while avoiding the security pitfalls inherent to traditional request & reply communication.
Intelligent Feeders will eliminate all issues related to scaling and performance from the complexity and volumes of sensors, devices and data in large scale IoT projects, at the same time ensuring complete interoperability and resilience of data flow – dramatically cutting the cost of any IoT deployment.
This article was written by Elena Pasquali, the CEO of EcoSteer, an IIoT software company with offices in Italy and the US. EcoSteer key software product, the EcoFeeder, converts any kind & number of sensors and industrial devices into real-time data streams instantly accessible to multiple applications.