Thoughts on IoT/Edge Computing and the Cloud
Thoughts on IoT/Edge Computing and the Cloud
Recently, there has been a lot of talk around how Edge Computing/IoT/Processing at the edge will be a major focus moving forward. I tend to agree with this approach and I think it’s following a pattern that’s always been there in computing.
Going into the past, we had large mainframes with terminals that’d access the mainframe (i.e. the Cloud with older protocols/less bandwidth). Connectivity was poor however, and it wasn’t long before people required a better processing capability right in front of them. We moved to Desktops/Workstations/Workgroup Servers. With that model, we worked around the bandwidth/processing limit and put data closer to the user.
Cloud computing came along, and at least in concept, it works a little bit like the mainframe, but instead, protocol improvements such as widespread adoption of HTTP and higher bandwidth connectivity has made it stick this time. Commercially, you also don’t have to own one, you can just make use of a providers Cloud.
Add to the technical improvements the obvious commercial advantage of turn on/turn off computing, we’re now in an age of utility computing where you can dynamically scale as you need to. From a raw conceptual view, none of this is new or different, it’s the same concept, just with vastly improved technology meaning the execution delivers a better result. There’s a ton more in terms of repeatable code, standardisation, the use of APIs and common methods and approaches.
So, back to the point on how IoT/Edge Computing is starting to come along. Recently, Michael Dell has made a $1 billion bet that this will be a major part of the future as can be seen Here on The Register. One comment stood out to me. The comment is about how the car would need to process information immediately to detect if a collision may occur;
Dell chairman and CEO Michael Dell used the example of a deer running out in front of a self-driving car. If the car has to wait for a public (off-device) cloud to receive sensor data, recognise a deer, discern there is the danger of a collision, and direct the car to avoid the collision, then the collision will have happened before the cloud can respond.
Working in computing, this almost seems completely obvious, sending realtime camera data to the Cloud to figure out if the car should hit the brakes or not just seems like a terrible idea, we’d all be dead in a week if this happened. With a bit of basic maths we can see even with 5G Mobile speeds, it seems like utter madness to constantly stream all of the data from a connected car into the cloud. Intel Say the connected car will produce on average 4TB of Data Per Day, say that car runs for 8 hours per day (It’s self driving, lets say you don’t own it and it’s a service vehicle), that’d be 500GB Per Hour meaning the car would need to send 500GB an hour for realtime processing, meaning 8.3GB a minute meaning a sustained transfer rate of 138Mb/s… Everywhere the car goes… What if it failed? What does the car then do?
It’s fairly obvious this might not be the best solution. The process Michael Dell talks about, shuffling processing and data to the edge makes a lot of sense. It’s really the only safe way of knowing you can make split second decisions that’ll ultimately save lives. If you’re a car manufacturer, the data at the edge will be invaluable to your ongoing product development, you might also need it for legal compliance incase a car is involved in an accident. So, while the data at the edge is used for realtime processing, it only becomes truly useful once it’s all available in the cloud and available for comparison/machine learning operations against loads of other sets of data.
In the car world, we’ll see manufacturers make the edge more intelligent and “record” interesting events. Driving along the motorway might not necessarily be interesting, so the system will “listen”, but not necessarily store. Another car swerving across a lane really late on might be interesting however, likewise, driving in a residential area with a child throwing a ball into the road might generate an interesting event. How did the car react? What was the weather like, how many milliseconds did it take the car to process the event? Could that process be optimised/speeded up? These events when combined together can be used by machine learning to generate more intelligent algorithms which make better decisions by the realtime processing equipment in the cars. It’s like a continual learning/feedback loop.
Going back to the computing element. Imagine the car records a number of these events over the period of a day, and lets make an assumption that the car is powered by electricity. It’d not be impossible to assume the cars have 4g/5g connectivity for sharing events into a Cloud Based Data Pool. Assuming that 4g/5g isn’t yet fast enough/economical enough for this, these electrical cars need charging. Enter the Edge Computing Model. Where Large Charging Stations are deployed, why not deploy a chunk of storage and a very fast secured wireless network? While the car is charging for say 20-30 minutes, why can’t it transfer, via high speed wireless networking – or cable as it’s plugged in – directly to a wireless charging station. The car transfers a shedload of data to the charging station in the ~20 minutes it’s sat there. The charging station could then slowly trickle that data into the public cloud for analysis over the next 2-24 hours.
This is one scenario where IoT (the connected car) interfaces with the edge (The Intelligent Charging Point – With Store and Forward Capability, Perhaps with processing to trim irrelevant data) that then forwards to the Cloud (For Centralised Analytics/Big Data Analysis). That data can then be fed right back into the cars over 4G in the way of updated maps that improves the model the onboard hardware uses to process it’s environmental surroundings.
This is just one example and one industry that’ll change by making use of IoT/Edge/Cloud computing, many more will in different ways. All of this introduces a need to standardise what we do at a Platform Level and make it easier to deploy things in a standardised way – think farm, not pet.