How does edge computing support industry?


In recent years the term ‘edge computing’ has become more and more pervasive. But what is it? A specialised technology? A type of computer? In reality, not truly either.

What is Edge Computing?

It best describes the location of computing, and if you like, the geographic location, rather than a specific technology. In the same way ‘cloud’ describes a centralised computing in a vague and not completely specific sense, the edge describes the computing done locally or near the source of data.

Why compute at the edge?


There are many reasons computing can’t be completely centralised. One of the biggest is latency: the delay between the local computer receiving a response from a server remotely in the cloud.

This delay is at best variable over the internet, or in some situations completely unavailable where there is a blip in connectivity. Responsiveness may be important for the user, or the compute’s output, such as processing images in realtime to provide a near-immediate analysis.


With applications processing massive amounts of data locally, such as those which involve a large number of sensors, GPU computing, AI or high-resolution imaging, the problem is even worse where the raw data would need to be transmitted over internet connections.

Here’s an example – even a very respectable 1Gigabit fibre internet connection is dwarfed when compared to the amount of data that can be potentially transferred internally within a computer with a modern computing interface massive 252.8Gbits (PCIe 4.0 – 15.8Gbit per lane). Whilst this potential may not necessarily always get utilised internally – trying to push even a fraction of this amount of data to the cloud would be akin to trying to fit a stadium full of people through a tiny door: the queues and delays would be immense.  That queue would simply make some industry applications unfeasible.

Instead, the computer ‘at the edge’ processes, and depending on the application, summarises the raw data – sending only what is important to the cloud.

Is the edge new?

Only in terminology. Computing has been done at the edge for a very long time, really for as long as computing has been pervasive within industry to automate a task. How long? IBM released the 5531 Industrial Computer in 1984, arguably a good milestone to measure, which saw computers doing processing within industry exactly where the data was being processed locally. The internet, and the ‘cloud’, came a little later, giving the relative position of the edge a meaning.

Industrial and embedded computing manufacturers have therefore been producing computers for the edge, before it was even a ‘thing’!

Living on the edge

Whilst the edge isn’t a specific technology, the location close to processing does naturally push computing in some more unusual or challenging environments when compared to the cloud – happily centralised in air conditioned, tightly secured and controlled datacentres!

These unusual environments often bring with them new technical challenges not seen away from the edge. The need to perform increasingly more complex computing in more and more applications locally, dictates that edge computing must be more robust and rugged.

Since the location of the edge is potentially anywhere on earth (or perhaps beyond!) the only limitation the that the sensitive electronic computing elements can operate is the level of protection they are equipped with. This protection varies greatly from ingress protection against dust and liquids the computer may be exposed to, such as in a factory, or outside exposed to the elements, extremes of temperature, or resistance to shocks or vibrations when used on a vehicle.

So perhaps the unspoken hero within the edge, is the engineering that happens behind the scenes to enable computing to operate in all the places it does – supporting industry, infrastructure and much more to do things barely imaginable in 1984.

To learn more about Edge Computing Solutions, contact a Captec technical specialist today. 

Spread the word ...