Edge Computing
History?
In the 1990s, The main concept of edge computing was taken when Akamai solved the challenge of Web traffic congestion by introducing Content Delivery Network (CDN) solutions. The technology included network nodes that cached static media in locations closer to the end users.
Today, edge computing takes this concept further, introducing computational capabilities into nodes at the network edge to process information and deliver services to the end user .
What is Edge computing?
Edge computing is an enhanced version of cloud computing. Both are the non-interchangeable technologies that cannot replace one another. In edge computing, data is processed or analyzed by the device itself or by a local computer or server rather than being sent to the data center or main cloud. Edge computing means running the processes at the edge of the client, such as in the user’s computer, or in an IoT device, or in any edge server that is connected with the internet and near to the client’s location. Here we are bringing the computation to the network’s edge that minimizes the long-distance communication between a client and server. It also reduces the latency, bandwidth usage and strain on main computing servers that accelerate real-time data processing. The important takeaway is that the edge of the network is geographically close to the user’s device unlike origin servers or cloud servers . In edge computing, you need an external device or resource. Usually, in cloud computing, we see 2 servers in function, cloud server and user’s device. However in edge computing, we have 3 servers – one extra node in between the end user’s device and a cloud server that will be called an edge device.
Application?
Autonomous vehicle: As the edge computing technology can provide us with real-time data without any delay, so it will be really useful in self-driving cars where the sensors or IOT devices of the vehicles require real-time data of the traffic around. By using that, autonomous vehicles can communicate and can run easily without any collisions. It will be helpful in effective city traffic management.
In Agriculture: Temperature and humidity sensors in agriculture fields collect valuable data, but that data isn’t analyzed or sorted in real time. Edge devices can collect, sort, perform preliminary analysis of that data, and then send it to centralized applications or some form of long-term storage, either in the cloud or on-premises. This traffic will not be time sensitive or slower, and because the data is presorted at edge network so the volume of traffic that needs to be sent will reduce. The upside of the edge computing is the faster response time for application that requires. The downside can be security concern as data will sort and analyze at edge so, it important to assure the security of the IoT device.
There can be many other applications possible with edge computing like, patient monitoring in hospitals, traffic management, remote monitoring in oil and gas industry, predictive maintenance, etc.
Challenges?
The current infrastructure of the network is not competent enough to support the real-time data processing with IoT or edge devices. We need the stronger network with high frequency waves and more small base stations to run the process smoothly. We need a full setup of network providing devices and it is something where we enter in the next generation model that is 5G. The implementation of the true 5G network is a challenge itself.
The other challenge is the security of the edge devices. To secure the personal data or real-time data in edge devices, we have to come with an advanced cyber security system that could be fit for edge devices.
Although, the need of an on-demand compute, real-time data processing and Internet of things will increase the uses of this technology and this human brain will definitely find a way to overcome these challenges.
How Does Edge Computing Works?
Edge computing works by processing data as close to its source or end user’s device as possible. It keeps data, applications and computing power away from a centralized network or data center.
Traditionally, data produced by sensors is often either manually reviewed by humans, left unprocessed, or sent to the cloud or a data center for processing, and then sent back to the device. Relying solely on manual reviews results in slow, inefficient processes. And while cloud computing provides computing resources, the data transmission and processing puts a large strain on bandwidth and latency.
Bandwidth is the rate at which data is transferred over the internet. When data is sent to the cloud, it travels through a wide area network, which can be costly due to its global coverage and high bandwidth needs. When processing data at the edge, local area networks can be utilized, resulting in higher bandwidth at lower costs.
Latency is the delay in sending information from one point to the next; it affects response times. It’s reduced when processing at the edge because data produced by sensors and IoT devices no longer needs to be sent to a centralized cloud to be processed.
By bringing computing to the edge, or closer to the source of data, latency is reduced and bandwidth is increased, resulting in faster insights and actions.
Edge computing can be run on one or multiple servers to close the distance between where data is collected and processed to reduce bottlenecks and accelerate applications. An ideal edge infrastructure also involves a centralized software platform that can remotely manage all edge systems in one interface.
Why Edge Computing? What Are the Benefits of Edge Computing?
The shift to edge computing offers businesses new opportunities to glean insights from their large datasets. The main benefits of edge computing are:
- Lower latency:Â Latency is reduced when processing at the edge because data produced by sensors and IoT devices no longer need to be sent to a centralized cloud to be processed.
- Reduced bandwidth: When data is sent to the cloud, it travels through a wide area network, which can be costly due to its global coverage and high bandwidth needs. When processing data at the edge, local area networks can be utilized, resulting in higher bandwidth at lower costs.
- Data sovereignty:Â When data is processed at the location it is collected, edge computing allows organizations to keep all of their data and compute at a suitable location. This results in reduced exposure to cybersecurity attacks and adherence to strict, ever-changing data location laws.
- Intelligence: AI applications are more powerful and flexible than conventional applications that can respond only to inputs that are explicitly anticipated. An AI neural network is not trained on how to answer a specific question, but rather how to answer a particular type of question, even if the question itself is new. This gives the AI algorithm the intelligence to process infinitely diverse inputs, like text, spoken words or video.
- Real-time Insights: Since edge technology analyzes data locally rather than in a faraway cloud delayed by long-distance communications, it responds to users’ needs and can produce inferences in real time.
- Persistent improvement: AI models grow increasingly accurate as they train on more data. When an edge AI application confronts data that it can’t accurately or confidently process, it typically uploads it to the cloud so that the AI algorithm can retrain and learn from it. So the longer a model is in production at the edge, the more accurate the model will become.
Â
What Are The Types of Edge Computing?
The definition of edge computing is broad. Often, edge computing is referred to as any computing outside of a cloud or traditional data center.
While there are different types of edge computing , three primary categories of edge computing include:
- Provider edge: The provider edge is a network of computing resources accessed by the internet. It’s mainly used for delivering services from telecommunication companies, service providers, media companies and other CDN operators.
- Enterprise edge: The enterprise edge is an extension of the enterprise data center, consisting of data centers at remote office sites, micro-data centers, or even racks of servers sitting in a compute closet on a factory floor. As with a traditional, centralized data center, this environment is generally owned and operated by IT. However, there may be space or power limitations at the enterprise edge that change the design of these environments.
- Industrial edge: The industrial edge is also known as the far edge. It generally encompasses smaller compute instances such as one or two small, ruggedized edge servers, or even an embedded system deployed outside a data center environment. Because they run outside of a normal data center, there are a number of unique space, cooling, security, and management challenges.