Showing posts with label Edge Computing. Show all posts
Showing posts with label Edge Computing. Show all posts

Monday, February 14

Edge Computing


Edge computing is a distributed information technology (IT) architecture in which client data is processed at the periphery of the network, as close to the originating source as possible.

Data is the lifeblood of modern business, providing valuable business insight and supporting real-time control over critical business processes and operations. Today's businesses are awash in an ocean of data, and huge amounts of data can be routinely collected from sensors and IoT devices operating in real time from remote locations and inhospitable operating environments almost anywhere in the world.

But this virtual flood of data is also changing the way businesses handle computing. The traditional computing paradigm built on a centralized data center and everyday internet isn't well suited to moving endlessly growing rivers of real-world data. Bandwidth limitations, latency issues and unpredictable network disruptions can all conspire to impair such efforts. Businesses are responding to these data challenges through the use of edge computing architecture.

In simplest terms, edge computing moves some portion of storage and compute resources out of the central data center and closer to the source of the data itself. Rather than transmitting raw data to a central data center for processing and analysis, that work is instead performed where the data is actually generated -- whether that's a retail store, a factory floor, a sprawling utility or across a smart city. Only the result of that computing work at the edge, such as real-time business insights, equipment maintenance predictions or other actionable answers, is sent back to the main data center for review and other human interactions.

Thus, edge computing is reshaping IT and business computing. Take a comprehensive look at what edge computing is, how it works, the influence of the cloud, edge use cases, tradeoffs and implementation considerations.  READ MORE...

Tuesday, March 23

Complementing the Cloud

As technologies like autonomous vehicles, factory robots, and remote monitoring systems become more commonplace, a concept called edge computing is receiving increased attention and investment.

Edge computing refers to a model in which processing power is placed closer to where data is being created in the physical world: While cloud computing platforms like Amazon Web Services are hosted from the retailer's own massive data centers scattered across the world, edge computing focuses on smartening up the car, robot, or other systems right on the device or placing a processor in closer proximity.

It's a concept that's only become more popular as a surge in connected devices — like Tesla's semi-autonomous cars or camera-laden robots in Amazon's factories — collides with the rise of cloud computing, presenting an opportunity for both.

"Edge computing is actually a counterbalance to the cloud," Gartner analyst Bob Gill told Insider. "It's a perfect complement to the cloud that solves for the weakness of the cloud."

As the flexibility, efficiency, and pricing of cloud computing have led firms to abandon their in-house data centers, it's created a new set of technical challenges. While the cloud offers immense raw computing power, relying on it comes with trade-offs, too.

"People realized that not all the things that they want to do in the cloud worked well in the cloud," IDC analyst Dave McCarthy told Insider.

Specifically, edge computing can help solve issues of latency (where systems need to be able to process data incredibly fast), bandwidth (where machines are generating vast amounts of data that would be inefficient to send to a distant data center), autonomy (where systems need to be able to function without network connection), or compliance (like when information needs to remain within a specific country to adhere with local regulations).

Gartner expects that by 2022 more than 50% of enterprise-generated data will be created and processed outside the traditional data center or cloud.  READ MORE

Thursday, February 25

Technology Trends for 2021

According to Rohit Sharma of UpGrad.com, there are 8 Technological Trends for 2021 of which we should be or become aware...  

These are:
  1. Artificial Intelligence
  2. Data Science
  3. Networking Devices
  4. Electronic Ledger Blockchain
  5. Robotic Process Automation
  6. Virtual Reality
  7. Edge Computing
  8. Intelligence Applications
Of course...  UpGrad is offering courses in each one of these areas, in case you want to move towards a career in any of these areas...  however, Community College and Technical Institutes are offering the same courses and probably for a substantial reduction of enrollment fees...   I'm just saying...

With this said...  most if not all of these 8 areas are predicated upon cloud computing or storing data in the cloud so that the proximity to the user is closer thus reducing time...

Cloud computing is the on-demand availability of computer system resources, especially data storage and computing power, without direct active management by the user. The term is generally used to describe data centers available to many users over the Internet.