There’s an elephant in the big data room. Many companies are using outdated tactics and technology to collect and analyze data. These companies are missing opportunities to extract the maximum value of their data. With the advent of the Internet of Everything (IoE) and fog computing, organizations can deliver the right information at the right time to people on any device, driving business success.
The traditional model for data analytics involves a centralized data warehouse and manual data manipulation and investigation. With the IoE, enormous amounts of data pouring in can make it difficult to find and act upon the right data if it is housed in centralized data warehouses. In most cases, centralized data processing also requires high bandwidth to get information from point A to point C for analysis. Moving gigantic amounts of data around can result in lower reliability and higher network latency.
On the other hand, fog computing promises to unleash actionable data in real time at the network edge through mobility support and knowledge of aspects such as location awareness and geographical distribution.
The Weather Company is using fog computing to transform big data into a competitive advantage. Data flows in from a variety of sources: sensors, nano satellites, aircrafts, autonomous marine vessels, automobiles, drones, and weather enthusiasts around the globe.
Their technologists cooperate with other executives to innovate new products that deliver more value to customers. The company can deliver more relevant and accurate weather forecasts and content, creating a competitive advantage. And with The Weather Channel television network, website, and mobile application, the company is reaching more customers and expanding the business.
An scenario, including The Weather Company’s, requires intelligence at the edge of the network, where colossal amounts of data are beginning to pour in from sensors and “things.” In the hyper-connected world of the IoE, innovation is accelerating in ways and speeds never before seen, because people are accessing the right information at their fingertips.
Much of the infrastructure to accomplish this—data centers and cloud-based computing—is already established. There’s just one thing missing: a highly virtualized platform that provides compute, storage, and networking services between end devices and traditional cloud computing data centers that is now referred to as fog computing.
Think of fog computing as a complement to cloud computing that is closer to the ground. It connects sensors to cloud computing resources to enable rapid, actionable decisions to be made based on big data pouring in from the IoE. Typically, the fog platform filters big data and pushes relevant data to the cloud. It then supports real-time, actionable analytics and enacts processes wherever people are.
Emerging data virtualization and fog computing technologies are helping companies overcome the challenges of centralized data processing. Data virtualization software makes it seamless to manipulate and view data, no matter where it resides. Fog computing creates a highly virtualized platform that extends cloud resources to the edge of the network. This reduces latency by providing compute, storage, and networking services throughout the network.
“Fog computing has three main benefits,” says Steve Hilton, co-founder and managing director of MachNation, a leading insight services firm. “First, data is processed and accessed more rapidly than in traditional, centralized computing architectures. Second, data can be accessed more efficiently from a network perspective, not always requiring the high bandwidth that is necessary for communications with a central data center. Third, datasets are processed and accessed reliably in the most logical location, minimizing the risks of latency.”
To secure these benefits, Hilton suggests a systematic review of a company’s business processes in each functional area.
“There are numerous processes that would benefit from the use of fog computing, including logistics management, warehousing services, retail services management, field services management, and others,” he says. “After reviewing business processes, companies can create a common architectural approach to deploy fog computing with its ancillary technology components: networks, hardware, platforms, software, and services.”
Together, data virtualization and fog computing help bring intelligence and analytical capabilities to the data. The result is the ability to extrapolate value and act on large quantities of diverse data, as it flows into the network, from any number of sources.
“IT decision makers, OT decision makers, and line-of-business leaders need to work together to determine their fog-based analytics needs,” says Hilton. “The basic question to answer is whether fog-based analytics increases the speed, efficiency, and reliability of processing and accessing data. If the answer is yes, then these stakeholders should determine how to implement a fog solution.”
Did you like this article?