Welcome!

@ThingsExpo Authors: Yeshim Deniz, Elizabeth White, Pat Romanski, Liz McMillan, William Schmarzo

Related Topics: @ThingsExpo, Java IoT, Agile Computing, @CloudExpo, @DXWorldExpo

@ThingsExpo: Article

IoT and Fog Computing Architecture | @ThingsExpo #IoT #DigitalTransformation

Fog computing represents an evolution from a centralized toward a decentralized cloud system

Cloud computing changed data analytics for good. It enabled companies to drastically decrease resources and architecture previously assigned with business intelligence departments. It also enabled laymen to run advanced business analytics. Cloud was also the architecture of choice for storing and processing big data.

Data piling is a continuous process, which is going to explode with emerging Internet of Things concept. Answer to this issue developers found in new concept called fog computing. As opposed to clouds, fog computing architecture is capable of conducting all required computations and analytics directly at the data source. This way, single network administrator is able to control the work of thousands (or even millions) of different data generating devices, by real time and predictive analytics, without overloading the network with huge piles of data going back and forth.

This process goes on as far as devices are working in a regular way. The moment when some problem occurs, and some device requires repair or maintenance, administrator receives a notice. With this approach that also includes advanced BI software that continuously conducts analysis of device's work, administrators are able to overlook huge number of devices, using little network capacity and bandwidth.

Benefits of fog computing
Fog computing concept comes with long list of benefits. Some of them are:

  • It frees up network capacity - As we said earlier, fog computing uses much less bandwidth, which means it doesn't cause bottlenecks and other similar occupancies. Less data movement on the network frees up network capacity, which then can be used for other things.
  • It is truly real-time - Fog computing has much higher expedience than any other cloud computing architecture we know today. Since all data analysis are being done at the spot it represents a true real time concept, which means it is a perfect match for the needs of Internet of Things concept.
  • It boosts data security - Collected data is more secure when it doesn't travel. This way, cyber criminals can't intercept it and use it for fraudulent purposes. The whole concept also makes data storing much simpler, because it stays in its country of origin. Sending data abroad might violate certain laws.

Disadvantages of fog computing
Like any new concept fog computing also comes with few disadvantages, although it is hard to say whether these disadvantages can even be compared with those that we experienced in previous computing architectures. Some developers state these three main disadvantages of systems that work on fog computing paradigm:

  • Analytics is done locally- You probably noticed that we mentioned this as a benefit in previous paragraph and a trick that enables fog computing systems to use much less bandwidth. Well, some developers argue that cloud computing concept should enable people to access data from everywhere around the world. Of course fog computing concept enables developers to access most important IoT data from other locations, but it still keeps piles of less important information in local storages;
  • Some companies don't like their data being out of their premises- basically this is the same with all cloud environments. Some people still don't believe outside servers, and since with fog computing lots of data is stored on the devices themselves (which are often located outside of company offices), this is perceived as a risk by part of developers' community.
  • Whole system sounds a little bit confusing- Concept that includes huge number of devices that store, analyze and send their own data, located all around the world sounds utterly confusing.

More complex hardware and software
Fog computing is a very complicated architecture, and the fact that every device does its own data analytics requires combination of both hardware and software components. Some of the required hardware includes: Wi-Fi routers, computer chips, various switches and IP cameras. Some of the companies that already developed their own systems in this field are: Cisco, IBM, Intel, EMC and several others.

If we think about it fog computing represents an evolution from centralized toward a decentralized cloud system. Mobile communications and the dynamic life people are living requires provisioning resources locally. Progress of Internet of Things idea largely contributed to fog computing development and in the future all world networks will be forced to apply this system, due to increasing growth of big data.

More Stories By Nate Vickery

Nate M. Vickery is a business consultant from Sydney, Australia. He has a degree in marketing and almost a decade of experience in company management through latest technology trends. Nate is also the editor-in-chief at bizzmarkblog.com.

IoT & Smart Cities Stories
Machine learning has taken residence at our cities' cores and now we can finally have "smart cities." Cities are a collection of buildings made to provide the structure and safety necessary for people to function, create and survive. Buildings are a pool of ever-changing performance data from large automated systems such as heating and cooling to the people that live and work within them. Through machine learning, buildings can optimize performance, reduce costs, and improve occupant comfort by ...
René Bostic is the Technical VP of the IBM Cloud Unit in North America. Enjoying her career with IBM during the modern millennial technological era, she is an expert in cloud computing, DevOps and emerging cloud technologies such as Blockchain. Her strengths and core competencies include a proven record of accomplishments in consensus building at all levels to assess, plan, and implement enterprise and cloud computing solutions. René is a member of the Society of Women Engineers (SWE) and a m...
Poor data quality and analytics drive down business value. In fact, Gartner estimated that the average financial impact of poor data quality on organizations is $9.7 million per year. But bad data is much more than a cost center. By eroding trust in information, analytics and the business decisions based on these, it is a serious impediment to digital transformation.
DXWorldEXPO LLC announced today that Ed Featherston has been named the "Tech Chair" of "FinTechEXPO - New York Blockchain Event" of CloudEXPO's 10-Year Anniversary Event which will take place on November 12-13, 2018 in New York City. CloudEXPO | DXWorldEXPO New York will present keynotes, general sessions, and more than 20 blockchain sessions by leading FinTech experts.
Apps and devices shouldn't stop working when there's limited or no network connectivity. Learn how to bring data stored in a cloud database to the edge of the network (and back again) whenever an Internet connection is available. In his session at 17th Cloud Expo, Ben Perlmutter, a Sales Engineer with IBM Cloudant, demonstrated techniques for replicating cloud databases with devices in order to build offline-first mobile or Internet of Things (IoT) apps that can provide a better, faster user e...
Bill Schmarzo, author of "Big Data: Understanding How Data Powers Big Business" and "Big Data MBA: Driving Business Strategies with Data Science" is responsible for guiding the technology strategy within Hitachi Vantara for IoT and Analytics. Bill brings a balanced business-technology approach that focuses on business outcomes to drive data, analytics and technology decisions that underpin an organization's digital transformation strategy.
Charles Araujo is an industry analyst, internationally recognized authority on the Digital Enterprise and author of The Quantum Age of IT: Why Everything You Know About IT is About to Change. As Principal Analyst with Intellyx, he writes, speaks and advises organizations on how to navigate through this time of disruption. He is also the founder of The Institute for Digital Transformation and a sought after keynote speaker. He has been a regular contributor to both InformationWeek and CIO Insight...
Rodrigo Coutinho is part of OutSystems' founders' team and currently the Head of Product Design. He provides a cross-functional role where he supports Product Management in defining the positioning and direction of the Agile Platform, while at the same time promoting model-based development and new techniques to deliver applications in the cloud.
Andrew Keys is Co-Founder of ConsenSys Enterprise. He comes to ConsenSys Enterprise with capital markets, technology and entrepreneurial experience. Previously, he worked for UBS investment bank in equities analysis. Later, he was responsible for the creation and distribution of life settlement products to hedge funds and investment banks. After, he co-founded a revenue cycle management company where he learned about Bitcoin and eventually Ethereal. Andrew's role at ConsenSys Enterprise is a mul...
In his session at 21st Cloud Expo, Raju Shreewastava, founder of Big Data Trunk, provided a fun and simple way to introduce Machine Leaning to anyone and everyone. He solved a machine learning problem and demonstrated an easy way to be able to do machine learning without even coding. Raju Shreewastava is the founder of Big Data Trunk (www.BigDataTrunk.com), a Big Data Training and consulting firm with offices in the United States. He previously led the data warehouse/business intelligence and Bi...