Welcome!

IoT Expo Authors: Dana Gardner, Carmen Gonzalez, Elizabeth White, Yeshim Deniz, Esmeralda Swartz

Related Topics: IoT Expo, Java, Linux, Web 2.0, Cloud Expo, Big Data Journal

IoT Expo: Article

Internet of Things: M2M and Big Data: Rise of the Machines

Don’t Worry – It’s a Good Thing!

Science fiction films abound that warn of machines taking control and wreaking havoc on the human race. "2001: A Space Odyssey," "War Games" and "I, Robot" are just a few of the titles that propose what might happen if we hand too much power over to intelligent, interconnected machines.

Decades after the first cautionary tale, the world's machines are more intelligent and more interconnected than even science fiction authors could have predicted. Machine to Machine (M2M) communication and the mobile revolution have led to the phenomenon of Big Data, an influx of structured and unstructured data at volumes and velocities never before heard of. The insightful analysis of all that data is proving to be a blessing to humanity, not the threat that many feared. M2M and Big Data Analytics can help reduce costs and create competitive advantage for a wide variety of businesses.

What Is M2M?
M2M refers to systems and technologies that make it possible for networked devices to exchange information and perform actions on their own, without (or with minimal) human intervention. Gathering sensor data from devices, analyzing it and using it to exercise more intelligent control can drive better outcomes. Everyday examples include:

Smart meters, coupled with predictive analytics, enable utility companies to predict demand patterns, automatically adjust to meet peak demand and avoid over-production when demand is low.

Remote medical sensors can monitor patients, remind them if they've forgotten their medications and alert doctors when intervention might be needed.

Smart buildings have sensors that can analyze environmental data to save energy and improve safety.

Traffic data from networked sensors can be analyzed to predict shifts in traffic patterns. Using this information to control traffic signals can actually prevent traffic jams, not just ease them.

Automated systems like GM's OnStar can alert emergency services when accidents occur, even when the humans involved aren't able to help themselves.

How M2M Came to Be
M2M didn't arrive on the scene overnight; as with anything else, it followed an evolutionary process. Back in the 1980s, Supervisory Control And Data Acquisition (SCADA) systems were introduced to enhance controls for electricity generation, transmission and distribution, and to improve monitoring and control for traffic and transportation systems. In the '90s, Wireless Sensor Networks were introduced to improve monitoring and control in many manufacturing and industrial systems. Wireless made it easier to monitor and control a broader range of devices but only supported limited, short-range connections.

When data modules were introduced in the mid-1990s and early 2000s that could communicate via cellular networks, a major leap forward occurred. These systems were used first to connect point of sale (POS) terminals, vehicle sensors and other remote monitoring and tracking systems, and then were further extended to automatic meter reading, security, elevator control, fleet management, vending and telemedicine.

M2M communication and applications have really exploded in diversity and number since the introduction of the Internet as a backbone for communication. Three major factors have combined to accelerate the recent growth in M2M:

  1. More data from more devices can be combined and analyzed more quickly due to advances in tools and technologies for big data analysis and predictive analytics. This enables machine-driven actions based on anticipated conditions - not just faster reaction times.
  2. The "everywhereness" of broadband networks, wireless and Internet has given rise to the Internet of Things (IoT) and has made it easier and cheaper than ever to connect devices. Assign an IP address to a device with Internet access and you can communicate with it anywhere in the world.
  3. Cheaper and smaller sensors, memory and processing power mean that more devices can be networked, and the devices themselves can be smarter.

M2M Now and in the Future
Gartner Inc. estimates that there are currently just under 30 billion connected devices and projects $309 billion in additional revenue for product and service suppliers by 2020 due to IoT. They also predict $1.9 trillion in total economic impact from improved productivity and cost savings, among other factors.

As an example of IoT's impact, Gartner turns its attention to data centers. The analyst firm predicts that IoT product and service suppliers will generate revenue exceeding $300 billion, mostly in services, by 2020.

How M2M Is Being Applied
With virtually every industry impacted, M2M's technology solutions applications are startling in their breadth and diversity. Machina Research points to benefits as varied as reduced energy costs, improved safety and security, and increased efficiency and faster response times for emergency services and national defense. Here are some examples:

In terms of how far along companies in key verticals are in implementing M2M initiatives, another recent study by Techpro Research offers some insight. Energy, IT and automotive top the list in current implementations, or plans to implement in the next 12 months, followed by Healthcare, Facility Management, Manufacturing and Retail.

M2M Success in the Marketplace
If businesses do thoughtful planning around how to use M2M to achieve their goals, opportunities to boost revenues, cut costs and more effectively serve customers are tremendous. A few recent examples include:

Retail - Nestlé Nespresso SA has equipped its coffee machines used in restaurants, hotels, offices and luxury retail boutiques to transmit operational and performance data from each machine to a cloud platform for tracking and analysis. The system tracks descaling and other maintenance procedures and alerts technical staff if servicing is required. The applications can also be used to remotely adjust water temperature and pressure. The system helps ensure that machines are maintained in excellent condition, that they produce the highest-quality coffee, cup after cup, and that customers are well supplied with their coffee of choice.

Transportation - The automotive industry and the U.S. Federal Government are embracing M2M. The US Department of Transportation recently conducted research that suggests that Vehicle to Vehicle (V2V) technology could prevent the majority of crashes involving two or more vehicles. Sensors can monitor speed and location of nearby vehicles, analyze risks and either warn drivers (near term) or take action on their own (longer term) to avoid accidents. The research could lead to a mandate to use V2V in the future.

Healthcare - Partnering with the University Teaching Hospitals of Grenoble and Toulouse, France Telecom R&D launched a project called "Gluconet" for managing diabetic patients remotely. A special instrument is used to periodically read patient glycemia data. This information gets transmitted automatically to the management center via mobile devices. The doctors can access the information over the Internet. Based on the analysis, doctors send medical advice to patients via SMS or voice messaging. The key advantage here is that both patients and doctors are alerted of any complications well before they become life-threatening.

Consumer - Lexmark, a provider of printing and imaging products, software, solutions and services, deployed M2M for more effective customer servicing. Lexmark uses M2M to collect data from millions of printers. The company analyzes the data to streamline its products to serve customers better, increase revenues and reduce operational costs.

Facilities Management - Commercial real estate services firm Jones Lang LaSalle (JLL) deployed an M2M system called IntelliCommand to collect data from building systems for security and protection against heating, cooling or fire incidents. Information collected by remote sensors is transmitted to a cloud-hosted system for in-depth analysis. When sensors collect data that strays outside of established parameters, alarms are relayed to a control center to alert managers. JLL's pilot installation with four sites enabled clients to cut costs by 15-20 percent. The real estate giant is now extending its deployment to 76 buildings.

How to Begin the Process
M2M possibilities for some organizations are self-evident. An equipment manufacturer might see an opportunity to leverage machine data to provide better service and build loyalty. Another might see an opportunity to add value that can be monetized. Some companies might find themselves threatened by competitors who have already started using M2M to gain advantage. But it's not so cut and dried for some businesses. The "M2M Opportunity Matrix" shown here offers some structure that can be used to think about M2M and identify opportunities that can improve business performance.

Listed across the top of the Matrix are possible business objectives. This isn't an exhaustive list, but you could do a lot of good for your business by finding ways to reduce cost, increase revenue or add value.

Options related to data sources are listed down the left side. Your organization might already have a large database of information that's coming in from POS systems or manufacturing control systems or some other source - Data In-Hand. But maybe you haven't figured out what to do with the information yet. There might be additional data that you could be collecting from existing "sensors" - New Data from Existing Sources. Or there might be new data that you could access with new sensors, or by sourcing from outside your company - New Data from New Sources. Probably, the data you already have in hand is going to be the easiest to tap into to achieve business objectives. But some opportunities might be so valuable that it's worth deploying new sensors to gather new data.

There's a potential M2M opportunity at the juncture of each business objective and data source. So, do some brainstorming. Start the process by thinking of how to leverage different data sources to achieve various business objectives. It can go in a lot of directions from there.

Alternatively, an experienced data consultant can help you look objectively at your situation and help you to identify low-hanging fruit or the really game-changing opportunities that could deliver more transformative results. There are a lot of right answers. The best thing is to get started.

Making the Most of M2M
It turns out that, so far at least, all those cautionary tales about intelligent machines have proven untrue. In fact, interconnected machines and the data they generate are improving the ways we live and do business. Smarter systems that don't need to rely on slower human input and that can more quickly adapt as needed are the upshot of M2M. Even now we are seeing incredible innovations like remote glucose monitoring, more efficient printing and safer buildings. And that's only the beginning. At the risk of imitating sci fi writers who were a bit off-base, we hesitate to predict what other life-enhancing technologies powered by M2M are on the horizon.

You, meanwhile, should not hesitate to take part in the M2M revolution. If you wait for someone else to figure out how to best leverage M2M, you are likely to lose market share or lose the opportunity altogether. It may seem overwhelming to know where to start; if that's the case, work with a data consultant who can help create a plan. Don't let the intelligent machines outsmart you.

More Stories By Sam Ganga

Sam Ganga is Executive Vice President of the Commercial Division of DMI. The commercial division at DMI is tasked with helping customers solve business problems by applying the solution value of emerging technologies. Current practice areas include: Enterprise Mobility, Big Data Insights, Cybersecurity, Cloud Computing, SOA and Java/J2EE.

Under Mr. Ganga’s leadership, DMI’s Commercial Division has developed the world’s most comprehensive set of Mobile Enterprise Solutions, including mobile strategy, mobile managed services, mobile app solutions and integrated vertical solutions for retail, financial services and healthcare.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


Latest Stories from IoT Journal
All major researchers estimate there will be tens of billions devices – computers, smartphones, tablets, and sensors – connected to the Internet by 2020. This number will continue to grow at a rapid pace for the next several decades. With major technology companies and startups seriously embracing IoT strategies, now is the perfect time to attend @ThingsExpo in Silicon Valley. Learn what is going on, contribute to the discussions, and ensure that your enterprise is as "IoT-Ready" as it can be!
The Open Group and BriefingsDirect recently assembled a distinguished panel at The Open Group Boston Conference 2014 to explore the practical implications and limits of the Internet of Things. This so-called Internet of Things means more data, more cloud connectivity and management, and an additional tier of “things” that are going to be part of the mobile edge -- and extending that mobile edge ever deeper into even our own bodies. Yet the Internet of Things is more than the “things” – it means a higher order of software platforms. For example, if we are going to operate data centers with new dexterity thanks to software-defined networking (SDN) and storage (SDS) -- indeed the entire data center being software-defined (SDDC) -- then why not a software-defined automobile, or factory floor, or hospital operating room -- or even a software-defined city block or neighborhood?
Internet of @ThingsExpo announced today a limited time free "Expo Plus" registration option. On site registration price of $600 will be set to 'free' for delegates who register during this period. To take advantage of this opportunity, attendees can use the coupon code "IoTAugust" and secure their "@ThingsExpo Plus" registration to attend all keynotes, as well as limited number of technical sessions each day of the show, in addition to full access to the expo floor and the @ThingsExpo hackathon. Registration page is located at the @ThingsExpo site.
Tyco's IoT platform transforms the data from these traditional facilities-oriented systems into highly valuable business intelligence focused on solving real-world problems in various environments. The platform will allow Tyco's customers to collect data from their installed sensors and devices and perform advanced analytics, unleashing the value of the information and creating numerous smart service possibilities for customers. The company is moving to expand the platform so that it incorporates more sensors, devices and applications from external sources, making it possible to deliver an even wider array of intelligent solutions for customers in multiple vertical markets.
Cloud Computing is evolving into a Big Three of Amazon Web Services, Google Cloud, and Microsoft Azure. Cloud 360: Multi-Cloud Bootcamp, being held Nov 4–5, 2014, in conjunction with 15th Cloud Expo in Santa Clara, CA, delivers a real-world demonstration of how to deploy and configure a scalable and available web application on all three platforms. The Cloud 360 Bootcamp, led by Janakiram MSV, an analyst with Gigaom Research, is the first bootcamp that introduces the core concepts of Infrastructure as a Service (IaaS) based on the workings of the Big Three platforms – Amazon EC2, Google Compute Engine, and Azure VMs. Bootcamp attendees will get to see the big picture and also receive the knowledge needed to make the best cloud decisions for their business applications and entire enterprise IT organization.
BetaBoston is reporting a shake-up at LogMeIn's Xively Internet of Things division. "Several top execs focused on launching new services to support the “Internet of Things” — sometimes called machine-to-machine communication, or M2M — have left Boston-based LogMeIn in recent months. Among those who have left the Xively division in 2014 are chief technology officer Philip DesAutels; Chad Jones, a vice president of strategy; and Les Yetton, the one-time general manager of the group."
Enthusiasm for the Internet of Things has reached an all-time high. In 2013 alone, venture capitalists spent more than $1 billion dollars investing in the IoT space. With “smart” appliances and devices, IoT covers wearable smart devices, cloud services to hardware companies. Nest, a Google company, detects temperatures inside homes and automatically adjusts it by tracking its user’s habit. These technologies are quickly developing and with it come challenges such as bridging infrastructure gaps, abiding by privacy concerns and making the concept a reality. These challenges can’t be addressed without the kinds of agile software development and infrastructure approaches pioneered by the DevOps movement.
Predicted by Gartner to add $1.9 trillion to the global economy by 2020, the Internet of Everything (IoE) is based on the idea that devices, systems and services will connect in simple, transparent ways, enabling seamless interactions among devices across brands and sectors. As this vision unfolds, it is clear that no single company can accomplish the level of interoperability required to support the horizontal aspects of the IoE. The AllSeen Alliance, announced in December 2013, was formed with the goal to advance IoE adoption and innovation in the connected home, healthcare, education, automotive and enterprise. Members of this nonprofit consortium include some of the world’s leading, consumer electronics manufacturers, home appliances manufacturers, service providers, retailers, enterprise technology companies, startups, and chipset manufacturers. Initially based on the AllJoyn™ open source project, the AllJoyn software and services framework will be expanded with contributions from member companies and the open source community.
There’s Big Data, then there’s really Big Data from the Internet of Things. IoT is evolving to include many data possibilities like new types of event, log and network data. The volumes are enormous, generating tens of billions of logs per day, which raise data challenges. Early IoT deployments are relying heavily on both the cloud and managed service providers to navigate these challenges. In her session at 6th Big Data Expo®, Hannah Smalltree, Director at Treasure Data, to discuss how IoT, Big Data and deployments are processing massive data volumes from wearables, utilities and other machines.
Larry Ellison turned 70 and has decided to turn over the CEO reins at Oracle. Safra Catz and Mark Hurd, both in their 50s, will function as a “Ms. Inside and Mr. Outside” as co-CEOs, at least for awhile. Serious reverberations will be felt within this highly competitive company and the highly competitive industry in which it makes its money. Even while guiding his yacht to an America's Cup title, Larry Ellison remained in firm control of the company he founded in 1977. He still has an ownership stake of about 20% of the company--1 billion or so shares of Oracle stock worth about $40 billion. Who can imagine that he'll be a docile, passive Chairman? Yes, he is returning as Chairman, with Jeff Henley, currently in that role, moving aside to be Vice-Chairman. Ellison reports he will also serve as Chief Technology Officer. So it's clear he's not fading from the scene. But he will not be able to micromanage the company by any measure. What Does It Mean? Think of all of the very strong executives over the years who rose quickly and highly in Oracle, only to be banished from the kingdom and/or to start their own big companies. Ray Lane, Marc Benioff, and Tom Siebel spring i...
I write and study often on the subject of digital transformation - the digital transformation of industries, markets, products, business models, etc. In brief, digital transformation is about the impact that collected and analyzed data can have when used to enhance business processes and workflows. If Amazon knows your preferences for particular books and films based upon captured data, then they can apply analytics to predict related books and films that you may like. This improves sales. This is a simple example, but let me tell you what I learned yesterday in sunny and warm San Francisco about more complex applications.
Explosive growth in connected devices. Enormous amounts of data for collection and analysis. Critical use of data for split-second decision making and actionable information. All three are factors in making the Internet of Things a reality. Yet, any one factor would have an IT organization pondering its infrastructure strategy. How should your organization enhance its IT framework to enable an Internet of Things implementation? In his session at Internet of @ThingsExpo, James Kirkland, Chief Architect for the Internet of Things and Intelligent Systems at Red Hat, will describe how to revolutionize your architecture and create an integrated, interoperable, reliable system of thousands of devices. Using real-world examples, James will discuss the transformative process taken by companies in moving from a two-tier to a three-tier topology for IoT implementations.
Technology is enabling a new approach to collecting and using data. This approach, commonly referred to as the “Internet of Things” (IoT), enables businesses to use real-time data from all sorts of things including machines, devices and sensors to make better decisions, improve customer service, and lower the risk in the creation of new revenue opportunities. In his session at Internet of @ThingsExpo, Dave Wagstaff, Vice President and Chief Architect at BSQUARE Corporation, will discuss the real benefits to focus on, how to understand the requirements of a successful solution, the flow of data, and how to best approach deploying an IoT solution that will drive results.
IoT is still a vague buzzword for many people. In his session at Internet of @ThingsExpo, Mike Kavis, Vice President & Principal Cloud Architect at Cloud Technology Partners, will discuss the business value of IoT that goes far beyond the general public's perception that IoT is all about wearables and home consumer services. The presentation will also discuss how IoT is perceived by investors and how venture capitalist access this space. Other topics to discuss are barriers to success, what is new, what is old, and what the future may hold.
The Domain Name Service (DNS) is one of the most important components in networking infrastructure, enabling users and services to access applications by translating URLs (names) into IP addresses (numbers). Because every icon and URL and all embedded content on a website requires a DNS lookup loading complex sites necessitates hundreds of DNS queries. In addition, as more internet-enabled ‘Things’ get connected, people will rely on DNS to name and find their fridges, toasters and toilets. According to a recent IDG Research Services Survey this rate of traffic will only grow. What’s driving this increase? The growth in volume is largely attributed to the rollout of new services and applications along with expanding migration to the cloud and traffic spikes. The Internet of Things will also place a strain on DNS services. Are you ready for this surge of new services and applications along with potential DNS threats?