Welcome!

IoT Expo Authors: Mark O'Neill, Jnan Dash, Pat Romanski, Liz McMillan, Elizabeth White

Related Topics: IoT Expo, Java, Linux, Cloud Expo, Big Data Journal, DevOps Journal

IoT Expo: Blog Feed Post

@ThingsExpo | Cloud, Internet of Things (#IoT) and Big Operational Data

The Internet of Things is only going to make that even more challenging as businesses turn to new business models and services

Cloud and Things and Big Operational Data

Software-defined architectures are critical for achieving the right mix of efficiency and scale needed to meet the challenges that will come with the Internet of Things

If you've been living under a rock (or rack in the data center) you might not have noticed the explosive growth of technologies and architectures designed to address emerging challenges with scaling data centers. Whether considering the operational aspects (devops) or technical components (SDN, SDDC, Cloud), software-defined architectures are the future enabler of business, fueled by the increasing demand for applications.

The Internet of Things is only going to make that even more challenging as businesses turn to new business models and services fueled by a converging digital-physical world. Applications, whether focused on licensing, provisioning, managing or storing data for these "things" will increase the already significant burden on IT as a whole. The inability to scale from an operational perspective is really what software-defined architectures are attempting to solve by operationalizing the network to shift the burden of provisioning and management from people to technology.

But it's more than just API-enabling switches, routers, ADCs and other infrastructure components. While this is a necessary capability to ensure the operational scalability of modern data centers, what's really necessary to achieve the next "level" is collaboration.

That means infrastructure integration.

it is one thing to be able to automatically provision the network, compute and storage resources necessary to scale to meet the availability and performance expectations of users and businesses alike. But that's the last step in the process. Actually performing the provisioning is the action that's taken after it's determined not only that it's necessary, but where it's necessary.

Workloads (and I hate that term but it's at least somewhat universally understood so I'll acquiesce to using it for now) have varying characteristics with respect to the compute, network and storage they require to perform optimally. That's means provisioning a "workload" in a VM with characteristics that do not match the requirements is necessarily going to impact its performance or load capability. If one is making assumptions regarding the number of users a given application can support, and it's provisioned with a resource profile that impacts that support, it can lead to degrading performance or availability.

What that means is the systems responsible for provisioning "workloads" must be able to match resource requirements with the workload, as well as understand current (and predicted) demand in terms of users, connections and network consumption rates.

Data, is the key. Measurements of performance, rates of queries, number of users, and the resulting impact on the workload must be captured. But more than that, it must be shared with the systems responsible for provisioning and scaling the workloads.

Location Matters

This is not a new concept, that we should be able to share data across systems and services to ensure the best fit for provisioning and seamless scale demanded of modern architectures. A 2007 SIGMOD paper, "Automated and On-Demand Provisioning of Virtual Machines for Database Applications" as well as a 2010 IEEE paper, "Dynamic Provisioning Modeling for Virtualized Multi-tier Applications in Cloud Data Center" discuss the need for such provisioning models and the resulting architectures rely heavily on the collaboration of the data center components responsible for measuring, managing and provisioning workloads in cloud computing environments through integration.

The location of a workload, you see, matters. Not location as in "on-premise" or "off-premise", though that certainly has an impact, but the location within the data center matters to the overall performance and scale of the applications composed from those workloads. The location of a specific workload comparative to other components impacts availability and traffic patterns that can result in higher incidents of north-south or east-west congestion in the network. Location of application workloads can cause hairpinning (or tromboning if you prefer) of traffic that may degrade performance or introduce variable latency that degrades the quality of video or audio content.

Location matters a great deal, and yet the very premise of cloud is to abstract topology (location) from the equation and remove it from consideration as part of the provisioning process.

Early in the life of public cloud there was concern over not knowing "who your neighbor tenant" might be on a given physical server, because there was little transparency into the decision making process that governs provisioning of instances in public cloud environments. The depth of such decisions appeared to - and still appear to - be made based on your preference for the "size" of an instance. Obviously, Amazon or Azure or Google is not going to provision a "large" instance where only a "small" will fit.

But the question of where, topologically, that "large" instance might end up residing is still unanswered. It might be two hops away or one virtual hop away. You can't know if your entire application - all its components - have been launched on the same physical server or not. And that can have dire consequences in a model that's "built to fail" because if all your eggs are in one basket and the basket breaks... well, minutes of downtime is still downtime.

The next evolutionary step in cloud (besides the emergence of much needed value added services) is more intelligent provisioning driven by better feedback loops regarding the relationship between the combination of compute, network and storage resources and the application. Big (Operational) Data is going to be as important to IT as Big (Customer) Data is to the business as more and more applications and services become critical to the business.

More Stories By Lori MacVittie

Lori MacVittie is responsible for education and evangelism of application services available across F5’s entire product suite. Her role includes authorship of technical materials and participation in a number of community-based forums and industry standards organizations, among other efforts. MacVittie has extensive programming experience as an application architect, as well as network and systems development and administration expertise. Prior to joining F5, MacVittie was an award-winning Senior Technology Editor at Network Computing Magazine, where she conducted product research and evaluation focused on integration with application and network architectures, and authored articles on a variety of topics aimed at IT professionals. Her most recent area of focus included SOA-related products and architectures. She holds a B.S. in Information and Computing Science from the University of Wisconsin at Green Bay, and an M.S. in Computer Science from Nova Southeastern University.

Latest Stories from IoT Journal
Enthusiasm for the Internet of Things has reached an all-time high. In 2013 alone, venture capitalists spent more than $1 billion dollars investing in the IoT space. With “smart” appliances and devices, IoT covers wearable smart devices, cloud services to hardware companies. Nest, a Google company, detects temperatures inside homes and automatically adjusts it by tracking its user’s habit. These technologies are quickly developing and with it come challenges such as bridging infrastructure gaps, abiding by privacy concerns and making the concept a reality. These challenges can’t be addressed without the kinds of agile software development and infrastructure approaches pioneered by the DevOps movement.
The worldwide cellular network will be the backbone of the future IoT, and the telecom industry is clamoring to get on board as more than just a data pipe. In his session at Internet of @ThingsExpo, Evan McGee, CTO of Ring Plus, Inc., to discuss what service operators can offer that would benefit IoT entrepreneurs, inventors, and consumers. Evan McGee is the CTO of RingPlus, a leading innovative U.S. MVNO and wireless enabler. His focus is on combining web technologies with traditional telecom to create a new breed of unified communication that is easily accessible to the general consumer. With over a decade of experience in telecom and associated technologies, Evan is demonstrating the power of OSS to further human and machine-to-machine innovation.
Whether you're a startup or a 100 year old enterprise, the Internet of Things offers a variety of new capabilities for your business. IoT style solutions can help you get closer your customers, launch new product lines and take over an industry. Some companies are dipping their toes in, but many have already taken the plunge, all while dramatic new capabilities continue to emerge. In his session at Internet of @ThingsExpo, Reid Carlberg, Senior Director, Developer Evangelism at salesforce.com, to discuss real-world use cases, patterns and opportunities you can harness today.
The Industrial Internet of Things represents a tremendous opportunity for innovative companies looking to unlock new revenue sources by packaging their products with new digital services, says Accenture (NYSE:ACN) in its new report, “Driving Unconventional Growth through the Industrial Internet of Things.” Combining sensor-driven computing, industrial analytics and intelligent machine applications into a single universe of connected intelligent industrial products, processes and services, the Industrial Internet of Things generates data essential for developing corporate operational efficiency strategies. However, the Accenture report finds that the Industrial Internet of Things also provides a rich opportunity to drive revenue growth through new, innovative and augmented services for a rapidly expanding marketplace.
littleBits Electronics, the company putting the power of electronics in everyone’s hands, today announced the launch of the bitLab, an app store for user-generated hardware. The marketplace furthers littleBits’ goal to democratize the hardware revolution, giving hardware developers the tools and ecosystem to develop and sell their own littleBits modules. "When Apple launched the App Store, many apps were games, many were frivolous. But now - 6 years later - there are more than 1.3 million apps that have distributed nearly $15 billion to the software developer community,” said Ayah Bdeir, CEO and founder of littleBits. “And those apps are solving huge problems, from cancer detection to transportation and anything in between. We believe the same thing will happen with hardware - developers just need one common platform to develop on, a supply chain that powers it, and a marketplace for community and distribution. We believe the bitLab will be the hardware industry’s solution to innovation, scale and growth.”
It's time to condense all I've seen, heard, and learned about the IoT into a fun, easy-to-remember guide. Without further ado, here are Five (5) Things About the Internet of Things: 1. It's the end-state of Moore's Law. It's easy enough to debunk the IoT as “nothing new.” After all, we've have embedded systems for years. We've had devices connected to the Internet for decades; the very definition of a network means things are connected to it. But now that the invariable, self-fulfilling prophecy of Moore's Law has resulted in a rise from about 10,000 transistors on a chip in 1980 to more than 2.5 billion today, our systems are powerful enough and fast enough to deliver long-imagined dreams. There simply was not enough bandwidth even a decade ago to the dataflows from tens of billions of sensors, billions of phones and tablets, and tens of millions of enterprises. Systems were not powerful enough to process such large amounts of data, nor could they handle software sophisticated enough to make sense of it all. Now, everything is up to speed. Moore's Law will continue, future systems will continue to make past systems look quaint and comical. But the paradigm will shift n...
Internet of @ThingsExpo announced today a limited time free "Expo Plus" registration option. On site registration price of $600 will be set to 'free' for delegates who register during this period. To take advantage of this opportunity, attendees can use the coupon code "IoTAugust" and secure their "@ThingsExpo Plus" registration to attend all keynotes, as well as limited number of technical sessions each day of the show, in addition to full access to the expo floor and the @ThingsExpo hackathon. Registration page is located at the @ThingsExpo site.
The Internet of Things promises to transform businesses (and lives), but navigating the business and technical path to success can be difficult to understand. In his session at Internet of @ThingsExpo, Sean Lorenz, Technical Product Manager for Xively at LogMeIn, will show you how to approach creating broadly successful connected customer solutions using real world business transformation studies including New England BioLabs and more.
The Internet of Things is tied together with a thin strand that is known as time. Coincidentally, at the core of nearly all data analytics is a timestamp. When working with time series data there are a few core principles that everyone should consider, especially across datasets where time is the common boundary. In his session at Internet of @ThingsExpo, Jim Scott, Director of Enterprise Strategy & Architecture at MapR Technologies, will discuss single-value, geo-spatial, and log time series data. By focusing on enterprise applications and the data center, he will use OpenTSDB as an example to explain some of these concepts including when to use different storage models.
Code Halos – aka “digital fingerprints” - are the key organizing principle to understand a) how dumb things become smart and b) how to monetize this dynamic. In his session at Internet of @ThingsExpo, Ben Pring, Co-Director (AVP), Center for the Future of Work at Cognizant Technology Solutions, will outline research, analysis and recommendations from his recently published book on this phenomena on the way leading edge organizations like GE and Disney are unlocking the IoT opportunity and what steps your organization should be taking to position itself for the next platform of digital competition.
There are dozens of disruptive, innovative, truly ground-breaking connected devices on the market today. Most of them, however, have not achieved anything close to the kind of ubiquity that they are seeking. Why? Because in the wearable tech industry, innovation alone is not enough. In order to be adopted by mainstream audiences, a device must be both disruptive and unobtrusive – it must slip into our lives without us having to adjust our behavior, or even really think about its presence. In his session at Internet of @ThingsExpo, Gilles Bouchard, CEO of Livescribe, will discuss the role that design plays in reaching mainstream consumers.
In his @ThingsExpo presentation, Aaater Suleman will discuss DevOps, Linux containers, Docker in developing a complex Internet of Things application. The goal of any DevOps solution is to optimize multiple processes in an organization. And success does not necessarily require that in executing the strategy everything needs to be automated to produce an effective plan. Yet, it is important that processes are put in place to handle a necessary list of items. Docker provides a user-friendly layer on top of Linux Containers (LXCs). LXCs provide operating-system-level virtualization by limiting a process's resources. In addition to using the chroot command to change accessible directories for a given process, Docker effectively provides isolation of one group of processes from other files and system processes without the expense of running another operating system.
It's the Great Convergence! That is, the convergence of the IoT and WebRTC. “From telemedicine to smart cars, digital homes and industrial monitoring, the explosive growth of IoT has created exciting new business opportunities for WebRTC, real time calls and messaging,” says Ivelin Ivanov, CEO and Co-Founder of Telestar. Ivelin will be one of the featured speakers at our @WebRTCSummit, to be held Nov 4-5 as part of the overall @CloudExpo @ThingsExpo conference and exhibition Nov 4-6, at the Santa Clara Convention Center, Santa Clara, CA. In his session, Ivelin promises to share “some of the new revenue sources that IoT created for Restcomm - the open source telephony platform from Telestax.” Unmistaken Identity @WebRTCSummit Conference Chair Peter Dunkley, based in the UK at Acision, says “we are reaching the end of the beginning with WebRTC and real systems using this technology have begun to appear. One challenge that faces every WebRTC deployment--in some form or another--is identity management.” “For example,” he says, “if you have an existing service - possibly built on a variety of different PaaS/SaaS offerings - and you want to add real-time communications you are...
Scott Jenson leads a project called The Physical Web within the Chrome team at Google. Project members are working to take the scalability and openness of the web and use it to talk to the exponentially exploding range of smart devices. Nearly every company today working on the IoT comes up with the same basic solution: use my server and you'll be fine. But if we really believe there will be trillions of these devices, that just can't scale. We need a system that is open a scalable and by using the URL as a basic building block, we open this up and get the same resilience that the web enjoys.
IoT is still a vague buzzword for many people. In his session at Internet of @ThingsExpo, Mike Kavis, Vice President & Principal Cloud Architect at Cloud Technology Partners, will discuss the business value of IoT that goes far beyond the general public's perception that IoT is all about wearables and home consumer services. The presentation will also discuss how IoT is perceived by investors and how venture capitalist access this space. Other topics to discuss are barriers to success, what is new, what is old, and what the future may hold.