Welcome!

IoT Expo Authors: Elizabeth White, Roger Strukhoff, Pat Romanski, Liz McMillan, Kevin Benedict

Related Topics: Web 2.0, Virtualization, Cloud Expo, Big Data Journal, SDN Journal, IoT Expo, DevOps Journal

Web 2.0: Article

Cloud, Internet of Things (IoT) and Big Operational Data

The Internet of Things is only going to make that even more challenging as businesses turn to new business models and services

Cloud and Things and Big Operational Data

Software-defined architectures are critical for achieving the right mix of efficiency and scale needed to meet the challenges that will come with the Internet of Things

If you've been living under a rock (or rack in the data center) you might not have noticed the explosive growth of technologies and architectures designed to address emerging challenges with scaling data centers. Whether considering the operational aspects (devops) or technical components (SDN, SDDC, Cloud), software-defined architectures are the future enabler of business, fueled by the increasing demand for applications.

The Internet of Things is only going to make that even more challenging as businesses turn to new business models and services fueled by a converging digital-physical world. Applications, whether focused on licensing, provisioning, managing or storing data for these "things" will increase the already significant burden on IT as a whole. The inability to scale from an operational perspective is really what software-defined architectures are attempting to solve by operationalizing the network to shift the burden of provisioning and management from people to technology.

But it's more than just API-enabling switches, routers, ADCs and other infrastructure components. While this is a necessary capability to ensure the operational scalability of modern data centers, what's really necessary to achieve the next "level" is collaboration.

That means infrastructure integration.

it is one thing to be able to automatically provision the network, compute and storage resources necessary to scale to meet the availability and performance expectations of users and businesses alike. But that's the last step in the process. Actually performing the provisioning is the action that's taken after it's determined not only that it's necessary, but where it's necessary.

Workloads (and I hate that term but it's at least somewhat universally understood so I'll acquiesce to using it for now) have varying characteristics with respect to the compute, network and storage they require to perform optimally. That's means provisioning a "workload" in a VM with characteristics that do not match the requirements is necessarily going to impact its performance or load capability. If one is making assumptions regarding the number of users a given application can support, and it's provisioned with a resource profile that impacts that support, it can lead to degrading performance or availability.

What that means is the systems responsible for provisioning "workloads" must be able to match resource requirements with the workload, as well as understand current (and predicted) demand in terms of users, connections and network consumption rates.

Data, is the key. Measurements of performance, rates of queries, number of users, and the resulting impact on the workload must be captured. But more than that, it must be shared with the systems responsible for provisioning and scaling the workloads.

Location Matters

This is not a new concept, that we should be able to share data across systems and services to ensure the best fit for provisioning and seamless scale demanded of modern architectures. A 2007 SIGMOD paper, "Automated and On-Demand Provisioning of Virtual Machines for Database Applications" as well as a 2010 IEEE paper, "Dynamic Provisioning Modeling for Virtualized Multi-tier Applications in Cloud Data Center" discuss the need for such provisioning models and the resulting architectures rely heavily on the collaboration of the data center components responsible for measuring, managing and provisioning workloads in cloud computing environments through integration.

The location of a workload, you see, matters. Not location as in "on-premise" or "off-premise", though that certainly has an impact, but the location within the data center matters to the overall performance and scale of the applications composed from those workloads. The location of a specific workload comparative to other components impacts availability and traffic patterns that can result in higher incidents of north-south or east-west congestion in the network. Location of application workloads can cause hairpinning (or tromboning if you prefer) of traffic that may degrade performance or introduce variable latency that degrades the quality of video or audio content.

Location matters a great deal, and yet the very premise of cloud is to abstract topology (location) from the equation and remove it from consideration as part of the provisioning process.

Early in the life of public cloud there was concern over not knowing "who your neighbor tenant" might be on a given physical server, because there was little transparency into the decision making process that governs provisioning of instances in public cloud environments. The depth of such decisions appeared to - and still appear to - be made based on your preference for the "size" of an instance. Obviously, Amazon or Azure or Google is not going to provision a "large" instance where only a "small" will fit.

But the question of where, topologically, that "large" instance might end up residing is still unanswered. It might be two hops away or one virtual hop away. You can't know if your entire application - all its components - have been launched on the same physical server or not. And that can have dire consequences in a model that's "built to fail" because if all your eggs are in one basket and the basket breaks... well, minutes of downtime is still downtime.

The next evolutionary step in cloud (besides the emergence of much needed value added services) is more intelligent provisioning driven by better feedback loops regarding the relationship between the combination of compute, network and storage resources and the application. Big (Operational) Data is going to be as important to IT as Big (Customer) Data is to the business as more and more applications and services become critical to the business.

More Stories By Lori MacVittie

Lori MacVittie is responsible for education and evangelism of application services available across F5’s entire product suite. Her role includes authorship of technical materials and participation in a number of community-based forums and industry standards organizations, among other efforts. MacVittie has extensive programming experience as an application architect, as well as network and systems development and administration expertise. Prior to joining F5, MacVittie was an award-winning Senior Technology Editor at Network Computing Magazine, where she conducted product research and evaluation focused on integration with application and network architectures, and authored articles on a variety of topics aimed at IT professionals. Her most recent area of focus included SOA-related products and architectures. She holds a B.S. in Information and Computing Science from the University of Wisconsin at Green Bay, and an M.S. in Computer Science from Nova Southeastern University.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


Latest Stories from IoT Journal
Predicted by Gartner to add $1.9 trillion to the global economy by 2020, the Internet of Everything (IoE) is based on the idea that devices, systems and services will connect in simple, transparent ways, enabling seamless interactions among devices across brands and sectors. As this vision unfolds, it is clear that no single company can accomplish the level of interoperability required to support the horizontal aspects of the IoE. The AllSeen Alliance, announced in December 2013, was formed with the goal to advance IoE adoption and innovation in the connected home, healthcare, education, automotive and enterprise. Members of this nonprofit consortium include some of the world’s leading, consumer electronics manufacturers, home appliances manufacturers, service providers, retailers, enterprise technology companies, startups, and chipset manufacturers. Initially based on the AllJoyn™ open source project, the AllJoyn software and services framework will be expanded with contributions from member companies and the open source community.
There’s Big Data, then there’s really Big Data from the Internet of Things. IoT is evolving to include many data possibilities like new types of event, log and network data. The volumes are enormous, generating tens of billions of logs per day, which raise data challenges. Early IoT deployments are relying heavily on both the cloud and managed service providers to navigate these challenges. In her session at 6th Big Data Expo®, Hannah Smalltree, Director at Treasure Data, to discuss how IoT, Big Data and deployments are processing massive data volumes from wearables, utilities and other machines.
Larry Ellison turned 70 and has decided to turn over the CEO reins at Oracle. Safra Catz and Mark Hurd, both in their 50s, will function as a “Ms. Inside and Mr. Outside” as co-CEOs, at least for awhile. Serious reverberations will be felt within this highly competitive company and the highly competitive industry in which it makes its money. Even while guiding his yacht to an America's Cup title, Larry Ellison remained in firm control of the company he founded in 1977. He still has an ownership stake of about 20% of the company--1 billion or so shares of Oracle stock worth about $40 billion. Who can imagine that he'll be a docile, passive Chairman? Yes, he is returning as Chairman, with Jeff Henley, currently in that role, moving aside to be Vice-Chairman. Ellison reports he will also serve as Chief Technology Officer. So it's clear he's not fading from the scene. But he will not be able to micromanage the company by any measure. What Does It Mean? Think of all of the very strong executives over the years who rose quickly and highly in Oracle, only to be banished from the kingdom and/or to start their own big companies. Ray Lane, Marc Benioff, and Tom Siebel spring i...
I write and study often on the subject of digital transformation - the digital transformation of industries, markets, products, business models, etc. In brief, digital transformation is about the impact that collected and analyzed data can have when used to enhance business processes and workflows. If Amazon knows your preferences for particular books and films based upon captured data, then they can apply analytics to predict related books and films that you may like. This improves sales. This is a simple example, but let me tell you what I learned yesterday in sunny and warm San Francisco about more complex applications.
Explosive growth in connected devices. Enormous amounts of data for collection and analysis. Critical use of data for split-second decision making and actionable information. All three are factors in making the Internet of Things a reality. Yet, any one factor would have an IT organization pondering its infrastructure strategy. How should your organization enhance its IT framework to enable an Internet of Things implementation? In his session at Internet of @ThingsExpo, James Kirkland, Chief Architect for the Internet of Things and Intelligent Systems at Red Hat, will describe how to revolutionize your architecture and create an integrated, interoperable, reliable system of thousands of devices. Using real-world examples, James will discuss the transformative process taken by companies in moving from a two-tier to a three-tier topology for IoT implementations.
Technology is enabling a new approach to collecting and using data. This approach, commonly referred to as the “Internet of Things” (IoT), enables businesses to use real-time data from all sorts of things including machines, devices and sensors to make better decisions, improve customer service, and lower the risk in the creation of new revenue opportunities. In his session at Internet of @ThingsExpo, Dave Wagstaff, Vice President and Chief Architect at BSQUARE Corporation, will discuss the real benefits to focus on, how to understand the requirements of a successful solution, the flow of data, and how to best approach deploying an IoT solution that will drive results.
IoT is still a vague buzzword for many people. In his session at Internet of @ThingsExpo, Mike Kavis, Vice President & Principal Cloud Architect at Cloud Technology Partners, will discuss the business value of IoT that goes far beyond the general public's perception that IoT is all about wearables and home consumer services. The presentation will also discuss how IoT is perceived by investors and how venture capitalist access this space. Other topics to discuss are barriers to success, what is new, what is old, and what the future may hold.
The Domain Name Service (DNS) is one of the most important components in networking infrastructure, enabling users and services to access applications by translating URLs (names) into IP addresses (numbers). Because every icon and URL and all embedded content on a website requires a DNS lookup loading complex sites necessitates hundreds of DNS queries. In addition, as more internet-enabled ‘Things’ get connected, people will rely on DNS to name and find their fridges, toasters and toilets. According to a recent IDG Research Services Survey this rate of traffic will only grow. What’s driving this increase? The growth in volume is largely attributed to the rollout of new services and applications along with expanding migration to the cloud and traffic spikes. The Internet of Things will also place a strain on DNS services. Are you ready for this surge of new services and applications along with potential DNS threats?
Building low cost wearable devices can enhance the quality of our lives. In his session at Internet of @ThingsExpo, Sai Yamanoor, Embedded Software Engineer at Altschool, will provide an example of putting together a small keychain within a $50 budget that educates the user about the air quality in their surroundings. He will also provide examples such as building a wearable device that provides transit or recreational information. He will review the resources available to build wearable devices at home including open source hardware, the raw materials required and the options available to power such wearable devices.
Where historically app development would require developers to manage device functionality, application environment and application logic, today new platforms are emerging that are IoT focused and arm developers with cloud based connectivity and communications, development, monitoring, management and analytics tools. In her session at Internet of @ThingsExpo, Seema Jethani, Director of Product Management at Basho Technologies, will explore how to rapidly prototype using IoT cloud platforms and choose the right platform to match application requirements, security and privacy needs, data management capabilities and development tools.
P2P RTC will impact the landscape of communications, shifting from traditional telephony style communications models to OTT (Over-The-Top) cloud assisted & PaaS (Platform as a Service) communication services. The P2P shift will impact many areas of our lives, from mobile communication, human interactive web services, RTC and telephony infrastructure, user federation, security and privacy implications, business costs, and scalability. In his session at Internet of @ThingsExpo, Erik Lagerway, Co-founder of Hookflash, will walk through the shifting landscape of traditional telephone and voice services to the modern P2P RTC era of OTT cloud assisted services.
The Internet of Things (IoT) is rapidly in the process of breaking from its heretofore relatively obscure enterprise applications (such as plant floor control and supply chain management) and going mainstream into the consumer space. More and more creative folks are interconnecting everyday products such as household items, mobile devices, appliances and cars, and unleashing new and imaginative scenarios. We are seeing a lot of excitement around applications in home automation, personal fitness, and in-car entertainment and this excitement will bleed into other areas. On the commercial side, more manufacturers will embed sensors in their products and connect them to the Internet to monitor their performance and offer pro-active maintenance services. As a result, engineers who know how to incorporate software and networking into their mechanical designs will become more in demand.
We were in contact recently with Shrikant Pattathil (pictured below), Executive Vice President of Harbinger Systems. Here are some of his thoughts about healthcare, the IoT, and disruption: IoT Journal: Healthcare, with all of its systems and dataflows, seems an ideal area for IoT solutions. What is Harbinger Systems doing in this area? Shrikant Pattathil: Being a service provider we work with many product development companies who are building new IoT-based applications to solve problems that plague the healthcare industry. For example, there is a need for applications to manage your medicine dosage, seek help, and notify your care provider. IoT Journal: And how do you go about addressing these problems? Shrikant: We are approaching IoT from mobile and cloud perspective. These are our key strengths. We are helping product companies in IoT space to quickly build the mobile interfaces for their product offerings. We are also helping them to place the data on the cloud in a secure way, so that they can truly exploit the benefits of IoT. IoT Journal: What are the advantages of the IoT here? Cost? Better care? What sorts of metrics can be applied, and are there intangibles as ...
Launched this June at the Javits Center in New York City with over 6,000 delegate attendance, the largest IoT event in the world, 2nd international Internet of @ThingsExpo will take place November 4-6, 2014, at the Santa Clara ConventionCenter in Santa Clara, California with estimated 7,000 plus attendance over three days. @ThingsExpo is co-located with 15th international Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading IoT industry players in the world. In 2014, more than 200 companies will be present at the @ThingsExpo show floor, including global players, and hottest new technology pioneers.
Enthusiasm for the Internet of Things has reached an all-time high. In 2013 alone, venture capitalists spent more than $1 billion dollars investing in the IoT space. With “smart” appliances and devices, IoT covers wearable smart devices, cloud services to hardware companies. Nest, a Google company, detects temperatures inside homes and automatically adjusts it by tracking its user’s habit. These technologies are quickly developing and with it come challenges such as bridging infrastructure gaps, abiding by privacy concerns and making the concept a reality. These challenges can’t be addressed without the kinds of agile software development and infrastructure approaches pioneered by the DevOps movement.