Storage Networks

Brian Steffler

IBM Edge2014 - You can’t manage the DATA if you can’t manage the INFRASTRUCTURE… Of Elastic Storage and Fabric Vision

by Brian Steffler on ‎05-20-2014 04:11 PM - last edited on ‎05-21-2014 09:33 AM by (2,927 Views)

What a week this is turning out to be in Las Vegas. I am jazzed coming into the IBM Edge2014 event and our Platinum sponsorship.

 

 steffler.jpg

 

I might be dating myself now but what is going on in the industry today is an inflection point much like that back in 1998, when shared storage for the masses was rolling out en masse in a technology called Fibre Channel. However, the landscape is different now and we need to get today’s networks ready for tomorrows workloads. IBM and Brocade are doing just that. Listening to the “Fast Data Forum Livestream Webcast,” I chose to key off of one of Tom Rosamilia’s (STG GM of Storage) summary statements at time marker 1:20:50 in the video, for the title of my Blog, “You can’t manage the DATA if you can’t manage the INFRASTRUCTURE”. There were many salient bites in the video but this statement strikes home to many. It is clear customers are having difficulty keeping up with the amounts and velocity of data from business, social and mobile sources that are being pumped into the cloud and analyzed.

 

steffler1.jpg

 

 

It became clear that IBM’s announcement of Elastic Storage is an underpinning for a new technology direction -- a direction that will help customers control of data in such a way that manages pools of storage (which at a time were stationary), and advance management to the rivers of data which are streamed by data centers and cloud infrastructures. Jamie Thomas (IBM Software Defined Systems) has a team of innovators working to solve these problems, with breakthroughs that are available today – and technologies that are on the horizon for tomorrow.

 

The timing of these technologies and directions are mirrored in our own recent announcements. Where IBM has Elastic Storage, Brocade has Fabric Vision. These are complementary technologies directed at addressing the same problems customers are facing. Customer data volumes are increasing, there is a need for speed of access to some data, flexibility for other, and smarter storage and smarter storage area networks.

 

These technologies are driven by high density server virtualization, cloud architectures, and flash based storage. But they come with a price: complexity and cost. That is why Brocade has innovated in the storage area network to create Fabric Vision: management tools in the network designed to optimize availability and performance and simplify management. These tools let you pre-validate infrastructure components such as cables, optics, and connections or the entire SAN fabric at full line rate prior to deployment, without the need for hosts, storage or test equipment. This helps customers reduce risk, accelerate new deployments, and identify/resolve connectivity issues in less time. Other advancements provide a threshold monitoring and alerting service that takes only minutes to deploy and is based on best practices that minimize the need for high level san expertise or complex switch configurations. These tools reduce administrative time and improve availability. These tools optimize the data economics much like elastic storage.

 

Many of the discussion points from the IBM Fast Data Forum Webcast discussed the problem customers are having with how to manage so much data velocity and variety in a dynamic data world where application awareness is becoming crucial. Software defined storage is providing for the flexibility, such that the system can respond quickly to the elasticity clients need and require. IBM Elastic Storage is built on technology from the IBM research group, the same research that built Big Insights, and some of the same technology as IBM Watson uses.

 

Brocade Fabric Vision is built on tried and true technology as well in the form of Brocade Gen 5 Fibre Channel. Brocade Gen 5 Fibre Channel offers advantages to meet new and evolving requirements, including: low latency and high I/O Operations per Second (IOPS) performance maximizes the number of virtual hosts per physical server. The Data Center proven, purpose-built architecture minimizes risk and fault domains of high-density server virtualization. Non-stop networking and automated management minimizes operational cost and complexity. Integrated advanced diagnostics, monitoring, and Reliability, Availability, and Serviceability (RAS) capabilities simplify management and increase resiliency. We also address bandwidth optimization and data protection with integrated ISL data compression and encryption. Low overhead and low latency eliminates I/O bottlenecks and unleashes the full performance of Flash, SSD, and 16 Gbps-capable storage.

 

Brocade’s Gen 5 technologies allow customers to access their data quickly, and provide the infrastructure for software applications to ingest and process that data on the right tier and at the right time. The network provides for the speed, agility and intelligent utilization of the infrastructure, while clients are increasingly applying analytics to their business processes, driving new requirements in storage speed and flexibility.

 

Not just is the data getting large within the Data Center, Russell Schneider Principle Consultant at Jeskell one of IBM’s largest federal resellers indicated in the IBM Webcast that Big Data is became Bigger Data. Not just in the enterprise, but externally as well. Data in several agencies is being pulled together and data is now being looked at by many in a global view. Neither the traditional storage pool environment nor HPC can manage the amounts on their own. Data must now go on autopilot and intelligent tools are needed to know which data to keep, which to discard, which to use later, and which to use today. The term Data Democracy was mentioned which allows the data itself to cast votes to tell the system where it should belong on Monday, Tuesday, or Wednesday etc. The systems in some sense need to go on autopilot. Brocade provides infrastructure solutions to address data in disparate locations such as Business Continuity solutions and the IBM SAN Volume Controller Stretched Cluster Solutions.

 

steffler2.jpg

Michael Factor, IBM

 

Michael Factor an IBM Distinguished Engineer in IBM Research discussed the software defined object storage capabilities they are looking into and the research they are doing on Storlets which is a move whereby the computation is moved to the data itself instead of the data to computation. Importance was placed on the openness of the solutions and the work going on in the OpenStack communities. Brocade shares this importance on standards direction and has shared its research with the Openstack community. Brocade has embraced this open source cloud platform in order to promote multivendor and system-to-system interoperability for cloud environments. Brocade supports OpenStack users in three ways: The Brocade data center portfolio delivers the visibility, extensibility, and adaptability for changing environments offered by OpenStack. Brocade is drawing on its data center networking expertise to contribute major architectural enhancements to the core OpenStack framework and Brocade partners with leading open source providers to offer a consumable, supported path to orchestrating The On-Demand Data Center.

 

steffler4.png

 

My final thoughts are elements from Carl Kraenzel’s (Distinguished Director Watson Cloud Technology & Support) where Watson provides the cognitive technology in elements of Elastic Storage. There will be a new normal driven by social, mobile, and cloud data coming together. Not just pools of data but rivers of data flowing by us. We cannot handle data as static records. The Data will know about itself and can tell you about it. Business models will change. There will be a need to get content and systems to be aware of context of what we are doing. Cognitive computing with rivers of data flowing by will need to have a smarter infrastructure. You will not be looking for a simple “needle in the haystack” any longer, but a “needle in a river”. Elastic storage is critical to this new data model.

 

Software Defined Storage is here, now. IBM and Brocade have provided technologies and solutions that customers can use and get started on now. For those that don’t get started now, driving the new data model adoptions will become increasingly difficult in their business tomorrow.

 

Enjoy the event everyone! There is much to learn for all of us. I know I have learned a lot already, and the event is just starting up.

 

-Brian

Comments
by jlv on ‎05-21-2014 12:57 AM
SDS ,SDN, SDDC, in future, Every filed will be Software Defined.
by jlaurenz on ‎05-21-2014 08:11 AM

Great insight in this article. The volumes and velocity of data are growing beyond the current policy based methods (Data age, data type, Content, file access dates, etc..) of data management. Data needs to move from being a static object to an active participant.

This will require intelligence in the infrastructure which requires visibility at all levels and big part of that will come from Fabric Vision and OpenFlow.

 

Announcements