No hay notas en la diapositiva.
We all know that today’s world is creating massive amounts of data, evident by the amount of storage consumption. One way to look at how fast information is growing is EMC’s own Exabyte journey. It took EMC up to 2005 to ship a total of it’s first Exabyte. In 2010, EMC shipped its first Exabyte during the course of a year, only to be followed in Q3 2011 by its first Exabyte shipped in a quarter. With data growth continuing at an amazing pace, EMC hit it first Exabyte month in Q2 2013.
As a result of the way we’ve been buying and manage storage, data growth is simply breaking it. Organizations are creating more unstructured data like videos, audio files, images, etc. and file system data each year than any other category of data. Traditional file-based storage have been built for traditional applications and they’re critical for those applications, but they’re more complex than necessary for new Web and mobile apps. They feature multiple data protection schemes depending on the arrays features and each arrays has its own APIs and management tools.
The complexity also makes it very difficult to scale. One of the biggest challenges of a growing storage environment is that primary storage reaches capacity, and organizations are continually purchasing more storage. This growth causes other problems as well, such as unplanned server outages due to running out of space, excessive administrative costs, backup failures, and more. But the bottom line is that they can’t economically scale – especially when storage spans multiple locations. Data protection, admin costs, etc add storage overhead. Which is important because it’s not just about total capacity – it’s about usable capacity – and how efficient you are drives down your $/GB. And fairly or not, enterprises are being compared to public cloud economics which feature very low $/GB.
And speaking of cloud, traditional storage was never architected for new Web, mobile and cloud applications. They were built for access over a LAN for specific applications. Provisioning and access is driven by IT, it’s difficult if not impossible to provide self-service access to traditional storage in an IT-as-a-Service model.
These modern Web-scale applications are very different. They create 10 times more data that traditional legacy applications and the expectation is that they be accessible 24x7/365.
Webscale applications are architected differently they:
Are massively multi-tenant
Are assembled from well-defined components or “black boxes”
Use standard communication protocols to facilitate universal access and interoperability
Are built using frameworks like JavaEE, Spring, Struts, .NET, WebObjects, Zend, Ruby on Rails, Grails, Django, Catalyst …
Mix both structured and unstructured content
Must store both object- and DB-type content
Are extremely scalable in their design and implementation
The demand a different approach to storage.
All of these trends are forming what is being the 3rd platform of IT Technology Evolution; what leading industry experts describe as the next generation platform, provides an architecture specifically designed to meet these demands and support billions of users and millions of applications.
The 3rd platform provides the means to achieve cloud scale, speed, agility, mobility, and growth.
As we look back, the first platform was mainframes with thousand of applications and millions of users with end user devices of choice being proprietary terminals. The second platform is and was the internet and client servers with end user devices being the PC. This platform continues to support tens of thousands of applications and hundreds of millions of users. However, current architectures are being pushed and scaling this type of environment can be costly and ineffective.
When we talk about the third platform in an enterprise setting, we’re really talking about the convergence of these forces and their powerful combination to serve as a foundational architecture for IT organizations. Beyond the individual trends, the seamless “combination” of these trends is becoming critical since it collectively represents an agile new IT fabric for applications, data centers and, most importantly, the user experience. According to IDC, the third platform, will serve as the primary growth driver of the IT industry over the next decade, responsible for 75% of new growth as worldwide IT spending moves from $3.7 trillion in 2013 to more than $5 trillion in 2020.
… Customers need to balance the issues they have today, while preparing for the future. Customers want:
Lower OpEx: It’s all about data growth. Mobility, social media and application development are driving a surge in data growth. Today’s storage silos and manual processes make controlling OPEX while managing data growth, very challenging.
Need Choice: Their environments are becoming infinitely more complex and they don’t want to be boxed-in. Customers have many concerns over things like cost, security, latency, availability, compliance and data protection (the list goes on). One size doesn’t fit all and they need a way to gain the benefits of cloud while reducing the complexity of their existing IT infrastructure. They want choice in their vendors and their storage systems, all based on their own individual needs.
Have a clear path: While the public cloud raises some concerns, customers like the value and attributes it offers - simplicity, ease of use, self-service, and cost effectiveness. However, they don’t know what path to take and they don’t know how to transform and leverage their current infrastructure to gain these benefits.
With millions if not billions invested in traditional infrastructure that support current business operations , enterprise simply cannot afford a rip-and-replace strategy to get to the third platform.
For IT organizations, this creates a real challenge. We are dealing with the perpetual need to cut costs in our existing IT environment. Spending less on capital equipment and finding efficiencies in our staffing models.
At the same time, there is an opportunity and real need for IT to understand these new web and mobile platforms and partner with the business to build new capabilities so that your company can be the disruptor, not the disrupted.
But, with all change, it must come without creating undue risk in any system. Security threats are growing and it’s not enough to just guard the castle walls any more. The bad guys are inside, and you have to find them before they get out with your assets.
All of this is leading to a new IT agenda. An IT agenda that will reduce costs, increase speed for new growth and balance risk.
First, is an adaptive, data-driven approach to security that watches and analyzes all activity so that you can quickly recognize the anomalies and act before it’s too late.
Building a software-defined data center that has automation and self-service will lower the cost of running IT and serve as a foundation for faster, more agile IT. This SDDC will provide your private cloud and can be extended to the public cloud, forming a hybrid cloud – the lowest cost way to run IT.
As you start to store and analyze the massive data sets that are available, you will gain new insights about your customers and your business. This requires a Data Lake to store these vast amounts of information.
Finally, you will need to build new applications in an agile way, with new platforms and target those applications at mobile devices. Those mobile devices all managed by IT to secure your corporate information, while driving employee productivity and satisfaction.
This new IT agenda is why we created the Federation.
The Federation is comprised of EMC, VMware, Pivotal and RSA, each with a specific mission, but all strategically aligned to deliver 5 solutions based on the new IT agenda.
These solutions can be deployed in your data center, or at a partner like the vCloud Hybrid Service from EMC and VMware or at any one of our Service Provider Partners.
This Federation is architected horizontally with each of the companies free to execute on their mission, partnering outside of the Federation to ensure customers have choice.
Through this innovative model, we provide 1) best-of-breed solutions for the new IT agenda and 2) customer choice, no lock-in.
In an effort to realize the 3rd platform benefits, enterprises are embracing the Software-Defined Data Center. The Software-Defined Data Center’s architectural approach enables enterprises to abstract, pool and automate all of their data center resources (compute, storage and network) and services to deliver on the promise of the 3rd platform. All the data center resources have abstracted from the underlying hardware to create shared pools of resources. With this approach we can truly build an adaptive data center.
In order to be a truly Software-Defined Data Center, services must be automated across ALL components – server, storage, and network.
Analysts estimate that enterprises have virtualized between 30-75% of their compute and 20% of their network infrastructure, but only 5-10% of storage infrastructure. This is due in part to the fact that, unlike network and compute, storage lacks a set of clearly defined protocols and standardization. To realize the full value of the Software-Defined Data Center, compute, network, and storage must all be virtualized.
Exact architecture of Atmos, running that Exabyte-scale customer
Brings us to next announcement…
Location Independence, private data center, inter-data center and hybrid private-public data center
Physical infrastructure independent, use existing and have flexibility for best price-performance for added capacity and new deployments
Virtual Infrastructure – Software defined data center infrastructure, decoupled from physical, operational model of a VM for entire data center environment, complete visibility, security and scale.
This must be a software only model, hardware independent, and serves to ABSTRACT hardware, POOL infrastructure, AUTOMATE all actions that can be automated.
Applications – ANY workload, ANY application, on-demand provisioning, isolation, mobility, speed and agility
Tools to provide IT operational insight across the full SDDC stack, and IT business economic transparency
Application Consumption – Anywhere, Anytime, Any Device office, mobile, home, whatever.
It’s worth noting that our approach to management is enhanced by being purpose-built for the cloud era.
Unlike the past, where IT was managed through silos and thus complicated manageability, the new era is about abstraction, pooling, and dynamic infrastructure.
The VMware management solutions were designed with this in mind – to support the maximum level of automation through a new approach to management.
In 2011 when VMware first embarked on delivering a full set of Cloud Management offerings we described our mission around management as: “simplify and automate IT management”.
Our vision was management that was matched to the benefits that virtualization was delivering – benefits anchored on driving down the cost of IT while simultaneously increasing business agility.
Far too often, legacy management inhibit these benefits by being overly complex; taking far too long to deliver value; and ultimately being far too expensive.
Today our mission remains unchanged – but we’ve broadened it.
New capabilities extend management across multiple platforms and providers, and we’re delivering powerful solutions for managing services across hybrid clouds.
All to better help you on your journey to the cloud and transition to becoming a broker of IT services.
Our differentiated approach to deliver on this vision and to help customers deal with the great scale and dynamic nature of the cloud is to turn management into manageability through intelligent, policy-based automation.