We are using data at a record pace. This directly impacts data centers and how they manage the increase in demand. Check out the data center trends for 2014.
According to DCD’s census report of global data center investment in 2013 which covers all expenditures related to DC construction, Infrastructure, and IT, there was a worldwide increase of 8% in spending over 2012. The global investment spend was 151.3 Billion $US with Latin America leading the way with 12.2% increase. This is due to many factors, the biggest is the increasingly growing number of devices connected to the internet or ‘The Internet of Things’ driving and enabling Social Media, Entertainment, and instant data any time, anywhere and on any device. More businesses are relying on the internet to enable customers to shop, research, and perform tasks. This has led to new data center models being developed and the rise of service and cloud provider giants such as Amazon, Baidu, Ali Baba, and Google.
The nexus of forces describes the convergence and mutual reinforcement of four interdependent trends: social interaction, mobility, cloud, and information. The forces combine to empower individuals as they interact with each other and their information through well-designed ubiquitous technology.
What Is the Nexus, and Where Does It Come From?
The nexus is the point at which two or more of four major IT forces converge to create new patterns of outcomes in technology use, business reality, market dynamics and changes to the lives of people.
The main drivers for the trends in the industry are due to what Gartner calls the Nexus of Forces. Social media, Mobile devices, Information (big data) and cloud technologies are reshaping the way businesses rely on their data centers to generate new opportunities, methods of advertising, optimize the customer experience, and new methods of collecting valuable marketing research in expensively. This nexus of forces has made it easier for the consumer to demand more of the business, raising the bar for more instant gratification.
Four independent forces — social, mobile, cloud and information — have converged as a result of human behavior, creating a technology-immersed environment. The forces interact and reinforce one another and are associated through complex dependencies. New business opportunities emerge from this Nexus of Forces, especially scenarios that extend reach and relationship to customers, citizens, patients, employees or any other participant in an ecosystem of humans and machines. The combination of pervasive mobility, near-ubiquitous connectivity, industrial compute services and information access decreases the gap between idea and action. To take advantage of the Nexus of Forces and respond effectively, organizations must face the challenges of modernizing their systems, skills and mindsets. Organizations that ignore the Nexus of Forces will be displaced by those that can move into the opportunity space more quickly — and the pace is accelerating.
Cisco estimates that there are 1.4 Trillion devices that can be connected but only 10 Billion are connected today. Cisco predicts by 2020 that there will be 500 Billion devices attached to the internet.
On-Demand Business: Anytime/where/device data – creates the Consumerization of enterprise IT, powered by the internet, that empowers to user more than ever. The enterprise needs to be intelligent and interactive to conduct business in an on-demand manner. To do so, enterprises need to shift from functional to integrated enterprise platforms. Shifting from silos to an integrated enterprise platforms is key to competing. Quick fact: 50% of all internet traffic is from a mobile device (smart phone, tablet, notebook, etc.)
Example is UK retailer John Lewis and Waitrose has seen ‘omni-channel’ or use of the on-line, mobile platfoms, in store, and social media driving sales upward (19.2% and 41.4% year over year). Web channel business bandwidth grew 50%. Retailers are using the data collected to provide greater personalization experience with customers, thus strengthening a personal bond that sometimes is missing from a pure on-line purchase. They are seeing people come to the store for advice and then purchase on line.
Show-rooming is increasing – provides more power to the consumer. So because of this we’re seeing a shift consumers/retailer relationships that also shifts the way businesses are using data centers we’re seeing two major trends.
Due to the increase demand by the consumer on businesses to provide services and access to data via the internet, business have been relying more and more on the data center to drive the business. The data center is becoming what the factory of the late 1800’s and throughout most of the 1900’s. It is the delivery mechanism for products, services, advertising, news, information, and communication of the 21st Century. Whose household today doesn’t have at least 5-10 devices connected to the internet? The future promises more devices connecting, a variety of time saving services, and new delivery methods for products such as entertainment, education, banking, etc. Businesses are using their data centers to generate more revenue in the form of these services, product delivery, and collecting ‘Big Data’ to use to develop new products and services to the consumer.
We’ve been experiencing this decrease or delay in spending from the Enterprise DC due to the adoption of virtualization. This is increasing the power density of the cabinet and decreasing the need for additional cabinets and space to support data center support. What use to be 1000s of servers, operating at 20% or less of capacity, in dozens of cabinets, to support a business are now reduced to a 1/10 of the servers operating at 75-90% or more of capacity. This delays the need for more DC build out. Network connectivity is decreased due the efficiencies of the server and software to use network bandwidth at a higher capacity level. At the same time energy saving is elevated by the decrease in servers being used and new DC trends such as containment, free cooling, and higher DC operating temperatures.
3) Outsourced Services – I’ve divided these to two types. MTDC refers the trend to not own and operate the data center as an ASSET; but rather lease the space and the services to support the ‘physical environment’ aka power delivery chain, cooling, and all the maintenance required to operate a data center. The business still owns and operates the IT infrastructure. Some models MTDC provide cables and cabinets. Others provide just space, power and cooling. Hyperscale data center services come in a variety of shapes, sizes, and flavors. A business can purchase ‘Cloud Services’ which are an amount of compute capacity in the form of servers, storage, network security and/or bandwidth for the business to LEASE for a period of time to run their applications and services. This eliminates the need for space, equipment and support. MTDC operators are also looking to provide other services to supplement their offerings and income in the form of Network connectivity or bandwidth between their properties as well as other cloud or XaaS offerings. Other models include a variety of XaaS or ‘something’ as a service such as storage, infrastructure, software, etc. that a business can leverage to reduce their capital investment in IT and data center. We are seeing more of a blurred line between cloud and XaaS and predict that these will merge in the future. Today we’re seeing an 80/20 split according to Gartner between MTDC services and Hyperscale services. MTDC is beginning to decline as the Hyperscale services are proving their reliability and cost savings. Gartner envisions in the next 10-15 years that there will be 80% of US data centers using Hyperscale technology and services in some form or a hybrid design.
Rarely Interactive with Consumer --- Website was a place to get information, a marketing tool / display only one way to get info on the net.
New Types of end points… This means no longer person to person; but machine to machine. Eg, newer household appliances connected to the web send status of operation back to manufacturer for proactive repair.
We’ve been experiencing this decrease or delay in spending from the Enterprise DC due to the adoption of virtualization. This is increasing the power density of the cabinet and decreasing the need for additional cabinets and space to support data center support. What use to be 1000s of servers, operating at 20% or less of capacity, in dozens of cabinets, to support a business are now reduced to a 1/10 of the servers operating at 75-90% or more of capacity. This delays the need for more DC build out. Network connectivity is decreased due the efficiencies of the server and software to use network bandwidth at a higher capacity level. At the same time energy saving is elevated by the decrease in servers being used and new DC trends such as containment, free cooling, and higher DC operating temperatures.
Example: Data Center.com Reports Barclays Shuts Six Data Centers (2/13/2014): “… Barclays has revealed that it has shut six datacentres and reduced thousands of servers since mid-2013 as part of its cost-cutting efforts. The bank is implementing a series of initiatives including restructuring the business, redundancies and investment in technology – to help reduce operating expenditures. As part of this programme titled ‘Transform’, the bank - as well as closing data centres – has reduced its number of servers by 6,000 since H1 2013.
Data Centres News (2/13/2014) reports The US General Services Administration or GSA to Close 24 Data Centres in 2014…” The Office of Management and Budget has touted datacentre consolidation as a major cost-cutting initiative, estimated to save US$3 billion by 2015. The initial goal was to close 800 of the government’s estimated 2,094 datacentres. Subsequent revised estimates suggest that this was a sustainable under-estimation and that the total number of datacentres is above 6,000.”
This chart shows how Enterprise DC operators are delaying their next DC build out. The affects of virtualization on the server installation base. Spending on actual server hardware is decreasing or staying relatively flat (purple bars). Logical (virtualized) server installed base is increasing (red line), while the physical server installed base is declining or staying flat (purple line). This is decreasing the need for more data centers; but making the existing data centers more dense. Costs of power and cooling are again trending upward and managing this increase of logical servers is becoming more expensive. Outsourcing some of theses services is trending upward to decrease the cost of IT.
Gartner forecasts that the industry is about 68% virtualized and will continue to increase moving forward. There will always be applications that require a single server; but the trend is to continue down the road of virtualization which is key for moving to the cloud. Virtualized applications can be moved into cloud architecture and take advantage of Hyperscale technology (private, hybrid, or public).
Data Centre News reports (2/13/2014): Telecity Group Results confirms positive out loook, “…On an organic basis we opened new capacity across Europe, typically around existing sites, where connectivity is high and we have growing demand from ecosystems of interconnected customers. We have also enhanced our offerings, in particular with our new Cloud-IX platform which ensures we extend our position at the core of the digital economy in Europe.”
451 Reports: $23 Billion in revenue for managed hosting.
Industrially designed infrastructure: back to basics…Disaggregate for cost (separated into component parts or modules), Designed for efficiency Engineered for serviceability, architected for agility
Web-Oriented Architecture – Design for failure and scale. Typical (failure in the) 1st year for a new google cluster: one pdu, 20 racks, 1000 machines, 1000s of disks (source: Designs, Lessons and advice from building large distributed systems (Jeff Dean, Google)
Velocity oriented Processes: DevOps--- Continuous Development, monitoring and delivery.
API based= Development, QA, Operations
Collaboratively Aligned Organization: Shared metrics with Dev & Ops
Risk Embracing Culture: Opposite of IT comfort zone. Take risks, good enough attitude. Break things, learn from them and move on. Challenge traditional norms.
Open Source: Plenty of free information and designs out there for all of these. You don’t need to re-invent the wheel.
Hand crafted by guild-level artisans verses Standardized and mass produced.
Large shops must adopt today. SMB cannot avoid this forever.
Scaling both volume and geographically is a key factor
Another Driver to using Outsourced Services or private cloud architecture the value of faster. Businesses state that agility and speed are the biggest reasons for moving to a cloud architecture. Ultimately, the primary business case for cloud computing will often be speed for the business. This enables business units to make a business case based on speed. The value of “FASTER” is an new trend in the data center as the business world we are quickly evolving to. Getting the product out to market as quickly as possible. In a ‘connected’ world, opportunities come and go, fast. Consumers expectations of immediate gratification due to the internet. It also allows the business to experiment with low cost/investment; thus eliminating the barriers of experimentation. Products that aren’t favored by consumers can be quickly discarded without major losses in profits. In other words fail faster to win big.
Education: No more certification theater: Vendor and Expert programs lose their value (Carl Klaunch, Gartner)
Virtualizing the DC: From Static to Dynamic. Monolithic servers, storage and networks to Fabric enabled networks, storage, and servers, to Future Fabric Based systems: pooled and globally shared resources, boundariless, unified fabric, disaggregated components combined dynamically.
Fabric: the ability to combine components at will (server, network, storage, I/O, specialty engines). Benefits: increase useful life, improves density and energy efficiency, evolving rapidly, SiPho accelerating use, leverages technology disruptions, trend supported by major systems vendors and start ups.
Servers: Inexpensive rack mounted, 2 or 4 socket, internal SAS drives, simplified, no frills, reduce complexity, ultra efficient physical infrastructure, industrial engineering, open source and almost in-house everything, ODM/Self build
ELE: use lower power processor types (ARM, Atom, Power, MIPS, Xeon), thousands of servers per rack, 10s to 100s of thousands of cores per rack
ODM: OEM servers built by ODMs, ODMs produce 76% of all servers, ODM now selling direct to end users, Self build + ODM direct is about 7.27% of market, Hyperscale will consume 17% of x86 market by 2016 (Gartner), ODM direct share will be 4.39 of x86 market by 2016, Colalitions of buyers increasing influence (open compute, open networking)
SiPho: direct chip fabrication of the optical link, larger data rates, longer distance connections, low latency, should lead to volume economics….PoC with Intel and Corning
Consumers are reshaping the hardware industry. The White Box Affect: Hyperscales are adopting OpenCompute techniques by utilizing white box servers or sometimes known as Skinless servers, specifically designed by the Hyperscale provider to only have the bare components required to run their software, creating a very low power server. All energy is used to operate the server, no unnecessary components are installed. Concentrating all of the power supplied to produce compute. The losers in this market are the big hardware manufacturers who produce 76% of all servers today. They are Dell, HP, and Lenovo (IBM sold it’s server unit to Lenovo in January). They have been reporting lower sales and eroding margins. The winners are the ODM (Original Design Manufacturers) who are building purpose built servers for these shops and larger end-users adopting Hyperscale principles. Today ODMs have 7.27% of the market share. Gartner forecasts Hyperscale will consume 17% of the x86 market by 2015 and continue to grow. ODM direct marketshare will be 4.39% of the x86 market by 2016.
ELE or Extremely Energy Efficient servers are extremely dense servers utilizing ARM and ATOM processors (which are mainly used in smart phones and tablets for their low power draw). These servers provide hundreds of processors in a 4-8 RU box and uses a fraction of the power that the same number of servers would use. HP’s Moonshot has 480 processor and uses a little over 1500 watts of power. The speed and memory abilities of these servers has increased 10 fold in the past 12 months. These servers use between 8-16 high speed assemblies for 40 G bps network connections.
Other manufacturers feeling this affect are the Networking giants. Part of the OpenCompute project is to design and build a network switch that will be much like a server, generic hardware with the ability to run a variety of networking software operating systems. This eliminate the current hold that network manufactures hold on hardware, software and support of their products. The reality of this is still several years off, but gaining momentum. The Networking Giants are combating this at every turn and designing products to provide more flexibility and functionality of simplier, low latency, two tier networks with appliances making connections and directing traffic in place of todays switches and routers.
Traditional Manufactures have been creating a variety of products that integrate services, hardware, software, and modular designs to provide ‘cloud in a box’ to keep their market share and show the consumer that a private cloud option is available, at a low cost, with service and support provided by the manufacturers. If adopted, traditional manufactures will keep a larger portion of their market share especially with risk adverse customers.
The next two slides show the growth in the Hyperscale Data Center market. The increase use of ODM or purpose built servers over traditional servers shows the influence that the Hyperscale concepts are driving the market.
At Belden, we understand a data center managers goals and pain points….
1. Do you believe that all data centers will utilize Cloud technology and construct data centers like the large Cloud providers like Amazon?
A. Maybe, someday. I believe there will be private clouds in enterprise data centers with a Hybrid cloud that bridges services that are offered by the public cloud providers at a much lower cost than an enterprise can support themselves. I see Disaster Recovery systems and storage being big players in this ‘Hybrid’ cloud world. Of course, being technology, every day someone builds attempts to build the better mousetrap or should I say service. The concept of failing fast will test the viability of many more applications at a much lower cost by startups and well established companies.
2. Are you telling us that we need to adopt Cloud technology even if we’re a small business?
A. You shouldn’t take risks for the sake of risk taking. A business needs to take calculated risks at first and provide the security to the staff that it’s ok to fail, but fail quickly and learn from your mistakes. Risk and Fail Fast go hand in hand. Extreme amounts of vetting/QA in the cloud based enterprise is a dying concept. Implement in a calculated process, like Facebook’s method of rolling new code out slowly across the globe and seeing what arises.
3. You said that being a generalist in business and technology will be more valuable to a business then an expert. If there are no experts, who will get the work accomplished?
A. Eventually you will need to get with the modern world and use this technology to stay competitive. We tossed out typewriters (the work horse of the pre-1980 office) for desktop computers. Today, a company could not remain competitive if they gave their employees a typewriter instead of a PC, laptop, or tablet to perform their jobs. Same way with these concepts discussed here today. Smaller companies may find it easier to outsource in one of the ways discussed until business grows to a level that makes sense to support this by themselves.
4. You said that being a generalist in business and technology will be more valuable to a business then an expert. If there are no experts, who will get the work accomplished?
A. By this, I mean being open minded to different technologies and new concepts. Having a well rounded skill set while understanding the needs of your company and putting the success of the business first to be successful in this highly competitive business climate.