The Data Center Era

Screen Shot 2014-03-31 at 6.38.51 PMYou can track the history of business through real estate.

For the first several centuries of civilization, most economies were centered on agriculture. In the 18th century, crop yields were all important and one of the most vital institutions was the village marketplace. In the 19th century, the factory became the symbol and driving force of industrialism. Innovations such as the Bessemer process were developed to help factories run more efficiently. The trend hit its apex in Henry Ford’s assembly lines in the 1920s.

And right about that time we began to see the emergence of the service industries like finance, law and insurance. By the 1970s, downtown office buildings fed by commuter rail lines and freeways became the center of the business universe.

Now, in the 21st century, we’re in the first stages of the data center era. Forget downtown corporate towers. Large financial institutions now compete for office space in Fort Lee, New Jersey because bandwidth connections there will shave milliseconds off transactions. Brick and mortar record stores and movie theaters are being replaced by online delivery services like Netflix and iTunes.

Consumer and business records will increasingly leave document warehouses for a new home in the cloud. The economies in the new world order are no longer dependent on physical real estate — instead they exist wherever critical data is managed and resides.

In the data center era, winning or losing increasingly will depend upon performance capabilities and operational efficiencies. What are the factors that will determine success?

1. Energy Consumption. Energy is the second highest variable expense, right after employees, for many large data centers. It can consume 30 percent of operating budgets. Datacenters are also notoriously inefficient. Five years ago, it was fairly common to find datacenters spending twice as much money on energy for its air conditioners than for running its servers. Google, Yahoo, Facebook, Amazon and others have managed to substantially reduce energy consumption with strategic placement of equipment airflow (hot aisles), DC power distribution and even locating their operations in abandoned mine shafts. Still there is a massive amount of work left to be done in energy consumption.

2. Latency. Getting data from storage system A to server B efficiency and without lag time (or latency) is arguably the biggest challenge in computing today. Nearly every enterprise technology company is working on innovative ways to reduce the time it takes to transfer the massive amounts of data being generated and/or collected by smart phones, PCs, industrial devices and other Internet-enabled things so that it can be processed and analyzed. Right now, companies try to conquer latency by brute force, i.e. buying more computers and other equipment than needed so that work can be performed in parallel or more rapidly. It’s unsustainable. The old way of building datacenters is outpacing budgets. Datacenter overkill is also a prime cause of energy bloat.

We’ll likely see a combination of solutions to address lag time. Fiber optics, which shuttle data faster than wires, will link server banks. New storage will also play a key role because communications standards and technologies can take extraordinary amounts of time to market. Storage effectively masks the lag time inherent in these links.

3. Virtualization. One of the datacenter world’s pain points is that servers spend most of their time idle. Only about 15% of processing cycles, on a good day, accomplish work. The virtualization process decreases idle time by combining different applications on the same piece of hardware. The technology was first applied to servers, but it’s migrating to storage systems and networking equipment.

4. Real Estate Strategies. Location, location, location. With the exception of South Korea and Japan, the broadband infrastructure of most nations lags behind demand. There is simply more data than bandwidth and the problem will become more acute. How do we solve it? By strategically locating data center real estate. At one end of the spectrum we will see the evolution of massive mega-data centers that occupy hundreds of thousands of square feet. Supplementing these will be much more moderate-sized facilities that will sit in industrial parks.

And to supplement the moderate facilities will be small, neighborhood-sized data centers. Think of something about the size of a dumpster stocked with processors, drives, cables and other pieces of IT equipment. By distributing physical infrastructure to the perimeter, data can be delivered more rapidly. This will also free up valuable bandwidth.

If it sounds farfetched, this is the same sort of strategy the utility industry devised to ensure that power could get to every location.

5. Training and Personnel. One of the big issues that will emerge in a few years is whether companies will prefer to continue to depend on outside cloud providers or bring their data center operations in house. At some point, you can’t outsource a strategic asset. The shift will increase demand for data center employees and also continue to disrupt urban commuting patterns and land use planning.

How long will the datacenter era last? It’s hard to say. If more computing power can be distributed to the edge, these beige buildings will lose their value, but right now the growth of data far exceeds our ability to process it individually.

The era might not generate iconic structures like the Chrysler Building or the Empire State, but it will likely last quite a while.

Albert “Rocky” Pimentel is President of Global Markets and Customers at Seagate.

Posted in Uncategorized | Tagged , , , , | Leave a comment

Why Facebook, Ebay Measure Power and Water Consumption in Their Data Center…$200M maybe?

Cloud services companies like Facebook and eBay have a vested interest in making their data centers as energy-efficient and water-efficient as possible, not just because it makes for a good public relations message but because they can’t scale operationally without these metrics in place.

Because measuring this sort of thing in a way that matters to the financial team or business division heads is still a pretty new concept, both companies were forced to develop their own measurement approaches to tease out the right information—both created management dashboards to express and visualize the data in ways that would help them make decisions.

eBay published the methodology behind its metric, called Digital Service Efficiency, in March 2013. Now, Facebook has published the best practices and source code behind similar frameworks that it uses to track both power consumption and water usage. (An example is in the image below.) It did this as part of the company’s Open Compute Project, under which it has shared the hardware and software designs for its data centers.

851546_278846115613961_15946467_n

Why Facebook, Ebay measure Power and Water consumption in their data centers

Link | Posted on by | Tagged , | Leave a comment

This Ain’t Your Daddy’s Data Center

http://siliconangle.com/blog/2014/02/12/this-aint-your-fathers-data-center/RTP Data Center Site

Posted in Uncategorized | Leave a comment

Cap Rates for income producing properties declined on average in 2013

Capitalization rates, used by real-estate investors to measure the annual return of income-producing properties, declined for many property types in 2013, according to data from Real Capital Analytics in New York. Meanwhile, the spread between cap rates and yields on 10-year Treasury notes narrowed. The average cap rate for all property types was 6.74% last year, down from 6.76% in 2012. Cap rates fell fastest for office buildings, which had an average cap rate of 6.93% in 2013 compared with 7.15% in 2012.   Wall Street Journal 1/29/14.

 

Posted in Uncategorized | Tagged , , | Leave a comment

Data Center Real Estate Acquisitions Report – 2013 Year In Review

Screen Shot 2014-01-27 at 6.27.09 PM

2013 Year In Review Data Center Real Estate Acquisitions

Data center real estate acquisition activity for the full year 2013 yielded a total of $1.292 billion of total transactions with 7.387 million SF of space sold.   This follows just over $2B worth of transactions in 2012.    Geographically, the acquisitions were spread out throughout the US.

The data center acquisitions market was very active in 2013 after a somewhat tepid start during the first half the year.  Notably, GI Partners made a big splash through its acquisition of the 663,000 SF One Wilshire Building in Los Angeles for $437M in July 2013.

Posted in Uncategorized | Tagged , , , , , , , | Leave a comment

The Unrequited Search For The Optimized Data Center

The Unrequited Search For The Optimized Data Center

One of the most aggravating things about technology is that, too often, there’s no one right answer. We spend years swinging on a pendulum, trying to solve problems whose solutions all have drawbacks.

It’s frustrating – especially for left-brained engineers – because there are areas where there is a right, or at least a widely accepted, answer. Ethernet? Yeah, we’ve all agreed on that. TCP/IP? Yep, we’re good.

But in other ways, we’re still wandering in the woods, trying to figure out what works best. The glass house gave way to client-server computing. Terminals were tossed on the dustbin of history, but we still find value in virtualized desktops. At the same time, the increasing popularity of laptops meant that the boundaries of where employees logged on became amorphous. Now the boom in smaller, smarter mobile has pushed us back toward consolidated data centers.

For the rest of the article, please click below:

http://www.forbes.com/sites/howardbaldwin/2014/01/23/the-unrequited-search-for-the-optimized-data-center/Image

Posted in Uncategorized | Leave a comment

Are You Better Protected Renting Space for your Disaster Recovery or Owning a Private Cloud?

http://twimgs.com/infoweek/green/120313s/InformationWeek_SUP_2013_12.pdfScreen Shot 2013-12-19 at 5.19.44 PM

Posted in Uncategorized | Leave a comment