Feeds:
Posts
Comments

Archive for the ‘data center’ Category


CloudHarmonyCloudHarmony™ provides  objective performance analysis to compare cloud providers. Their intent is to be the goto source for independent, un-biased, and objective performance metrics for cloud services. CloudHarmony is not affiliated with, owned or funded by any cloud provider.   The benchmarks provided by CloudHarmony fall into 3 categories: Performance Benchmarking, Network Benchmarking, and Uptime Monitoring.

CloudHarmony states that there are 7 questions one might ask when considering benchmark-based claims. Answering these questions will help to provide a clearer understanding on the validity and applicability of the claims.

  1. What is the claim? Typically the bold-face, attention grabbing headline like Service Y is 10X faster than Service Z
  2. What is the claimed measurement? Usually implied by the headline. For example the claim Service Y is 10X faster than Service Zimplies a measurement of system performance
  3. What is the actual measurement? To answer this question, look at the methodology and benchmark(s) used. This may require some digging, but can usually be found somewhere in the article body. Once found, do some research to determine what was actually measured. For example, if Geekbench was used, you would discover the actual measurement is processor and memory performance, but not disk or network IO
  4. Is it an apples-to-apples comparison? The validity of a benchmark-based claim ultimately depends on the fairness of the testing methodology. Claims involving comparisons should compare similar things. For example, Ford could compare a Mustang Shelby GT500 (top speed 190 MPH) to a Chevy Aveo (top speed 100 MPH) and claim their cars are nearly twice as fast, but the Aveo is not a comparable vehicle and therefore the claim would be invalid. A more fair, apples-to-apples comparison would be a Mustang GT500 and a Chevy Camaro ZL1 (top speed 186).
  5. Is the playing field level? Another important question to ask is whether or not there are any extraneous factors that provided an unfair advantage to one test subject over another. For example, using the top speed analogy, Ford could compare a Mustang with 92 octane fuel and a downhill course to a Camaro with 85 octane fuel and an uphill course. Because there are extraneous factors (fuel and angle of the course) which provided an unfair advantage to the Mustang, the claim would be invalid. To be fair, the top speeds of both vehicles should be measured on the same course, with the same fuel, fuel quantity, driver and weather conditions.
  6. Was the data reported accurately? Benchmarking often results in large datasets. Summarizing the data concisely and accurately can be challenging. Things to watch out for include lack of good statistical analysis (i.e. reporting average only), math errors, and sloppy calculations. For example, if large, highly variable data is collected, it is generally a best practice to report the median value in place of mean (average) to mitigate the effects of outliers. Standard deviation is also a useful metric to include to identify data consistency.
  7. Does it matter to you? The final question to ask is, assuming the results are valid, does it actually mean anything to you? For example, purchasing a vehicle based on a top speed comparison is not advisable if fuel economy is what really matters to you.

Read Full Post »

ZDNetAccording to a ZDNet article (by John Hazard, March 9, 2011) “IT manager jobs to staff jobs in move to the Cloud“:

The typical IT organization usually maintains manager-to-staff ratio of about 11 percent (that number dips to 6 or 7 percent in larger companies), said John Longwell, vice president of research for Computer Economics. The ratio has been volatile for four years, according to the Computer Economics recently released study, IT management and administration staffing ratios. As businesses adjusted to the recession, they first eliminated staff positions, raising the ratio to its peak of 12 percent in 2009. In 2010, businesses trimmed management roles as well, lowering the ratio to 11 percent, Longwell said. But the long term trend is toward a higher ratio of managers-to-staff ratio, he told me.

“Over the longer term, though, I think we will see a continued evolution of the IT organizations toward having more chiefs and fewer Indians as functions move into the cloud or become more automated.”

For a complete copy of the article see: http://www.zdnet.com/blog/btl/it-manager-jobs-to-staff-jobs-in-move-to-the-cloud/45808?tag=content;search-results-rivers

Read Full Post »

Powered by PathView Cloud, the Cloud Provider Scorecard rates the performance of leading cloud providers to and from numerous locations throughout North America. The scores, 100 being the best, represent a proprietary scoring algorithm of network performance characteristics – such as capacity, jitter, latency and packet loss – between the provider and these locations.  The cloud provider offering the best performance to each city is indicated by the colored circles on the map. Cloud providers are monitored continuously and the scorecard is updated daily.  Cloud providers include AWS, GoGrid, Hosting.com, Rackspace and Salesforce.com.

Source: http://www.apparentnetworks.com/CPC/Scorecard.aspx

Read Full Post »

Avishay Traeger from the IBM Haifa Research Lab and Erez Zadok from Stony Brook University are raising awareness of issues relating to proper benchmarking practices of file and storage systems.  They hope that with greater awareness, standards will be raised, and more rigorous and scientific evaluations will be performed and published.

acm_imagesIn May 2008 they published a paper in the ACM Transactions on Storage entitled “A Nine Year Study of File System and Storage Benchmarking'” in which they surveyed 415 file system and storage benchmarks from 106 papers that were published in four highly-regarded conferences (SOSP, OSDI, USENIX, and FAST) between 1999 and 2007.  They found that most popular benchmarks are flawed, and many research papers used poor benchmarking practices and did not provide a clear indication of the system’s true performance.  They have provided a set of guidelines that they hope will improve future performance evaluations. An updated version of the guidelines is available.

Traeger and Zadok have also set up a mailing list for information on future events, as well as discussions.  More information can be found on their File and Storage System Benchmarking Portal
http://fsbench.filesystems.org/.

Read Full Post »

zohoServiceXen, an IT firm located in Atlanta, Georgia, has provided six (6) interactive spreadsheets to assist in IT benchmarking activities.  Each spreadsheet is a shared Zoho Sheet.  See below:

  1. Data Center Security Audit
  2. New Employee Cost Calculator
  3. Server Buy vs. Lease Calculator
  4. Total Cost of Ownership (TCO) Calculator
  5. Virtualization Fit Tool

Read Full Post »

Amazon Virtualization

Amazon Virtualization

Virtualization Benchmark

Amazon sold storage to external customers for 15 cents/GB/month (estimated).

Bechtel’s internal storage costs were $3.75/GB/month.

WHAT BECHTEL LEARNED: Amazon could sell storage cheaply, Ramleth believes, because its servers were more highly utilized.

Source: CIO Magazine, Bechtel’s New Benchmarks, October 24, 2008.

Read Full Post »

Microsoft Data Center

Microsoft Data Center

Microsoft has set some benchmarks for their Data Center of the future.   “A key driver is our goal to achieve a PUE (power usage effectiveness) at or below 1.125 by 2012 across our data centers”, said Michael Manos, general manager of global foundation services at Microsoft.

Source: http://loosebolts.wordpress.com/2008/12/02/our-vision-for-generation-4-modular-data-centers-one-way-of-getting-it-just-right/

Read Full Post »

Microsoft will staff its 500,000-square-foot Chicago data center with only 35 people. Read more about Microsoft’s new data center.

Source:‘How to Staff Your Next Data Center’, CIO Magazine, June 10, 2008.

Data centers built for today’s equipment range from 150 to 300 watts per square foot.
Source: ‘The 5 Pitfalls of Data Center Consolidation and Relocation”, CIO Magazine, November 19, 2008.

Read Full Post »