Data Centers of the Future

Posted on

IDG Enterprises gathered a number of articles on data center management trends. Let’s take a look at some of the highlights:

The Fuel Behind Data Center Traffic Growth

Cloud computing is driving data center traffic growth. Over the next four years data center traffic is expected to triple, reports Sharon Gaudin, largely due to growth in cloud computing.  In her article, Ms. Gaudin reports that by 2018 cloud will account for 76% of data center traffic.  Other industry findings are reporting a 50% growth rate in public cloud computing as well 40% growth in hybrid and 45% growth in private clouds. This growth will require data centers to perform even more efficiently.

Virtualization Is Key

Eric Knorr writes, “the technology foundation of cloud boils down to one seminal advance: virtualization.” He further adds, “virtualization abstracts the resources delivered by hardware infrastructure from the hardware itself. The resources become elastic -“defined” by software rather than by admins crawling around the data center rerouting cables, standing up new boxes, or flipping physical switches.” Taking virtualization one step further is the software-defined data center which opens up the time-saving functionality to the storage and network side of IT.

Software-Defined Data Centers

Finally, how does one get to a software-defined data center (SDDC)? Brandon Butler presents these essential first steps when starting down the path to SDDC:

  • Make sure you have the ability to manage capacity and make sure you have enough capacity to meet your organization’s needs
  • Make sure your platform can support multi-virtualization and multi-cloud vendors
  • Make sure your configuration management process is automated not manual

What trends are driving change in your data center?

Cloud Adoption Success

Posted on

Cloud computing is growing fast – you may feel it in your own organization. Louis Columbus of Forbes recently rounded up various cloud growth estimates in his article:

  • By 2018 59% of cloud workloads will be SaaS workloads (up from 41% in 2013)
  • 42% of IT decision makers are planning to increase spending on cloud computing this year
  • Spending on cloud infrastructure and platforms is expected to grow by 30% from 2013 through 2018

With this pace of cloud growth, now is a good time to take a step back and review what it takes to help ensure successful cloud adoption and deployment. Chief Architect Martin Wolfe of IBM identifies the following factors that should be addressed when deploying a cloud solution on Thoughts on Cloud blog:

  • Determine  if the cloud is a good fit and is justified. Don’t assume the cloud will fix everything
  • Understand the impacts of organization, culture and process. This means not only technical skills but also financial management and other key internal processes
  • Identify key areas of automation and services management integration
  • Define technology and process standards. Mr. Wolfe points out, “Just as important are the policies and, especially, practices in place for utilizing that technology. Often, these practices are not documented in the form of official policies, yet they typically have a huge impact when utilizing a cloud model”
  • Identify targeted workloads and migration considerations. Mr. Wolfe adds the focus should be on workloads not operating models

He recommends starting every cloud project with these two assumptions. First, cloud is a pay-per-use model. This model can make costs more unpredictable versus a pre-allocated budget. Is your organization prepared?  And, second, cloud assets are not wholly owned or owned at all by the organization. He also adds, “In order to successfully use the cloud model, you need to ask yourself a couple of questions: What will get fixed, improved, enhanced or created if I use cloud? What problem or challenge am I trying to solve by using cloud?”


Share how your organization is evaluating its use of cloud computing.

Tips for Managing Cloud Costs

Posted on

You made the switch to cloud, team members are high-fiving each other and systems are running smoothly, but now comes the question, “How do we best manage our cloud costs?”

Clinton Boulton writes for The Wall Street Journal about the importance of managing cloud costs especially with the continued growth of cloud use, “The ability to manage cloud costs is becoming a priority for businesses as the technology moves deeper into the mainstream. World-wide spending on public cloud services is expected to total $59.5 billion, up from $45.7 billion in 2013, according market research firm IDC. The cloud market is expected to have a compound annual growth rate of 23% through 2017.”

In Mr. Boulton’s article he interviews a number of seasoned cloud users about their experiences with managing cloud costs. Several users highlight these common mistakes when switching to cloud:

  • Ordering too much power
  • Not accounting for off-hours
  • Failure to monitor computing cycles
  • Thinking computing cycles are free

Mr. Boulton writes, “the economic case for embracing cloud computing is based on the idea that consuming resources as you need them beats expending capital and maintenance budgets to fund a roomful of servers. But the ease with which departments can tap online resources with little more than a company credit card can lead to problems. Ordering too much computing power can be as easy as over ordering at a restaurant or leaving the water running at home.” According to an analyst interviewed in the article around 60% of cloud  servers could be reduced because companies have order too many.

Solutions that experienced users are using to help manage cloud costs include monitoring computing cycles closely to look for underutilized servers, programming software to monitor for off-peak times or for example weekends which allows cloud systems to be turned on and off versus running constantly and being wary of having “too much” power.

Share what steps your organization has taken to manage cloud costs.

Shadow IT?

Posted on

by James Keating III, Business Technology Architect, Evolving Solutions

Shadow IT (or skunkworks as it was referred to at a few places I have been throughout my career) is a fact of IT.  For those who don’t know what it is, the definition from an internet search to Wikipedia reads:

Shadow IT a term often used to describe IT systems and IT solutions built and used inside organizations without explicit organizational approval.

In the past this was people setting up clandestine labs, or putting virtualization software on company laptops to run a different operating systems.  Today with the reality of cloud infrastructure and cloud services, a shadow IT initiative can be full development environments, complete business analytics or any number of things that used to require full IT involvement to have even the slimmest chance of happening.  Now, a person with a credit card or expense account and an email address can stand up vast amounts of storage, computing power with software included.  What is even more interesting is often times real data that may contain sensitive items can be quickly put out onto these systems.

Whenever I have run into shadow IT in my career it is always a result of people wanting to do the correct thing for the business, but feeling normal process would be too cumbersome to allow them to innovate or get the results they need.  So rather than work within the system, people work around the system.  So from a business perspective, this is a double edged sword.  As a business I want innovation and better results to drive revenue, however I don’t want risks to my brand, product, data etc.  So often times the culture is relaxed about the fringed technology efforts until something bad happens.  This may not be a solid plan.  Conversely I have seen the lock down everything strategy as well, however this often creates more shadow IT efforts than it prevents due to the reasons people choose to go shadow in the first place.

In the world of cloud, I am of the opinion that IT needs to be aware of the cloud shadow IT going on, but they also must understand that elimination of it may not be possible or the best result for the company.  As such I recommend visibility and visibility tools into cloud delivered services that may be part of shadow IT efforts.  For example, Amazon Web Services, is likely being consumed in some manner at most medium to large companies regardless of if it is fully blessed by IT.  This creates the data risk I mentioned above but it also creates accounting and cost of IT metrics to be skewed.  So if I could know what and how much AWS is going on, that would be beneficial to both IT and the company as a whole.

So how does an IT professional get visibility like this into AWS?  The answer is actually to work with a reseller.  I know this answer may seem counter intuitive, “I can get more visibility into AWS when I purchase it through a reseller?”.  The answer is yes and if you want to know more about the mechanics of this contact me or Evolving Solutions and we can go through it with you, however the details are too much for a blog post.  The second question I know many are asking is “sure but how much more will this cost me over going direct?”.  Again the answer may seem contrary to conventional wisdom, but the answer should be $0.00 more.

So if you are scratching your head at this point, I understand, and yes that last paragraph is correct.  To better understand this, it is best explained over the phone or in person.  If you would like to know more don’t hesitate to contact Evolving Solutions.


James Keating III is a Business Technology Architect for Evolving Solutions. James is a technology leader in the IT community and brings a combination of excellent technical skills and business acumen which allows him to guide customers in developing IT solutions to meet business requirements.

Cloud Computing Myths

Posted on

In an article in Entrepreneur, Eric Dynowski walks us through several cloud computing myths:

  • The cloud is redundant. Not true. Mr. Dynowski points out that “The cloud is not a fail-safe solution out of the box. Any company using a cloud-service provider is responsible for designing and building a computer infrastructure that can be easily and quickly duplicated.” Just as with any technology, a data recovery plan needs to be in place just like within your own physical data center in case an outage takes place. Comprehensive data recovery and backup across your technologies needs to be determined.
  • The cloud scales automatically. Not true. Cloud computing power can be scaled but if applications are not built with scalability in mind you will still have a significant roadblock to growth. Review your applications that would touch the cloud. Mr. Dynowski recommends that applications should be built modular with functionality broken into independent commands.
  • The cloud is not secure. Not true. Most cloud providers have completed more certifications and steps to best ensure data security than many non-cloud companies with physical data centers. Companies who work in the cloud take an active role with the cloud provider to best mitigate risks together.
  • The cloud is cheap. True at first but it can add up without a clear vision of company cloud use. Before jumping in, align your cloud strategy among stakeholders, discuss use, understand the price points, and be clear about the visibility you will have or will need to build to monitor effectively.

Cloud computing represents a great way to tap into not only systems that may be too expensive to build and scale for small- and medium-sized businesses, but also the cloud allows businesses to tack on to best-in-class technology, people and processes that may not be available in-house.