The Olympics in the Cloud

Posted on

With the 2016 Olympics in Rio fast approaching, today, let’s take a look at how cloud computing  is impacting athletes around the world and the fans who cheer them on.

First, staying steady is everything when it comes to archery and this summer fans will be able to see in real-time the archer’s heart rate as they take aim, according to the official news outlet for the Rio 2016 Games.  Wearables used for payment will also be part of Olympic venues. Visa is working with a Brazilian bank on not only a bracelet for fans but also a “payment ring” which will be given and used by 45 sponsored athletes. Finally, to support the technology needs of the games, the official IT partner of the Olympic games has been working to migrate many of its operations to the cloud to reduce its hardware needs.  For example, they expect to have 250 servers for Rio down from 719 servers used during the London 2012 games.

Laura Gargolinski for IBM’s Thoughts on Cloud writes, “even more interesting is the way cloud technology is revolutionizing the way athletes (whether they are Rio-bound, or just regular people like you and me) eat, sleep, and train to improve their overall health, eliminate injury, and achieve optimum performance.”  In her article she highlights Team USA Cycling which has developed an application that provides real-time data analytics to cyclists while they workout or train.  IBM’s application allows the cyclist to make “on-the-spot” adjustments so they can optimize performance.  Even “regular” athletes as she describes herself, an avid runner, can take advantage of cloud-powered apps that track and help plan your training regime.

Cloud technology not only supports apps and wearables that allow athletes to perform better but it also opens up new experiences for fans and helps to make supporting tech operations more efficient.

Why a Cloud Strategy is Important

Posted on

“Survey finds companies with cohesive cloud strategies have less trouble with cost overruns and staffing shortages,” reports Joe McKendrick on ZDNet.  Despite this finding, a recent survey by Softchoice of 500 executives also found that many companies lack a cohesive cloud strategy:

  • 54% reported their teams are struggling to form a cloud strategy
  • 52% reported that they lack any formalized strategy

Without a cohesive strategy and with the cloud “question” entering into so many functions, your organization could have little cloud plans forming right now across  many groups. Each with different goals, risks and needs, but you may not know until it is too late.  Once in the cloud, executives who report not having a strong strategy often find they struggle with:

  • Managing the cloud budget
  • Choosing and managing the cloud model
  • Keeping up with the cloud skills gap

Sure, pulling it all together may seem daunting, but whether you are still just talking about cloud or are currently testing cloud, take a step back now and make sure you pull the pieces together. Here are some tips to get started from Mr. McKendrick’s article:

  • Now where you are now. Review your current suite of applications and systems. What are their functions, dependencies, organizational alignments and service level and security needs. Talk with business groups to understand their needs and frustrations as well with current processes
  • Analyze the cloud you do have. Already in the cloud? Make sure you know what is going on within it, so you can manage service levels and most important budgets. Also, don’t forget to look into who is using it and how.  Cloud analytics tools will help you meet and forecast your budget needs as well as manage scale

Finally, don’t forget that the cloud can bring a learning curve. Invest in your employees as well as bring in outside expertise to ensure that your organization is choosing the right cloud to fit your needs.

The Language of IT

Posted on

James Keating IIIby James Keating III, Business Technology Architect, Evolving Solutions,

I recently had another eye opening moment with language and how vernacular for one group can and often is different to another group.  This can pose a huge challenge in the fast changing world of IT.  I have always been aware of this dilemma and for much of my career I have endeavored to use this awareness to help me be better at the job I am doing.  Primarily in my past this has been manifested in the fact that while I am a technical person by nature and have spent decades in the IT space as an administrator and technical manager, I always have come at the IT problem from a business perspective.  I was more likely in college to attend an Economics presentation than one on how to program in Pascal (yes they did teach Pascal when I was in college).  I had a few small businesses that I owned and managed during my early days and many of them were technical in nature, ranging from using computers to perform color separations for screen printing, to digital music and video production.  Granted they were completely in the dark ages by today’s standard as most of what I was selling can now be done on an iPhone (we are talking iPhone 3 not the latest and greatest even).  But what I learned there, I applied to working in IT and became a technical to business translator of sorts.  This is because the business often thinks in terms of risk, risk mitigation, investments and opportunity costs, while IT is firmly planted in megabits per second, disk and CPU performance and overall application performance.  This gap in perspective becomes easier to see the further down the IT stack one goes.  As a former data center manager, I found it the most difficult to get non-technical business folks to understand risks at the data center level than I did as a storage manager.  This was because the data center always looks tidy, was filled with chilled air and rows and rows of uniform looking racks.  It just didn’t look like it had much risk on the surface.

I believe the emergence of cloud computing has amplified this problem of business and IT not speaking the same language.  I will demonstrate this by the fact that if I were to take 10 people (5 business types and 5 technical types) and asked them for what they believe the cloud could do for the company they worked for and what it really meant as a concept, I would get at least 10 different definitions  and you would not always be able to tell which perspective was from a technical person and which was from a business person.  This is because much of the IT landscape has become charged with buzzwords that evoke emotional responses that are not always based upon confirmed fact, but often times hearsay and conjecture.  My point is if we as stewards of IT are going to be successful in leading the IT operations into the future using all of the available tools and advances, we are going to need to be able to work from a known good set of facts.  In the past when I found myself in situations like this I would level-set and look at what the desired results are before I would even allow myself to think technically.  Basically don’t even attempt to solve the problem until the goals, issues and requirements are all out on the table.

In today’s fast changing IT world, it is time more than ever to work on defining the desired outcomes before working to solve or build anything.  Since nobody can be an expert in everything, it is also more critical than ever, in my opinion, to use partners to help define goals, requirements and then work with the best minds to define, architect and build that next level of IT that all businesses want to see happen now that they have been hearing about the great abilities and promise that cloud computing holds.   To that end, I have been working the last 3 years to create a solid process to enable what I have been talking about in terms of IT design based upon goals and outcomes rather than created around features and feature sets of various software or cloud services.  I call these Cloud Potential Studies and Evolving Solutions offers them.  If you would like to learn more about Cloud Potential Studies and how they could help create a more cohesive IT strategy feel free to contact Evolving Solutions.

_______________

James Keating III is a Business Technology Architect for Evolving Solutions. James is a technology leader in the IT community and brings a combination of excellent technical skills and business acumen which allows him to guide customers in developing IT solutions to meet business requirements.

Operations vs Project: The Secret Innovation Killer

Posted on

James Keatingby James Keating III, Business Technology Architect, Evolving Solutions

Over the years I have worked for both large enterprises and also small businesses and all of them had an IT dilemma that was rarely spoken about, but was just a fact of life. The dilemma I am talking about is the operations vs project that all IT teams face. This is the concept of doing operations well (which is required to maintain the current business) but also being able to get new project work done which is the place that innovation and business growth happens. The issue is many IT shops have many of the technical teams splitting time between both sides (operations and project), however this setup often leads to either operational corners being cut to meet project demand, or projects not happening which means a loss of innovation and improvement. A side effect of this type of setup is it is difficult to really know how much innovation is being lost, or how much risk is being put onto operations as true accounting for either side of the equation is difficult at best in this blended model.

To make this concept more easily seen if you will, let’s take a small business that has an IT team of 3 people.  This team of three is responsible for all of the following operational areas:

  • System administration (Operating System and Hypervisor)
  • Storage administration
  • Backup and archive administration
  • On-call support frontline
  • On-call support backline
  • Data center operations
  • Back office software support
  • Data security (firewall and virus etc)
  • Compliance and change control
  • Mobility and support
  • All project work requiring IT assistance
  • Constant improvement of IT process

When you list it out like this you can quickly see that a team of 3 will find it difficult even with a very small and relatively static environment to be able to do all of those functions at a high level. I would argue it would be difficult for them to even do the majority of those functions at an adequate level. So the outcome is many things that could drive more business, improve operations and reduce risk or things that could contribute directly to the bottom line of the business don’t get done in favor of the fire of the moment so to speak. It is like the old saying “It is hard to remember the goal is to drain the swamp when you are up to your ears in alligators.”

So how can a business help this situation out without a lot of change in process and tons of additional headcount? I would suggest looking at taking two or three of the items off the list in a manner of speaking by using cloud or as a service offerings. The easiest of these is likely hosted Exchange. It is tried and true, relatively inexpensive and can free up time from an employee to work on other areas, and further provides some risk mitigation to operations as hosted Exchange comes with a 99.9% uptime SLA. Integration and implementation of a service like Exchange is something that Evolving Solutions can help you with and free up those cycles near instantly for your administrators.

For more details on what an engagement would look like and pricing contact Evolving Solutions today.

___________________

James Keating III is a Business Technology Architect for Evolving Solutions. James is a technology leader in the IT community and brings a combination of excellent technical skills and business acumen which allows him to guide customers in developing IT solutions to meet business requirements.

Be prepared for cloud migration

Posted on

Sharon Gaudin of IBM’s Thoughts on Cloud recently interviewed leaders with enterprise cloud migration experience on what “pitfalls” to watch for and be prepared for.

Legacy systems were at the top of the list. Ms. Gaudin cites The Weather Company which has migrated about 80% of its services for his thoughts. The remaining 20% are the tougher legacy systems.  Enterprises need to be ready to make decisions about legacy systems – their function and viability over the long term.

Be careful with mission-critical services, data and applications. Those experienced in cloud deployment still recommend taking it one step at a time when migrating to the cloud. Tackle systems, data and applications that were made for the cloud. Start small to gain experience and take incremental steps to see how the cloud fits and reacts with the whole system overtime.

Don’t get stuck on the what-ifs.  Some companies find themselves stuck in the planning stage and never moving to deployment. This could be due to tackling too big of a project, lack of experience or even choosing the wrong starting point. The key again is to start with a small cloud project. Look for the low hanging fruit – what is best suited for the cloud – and start with those to gain experience and to test the waters.

Know how you will connect with the cloud.  Ms. Guadin speaks with John Trujillo VP of Pacific Life Co who adds, “connecting to [the cloud] through your VPN is fine, but if you have offices in Chicago and you’re connecting with Virginia, that latency can become frustrating, and that latency can become a little obnoxious. Make sure you pay a lot of attention to that networking component. How you get there is a big part of it.”

Finally before migrating to the cloud, enterprises must think about the people element. What new roles will be created, what skills are needed and how do you position your people in the best way possible to capitalize on the benefits of the cloud.

Industry News Roundup: The Cost of Cloud

Posted on

To keep you on top of the latest and greatest cloud technology news, we’ve rounded-up a list of recent topics.

The Cost of Cloud

Forbes recently reported on 451 Research’s update to its cloud price index. What did they find? First for those entering the cloud or testing the waters in the short term prices fell from October 2014 by 2%, but the firm found that for those companies willing to negotiate for the long term prices fell by almost 12%. 451 Research encompasses compute, storage, networking and higher-level services – they like to focus on the “portfolio” level.  Here is a look at how price index:

Cloud price index
image source: forbes.com

Cloud Native Computing Foundation

The Linux Foundation recently announced the formation of The Cloud Native Computing Foundation reports Ben Kepes for Network World.  The goals of the foundation include:

  • Advance the methods used to build cloud-native applications
  • Align the different open source initiatives related to cloud apps
  • Approach open source at the orchestration level

Members include some of the biggest names out there such as IBM, Goldman Sachs, Google, and VMware just to name a few. Mr. Kepes reports, “In terms of its mandate, The Cloud Native Computing Foundation will be responsible for stewardship of the projects, fostering growth and evolution of the ecosystem, promoting the technologies and serving the community by making the technology accessible and widely adopted. On the all-important governance aspects, The Foundation will include a Technical Oversight Committee and an End User Advisory board to ensure alignment of needs between the technical and end-user communities.”

Cloud Security of the Future

Andre Froehlich of InformationWeek walks us through several new cloud security developments that will be more and more mainstream:

  • Cloud for security threat detection
  • Cloud for self-healing and retroactive malware protection
  • SDN to improve cloud network visability
  • Cloud as a first line of defense for DDoS

Mr. Froehlich writes, “In many ways, cloud security is gaining in strength based on a seemingly inherent weakness. Cloud service providers are in a unique position to absorb vast amounts of data. Because large clouds are geographically dispersed in data centers around the globe, they can pull in all kinds of security intelligence as data flows in and out of the cloud. This intelligence can then be used to track security threats and stop them far more quickly.”

The Cloud to Innovate

Posted on

Sorting through what cloud benefits might mean the most to your company as well as talking through how to best use the cloud can sometimes be overwhelming. Today let’s look at thoughts from Rashik Parmar of IBM’s Thoughts on Cloud. He studied more than 100 cloud projects all different sizes and found five ways in which most companies use the cloud to innovate and to benefit:

 

  • Use cloud to offer your services through an application program interface (API). Mr. Parmar writes, “The first and easiest way cloud can help you innovate is to use cloud as a route to market. APIs are the digital services you provide. Partners and customers can pay for these services as they use the APIs. PayPal is an ideal example, as a wide range of mobile apps and web-based businesses use its API.”
  • Use the cloud to capture new data
  • Use the cloud to create a “data-as-a-service” revenue opportunity. Mr. Parmar explains, “Every large enterprise builds vast lakes of data to support their business and allow its leaders to make smart decisions. However, not all of this data is confidential to the business and could be valuable to others. In many cases, organizations would be willing to pay for certain data. Cloud allows you to provide data as a service in a manageable way, which helps organizations capture value. There are risks to consider when sharing data, foremost, privacy and ethical use.”
  • Use the cloud to integrate data from different sources and locations
  • Use the cloud to go more “digital”.  Examples from Mr. Parmar’s article include digital blue prints and learning that can be moved “online” by using the cloud versus more traditional approaches

What on this list captures your attention as a way to innovate with cloud at your company?

How are you using the cloud?

Posted on

Depending on the size of your company cloud use can vary. John Mason of IBM’s Thoughts on Clouds focuses in on small and medium-sized businesses (SMBs) and breaks down their top five uses of cloud computing this year.

  • Hybrid. SMBs are using this cloud technology to bring portability to both back and front office applications
  • Infrastructure. Small and mid-sized businesses can gain flexibility to scale up and down with demand without the additional hardware costs typically incurred
  • Test & Development. Mr. Mason writes, “Open, cloud-based environments are empowering SMBs to quickly innovate, test and launch new applications and solutions, cutting deployment times from months to hours or even minutes in many cases. Even the smallest developer teams are creating business applications with ease and speed, helping them to better serve their market and compete on a global scale.”
  • Big Data and Analytics. The cloud has opened  up the power of big data for SMBs. SMBs can tap into analytics services and data that allows them to better serve their customers
  • Mobility. Another technology that cloud has helped make more affordable for SMBs. Using cloud employees can access data and apps on-the-go making them more productive

Small Business Computing also notes cloud disaster recovery as an important cloud use, “moving your business data to the cloud can make disaster recovery (DR)—i.e., retrieving data in the event of a hardware compromise—easier and less expensive. You can even set up your system to back up data automatically to ensure you’ll be able to recover the most up-to-date information in case of emergency.”

Not using the cloud yet? What use ideas from this post would be a good starting place for your company?

Cloud Security Myths

Posted on

Jonha Revesencio writes for The Huffington Post, “In everyday language, “cloud” suggests something porous, and the word “cloudy” means murky and nebulous. The very term, in addition to the complexity of subjects like virtualization, makes cloud computing a tough concept for non-IT people.” She also recognizes that because the technical pieces of the cloud can be hard for business to understand it can also lead to increased fear that the cloud is unsafe especially in light of the numerous data breaches in the news in recent years. Ms. Revesencio suggests that IT help “debunk” these common cloud security myths.

Start with the common misconception that “cloud environments are easier to attack.” Not true, security depends largely on the processes and procedures you have in place – whether that is on-premise or in the cloud. Ms. Revesencio points out that a cloud service provider needs to itself have top-of-the-line security measures in place in order to gain and keep its clients’ business. If you are in a small company with limited IT resources, chances are that many cloud providers will have much more advanced security practices in place and more expertise to draw on to stay up-to-date. To sum it up, the cloud is not more or less vulnerable than your internal systems, what matters are the protections and processes that you have in place to prevent data security threats.

Next myth – you can’t control where data lives in the cloud. Ms. Revesencio writes, “knowing where your data lives requires transparency from your provider. You should know where your data travels and how it’s protected both at rest and in transit.” If you are global company with data traveling around the world you still have responsibility for knowing where it is and how it is handled and what regulations need to be followed. Ms. Revesencio notes that  many times a global cloud service provider can provide better transparency than piecing together a network of local providers.

Finally, some people may think it is easy for “cloud tenants” to spy on each other.  Even though public cloud banks on shared resources, Ms. Revesencio points out that “virtualization provides strong partitions between tenants.” Companies should instead define what data provides a competitive edge and what data is not as sensitive. The very essential information could be stored on a private cloud whereas other less sensitive data could be stored in the public cloud.

What cloud security myths do you commonly run into?

Data Centers of the Future

Posted on

IDG Enterprises  gathered a number of articles on data center management trends. Let’s take a look at some of the highlights:

The Fuel Behind Data Center Traffic Growth

Cloud computing is driving data center traffic growth. Over the next four years data center traffic is expected to triple, reports Sharon Gaudin, largely due to growth in cloud computing.  In her article, Ms. Gaudin reports that by 2018 cloud will account for 76% of data center traffic.  Other industry findings are reporting a 50% growth rate in public cloud computing as well 40% growth in hybrid and 45% growth in private clouds. This growth will require data centers to perform even more efficiently.

Virtualization Is Key

Eric Knorr writes, “the technology foundation of cloud boils down to one seminal advance: virtualization.” He further adds, “virtualization abstracts the resources delivered by hardware infrastructure from the hardware itself. The resources become elastic -“defined” by software rather than by admins crawling around the data center rerouting cables, standing up new boxes, or flipping physical switches.” Taking virtualization one step further is the software-defined data center which opens up the time-saving functionality to the storage and network side of IT.

Software-Defined Data Centers

Finally, how does one get to a software-defined data center (SDDC)? Brandon Butler presents these essential first steps when starting down the path to SDDC:

  • Make sure you have the ability to manage capacity and make sure you have enough capacity to meet your organization’s needs
  • Make sure your platform can support multi-virtualization and multi-cloud vendors
  • Make sure your configuration management process is automated not manual

What trends are driving change in your data center?