The 2019 Cost of a Data Breach Report is out!

Posted on

Do you have $3.9 Million to cover a data breach?

The 2019 report is out and it’s not pretty!  Ponemon Institute estimates that the average cost of a data breach will run your organization a cool $3.9 million! The bad news doesn’t stop there either.  There is time loss, resource strain, and, probably the most glaring consequence, hits to the company reputation.

You can check out the full report here:

2019-cost-of-a-data-breach-report

And if you need sound advice on securing your data or help to prepare for a breach.  Give us a shout!  Let’s Get to Work!

 

Business Partner Advisory Council Sessions did not Disappoint

Posted on

I was fortunate enough to represent Evolving Solutions at three IBM Business Partner Advisory Council sessions from June 11-13 in Charleston, SC.  Evolving Solutions is a member of the IBM Z, IBM Power Systems and IBM Storage advisory councils.

IBM’s messaging is clear – Hybrid Multi-cloud is the future of IT. As IT organizations move to Cloud 2.0, they are evaluating the merits of moving mission critical applications to the public cloud against leveraging existing assets on prem to support digital transformation. To quote an IBM Executive: “Cloud is a means to an end; not a strategy”.  Gartner predicts that organizations are likely to have 1/3 of workloads on prem, 1/3 of workloads in the public cloud and 1/3 of workloads in a hosted environment.  Organizations will need a solid Hybrid Multi-cloud strategy to support their business transformation.

IBM Z continues to be a growth platform for IBM and many clients continue to put their trust in the platform.  This is due to the high level of integrated security, automation and the platform’s design for data serving.  IBM’s investments are enabling the platform to become an integral part of any Hybrid Cloud environment through the development of RESTful APIs that allow seamless integration. The platform value of the IBM Z is being extended through open standards and tooling across all cloud consumption models.  In addition, IBM’s Tailored Fit Pricing delivers simplicity, transparency and predictability of pricing. Several options exist but the Enterprise Consumption Solution offers a cloud-like usage-based licensing model.

A highlight of the week was hearing Paul Zikopoulos, Vice President of Big Data and Cognitive Systems, speak on AI and Deep Learning.  Data is the next natural resource for companies – only 20% of the world’s data is accessible today. The real value lies in uncovering the insight from the 80% that is not accessible today. AI and Deep Learning will be the key to uncovering the insights from today’s dark data. A key quote from Paul stated that “There is no AI without IA (Information Architecture”.  Complex workflow can lead to data isolation, which is a key challenge in uncovering the value in the hidden data. A solid Information/Data Architecture enables the true benefit of AI. IBM’s Systems reference architecture for AI spans IBM PowerAI, IBM Spectrum Computing and IBM Storage.

There was big news in the world of SAP.  SAP HANA Enterprise Cloud (HEC) will soon offer IBM POWER9 as an option.  Why?  SAP HEC executives were looking for a more reliable and flexible architecture that allowed SAP to meet strict SLAs while adapting to highly dynamic environments. The POWER9 platform simplifies and accelerates SAP HANA deployments.  We are also excited to see POWER9 come to both the IBM Cloud and Google Cloud Platform this summer.  More to come in the next few months…

The three days were packed with great information and learning. Of course, we also enjoyed the social time with a great IBM team.  The IBM partnership is one of our longest and most-strategic, and we value the opportunity to provide insight into the company’s future systems strategies.


Jaime Gmach, President and CEO

Jaime Gmach co-founded Evolving Solutions in 1996 and continues to lead the company today as its President and CEO. Together with the extended Evolving Solutions team, Jaime has built the company into a business focused on creating enduring, open and trusted client relationships as a leading technology solution provider to businesses throughout North America.

Jaime has spent the past 30 years serving in various leadership roles within the technology industry. Jaime’s career began as a Systems Engineer with a Minneapolis-based professional services firm where he traveled throughout the world focusing on the implementation and support of mid-range compute and storage solutions. Daily face-to-face interaction with clients early in his professional career served as the inspiration for Jaime’s entrepreneurial passion and for his continued desire to work closely with clients.

Like what you read?  Follow Jaime on LinkedIn.

MinneAnalytics: Data Tech 2019

Posted on

We are so excited to be part of the popular practitioner’s field guide to analytics and emerging tech, returning to Normandale Partnership Center for its fifth straight year on May 30. We can’t wait to explore AI, Machine Learning, Deep Learning, NLP, Robotic Process Automation, Graph Technologies, and much more. Plus, we have an opportunity to speak about hybrid cloud and share some awesome data case studies pertaining to The Weather Company. Although the event is sold out, we will be posting updates on the sessions on our blog, so stay tuned for some great information!

Weather (data) for all seasons

Posted on

By Doug Polen, Software Sales Specialist

Many people don’t know this, but Weather Underground was a part of The Weather Company, which was acquired by IBM a few years ago. They had been offering an API that many customers and weather junkies had been using for quite some time to gather weather data for a vast array of applications.

To accommodate Weather Underground’s rapidly growing customer base, The Weather Company made the decision to move the Weather Underground platform to IBM’s enterprise API infrastructure and set this popular API to be shut down this month.

There are several different flavors of the API that are being published. The scalability of IBM’s infrastructure will allow existing Weather Underground customers to continue to receive the consistent customer experience they are used to, as well as better serve the developers working on the next generation of weather data.

As a result of this change, I was called upon to help with the surplus of inbound inquiries this decision created. Little did I know when I agreed to help work with these folks, the wealth of information I would take in about weather data and its seemingly endless use cases.

So why is this something that’s worth writing about? The Weather Company continues to remain the world’s most accurate forecaster and IBM is committed to ensuring its customers receive precise and accurate weather data at rapid speed.

Weather is something that impacts everyone daily in their personal lives. It’s a lot like the 82,000 memes you’ve seen on social media around the January 2019 Polar Vortex here in Minnesota … this guy survived his first -30° day ever, and here I am writing a weather blog in the postmortem.

Is it going to be sunny today?
What’s the wind chill today?
Will I need an umbrella?

This information is invaluable from a business perspective.

Will a snow storm impact a shipment?
Does an electric utility require more power today because more people will be using their AC’s?
What is the historical sun/wind/rain pattern that could impact agricultural output?

You get the point, the bottom line for business is having the ability to access better weather information can really have an impact on how business decisions are made. Helping clients make their businesses better and more competitive through data, is what I do. These new IBM offerings are both cost effective and robust.

Truth be told, in my sales career, I’ve never enjoyed the customer conversations more than I am right now. These are really fun conversations to have. I’ve learned about vineyards and how weather affects wine production, how off-shore oil rigs rely on accurate weather information to make decisions on worker safety and asset protection, and the ways public safety uses weather to predict what’s coming so they can best plan and schedule the resources necessary to keep public utilities going during weather events.

Speaking of fun, I’m positioning the Weather Company for Enterprise use, but I’ve got a new app on my phone that I simply love. WTForecast is a great app that I recently discovered (not affiliated with IBM). If you want a little humor (and let’s face it, when it’s -30° you need to laugh, albeit carefully, so as not to crack your face), be sure to give this one a try.

The point is, take a look at how decisions are made in your organization. Could better weather data help your company to make better choices to enhance profitability, make a more enjoyable workplace, or maybe even save an employee’s life? Let’s get to work today and uncover what we can do for your future.

#ISurvivedThePolarVortex

Doug Polen is a Software Sales Specialist at Evolving Solutions.  He has been with Evolving Solutions since 2015, after spending 16 years at IBM as a Software Client Leader and Client Executive.

He specializes in IBM PassPort Advantage, Software as a Service, Analytics, Cloud, Cognitive, IoT, Security, Social & Weather solutions and holds numerous IBM software certifications.

Like what you read? Follow Doug on LinkedIn.

Tips to Optimize Data Center Storage for Big Data

Posted on

Mary Shacklett of Tech Republic recently remarked, “Storage is an often-overlooked area, but with the increase in big data, it’s worth paying attention to.” She further explains that in her experience, instead of looking for underutilized data center storage, many managers simply purchase more. Big data injects more volume and complexity into your data center storage solution and simply “buying more” will eventually not be cost effective. Ms. Shacklett instead recommends that data center managers look to practices that optimize storage to prepare for big data.  Here are some tips to get started:

  • Make sure you have a solid tiered data strategy. Not all data is equal or even needed at all times, so your data center storage solution should not provide all access and anytime access for all.  Tier your data based on importance and need and assign the best storage solution to handle
  • Know your cloud storage costs. Make sure you have a clear understanding of the cost to scale your cloud storage when demand is high, so you can manage costs
  • Manage your storage assets well. Inventory all of your storage assets.  Chances are you will find not just underutilized storage, but storage that is not even being used. An IT asset management system can be great way to stay proactive
  • Review your data retention policies.  Once you start to use big data it can grow fast and having a strong strategy in place for what needs to be kept and for how long will help you manage the volume better

Before you jump into big data, it is important to make sure your data center storage solution and policies are ready, costs are transparent and space management and processes optimized.