Mary Shacklett of Tech Republic recently remarked, “Storage is an often-overlooked area, but with the increase in big data, it’s worth paying attention to.” She further explains that in her experience, instead of looking for underutilized data center storage, many managers simply purchase more. Big data injects more volume and complexity into your data center storage solution and simply “buying more” will eventually not be cost effective. Ms. Shacklett instead recommends that data center managers look to practices that optimize storage to prepare for big data. Here are some tips to get started:
- Make sure you have a solid tiered data strategy. Not all data is equal or even needed at all times, so your data center storage solution should not provide all access and anytime access for all. Tier your data based on importance and need and assign the best storage solution to handle
- Know your cloud storage costs. Make sure you have a clear understanding of the cost to scale your cloud storage when demand is high, so you can manage costs
- Manage your storage assets well. Inventory all of your storage assets. Chances are you will find not just underutilized storage, but storage that is not even being used. An IT asset management system can be great way to stay proactive
- Review your data retention policies. Once you start to use big data it can grow fast and having a strong strategy in place for what needs to be kept and for how long will help you manage the volume better
Before you jump into big data, it is important to make sure your data center storage solution and policies are ready, costs are transparent and space management and processes optimized.