Subscribe to the podcast on iTunes. Read a full transcript or download a copy.

If, as the adage goes, you should fight fire with fire then perhaps its equally justified to fight big data optimization requirements with -- big data.

It turns out that high-performing, cost-effective big-data processing helps to make the best use of dynamic storage resources by taking in all the relevant storage activities data, analyzing it and then making the best real-time choices for dynamic hybrid storage optimization.

In other words, big data can be exploited to better manage complex data and storage. The concept, while tricky at first, is powerful and, I believe, a harbinger of what we're going to see more of, which is to bring high intelligence to bear on many more services, products and machines.

To explore how such big data analysis makes good on data storage efficiency, BriefingsDirect recently sat down with optimized hybrid storage provider Nimble Storage to hear their story on the use of HP Vertica as their data analysis platform of choice. Yes, it's the same Nimble that last month had a highly successful IPO. The expert is Larry Lancaster, Chief Data Scientist at Nimble Storage Inc. in San Jose, California. The discussion is moderated by Dana Gardner, Principal Analyst at Interarbor Solutions.

Subscribe to the podcast on iTunes. Read a full transcript or download a copy. Sponsor: HP.