Big data seems to be the hot topic of conversation lately as it pertains to business intelligence. We get asked often about what organizations should be aware of as they explore this avenue.
The obvious perils of big data include performance concerns, such as the ability to process and analyze data quickly and efficiently enough to provide value. Non-performant data analysis and processing will render any big data notions DOA.
Ideally, the BI solution sitting on fast, powerful hardware is a robust enough avenue to allow for this expeditious analysis. Often in implementations, instead of the doing what should be the norm — writing a report or developing a dashboard and extending business intelligence to the masses seamlessly — we spend too much time trying to optimize the SQL/database and do endless tuning, tuning and more tuning simply to secure adequate performance. This takes away from the core concept of putting BI in the hands of the end users. But the brutal truth is that if it doesn’t perform quickly, end users will not adopt it, no matter what information you’re promising to give them.
Oracle has spent a lot of time and effort to solve the big data problem with their Exalytics solution. Simply defined, it is optimizing hardware and BI software to work together to provide the ability to process enormous amounts of data quickly and efficiently. I have not had the opportunity to work with this solution personally, but if it achieves the results that are being emphasized, it represents some real opportunity to deploy a BI and hardware solution that can put the power of BI back into the hands of the end user. Less time tuning and more time letting the end users analyze, which is what decision-enablement is all about.
There is a lot of data about the Exalytics solution and overall Oracle big data solutions, both from Oracle itself as well as analysts that can be found here. Give it a look.
Of course, if you have questions, you know the drill: ask away.
MIPRO Consulting main website.