It's now or never: the use case for in-memory platforms for Business Analytics

Somewhere in another article I read about “Information being the new global currency”. With today’s Digital Transformation and all that she is taking in her wake like IoT or BigData, information insight is an even more necessity than it has ever been. Structured and unstructured data coming in massively to all of us; us, who act in new markets in a networked economy. Insights are today the heartbeat of a company’s success; or new currency, end of the line. Any place and any moment we require to interact with business analytics searching for new and better insights.

To deliver on our requirements, we cannot do without in-memory platforms with dazzling performance and capabilities. It is now or never. In-memory computing is a technology that analyzes massive quantities of data in local memory so that the results of complex analysis and transactions are available at your fingertips, and business decisions can be executed without delay.

In-memory technology:

The primary capabilities if in-memory technology are to store information in primarily columnar format that can compress and store massive amounts of information in main memory, utilize parallel processing on mulitple cores and move data intensive calculations from the application layer into the database layer for even faster processing. Since all the detailed data is available in main memory and processed on the fly, there is almost no need for aggregated or materialized views, fundamentally simplfifying the architecture and hence reducing latency, complexity and cost.

Let us dive a little deeper into the use cases for in-memory platforms, and let’s do so from a business analytics perspective. Why is it that business analytics in particular is benefitting from in-memory computing? And how come that business analytics changes her behavior significantly when connected to in-memory platforms?

Insight in the use cases for in-memory platforms:

Operational Intelligence: real-time insight

Simply knowing it when it happens, is a use case in itself. Being able to respond immediately to operational activity and external information, allows organizations to differentiate from their competitors, but also to respond to market fluctuations. The impact is huge and game-changing. Real-time analytics become commodity quickly as decision takers act now and not tomorrow. They only can with in-memory computing delivering the calculation and processing power required. Real time insights opens the door for operational business intelligence: looking, interpreting and acting upon time driven behavior, predicting impact of a lost supplier immediately and projecting rolling planning models on the fly when new product lines are introduced. These are just basic examples on what operational business intelligence - driven by in-memory computing – will do.

Dive deeper

Ask any question on any data; meaning that we can now offer the full scope of operational data for analytical insights. In the past we had to make decision between speed and performance versus a more granular picture of the data. With in-memory platforms we can have both: the lowest level of detail available for end users with lightning performance. Governance models in in-memory system do also respect the principle of “Trusted Data Discovery”: any type of user with any type of BI component whether it is reporting, dash boarding or self-service, has full governed access to the same grain of data, allowing for way better integration between the components.

Act broadly: enlarge the scope of insights

Manage large volumes of data: now that BigData archives and new connected networks (ie. IoT, social,..) become available using techniques like Hadoop, MongoDb, Spark or Cloudera, we need access to them and be able to gain insights quick. The accompanying data structures are unfamiliar or even unknown to us. However enlarging the scope to these new data providers is crucial if we want to immediately respond to market or customer behavior. Only business analytics provides the capabilities to unravel valuable insights on these new data providers under the prerequisite it is being processed by in-memory platforms. The calculations and correlations are so big that “classical” platforms will fail. Notice also that this use case of acting broadly is not on its own, but typically in parallel with the others mentioned in this article.


Ask a question, get an answer. Interacting with data and get insights on the fly, is the new era of business intelligence. Data comes in on the fly; this data is often not our core data but could come from outside. Still we immediately require the insights on it, so we slice, dice, correlate, compare, project or predict on the fly ……. On the fly …. Yes, on the fly! I have said before that “your meeting will never be the same”: we will use business analytics in our meetings using story-telling techniques to make decisions. We have access to all detailed information needed and generate insights at our fingertips. Don’t believe me? Have a look at the boardroom of the future, or should I say, the boardroom of today.

Earn your users recognotion

Everybody reading this article and having joined a BICC or similar, know: you can have the best looking and most adequate reports and dashboards, but if your users have to wait for them to respond, they will not use them. They simply don’t. You will only earn the recognition of your users for your business intelligence environment when that same environment excels not only in accuracy but also in response times.

Time to deployments using in-memory platforms for your business intelligence, dramatically decreases. Modeling and report-dashboard-building is easier than ever since we can turn down many aggregation models, loads and unloads. Latency and complexity is reduced dramatically. New requests for BI are deployed way faster now. Your users will adopt the business intelligence environment better with in-memory platforms.

tweet this article

Share on Facebook

Highlighted articles