Real-Time Analytics

Read Article

As more real-world implementations of in-memory analytics happen, a clearer picture of its future will emerge

 

That organizations are staring into the face of huge piles of structured and unstructured data is now a given. Barring a few laggards, no one is debating about the reality of big data and the need to address the issues surrounding it.

The big questions now bandied about enterprise IT are: How do we store and manage all that data? What are the best ways to process it? How can it be analyzed to extract maximum business benefits and useful insights?

One answer to the puzzle is the use of in-memory technology. Put simply, it means storing and accessing humongous amounts of data in the random access memory itself rather than fetch it from disk. It is said that in-memory computing is nothing new and has been around for decades—but it is only now that the multiple forces of technology advances, economic feasibility and business needs are coming together to make in-memory hot, relevant and available today.

In-memory analytics got a big boost ever since SAP started painting the town red with its HANA (High Performance Analytic Appliance) around end-2010. About a year later, Oracle launched its response by the name of Exalytics. Besides these two, IBM, SAS Institute and many other vendors are in the fray.

The solutions for in-memory analytics are considered relatively expensive and the market is still emerging, but analysts are predicting quick uptake. According to a Gartner report, by the year 2015, as many as 35% of mid-size and large organizations will have adopted some form of in-memory computing compared to less than 10% in 2012.

Among the drivers of this technology are near real-time speeds to analyze data (batch processing can be done in seconds what used to take hours), growing requirement of 24×7 operations for businesses, and the availability of in-memory-enabled business applications (according to Gartner, there are over 50 software vendors that provide in-memory computing application infrastructure).

At the same time, there are several issues that could prove to be a dampener to the technology’s mainstream adoption. These include lack of standards, paucity of skills, security concerns, architectural complexity brought on by new appliances and software, and monitoring and management challenges.

Nevertheless, real-world implementations of in-memory analytics are happening across industries. So, in addition to what the vendors have to offer, how the CIOs and IT managers deploy and tweak their systems—and what kind of learning they come up with and share with their peers—will also have a significant bearing on the future of this technology.

Despite the quick start and the hype, it is still early days for in-memory.

– Sanjay Gupta
Editor, Express Computer


If you have an interesting article / experience / case study to share, please get in touch with us at editors@expresscomputeronline.com

Comments (0)
Add Comment