Follow Datanami:
June 25, 2015

Going In-Memory? Consider These Three Things First

Paul Birney

Businesses, large and small, know that if they aren’t using the data they collect in some kind of intelligent way, their companies will be out of business in a few short years. Quickly turning data into business insights is downright essential to increasing and retaining customers as well as delivering new products and services.

You know how online retailers have those flash sales and replace items that have sold out with similar ones as shoppers are viewing the screen? They are able to track what is being sold in real time while providing the shopper with a better customer experience that lets them see what is actually available for purchase.

So why aren’t more businesses able to turn information into business advantage in real time? Big data environments are extremely complex, and businesses are challenged with multiple instances, multiple tools, and legacy databases, all of which lead to complexity and cost, and make it hard to get the data they need consistently.

But in-memory computing is changing the way businesses process information. It uses in-line memory instead of disks for storing data, which means fewer CPU instructions and quicker seek-times during the data query process on complex and often massive data sets. The result: much faster performance. What once took hours or days to process now takes minutes to seconds. Business leads can make faster decisions based on real-time information and IT can quickly deliver analytics and insights without impacting their current day to day operations.  Database managers spend less time moving and managing data and more time analyzing it.

SAP HANA, for example, takes advantage of in-memory computing technology to enable real-time data access. It also provides a common platform across the entire SAP landscape within an enterprise to simplify how data is stored, managed and analyzed.  Your business can glean faster real-time insights from your SAP environment and make better business decisions.

While you would love to transform your IT this way, getting to faster data analytics, a more scalable data warehouse and the ability to migrate all business applications to one underlying database is more of a journey. And, as I mentioned, more than likely your environments are crammed with legacy databases that are complicated and disconnected. And it takes more than in-memory data management software to achieve your business objectives.

It’s critical that your infrastructure correctly match the in-memory data management software you select. That’s why more IT executives and managers are making long-term, strategic architectural bets for their data centers and data management platforms. You need to develop the right strategy, delivery model and partnerships to minimize risk and disruption to datacenter operations during deployment as well as to scale as your data grows while maintaining high performance.

Let’s look at three requirements that should be part of your strategic migration plan:

#1: Performance and Scale

Can the solution handle my needs? To get the most from your in-memory data management environment, you need application-optimized systems that are purpose-built to meet your in-memory computing needs of high performance and high availability. These systems should have scalable architecture design to allow easy scalability to protect your current HP Server Racksinvestment and expand as needed into the future.  The industry standard is scalable up to 6 TB, but if you are looking to really use SAP HANA, for instance, for your largest business application environments, a ceiling of 4 or 6 TB in-memory computing doesn’t address your needs. It’s not uncommon for large systems to run on databases that are 20 or 30 TB. Make sure you have adequate scalability to handle data warehouse environments as you move along your big data journey.

And remember, not all memory is created equal. Some vendors jam memory into their hardware which depletes workable memory. Make sure systems have high level chips that can accommodate higher RAM so you are productively using the increased memory size and maintaining a high level of performance.

#2:  Levels of Service

Can the solution provide mission-critical capabilities and end-to-end services and support?

As you move to more advanced workloads, you have more advanced mission-critical needs that require the capabilities found on high-end UNIX systems—fault-tolerant architecture, very high I/O bandwidths, self-healing analytics and automation. Make sure these capabilities are built into your system to keep downtime at a minimum and disaster recovery as fast as possible. For example, hard partitioning technology increases system reliability and agility significantly. Each hard partition has its own independent CPUs, memory and I/O resources as part of the memory crossbarblades that make up the partition. These are tied together through a fault tolerant crossbar. So a fault in one part of the system has no impact on the rest of the system.  In terms of agility, these partitions enable you to run CRM, ERP and BW solutions on a single system – a significant step forward in realizing ‘real time’ enterprise.

Even with mission-critical capabilities built in, it’s wise to augment your system with end-to-end services that enable speedy deployment, smooth migration, and business continuity. Data management requires 24/7 support so you can proactively prevent issues, maximize system performance, and accelerate problem resolution.

#3: Secure Investment

Will the solution allow my in-memory data management environment to grow and consolidate across the entire company?

Let’s face it. Migrating your business applications to an in-memory data management environment will not be easy or inexpensive, but it will be more cost efficient than continuing to operate sluggish and disconnected databases. And it will definitely deliver faster time to value for your business.

But here’s the thing. Your rate of success will be higher with a trusted and experienced partner. You need a vendor with major experience deploying these types of environments. And you need a vendor with a strong partnership with SAP, Microsoft or other big data software vendor, who has certified platforms and proven benchmarked performance and whose systems have pre-approved parts by your selected vendor. You will also want to look at several delivery models – on-premise, Cloud service, hosted solution – for deployment and implementation of your environment to see which one best fits your business needs.

So when you’re contemplating the move to an in-memory data management platform, don’t postpone it, but take your time in upfront planning. Think strategically about performance, scale, levels of service, mission-critical capabilities, and how to make the most of and get the most out of your in-memory computing investment.

About the author: Paul Birney is the Director of Product Management at Paul BirneyHewlett-Packard Converged Data Center Infrastructure. In that role, Birney heads product management for data management and big data systems and solutions, with a focus on driving a portfolio of workload-optimized systems, solutions and reference architectures across high-growth opportunities like SAP HANA, Microsoft SQL and APS, Hadoop and more.

Related Items:

Accelerating Hadoop MapReduce Using an In-Memory Data Grid

How In-Memory Data Grids Can Analyze Fast-Changing Data in Real-Time

 

Datanami