Why should large corporations require a 24TB in-memory application? Implications for SAP systems

15-03-2023 | 6 | Data Management for S/4HANA Migration, SAP Data Archiving, SAP Data Management, SAP Information Lifecycle Management

Author: Thierry Julien, CEO of TJC Group 

Why should a large corporations require a 24TB in-memory application? Find the reasons why SAP systems are outsized and how to tackle them. If TJC Group can’t cut your data volumes to 24TB or less, lunch is on us!

It is a well-known fact across the IT industry that SAP is a very memory hungry application. Extremely memory hungry. Users require high levels of memory and CPU to run their most critical business operations and, as technological capabilities expand, their needs for ever more computing power is only growing. This has obvious cost implications and now, with companies migrating to S/4HANA, many are finding they are outgrowing their on-premises infrastructure.

This further adds to the cost, but we should always acknowledge that SAP S/4HANA is a fantastic IT solution for enterprises. It has unique, in-memory capabilities that enable it to deliver exceptional data processing capabilities. However, this power comes at a cost – an increased need for memory and computing power to address these larger volumes of data. It’s the IT equivalent to portion supersizing and left unchecked, over consumption of memory is as unhealthy for enterprises as eating too much junk food is for their employees.

Tackling in-memory requirements in SAP

SAP is acknowledging the problem but more needs to be done to stop this bloating. For instance, to help users address the extra memory requirements, SAP has introduced the Column Store feature on most of its tables. This has a better compression ratio than the traditional Row Store databases used previously and it can effectively reduce the amount of memory needed to store the same volume of data. Unfortunately, Column Stores can only resolve half the problem though, because compressing data alone does not compensate for the fact that more memory is needed to retain most of the data in working memory.

New Virtual Machine of 24 TB release by Microsoft

Microsoft has release a new virtual machine that supports nearly 24TB of memory with 832 vCPUs, helping users meet the demands of high-end S/4HANA workloads. It’s the Standard_M832ixs_v2. “What we experience with a lot of customers that are operating on-premise is that their S/4HANA systems or their bigger SAP systems are outgrowing their on-premise infrastructure”, said Microsoft partner architect Juergen Thomas.

Microsoft ’s new VM goes some way to addressing this problem and we can salute them for their ability to provide solutions with outstanding computing power.

But aside from this, we need to address one fundamental issue that continues to bug us here at TJC Group. 25 years ago, when enterprises were still running their mainframes, most users were able to run their SAP applications with less than 1TB of database memory. So, why are we in this situation whereby modern applications are ever more memory hungry?

Data growth vs. business growth

Do we really need to capture so much more information today in order to deliver the same products that were needed 30 years ago? I don’t think so. Sure, we still need to get the same basic details – customer address, maybe their email and some delivery instructions. And yet here we are, facing this over consumption, bloated memory situation. Some industries, notably fast fashion businesses, are selling twice as many products as they were 20 years ago. This would account for some of the extra data growth. But putting things into perspective, this is still only growing by a factor of 2.

What about all the other businesses who have experienced rapid growth? Some companies, especially those linked to the tech or e-commerce industries, have seen exponential growth. Amazon is an obvious example, but not many companies have experienced hypergrowth on this scale to justify their additional data volumes.

Deep dive: why should a company require a 24 TB in-memory application ?

So, I return to my original point. Why is it that a large and successful enterprise today should need a 24 TB in-memory application? It seems to be excessive and actually it is!

The technical teams at TJC Group have investigated the reasons behind this issue and we have analysed hundreds of SAP systems to understand why this is the case.

Based on our investigations, here are our conclusions:

1) Architectural design flaw

There are so many to choose from. Here are just some of the volume generating architectural design mistakes we’ve found over time within customer applications:

  • Setting up a full SD flow (sales, delivery, invoice) when you have a digital delivery.
  • Using a fully featured MRP to replenish stores in a retail industry.
  • Ingesting too much detailed information – from ePOS (point of sales) to accounting.
  • Installing standard SAP when you’re running a utility business (SAP IS-U volumes are lower compared to SAP SD).

2) Technical issues

One of my personal favorites, a custom development that has been optimised by an expert, but in order to do so, the expert activates change logs on the code. This simple adjustment to batch programming manages to create logs that generate a 400% increase in total database volumes.

3) Not performing basic administrative functions and common housekeeping jobs.

4) Not initiating an archiving or ILM process. With any stable business, 2 years of data requires twice the volume. And no technology will change this.

5) Not measuring results of archiving or ILM processes. If you have run your archiving program, but just achieve a 50% ratio, you are still left with half the volume and memory.

6) Not ensuring that results are consistently achieved. TJC Group delivers expert archiving and ILM automation, with in excess of 95% archiving ratios guaranteed and the ability to provide ongoing archiving services.

Looking back over 20 years in the industry, we have encountered a few SAP systems with over 24TB of data in our time.

These even included Oracle databases which predate S/4HANA, but the examples are relatively rare and could be counted with our fingers. Putting aside issues with BW systems, most of these oversized systems suffered a common conception flaw. How did things become so much worse with HANA and the hyperscalers?

Learn how to reduce migration times and lower the TCO of your SAP systems when migrating to S/4HANA in this webinar on-demand. Watch it now! https://info.tjc-group.com/webinar-data-archiving

Banner Webinar Data Archiving for a succesful S/4HANA implementation

Primary drivers behind our tolerance for ‘outsized’ systems

Here what TJC Group believes are the main factors behind these problems:

  • C-level executives have actually believed the marketing hype that their SAP data is compressed. That it’s cheap and that there are no issues. It’s simply not true.
  • Migrating to a cloud-based infrastructure involves firing those legacy systems administrators. A mistake, because they were the ones who alerted you when data volume growth was getting out of hand.
  • Many companies don’t bother with regular data housekeeping. I ask them this question. What would your home look like if you stopped doing any cleaning and tidying for a few weeks?

You might be wondering why these things matter. We believe they matter a lot for a number of reasons. Firstly, storing unnecessary volumes of data is expensive and when we are talking about SAP S/4 HANA it is a lot of money and the cost keeps escalating. Secondly, it wastes energy, data centres consume around 20% of the world’s energy and we need to control emissions. And, lastly, because ignoring these problems won’t mean the problem goes away. It will simply get bigger, just as your volumes of data will continue to grow and grow. So too will the associated problems.

The low-hanging fruits – Archive or delete useless data!

Our technical experts at TJC Group feel heartbroken when we see companies wasting their hard-earned cash. We feel distraught when we see up to 15% of this hard-earned cash, paid for memory being wasted managing completely useless technical data. This is exactly what happens because business data consumes less memory than technical data. But it is a problem that’s relatively easy to fix with the right resources in place.

So before switching to an even bigger, more costly, and much less environmentally friendly Virtual Machine, please give us a call!

Give us the chance to look at your SAP systems and make an assessment. We are pretty sure we can help you avoid the costs and wastage I’ve just described. And if we can’t get you below 24TB, we’ll at least offer you an outstanding lunch! What more could you want?

Thierry JULIEN

Author: Thierry Julien, CEO of TJC Group