Every Organization has the biggest challenge to reduce the TCO (Total cost of ownership). One of the major contributor in that is the cost of Hardware on which ERP applications are running.
With evolution of new technology and presence of Cloud services, the availability of these Hardware has been made very easy, but still when we talk about the compute capability then each organization has to plan for the quantum of data it needs to store which will be inline with there Business needs and comes handy for future planning during data analysis.
For maintaining Data Footprint and reduce the cost SAP has provided many guidelines through which Business can achieve it’s objective and in this blog I am going to provide you on detail aspects of that guidelines and planning on how to achieve that.
For ERP applications running on systems other than SAP HANA 2.0 SP04, SAP has come up with the concept of Data Aging to make Business capable to take decision how to segregate the older data and keep them in a storage which can reduce the cost as well.
In SAP HANA SP04 and above, SAP comes with a feature of NSE advisor, which provides the input on how to place the Data into different Partitions and thus reducing the total licensed memory requirement for SAP HANA system.
Hot Data: Frequently used and modifiable data and very critical for day-to-day Business transactions and planning.
Warm Data: Less utilized (Only during month end/quarterly/yearly reporting and planning) and modification of such data is rare but can’t be ruled out.
Cold Data: Rarely used by Business (only during Government or Internal Auditing) and modification of such data is remote.
The following Figure shows the relationship between the Data Growth and the total cost in case of SAP HANA system:
SAP has provided a list of Business Objects in ERP system where one can activate the Data Aging which will add (if not exist) a separate column _DATAAGING on the tables associated with the Business Object and include the timestamp of the origin or last modified of the data in that table. This will further help to identify on which data to move to other storage area for the minimization of the cost.
The detail of these Data Aging objects is shown below:
Reference:SAP Data Aging Object Provided by SAP
The Steps Involved are as follows:
- Identifying Objects for Data Aging (DAGOBJ)
- Verifying Tables association with Data Aging Objects (DAGADM)
- Verifying the Partition Object association (DAGPTC)
- Identifying Partitioning Range(DAGPTC)
- Identifying Business Critical Transaction and Analysis Reports
- Identifying the Data Aging Needed (in Years or Months)
Hot Data: 15 Months of Data
Warm Data: 15 to 36 Months of Data
Cold Data: Greater Than 36 month old data
- Time Range too High? Yes->Re-evaluating the Reports identified and discuss with Business to reduce TimeSpan
- Defining the Partition Range Accordingly (DAGPTM)
- Activating Data Aging Object and Running Data Aging (DAGRUN)
- Verifying the Data Aging Log (DAGLOG) and Verifying the Partitioning (DAGADM)
- Identify Table and Partition that Needs to be converted into Page Loadable (Use NSE)
- Convert DDL to NSE at Partition Level (Making them Page Loadable)
The command which can be executed to achieve this is as under:
- Enable NSE advisor by Enabling Statistics Collection
- Adjust NSE Advisor Tuning Parameter
Note: The above values are for demonstration purposes only.
- Run the workload for a minimum of 6 hours
- Get the NSE recommendation result
The following diagram provide the insight on how the Tables can be Changed between Page Loadable to Column Loadable depending upon it’s Access Frequency and Size:
- Verify the Current Distribution of Table into different Partition and take necessary step if it is not in-line with NSE advisory
- Disable NSE advisor by Disabling Statistics Collection
Reference: NSE Advisory Detail
The entire above detail planning can be summarize by the following pictorial representation:
Data Tiering Options by Application:
By following above steps the organization definitely achieve it’s optimal solution on data foot print store with lower Storage cost as well Compute cost and thus reducing the TCO for IT infrastructure.