Standard service - a fixed amount of work that is performed at a fixed price.  Post Service

  

SAP Announces SAP HANA 2 – Next Generation Platform for Digital Transformation

Meet SAP HANA 2, the next generation platform for Digital Transformation. You can leverage state-of-the-art database and data management technology, analytical intelligence capabilities and intuitive application development tools on a single, unified in-memory platform – freeing valuable resources and human capital for new and innovative application development to meet your business’s strategic objectives.

SAP HANA 2 is an evolution of the existing platform that gives you access to a stable, highly available and secure data environment for all your applications. Yet, SAP HANA 2 transforms database and data management with enhanced high availability, security and workload management, enterprise modeling, data integration, and data quality. And, if you are considering developing or deploying modern applications that access any data from anywhere and combine transactions with advanced analytical processing, SAP HANA 2 also transforms application development and analytical intelligence. It comes with enhanced analytical processing engines for text, spatial, graph, and streaming data, improved application server capabilities and development tools.

Database Enhancements

High availability and disaster recovery/ backup and recovery

  • Load balance read-intensive operations between a primary and secondary instance of SAP HANA with the active / active-read enabled mode.
  • Automate orchestration of HA/DR processes with enhanced SAP Landscape Management integration.
  • Optimize workload for 3rd party backup tools by consolidating SAP HANA log backups.

Security

  • Protect data at rest by using SAP HANA’s comprehensive encryption for data and redo log files.
  • Reduce TCO by using existing LDAP groups for automatic role assignment.
  • Simplify monitoring of security alerts and configuration of security and users in the new SAP HANA cockpit.

Administration

  • Manage one instance, multiple tenants or 1000’s of SAP HANA instances within the SAP HANA Cockpit administration and monitoring tool.
  • Prevent run-away queries and manage system thresholds with enhanced workload management.
  • Reduce time and cost when implementing change by capturing, comparing and analyzing multiple workload replays.

Advanced Analytical Processing Enhancements

Search

  • Find information faster with improved searching and filtering on dates.
  • Dynamically configure search rules for duplicate detection.
  • Run search rules in “batch mode” to check a large number of records in a single call.

Text analytics

  • Extend the smarts of your applications that use text analysis to all languages using spaces between words.
  • Easily embed natural-language processing into your products with a new native SQL interface.
  • Manage your own domain-specific custom dictionaries and rules within the Web IDE for SAP HANA.

Graph data processing

  • Analyze graph data more efficiently and achieve results faster with new visualizations.
  • Leverage existing Cypher query language skills with SAP HANA support for Cypher for pattern matching

Predictive analytics and machine learning

  • Create richer predictive applications using more pre-packaged algorithms.
  • Run scoring functions faster with parallel processing across large-scale partitioned data.

Application Development and Tools Enhancements

Application server

  • Reduce TCO with Bring Your Own Language and Runtimes by running these applications within the XS Advanced framework.
  • Build native applications in multi-tenant (MDC) landscapes without restriction.
  • Reduce effort of supporting JAVA based applications with enhanced JAVA / JavaScript runtime supportability and trace features.

Tools, languages, and APIs

  • Accelerate application development with enhanced modeling for calculation views, core data services, and data types (i.e. spatial, text, graph, etc.).
  • Directly create and query hierarchies without dependency on the graphical modeler.
  • Test and develop MDX based queries and statements against SAP HANA views and leverage additional MDX capabilities.
  • Increase productivity with improved syntax error troubleshooting and expanded impact and lineage analysis.

Data Management Enhancements

SAP Enterprise Architecture Designer, Edition for SAP HANA

  • Perform enterprise architecture and business process modeling, as well as architecture and plan physical data models that target SAP HANA, with this new web-based tool that compliments existing SAP HANA modeling tools.

Multi-tier storage*

  • Simplify management of large volumes of data with multi-store tables.
  • Integrated management of Dynamic Tiering with support for encrypted extended store, system replication and delta back-ups.
  • *Targeted for release in near future.

Data integration

  • Simplify and accelerate data movements with parallel data loads, automatic recovery, and data integration with SAP ABAP-based systems (via BAPIs that support virtual procedures).
  • Gain access to more data sources with added support for Microsoft Access and SharePoint.

Data quality and data federation

  • Improve the accuracy of location-based data by taking advantage of enhanced address cleansing capabilities.
  • Leverage remote metadata synchronization to simplify alignment with remote data schema copies.

PrintEmail

SAP futurists name five future technology trends to plan for now

Making predictions about future technology trends is not particularly hard. It's not exactly going out on a limb to say that IoT, machine learning and blockchain are going to be very important in the not too distant future. The important thing is trying to determine how these trends may affect enterprises, as well as what people need to think about as they plot their course for the digital future.

SearchSAP recently spoke with three SAP executives about five future technology trends -- augmented reality, blockchain, artificial intelligence (AI), robotics and contingent labor -- and some of their implications for enterprises and SAP users. The executives are Dan Wellers, SAP global lead for digital futures; Kai Goerlich, chief futurist, SAP Innovation Center Network; and Michael Rander, global research director, Future of Work at SAP.

1. Immersive technology and augmented reality

Dan Wellers: [The] biggest example of augmented reality is Pokémon Go, on the consumer side. Until recently, this technology couldn't deliver a believable experience, and, as recently as 2014, the market barely existed. There was certainly nothing on the enterprise side, and those applications are just starting to [be] built now.

On the retail side, this could be the ability to try on outfits in a virtual mirror that takes your sizes as you do it, or sitting in a new car model simulation to sense whether that's the right model. Lowes has its Holoroom in a couple of stores, where customers can design a simulated room [using] a tablet.[Dan Wellers, SAP global lead for digital futures] Dan WellersAnother big area is medicine, and using augmented reality to overlay diagnostic and treatment information over people's bodies, letting medical students practice complex procedures safely. Cleveland Clinic is actually turning MRIs into conventional 2D and 3D images that can be projected over the site of the procedure for training or diagnostic purposes.

Also, a company might use augmented reality to walk field service reps through repair processes or, in an IoT [internet of things] example, on a construction site [by] scanning an area and having it trigger data about real-time costs, supply inventories or planned versus actual spending.

All of these have data intensity as their core, and the message to our customers is that you should be thinking about this stuff now, and work with us to coinnovate, either by adding virtual or augmented reality into existing apps or thinking about new apps for new needs that are being discovered as we talk.

2. Blockchain

Wellers: It's important that we look at blockchain from an enterprise standpoint because it has huge potential, but there's a lot that's been written that's not really concrete. One [example] has to do with the need for scale -- can blockchain scale to the levels that enterprises really need?

A great example is shipping containers. A large container ship can hold around 18,000 containers, 50 pallets per container, 50 boxes per pallet. So, on one ship, you've got 45 million items to track, and if there's 20 million containers out there, the numbers get really big. So can blockchain scale to that level? We don't see that happening soon, but we know that somebody needs to look at it.

We're starting to look at how close we can move blockchain toward enterprise requirements -- what are the hurdles, what unknown things will we come up against? We believe that we can help based on our experience in the enterprise and [with] business process understanding and mission-critical systems. We're looking at approaching it from a full, end-to-end business process perspective; things like [increasing] process speed and transparency, improving supply chain efficiency and facilitating business networks and exchanges.

3. Artificial intelligence -- machine learning and deep learning

Kai Goerlich: The concept of AI is from the 1950s, and usually the connotation is more on smart, human-like robots. Now, basically there are two versions -- machine learning and deep learning -- and 2017 will probably be the year of the beginning of true AI. We use machine learning as the umbrella for both, but there is a subtle difference. Machine learning is learning by statistics and coming up with rules, while deep learning is self-learning with qualified data. At SAP, [we're betting] on the deep learning algorithms to hit business, and we think that those algorithms will change how we do business.[Kai Goerlich, chief futurist, SAP Innovation Center Network] Kai GoerlichIf you look into it, deep learning is very much like human training; you need good data and time so that the algorithm [can] come up with a solution. It could be that, if you feed the algorithm the right data about demand and supply and logistics, the algorithm might find a better routing, for example, or an improved supply chain. For the moment, deep learning has been used mostly, as you know, with AlphaGo, the algorithm that beat the Go world champion, but that's a very narrow frame; it's a board game.

We are now using the same algorithms for cars to drive by themselves or for image and speech recognition, but these are still limited data sets. If you look at a complete supply chain, that's significantly bigger, and with every additional parameter, the information base gets exponentially bigger.

We are at the early stage, and we will probably start to find [opportunities in] the highly repetitive stuff in the supply chains, like seeing if certain items have been shipped or not or if they have been properly taxed; something where such an algorithm would help.

4. The emergence of cyborgs or robotics

Goerlich: There's an underground cyborg scene in many big cities, and humans are testing cyborgs. There are some medical cases that make it into the news, and artists are using that stuff already where they try to bridge or hook the brain to the computer or have implants. It's underground, but with the medical use cases, it will be more prominent, and we will see that in some niche areas to start. It's a big testing area for some extreme or fringe use cases in robotics and algorithms; it's ideal for machine learning and algorithms because they can learn on very good data sets.

The future is difficult to predict, but my bet is on human-machine cooperation, or maybe even coevolution, so that we merge with what we now call machines. The other scenario is that we have very smart machines, like what I call the "Terminator Scenario," where they will be much smarter than we are and turn out to be very clever, but not merged with humans. So humans stay humans and machines stay machines. It's pretty appealing to some people, so I'm pretty sure that we will test it beyond medical applications.

5. Automation and the gig economy

Michael Rander: Automation will affect the future of work and how the workplace is going to look, but it's part of a bigger picture. It comes with all of the different advances in technology, so we look at it as the whole digitization of the economy and how that's going to affect the workforce.

Automation will move a number of jobs in a certain direction, and they're going to take the easy to replace jobs first; the manual ones or the ones that can be replicated the easiest. Over time, we'll see that it will affect certain groups; for example, taxi drivers, where automated cars could essentially take over that whole workforce.[Michael Rander, SAP global research director, Future of Work] Michael RanderThe problem we have then is, what to do when we have a whole group of people who find themselves without any training to go and take other jobs? So it becomes a larger societal problem. It's something that we feel we need to dig into from an SAP perspective, but also from [a] collaboration across companies perspective. We want to engage other companies, we want to work across boundaries to look at this because it becomes a workforce problem and becomes a societal problem. So it's something we need to address or we're going to have much larger problems in the workforce.

This means we need to look at training, at the hiring processes, at what it means to have a contingent workforce. How do you manage a contingent workforce and make them be a part of what you do as a company? It's estimated that, basically, every company will see contingent workforce as an essential part of their business.

So you can't just look at this and say you'll find a solution when you get there; it's an issue right now that you need to deal with. Looking just a few years ahead, you'll see that trend increase, and people that aren't getting ready now will fall behind.

We did a study where we were looking at the digital readiness of companies, and even though a lot of these companies say that digital transformation is an important part of what they do, very few are actually ready for it. People are realizing that it's happening, but they're not taking the steps to address it, which is a major problem.

Tying the future technology trends together

Wellers: The real value in these doesn't come from them standalone, it comes from how they combine. How do things like 3D printing, IoT, blockchain, connected cars, smart cities and the sharing economy combine to create a logistics internet? How do nanotechnology, robotics, augmented reality and connected healthcare combine to revolutionize the way healthcare is delivered and how research is done? How will blockchain, artificial intelligence and IoT contribute to the smart logistics supply chain?

PrintEmail

HANA 2 – What is it?

 

Quoting from the SAP Press Release dated 8 November 2016 –

BARCELONA — SAP SE (NYSE: SAP) today announced the SAP HANA 2 platform, the next generation of SAP HANA optimized for innovation.

SAP HANA 2 includes and extends the proven technology from SAP’s breakthrough in-memory computing platform to provide a new foundation for digital transformation.

… said Bernd Leukert, member of the executive board, … “The release of SAP HANA 2 marks a milestone in the industry, as it represents the next generation of SAP HANA that will propel customers toward a successful and prosperous digital future.”

Lots of people were pretty excited about this but also a bit confused about what this actually means. Let me try and break it down for you.

To understand what SAP HANA 2 is we need to look at the history of SAP HANA – and particularly at the underlying release schedules.

As you would expect SAP are continually improving and enhancing the capabilities of HANA. These innovations are delivered in so-called Support Package Stacks (SPS) packs. SPS packs are numbered incrementally and generally released twice a year. So SPS5 will come about 6 months after SPS6, which will be followed by SPS7, etc. The latest SAP HANA release is SPS12.

Not unreasonably, SAP want customers to apply the latest HANA SPS pack as soon as possible so that innovation and bug fixes are implemented quickly. It is also pretty important that the SAP support organisation does not get bogged down in the weeds supporting old releases containing bugs that have been identified and fixed in later SPS packs.To that end, SAP require HANA customers to keep their system up to date to ensure continued support. Basically the rule is that SAP will only support the current SPS level and the two previous levels. This means current SAP HANA customers must be on SPS12, SPS11 or SPS10 to get support from SAP.

But this constant need to update/upgrade your HANA systems is a burden for SAP’s Customers – especially those who have Business Suite on HANA or S/4HANA. It flies in the face of the 2-Speed IT paradigm where we are supposed to have a “stable core” and perform “innovation at the edge”. How can my core be stable if I have to upgrade it twice a year?

SAP have responded to this customer issue in a way that developers are very used to. They have determined that HANA SPS12 will become a “stable release”. This means that this release will be supported for a longer period of time – in this case three years from the release date which makes it until May 2019.

This is not a new concept. SAP do the same thing with their SAPUI5 libraries for example.


Meantime, HANA innovation will continue to be delivered as SPS packs every 6 months – just as it always has. So can we expect SPS13 to come out soon, followed 6 months later by SPS14, etc? Well no – actually you can’t.

SAP have decided to draw a line in the sand at HANA SPS12 and rename the product to HANA 2. The first release – due November 30 2016 – will now be called HANA 2 SPS0 rather than HANA SPS13. It will be followed by HANA 2 SPS1, etc.

Now you can choose to stay on the stable HANA SPS12 release until you want to take advantage of forthcoming innovations. These will be delivered via HANA 2 SPS packs. SAP’s intention is for you to be able to upgrade from HANA SPS12 directly to any HANA 2 SPS level.

If you want to take immediate advantage of the latest HANA innovations you can continue to apply SPS packs when they become available.

 the announcement of SAP HANA 2 means innovation in the HANA platform will continue as before whilst support for the SPS12 release has been extended through to May 2019.

PrintEmail

Funny SAP acronyms

 

SEAL ALL PRODUCTS

"Structured Advanced Programming."

Suffer After Purchase!

Suffer And Pain

Sit And Practice

SLOW and PAINFUL

SLOW and PROBLEMETIC

Stop all Production

Sweet Apple Pie

Serious annoying program

Shoot a Planner

Ständig andere Probleme

Superior Arrogance Provided

Sold A Pig-in-a-Poke

Start Adding People

Start Applying Patches

Submit and Pray

South African Police

PrintEmail

FAQ on SAP HANA Workload Management

FAQ on SAP HANA Workload Management

1. What is SAP HANA workload management?

SAP HANA can be used for various types of workloads, such as:

  • OLAP workload, for example reporting in BW systems
  • OLTP workload, for example transactions in an ERP system
  • Mixed workload, meaning both OLAP and OLTP, for example modern ERP systems with transactions and analytical reporting
  • Internal workload in SAP HANA, for example merges, backups and savepoints

Workload management in SAP HANA allows you to balance and manage all workload types, OLAP, OLTP, mixed, and internal for optimal throughput and response times.

2. What kind of resources can become scarce in SAP HANA environments?

The following resources can become a bottleneck in SAP HANA environments:

Resource SAP  Limited by
Memory 1999997 Physical memory, SAP HANA global allocation limit
CPU and threads 2100040 Number of physical / logical CPUs
SAP HANA parameters like max_concurrency or max_sql_executors
Network 2222200 Network bandwidth and latency

3. How can workload management be configured for memory?

The following options exist to configure workload management for memory:

Memory limit for Available as of Details
SQL statements SPS 08

Starting with SPS 08 you can limit the memory consumption of single SQL statements. As a prerequisite you need to have the statement memory tracking feature enabled (see SAP doc1999997 -> "Is it possible to monitor the memory consumption of SQL statements?"). Additionally you have to set the following parameter in order to define the maximum permitted memory allocation per SQL statement and host:

global.ini -> [memorymanager] -> statement_memory_limit = <maximum_memory_allocation_in_gb>

Changes to the parameter take effect immediately, no restart is required.

Be aware that the limit is host specific. If you e.g. set the limit to 150 GB and you use a scale-out scenario, one SQL statement can allocate up to 150 GB per host. So the overall memory size (as e.g. displayed in M_EXPENSIVE_STATEMENTS) can significantly exceed the configured limit.

You should test the effects of these settings carefully in order to avoid unexpected results (e.g. termination of backups or critical business queries). In general it is useful to start with a rather high memory limit. You can e.g. take 30 % of the first 500 GB of the global allocation limit and 15 % of the remaining memory as a starting point:

Global allocation limit statement_memory_limit
250 GB 75 GB
500 GB 150 GB
1000 GB 225 GB
3000 GB 525 GB

A statement that exceeds the configured statement memory limit will terminate with an OOM dump ("compositelimit_oom"). See 2122650 for more information.

As of SPS 09 the following parameter is available that can be used to make sure that the statement_memory_limit only takes effect if the overall SAP HANA memory allocation exceeds a defined percentage of the global allocation limit:

global.ini -> [memorymanager] -> statement_memory_limit_threshold = <percentage_of_global_allocation_limit>
SQL statements of specific database user SPS 09

As of SPS 09 you can also specify a database user specific statement memory limit:

ALTER USER <db_user_name> SET PARAMETER STATEMENT MEMORY LIMIT = '<maximum_memory_allocation_in_gb>'
SQL statements of specific workload class SPS 10

Starting with SPS 10 you can define a workload class with a specific statement memory limit setting:

CREATE WORKLOAD CLASS "<workload_class>" SET 'STATEMENT MEMORY LIMIT' = '<maximum_memory_allocation_in_mb>'
ALTER WORKLOAD CLASS "<workload_class>" SET 'STATEMENT MEMORY LIMIT' = '<maximum_memory_allocation_in_mb>'

Attention: The value unit is MB while for the options further above it is GB.

See "How can workload classes be mapped to users and applications?" below for further information about workload classes.

4. How can workload management be configured for CPU and threads?

Controlling CPU consumption is closely linked to controlling the number of active SAP HANA threads, because in most cases an active thread consumes CPU (exception: passive lock waits).

If you face a very high SAP HANA CPU consumption, you should always check in the first place if it can be limited by performance tuning approaches. There were already cases when the creation of a single SAP HANA index reduced the CPU consumption from 100 % to less than 10 %. Only if you have made sure that the high CPU consumption is required and can't be reduced via technical tuning, you can control and limit it in the following ways. <service>.ini is can be used to control the focus of the parameter change, e.g.:

  • global.ini: Activates the limitation for all services
  • indexserver.ini: Activates the limitation only for the indexserver process
Action Availability   Parameter Default  Restart required Details
Limit number of SQL executors general <service>.ini -> [sql] -> sql_executors 0 (-> Number of logical CPU cores) no

SQL executors are threads, which are responsible for normal SQL request processing. The number of SQL executors can be configured with this parameter.

This parameter is less strict than the max_sql_executors parameter described below, because it is possible that the actual number of used SQL executors exceeds the configured value.

Limit maximum number of SQL executors general <service>.ini -> [sql] -> max_sql_executors 0 (-> No limit) no

SQL executors are threads, which are responsible for normal SQL request processing. The maximum number of SQL executors can be configured with this parameter.

Setting max_sql_executors introduces a hard limit. If for some reasons the defined value needs to be exceeded, the transaction is terminated with an error like:

exception  1: no.71000132  (ptime/session/tcp_receiver.cc:849) max number of SqlExecutor threads are exceeded: current=111, max=110

Therefore this parameter should only be used in very rare cases.

Limit CPU consumption of job workers general <service>.ini -> [execution] -> max_concurrency 0 (-> Number of logical CPU cores) no

Job workers are threads, which are responsible to process parallelized OLAP load and internal activities like savepoints or garbage collection. The maximum number of logical CPUs consumed by JobWorkers can be configured with this parameter.

If each active job worker currently consumes less than 100 % CPU, more job workers than the max_concurrency value can be activated. The defined number of logical CPUs may be exceeded if the active job workers increase their CPU consumption over time.

Limit parallelism of CHECK_TABLE_CONSISTENCY (not started by ESS) >= Rev. 97.01 indexserver.ini -> [table_consistency_check] -> check_max_concurrency 0 (-> Number of logical CPU cores) no

This parameter controls the number tables analyzed in parallel using the consistency check CHECK_TABLE_CONSISTENCY (unless triggered by the statistics server). See SAP doc2116157 for more information.

On top of this client side parallelism the analysis of a single table may be parallelized on SAP HANA server side (limited by the max_concurrency parameter).

Limit parallelism of CHECK_TABLE_CONSISTENCY ( started by ESS) >= Rev. 97.01 indexserver.ini -> [table_consistency_check] -> internal_check_max_concurrency 0 (-> max(1, min (8, number of logical CPU cores / 10))) no

This parameter controls the number tables analyzed in parallel using the consistency check CHECK_TABLE_CONSISTENCY when triggered by the statistics server. See SAP doc2116157 for more information.

Per default a maximum of 8 CPUs is used - and even less if there are less than 80 CPU cores on the node.

On top of this client side parallelism the analysis of a single table may be parallelized on SAP HANA server side (limited by the max_concurrency parameter).

Limit parallelism general <service>.ini -> [execution] -> max_concurrency_hint <service>.ini -> [parallel] -> num_cores

Number of logical CPU cores (SPS <= 08)

max_concurrency (SPS >= 09)

no

Similar to max_concurrency these parameters help limiting the number of used job worker threads. It defines the number of jobs to create for an individual parallelized operation. For one statement, there can be several such operations, so the total number of jobs being created for a statement will typically exceed the value of this parameter, as well as the number of worker threads being used for these jobs.

For historic reasons these two parameters exist, but they are used for the same purpose. You should generally use identical values for both parameters.

Limit garbage collection parallelism general <service>.ini -> [persistence] -> max_gc_parallelity

0 (-> Number of logical CPU cores)

yes

Garbage collectors clean up no longer needed information on a regular basis. The parallelism of them is controlled by the max_gc_parallelity parameter. Per default a rather high parallelism is used which can result in a high system CPU consumption. By reducing the parameter you can reduce this overhead. Be aware that a small value can result in garbage collection delays, a growing persistency and "database full" situations. See SAP doc2169283 for more information.

Limit parallelism of table redistributions >= Rev. 73 indexserver.ini -> [table_redist] -> num_exec_threads

10 (SPS <= 10)
20 (SPS >= 11)

no

This parameter controls the number of threads that are started for a table redistribution (SAP doc 2081591). Larger values can reduce the overall table redistribution time, but at the same time it increases the resource consumption in terms of memory, CPU and I/O.

Assign CPUs to services >= Rev. 90 daemon.ini -> [<service_name>] -> affinity No assignment yes It is possible to limit the CPU consumption per service (e.g. statisticsserver, indexserver) by adjusting the affinity parameter in the related daemon.ini section. See the SAP HANA Troubleshooting and Performance Analysis Guide for more information related to CPU affinity adjustments.

Additionally the following options exist:

CPU and / or thread limit for Available as of Details
SQL statements of specific workload class SPS 10

Starting with SPS 10 you can define a workload class with a specific thread limit setting:

CREATE WORKLOAD CLASS "<workload_class>" SET 'STATEMENT THREAD LIMIT' = '<max_threads>'
ALTER WORKLOAD CLASS "<workload_class>" SET 'STATEMENT THREAD LIMIT' = '<max_threads>'

The default <max_threads> value is 0 which means no limitation.

See "How can workload classes be mapped to users and applications?" below for further information about workload classes.

5. How can workload management be configured for network?

No SAP HANA network workload management exists. If network becomes a bottleneck due to bandwidth or latency limitations, it can usually be resolved by optimizing the network infrastructure and the application design. See  2222200 for more information regarding SAP HANA network topics.

6. How can database requests be prioritized?

Usually all database requests have the same priority. In order to prioritize requests, you can use the following approach:

Prioritization of Available as of Details
SQL statements of specific database user SPS 09

As of SPS 09 you can specify a database user specific statement priority:

ALTER USER <db_user_name> SET PARAMETER PRIORITY = '<priority>'

The priority can vary between 0 (lowest priority) and 9 (highest priority). The default value is 5.

SQL statements of specific workload class SPS 10

Starting with SPS 10 you can define a workload class with a specific priority:

CREATE WORKLOAD CLASS "<workload_class>" SET 'PRIORITY' = '<priority>'
ALTER WORKLOAD CLASS "<workload_class>" SET 'PRIORITY' = '<priority>'

The priority can vary between 0 (lowest priority) and 9 (highest priority). The default value is 5.

See "How can workload classes be mapped to users and applications?" below for further information about workload classes.

7. How can workload classes be mapped to users and applications?

As described above, you can define workload classes with a defined priority and with limitations in terms of memory and threads as of SAP HANA SPS 10. It is possible to define multiple properties at once, separated by comma:

CREATE WORKLOAD CLASS "<workload_class>" SET 'PRIORITY' = '<priority>', 'STATEMENT THREAD LIMIT' = '<max_threads>', 'STATEMENT MEMORY LIMIT' = '<maximum_memory_allocation_in_mb>'

In the second step the defined workload class can be mapped to specific contexts. The following workload mapping options exist:

Context Details
CLIENT Client number (typically the SAP ABAP client)
APPLICATION NAME Application name (e.g. 'ABAP:<sid>' for SAP ABAP or 'HDBStudio' for SAP HANA Studio)
APPLICATION USER NAME Application user name (e.g. SAP ABAP end user)
USER NAME Database user name (e.g. SAP<sid> for SAP ABAP)

The following command can be used for the workload mapping. You can omit properties in the SET clause if you only want to specify a subset:

CREATE WORKLOAD MAPPING "<workload_mapping>" WORKLOAD CLASS "<workload_class>"
SET 'CLIENT' = '<client>', 'APPLICATION NAME' = '<app_name>', 'APPLICATION USER NAME' = '<app_user_name>', 'USER NAME' = '<db_user_name>'

Example: Limiting the maximum thread parallelism for SAP application users SAPBATCH1 and SAPBATCH2 to 30

CREATE WORKLOAD CLASS "WLC_30THREAD" SET 'STATEMENT THREAD LIMIT' = '30';
CREATE WORKLOAD MAPPING "WLM_SAPBATCH1" WORKLOAD CLASS "WLC_30THREAD" SET 'APPLICATION USER NAME' = 'SAPBATCH1';
CREATE WORKLOAD MAPPING "WLM_SAPBATCH2" WORKLOAD CLASS "WLC_30THREAD" SET 'APPLICATION USER NAME' = 'SAPBATCH2';

Changes to workload classes don't have immediate effects on existing workload mappings. You have to recreate the workload mappings to activate the new settings.

8. Where do I find information about configured workload classes and mappings?

Information about configured workload classes and mappings are available in views WORKLOAD_CLASSES and WORKLOAD_MAPPINGS. Alternatively you can use SQL: "HANA_Workload_WorkloadClasses" (SAP doc1969700).

Example output:

9. What happens if multiple workload classes match to a specific database context?

If multiple workload classes match a database context, the following rules apply:

  • If one workload class matches more properties than the other, that workload class is used.
  • If multiple workload classes matches the same number of client properties, the following prioritization is used:
    • APPLICATION USER NAME
    • CLIENT
    • APPLICATION NAME
    • USER NAME

PrintEmail