Quantcast
Channel: SCN : Blog List - SAP Solution Manager
Viewing all 337 articles
Browse latest View live

Configuration overview for Business Process Monitoring

$
0
0

As part of my blog series on SAP Solution Manager 7.1 features and functions, I decided to write this blog post to give a high level overview of the steps needed to set up Business Process Monitoring in SAP Solution Manager 7.1 (used SP12 here). It’s not my intention to detail everything out in this blog post but instead give a view on which high level steps are needed in terms of configuration.

 

For a detailed setup overview, check out the great presentation (BPM setup roadmap 81 pages) that you can find in the media library on https://service.sap.com/bpm or through the presentation material on business process operation - business process monitoring in https://service.sap.com/rkt-solman.

 

In a previous blog post on getting started with SAP Solution Manager I explained briefly the initial steps to get started with SAP Solution Manager. For Business Process documentation, as a prerequisite, you have performed the system preparation, basic configuration for SAP Solution Manager and the managed SAP system setup for the SAP systems involved in the monitoring scenario.

 

Quick configuration steps cheat chart


steps.png

First step: preparation for the Business Process Monitoring scenario through SAP Solution Manager: Configuration workcenter


solmansetup.png

You start with the guided procedure for Business Process Monitoring  which you can access via the menu in the Solution Manager Configuration workcenter (or transaction SOLMAN_SETUP which is the same in the end).

 

bpm.png

This guided procedure will help you to get the prerequisites in place to set up a Business Process Monitoring scenario.

details.png

Prerequisite notes are checked for example during this guided procedure (via RTCCTOOL) and the detail log (link show) will show you what is missing in terms of SAP notes to ensure you have the latest corrections implemented.

 

Prerequisite before setting up a business process monitoring scenario


You need business process documentation in a solution within SAP Solution Manager (a structure that represents the business processes & business process steps) in order to do business process monitoring.

 

There are multiple ways you can achieve this but I won’t cover them all in this blog because it would take this blog too far off topic. If you already utilize business process documentation, you can leverage what you have already build. Otherwise, if you want to set up a simple Business Process Monitoring scenario (let’s call it a test) it doesn’t have to take a lot of time to get started.


You can create a Solution Manager implementation project using transaction SOLAR_PROJECT_ADMIN.  The minimal configuration there is that you give the project a title, you choose the language and you insert the logical components in the system landscape tab which represent the involved SAP systems for the business process steps.

 

bpst.png

Once that is done, you can create a business process structure using transaction SOLAR01. Once your structure is ready (just keep it simple to start) you need to insert the structure into a solution. That’s done using transaction SOLMAN_DIRECTORY. If you don’t yet have a solution, you also need to create a solution, this can be via the SAP Solution Manager Administration workcenter.

 

Setting up the business process monitoring scenario

bpmset.png

In the business process monitoring setup, you continue the setup (follow the steps, hit create and follow through the configuration) and you can configure business process monitoring against business process steps.

 

After you run through the configuration steps, you can see the monitoring overview in the Business Process Operations (new) workcenter under Business Process Monitoring. Recently introduced is integration into the Monitoring and Alerting Infrastructure (MAI) which makes the technical architecture and the look & feel of the scenario, the same as for Technical Monitoring.

 

resulting.png

 




Solution Manager 7.2 Roadmap Webcast Summary Part 1

$
0
0

The official roadmap is at https://websmp106.sap-ag.de/~sapidb/011000358700001435482012E.pdf (SMP logon is required)

 

The usual disclaimer applies that things in the future are subject to change.

 

Matthias Melich, SAP, provided this webcast

 

Current release is 7.1 with maintenance commitment to 2017

 

Solution Manager 7.2 will into ramp-up middle of next year; by Q4 2015 SAP expects to be  GA

 

Then if you are on Solution Manager 7.1 you have 2 years to transition

1fig.png

Figure 1: Source: SAP

 

In the past there have only done a few investments in implementation.  The next release will see a big investment – “pragmatic business process management” – most of this presentation is on this topic

 

Most customers have large on-premise

 

SAP is actively driving cloud

 

Lifecycle shown in Figure 1 means most of customers will be in a hybrid situation to support on-premise and cloud solutions and integrated

 

SAP wants customers to use SolMan for hybrid environment

 

There is a difference: Solution Manager for HANA and on HANA

 

Solman 7.2 will be available on SAP HANA

 

SolMan 7.1 is IT and less for business

 

SolMan 7.2 is a more business balanced SolMan

2fig.png

Figure 2: Source: SAP

 

Figure 2 on the left in Development system – planned, Solution Manager 7.2 will provide “state of the art” process modeling

 

Picture shows what is typical in modeling environments

 

PowerDesigner , which SAP acquired during Sybase acquisition, is being used

 

Today – 7.1 need a landscape to enter business process steps; business process experts can’t use

 

SAP wants to decouple this

 

Use 7.2 early in project and hand over to business process experts, wants to make it easier to use for documentation for business processes

 

SAP wants to extend diagnostic and analytics framework in Solman for managing business case

 

SAP wants to extend framework to innovations area of SolMan  - relate business case to KPI’s

 

SAP is investing in pre-configured solutions – have RDS’s (rapid deployment solutions)

3fig.png

Figure 3: Source: SAP

 

Figure 3 shows processes will have more than 3 levels

 

Figure 3 shows a screen shot,  non-graphical view of Solman

 

SAP wants openness to other modeling tools

 

SAP will have a marketplace on SCN; will allow vendors to certify interface similar to Service Desk, with a bi-directional interface

 

This will be for business process – not full-blown UML ; for full-blown look at PowerDesigner

 

SAP hopes to have interface added by then; but will not be part of the ramp-up scope

4fig.png

Figure 4: Source: SAP

 

SAP will do away with some of the restrictions today

 

Figure 4 shows a technical object library – transactions, reports, all objects in system only once – e.g. VA01 only once

 

SAP will structure this library according to application component hierarchy

 

Library is based on usage – object only goes to Technical Objects Library (TOL) if used

 

It will generate this library automatically

 

Process Step Library or PSL is based on usage – using application component hierarchy as a reference.  The PSL is the home for documents, test cases – can have multiple occurrences of technical objects – PSL is available per system. It is generated automatically; built on top of TOL

 

E2E documents business processes across systems – business process library – can’t build automatically – this is optional – pull steps from individual systems

5fig.png

Figure 5: Source: SAP

 

Figure 5 shows several paths

 

If customer has no solution documentation today, then the libraries are generated and build up to end to end

 

If have solution documentation today, all documentation is in read-only after upgrade, customers migrate projects to new environment – not automated fashion

7fig.png

Figure 6: Source: SAP

 

Figure 6 shows the link to business case; once technical implementation is done, look at how implementation is by usage of systems – business view in pink

IT view – requirements, test, change, application usage verification

 

Part 2 of my notes is coming; focusing on the Cloud, HANA, future direction and question & answer

 

Related

Hopefully we'll learn more details at SAP Insider's Basis  SAP Administration 2015

Solution Manager 7.2 Roadmap Webcast Part 2 - Cloud HANA Question Answer

$
0
0

Part 1 is Solution Manager 7.2 Roadmap Webcast Summary Part 1

 

The usual legal disclaimer applies that things in the future are subject to change.

 

Cloud Adoption

1fig.png

Figure 1: Source: SAP

 

When systems are on a Private Cloud such SAP HEC there is no real difference for SolMan

 

For a Public Cloud – such as SuccessFactors, Ariba , the plan is to allow SolMan to provide services for public

 

LMDB  will have new interface

2fig.png

Figure 2: Source: SAP

 

Public cloud performance monitoring  will be in Solman and will be in 7.2

 

It is also included on SolMan 7.1 SP12

3fig.png

Figure 3: Source: SAP

 

Figure 3 shows how to register cloud service with a guided procedure

4fig.png

Figure 4: Source: SAP

 

Figure 4 shows interface and connection monitoring that is “In the pipeline”

 

Solution Manager with In Memory Technology

5fig.png

Figure 5: Source: SAP

 

All customers who have a valid support contract can use Solman on HANA without additional licenses

 

This will come with transition support, standards, and more

6fig.png

Figure 6: Source: SAP

 

Figure 6 is not part of 7.2 but a strategic message for the future

 

SolMan will have a “Fiori like experience”

 

Notice too that the reports will come from HANA Live, not BI/BW

7FIG.png

Figure 7: Source: SAP

 

Install, register, do an upgrade will be improved with maintenance planner shown in Figure 7

8fig.png

Figure 8: Source: SAP

 

Figure 8 shows there is a ramp-up for maintenance planner today

 

SAP is looking for ramp-up customers

 

Maintenance optimizer will be gone in 7.2

9fig.png

Figure 9: Source: SAP

 

Figure 9 shows there is no new CRM release in 7.2; only an enhancement package

10fig.png

Figure 10: Source: SAP

 

SOLAR01 02 are going away

12fig.png

Figure 11: Source: SAP

 

Figure 11, was already previously announced

 

Question & Answer

Q: Java stack for SolMan – 7.1 – some functionality is removed from Java stack – going forward could we have Solman without the Java stack

A: 7.2 will not be possible and Java is still required

 

Q: Business process design – 7.1 – Advanced Business Process Blueprinting – how will it work in 7.2?

A: First customer came back with feedback with lots of feedback and checked architecture and decided to rethink process modeling which is why coming out of new environment

 

Advanced blueprint will not work on 7.2; only a few customers are using it

 

Q: What is the planned support for current 3rd Party Tools: HPQC, Redwood CPS, Wily, Productivity Pak?

A: Will continue to support interfaces in 7.1; will not have interface support during ramp-up – there are changes coming  - architecture is changing, want it stable before offer interfaces

 

Q: What happens if upgrade from 7.1 to 7.2 and have open projects with transports?

A: Current feedback – will have to close project; they are revisiting with development organization – customers are unhappy – customers want to continue to work after the upgrade

 

Q: Will a planner Excel sheet be available, allow them to find risk and timelines?

A: Not at this point but plan to offer the system – need to upgrade and see what the situation is

 

Related

Hopefully we'll learn more details at SAP Insider's Basis  SAP Administration 2015

MOPZ Framework 3.0

$
0
0

Dear followers

 

My name is Mateus Pedroso from MOPZ/LMDB/Solman Configuration team and I'll start to write some posts about these topics. I would like to start writing about MOPZ framework 3.0.

 

MOPZ Framework 3.0 is the standard for Solution Manager 7.1 SP12, but you can apply note 1940845 to enable MOPZ 3.0 in Solution Manager 7.1 SP05-SP11. Note 1940845 must be always implemented in the latest version and this note fix some bugs in mopz 3.0, so it's very important to ensure that the latest version of note 1940845 is implemented even in Solman 7.1 SP12. The following points changed in MOPZ 3.0.

 

- UI and performance.

- Integration of the Maintenance Optimizer with the Landscape Planner.

- Add-on installation procedure.

 

You can check more details about MOPZ 3.0 in the pdf attached in note 1940845.

 

The most important points that you'll see that received significant improvements are the performance and the Add-on installation. The Choose add-ons phase usually takes a long time to finish in MOPZ 2.0 and now in mopz 3.0 this phase received some performance improvements.

Here's a screenshot showing that now you can apply add-ons in step 2.

mopzaddon.png

 

Now it's easier to apply add-ons.

mopzaddon1.png

In the next posts, I'll explain some LMDB/SLD topics related with MOPZ and how to fix some well known issues.

How-to use the new CBTA Loop capabilities

$
0
0

PREREQUISITES

The looping capability are planned to be shipped with SAP Solution Manager 7.1 SP13

Alternatively you can implement following notes in advance:

  • 2088536 - Downport CBTA Default Components
  • 2088525 - IF and LOOP Default Components for CBTA
  • 2029868 - CBTA - Runtime Library - Fixes & improvements

 

USE-CASE FOR LOOP FUNCTIONALITY:

A test script may need to perform actions against an unknown number of entries in a table. The script may therefore need to:

  • Start at first row and check if there is an entry
  • If entry exists perform one or more actions on the current row
  • Continue with next row

 

REQUIRED DEFAULT COMPONENTS: DO, EXIT_DO, LOOP

 

Keyword: DO

It can be used to iterate over several steps. It defines where the loop starts.

  • It must be used together with the LOOP keyword which defines where the loop ends.
  • The EXIT_DO keyword must be used as well to determine when to stop the loop.

 

The CounterName parameter provides the name of the iteration counter. This counter is incremented automatically at runtime while iterating over the included steps. The actual value of the counter can be retrieve using the regular token syntax.

For instance, when CounterName is set to "COUNTER" its value can be reused in the subsequent steps using %COUNTER% (or $COUNTER$ for specific situations where the percent character is ambiguous).

 

If you plan to use nested loops please make sure to declare a different counter names.

 

Component Parameters

 

CounterName: Specifies the the name of the iteration counter.

 

Keyword: EXIT DO

It must be used within a loop that has been defined using the DO and the LOOP keywords. The EXIT_DO keyword interrupts the loop as soon as the condition is met.

A typical use case is to check the value of iteration counter that has been declared via the CounterName parameter of the DO keyword.

For instance, when CounterName is set to "COUNTER" its value can be checked using the %COUNTER% token.

 

Component Parameters

LeftOperand

  • Specifies the value of the left operand that is to be checked.

Operator

  • Specifies the boolean operator to use.

The operators supported are the ones below:

    • = for "Equal to"
    • < for "Less than"
    • > for "Greater than"
    • <= for "Less than or equal to"
    • >= for "Greater than or equal to"
    • <> for "Not equal to"
    • {contains} for "Contains"
    • {startsWith} for "Starts with"
    • {endsWith} for "Ends with"

An additional operator is supported when testing WEB applications (i.e.: applications running in the browser):

    • {matches} for checking whether the value matches a regular expression. The regular expressions are expressed using the .NET syntax.

RightOperand

  • Specifies the value of the right operand that is to be compared with the left operand.

 

Options

The options parameter lets you perform some adaptations or conversions of both the left and right operand before comparing them.

The supported options are:

  • /u (for uppercase) - Both values are converted to upper-case before being compared
  • /t (for trimmed) - Both values are trimmed before being compared
  • /i (integer) - Both values are converted to an integer before being compared
  • /f (float) - Both values are converted to a float (or double) before being compared
  • /b (bool) - Both values are converted to a Boolean before being compared

 

Keyword: LOOP

It defines the end of the loop and must be used together with the DO keyword which defines where the loop starts.

 

  

EXAMPLE – PROCESS LINE ITEMS IN SALES ORDER

The following scripts was created for transaction VA02 (Change Sales Order) to add shipping information for each line item of an existing sales order.

script.png

 

With DO Keyword the loop starts and the counter is set to ‘1’.

DO.png

To be able address the row number starting at ‘0’ we take the counter number minus ‘1’ using the CBTA_A_SETINEXECUTIONCTXT component.

SETINEXECONTEXT.png

Then the scripts reads the value of the first row in the first column to check if an entry exists.

GETCELLVALUE.png

 

If the value is empty we exit the loop with the EXIT_DO keyword.

EXIT_DO.png

Otherwise the scripts performs the required actions for the current row

  • Select row

SELECT_ROW.png

  • Menu Goto --> Item --> Shipping
  • Enter the required shipping information using the related screen component
  • Go back to main screen

With the LOOP keyword the script goes back to the DO keyword while increasing the counter and processing further line items of that sales order.

How to setup Cross System Object Lock (CSOL) when using enhanced retrofit

$
0
0

The cross-system object lock functionality ensures that when an object is changed in a managed system, a lock entry is created for this object in the central SAP Solution Manager system. Depending on the selected conflict analysis scenario, this lock entry prevents changes being made to this object by any other change (transport request). This applies to all managed systems and clients for which the cross-system lock has been activated.

Once the cross-system object lock has been activated, the system can detect conflicts between objects in transport requests that have the same production system or the same production client as their transport target.

The meaning of the cross system object lock function is to protect your production system from “passing developments”.

 

Inside a Change Request Management maintenance project all changes (Normal, Preliminary, Urgent and Defect) will consolidate with the project. As the import method is IMPORT_PROJECT_ALL “passing developments” inside a project can never happen.

 

An exception to this is that Preliminary Changes & Urgent Changes can pass each other within a project. Therefore the use of CSOL is necessary to protect the PROD system from downgrades.

 

Also if more than one project is available for the same system landscape, CSOL can protect the PROD system from downgrades.

 

Automatic Categorization of Objects to retrofit (Auto Import, Retrofit and Manual) is based on the Cross System Object Lock entries in Solution Manager
If the Enhanced Retrofit function does not detect a Cross System Object Lock entry for an object of a transport requests that should be retrofitted, the object will be flagged as Auto Import object.

error.jpg

A change to object A is performed in the DEV system. This change is recorded in the CSOL table of Solution Manager. Now it happens that in the PRD system a fix is needed. The fix will be performed in the MAINT system and has to change object A as well. As the CSOL entry blocks the second change (fix) of object A the only solution to go on is to delete the CSOL entry as the fix is necessary to solve the issue in PRD.

If now the transport request in MAINT is released and the retrofit categorization is calculated the retrofit will not detect an entry for object A and therefore calculate a green case.

If now retrofit is performed the version of object A in the DEV system is overwritten!

 

How can we avoid this behavior?

 

You can customize how CSOL shall behave.

csol.jpg

csol2.jpg

You will find default mode and expert customizing.

We will need to use the "expert" customizing as the default mode does not protect you 100% from the issue described above.

csol cust.jpg

The "Project Relation" customizing is key for the enhanced retrofit scenario. In default it's set to "cross" which means conflicts from different projects as well as conflicts within the same project will stop the process.

What we want to avoid is exactly that conflicts from different projects will end in a termination of the process. Therefore the project relation has to be set to "Specific". This means that only conflicts within the same project will result in a termination and for different project will only appear as warning.

The other settings do not influence the enhanced retrofit behavior, so Change type relation and object type can be set however you need. But it's necessary that the project relation is only set to "specific" in the case you have the enhanced retrofit scenario active in your landscape.

One exception comes if you can for sure exclude Maintenance projects in the DEV landscape. In this case urgent changes cannot be created (this is only allowed when using maintenance projects) which means the default mode comes back into the play again.

Also possible is the warning only setting which results in that all conflicts will ever be detected as warning only and the process is never terminated.

In this case it's necessary to also activate the downgrade protection (DGP). This will ensure that if you get a warning in CSOL you can still not get passing developments as it checks again for release and every import.

 

So with these allowed settings you will never need to delete an entry from the CSOL list because of Urgent Changes needing to be implemented to PRD as fast as possible. Also in any other conflict situation you will never need to delete entries from the CSOL list to go on with your process.

Like this you will never get a wrong "green" retrofit categorization which will end up in an over write in DEV.

 

Conclusion:

When using enhanced retrofit in Solution manager the use of cross system object lock is mandatory for the correct behavior of the tool.
You cannot use the enhanced retrofit without having CSOL setup and activated for the retrofit relevant projects.
With some of the available conflict analysis customizing settings in cross system object lock  the danger of downgrading your Implementation work appears.

When using the enhanced retrofit, you should only use project relation "specific" . Any “cross-project” setting is not allowed, because a terminating cross system object conflict would require the deletion of the corresponding lock entry. But that lock entry is required for the correct  analysis of the enhanced retrofit.

 

Summary:

When using the enhanced retrofit scenario make sure your CSOL customizing is set to "specific" from the project relation point of view.

Also "warning only" is a valid setup if on top DGP is activated. The default mode can also be valid for the enhanced retrofit scenario when it's ensured that no Urgent changes can ever be created in the implementation landscape (DEV).

DBA COCKPIT CONFIGURATION IN MANAGED SYSTEM SETUP SOLUTION MANAGER 7.1

$
0
0

How to configure and Trouble Shooting DBA COCKPIT Configuration in Manage System Setup Solution Manager 7.1


In Managed System Setup - Step 4 (Enter System Parameters) highlighted below

Page1.png

 

Please provide all the required Details for DB Parameters

 

DB Host

Service Name

Port Number

TNS Name

 

And User Name will be your ABAP Schema User/ For Java (SAPSR3DB)

Page0.png

 

Once provided all the required information then save. you will see that log message saying

 

The DBA cockpit
connection %_******** is OK. DB Extractors can be activated

 

image12.jpg

 

Once this step is completed we can activate the DB Extractors in Step 8 ( Configure Automatically) . We can check the successful connection entry in

DBACOCKPIT T-Code. Below is the screen shot for your reference.

 

 

Page77.png

Trouble Shooting Connection Error

 

 

If DB cockpit connection cannot be established then you will get below message.

 

 

Page33.png

 

We can see that Error cannot establish DBcockpit connection

Page44.png

 

page55.png

as we can see the same connection entry in DBCO T-code. Delete these existing entries

Page66.png

 

And delete all the entries in DBACOCKPIT and MSS and check the in Operating System Level

tnsnames.ora at both location it should be the same as per the managed system

entries. If not then change the entries and check the tns ping.

 

 

Then configure the DBACOCKPIT in MSS in the same way shown above. Once the DBA
extractors are activated check the connection in T-Code DBACOCKPIT in SOLMAN

 

DBACOCKPIT Connection in SOLMAN à connection should be
successfully established.

 

Thank You,

Nahid

Partner determination via Rule policy ITSM / CHARM (NO ABAP REQUIRED).

$
0
0

Continue exploring Partner determination with default or dependent values, with certain conditions to have more flexible ITSM / CHARM procedures. There is topic on how to setup Partner determination via BRF+and very good blogpost  from Vivek . But  why not think about other possibilities? And here comes crm framework called Rule policies, it is mainly used in solman as Dispatch tool, but if you digg it deeper the true possibilites are opens to your eyes

 

Example Scenario: Rule policy ITSM / CHARM partner determination (NO ABAP REQUIRED).

 

Steps:

1.    Assign Rule modeler to Categorization schema of CR

2.    Create Rule policy type SRQ ZMCR_DEFAULT_BP

       Mapping If category = CAT_1 then route developer = 11.

3.    Copy SAP_SRQMROUTING to Z/Y

4.    Assign policy

5.    Assign Service Maanger Profile to Change Request transaction

       Change Request = ZCR_SRQMROUTING

6.    Test

 

 

1. Assign Rule modeler to Cat schema of CR

 

First let’s go to Solman’s CRM WEB UI and setup things we need

Tcode SM_CRM – Service Operations – Categorization Schemas

Choose your Schema that assigned to Change Request

Add new version, go to Application Areas and press New and add

Appilcation ID – Rule Modeler

Parameter – Context

Value – Service Request Management

We need to have a row like a last row in pic below:

Снимок1.PNG

 

2. Create Rule policy type Service Request

 

Tcode SM_CRM – Service Operations – Rule Policy, now here we need to create a new Rule policy

Снимок2.PNG

 

Context – Service Request Management.

Give some name to Rule Policy

 

Снимок3.PNG

This technology will work for any type of Solman transactions both ITSM or CHARM.

Now its most interesting part – the design part!

Choose Draft Rules row, press Subnode

Снимок4.PNG

Name it as you like f.e. Category = Partners and hit Subnode again

 

 

Again give proper name to avoid confusion when you read policies and press Add Entry in Conditions block

 

Снимок6.PNG

Снимок7.PNG

 

Choose

Attribute – Order Category

Operator – Contains

Value – choose category you wish to map the partner functions f.e. our popular Change Manager in the example it is ATH

Снимок8.PNG

Now in Action block press Add Entry

Choose Action – Route to a Partner, Partner Function – SDCR0002 Change Manager, Partner – who we need to assign as Change Manager in this case

Снимок9.PNG

For example I have setup all needed partner functions to be filled for the category ATH see below:

Developer, Tester, Custom partner function and etc.

Снимок10.PNG

Do not hurry to go to the next topics, take some time and sit here, because here you can make any scenario you need.

For example you can combine multiple checks for any situation: User status, Priority or Change category with other condition like check the category. You may Match conditions with AND / OR operators.

 

3. Copy SAP_SRQMROUTING Service Manager Profile to Z/Y

 

Tcode SPRO - Customer Relationship Management - E-Mail Response Management System - Service Manager - Define Service Manager Profiles. Choose SAP_SRQMROUTING and press Copy button on the top, name it like ZCR_SRQMROUTING

Снимок11.PNG

4. Assign Rule policy to Service Manager Profile

 

Stay on your ZCR_SRQMROUTING – double click Directly Called Services – double click Properties

Policy = your created policy in our case YALM_ZMCR_2

Снимок12.PNG

5. Assign Service Manager Profile to Change Request transaction

 

Transactions - Additional Settings - Assign Dispatching Rule Profile to Transaction Types.

 

 

 

6. Test

 

Now go to SM_CRM and create or pick any Change Request and press More – Dispatch all partners will be filled as mapped like here: Choosed category = ATH, pressing Dispatch

Снимок14.PNG

All partners filled

Снимок15.PNG

This will work even if Partner Function is not empty

 

Have Fun J

D.K.


How to setup central monitoring of BI (systems in landscape & Chains, jobs)via Solution Manager 7.1?

$
0
0
With 7.1, the capabilities of central monitoring has immensely improved in Solution Manager. For setting up of monitoring of systems in the BI landscape and objects in these systems, we have certain options available now.
1. Via Technical Monitoring - BI Monitoring
2. Via Business Process Monitoring (BP MON) - BW process chain monitoring.
3. Via Unified Job Monitoring

I would like to highlight what is a good option to do and what are the pros and cons with each approach.

 

What is BI Monitoring In technical Monitoring

To provide central monitoring and alerting capabilities. Integrated with Guided Procedure for alert resolution path. This is to cater to the need of an administrator to get an overview of the health of the systems participating in the BI landscape in addition to the objects specific to the data flows within the landscape.
Target audience:
BI administrator
BOBJ administrator
Application Support
BI operations Team
centralmontiroing_BIMON.PNG

Runtime:

1. Overview Monitor

A single screen overview of the
Health of all systems participating in the BI landscape
  • Availability
  • Performance
  • Exception
View of the health of data flow entities in the landscape
o BW Process chains (ETL)
o BOBJ jobs (Reporting)
o DS Jobs (Replication)
o Bex Queries & Templates (Ad-hoc Reporting)

2. Detail Monitors

Provides specific information on the health of these monitored objects by monitoring certain metrics per instance of these recurring jobs which are representative of the health of the data flows across the systems.
Overall health of the job
o Status
o Error logs * (managed system login maybe required)
Scheduling metrics
o Start delay
o Not Started on Time
Runtime metrics
o Duration
o End delay
o Out of time Window
Data integrity metrics
o Records processed
o Data packages processed
o Rows_read
o Rows_written* Data services

DETAIL_BIMON.PNG
Supported system types in BI Monitoring
System Monitoring Metrics are integrated in the overview monitor of BI for following system types

  • BW JAVA
  • SAP HANA Database
  • SAP SLT
  • BWA(TREX system)
  • ABAP Source system
  • BOE WAS (TOMCAT, WebSPhere, SAP_J2EE)
In addition to the system monitoring metrics, we can configure jobs/reports on top of these systems to be monitored.
  1. SBOP  (SAP Business objects platform jobs in CMC)
    1. 3.x
    2. 4.x
  2. SAP DATA SERVICES (Jobs)
    1. 4.1
    2. 4.2
  3. BW ABAP server (Process chains, BeX Queries, Templates)

MONITORS_BI_MON.PNG

As you can see, these screens have a designated flow for navigation. A single overview screen to show the health of all participating systems in the landscape and subsequently drill down capability by system-type and then to the monitored objects per system and then to its instance details!

Some key features in BI Monitoring (Configuration time in SOLMAN_SETUP) which caters to handle mass objects in configuration UI
  1. Mass maintenance of Thresholds
  2. Take from schedule from managed system to assist in configuring thresholds
  3. Excel Upload & Download
  4. Provides the managed object details in the configuration to assist in providing thresholds value for metrics like  (Duration, records processed etc.)
Nevertheless, this application has certain limitations.
1. Runtime (Monitoring UI) is not always colured!

The Monitoring UI could report grey rating for certain monitored objects which are not frequently executed in the managed system. Like a chain that is executing only once a week in the managed system. The collection frequency is set to 5 minutes in solution manager. So the status of these chains in the monitoring UI will turn grey,10 minutes after the chain ends today. And kicks in only for the next execution, that is in the next week! Howevere the alert (if any) would remain in the alert inbox with the history of measurements.
{Increase the collection frequency. Instead of (once every) 5 mins, make it once every hour. Esp for longer running chains. This would mean the monitoring stays colourful for twice the collection frequency. so instead of 10 minutes, its available for 2 hours.  This has a flip side, "delay in alerting" bad news that of failure.}

2. Alert inbox has multiple alert groups open, for instance, for the same BW process chain LOG_ID due to
    1. Grey metrics (collector issue, MAI extractor issue, Engine design
    2. If an open alert of a chain failure is confirmed in the alert inbox, the next collection will again report this error and an alert is opened up again! (owing to a fixed look back time of 36 hours for the ST-A/PI collector)

 

 

=> SERIOUS CONSEQUENCE : Multiple (duplicated) email automatic notifications

Workaround for reducing the number of duplicate emails. (this does NOT eliminate the duplicate alerts. This would only reduce the duplicate emails)

1. Reduce the collection interval to a small and relevant window (to reduce the occurrence of grey metric from collector)

    1. However, in the advanced tab in scheduling of data collection, Managed system time zone is  not handled in ST< SP10 (UTC is considered)
    2. If the chain is  executed  over the midnight, it is not possible to configure this restriction in the design time

2. Increase the collection frequency. Instead of (once every) 5 mins, make it once every hour. Esp for longer running chains. This reduces the probability of a grey alert!  This has a flip side, "delay in alerting" bad news that of failure.


3. No analytics capabilities of the collected metrics. vis-a-vis Interactive reporting. No BW reporting!


What is it to monitor BW process chains Via BPMon - BW process chain monitoring?


    In order to support the end to end monitoring of Business processes, that could span across several systems and can internally comprise of different entities like interfaces, jobs, process chains, in Solution Manager, we have the possibility to orchestrate a Business Process and setup monitoring for the participating entities. In such a context, Process chains can be setup for monitoring.
    Advantages:

    => A clear Business process drivenapproach for monitoring
    => Support of extended schedule and multiple not started on time etc,.
    Limitations:
    -  has no overview of the underlying technical systems health
    -  has no contextual navigation to the underlying system monitoring.
    -  Always require a BP solution to be orchestrated in Solution Manager to setup monitoring!


    What is Unified Job Monitoring?


    Over the last one year, we have been pondering on means to fix these known issues and develop an application that serves the customer requirements by closing the existing gaps. Starting SP10, we unveiled the new work center -> Unified Job Monitoring.
    • There is a consistent approach to monitor all type of jobs (BW process chains, ABAP jobs, SBOP jobs, SAP Data Services jobs). Also monitoring SAP CPS (REDWOOD) scheduled jobs.
    • Reporting of background jobs without requiring direct access to production systems using the collected metrics (BW analytics)
    • Powerful monitoring capabilities with factory calendar awareness, job log content, and Business process context so on

    MOTIVATION_JOB.PNG

    We have developed a brand new monitoring UI keeping in the interest to transition to the new html5 technology (SAPUI5). Find below a glimpse of this monitoring UI.
    jobMON-MOnUI_Sp12.PNG
    Key features
    In order to remove redundant collection in the managed system and to provide a persona-specific runtime view, we have ensured to unify the configuration, data persistency, collection and monitoring UI.


    Design Time:

    1. Reuse of Monitoring Objects from the three entry points - BP Monitoring solution, Technical Monitoring scenarios, Job Documentation.
    2. Pattern-based MO is supported for ABAP, BO, and DS. However, BW Process Chain has to be fully qualified name
    entry_points.PNG
    Runtime:
    1. Intermittent grey alerts are avoided
    2. Multiple email notifications are overcome
    3. Support of BW reporting.
    reporting_JOB_MON_SP12.PNG

    When to use what?
    1. If a need for Overview Monitor exist:     Technical Monitoring - BI Monitoring. With certain workarounds for the known limitations.
    2. For a pure Business process context:    BP MON - BW Process chain monitoring
    3. For a harmonized approach towards monitoring, Starting SP12, please migrate to BPMON and MAI integrated Unified Job Monitoring.
    SAP would continue to invest only in this option. Overall, Unified Job Monitoring addresses the known limitations of BI Monitoring and integrates
    BP monitoring based process chain monitoring.
    However, there are still gaps owing to the time required to develop. We intend to bring the best of both worlds together in this Unified job monitoring.
    Starting SP12, you could migrate existing BPMON Solution to MAI based solution. This would ensure automatic usage of Unified job monitoring if there are jobs, Process chains available in the classical solutions.
    Details: Execute the migration report R_AGS_BPM_MIGRATE_SOLU_TO_MAI via SE38. Use the F4 help to identify your solution.

    Similarly, there exist also a means to migrate relevant objects of existing Business Intelligence Monitoring scenarios to Job Monitoring Scenario.
    Details: In transaction SE38: AC_JOBMON_MIGRATION. Migrate or copy from existing BI Monitoring scenarios, job type objects (BW Process Chains , SBOP jobs, DS jobs) to a new Job Monitoring scenario to utilize the new collection framework and monitoring UI
    Comparison
    In 7.1 SP12 , all the three monitoring work centers co-exist. A comparison chart of features between Technical Monitoring - BI Monitoring versus Unified Job Monitoring is as below.

    NAX
    FeatureBI MonitoringUnified Job Monitoring
    ABAP Jobs and stepsNAX
    BW Process chains and stepsXX
    SAP Business Objects Jobs (SBOP)XX
    SAP Data services jobsXX
    External Scheduler (SAP CPS REDWOOD)NAX
    BW Bex Reports & TemplatesXNA
    MAI features
    • Notifications
    • Incidents
    • Alert Inbox
    • Third party
    XX
    Integration to MAI System Monitoring and contextual navigation to System MonitoringXPlanned
    Contextual Navigation to Managed system analysis tools from Monitoring UIXPlanned
    Overview monitor For viewing the overall scenario health of all landscape entities & JobsXPlanned
    (MAI) Work mode awareness
    NAPlanned
    Mass handling of monitored objects & thresholdsXPlanned
    Integration to Job DocumentationNA
    Guided Procedure for Alert ResolutionXX
    Please write to me regarding:
    1. How is job monitoring done today?
    2. Which job scheduling tools are used in the landscape (embedded schedulers from managed systems, CPS, UC4, Solution Manager JSM)?
    3. What are the relevant/important job types?
    4. Is SAP Solution Manager-based Job/BI monitoring used?What is the feedback? Which functionality is missing?
    Regards, Raghav, S
    Development Manager, SAP Solution Manager

    How to migrate (BW PC, SBOP, DS jobs) of BI Monitoring to Job Monitoring(Post SP12)

    $
    0
    0

    It is essential to read this blog before you proceed further.

     

    Since 7.1 SP10, there exist two work centers in technical monitoring namely BI monitoring and Job Monitoring.

     

    There are some inherent design shortcomings with the BI Monitoring application in 7.1 that SAP decided to invest in a renewed work center to overcome these deficiencies. As a part of this approach, we decided to unify the collectors of Business process monitoring and technical monitoring to be based
    out of the same infrastructure namely MAI.

     

    However from the runtime and reliability of the solution, Job monitoring offers robustness with regard to alerting mechanism.

     

    What does the MIGRATION report do? & When to execute this?

     

    So when you are actively using BI Monitoring to monitor BW PC (process chains), SBOP and SAP Data services jobs in the past (earlier to SP12) and have
    several managed objects configured, we provide a report program to transfer these configurations to managed objects of type Job Monitoring to utilize the
    new unified job monitoring collection mechanism and hence overcome the known limitations of BI Monitoring.

     

    What objects are migrated?

     

    If the technical scenario in BI Monitoring has managed objects of type BW PC, SBOP, SAP Data service jobs; this migration report acts on them

     

    Which objects are not migrated?

     

    Bex queries and templates are not migrated

     

    FAQ:

     

    1. What happens to the old BI monitoring objects?

     

    The old BI Monitoring scenario would remain as is. There are options to decide what should happen to the objects in this scenario.

     

    A drop down exist to execute this migration one scenario at a time.

     

    There are two options

     

    a) Migrate BI monitoring Objects :

     

    We create a new Job Monitoring technical scenario. '_BIMONIT' will be added as a suffix to the existing BI Monitoring scenario name. Then in the chosen
    scenario, all the objects of type Process chains, SBOP jobs and SAP Data services jobs are referred and the existing configuration (Metrics, Thresholds,
    Notification, incident settings) migrated by creating new job monitoring objects of respective sub type in the new job monitoring scenario.

     

    In this case, the bi monitoring is still functioning and now the job monitoring is also functional.

     

    b) Migrate and Deactivate BI Monitoring Objects

     

    We create a new Job Monitoring technical scenario. '_BIMONIT' will be added as a suffix to the existing BI Monitoring scenario name. Then in the chosen
    scenario, all the objects of type Process chains, SBOP jobs and SAP Data services jobs are referred and the existing configuration (Metrics, Thresholds,
    Notification, incident settings) migrated by creating new job monitoring objects of respective sub type in the new job monitoring scenario.

     

    In this case, only these objects are deactivated from monitoring via BI Monitoring and are now available via the job monitoring.

     

    2. what happens to the old BI Monitoring scenario?

     

    The old BI Monitoring scenario would remain as is. Active and functional and

     

    3. What happens to the other parts of the BI MON scenario, that are not migrated

     

    The old BI Monitoring scenario would remain as is. The other parts of the BI Monitoring scenario like the BW Bex queries & templates, all the systems
    included in the scope selection in define scope step 4 would remain in the BI Monitoring Scenario. Depending on the option chosen for migration, the objects
    of types BW PC, BO jobs, DS jobs would be deactivated.

     

    4. Where to check the result of the report program that performed the migration

     

    Please check in SLG1,

     

    Object type: E2E_ALERTING

     

    Sub Type: JOB_CONFIG

     

    5. Where to check for the logs of this background job?

     

    Owing to the time intensive operation this program would execute in background. In transaction SM37, check with the job name MIGRATE_BI_JOB_* with
    the user who triggered the migration report for the logs and the status of the job


    6.Explain the advantages of this migration?

     

    The Job monitoring work center is sophisticated in the collection and would hence avoid grey alerts. Also starting SP12, the BW reporting is available for

    job monitoring collected metrics.

     

    7. Explain which feature in BI Monitoring setup is not available in JOB Monitoring


    There is also a compromise in this migration. BI Monitoring work center evolved over the last 10 SPs to make a feature rich SOLMAN_SETUP. Certain functionalities that were developed in BI Monitoring configuration typically caters to the mass handling requirements. These are 'threshold mass maintenance', 'Job Details', 'Excel import and export' and 'take from schedule'. These features are not yet available in the Job Monitoring configuration. But the trade-off exist. The collection is robust. Hence the alerting and monitoring UI are more dependable when using the job monitoring.

     

    Solution

     

    Execute the report program AC_JOBMON_MIGRATION to migrate (BW PC, SBOP, DS jobs) of BI Monitoring to Job Monitoring

    Project Completion and complete Closure of all related Change Documents

    $
    0
    0


    Closing the current Change Cycle and open a new One

    SAP highly recommends, that customers close their Maintenance Cycle on a regular basis.

    • This allows a meaningful Reporting on Change Activities per Change Cycle.
    • On the other hand closing the Maintenance Cycle regularly helps to avoid a potential performance impact on the long run.

    A Change Cycle is closed by processing the Change Cycle Document to the final CRM User Status:

    • SMMN for a Maintenance Cycle with Task List Variant SAP0,
    • SMMM for a Maintenance Cycle with Task List Variant SAP1,
    • SMDV for a Project Cycle.

    Take over open Change Documents to the next Change Cycle

    When you close the existing Change Cycle, for instance a Maintenance Cycle and open a new one, you are not forced to close all Change Documents, which belong to this Change Cycle.

    SAP Change Request Management allows to take over open Change Documents to the next Change Cycle.

    Project Completion and complete Closure of all related Change Documents

    In SAP Change Request Management no automation for the Closing of Change Documents, which belong to a ChaRM Project is available.

    However ChaRM offers program CRM_SOCM_SERVICE_REPORT as Standard Solution for Closing of Change Requests and Change Documents, which belong to the Change Cycle.

    Before utilizing program CRM_SOCM_SERVICE_REPORT, you should check the following:

    • Status Profile Customizing (for the referring Change Cycle Document: SMMN, SMDV),
    • ChaRM Condition: SUB_ITEMS is defined for Change Cycle Documents SMMN and SMDV.

    With the help of program CRM_SOCM_SERVICE_REPORT you can search for any kind of open Change Documents with various search criteria as:

    • Open Change Documents per Business Partner, Team, etc.
    • Open Change Documents per CRM User Status,
    • Different Service Process related search criteria, as 'Transaction Type', 'Posting Date', etc.

     

    After having made your selection, you can let the program further process the open Change Documents up to their final CRM User Status.

     

    Snagit1.png

    In addition the program offers a 'Test run mode'.

     

    Of course, the program CRM_SOCM_SERVICE_REPORT can be also utilized in order to close IT Service Management related documents such as Incidents or Problems.

    CHARM - A Developer's Point of View - in other words - is CHARM really Charming?

    $
    0
    0

    Here's what I thought before using CHARM:

     

    Charm will:

    • Remove Conflicts between developers
    • No more missing objects when transporting to production
    • No more keeping track of transport dependencies
    • Allow to bundle transports outside of SAP
    • Keep defects with original requests
    • There will be less transports

     

    The above is living in Michelle's world of what CHARM will do.  NOT WHAT SAP or CHARM Claims to do.

     

    So here's a scenario:

    I would make changes to an object.  There would be changes to an outside system.  Developer 2 makes changes to a different object that is a part of my project.  All of the previous transports/objects will be bundled in one CHARM request.  Emergency and non-emergency transports will be taken into consideration.

     

    Dum, Dum, Dum, Da, Dum - Drum roll please.  Charm to the rescue.

     

    See below:

     

    charm4.JPG

     

    So was my vision correct?

     

    In practice:

    • A regular transport is created.   Table 1 is not changed.
    • The transport and CHARM ticket are released for an emergency change.  It is immediately moved to production. (After testing in quality)
    • The regular transport has fields removed from table 1, and the emergency transport object is changed so it no longer requires those fields.
    • The emergency change is moved to production again.
    • The regular change is moved.  Now when the programs are regenerated - the table is generated, and then the program.    The emergency program is generated with errors - so it goes with error - 8, and the regeneration stops.

     

    If the above confuses you.  You are not alone.  It confuses me and my BASIS people.   So the only solution I found was to create a new CHARM ticket with just the table.   Transport it first.  Re-transport the 2 Charm tickets.  They will go into the system clean.

    In theory:

    All transports are moved to production with the release.

     

    In Practice:

    • Not all transports move to production.
    • The changes are backed out of the object, and the object is changed by the developer.  The developer ignores the conflict and can create the new transport request.
    • At this point the changes can't be moved without BASIS help.  Why?  Because there is a conflict.

     

    In theory:

    Only one developer works on an object at a time.  Or if more than one developer is working on it, then it's for the same project.

     

    In Practice:

    • There can be more than one developer working on an object.  And yes, it is for two different projects.
    • So there are two options - add the object to the two different CHARM tickets.   Leave the object in the CHARM ticket that has it in it.  Either one will cause one CHARM ticket to be dependent on the other.  It will be a manual task to keep track of that.

     

    In theory:

    When a new table is created, all your developer's will know it is new and won't use it in their objects.

     

    In Practice:

    • Developer's miss that the table was created in a different CHARM ticket.   They have no idea on the dependencies.
    • CHARM doesn't notify of the dependencies.
    • The move to production has errors.

     

    OK - I'm done with the things CHARM doesn't do well.    There are some things that it does very well.

     

    CHARM is amazing at:

     

    • Limiting the number of transports.   For a regular CHARM ticket that goes with a release, only the transport task will need released.   When the task is released, it moves in the background to the test system.  If there are problems, then I just create another task.  The transport request is never really moved until the move to production.
    • It is easy to create a configuration transport request and a development transport request.   Since they are both on the same CHARM ticket, they will move to production together.
    • If your CHARM ticket has been released,  and an error is found.  It is easy to create a defect request and attach it to your CHARM ticket.  This will keep the transports together in one CHARM ticket.
    • The test environment is easily locked down when the system is moved to testing.  This will stop everything except for emergency transports from moving to the test client/system.
    • The approval process is at the front end.  A CHARM ticket is not created until the CHARM request is approved.  That means a transport request can't be created.
    • Outside objects - I'm not sure as we haven't used CHARM for that yet.

     

    So there you have it, my personal thoughts on CHARM.  Keep in mind, like all SAP products, different companies will have CHARM configured differently.  So some (or none) of what I've written may apply to you.

     

    Does CHARM do what it claims to do?  Yes.  Does it do what you think it should?  You be the judge of that.  Personally, I think it does make my job easier.  It's not a silver bullet.  It doesn't fix all transport issues.

     

    Please comment with some pros and cons.  And do let me know if I'm losing my mind with some of my comments. 

     


    :

    How-to enable Transport of Copies on Urgent Changes Flow (PART I)

    $
    0
    0

    Last month, I had the pleasure in colaborate with a brazilian food company (the world's tenth-largest food company)  speaking about some of our experiences, best practices and provide on demand consulting for their ChaRM Change Request Management Solution.

     

    During our conversation the customer tell to me your wish to deploy Transport of Copies as part of the Urgent Changes flow. As we know, the Change Request Management cover a standard workflow containing the transport of copies procedure available only on Normal Changes.

     

    In this blog I provide some hints in how you could set it up, however there is no guarantee and also standard support from SAP for this configuration.

     

    Background

     

    Urgent Changes have their own tasklist (Type "H") to coordinate all transport requests. The original tasklist type H does not contain the action "Create Transport of Copies". In this case we need to enhace the tasklist type H using a custom tasklist variant. After this configuration, some ajustments must to be applied to control the TMS of managed system.


    Just to clarify when the ToC will be generated and when the original Transport Request is released, I made the following pictures showing the "AS-IS" and "TO-BE" solution. You can adapt for your needs (e.g. creating additional status).

     

    Standard Process Flow: Urgent Change (SMHF)


    pic01.png


    Enhanced Process Flow: Urgent Change (Y/ZMHF)


    pic02.png

    Configuration Procedure

     

    Create a Tasklist Variant

     

              Access the IMG activity using the following navigation options:


                   spro_img1.jpg

                   pic03.jpg

             

     

              Push button "New Entries":

          pic04.jpg

            

     

              Create the tasklist variant "Y/ZSAP0 ":

            

               pic05.jpg   

     

     

     

    Define Tasks for Tasklist Variant

     

              Access the IMG activity using the following navigation options:

     

              spro_img2.jpg

              pic06.jpg

     

         Select all entries from Tasklist Variant "SAP0" and copy to Tasklist Variant  "Y/ZSAP0":

        

         pic07.jpg


         Create a new record adding the task "Create Transport of Copies"  for the Project Type "H Urgent Change":

     

         pic08.jpg

     

     

     

    Define Header / Footer Tasks for Tasklist Variant

     

         Access the IMG activity using the following navigation options:

       

         spro_img3.jpg

         pic09.jpg

     

     

    Repeat the procedure 2 from Define Tasks from Tasklist Variant configuration:


         pic10.jpg

     

     

    Register Tasklist Variant into Project Cycle

     

         Apply the SAP Note 927124.

       

     

    Adjusting Conditions and Actions (TSCOM Tables)

     

         Some activities regarding the transport management system and consistency checks are triggered when  the change document is assigned to a specific status value.       To enable the urgent change flow to generate transport of copies, we need to change some actions and their conditions based on status value.

     

     

         Access the IMG activity using the following navigation options:

     

         spro_img4.jpg

     

         On folder "Create Procedure Type" choose "Y/ZMHF" transaction type and their status profile:

     

         pic11.png

            

         On folder  "Assign Actions"  make the follow ajustments for the User Status "E0004 - To be Tested":    


          pic12.png

     

         On folder  "Assign Actions"  make the follow ajustments for the User Status "E0005 - Successfully Tested":

     

         pic13.png

     

         On folder "Define Execution Times of Actions", make the follow adjustments for the User Status "E0004 - To be Tested":


         pic14.png

     

         On folder "Assign Consistency Checks", make the follow adjustments for the User Status "E0004 - To be Tested":

     

         pic15.png

     

         On folder "Assign Consistency Checks", make the follow adjustments for the User Status "E0005 - Sucessfully Tested":


         pic16.png

     

     

     

    Result

     

    Project cycles powered by the custom tasklist variant will be able to generate Transport of Copies. In my next blog I will describe how to use this feature.

    How-to enable Transport of Copies on Urgent Changes Flow (PART II)

    $
    0
    0

    The Part I of this blog gives the some tips in how to enhance the original urgent change flow to generate transport of copies. Today I will explain how to activate this customizing in your ChaRM Project. You can check it out on this link How-to enable Transport of Copies on Urgent Changes Flow (PART I)

     

    Create a new Project under transaction SOLAR_PROJECT_ADMIN or close the current Project Cycle.

     

    Before you create the new tasklist, you should perform the following steps:

     

    pic01.png

     

    Push button Show Avaiable Variants for Tasklist and select the Y/ZSAP0 tasklist variant. Without this step, transport of copies at urgent change will not work! If you don't make this change in before generate the Tasklist you will receive some errors when the urgent change flow when you stay at the status E0004.

    New SCN Wiki Pages for Business Process Operations

    $
    0
    0

    The Service Marketplace pages for Business Process Operations will stop being available in the very near future. This means that accessing information via the pages

    will no longer be possible.

     

    Therefore, all documentation for Business Process Operations (including overview presentations and setup guides) is now accessible via the SCN Wiki Page
    http://wiki.scn.sap.com/wiki/display/SM/SAP+Solution+Manager+WIKI+-+Business+Process+Operations.

     

    SCN_Wiki.jpg

     

    This page gives a general overview about Business Process Operations. Each area of BPOps is shortly explained and links to a sub-page with more details are provided. These sub-pages per area are directly accessible  via the following URLs:

     

    In these sub-pages you have access to the existing setup guides and further documents currently available in the Service Marketplace. All future documentation will also be made available here.

     

    In the coming weeks we will further extend our documentation in these wiki pages and we will keep you informed in case of major updates.


    SAP Solution Manager: Technical Monitoring architecture part one

    $
    0
    0

    There are a lot of questions and discussions around the Technical Architecture that should be used for Technical Monitoring so I decided to start write blog post series as requested by community members. I want to keep the blog reasonable in length so I’ll write up parts on Technical Monitoring.

    A first reasonable question is, how many SAP Solution Manager systems do I need?


    This blog represents my opinion. If you have a different opinion, feel free to share and discuss it with the community at large as it can be of interest to all of us so please feel free to comment.

     

    How many SAP Solution Manager systems do I need?

    howmanylicksdoesittake.png

    One of the first questions in terms of architecture for Technical Monitoring is “How many SAP Solution Manager systems do I need?”. The answer can differ greatly depending on what your plans are,  what you are trying to achieve and how large your landscape is.

     

    Small size

    smallsized.png

    A small customer with a small SAP landscape (one ERP system landscape) will often run a single SAP Solution Manager instance. When it comes to Technical Monitoring, it would mean that all systems get connected to this one SAP Solution Manager instance.

     

    Having only one SAP Solution Manager system comes with typical advantages and disadvantages as you would have them in a traditional ERP landscape if you would only have a productive ERP system.

     

    When it is time to update the SAP Solution Manager system, to avoid direct impact, you could clone or copy the SAP Solution Manager system and process the update on the clone or copy to test out the procedure before doing the actual update in a weekend for example (to avoid downtime / impact as much as possible).

     

    Medium size

     

    mediumsized.png

    Many customers only have two SAP Solution Manager systems so often I see DEV – PRD landscapes at customers as we have a good amount of medium sized customers. This is a very common configuration that I’ve seen.


    The discussion starts here with two SAP Solution Manager instances, which SAP systems (talking ERP now), do I connect where? Do I connect all DEV and perhaps ACC systems to the DEV SAP Solution Manager and only PRD systems to the PRD SAP Solution Manager system?


    Well, I’ve said this before and about to say it again. SAP Solution Manager wasn’t really designed to have this kind of split so I only connect SAP Solution Manager DEV to itself as well as one or more sandbox ERP systems.


    All other systems (DEV, ACC, PREPROD, PRD, …) get connected to SAP Solution Manager PRD in order to benefit from having all that data in one place.  One alert inbox for the support team, one single source of truth for reporting purposes, one single source of truth for specific scenario’s that require data and connectivity of all the systems that belong to a specific landscape.


    The advantage of having a DEV SAP Solution Manager is that you have a place where you can try scenario’s out, perform support package stacks updates which makes it easier to minimize the impact on the PRD SAP Solution Manager.


    Large size

     

    largesizepartone.png

    largesizeparttwo.png

    You probably guessed I was going to say, large customers have three SAP Solution Manager systems but that’s not the general rule of thumb. It’s a SAP recommendation most likely but that doesn’t mean it’s really a necessity. It can make sense in case you are going heavy on custom development and want to invest in ITSM scenario’s where you really want to have a ACC system in place but what I see is that many SAP customers max out at two SAP Solution Manager instances (DEV, PRD) as their SAP Solution Manager landscape.


    Larger customers can have larger landscapes (potentially scaled out). Some have a split landscape where they decide to go DEV1 – PRD1 for Technical Monitoring and DEV2 – PRD2 for IT Service Management and by doing it, separating out those scenario’s from running on the same SAP Solution Manager instance.


    Why? Because they want to avoid impact of one scenario on the other scenario so they would like to patch DEV – PRD faster compared to DEV2 – PRD2 for example.


    The drawing above shows an example of such a split landscape. You use one SAP Solution Manager landscape for Technical Monitoring purposes while you use a second SAP Solution Manager landscape for IT Service Management purposes. In the IT Service Management landscape, you don't use diagnostics agent. You only connect the managed SAP systems through RFC connections thus you ignore some red traffic lights in managed SAP system setup. Thet second SAP Solution Manager landscape can be monitored by the first SAP Solution Manager landscape where Technical Monitoring is implemented.


    SAP Solution Manager – Diagnostics Agent architecture guide

     

    I haven't gone into any kind of detail on agents or other elements yet here that make up the architecture. That might be content for a future blog post. In SAP note 1365123 - Installation of Diagnostics Agents there is a guide attached that provides you with insight on possible architectural options for Technical Monitoring. The PDF document goes through numerous possibilities and options which ventures outside of what I covered in this blog post.


    I prefer to keep things simple in terms of not cross using elements as you can see in above “simplified” architecture schema’s. Why? Because complexity adds additional effort on multiple fronts, configuration, maintenance, support and troubleshooting to give some examples. The split landscape option translates into more maintenance effort but lower risks and it makes it easier to keep the SAP Solution Manager landscape up to date that is mostly used for technical scenario’s since you don’t impact ITSM processes that way.


    Diagnostics agent on the fly as a default

     

    At the moment, I advise to install Diagnostic Agents on the fly (as opposed to a regular, standalone Diagnostic Agent) as a default even if you only have a single SAP system per server because in SAP Solution Manager 7.1 SP12 it is a prerequisite to use automated reconfiguration.


    Automated reconfiguration allows the system to reconfigure certain managed system setup steps and some other steps like automatically assigning new, default SAP templates in technical monitoring after the SAP product version has been updated.


    If you haven’t seen or read about it yet, you can find a nice presentation on the SCN wiki:http://wiki.scn.sap.com/wiki/display/SMSETUP/Home

    where you can also find the Sizing Toolkit which can help you calculate the need for scale-outs for example.


    Under 7.1 SP12 (NEW) check out the presentation on Automatic Managed System Reconfiguration(PDF)

    Signavio Integration with Solution Manager 7.1 (SP12) ( Solman Connector ) - How To

    $
    0
    0

    Hello

     

    You are interested in the integrations of your solution Manager with the Signavio editor but during your integrations you have difficulties with the setting up your signavioconnector propoertes file.

     

    User Guide for integration you can find in the Signavio online help  documentations where you have detail  descriptions how to implement the link between your SAP Solution Manager and Signavio SAP Solution Manager Connector 7.1.

     

    My bog will help you just with the additional explanation of the User Guide sections which I've found not clear  during my integration effort.

     

     

    In the User Guide online documentation  starting on  the sections  5.   there is difference in SP12 screens . It  differs from the online User Guide  documentations !

     

    Sectiuon 5 part:

    "

    .........

     

    5.The Signavio SAP Solution Manager 7.1 Connector uses the BSI Enterprise Services (SOAP web service) to communicate with the SAP Solution Manager. This web service has to be enabled and configured before the connector can work properly:

     



    a.Logon to your SAP GUI and start transaction se80.


    b.Search for the package BSI_SERVICE_API:


    ........

     

     

    in SP12 there is different screens for the BSI enterprise Services therefore please follow my blog in order to make your integrations properly.

     

     

    If you will follow User guide in section f.  (f. Open the tab Transportation Settings and find the URL of the service binding. Please store the URL for later usage when configuring the connector.) you will find that  URL from the section f. is missing:

     

    2015-01-15_14-02-51.png

     

     

    In SP12  BSI Enterprise Services (SOAP web service)  looks a bit different. In order to get requested URL you should go

     

    In SOAMANAGER follow steps:

     

    2015-01-19_16-39-30.png

    on the next screen you will get URL you are looking for:

    2015-01-15_14-47-45.png

     

    The above URL should be later used in your solmanconnector PROPERTIES files in the signavio HOME directory on your solman host. My URL example is following:

     

     

    http://<mysolmanhostname>:8000/sap/bc/srt/wsdl/flv_10002A111AD1/bndg_url/sap/bc/srt/rfc/sap/bsiprojectdirectroyinterface/100/signavioconnect/binding_1?sap-client=100

     

     

    in order to fill out your solmanconnector properties file you must derive two parameteres from your WDSL URL Link:

     

     

    solman.bsiservice.binding            =  /100/signavioconnect/binding_1

    solman.bsiservice.endpoint          =   /sap/bc/srt/rfc/sap/bsiprojectdirectroyinterface 

     

     

     

     

    2015-01-19_12-05-18.png

     

    To find out what is wrong with your solman connector the best place where to look is in the log file of the signavio HOME installation directory. The log file is located in the log directory!

     

    if you have questions do not hesitate to contact me.

     

    Boris Milosevic

    New key figures for BPMon & Analytics - new content for 3rd party processes and MRP list related key figures

    $
    0
    0

    A belated Happy New 2015! As a belated Christmas present I want  write this blog about the new key figure content that was shipped in December 2014. On Monday December 15, 2014 the new ST-A/PI 01R support package 1 plug-in was shipped to customers and this means that many new key figures have been shipped for Business Process Monitoring and Business Process Analytics in SAP Solution Manager. This blog will give a short overview about what is new. The plug-in contains (besides others)

    • New key figures related to a 3rd party sales process (often also called drop ship process)
    • New key figures where business documents (e.g. purchase requisitions, purchase orders, MM scheduling agreements) are brought togehter with MRP list information
    • New Automation rate key figures for WM and PM
    • New item related CRM key figures
    • New transportation lane related key figures for SCM APO

     

    The new ST-A/PI 01R support package 1 is available for download and can be found under (SMP login required) http://service.sap.com/supporttools.A complete list/catalog of all available out-of-the-box key figures is available as MS PowerPoint presentation at (SMP login required in both cases)

     

    Remarks:

    1. On slides 2,3 and 4 (Table of Content) you can find hyperlinks where you can directly access the respective chapter of interest.
    2. For application related key figures you find some of the listed Selection Options in bold letters. Those Selection Options are available as "Group by" fields in Business Process Analytics.
    3. Those key figures with '€' as bullet point support also a value benchmarking as part of the "Advanced Benchmarking" functionality.

    Key figure news summary for selected areas


    New key figures have been developed for 3rd party (drop ship) processes:
    • 3rd party sales document items without purchase requisition items
    • Overdue 3rd party purchase requisition items with sales information
    • 3rd party purchase order items overdue for goods receipt (only relevant for customer that post the statistical GR)
    • 3rd party purchase order items overdue for invoice receipt

     

    New key figures where business documents are brought togehter with MRP list information to allow better insights in supply chain planning:

    • Overdue purchase requisition items with MRP list
    • Overdue purchase order schedule lines with MRP list
    • Overdue MM scheduling agreements with MRP list

     

    New outbound delivery key figures bringing delivery and shipment information together:

    • Overdue outbound deliveries without shipment assignment
    • Lead time from outbound delivery creation --> shipment assignment

     

    New Automation rate key figures for WM and PM:

    • Automation rate: Inbound transfer order items  (how many inbound transfer order items are created automatically vs manually)
    • Automation rate: Outbound transfer order items (how many outbound transfer order items are created automatically vs manually)
    • Automation rate: PM/CS notifications (how many PM/CS notifications are cleared automatically vs manually)
    • Automation rate: PM/CS orders (how many PM/CS orders are cleared automatically vs manually)
    New CRM related key figures:
    • Sales document items in status 'open' or 'in process'
    • Service document items in status 'open' or 'in process'
    • Lead time from sales document creation --> Taking document 'in process'
    • Lead time from sales document creation --> Completing the document
    • Lead time from service document creation --> Taking document 'in process'
    • Lead time from service document creation --> Completing the document
    • Lead time from business activity/task creation --> Taking activity/task 'in process'
    • Lead time from business activity/task creation --> Completing the activity/task

     

     

    New transportation lane related key figures for SCM APO:
    • Transportation lanes per product
    • Transportation lanes per location

     

    Further reading

    You can find all necessary information about Business Process Analytics in this document. Frequently Asked Questions about Business Process Monitoring and Business Process Analytics are answered under http://wiki.sdn.sap.com/wiki/display/SM/FAQ+Business+Process+Monitoring andhttp://wiki.sdn.sap.com/wiki/display/SM/FAQ+Business+Process+Analytics respectively. The following blogs (in chronological order) provide further details about Business Process Analytics and Business Process Monitoring functionalities within the SAP Solution Manager.

    How to Migrate from Classical BPMon to BPMon on MAI

    How to create a composite test for an End-to-End business scenario using CBTA test scripts

    $
    0
    0

    Applies to:

    Solution Manager 7.1 SP 7 or higher, ST-TST 300 SP0 or higher, CBTA 3.0 SP2 or higher (frontend), SAP GUI 7.30 or higher, ST-PI 2008 SP6 or higher, ST-A/PI 01P or higher.


    Overview:

    The Component Based Test Automation is shipped with SAP Solution Manager 7.1 SP07 and higher. CBTA allows the creation of automatic tests like eCATT via a dedicated recorder. It supports scenarios on systems under test that are based on the SAPGUI, SAP CRM web-client, SAP ABAP WebDynpro, SAP Portal, SAP WEBGUI, BSP, SAP Java WebDynpro technology.


    Unlike eCATT (record and playback approach), CBTA test scripts follow a modular approach. Therefore, test components can be reused and damaged test scripts be repaired fast. Apart from that, CBTA scripts can be created easily by business analysts while eCATT requires developers' expertise to creation and maintain test cases. CBTA is composed out of 2 software components: An add-on to be installed on SAP Solution Manager 7.1 from SP7 onward, and a front-end component to be installed on the user's desktop.


    Prerequisites:

    The following are the prerequisites that have to be met in order to use the CBTA script recording functionality for composite test scripts.

    • Check the compatibility for CBTA for front end, ST-TST 300, SAP GUI, ST-PI and ST-API (Available on SAP service market place).The CBTA compatibility matrix can be downloaded from http://service.sap.com/testing> Additional Information. Also refer the SAP note: 1763697 for more details on installation and configuration.
    • Install required SAP notes for both solution manager and managed system under category CBTA.
    • Complete the configuration for CBTA on solution manager along with SUT- System Under Test (managed system) for which the test is to be executed.
    • Maintain connection and credentials for SUT. Manage the SUT using CBTA setting under Test Management work center.
    • Enable scripting for SAP GUI on both solution manager and SUT.
    • Run self-checks for CBTA and SUT, also create SDC (System Data Container) for a project under which the business process is defined and want to assign the composite test script.

    Additional information related to CBTA configuration in SAP Solution Manager can be found in CBTA Release Document and How-to Guide of CBTA.

    http://wiki.scn.sap.com/wiki/display/SM/SAP+Solution+Manager+WIKI+-+Test+Management

    Composite Test Configuration:

    A composite test configuration comprises of test scripts (CBTA, eCATT or 3rd party test scripts), a system data container and a test data container (for data variant). You create a composite test configuration to create an end to end automated test case. The parameters can be passed between the scripts by changing their usage type. The data variant can be assigned using the test data container. The parameter values for which variant has to be assigned, must be of usage type ‘Exposed’.

     

    Composite test.png

    In the above figure , the composite test script is made of three individual CBTA test scripts. Each CBTA test script is made up of two types of components.


    Default components are delivered by SAP with the CBTA software. They are stored in the SAP Solution Manager repository for eCATT test scripts (you can also create custom components in custom libary and can be called for specific requirements). They are used for standard actions like entering a value in a field, pushing a push button, selecting a tab page.

     

    Screen components are generated after the recording. For each screen, a screen component is generated during the inspection process. This process retrievesinformation on the screen from the System Under Test (SUT) and generates a screen component. It contains one parameter for each field of the screen. The value can be assigned to such a parameter for which the system enters it in the corresponding field during execution.

    A screen component reflects a single version of a screen. To differentiate between the versions of a screen, the system uses the system data container (SDC) and the target components. The version of a screen is the same on all systems of the same SDC and target component. For other systems, other screen components are created. You can reuse default and screen components in several test scripts. When a screen is updated, its screen components can be repaired, to make the repair of test scripts easier.

     

    You have to define the sequence of the test scripts (steps) to be executed and the values of parameters. The parameter value can be passed by setting the usage type to ‘Local’ and providing the appropriate reference to the succeeding script.


    Creation and execution of Composite Test Configuration:

    Here, for our example we will consider only first two steps of an end to end composite test example. We want to create a composite test script comprising 2 steps (scripts) with the following parameters:

     

    Step -1 Create quotation (CBTA script 1):

    Import parameters: Customer, Material, Quantity etc.

    Export parameter: Quotation Number.

     

    Step-2 Create sales order from quotation (CBTA script 2):

    Import parameter: Quotation Number

    Export parameter: Sales Order Number

     

    First, you will require a test configuration where you can add these two steps. This can be created directly from Business Blue Print (where the Business Processes are defined) as an Ad-hoc execution approach. You can also create the same from the ‘Test Composite Environment’ under ‘Test Management Work center’. For our example we will go with the second option.

     

    Go to the ‘Test Management’ work center. Select the ‘Test Configuration’ sub-view for Test Repository. Click on ‘Create’.


    snap1.PNG



    In the pop up window provide the name of the configuration. Select the test tool as Composite Test from the drop down. Also provide the name of the composite test script and click on OK.

    snap2.PNG



    Select the appropriate Application Component using the value help. For system under test, select appropriate system data container (SDC) and select the target system for which the composite test is to be executed. Save the configuration as shown in the figure below.

    snap3.PNG


     

    Now, we have to put 2 of the steps (scripts) in the sequence. First we have to create the quotation. The corresponding CBTA script ‘Z_CREATE_QUOTATION_VA21’ is already created. Under the tab ‘Test Script’, add the CBTA test script‘Z_CREATE_QUOTATION_VA21’ for ‘Test Script Steps’. Now, the next step is to create the Sales Order in reference to the quotation number. The corresponding CBTA script ‘Z_CREATE_SO_VA01’ is to be added as the second step.

    snap4.PNG


    To pass the value of parameter Quotation Number created by the ‘Z_CREATE_QUOTATION_VA21’ test script step to the ‘Z_CREATE_SO_VA01’ test script step we have to change the usage type of the parameters.

    First, open the test script ‘Z_CREATE_QUOTATION_VA21’ by double clicking on the name.The test script will be opened in the new tab. Change it in the edit mode.Export the Quotation number created in the export Parameter ‘MESSAGEPARAMETER1’ by exposing it as shown below. Save the script and change it to display mode.


    snap5.PNG



    Now, go back to composite test part and select the second step.

    snap6.PNG



    ‘MESSAGEPARAMTER1’ is the input parameter to the script ‘Z_CREATE_SO_VA01’ which will receive its value from the output of the script ‘Z_CREATE_QUOTATION_VA21’. Now we will map the output parameter quotation number from the first step (script) to the input parameter team of the second step (script).

    snap7.PNG

     

    To do so, go to the ‘Ref. Parameter’ column of the parameter ‘SALES_DOCUMNENT’ under the details of test ‘Z_CREATE_SO_VA01’. Press F4 help and you can see the exported ‘MESSAGEPARAMETER1’. Select the parameter and click ok. Set the usage type to Local and save. You can now see that the parameter is mapped with the reference of previous step (script) as shown in the figure below.

    snap8.PNG


    Now, for all the parameters for both the steps (scripts) for which you want to provide the input as a variant, make the usage type to ‘Exposed’ using the drop down. You can assign the data variants using the test data container (TDC). The test data container with different data variants can be assigned under the ‘Test Data’ tab.

    The execution of the Test configuration can be done directly from the test repository by clicking on the “Execute” button. At the end of the execution of the test you can find the report log as shown in the figure below. You can also find details for parameters for each step (script).


    snap9.png


    Similarly, we can add the third step “Creating Delivery” with the reference of the created ‘sales order number’ at the end of the second step (script) to complete an end-to-end scenario for sales document.  You can also create similar composite test scripts and pass the parameters in between.



    About Author :

    The author is part of RunSAP CoE team, having expertise in SAP Solution Manager functionality from ALM & RSlaF like Solution Documentation, SoDocA, Test Management, BPCA, BPMon, BP Analytics JSM etc.


    Viewing all 337 articles
    Browse latest View live


    <script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>