Quantcast
Channel: SCN : Blog List - SAP Solution Manager
Viewing all 337 articles
Browse latest View live

How to Edit the Categorization Schema for SAP ITSM/ChaRM

$
0
0


How to Edit the Categorization Schema to include Newly Created Custom Transaction types

 

SAP strongly suggests to copy the default Transaction Types for ITSM and ChaRM to the Customer namespace (Z* or Y*) before customization begins.  These steps are done via the Guided Procedure for ChaRM, as is the creation of the Categorization Schema for the S* Ttypes.

 

In order to enable categorization for Customer namespace Incidents, Requests for Change, and the Change Documents, you must copy the SAP_SOLUTION_MANAGER_TEMPLATE Cat schema to the Customer Z Namespace and include the Z* Transaction Types.

 

Pre Steps:

  1. From Solution Manager: Launch sm_crm

3.png

 

 

   2.  To Enable Table Keys in Drop Down:

 

  • In WebUI Click Personalize

 

1.png

 

  •   Check box for Enable 'Show Keys in Drop Down' -> Save

 

2.png

 

 


Configuration:

From WebUI:

 

  1. Choose -> Service Operations -> Categorization Schemas

 

4.png

 

   2.  Search for SAP_SOLUTION_MANAGER_TEMPLATE

 

  •      Copy this Schema
    copy.PNG

 

 

 

 

3.  Change the Schema ID and Name to Z Namespace and Save

7.png

 

 

4.  Put the Schema in Edit Mode:

 

edit.PNG

 

(If editing a currently active version of a schema, First
Create a New Version of the Schema

8.png

 

 

5.  From General Data Assignment  Block  Choose Drop Down Status on Right, choose Draft

 

9.png

 

 

 

6.  From Application Area Table on Right, Choose New

 

new.PNG

 

  • Create an Entry for Service Order -> Transaction type/Catalog Category ->

  • Make Entries for ZMCR, ZMAD, ZMGC, ZMHF, ZMMJ (Choose (D)
    entries)

 

  • (* When editing/creating a copy of the Standard Schema, Remove all Standard S*ttype entries)

 

 

10.png

 

 

7.  From General Assignment Block:

 

  • Choose Valid-From Date and Time (default is 24 hours ahead)
  • Put Schema in Released Mode ->

 

12.png

 

 

8.  Save the Schema

 

 

Finished:

 

Go Back to Search for Categorization Schemas:

 

Once Valid To Date is reached, the Status of schema will change to Active

13_endresult.PNG

 

Exit UI

 

To Test, Create a New Change Doc and verify the Categories are enabled:


Stay Informed at ASUG ANNUAL CONFERENCE: Expanding Your Borders With Solution Manager

$
0
0

AC_Logo_black_lores.jpg

 

SAP Solution Manager for both the LoB and IT

 

SAP's John Krakowskiand his colleagues are  providing more information below about what to expect in the ASUG Pre-Conference session for "Expanding Your Borders with Solution Manager" which is being held June 2nd in Orlando, Florida.  Please read below:

 

NOW is the perfect time for every organization to register a representative to attend the ASUG Preconference Full-Day SeminarExpanding your Borders with SAP Solution Manager’.

On the SAP SAPPHIRE NOW | ASUG ANNUAL CONFERENCE home page, you will see the statement ‘Get the latest technological vision, actionable insights you need to drive profitability and growth, and influence future SAP offerings’, and on the ASUG PROGRAM home page is the statement ‘Every Experience Adds up to Your Success’.


These two statements are in perfect alignment with the tremendous SAP Solution Manager benefits and features that can be realized by all organizations and both their Lines of Business and IT.


More than ever, the market never stands still and neither can SAP and SAP Solution Manager. Year over year, SAP Solution Manager pre-conference seminars have proven to be among the most successful, valuable, and important topics for the ASUG Community. This year’s focus is to demystify, simplify, and discover new ways to approach your SAP Solution Manager adoption for lasting benefits to drive maximum value from your SAP Solution environment. SAP Solution Manager combines tools, content, and direct access to SAP to manage everything from deployment to solution monitoring and improvement. Based on an SAP support contract, the solution can help you optimize core business processes, your IT infrastructure, and innovate with two value releases per year.


This preconference seminar will help you refresh your SAP Solution Manager information to address the painstaking and time consuming efforts and potentially high risks to adopt innovations in your existing SAP solution, such as test phase duration, projected costs, and business user impact. The session is designed to provide maximum opportunity for you to hear from, interact, and see demos with several key and diverse ASUG members (Customer, Partner, and SAP) knowledgeable about SAP Solution Manager and obtain insight into their playbooks. You will be presented with multi-faceted topics, such as best practices for deploying innovation reliability and quickly while minimizing business risk, to increase your understanding and make SAP Solution Manager even more relevant to your organization. You will walk away with specific ways to expand your use of SAP Solution Manager, maximizing the value of your support engagement with SAP and your partners. You'll be able to streamline internal processes, minimize manual effort, reduce operational costs, and introduce new business functionality with greater ease.

 

Leveraging SAP Solution Manager more fully has been shown to increase reliability and help lower total cost of ownership:

 

  • Discover how integrated SAP IT Portfolio and Project Management, provides visibility and collaboration between project and change management teams.
  • Learn the best landscape for Two Value Releases per Year.
  • Hear about the recommended major and minor release approach.
  • Understand how to adjust and test only what matters when managing innovation projects with SAP Solution Manager’s Scope and Effort Analyzer.
  • Gain steps to improve Custom Code Management.
  • Review Solution Documentation best practices and whitepapers.

 

Also, walk away with information about:

 

  • SAP’s Control Center Approach for proactive support and to accelerate innovation.
  • Deployment and Run SAP like a Factory best practices for SAP HANA.
  • IT Service Management best practice processes with SAP Solution Manager.
  • What the new central CTS means for Change Control Management.
  • Learn information about SAP Solution Manager’s strategic direction and release plans, with details of the new release’s exciting innovations such as a new approach for collaborative process design.

 

Don’t miss out learning about the new features and breaking announcements!


Register Today


johnk.png

Photo by SAP



Could this be you, asking John questions during this ASUG pre-con?


I also  encourage you to read what former SCN Member of the Month Thomas Dulaney wrote in his blog last year about the ASUG Solution Manager pre-conference session at Review: ASUG Preconference ==> SAP Solution Manager 7.1 – Additional Value Creation Through Technological Innovations

 

He says:

"I have absolutely *nothing* negative to say about the ASUG preconference sessions. The three presenters were incredibly well informed. The presentations were incredibly well prepared. It was clear that this preconference session was designed and developed as a smoothly flowing whole session and not a collection of one hour sessions designed for a different audience. "

 

Register Today

My experience with cCTS ChaRM

$
0
0

Few months back we have decided to implement ChaRM with cCTS and I was looking eagerly for some blogs which talks about good and bad of it, but I was not able to find much. Then I thought I should write one when I have good understanding on this.

 

To start with we have followed the below how to guide of cCTS.

http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/20c1ba0c-8f32-3110-bfa0-9d1e57522951?QuickLink=index&overridelayout=true&59017145670468

To understand the simple flow setup this is enough but if your landscape is complex then it’s difficult to manage with this little piece of info.

 

The landscape where we have started is an ECC dual track (Maintenance and Development) and singe track BI for both maintenance and development.

As its being a dual track customer wants us to setup enhanced retrofit as well. The systems in the landscape are as below:

Capture.PNG

Points to be noted during clustering:

As per the how to guide all similar role systems should be made as one cluster e.g. all DEV systems in one cluster and all test systems in one cluster. If we can take the above landscape as a use case there should be three DEV systems in one cluster.

 

Capture1.PNG

The clustering is closely associated to logical component in a project, systems must be even in all the logical components of a project else clustering is not possible.

 

Not necessarily all systems have to be in cluster e.g. retrofit.

 

Problems in clustering:

1. If one system is down in your cluster we cannot move transports to any other systems in the same cluster. E.g. In the above landscape quality cluster is having three systems two of ECC and one of BW and the project QAS system is down for maintenance then we cannot move transports to QAS of Maintenance or BW.

 

So plan you clusters properly so that there will not be any dependencies.

 

2. As retrofit is involved we have added the post processing system in logical component so the logical component will have five systems but in clustering you still have only four clusters (CD1, CQ1, CPR1, CP1) in this case systems will not be able to identify the clusters.

 

To make this work define another logical component with only post processing system, as this logical component is not even this will not be considered during clustering.

 

3. In a project we have defined DEV, QAS, PPD, PRD in logical component and in cluster we have made only three cluster DEV as cluster one, QAS and PPD as cluster two and PRD as cluster three. In this case as preprod is also kind of a test system so we can club both systems into one cluster.

Capture2.PNG

Real scenario: A new development is started in project A and was tested well in QAS now it is yet to be moved to PPD system. Later there is a change in plan and these changes are decided to be moved as part of project B, when you try to change the project assignment of this change document system give message that TR’s are not moved to all the systems in the cluster hence you cannot change the project assignment or decouple the TR’s from this change document till its moved to all the systems in the cluster.

 

Hence plan your clusters properly considering multiple use cases.

 

4. In ChaRM without cCTS if you have to stop the TR movement to next system we can delete the TR from the import queue before the import all job run. But in cCTS this is not possible even after deleting the TR and transport collection, changes will still move to next systems.

 

5. In cCTS ChaRM projects sometimes the TR will not move to the next system even after the import all job run. In these cases we have to check the log of transport collection which will give you the clue of what went wrong.

 

6. In some cases the transports will move to the next system but the TR will have a message “return code unknown” because of unknown return code it gets into the import queue again and again currently I didn’t find any way out for this.

 

Hope the above information will help in design your clusters well. 

Change Manager Determination depending on the Category using BRF+ in Change Request Management

$
0
0

Taking a cue from Daniyar Kulmanov on possibility of using BRF+ with ChaRM, I tried the automatic Change Manager determination from BRF+ framework. It worked ! and here is the procedure to achieve it;

 

The standard SUPPORT_TEAM function is copied to Z namespace to adopt the Elements. You can also create your own Z function from scratch if you are familiar with BRF+. I chose to copy the standard;

 

sshot-1.png

sshot-2.png

sshot-3.png

sshot-4.png

Right Click > Copy

sshot-5.png

Name of target application should be 'SOLMAN'

sshot-5.png

sshot-6.png

Write down the Function ID of this newly created function from Details tab;

sshot-21.png

 

You can see the Signatures inherited from the SUPPORT_TEAM in the copied function (Z_CHANGEMAN_FUNCTION). It contains all relevant elements for the Service Desk.

 

sshot-7.png

 

We will create a 'Change Manager' element which is relevant in our case. To do this follow the procedure shown in screenshots;

 

Click on Add New data Object and provide the details shown below;

sshot-9.png

 

Now add the Change Manager in the Result Data Object replacing the Support Team. You may get following error when you try to Save;

 

sshot-11.png

To overcome this error, remove the Change Manager from the Signature screen using Remove Data Object button.

 

sshot-10.png

 

Now you need to create an Application, Ruleset, Decision table as you would create it for Support Team Determination. Instead of Component and Support Team in Decision table, use Category and Change Manager dataset; To configure application follow the steps mentioned in 5. CONFIGURATION IN THE BUSINESS RULE FRAMEWORK PLUS in the following link http://wiki.scn.sap.com/wiki/display/SAPITSM/Support+Team+Determination+via+Business+Rule+Framework+plus

 

sshot-16.png

sshot-17.png

 

Now we create a Z action in Z(Y)MCR_ACTION action profile for calling this function; this new action is similar to SMIN_STD_FIND_PARTNER_FDT action delivered in SMIN_STD action profile of Service Desk.

 

sshot-23.png

In the Processing Parameters provide below information;\

 

sshot-3.png

 

PARTNER_FCT : Partner Function of Change Manager

FUNCTION_ID : is the one which we created in BRF+

APPLICATION_ID: 0050568E6E9A02DE82A4EC0838CD95E5 (alwas constant, refers to Appln SOLMAN  in BRF+

 

last step is to activate this action in Condition definitions. No need to provide any Start or Schedule condition for this action , just make it 'Green' in Conditions screen

 

sshot-25.png

sshot-26.png

 

That's all ! now depending on your decision table entries, whenever you select a category in change Request screen, Change Manager is determined automatically upon Saving.

 

sshot-4.png

How To - Configure Multiple SLA based upon MLC in IT Service Management - Part1

$
0
0

I will be publishing this as a series of blogs with all the steps required to understand(basic concepts), configure or setup multiple SLAs in IT Service Management based upon Multi Level Categorization in Solution Manager 7.1 which we have recently implemented or configured for a customer.


Over the years SAP has made huge investment in the Service Operations area of Solution Manager and the new version which is 7.1 today provides flexibility to all SAP customers to implement IT Service Management for entire SAP and non SAP landscape.

 

In my experience, this is a journey and we should not jump on it doing everything at the same time but rather go step by step to implement the various capabilities of IT Service Management in Solution Manager 7.1.

 

During several implementations for IT Service Management in SAP Solution Manager 7.1, we have come across a common requirement to have multiple SLA or Service Level Agreements for support.

 

This blog series will focus on detailed configuration or implementation steps for setting up multiple SLA based upon Multilevel categorization(MLC) or the different type of incident.


Basic concept of SLA

Service Level Agreements (SLA) are typically agreed by SAP Customer with vendors who provide support or services adhering to their requirements

Typically there can be various criteria for having different SLA requirements. For every customer SLAs can be different based upon business conditions or support requirements i.e. either service basis or for specific product basis etc.


We can implement or configure multiple Service Level Agreements to satisfy different requirements from support perspective, for multiple customers or a single customer itself. This means we now have got flexibility of easily supporting different deadlines as per customer scenario based upon the variants of configuration possible in the new version of Solution Manager.

 

Service Level Agreements (SLA) in Solution Manager works upon two main concepts of:

  1. Service Profile and
  2. Response Profile

 

The basic concept are already explained in detail over wiki on below link

SLA Management - SAP IT Service Management on SAP Solution Manager - SCN Wiki

 

Just to add further over Service and Response Profile

  1. Service Profile– This helps us to define the time period within which the support partner should provide the service. e.g. 24/7 support which means 24 hours each day of a week
  2. Response Profile–  This helps us to define within the time period what is the start date/time and what is the end date/time e.g. based upon criteria say priority what should be expected deadline for start and end

 

The two major parameters for SLA measurement offered by SAP Solution Manager are mentioned below which are part of Response Profile:

IRT - Initial Response Time

MPT - Maximum Processing Time

 

Thus, based upon the actual situation, if we don’t see the incident IRT and MPT meeting then it is called as SLA breach or violation.

This has severe impacts based upon the contract between customer and vendor. We can build reports in Solution Manager to see the entire list of incident which are breaching IRT or MPT etc in SM_CRM transaction.


Typically, in my organization, we have seen several SAP customer’s going for a 24*7 ( i.e. 24 hours and 7 days working) support model or 12*5 (i.e. 12 hours and 5 days a week) support model based upon their specific support requirements with respect to region wise, SAP/ non SAP incidents i.e. category of incident or SAP Module wise like SD, MM, FI, etc. Therefore, it is very much possible to have a one customer who would like to have multiple SLA which means separate SLA for SAP tickets and other one for Non SAP Tickets.


Please note: Before proceeding on it further, it is assumed that the system is already configured with Incident Management and custom Transaction Type like ZMIN or YMIN is present in the system as per requirement. Please check the below links for setting up the same.

http://www.service.sap.com/~sapidb/011000358700000608542012E

http://www.service.sap.com/~sapidb/011000358700000608872012E


Basically, the minimum is to have a properly copied transaction type (ZMIN/YMIN) which is having the relevant status profile and other configurations with master data in place.


Further, in our example, let us assume the below 2 different Service & Response criteria for our 2 different category of incidents

  • SAP Tickets – assuming 24/7 support
  • Non SAP Tickets – assuming 12/5 support


I will soon publish the next part continuing further so keep following.

Continuous Evolution of LMDB! Let’s Evaluate Again!

$
0
0

I was just remembering how I was struggled to understand LMDB in SM 7.1 SP1 in my own sandbox; Those days I don’t have any source for reference except the guide from SAP and the blog #sapadmin:: How to assign Product System in SOLMAN 7.1 & How LMDB, SLD, SMSY and Landscape Verification  work in SOLMAN7…

 

 

This blog is the asset for my entire team to know where to start in LMDB.  During that time it was big debate with Tom Cenens in the blog Solution Manager 7.1 Landscape Management Database - A motorcycle or a bike?   about easy usage of LMDB, Later SAP come forward and clarified the road map, need and necessity of LMDB Evolution of Landscape Data Management – What’s better with LMDB?

 

 

We might thought it was just a vision,accepted and moved on. but it was not true, that was tremendous effort put on these concept, As promised by Bjoern Goerke in 2012 Teched, LMDB gradually evolved and it was much more mature now.

 

 

I was recently asked to give training to some of the functional colleagues on solution manager basic concepts; it was amazing, LMDB now become more self-guided and they done the entire setup without much hurdle. It was automatically guided them to assign the missing technical systems, product systems and logical components.

 

 

t1.JPG

 

 

It was made my job easy!  I was impressed and my entire team was satisfied. Two weeks before I had discussion with Tom, finally was decided to reopen the old poll and to collect the feedback again.

 

 

Before that, Thanks to Wolf Hengevoss  for his continuous effort to bring the product much simple and more powerful.

 

 

You can check out the recent changes of LMDB in sp10. Evolution of Landscape Data Management – Part II: What’s better with LMDB in SAP Solution Manager 7.1, SPS10?

 

 

Now time for you all, let’s evaluate LMDB in SM 7.1 SP10, and how do you all feel? Take few minutes and finish the Poll by June 30th 2014.  SAP Solution Manager 7.1 SP10 LMDB

Getting feedback on documentation

$
0
0

Hello all,

 

It's been some time since we have the KBA methodology at SAP. I like the idea of being able to release documents for you as quickly as you need them, but we don't get a lot of feedback about them.

 

I would like to know if you have any issues following a Maintenance Optimizer KBA, because I want to make them as good as I can.

 

So, don't be shy, and let me know if you have comments about any SV-SMG-MAI KBAs.

 

I am looking forward to hearing from you!

Decommissioning a Managed System from Solution Manager 7.1 SP11

$
0
0

As you all know removing a managed system from Solution Manager can be a tedious time consuming process. With SP11, SAP has provided a guided procedure for removing (decommissioning) a system from Solution Manager 7.1. The steps below show you how to access the guided procedure and high level preview of the steps provided.

 

SAP also added steps for decommissioning in the latest Solution Manager 7.1 SP11 Operations guide. You will need access to the SAP support Portal to access the document. Here is the link to the guide and a screen shot of the location:  https://websmp109.sap-ag.de/~sapidb/011000358700000631992012E.PDF

 

1.jpg 

 

1. You need the following composite role to run guided procedures

     a. IT Operator  composite role = SAP_TASK_INBOX_ALL_COMP

2. Enter Transaction SOLMAN_WORKCENTER in SAPGUI

3. Select Technical administration Workcenter

  • Select Guided Procedure Management
  • Select the Managed System
  • Select the Guided Procedure browser drop down
  • Select either Start new window or Start Embedded

 

  2.jpg

4. Select the Decommissioning Procedure and select Execute

  3.jpg

5. Select Edit

6. Before continuing to the next step you must understand that by decommissioning the system you will lose data on the managed system. Keep in mind all of the steps may not be required.

7. Set the execution status to performed and select next to move on to step 2.

  4.jpg

8. Step 2 – Application Clean-up has 2 automatic steps and a number of manual steps.

     a. Automatic activities are completed by selecting execute all

    • Remove Technical Monitoring Templates via SOLMAN_SETUP
    • Delete Session data, Reports, Early Watch Reports, DVM, Service Level Reporting with report “RDSMOPREDUCEDATA”

         b. Complete the manual activities by reading the Documentation for each step and the selecting Start Webdynpro to navigate to the location where the steps need to be completed.

         c. The manual activities are all about removing the scheduled jobs and monitoring that is configured in Solution manager.

         d. When all manual activities are complete set the Execution Status to Performed and select next to move on to the next step.

      5.jpg

    9. Step 3 – Cross Application Clean-Up

         a. This step has you delete the Extractors, RFC’s, Transport Routes, and uninstalling the Diagnostic agent from the managed system.

         b. This step and the following steps have all Manual Activities. Read the documentation and complete as needed.

    6.jpg

    10.  Step 4 - Planning Projects and Solutions Clean-up

         a. These manual activities are removing Solutions and deleting logical components from the LMDB and SMSY.

      7.jpg

    11.  Step 5 – Software Life Cycle Management Clean-up

         a. This step has you remove more product systems and system data from solution manager.

      8.jpg

    12.  Step 6 – Landscape Management Clean-up

         a. This step has you remove the remaining system data from LMDB and the SLD.

      9.jpg

    13.  The final step is just an overview displaying the status of the other steps.

      10.jpg

    14.  That is it, once complete all data on the managed system is completely removed from Solution Manager.


    SAP Solution Manager Key on 7.1

    $
    0
    0

    Hello forum,

     

    Today I see a mail from some "SAP Consultant" asking us why SAP Solution Manager Key don't work on SMSY...

     

    Well, as indicated on the next SAP note, Solution Manager Key is no longed provided trough SMSY transaction, you have to do that using LMDB as indicated on note:

     

    1888840 - How to generate installation keys in Solution Manager 7.1


    You can see on the next pictures how to access to that LMDB functionality:


     

    LMDB stat screen

    cap1.png

     

    cap2.png


    To do that you don't need to finish the managed system setup, you can select any system on LMDB technical systems selection and then click on Generate Installation Key, then on the next screen only fill the fields with the correct information to generate the KEY.

     

    I check SAP notes to see if there is any new update about that, and found the next one interesting:

     

    811923 - SAP Solution Manager Key


    "...The SAP Solution Manager Key is no longer required..."


    I don't install SAP software but it's curious to see people for Solution Manager Key, and there is a note where SAP indicate that the key is not longer required :-O

     

    Anyone have more information about that last SAP note ?

     

    Best Regards,

    Lluis



    How To - Configure Multiple SLA based upon MLC in IT Service Management - Part2

    $
    0
    0

    In continuation to earlier blog (Part 1) , we are going to explore the steps to configure and implement multiple SLA for same customer in Solution Manager 7.1 IT Service Management where single SLA is not sufficient.


    Practical example would be a company/customer wants to have 24/7 support for SAP incidents and 12/5 support for non SAP incidents like hardware or IT Asset incidents etc.


    Please note: Before proceeding on it further, it is assumed that the Solution Manager system is already configured with Incident Management and custom Transaction Type like ZMIN or YMIN is present in the system as per requirement. Please check the below links for setting up the same.

    http://www.service.sap.com/~sapidb/011000358700000608542012E

    http://www.service.sap.com/~sapidb/011000358700000608872012E


    Basically above signifies, the minimum configuration to have a properly copied transaction type (ZMIN/YMIN) which is having the relevant status profile and other configurations with master data in place.


    Further, in our example, let us assume below 2 different Service & Response criteria for our 2 different category of incidents

    • SAP Tickets – assuming 24/7 support
    • Non SAP Tickets – assuming 12/5 support
    • Customer name -  assuming XYZ Corp India


    (PS. Above are just dummy/Imaginary values and you can replace with with real values or as per requirement)


    For SAP Tickets, the availability of support should be 24/7 i.e. 24 hours and 7 days a week. This means users can post ticket at any time or day and the response should be provided as per below agree criteria as mentioned in below table:


    Customer

    Priority

    IRT/MPT

    Deadline

    XYZ India

    1

    First Response By

    1

    Hour

    To Do By

    2

    Hour

    2

    First Response By

    2

    Hour

    To Do By

    4

    Hour

    3

    First Response By

    8

    Hour

    To Do By

    12

    Hour

    4

    First Response By

    12

    Hour

    To Do By

    16

    Hour

    Table 1: SLA for SAP Tickets with 24/7 support


    Please note: Timestamps First Response By and To Do By are for measuring the IRT and MPT respectively which are defined in the system for setting up the deadline.

    For non SAP Tickets, the availability of support should be 12/5 i.e. 12 hours and 5 days a week. This means users can post ticket at any time during but support will be provided for 12 hours for 5 days within a week, excluding Saturday and Sunday.


    The response or resolution for incidents should be provided as per below agreed criteria as mentioned in below table:


    Customer

    Priority

    IRT/MPT

    Deadline

    XYZ India

    1

    First Response By

    3

    Hour

    To Do By

    5

    Hour

    2

    First Response By

    5

    Hour

    To Do By

    9

    Hour

    3

    First Response By

    7

    Hour

    To Do By

    12

    Hour

    4

    First Response By

    11

    Hour

    To Do By

    16

    Hour

    Table 2: SLA for non SAP Tickets with 12/5 Support

     

    Let us start configuring the SLA as per our requirements mentioned in Table 1 and 2 with below 5 main steps in below sequence:

    1. Creation or Configuration for Service and Response Profile
    2. Creation of Product and assignment of Service & Response Profile
    3. Assignment of Service Products to Multi Level Categorization(MLC) & Sold to Party
    4. Configuration of SLA Determination procedure in SPRO
    5. Assignment of SLA Determination Procedure to transaction type

     

    Step 1: Creation or Configuration for Service and Response Profile


    First we need to create two different Service and Response Profiles in the system for XYZ Corp India customer. Therefore, logon to Solution Manager 7.1 system, now enter the transaction code “crmd_serv_sla” or follow the below SPRO path.


    SAP Solution Manager Implementation Guide->SAP Solution Manager->Capabilities (Optional)->IT Service Management-> SLA Escalation->Edit Availability and Response Times.


    This will help us to configure Service and Response Profiles.

    Click on Service Profile as highlighted in below figure and then click New Entries button as highlighted to create a new Service Profile.


     

    Next step is to enter code (say ZSP), provide a description and then click on highlighted icon to maintain availability service timelines as per requirement for SAP Tickets/incidents. Thus, we create this one for 24/7 support.


     

    Click on the highlighted icon will help to maintain the timelines or period. To maintain the service availability times 24*7 proceed as shown below.

     

    Note: I am assuming all days are working here but it is possible to define our own Calendar (via SCAL transaction) which is very basic and easy for any Solution Manager Functional Consultant.

     

    Further, click copy button as highlighted above then and save it.


    On returning to screen or figure 2, again click new entries or repeat steps to create our second service profile for 12/5 timeline for non SAP Incidents/Tickets.

    Once completed with Service Profile creation, we can now proceed further for response profile setup.

    Next step is to configure Response Profile in the same transaction (crmd_serv_sla), so click Response Profile and click New Entries button.

     


    Create a Response Profile ZRP and maintain a description. Choose Priority checkbox as shown.

     


    Now click on Indicators for Response Times on the left side in above screen, which will change the right part of screen and we can maintain the priority values available in the system by pressing F4 as shown. Map all the priority values as requirement.

     


    Once added from the figure 8, all the above priorities will be attached to response profile as shown below

     


    Next for each Priority, we need to assign the relevant First Response By and To Do By parameter as per our Table 1.

    Therefore choose Priority “1” as shown below and then click on Response Times on the left side of the screen.

     


    Double clicking the Response Times node on the left side of our figure will open sub screen on the right part of same figure Maintain the values for two durations as shown below

    • SRV_RF_DURA – for IRT
    • SRV_RR_DURA – for MPT

     

     

    Similarly, proceed for Priority 2, 3 and 4 by repeating above steps and maintain durations.

     

    Similarly now we will create another Response Profile ZRP2 and maintain the parameter First Response By and To Do By as per Table 2 by repeating all above steps.


    Step 2: Creation of Product and assignment of Service & Response Profile


    Based upon example requirement for customer/company XYZ Corp India, we will now proceed to create 2 Service Products for SAP and Non SAP Incidents respectively. To create a service product, we can access the transaction commpr01, and this will open the below screen.

    First, hit the search button in the Find Area.


    Then a default SAP Service Product “Investigation” will be shown which is created via the Solman_setup – ITSM wizard.Double click to choose it and enter the change mode by clicking the Pencil or edit icon.  Further, assign the Service and Response Profile which will be empty initially.


    Save it.


    Similarly, we copy the above product using copy button to a different product by name Hotline and assign 12/5 Service and Response profile as shown below


     


    Thus, we are now ready to move to assign them in Multi level categorization or step 3 as mentioned earlier in the blog.


    We will continue in another blog with remaining or mentioned steps.


    How to send an email with a custom job report from Job Management Work Center

    $
    0
    0

    Do you need to know by the time you enter the office if the jobs that were scheduled during the night ended without errors?

    Do you want to receive customized job reports on your mobile device?


    Would you like to be able to receive the result of a Job Management POWL query, for example the list of jobs in a 24h time window as shown below, by email?

    00 jsm_wc_smse_query.png

     

    Prerequisites:

     

    1. You have SAP Solution Manager Support Package 10 or higher installed

     

    2. You have to create the required POWL queries in the Job Management Work Center that either

    a) Collects the job data from the SAP system directly or

    b) Collects the job data from an external job scheduler like SAP CPS by Redwood

    The queries have to use dynamic time windows in order to get for example the job runs during the past 24 hours.

     


    Configuration:

     

    Once the queries are available you navigate in the Job Management Work Center to the Administration view and start the configuration of the POWL Query Result notification. As the name implies this works for al kinds of POWL queries and not only for the queries defined in the Job Management Work Center.

    10 jsm_wc_admin - config.png

    In the configuration application you just select Add, then select the POWL query you want to receive (make sure that you have selected the correct type of query) and enter your business partner number.

    11 jsm_wc_admin_new_query_result.png


    After pressing OK you should see your configuration in the list of available configuration. Select your configuration and the assigned business partners will be displayed:

    12  jsm_wc_admin_new_query_result_2.png


    Of course it is possible to assign more than one business partner to a configuration as you can see from this example:

    13 jsm_wc_admin_new_query_many_recipients.png


    Automation

     

    Now that the configuration has been done return to the  Administration view in the Job Management Work Center and select Schedule Job in section POWL Query Result Configuration.

    20 jsm_wc_admin -schedule.png


    The job scheduling of Job Scheduling Management will be started already prefilled with the report to be schedule. Just enter your preferred start time and recurrence and you're done.

    21 jsm_wc_schedule_email_job.png

     

    Result:

     

    What's next? Just wait for the job to run and you will receive an email containing the results of your POWL query, see the example below:

    30 POWL_email.png

    SLA on ITSM: Using BRFPlus and Decision Tables for the SLA Asignation

    $
    0
    0

    DRAFT VERSION (Final Version in 1 week)


    Concept: This blog allow you to configure the SLA assignation based on the information of the incident, actually using customizing is not possible to use different conditions, this configuration allow you to play with several inputs to assign the SLA that you need.

     

    This functionality will be covered in 2 blogs, this to create the Decision Table and the next one to create the PPF action

     

     

    In Solution Manager you need to execute the transaction BRFPLUS

     

    In the Menu, select “Create Application
    image001.png

    Define the Application name:

    image003.png

    Remember to SAVE and Activate the application.
    image006.png


    Create the New Elements:

    Right click on the application and select to create a new element..

    image007.png

    Create the element using the “Bind to DDC element (data Dictionary)

    image009.png

    Us the DDIC element CRMT_SRV_SERWI to bind the Service Profile

    image011.png

    Remember to SAVE and Activate

     

    Execute the same steps for CRMT_SRV_ESCAL

    image015.png

    Now you will see the Data objects created and activated

    image017.png


    Create the Decision Table:

    Now you need to create the Decision Table, right click on the application and select Expression—>Decision Table..

    image019.png

     

    Create and navigate to the table

    image021.png

    Now you need to define the columns.

    image023.png

    Insert a new column from the Context Data Objects.

    image026.png

    Select the Application SOLMAN and add the fields that you want to use for the assignation, for this example we use PRIORITY and CATEGORY

    image027.png

    Now you need to add the Result Columns

    image029.png

    In this case we use the objects created previously (CRMT_SRV_SRWI and CRMT_SRV_ESCAL)

    image032.png

     

     

    Now you need to add new data to the table, you need to select the insert Icon

    image033.png

    In this example we add 2 entries with different sets Priority and Service Profileimage036.png


    Create the Function:

    Now you need to add a function, Right click on the Application and then Select Function

    image038.png

    Create the function and navigate to the object

    image039.png

    Now we add the Data objects that will be used for the function, in the tab signature select Add Existing Data Objects.

    image042.png

     

    Add all the objets found in the SOLMAN Application (This allow you to use different  columns in your decision table.

    image044.png

     

    Now we need to define the Result data object , select Create in the action button.

    image046.png

    Create a new Structure object to allow you to pass the 2 values

    image048.png

    In the new structure, click in Add Existing Data Objects

    image050.png

     

    Select the Application ZSLA and add the 2 new objects created in the previous steps.

    image052.png

    Remember to SAVE and ACTIVATE


    Create the Ruleset:

    Now is time to create a new RuleSets, you need to select your Function (Blue pyramidal Icon) and then the Assigned Rulests tab, then click on “Create RuleSet”

    image054.png

     

    Define the new name

    image056.png

    After create the Ruleset you need to create a new Rule, to do that, you need to Click on “Insert Rule” and then in Create

    image057.png

    Now write a description, and then select in “Then --> Add”, the option “Process expression—>Select..”

    image060.png

     

    Now select the decision table created before

    image062.png

    Now you need to SAVE and Activate.


    BR



    Secure your EhP/SPS implementation with Scope & Effort Analyzer in SAP Solution Manager

    $
    0
    0

    '

    The everyday duty of IS/IT organizations, especially SAP Competency Centers is to deliver regularly innovation and new features to the business, according to the requirements, budget and priorities. But is also to maintain in operational conditions the existing solution.

    SAP recommends to deliver 2 major releases per year and minor releases on a monthly or weekly  basis depending on the maturity and stability of your solution. This is directly inspired by the editor's own release strategy of Enhancement Packages and Support Packages Stacks.

    SEA - Intro.jpg

     

    As any technical or functional project, an EhP or SPS implementation project has to be budgeted, prepared and planned accordingly to mitigate the risks and delays, but above all the preceding to avoid potentially negative effects on live processes and solutions.

    SEA - Update Project challenges.jpg

     

    Scope & Effort Analyzer (SEA) is an innovative tool designed for those people who have to manage maintenance events on their SAP systems and need to get a clear understanding of the change impact and the test scope and related effort. It has been recently shipped with SAP Solution Manager 7.1 SP11 (March 2014) and helps to predict the major cost and effort drivers of maintenance projects (aka: software change events) without the need to physically deploy any software packages. We highly recommend it to be used in an early planning phase of each and every maintenance project.

     

    The analysis results covers two parts :

    • Adaptations and development management: identification of affected custom code and modifications, required adjustments in the SAP system, since software updates comes with updates or deletions of SAP standard objects. Detailed effort estimation for custom code and modification adjustments.
    • Test management: Identification of required test scope, test planning, recommendations for creation of missing test cases and execution of manual tests. Detailed effort estimation for regression tests and recommendations based on test scope optimization

     

    It relies on Usage and Procedure Logging (UPL), a new functionality available in any ABAP based system as of SAP NetWeaver 7.01 SP10 or equivalents. UPL is used to log all called and executed ABAP units (procedures) like programs, function modules down to classes, methods and subroutines or smart forms without any performance impact. UPL will give you 100% coverage of usage without estimations or evaluation of ABAP call stacks. This also includes the detection of dynamically called ABAP elements. UPL is the technology to close existing gaps in the SAP workload statistics which only captures static calls as opposed to static and dynamic calls.

    With help of UPL technique it is now possible to calculate the impact to custom code, modifications and business processes in consideration of the real system usage.


    The Maintenance Optimizer scenario automatically calculates the update vector (that’s a detailed list of all technical support and / or enhancement packages to reach the target definition). Scope and Effort Analyzer calculates all SAP ABAP objects which are either deleted, changed (update version) or new delivered with this software update. This ABAP object list or Bill of Material (BOM) is the central element to calculate the impact on your SAP system even without a physical installation of those packages.


    In addition to this, semi-dynamic TBOM generation and the automated generation of SAP Module oriented blueprints are two additional source of value. These features ensure to identify the impact on your business processes / transaction codes with the objective to outline the test scope and recommendations how to reduce the test effort with help of Test Scope Optimization (TSO) functionality. This is achieved through a program-based optimization of the number of changed objects by business process and the test effort of associated test cases.

    SEA - Additional benefits.jpg

    Testimonial and success story: the French customer Coliposte is one of the very first world references for the usage of SEA. With the help of our consultants they have achieved in a remarkably short time the implementation of this new tool in the context of their EhP7 for SAP ERP adoption project.

    David Bizien, the CIO for Financial Department of Coliposte explains in the following video how SEA helped him :

    • Forecasting the overall effort and budget for the EhP7 upgrade project
    • Focusing on the most critical impacts
    • Identifying the required skills and competencies for adaptations and developments
    • Booking and mobilizing the appropriate resources for testing
    • Taking into account the team planning constraints

     

     

    This interview was recorded at SAPPHIRE NOW 2014 in Orlando.

    If you wish to learn more about this secured (no code or sensitive information exposed outside your company) and very comprehensive (covers both SAP standard and customer-specific objects) but nevertheless free of charge tool, feel free to visit our website or SAP Marketplace.

    Scope & Effort Analyzer is the ultimate tool to secure your SAP maintenance or evolution projects.

    Interesting Information About Interactive Reporting In SM 7.1 SP10

    $
    0
    0

    Often I had seen  most of them, talking about Interactive reporting, one of the interesting feature in solution manager 7.1. Today I got some time to explore in detail. Got answers for some queries like what is this all about and how this is different from other reporting approach. Through this blog I would like to share my learning with you all and looking forward to get your experiences.

     

     

    What is interactive reporting in solution manager context?

     

     

    As you know all that Solution manager has various reporting options, one of the prominent and recently promoted would be interactive reporting, which is very catchy,  The common use case for interactive reporting is to perform ad hoc analyses in real time. But these reports are useful for scenarios that have limited data volume, where the most importance given for quick result than the amount of information analyzed.  Since solution manager has its own BI content, Interactive reporting would be much suitable.

     

     

    How IR differ from other reporting?

     

     

    Technically, I was not seen any difference, both using BI cubes and queries, templates. But the scenarios are different.  More over any reports delivered always use the queries generated in BW and data is retrieved from the BW to CRM for reporting cause. The difference here, we were identified under the scenarios where it has been used.

     

    When we setup the dedicated or current client in Solution manager system as BI client, then the best reporting suggestion would be interactive. Because no much processing required, most of the time would be very basic reports.

     

    Whereas if you have dedicated BI system, we can consider BI reporting would be more efficient due to huge data volume. This is more complex with heavier level of processing. Hence BI reporting would be the better option.

     

    Other huge difference I found that you can create, edit, and display interactive reports in ITSM web CRM_UI.  This helps to reports retrieve data in real-time on demand.  We have self-guided wizard to assist us during the creation of this reports.  You can then release these reports for certain users.

    For more information, see ITSM analytics. But this is not with BI reporting, most of the BI reporting are pre existed. Whereas BI reporting has more navigation and conditions for drilling down to base level.

     

    Good thing is that, solution manager has the use case for both the scenarios. For detail level of analysis like RCA we could use BI reporting whereas for technical monitoring scenarios we could use Interactive reporting

     

    Below are the Technical Monitoring Interactive reporting which shows basic details with the restricted navigation, this is helpful for everyone including end users.

     

    t1.JPG

     

    Here the RCA BI Reporting, which has detail analysis much helpful for the administrators

     

    t1.JPG

     

    Below the guided procedure for Interactive reporting creation in ITSM

     

    t1.JPG

     

     

    Is it new to solution manager?

     

     

    No, from our cross check, we identified the IT performance reporting which been used since SM 7.0 Ehp1 is the same concept. Now IT Performance reporting is also enhanced as new interactive reporting templates. Yes, IT performance reporting in SM 7.1 is same as Technical monitoring -> Interactive reporting, Both are using the same web templates.

     

    In solution manager 7.0 IT performance reporting the web template 0SMD_RS_NAVIGATION has only reports

     

    t1.JPG

     

     

    The same web template enhanced in SM 7.1 SP10 with more than 50+ reports as below, it is now called as "Interactive reporting"

     

    t1.JPG

     

     

    Unique feature in IR reporting

     

     

    If I compare IR reporting with BW reporting, might BW reporting ends with high rating, But If I look for unique feature of IR reporting, it is efficient, accurate. mostly it is targeted to all kind of users.  You can display reports in table and charts. The following chart types are available: Column chart, Line chart, Pie chart, Bar chart and Stacked column chart

     

     

    t1.JPG

     

     

    You can use these reports to analyze data in many different ways, including a breakdown of individual documents. The report data is retrieved in real-time. You can export report data to Microsoft Excel and print reports.

     

     

    Technical settings

     

    This is again vary scenario to scenario,  we need to activate the corresponding bi content. For Interactive reporting in Technical monitoring you can do it via Technical monitoring Guided procedure setup.

     

     

    t1.JPG

     

     

    For interactive reporting in ITSM, you need to manually add some of the configuration services, refer https://websmp206.sap-ag.de/~sapidb/011000358700001194612012E 

     

    And also SAP ITSM Analytics on SAP Solution Manager 7.1 - SAP IT Service Management on SAP Solution Manager - SCN Wiki


    Other technical demos for IR reporting


    There are lots of demos available in solution manager rkt website, few are listed below, you all can also try and update your use cases below as comments.


    https://websmp106.sap-ag.de/~sapidb/011000358700000665452012E/index.htm

     

    https://websmp106.sap-ag.de/~sapidb/011000358700000479372011E/index.htm

    BPCA - Powerful Risk Eliminator

    $
    0
    0

    Is there any smartest and fastest impact analysis tool for SAP applications? The answer is “BPCA” i.e. Business Process Change Analyzer.

     

    How many of us are aware of this smart tool? Well!!  We have implemented this tool in our project and it works in very wonderful way.

     

    In SAP, change is the constant thing which may occur by support package upgrades, custom releases, etc. So it is important to assess the impact of change.

    Many times we have common questions during production movement of the new change that BPCA answers automatically:

    •          What’s changing?
    •          What’s the impact of change?
    •          How do we identify what are the impacted business processes to be tested as part of no-negative impact testing?
    •          Can we get recommendation for regression tests?

     

    SAP centric solutions are changed on a regular basis by SAP or customers, which require customers to test their business processes thoroughly. Sometimes it becomes difficult to identify critical business processes where ‘BPCA’ comes into picture. BPCA helps to find out list of all impacted business processes for solution manager project.

     

    In simple terms process follows as: 1. Capture the impacted steps. 2. Validate the steps. 3. Mitigate the risk and impact by performing regression testing. 4. Confirm the results

     

     

    pic1.jpg

     

    This blog will help in understanding the concept of Business Process Change Analyzer (BPCA).

     

    Pre-requisite: Basic knowledge of SAP Solution Manager Concepts.

     

    To get answers for questions on BPCA:  What and how BPCA does the change impact analysis? How it works? What are the prerequisites to implement this?  Let us see detailed steps.

     

    What we will cover in this blog?

     

    • Introduction: What is BPCA?
    • Technical Prerequisites for BPCA
    • BPCA Preparation: Identify and mark critical business processes
    • What are TBOMs and TBOM Generation Ways?
    • Results Interpretation with Change Impact Analysis.
    • Test Scope Optimization

     

    Let’s begin!!!

     

    1.     1. Introduction: What is BPCA?

     

    In day to day scenarios, SAP solutions are changed either by SAP or by customers when there is need for enhancement. In such scenarios, customers require to test their business processes thoroughly to ensure the particular change does not have any negative impact on other or existing business processes. Test scope identification is the important activity which helps to determine time and effort required to perform testing.

    Before we start the actual testing it is important to differentiate between the types of SAP solution change.

     

    Types of SAP solution change could be:

    • Maintenance or Enhancement to SAP support packages
    • Configuration changes
    • Custom developments, etc.

     

    For these types of changes standard test management approach is depicted below in picture:

             pic2.jpg

    Standard test management approach:

    •       Perform initial risk assessment on the effects of change on critical business processes.
    •       Based on impact analysis results, plan for testing.
    •       Once Test Planning is done, execute the test cases either manually or by automated test scripts.

     

    During all this process there is major pain point: Which business processes are affected by this planned change?Let me explain the situation by giving project scenario:

     

    In any project, whenever there is planned change which needs to be moved to production system specific approvals needs to be taken for production movement.So if there is a change to existing functionality then complete testing needs to be performed and business approval needs to be taken.For getting the change approved, change needs to be presented in front of approver board who will take decision to approve or reject the change. In such meetings, questions will be asked about Integration testing, business acceptance tests and non-negative testing. Based on the answers from change owner change will be approved but there will not be exact proof on impact assessment and testing performed for it.

     

    So major challenges faced in this process are:

    •         How to identify which business processes are impacted by this change?
    •         How to get the test recommendations?

     

    In order to address these pain points SAP introduced a new type of analysis application called “Business Process Change Analyzer” which is capable of performing change impact analysis.

    We can perform this analysis based on the transport request number result of which will be the list of business process steps impacted by that particular change.

     

    2.     2. Technical Prerequisites for BPCA:

     

    Let us see prerequisites for BPCA implementation:

     

    • Adequate Business Process Repository in Solution Manager: Business process repository should be adequate in solution manager with all Business Scenarios/Business Processes and Business process steps.
    • Business Process Steps with Transactions or Programs: Business process steps should be assigned with relevant ABAP object e.g. transaction code or report program.
    • Test Cases Defined per Business Process: Test cases should be defined per business processes
    • Test Cases uploaded in Solution Manager: Test cases should be uploaded to solution manager.
    • TBOM Creation: TBOMs should be created for each business process step.

     

      3.  BPCA Preparation: Identify and mark critical business processes:

     

    BPCA uses a SAP solution manager project as basis for analysis and for structuring the results as well.

    1.  

           1.  First project needs to be created in solution manager using SOLAR_PROJECT_ADMIN transaction code. Once project creation is complete, Business Blueprint needs to be configured via t-code SOLAR01. Business blueprint function enables to design, document and hierarchically catalogue all the business processes into Business Scenarios/Business Processes and Business Process Steps.

     

    As shown in the below picture business blueprint comprises the following elements in a hierarchy:

     

    pic3.jpg

     

    2.    2.Business processes must be defined in SAP solution manager within project. Below is one sample project where Material Creation and change are business processes.

      pic4.jpg

                                    Fig: Sample blueprint structure

     

    3.    3.Managed systems i.e. the system landscape where that particular transaction/program will be executed must be assigned with logical component which will connect SAP solution manager to that system landscape. A logical component is an administrative entity which assigns logical systems, in the entire system landscape and across projects. e.g. We want to test ‚‘MM01‘ t-code in specific system landscape then to connect to that system via SAP solution manager logical components will be used , in short it is the intermediate connection between SAP solution manager and managed system.

    4.  

          4. After the transactions are assigned make sure test cases are assigned to each business process or business process step.

          

              pic5.jpg

     

    • Identify and mark critical business processes: Let us see how to mark critical business processes. The criticality of business process can be set in the business blueprint of the project. The criticality setting will be helpful in prioritizing business processes during Test Scope Optimization and during BPCA analysis as well.

            We can set Business Process Priority in attributes tab of business process, for this we need to configure customer attribute.

          

            Let us see how to define customer attribute: Below is the path where we can define new attribute: Here we can define new customer attribute‘Business 

           Process Priority’.

     

            pic6.jpg

       

       So to mark business process as critical process, we can set ‘Business Process Priority’ as 1.We can define business priorities e.g. if it is a critical  

       business process then priority would be ‘1’.

       

            Let us see how we can mark the critical business process.In below example Material Creation is critical business process so that process is selected in

            blueprint.

       pic7.jpg

       Then go to attributes of the business process and in customer attributes set Business Process Priority to ‘1’. So ‘Material Creation’ is marked as critical 

       business process.

       pic8.jpg

    4.   4. What are TBOMs and TBOM Generation Ways?

     

    Till this step we have seen BPCA preparation first step that business process setup with marking critical business processes.

     

    Let us see next step i.e. TBOM generation.

     

    Users should run the business process transaction so that BPCA can collect the technical objects used during the execution of the business process. This collection of technical objects is called as technical bill of material or TBOMs. For example, if we want to create TBOM for’ Material Creation’ Business Process Step which is executed using the transaction “MM01” in the R/3 system. Execute the transaction in SAP solution manager which will connect to ECC system, give required input parameters and complete the creation of material. Same time BPCA will enable a trace in the ERP system and will collect all the objects such as data dictionary objects/includes/FMs, etc. This list of objects will become TBOM for transaction “MM01”.

     

    Let us see what TBOM generation ways are:

     

    BPCA TBOM Generation Ways:

     

    •         Static TBOM
    •         Semi-Dynamic TBOM
    •         Dynamic TBOM

     

    TBOMs can be created using three methods, let us see in details.

     

    1.Static TBOMs: Static TBOMs are created by scanning through source code of a transaction or report program. Statically it will note down all the list of technical objects used in that particular report or transaction. They have restriction on level of scan i.e. how deeper it will be able to scan the objects. Sometimes static TBOMs likely to be in accurate as they might miss few objects which are in deep levels of code. Static TBOMs are not recommended for productive use of impact assessment.

     

    To create static TBOM select the business process step and go to attributes. In attributes go TBOM tab and click on create TBOM, pop up will come select Static then create tbom.

     

    Step by step TBOM creation1.    Go to business blueprint and select the process step for which TBOM needs to be created. Once step is selected go to Transactions tab and select Attribute button.

                            pic9.jpg

    2.    Then go to TBOM tab and click on ‘Create TBOM’. It will give 3 options static, dynamic and semi-dynamic. Select static and branching levels up to 5.


                 pic10.jpg

                 pic11.jpg

    3.   Click on ‘OK’, static TBOM generation will be started and pop up will come “TBOM Created”.

                             pic12.jpg  

    4.   After creation we can see the contents of TBOM:

                           pic13.jpg

    5.  Below are the screenshots for TBOM contents and list of technical objects: Chart shows percentage of SAP components: e.g. ABAP objects percentage, etc.Below is TBOM example for transaction MM01 which contains total 373 objects. See that different types of objects have been collected program/code objects, table contents, data dictionary, business transactions, etc.

     

                           pic14.jpg

                           pic15.jpg

    2.Dynamic TBOM: Dynamic TBOM is similar to taking ST05 trace; it will make the list of objects by enabling the trace. That means for creating dynamic TBOMs user has to execute that particular business process step or transaction either manually/automatically and dynamic TBOMs will record technical objects used during execution for that transaction. Dynamic TBOMs are more accurate than static TBOMs.

     

    Let us see brief on creation of dynamic TBOMs: Once you select the ‘Dynamic’ option it will ask start recording option. Click on ‘Start Recording’ and complete the execution of transaction on managed system. Once done stop recording, dynamic TBOM will be created.

    After creation of TBOM contents can be viewed.

                            pic16.jpg

                            pic17.jpg                       

    ·    3.Semi Dynamic TBOM:Semi-dynamic TBOMs are created using UPL data (usage procedure logging) from production system. UPL is kernel level logging. Solution Manager has to be 7.1 SP11 and managed system above SP9 and above to create Semi-dynamic TBOMs.

    Semi-dynamic TBOMs are created in mass fashion using background job in BPCA. They are most accurate as they are based on usage data in production system.

     

    UPL data is collected at OS or kernel in the managed system. Two main jobs are executed in production system for collection of UPL data:

     

    •          Collector Job– This job runs every 45 minutes to collect the UPL logs.
    •          Daily Job– This job runs on daily basis to extract usage statistics. We can execute the report /SDF/SHOW_UPL to see UPL data sample.

    Let us see how to create semi-dynamic TBOMs:

     

    1. To create semi-dynamic TBOM execute transaction SOLMAN_WORKCENTER and go to “Administration”.In Administration click on Go to TBOM Utilities.

                           pic19.jpg

    2. Click on ‘Generation of Static and Semi-Dynamic TBOMs’ option.

                          pic20.jpg

    3. Select option ‘create semi-dynamic TBOM’ and enter the period for which UPL data needs to be fetched.

                          pic21.jpg

                            pic22.jpg                  

    4. Once this is done schedule a job and change the state to ‘Immediate’.

    5. Once the job is finished, check the job log for semi-dynamic TBOM. Job log shows TBOM creation is finished.

                           semi-dyn.png

    These are the 3 ways to create TBOMs Static/Dynamic/Semi-Dynamic.

     

     

    5.   5. Results Interpretation with Change Impact Analysis:

    Let us see the process to find the results of change impact analysis and also find out how to interpret these details.

    1.   1. Go to test Management work center by executing SOLMAN_WORKCENTER.

    2.   2. Go to BP Change Analyzer View. This view shows us list of analysis which was done previously also allows us to create new analysis.Scroll down below the screen and we can see already done analysis.

      s                          pic25.jpg

    3.   Now let us see how to create new analysis:

          1.In the initial screen enter all the details required for analysis as shown in below screenshot:

                                   pic26.jpg

    4.   2.After all the details are entered click the RUN button, this will run impact analysis for that transport request.

          3.After execution check the analysis, analysis will list down all the affected business process steps:

                            pic27.jpg

    4.Above is the output of impact assessment, it lists all the impacted business process steps because of the particular transport request.

     

    6.   6. Test Scope Optimization:

    While performing BPCA analysis for large or huge changes like support package upgrades, BPCA analysis gives lot many business process steps as result. That is to say, this analysis is technically correct as many objects are modified during these upgrades.

    In such scenarios, we may look at the impact analysis in different way to optimize and reduce the test scope using different parameters.

     

    BPCA helps users to optimize and reduce test scope using below criteria:


    • Test Object Coverage: In test coverage, BPCA will use number of objects which are impacting certain business process. Example ‘Material Creation’ process step has large impact which is almost 40% of impacted objects. This technique will give users clear idea on which are the most impacted business process steps and accordingly how to optimize the test scope to test only those processes.


    • Test Efforts: Another way to optimize the test scope is to use test efforts. In few cases there would be automated test cases assigned to business process steps. In such scenario testing would be more efficient and easier. So based on manual efforts required for testing test scope can be optimized.


    • Business Process Attributes: We have already seen how to mark critical business processes for which user need to set ‘Business Process Priority’ in attributes of business process. This way of marking critical process helps BPCA to optimize the test scope.


    Below are samples of Test Scope Optimization:


                             pic28.jpg

                               pic29.jpg

    Also BPCA can be integrated with HP Quality Centre and there are few prerequisites for this integration.I can share more details on integration part once we implement this in our project


    I hope this blog gave helpful insight on BPCA implementation and its benefits.


    Discrete manufacturing: Mapping BP Analytics key figures to system status

    $
    0
    0

    More than 5 years ago we wrote a blog on how to best monitor a discrete manufacturing process. Back then key figures like "Production/Process Orders overdue for Release" and "Production/Process Orders overdue for Technical Closure" have been described. Some other key figures, that describe process steps in-between, were left out back at that time. Such key figures as
        

    • Production/Process Orders Released without first Confirmation
    • Production/Process Orders overdue for Final Confirmation
    • Production/Process Orders overdue for Delivery Completed

       

      All those key figures were optimized from a technical runtime perspective, i.e. those key figures are only reading from tables AUFK, AFKO and AFPO and do not select data from the status table JEST, which is typically huge.

      Customers are usually so used/trained on looking at system status (e.g. CRTD, REL, PCNF, CNF, PDLV, DLV) for a production/process orders, that those users are having difficulties to understand what the above mentioned key figures are measuring. This  blog shall help to bridge this gap and shall explain how the key figures in Business Process Analytics map to the system status in a production/process order.

      Mapping Business Process Analytics key figures to system status in order

       

       

      I will explain the key figures in the logical order of a discrete manufacturing process

        1. Production/Process Orders overdue for Release
          • measures those orders which have been created (status CRTD) and where the scheduled release date already lies x days in the past, but where no actual release took place (i.e. status REL not yet reached)
        2. Production/Process Orders Released without first Confirmation
          • measures those orders which have been released (status REL) and where the actual release date lies x days in the past, but where no initial confirmation took place (i.e. status PCNF not yet reached)
        3. Production/Process Orders overdue for Final Confirmation
          • measures those orders which have been released (status REL) and where the scheduled end date already lies x days in the past, but where no final confirmation took place (i.e. status CNF not yet reached)
        4. Production/Process Orders overdue for Delivery Completed
          • measures those orders which have been released and at least initially confirmed (status REL & PCNF) and where the scheduled end date already lies x days in the past, but where no delivery completed flag was set (i.e. status DLV not yet reached)
        5. Production/Process Orders overdue for Technical Closure
          • measures those orders which have been released, initially confirmed and are complete regarding delivery (status REL & PCNF & DLV) and where the scheduled end date already lies x days in the past but where no technical completion took place (i.e. status TECO not yet reached)

         

         

        Let's summarize it in a short table:

         

        Key figureActive status reachedPossible further active statusWaiting for status
        Production/Process Orders overdue for ReleaseCRTDREL
        Production/Process Orders Released without first ConfirmationRELPCNF
        Production/Process Orders overdue for Final ConfirmationRELPCNFCNF
        Production/Process Orders overdue for Delivery CompletedREL & PCNFCNF or PDLVDEL
        Production/Process Orders overdue for Technical ClosureREL & PCNF & DELCNF

        TECO

         

         

        Further reading

         

        Frequently Asked Questions about Business Process Monitoring and Business Process Analytics are answered under http://wiki.sdn.sap.com/wiki/display/SM/FAQ+Business+Process+Monitoring and

        http://wiki.sdn.sap.com/wiki/display/SM/FAQ+Business+Process+Analytics respectively.

         

        The following blogs (in chronological order) provide further details about Business Process Analytics and Business Process Monitoring functionalities within the SAP Solution Manager.

        Meet ASUG SAP TechEd d-code Speaker Heiko Zuerker - Solution Manager

        $
        0
        0

        http://scn.sap.com/servlet/JiveServlet/showImage/38-110543-496382/asug.png

         

        Continuing our meet the ASUG speaker series at SAP TechEd && d-code series I am pleased to introduce Heiko Zuerker who is speaking at ASUG SAP TechEd d-code session ITM 120 How to be Successful with Run SAP Like a Factory

        Zuerker_Heiko_SAPTeched12_Head_Small.png


        About Heiko:

        (pictured to the right - photo supplied by Heiko)

         

        Heiko Zuerker is an IT Manager at Rockwell Automation. Born and raised in Germany, he moved to the United States in 2001. Heiko has over 20 years of IT experience, including various roles in desktop and server support, security, and SAP Basis.

        More recently, he has been focusing on SAP continuous improvement and has been a pioneer in implementing “Run SAP Like a Factory.”

         

        If he’s not working late at the office, you find him either presenting at the SAP TecheEd, diving with sharks, or crawling through Lake Michigan shipwrecks.

         

         

        This year's presentation is very special to him, since it's his 5 year anniversary for presenting at the SAP Teched/d-code through ASUG.

         

        About his session:

         

        Here is the abstract from the session listing:

         

        Come and learn from Rockwell Automation's 2 1/2 years of "Run SAP Like a Factory" experience. Learn how they plan and implement its phases, how they have set up and run their Operations Control Center (OCC), the challenges they have faced and are still facing, and how they continuously improve. Hear also how Run SAP Like a Factory has transformed their SAP support

         

         

        _______________________________________________________________________________________________________________

         

        Join ASUG at SAP TechEd && d-code

        OCTOBER 20-24
        Venetian/Palazzo Congress Center


        ASUGSAP d-code Las Vegassessions are now published - for a complete listing please seehere


        Save the date Monday, October 20th for ASUG SAP TechEd d-code Pre-conference Day


        Related:

        Meet ASUG SAP d-code Speaker Charles Reeves - Implementing Enterprise Master Data Management

        ASUG SAP d-code SAP BW 7.4 powered by SAP HANA Speaker - Introducing Pawel Mierski

        ASUG SAP d-code Sessions Are Published - Featuring SAP Mentors

        Journey to Mobile BI - Meet ASUG SAP d-code Speaker Peter Chen

        Did you know?

        Meet ASUG SAP TechEd d-code Speaker Kumar Chidambaram - Holistic BI BW on HANA Approach


        Big Five Challenges on Setting up Technical Monitoring in SM 7.1 SP10

        $
        0
        0

        We have successfully completed the technical monitoring setup project with one of the client last week; It went go live as planned. And this was my last project in Singapore.  We celebrated success party with the client end as well as my farewell party with my peers and friends.  We were very pleased to have one of my manager from previous company on the party; he is one of my well-wisher too.  We had a wonderful conversation with various things. One of the catchy questions from him was about Technical monitoring. The question from him was very interesting; it was that, what are the challenges from your end for small and easy projects like technical monitoring.  Any project whether it was small or big always  has challenges, only critical level may differ.  Like my manager, might be many of you had the similar doubt, hence thought of sharing with you all my experience on setting up technical monitoring.

         

        I Clear Draft of Requirement

         

        The major work in the technical monitoring would be on defining Template strategy. Though SAP had given the standard templates for setting up monitoring, we need to customize it anyway. It is always good practise to create custom template based on the standard one and work only on the custom templates. Due to this, We have to define the clear plan on how many custom templates been created, and how could it be assigned to systems. More over Customization is not meant of custom alerts or metrics, even if you are changing the notifications, priority, severity, threshold all would be considered as customization. We need to be very clear on what are the metrics are need to be monitored, what are the metrics are need to trigger the alert, what is the mode of sending notification whether Email or third party, who are my recipients. This all needs to be clear in our hand, before start the setup.

         

        You can get the available metrics and alert with the detail description from the SAP Standard template overview tab under solman_setup -> Technical monitoring -> Step 4 Template maintenance

         

        tab1.png

         

        You can use this excel file for finalising your entire requirement on beforehand.

         

         

        II Ccustom Alerts

         

         

        There are lots of special features available in Technical monitoring like alerts grouping, metric group, and variant settings. Analyse the entire standard and change individual setting accordingly on your need. If you are going to create any custom metrics based on CCMS MTEs, make sure you have created the data collector also on Z space. This could be very helpful during the time of tracking. The document which we followed was here, How to Create Custom CCMS Metrics.http://wiki.scn.sap.com/wiki/download/attachments/269157890/How%20to%20create%20custom%20CCMS%20metrics%20in%20MAI.pdf?v…

         

        IMG_20140720_230431.JPG

         

        Please see the appendix of the document on page number 27 for custom data collectors creation.

         

        Also note that, every time you change the metrics you need to reactivate the template, then only all the changes get activated. If you created more templates and assigned to lots of systems this deactivation and activation takes lots of time, hence make sure all custom alerts are finalized and created before. If you have test systems would be very nice that you can test before active in production.

         

        III Fulfilling Pre requistie

         

         

        Almost all the solution manager scenarios setup can be started only after the pre requisite met. The major issue in technical monitoring would be data collectors. As you all know that technical monitoring is completely different from the earlier system monitoring which is CCMS based. Technical monitoring collectors have lots of collectors, like RFCs, Diagnostic agents, SAP Host agents etc.

         

        img20140720_195018.jpg

         

        Most of the pre requisites are checking this collector’s connectivity and status. The major pre requisites are complieting the entire solman_Setup -> system preparation, basic configuration, and Managed system configuration.  Hence we need to make sure that all steps are marked green.

         

        img20140720_193623.jpg

         

        The other things which I could consider as pre requisite would be EWA reporting, DIA agent connection, Technical Monitoring content update, ST/PI, SAP Host agent upgrade to latest level.

         

        There are some metrics needs additional parameters needs to be set like NFS Share onitoring, In such case make sure you set the parameter and restarted the system.

         

         

        IV Troubleshooting

         

         

        We do have major issues in this area, like most of the Metrics data would not be collected or rated grey or wrongly rated or wrongly define.

         

        The wonderful places helped to overcome most of the major issues are, Content check and compare tool.

         

        tab2.png

         

        Metric Monitor, this helps to identify the range of metrics value variation over period of time.

         

        tabq4.png

         

        Data collection check, this is the first place where we can get the cause of the grey monitoring.

         

        image3.png

         

        And also MAI Tools, this is one of the very powerful transaction code, which almost helped to fix very difficult collection issues, like authorisation and etc. Check out more here, Monitoring and Alerting Infrastructure Analysis Tools - SAP Solution Manager - SAP Libr

         

         

        V House Keeping


        The standard housekeeping from SAP might not be sufficient, if you are having more systems or more administrators and vice versa. Please make sure that you defined the house keeping for alerts as well as the BI store.

        For alerts can be defined directly in Technical monitoring setup.

         

        img20140720_190748.jpg

         

        For BI housekeeping, please look at my prior blogs for more clarity here,How Is The Health Of Your SAP Solution Manager BI Content?

         

        The other minor challenges like authorization, maintaining global notifications, integration third party ticketing tool,  work mode management all are significant too.

         

        Hope this blog help out all those who planning for setting up Technical Monitring.

        A Different 'Reject' Behaviour in the Change Request Management Approval Procedure

        $
        0
        0

        So after some time at home (parental leave, etc.) today I want to share with you, how and where to change behaviour in the approval procedure easily if you are able to do coding (or in this special case insert the code).

        I will not get into approval procedure detail, more information of the this can be found here f.e.:

        http://scn.sap.com/community/it-management/alm/solution-manager/blog/2013/10/31/new-charm-feature-with-sp-10-enhanced-approval-procedure-functions

         

        Standard behaviour:
        When using the approval procedure in Change Request Management with more than one approver, the Request for Change is only rejected after all approvers have voted, meaning executed their approval step.

        Which means if you have a Request for Change with five approvers and the first one rejects the Request for Change, you have to wait until all approvers voted so the Request for Change is rejected. But it will be rejected as one reject is enough to completely reject the whole Request for Change.

        This is the standard behaviour taken over from CRM standard.


        Advantages and disadvantages:


        The advantages are that all approvers know about the Request for Change and that it might be more convenient that if this has to be clarified with the rejecters, you know all of them in one step.

         

        The disadvantage is that you have to wait until all approvers executed their decision until you get feedback because the Request for Change will nevertheless be rejected.

         

        I was shortly contacted by a customer asking if this could be changed.

        The customer wanted that the first reject leads to a general rejection of the Request for Change because it doesn't make sense for him to wait until others rejected, too. I assume they have a process which allowed to go again into approval from 'Rejected' which might not be a 'FINI' status at all. But that's a guess.

         

        Here is now how the behaviour can be implemented:

        1. Copy a standard function module and change the code
        2. Register the new function odule instead of the old in the CRM event Framework


        The standard behavior evaluating the approval steps and setting the approval procedure is implemented via the CRM Event Framework which can be reached via transaction 'CRMV_EVENT'.

        event framework.jpg

        There function modules are registered to be called by specified events, like save of the CRM document, etc.

        Enter the function name 'AIC_SRQM_RFC_APPROVAL_STAT_EC' and press 'Callback for Cat./Obj./Event'.

        You will see that the function is registered to run on the event 'AFTER_SAVE' if for the object 'APPROVAL' the attribute 'STEP' has changed.

        function regiesterd.jpg

         

        This function  'AIC_SRQM_RFC_APPROVAL_STAT_EC' has to be replaced by our own copied function ( f.e. 'Z_SRQM_RFC_APPROVAL_STAT_EC') with changed code.


        To do so, call transaction 'SE37', copy the function to the name stated above to an existing function group of your choice and save everything in your transport. Be aware that the top-include of your function group needs the includes

         

        INCLUDE crm_approval_con.

        INCLUDE crm_object_kinds_con.

        INCLUDE crm_object_names_con.

        INCLUDE crm_events_con.

        INCLUDE crm_mode_con.

        INCLUDE crm_log_states_con.

        INCLUDE crm_status_con.

         

        so the copied function module is syntax free.

         

        Then replace the code listed in the screenshot with the code further down and activate everything.

        Replace.jpg

        * Replacement code ****

          READ TABLE ls_approval_wrk-approval_steps_wrk WITH KEY approval_result = gc_status-request_for_change_rejected TRANSPORTING NO FIELDS.

          IF sy-subrc NE 0.

        * --4. whether all step are processed

            CHECK cl_crm_approval_utility=>all_steps_done(

                  it_approval_s_wrk = ls_approval_wrk-approval_steps_wrk ) EQ true.



            lv_approved = true.

            LOOP AT ls_approval_wrk-approval_steps_wrk ASSIGNING <fs_approval_s_wrk>

                    WHERE approval_result EQ gc_status-request_for_change_rejected.

              lv_approved = false.

              EXIT.

            ENDLOOP.

          ELSE.

            lv_approved = false.

          ENDIF.

         

        Afterwards register this function in the CRM Event Framework by going into 'Edit' mode and entering the new function name.

        replace 2.jpg

         

        That's all. Now the new function with the changed code is called and the Request for Change is set to 'Rejected' when the first approver rejects his approval step.


        By the way,  if you are familiar with the Basis Enhancement Framework ,it is as well possible to replace the code directly in function 'AIC_SRQM_RFC_APPROVAL_STAT_EC', then you do not have to change the registration of the function in the CRM Event Framework.

        enhance.jpg


        So, hope that helped a bit if you find errors, get in contact with me,

        Michael

        PPMS reference for Netweaver 7.3, Netweaver 7.31 and Netweaver 7.4

        $
        0
        0

        I just released KBA 2045839 .

         

        It has the product instance definitions of Netweaver 7.3, EHP1 for Netweaver 7.3 and Netweaver 7.4.

         

        This is by no means everything that there is for these product versions in PPMS, but it is a start.

         

        Let me know if you like the document.

        Viewing all 337 articles
        Browse latest View live


        <script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>