Quantcast
Viewing all 172 articles
Browse latest View live

On queue processing, what (just) happened? Part #3

So it seems that finding out what happened to an entry/assignment and why is also an issue. Here are some hints on how.

 

Please note that these are not official table definitions. Views that start with idmv and mxpv are public and stable, but the tables themselves can and have occasionally changed.

 

Audits

 

All provisioning tasks are audited, the combination mskey, actionid, auditid is unique in the system. Any task that is started using the IdM UI, Rest interface, attribute events, privilege events etc. will get a unique AuditId. If the workflow processes triggers addd

 

mxp_audit

The main audit table is mxp_audit. This contains a single row entry for each task/event that has been triggered for your entries. Some notable columns:

 

ColumnValue
auditidunique identifier
taskidTaskid of root task of this audit (such as change identity, provision, deprovision, modify in provision framework)

mskey

Entry for which this audit was started
auditrootroot audit or itself. Root audit points back beyond the parent audit (refaudit) to the very first audit in case of deep nested events
posteddateDatetime when the task was initiated and put into the provisioning queue
statusdateDatetime when the task or any subtask of this audit was last updated or evaluated
provstatusSee mxp_provstatus, Key values: 0=Initiated OK, 1000=Task OK, 1001=Task Failed, 1100= OK, 1101=FAILED
LastActionId of last subtask that has been executed or attempted executed. Will be the task that failed in case of provstatus 1001 or 1101
refauditnull or the parent audit
MSGError message or just informational message depending on provStatus
userid

Somewhat fuzzy logic datablob, but there is a structure to it

 

If its simply a number then it should be the mskey of an entry, and the task was started by a user in the IdM UI.

 

When value starts with # this column indicates an privilege attribute event that caused the task to start. The Format is  #<attrid>:<operation>:<checksum>:<oldvalueid>

Example: #395:INSERT;3285570;0

395 is the attribute ID (MXREF_MX_PRIVILEGE), 3285570 is the checksum of the value that triggered the task (mxi_values.bCheckSum=<checksum>), oldvalueid = 0 means this is a new value, not a replacement/modify otherwise the old value can be found in mxi_old_values.old_id=<oldvalueid>

 

When starting with + this value indicates that this is an On event such as On (chain) OK or On (chain) Fail. Format is +<taskid>:<eventname>

Example: +1002083:On Chain OK task

 

When starting with * this value indicate that it was started by an entry event (defined on the entry type). Format *<entryid>:<operation> where operation should be Insert, Modify or Delete.

startedByOnly valid when task is started from Workflow UI/REST, contains the mskey of the logged in user that ran the task

 

Views: mxpv_audit

 

A fun/useful thing to do with the audit includes checking the average execution time of a task from start to end over time. Change taskid=X to taskid in (x,y,z) to get more tasks or extend this sequel with a link to mxp_tasks to use tasknames and be creative. I suggest keeping it limited to a few tasks or the results might become difficult to interpret.

 

SQL Server:

select taskid,convert(varchar(10),posteddate,20) Date,count(auditid) as numExecs,avg(datediff(ss,A.posteddate,A.statusdate)) AvgTimeToComplete
from mxp_audit A with(nolock)
where taskid = 1 and posteddate > '2014-02-01' and ProvStatus >999
group by taskid,convert(varchar(10),posteddate,20)
order by taskid,convert(varchar(10),posteddate,20)

Oracle:

select taskid,to_char(posteddate,'YYYY-MM-DD') "date",count(auditid) "numExecs",AVG(round(statusdate-posteddate,2)*24*60*60) "avgTimeToComplete"
from mxp_audit
where taskid = 20 and postedDate > to_date('2014-02-01','YYYY-MM-DD') and provstatus > 999
group by taskid,to_char(posteddate,'YYYY-MM-DD')
order by taskid,to_char(posteddate,'YYYY-MM-DD')

This calculates the average time between start-time and end-time of the task with id=1 (SQL) and 20 (Oracle), and I suggest using the taskid for Provision or Modify to test this. ProvStatus >= 1000 are completed, those still running will have no statusdate worth using in this case.

On SQL Server changing the length of the convert statement to 7 characters you can group it by month, 4 to get per year. On Oracle you can change to to_char conversions to YYYY-MM, or just YYYY.

You can also query for posteddate between two dates and much more. This is also useful to spot negative trends over time, but you must consider the overall load of the system. This is also useful during testing and tuning to verify if improvements you do have any impact on a full workflow execution.

Image may be NSFW.
Clik here to view.
part3_auditLogExecOverTime.png

List all tasks that have been executed on a user (SQL Server and Oracle):

select A.auditid, A.AuditRoot, A.RefAudit auditParent, A.userid, A.StartedBy, A.taskid, T.taskname, A.mskey, A.PostedDate, A.StatusDate, A.provstatus, A.LastAction, A.msg
from MXP_AUDIT A, MXP_Tasks T where A.TaskId = T.TaskID
and A.msKey = (select mcmskey from idmv_entry_simple where mcMskeyValue = 'ADMINISTRATOR')
order by auditroot,RefAudit

mxp_audit.taskid can be linked to mxp_tasks.taskid to get the taskname when accessing the mxp_table instead of the view (which has an unfortunate top 1000 limit).

 

mxp_ext_audit

The extended audit is stored in mxp_ext_audit. This contains a single row entry for each task/action executed within an audit and is enabled by checking the "Enable trace" checkbox.

 

ColumnValue
Aud_refAudit Id
Aud_TaskTask Id
Aud_OnEntryMskey
Aud_datetimeDatetime when the ext_audit record was created
Aud_ApproverMskey of approver. You should use mxp_link_audit for this when getting link approval records
Aud_Info

Generic information. If the audited task is a switch or conditional this column will contain the result of the evaluation. In case of conditionals it will be TRUE or FALSE, for switches it will contain the value returned by the SQL Statement.

Aud_Startedby

Reason for the task starting. Another fuzzy logic data blob. Some of the common value formats:

 

USER:<mskey>

ATTR:<attributeid>:<operation>

ENTRY:<entryid>:<operation>

TASK:<taskid>:<task operation. 0=inittask, 1=OnError, 2=OnOk, 3=OnChainError, 4=OnChainOk>

PRIV:<priv mskey>:<entryoperation>

ROLE:<role mskey>:<entryoperation>

OTHER:<other info>    : This is typical for tasks started using uProvision.

 

Operation values: 1=Modify, 2=Delete, 3=Insert

Entryoperation values: 0=Provision, 1=Deprovision, 2=Modify

 

Views: mxpv_ext_audit

 

The extended audit is useful for when you need to see what happened in subtasks, what conditional or switch statements returned or find out where a workflow stopped for a user. This query lists all tasks started by and including auditid 1307812, but can easily be modified to filter on aud_onEntry (mskey), and dates.


SQL Server and Oracle:

select t.taskname,A.aud_ref,aud_Datetime,aud_info,Aud_StartedBy
from mxp_ext_audit A, mxp_tasks T
where T.taskid = A.aud_task and aud_ref
in (select auditid from mxp_audit where AuditRoot=1307812)
order by aud_datetime

The red arrows shows a child audit being started, in this case by a uProvision call in a script, and the green arrows show where the child-audit is completed and allowing the parent audit to continue from its wait for events state.

Image may be NSFW.
Clik here to view.
part3_extAudit.png

 

Link audits, mxi_link_audit

 

Audits related to evaluation and processing of reference values (role/privilege assignments, manager and other references) have information stored in mxi_link_audit (also see mxi_link_audit_operations). This has a lot of columns and I suggest you look at the views and see what is there. Some of the key columns are:

 

ColumnValue
mcLinkid/linkIdReference to mxi_link.mcUniqueId
mcauditid/auditidReference to mxp_audit.auditid
mcDate/dateDate of entry
mcOperation/operationReference to mxi_link_audit_operations
mcReason/reasonRequest/approve/decline reasons
mcMSKEYUserMskey of user
mcMSKEYAssignmentMskey of the assigned entry (privilege, role, manager etc.)

 

Views: idmv_linkaudit_<basic/ext/ext2/simple>

 

Example data from idmv_linkaudit_ext2 view for an audit in which a role was added to a person, which caused to two inherited privileges to be assigned. Later the role was removed.

 

SQL Server and Oracle:

select linkid,auditid,auditDate,userMSKEYVALUE,AssignmentMSKEYVALUE,OperationText,AdditionalInfo
from idmv_linkaudit_ext2
where userMskey = 23
order by auditdate

Image may be NSFW.
Clik here to view.
part3_linkAudit.png

Note that a new audit be created only when there's an event task execution. The privilege in my example only had a del-member event and this event got a new audit (944072), the rest shared the add-audit of the role they were inherited from.

 

Useful variation with tasknames (SQL Server and Oracle):

select LA.userMSKEYVALUE, LA.auditDate, LA.AssignmentMSKEYVALUE, LA.operationtext, LA.auditid, A.taskid, T.taskname
from idmv_linkaudit_ext LA  left outer join mxp_audit A on A.AuditID = LA.auditid  left outer join mxp_tasks T on T.taskid = A.TaskId
where LA.userMSKEYVALUE = 'USER.TEST.3'
order by LA.auditDate

 

Logs

 

There are additional logs in jobs and actions that are stored in base64 blobs in the database. From SP8 we've added a new log, the execution log, which now stores messages from the runtime logged with uInfo/uWarning/uError.

 

Job and action logs, mc_logs

 

This contains the logs of all jobs and actions as well as other useful values. Some columns I find useful are:

 

ColumnValue
JobIdId of the job. An action is linked to a jobconfiguration on mxp_tasks.jobguid = mc_jobs.guid
TimeUsedThe number of seconds the action/job used to complete for this logentry
TotalEntriesThe total number of entries processed in this logentry
Num_AddsNumber of add operations performed
Num_ModsNumber of modify operations performed
Num_DelNumber of delete operations performed
Num_WarningsNumber of warnings reported
Num_ErrorsNumber of errors reported

 

Views: mcv_logall,  mcv_logwarn, mcv_logerr

 

One of the things this can be used for is to calculate how many entries per second an action/job processes.

SQL Server:

select jobname,JobId,sum(TotalEntries) totalEntries,sum(TimeUsed) totalTime,
round(cast(sum(TotalEntries) as float)/cast(sum(TimeUsed) as float),2) entriesPerSecond
from mcv_logall group by jobname,jobid
order by round(cast(sum(TotalEntries) as float)/cast(sum(TimeUsed) as float),2)  asc

Oracle:

select jobname,JobId,sum(TotalEntries) totalEntries,sum(TimeUsed) totalTime ,Round(sum(TotalEntries) /sum(TimeUsed),2) entriesPerSecond
from mcv_logall
group by jobname,jobid
order by entriesPerSecond

Image may be NSFW.
Clik here to view.
part3_jobLogCalc.png

This can give valuable hints about actions or jobs that are slow and will cause problems at some point in time. In this case my "Test something true: False" task is slow and needs a look. You can also reverse this by calculating totalTime/totalEntries to get time used per entry. This can be used in combination with the threshold log when running mass-update/performance tests in dev/qa cycles to detect potential issues before they cause downtime in production.

 

execution log

 

View: mcv_executionlog_list

 

This is a new log that has been in hiding for a while as it needed a UI. It still doesn't have one outside the MMC, but it is very useful. This log contains all the messages from the runtimes that would usually be locked inside the big blob of mc_logs or the dse.log file on the file system. So in short, this means that messages like this one:

Image may be NSFW.
Clik here to view.
part3_execLogXML.png

Are now also logged individually and linkable to the user and auditids. My root audit was 1316231 so this query will find all related audits and list runtime messages reported during the processing of these audits:

 

select mcAuditId,mcTaskId,mcTaskName,mcMskey,mcMsg,mcLogLevel,mcLogTime From mcv_executionlog_list where mcAuditId in
(select auditid from mxp_audit where AuditRoot = 1316231) order by mcUniqueId

This output would usually would be "hidden" inside the logfiles associated with each individual job:

Image may be NSFW.
Clik here to view.
part3_execLog.png

There is a lot more to the execution-log though, so have a look at it when you get your hands on a version supporting it.

 

Pulling it all together

 

To summarize this

  • One audit per task executed on an entry
    • One extended audit entry per sub-task
      • 0 to many execution log entries per action


And combining all this information can be done using:


select   AT.NAme,T.taskname taskname, EA.aud_ref auditid, ea.aud_datetime logtime,  '' loglevel, ea.Aud_Info info, ea.Aud_StartedBy startedby 
from   mxp_tasks T, mxp_actiontype AT, MXP_Ext_Audit EA 
where   T.taskid=EA.Aud_task and T.actiontype = AT.actType and   EA.Aud_ref in (select auditid from mxp_audit where AuditRoot = 1316231)
union
select 'Action' as type, mcTaskName taskname,mcAuditId auditid,mcLogTime logtime,
case   When mcLogLevel = 0 then 'Info'  when mcLogLevel = 1 then 'Warning'  when mcLogLevel = 2 then 'Error'  else cast (mcLogLevel as varchar)
end loglevel,mcMsg info ,'' startedby 
From   mcv_executionlog_list where mcAuditId in (select auditid from mxp_audit where AuditRoot = 1316231) 
order by logtime


This can give something like this:

 

Image may be NSFW.
Clik here to view.
part3_AllLogsTogether.png

 

And that I believe is all the detail one could look for on the processing of a specific task "dispatcher test #1.0.0" through two child tasks and back for a user all in a single view. I'm sure there'll be an admin UI for this later, but for now I expect this to be most useful in the development and QA cycle.


Assignment Notification task, customization and custom messages

Assignment Notification Customization and Standalone Notifications

 

When the new approval mechanism was introduced in SP4 we also added a new notification script. This was designed to be fairly flexible and usable for other notifications as well so you can use it in your workflows to send notifications about anything. I will do two things in this blog

 

1) Add additional/custom replacement strings to existing template

2) Use the Assignment Notification task to send a message as part of a regular (non-assignment) workflow

 

 

 

Common steps

 

Importing the Assignment Notification task and Notification Templates job folder

 

Start by importing the job folder and task and configuring the notification repository as outlined in the documentation. This is the quick version:

The templates are located in "\usr\sap\IdM\Identity Center\Templates\Identity Center\Provisioning\Notifications"

 

Right click on a provisioning group of your choice (or create a new one like I did with "Notification blog thingy") and import Assignment Notification_Task.mcc

Right click on a job folder of your choice or the top node (I used the default Job Folder) and import Notification Templates_jobFolder.mcc

Image may be NSFW.
Clik here to view.
notificationTask.png
Image may be NSFW.
Clik here to view.
notificationJobs.png

 

The first thing to do is check if you need to fix a mistake we did. The Assignment Notification task was not made public in the template provided with some versions. So open the task and verify that Public is checked, and if not, check it and save:

Image may be NSFW.
Clik here to view.
notificationTaskMakePublic.png

If this is not done any attempts to start this task using uProvision will be rejected.

 

Configure the notification repository values

 

You also need a notification repository with these values set:

Image may be NSFW.
Clik here to view.
notificationRepository.png

Import the standard notification templates

 

Next we need to update the templates in the database, so run the job named Import notification templates and verify there are no errors.

Image may be NSFW.
Clik here to view.
newNotificationImportTemplates.png

 

Create a basic approval workflow

 

Ordered task
  Action with a To Identity Store pass

  Approval Task

Image may be NSFW.
Clik here to view.
approvalWorkflowBasic.png

To Identity Store pass in Add approvers to PVOImage may be NSFW.
Clik here to view.
approvalWorkflowAddApprovers.png

Configure the approval task

- use the PVO to get approvers (MX_APPROVERS attribute).

- use the Assignment notification task as the ... notification task.

- use the Approval Initial Notification template as the initial message.

Image may be NSFW.
Clik here to view.
approvalWorkflowApproval.png

Create a test privilege

 

The privilege for this does not need anything more than a name and pointing the Validate Add task to the Assignment Approval Workflow that we created

Image may be NSFW.
Clik here to view.
approvalPRivilegeConfig.png

 

Creating users and repeatedly testing

 

It's very useful to create a small job that just creates a new user and perform a few test operations on it, and this sample can be used in a multitude of scenarios. In this job I have two very simple passes. The first just sets an email address on the user I've decided to use as an approver:

Image may be NSFW.
Clik here to view.
testJobSetApproverAddress.png

The next pass creates a new user prefixed with the %ddm.job% constant which is a counter that increases every time the job is run.

Image may be NSFW.
Clik here to view.
testJobCreateNewAndAssign.png

This means that every time I run this job a user with the name USER.TEST.BLOGNOTIFICATION.<number> will be created and assigned the privilege that has the approval  task, and I can rerun this and test the approval process with new users as many times as I need until I get all the bugs sorted out of my configuration.

 

At this point you should be able to run the job and get a notification in your mailbox when the assignment hits the approval task, more or less like this one if you're using the same release as me. Not perfect, but enough for this purpose:

Image may be NSFW.
Clik here to view.
approvalNotificationWorking.png

End of Common

 

Customizing existing message with new strings

 

This is a fairly simple operation and I will use the Initial Approval Notification sent to approvers to demonstrate this. The notification script looks at the context variables for most of its data. All the template files use a syntax of PAR_<VARIABLENAME> for text replacement strings. These are passed to the notification script as context variables named MSG_PAR_<VARIABLENAME>. This means that to add a new value all you have to do is

 

1) Add the new string replacement variable to the template file

2) Add the context variable using the uSetContextVar function

 

Editing the template

 

Locate the file AssignmentRequestApprovalinitialnotification_EN.html in \usr\sap\IdM\Identity Center\Templates\Identity Center\Provisioning\Notifications. Open it in a UTF-8 compatible editor and start editing. For this example I've modified the end of the file and added PAR_QOTD where the copyright notice used to be, and inserted a shoppinglist reminder above the URL:

Image may be NSFW.
Clik here to view.
approvalWorkflowTemplateEdit.png

Next we need to update the templates in the database, so run the job named Import notification templates as outlined in the common section

 

Setting the additional variables in the workflow

 

First we need an Assignment Approval workflow, again I keep most of this short and simple, it's well documented in tutorials etc. and focus on the new part, Add custom message variables. The basic layout of the task is described in the Common section.

 

Insert a new action with a To Generic pass after the Add approvers to PVO actionImage may be NSFW.
Clik here to view.
approvalWorkflow.png
Add custom message variables is a To Generic pass with a fairly simple entry script listed below. This simply takes the array and splits it first on !!, then writes each %1=%2 combination as an audit variable named #MSG_%1 with value %2. See the Additional Data section to see how it actually looks in the tableImage may be NSFW.
Clik here to view.
approvalWorkflowAddCTX.png

 

// Main function: setCTXVARS
function setCTXVARS(Par){  tmp = Par.get("MSGVARS");  ctxVars = uSplitString(tmp,"!!");  for (ctxvar = ctxVars.iterator(); ctxvar.hasNext();dummy=1)  {  tmp = ctxvar.next();  vals = tmp.split("=");  ctxVarToSet = "#MSG_"+vals[0];  ctxValToSet = vals[1];  OutString = uSetContextVar(ctxVarToSet, ctxValToSet);  }
}

The QOTD script returns a random string. I'll attach it to the end of the blog for the curious.

 

With the templates up to date in the system it's time to test, and to do that we just run the testjob from the common section again. My result, red outlines around the changes are added by me:

Image may be NSFW.
Clik here to view.
approvalNotificationModified.png

 

End of Customizing existing message with new strings!

 

 

Sending custom messages

 

This is a bit more complex. First we need to create a new template. The default location for the template files is

\usr\sap\IdM\Identity Center\Templates\Identity Center\Provisioning\Notifications

 

Creating a new template


Make sure you use a text editor that can save the file as UTF-8 without BOM (Byte Order Mark, gives two garbage chars at the beginning of the message if included) when performing the next steps.

 

We're creating a new message template text file named MyCustomMessage_EN.html with the following content:

<html>

<meta http-equiv="Content-Type" content="text/html; charset=UTF-8">

<head>

</head>

<body>

<p>Something PAR_DESCRIPTION has happened!</p>

</body>

</html>

 

Next we need to add a row describing this template to the index file, AssignmentNotificationsList.txt:

Image may be NSFW.
Clik here to view.
newNotificationList.png

Here's the text for those who like things easy.

Custom Workflow Message;EN;999;Its custom;Message for you sir;CHARENC=UTF8;MyCustomMessage_EN.html;CUSTOM

 

Importing the template

 

Next we need to get this template into the mc_templates table. This is what the Import Notification Templates job we imported earlier does. Run it and check the log. There should be no warnings or errors. You can also verify the database contents using this query

 

select * from mc_templates where mcClass = 'CUSTOM'

The result should be something like this:

Image may be NSFW.
Clik here to view.
newNotificationDB.png

Starting the notification from a workflow

 

Now we're ready to trigger the notification. It requires some fixed values to be set to work, and these are:

#MSG_TEMPLATE     Should match the name we used: Custom Workflow Message

#MSG_TYPE              Should match the message class used: CUSTOM

#MSG_RECIPIENTS   Is the MSKEY of the user message recipient

#MSG_PAR_<variables> Whatever else we feel like saying

 

I already made a script that sets a range of context variables earlier so I reuse it with a small modification in the script. This can be part of pretty much any workflow, you decide where it makes sense to you. My test task works as a UI task and when started using test provisioning and should give you some ideas on how to use it, nothing more:

Image may be NSFW.
Clik here to view.
customMessageSending.png

 

// Main function: setCTXVARS
function setCTXVARS(Par){  tmp = Par.get("MSGVARS");  ctxVars = uSplitString(tmp,"!!");  for (ctxvar = ctxVars.iterator(); ctxvar.hasNext();dummy=1)  {  tmp = ctxvar.next();  uInfo(tmp);  vals = tmp.split("=");  ctxVarToSet = "#MSG_"+vals[0];  ctxValToSet = vals[1];  OutString = uSetContextVar(ctxVarToSet, ctxValToSet);  }  mskey = Par.get("MSKEY");  taskid = Par.get("NOTIFICATIONTASKID");  AuditID = uGetAuditID();  OutString = uProvision(mskey ,taskid ,AuditID, 0, "does this work?",0);
}

The key change is that in addition to set the #MSG_ context variables it also calls the Notification task. This is brute force hard-coded taskid in the action, no error checking and not pretty code. So go forth and improve it.

 

Anyway, the result in my case was a very simple but satisfying email in my inbox:

Image may be NSFW.
Clik here to view.
customMessageReceived.png

And thats it.

 

Additional data

 

What does all this context variable look like anyway

 

At the time when the Assignment Notification action runs the mc_audit_variables table can look like this:

Image may be NSFW.
Clik here to view.
additionalContextVars.png

And for the custom notification:

Image may be NSFW.
Clik here to view.
additionalContextVars2.png

 

Quote of the Day script

, not well designed or thought out but for this sample it does what it needs to

 

// Main function: qotd
function qotd(Par){  qn = Math.floor((Math.random()*5)+1);  if (qn == 1) {  return "A day without sunshine is like, night";  } else  if (qn == 2) {  return "A penny saved is a government oversight";  } else  if (qn == 3) {  return "When everything comes your way you're in the wrong lane";  } else  if (qn == 4) {  return "Silence is golden, duct tape is silver";  } else  if (qn == 5) {  return "The road to success is always under construction";  }  return "You should not be here?! No quote for you!";
}

Compliant Identity Management - Don't Leave Your Virtual Doors and Windows Open

Network Security: Don’t Leave Your Virtual Doors and Windows Open

 

Imagine designing a new home. It’s likely you’d focus on the
overall layout first and then move on to the layout of each room. From there,
you’d incorporate important features, like your heating and air conditioning
systems, plumbing, and maybe a surround sound system. Maybe you’d start
selecting appliances. And of course, you’d want input on the design and décor
of your floors, walls, and ceilings.

 

 

But what if your contractor forgot to include locks on your
doors? Or used easily shattered glass for your windows? What about installing a
security system or screens to keep out pests? No matter how functional or
beautiful your home is, your investment isn’t worth much if it’s vulnerable to outside
threats.

 

 

But that’s often the case for many organizations that build
out their network organization. They design an efficient, state-of-the-art
solution with an attractive interface, but they forget a key component: network
security. In effect, they’re leaving their doors and windows open to the
internet equivalents of home burglars and pests – the hackers, cyber
terrorists, worms, and moles.

 

 

Network Security Shouldn’t Be an Afterthought

Often, security is added retroactively, when the damage is
already done. Many companies don’t recognize that they have a problem until
after their digital walls have been breached. And what’s even more dangerous is
that some may not even realize that an attack has occurred at all. Often, the
attacks are designed to be surreptitious. The longer an attack goes undetected,
the more information can be stolen.

 

A single cyber-attack can tear down what a company has spent
years building, resulting in:


  • The loss of intellectual property and
    proprietary data
  • Disruption to services for days, weeks, or
    months
  • Permanent damage to your brand loyalty and
    reputation
  • Legal costs associated with compensating
    customers for loss or identity theft
  • Compensation related to delays in meeting
    contractual obligations
  • Loss of customers to competitors
  • An increase in insurance premiums

 


So just how common is cybercrime? Both small businesses and
corporations are at risk. In my next post, I’ll talk numbers.

SP9 for SAP NetWeaver Identity Management Now Available

The latest support package, SP9, for SAP NetWeaver Identity Management, contains important integration enhancements for the SAP and GRC provisioning frameworks, and an updated SAP HANA connector. It also offers a new feature: You can now benefit from attestation capabilities. Attestation, also known as re-certification, means that managers or administrators periodically check and "attest" that a person only has those access rights he or she should have.

 

Need more information? Take a look at our overview presentation, and read the detailed release notes on the SAP Help Portal.
Ready to download? Get the support package on the SAP Service Marketplace.

How to optimize identities’ lifecycle management in your information system using SAP HR events?

This blog exposes a method for designing SAP HR integration with SAP IDM, it also gives you a large understanding of HR use cases and some tips to succeed this implementation. Here are some thoughts summarized from several customer experiences on SAP HR implementation with IDM.

 

 

 

HR possible use cases related to IDM scenarios

 

Combining Personnel Administration (PA) “tasks” (example: leaving) and “reasons” (example: firing) can quickly have different meanings and generate many changes on an employee PA file. Below is a description of the most common PA tasks which defines one employee lifecycle:

 

  • Hiring: hiring a new employee can be under different contract types or different employment categories for instance, as a permanent employees or as a trainee.

 

  • Rehiring: reentering an employee into a company after a long period of time, for example, for maternity protection leave (same as Suspension of contract)

 

  • Organizational reassignment: changes when the employee changes positions, cost center, or is moved to another subsidiary.

"Promotion" is an essential case to consider for IDM design, as it can be directly related to automatic roles calculation such as ESS/MSS roles.

 

  • Country Reassignment: refers to an employee being assigned to an organizational unit in another country, in other words, the employee is being expatriated to a different country.

 

  • Basic employee information modification: changing an employee’s last name in case of Marriage. This case can also represent a highlight when Login IDs are based on user’s last name.

 

  • Early retirement: as any other "early event" it’s when updating information for future events in SAP HR. (Same as Extensionof contract)

In those use cases we should manage validity dates with attention during IDM design phase.

 

  • Leaving case: When an employee leave the company.

 

Image may be NSFW.
Clik here to view.
HR flow.png

 

Identity Lifecycle regarding HR Business processes


 

Key steps for a successful design of SAP IDM scenarios derived from HR use cases

   

     1. Dig into how customer deal with every HR process

Essential SAP HR personnel administration tasks are defined and performed differently from one customer to another.

Prepare a set of questions to ask about every process during design phase, an example of questions would be:

    • How do HR operators deal with expatriations?
    • Is it a leaving task followed by a rehiring?
    • Is it only an organizational change?

 

    2. Think big … start small

When implementing HR with IDM, we tend to automate accounts management following a predefined rules.

Automatic rules can’t fit to 100% of company employees, that’s why it’s important to demarcate HR scope on a small “population” for a start then enhance      it to the most.

 

    3. Make it simple

SAP IDM provides a set of good utilities to manage rules on roles such as RBAC, Dynamic group for automatic calculation, inheritance between Business Roles Layers ….

When designing role model, let it be as simple as possibleand avoid combining many IDM utilities, the structure gets quickly messy and it’s always a pain  to explain to IDM end users.

 

    4. Spot relevant information

Pick up relevant information of what you need to know to build SAP IDM workflows and translate what you understood from HR process to IDM workflows      in a basic way.

From IDM side, everything is about creation, modifications and deletion.

 

     5. Summarize and focus on SAP IDM fundamentals

Sort out the collected information and focus on what you really need to know to build IDM workflows, below an example of an easy way to recap:

 

HR process

IDM Workflow

Relevant information for IDM?

Provisioning

Standard Modification

Specific * modification

De-provisioning

Hiring a trainee

X

 

 

 

New PA*

Hiring a permanent employee

X

 

 

 

New PA*

Re-hiring

X

 

 

 

New PA*

Marriage / divorce

 

 

X

 

Last name modification

Expatriation

 

X

X

 

Personnel area / Country / Organization modification

Company transfer

 

 

X

 

Organization modification

Expatriation

X

 

 

X

Contract type / country modification

 

*Specific modification: implementing a triggered modification workflow based on event tasks in IDM to respond to one customer specific business requirements.

*PA: personal administration

 

 

 

What you need to know about customization

 

Here are some tips that you will probably have to anticipate:

  • Query result: If you realize that you have many records for the same employee, you will probably have to ask your developer to make it all in one.
  • HCM write back: if you choose to write back information to HR, think about unselecting the corresponding "communication" data from SAP Query as you set SAP IDM as master on “communication” infoset.
  • Future events in HR such as future departures, usually require a modification on the standard query selection.

 

Driving SAP IDM processes by SAP HR events proves to be a good way to cut off support costs.

 

Feel free to try those tips and leave us a comment to let us know if it turns efficient for your projects too :-)

Single Sign-On versus Password Synchronization solutions. How do you know which one is right for you ?

Single Sign-On versus Password Synchronization solutions.

How do you know which one is right for you?

 

This blog co-authored with Benjamin GOURDON is based on several customers’ experiences.

 

The purpose of this blog is to perform a quick comparison and to provide an overview of pros/cons between Single Sign-On and Password Synchronization solutions.  Both are designed to greatly reduce the number of calls to the support and improve the user’s comfort, and provides a ROI lower than 3 months, as proven by many customer implementations.


Single Sign-On: SAP NetWeaver Single Sign-On

 

SAPNetWeaver Single Sign Onenablesusers to access all their applications through a single authentication event. From an end-user perspective, there is
no longer a need to provide credentials for connecting to each application.

 

The overall solution is subdivided into 3 sub solutions:

 

  • Secure Login which enable SSO to SAP systems using SAP GUI and other web applications in the same domain. Based
    on Kerberos tickets or X.509 certificates.
  • Identity Provider which enable SSO to any web application or web services with identity federation. Based on SAML2.0.
  • Password Manager which enable SSO to applications which are not supporting any standard protocol and requiring
    login/password information (previously locally recorded).

 

Depending on the system landscape, 3 different implementation scenarios are suitable and will determine the identification protocol: 

  • Homogeneous landscape: Only SAPapplications in the same domain
  • Heterogeneous landscape: SAP applications and non-SAP in the same domain
  • Heterogeneous landscape and inter-domain (« On cloud » applications)

 

Password synchronization:SAP NetWeaver Identity Management

 

SAP NetWeaver IdentityManagement allows to synchronize the password throughout your IT landscape so the user can access any application with the same password. Each password change in SAP IDM or in Microsoft Active Directory will automatically be replicated to all other integrated or supported systems as a productive password (optional). To secure this solution, the provisioned password must be encrypted via secure Channels (using SNC for SAP ABAP systems, or SSL for web applications including SAP Java systems or directories).


From an end-user perspective, this means using the same password for every application where you want to log on.


For additional information about this solution, I strongly recommend you to read this blog written by Jérémy Baars:


http://scn.sap.com/community/netweaver-idm/blog/2013/12/12/a-little-synchronization-can-pay-big-dividends-end-to-end-password-synchronization

 

Determine the solution which would balance cost, security, user comfort, adaptability according to your criteria.

 

The table below intends to compare the Password Synchronization and Single Sign-On by analyzing their respective strengths and
weaknesses:

 

Image may be NSFW.
Clik here to view.
TableBlogSCN.png

 

So let's consider several criteria to choose the most appropriate solution:

 

User Friendliness

As you can see above, SAP Netweaver Single Sign On offers a better end-user experience, as this solution reduces the number of times a user must type ID and password to access an application. This also contributes to raise user productivity.

 

Evolution perspectives

SAP Identity Management allows to optimize the user lifecycle and to simplify user management. It is replacing SAP Central User Administration (CUA) that will not be further developed by SAP., As such, it could be interesting to choose password synchronization method if you plan to implement an Identity & Access Management solution in the near future.

 

Security

If Security is an important criteria for your choice, implementing SAP Netweaver Single Sign On will guarantee a strong authentication by blocking traditional access on each application concerned.

 

Cost

From a financial point of view, there is not much difference regarding the implementation costs. The choice should more be oriented on the policy and the strategy of the enterprise.

Why you cannot use notifications with basic approvals

Introduction

The enhanced approval mechanism was introduced with SAP NetWeaver Identity Management 7.2 SP4. The purpose was to add more functionality as well as improve performance.

 

This post will attempt to clarify how the basic approvals are handled when the 7.2 approvals are enabled. It will explain why you won't always see the approvers for the basic approvals in the "Approval Management" of Identity Management Administration User Interface.

 

Defining the basic approvers

For basic approvals, the approvers are defined on the task, and it uses the same mechanism as the access control. This may include using an SQL filter, to determine who is allowed to approve. This gives you a really powerful way of defining the approvers, but also has some drawbacks.

 

In the following example, I've defined a role called ROLE:APPROVER. A user with this role is allowed to approve, but is only allowed approve users within the same cost center, i.e. with the attribute MX_COSTCENTER with the same value.

 

The approver definition looks like this:

Image may be NSFW.
Clik here to view.
BasicApprovalAccessControl.png


The filter to select users within the same cost center may look like this (on Microsoft SQL Server):

 

SELECT DISTINCT mskey

    FROM idmv_value_basic with (NOLOCK)

     WHERE IS_ID=1 AND

     ((mskey IN(SELECT mskey

        FROM idmv_value_basic

        WHERE AttrName='MX_COSTCENTER'AND

        SearchValue =(select aValue

                          from idmv_value_basic

                          where AttrName='MX_COSTCENTER'and

                         MSKEY=%ADMINMSKEY%))))

 

During execution, the %ADMINMSKEY% will be replaced by the MSKEY of the approver.

 

Determining the approvers

To determine the approvals for a given user, each and every pending approval must be checked. This evaluation is done when the To Do tab is opened. So for everyone who is a member of ROLE:APPROVER the system will have to check all the pending approvals to see if the target of the pending approval is in the same costcenter as the logged in user.

 

It is not possible to "reverse" the statement to get all the approvals for a given user(%ADMINMSKEY%).

 

As a side note: determining approvers for assignment approvals is simpler, as this will always be a list of users, privileges or roles, which can be expanded immediately.

 

Performance improvement for basic approvals

A major performance improvement was done with handling basic approvals, as the approver information is saved, which means that each approver only needs to run the above check once, for each new approval.

 

Whenever an approver is calculated, this approver is added to the internal approval data structure, which means that subsequent listing of approvals is very fast, compared to having to calculate this every time the user lists the approvals.

 

The MX_APPROVALS attribute

The MX_APPROVALS attribute is (as before) written to entries where an approval is pending, but is not used during the approval process. Therefore, if you have code which has manually changed this attribute, this will not have any effect on the pending approval.

 

Approval management

With the 7.2 approvals, we also added approval administration, both for manager and for administrator. This works fine for the assignment approvals (which are always expanded), but for basic approvals, you will only see approvers which have actually listed their approvals in the "To do" tab, and as a result, being added to the mxi_approver table.

 

Summary

Because of the possibility to use filters for defining approvers for basic approvals, it is not possible to expand the approvers initially, thus it is not possible to send notification messages. In addition they will not be shown in the approval management for the manager, until they have been expanded.

How to use Powershell to create a user in AD

Hi All,

 

I want to share a simple example with you to demonstrate how you can utilize SAP IdM to invoke a local PowerShell script.

In my scenario I am using Quest ActiveRoles Server Management Shell for Active Directory but this should work with Windows AD cmdlets as well.

 

In my Plugins folder I have replaced the standard To LDAP directory pass with a new Shell execute pass.

Image may be NSFW.
Clik here to view.
Screen Shot 2014-04-03 at 22.53.01.png

In the Destination tab you should disable the option "Wait for execution" and insert the following command with your arguments.

 

cmd /c powershell.exe -Command "c://scripts//ProcessQADUser.ps1" %$rep.QARS_HOST% %$rep.QARS_PASSWORD% %MSKEYVALUE% $FUNCTION.cce_core_descryptPassword(%MX_ENCRYPTED_PASSWORD%)$$ "'%Z_ADS_PARENT_CONTAINER%'" %MX_FIRSTNAME% "'%MX_LASTNAME%'"

Image may be NSFW.
Clik here to view.
Screen Shot 2014-04-03 at 22.57.50.png

Please remember to separate attributes using white spaces as PowerShell will remove commas and convert the arguments into an Array.

 


Hope this helps.

 

Regards,

Ridouan


Eliminating Whitespace

Maybe some of you have experienced this problem and maybe not. Maybe you just knew the answer but I couldn't find it on here anywhere so when I figured it out, I figured I'd share.

 

In the current environment I'm working in, when a new account is entered into IDM, be it through IDM directly or via the HR system, the first 6 characters of the last name and a couple characters from the first name or nickname are then used to complete the MSKEYVALUE, which is in turn becomes the user's Windows and SAP login IDs. We call this the 6+2 unique ID. The problem that was occurring was that if the person had spaces in their last name, that space counted as a character. It would get squeezed out when the actual MSKEYVALUE was created but it would then leave the ID in a 5+2 state.

 

For example, a name of "Jodi Van Camp", "Van Camp" being the MX_LASTNAME, would turn out an MSKEYVALUE of "VanCaJo" when it should be "VanCamJo".

 

The bottom line was, we needed to eliminate those spaces in the last name for the purpose of creating the MSKEYVALUE.

 

I thought it would be a simple replace using a script. Maybe something like this:

 

function z_eliminateWhitespace(Par){  var result = Par.replace(/\s+/g, "");  return result;
}

Or maybe this:

 

function z_eliminateWhitespace(Par){  var result = Par.replace(/\s/g, "");  return result;
}

Or this:

 

function z_eliminateWhitespace(Par){  var result = Par.replace(/ /g, "");  return result;
}

Or lastly, this:

 

function z_eliminateWhitespace(Par){  var result = Par.replace(" ", "");  return result;
}

None of this seemed to work. I've had it happen way too many times where a SQL query or JavaScript won't work exactly the way it should in IDM as it does in other environments so this wasn't a total surprise but now what? Finally, I happen on the idea of splitting the string on the spaces and rejoining it without the spaces. This was the script I eventually came up with and it seems to work:

 

function z_eliminateWhitespace(Par){  var result = Par.split(" ").join("");  return result;
}

The final script had an IF line before the split / join checking Par to make sure it wasn't empty or a NULL value but you get the general idea. Hope this perhaps helps someone out there someday.

A simple example consuming SAP Identity Management REST API

Dear community,

 

as I am often getting the question how to build a simple example using the SAP Identity Management REST API, I am writing a small blog post about it.

 

  • Attached, you are able to find a text file. Download it do a directory of your choice and rename it to index.html.
  • Download the jQuery library from following location and save it to the same directory as jquery-1.9.1.min.js:

http://code.jquery.com/jquery-1.9.1.min.js

  • Edit the index.html file with a plain text editor (e.g. Sublime Text) and change the host definition, so that it fit's your environment:

var baseUrl = 'http://localhost:50000/idmrest/v1';

  • After storing the file, open it with a browser (e.g. Goggle Chrome), and execute the "Search for Users" functionality. You will be prompted for username/password. As result, you should see a list of MX_PERSON entries.
  • Afterwards, execute the "Get Data from Display Task" functionality for a valid MSKEY, and you will see the attributes of this entry.

 

With the Google Chrome Developer Tools, you are easily able to look behind the scenes and which data is moved over the wire.

 

I recommend following  pages and tutorials for further information:

http://en.wikipedia.org/wiki/Ajax_(programming)

http://en.wikipedia.org/wiki/Representational_state_transfer

http://en.wikipedia.org/wiki/JSON

https://developer.mozilla.org/en-US/docs/AJAX/Getting_Started

https://developers.google.com/chrome-developer-tools/

 

In addition to following blog posts:

Write your own UIs using the ID Mgmt REST API

SAPUI5 and ID Mgmt - A Perfect Combination

 

Jannis

SAP IDM CEI Program

I had the opportunity to attend SAP’s first CEI program session for IDM. These sessions are designed to give SAP the chance to get some information on not only what customers are looking for from SAP IDM but also what they need from it. I think this is a fantastic initiative on SAP’s part for a few reasons.

 

  1. For the first time since SAP acquired MaXware almost seven years ago, they are going out to seek information from a variety of customers about how they would use the product. Pervious CEI initiatives were aimed at a very small group of SAP IDM customers. From what I understand this new group, mostly volunteers, is about 10 times larger! During this first session, I heard a few ideas come out that I don’t think SAP thought of, but saw the value almost immediately.
  2. SAP is actively engaging customers, partners, and consultants for feedback. The first session centered on the integration of Success Factors HR with IDM and what customers would be able to expect via an enhanced “ramp up” program.
  3. There are other sessions planned to cover a variety of topics in the coming months showing a considerable amount of initiative from SAP. I’m looking forward to exploring it in more detail.

 

Based on what I saw in this presentation and from things I’ve seen at past TechEd sessions, the forums, and general research, I’m seeing the following trends at SAP regarding Identity Management. These opinions are entirely my own, although comments from SAP are, as always, welcome.

 

  1. IDM is being positioned as the key workflow engine for managing identities throughout the SAP Landscape. If information is coming from any resource involving people, IDM is where it should be processed. HCM, Success Factors, PI, XI, AD, LDAP, no matter what the source, IDM should be handling the flow of information to other systems in the landscape and applications in the Enterprise. Unlike some other “people centric” systems like GRC and CUA, IDM is uniquely designed to work in a heterogeneous environment.
  2. It appears that SAP is poised to start a new phase of growth for IDM. This isn’t just the basic, release the latest iteration of the connector list and publish a new service pack. It’s time to add some new functionality and breathe some new life into SAP IDM!

 

One of the other benefits of this call-in session was getting to hear the voices of several people I have met in my travels and others that I have only met through SCN, Hope we all get to meet up soon! (TechEd, I mean d-code anyone?)

 

I’m looking forward to seeing what SAP has planned for IDM. Here’s to the new IDM frontier in the Landscape and the Enterprise beyond!

SP9 for SAP NetWeaver Identity Management Now Available

The latest support package, SP9, for SAP NetWeaver Identity Management, contains important integration enhancements for the SAP and GRC provisioning frameworks, and an updated SAP HANA connector. You can learn how to set up the connector in Penka Tatarova's new video.

 

SP9 also offers a new feature: You can now benefit from attestation capabilities. Attestation, also known as re-certification, means that managers or administrators periodically check and "attest" that a person only has those access rights he or she should have.

 

Need more information? Take a look at our overview presentation, and read the detailed release notes on the SAP Help Portal.
Ready to download? Get the support package on the SAP Service Marketplace.

How to optimize identities’ lifecycle management in your information system using SAP HR events?

This blog exposes a method for designing SAP HR integration with SAP IDM, it also gives you a large understanding of HR use cases and some tips to succeed this implementation. Here are some thoughts summarized from several customer experiences on SAP HR implementation with IDM.

 

 

 

HR possible use cases related to IDM scenarios

 

Combining Personnel Administration (PA) “tasks” (example: leaving) and “reasons” (example: firing) can quickly have different meanings and generate many changes on an employee PA file. Below is a description of the most common PA tasks which defines one employee lifecycle:

 

  • Hiring: hiring a new employee can be under different contract types or different employment categories for instance, as a permanent employees or as a trainee.

 

  • Rehiring: reentering an employee into a company after a long period of time, for example, for maternity protection leave (same as Suspension of contract)

 

  • Organizational reassignment: changes when the employee changes positions, cost center, or is moved to another subsidiary.

"Promotion" is an essential case to consider for IDM design, as it can be directly related to automatic roles calculation such as ESS/MSS roles.

 

  • Country Reassignment: refers to an employee being assigned to an organizational unit in another country, in other words, the employee is being expatriated to a different country.

 

  • Basic employee information modification: changing an employee’s last name in case of Marriage. This case can also represent a highlight when Login IDs are based on user’s last name.

 

  • Early retirement: as any other "early event" it’s when updating information for future events in SAP HR. (Same as Extensionof contract)

In those use cases we should manage validity dates with attention during IDM design phase.

 

  • Leaving case: When an employee leave the company.

 

Image may be NSFW.
Clik here to view.
HR flow.png

 

Identity Lifecycle regarding HR Business processes


 

Key steps for a successful design of SAP IDM scenarios derived from HR use cases

   

     1. Dig into how customer deal with every HR process

Essential SAP HR personnel administration tasks are defined and performed differently from one customer to another.

Prepare a set of questions to ask about every process during design phase, an example of questions would be:

    • How do HR operators deal with expatriations?
    • Is it a leaving task followed by a rehiring?
    • Is it only an organizational change?

 

    2. Think big … start small

When implementing HR with IDM, we tend to automate accounts management following a predefined rules.

Automatic rules can’t fit to 100% of company employees, that’s why it’s important to demarcate HR scope on a small “population” for a start then enhance      it to the most.

 

    3. Make it simple

SAP IDM provides a set of good utilities to manage rules on roles such as RBAC, Dynamic group for automatic calculation, inheritance between Business Roles Layers ….

When designing role model, let it be as simple as possibleand avoid combining many IDM utilities, the structure gets quickly messy and it’s always a pain  to explain to IDM end users.

 

    4. Spot relevant information

Pick up relevant information of what you need to know to build SAP IDM workflows and translate what you understood from HR process to IDM workflows      in a basic way.

From IDM side, everything is about creation, modifications and deletion.

 

     5. Summarize and focus on SAP IDM fundamentals

Sort out the collected information and focus on what you really need to know to build IDM workflows, below an example of an easy way to recap:

 

HR process

IDM Workflow

Relevant information for IDM?

Provisioning

Standard Modification

Specific * modification

De-provisioning

Hiring a trainee

X

 

 

 

New PA*

Hiring a permanent employee

X

 

 

 

New PA*

Re-hiring

X

 

 

 

New PA*

Marriage / divorce

 

 

X

 

Last name modification

Expatriation

 

X

X

 

Personnel area / Country / Organization modification

Company transfer

 

 

X

 

Organization modification

Expatriation

X

 

 

X

Contract type / country modification

 

*Specific modification: implementing a triggered modification workflow based on event tasks in IDM to respond to one customer specific business requirements.

*PA: personal administration

 

 

 

What you need to know about customization

 

Here are some tips that you will probably have to anticipate:

  • Query result: If you realize that you have many records for the same employee, you will probably have to ask your developer to make it all in one.
  • HCM write back: if you choose to write back information to HR, think about unselecting the corresponding "communication" data from SAP Query as you set SAP IDM as master on “communication” infoset.
  • Future events in HR such as future departures, usually require a modification on the standard query selection.

 

Driving SAP IDM processes by SAP HR events proves to be a good way to cut off support costs.

 

Feel free to try those tips and leave us a comment to let us know if it turns efficient for your projects too :-)

Single Sign-On versus Password Synchronization solutions. How do you know which one is right for you ?

Single Sign-On versus Password Synchronization solutions.

How do you know which one is right for you?

 

This blog co-authored with Benjamin GOURDON is based on several customers’ experiences.

 

The purpose of this blog is to perform a quick comparison and to provide an overview of pros/cons between Single Sign-On and Password Synchronization solutions.  Both are designed to greatly reduce the number of calls to the support and improve the user’s comfort, and provides a ROI lower than 3 months, as proven by many customer implementations.


Single Sign-On: SAP NetWeaver Single Sign-On

 

SAPNetWeaver Single Sign Onenablesusers to access all their applications through a single authentication event. From an end-user perspective, there is
no longer a need to provide credentials for connecting to each application.

 

The overall solution is subdivided into 3 sub solutions:

 

  • Secure Login which enable SSO to SAP systems using SAP GUI and other web applications in the same domain. Based
    on Kerberos tickets or X.509 certificates.
  • Identity Provider which enable SSO to any web application or web services with identity federation. Based on SAML2.0.
  • Password Manager which enable SSO to applications which are not supporting any standard protocol and requiring
    login/password information (previously locally recorded).

 

Depending on the system landscape, 3 different implementation scenarios are suitable and will determine the identification protocol: 

  • Homogeneous landscape: Only SAPapplications in the same domain
  • Heterogeneous landscape: SAP applications and non-SAP in the same domain
  • Heterogeneous landscape and inter-domain (« On cloud » applications)

 

Password synchronization:SAP NetWeaver Identity Management

 

SAP NetWeaver IdentityManagement allows to synchronize the password throughout your IT landscape so the user can access any application with the same password. Each password change in SAP IDM or in Microsoft Active Directory will automatically be replicated to all other integrated or supported systems as a productive password (optional). To secure this solution, the provisioned password must be encrypted via secure Channels (using SNC for SAP ABAP systems, or SSL for web applications including SAP Java systems or directories).


From an end-user perspective, this means using the same password for every application where you want to log on.


For additional information about this solution, I strongly recommend you to read this blog written by Jérémy Baars:


http://scn.sap.com/community/netweaver-idm/blog/2013/12/12/a-little-synchronization-can-pay-big-dividends-end-to-end-password-synchronization

 

Determine the solution which would balance cost, security, user comfort, adaptability according to your criteria.

 

The table below intends to compare the Password Synchronization and Single Sign-On by analyzing their respective strengths and
weaknesses:

 

Image may be NSFW.
Clik here to view.
TableBlogSCN.png

 

So let's consider several criteria to choose the most appropriate solution:

 

User Friendliness

As you can see above, SAP Netweaver Single Sign On offers a better end-user experience, as this solution reduces the number of times a user must type ID and password to access an application. This also contributes to raise user productivity.

 

Evolution perspectives

SAP Identity Management allows to optimize the user lifecycle and to simplify user management. It is replacing SAP Central User Administration (CUA) that will not be further developed by SAP., As such, it could be interesting to choose password synchronization method if you plan to implement an Identity & Access Management solution in the near future.

 

Security

If Security is an important criteria for your choice, implementing SAP Netweaver Single Sign On will guarantee a strong authentication by blocking traditional access on each application concerned.

 

Cost

From a financial point of view, there is not much difference regarding the implementation costs. The choice should more be oriented on the policy and the strategy of the enterprise.

Why you cannot use notifications with basic approvals

Introduction

The enhanced approval mechanism was introduced with SAP NetWeaver Identity Management 7.2 SP4. The purpose was to add more functionality as well as improve performance.

 

This post will attempt to clarify how the basic approvals are handled when the 7.2 approvals are enabled. It will explain why you won't always see the approvers for the basic approvals in the "Approval Management" of Identity Management Administration User Interface.

 

Defining the basic approvers

For basic approvals, the approvers are defined on the task, and it uses the same mechanism as the access control. This may include using an SQL filter, to determine who is allowed to approve. This gives you a really powerful way of defining the approvers, but also has some drawbacks.

 

In the following example, I've defined a role called ROLE:APPROVER. A user with this role is allowed to approve, but is only allowed approve users within the same cost center, i.e. with the attribute MX_COSTCENTER with the same value.

 

The approver definition looks like this:

Image may be NSFW.
Clik here to view.
BasicApprovalAccessControl.png


The filter to select users within the same cost center may look like this (on Microsoft SQL Server):

 

SELECT DISTINCT mskey

    FROM idmv_value_basic with (NOLOCK)

     WHERE IS_ID=1 AND

     ((mskey IN(SELECT mskey

        FROM idmv_value_basic

        WHERE AttrName='MX_COSTCENTER'AND

        SearchValue =(select aValue

                          from idmv_value_basic

                          where AttrName='MX_COSTCENTER'and

                         MSKEY=%ADMINMSKEY%))))

 

During execution, the %ADMINMSKEY% will be replaced by the MSKEY of the approver.

 

Determining the approvers

To determine the approvals for a given user, each and every pending approval must be checked. This evaluation is done when the To Do tab is opened. So for everyone who is a member of ROLE:APPROVER the system will have to check all the pending approvals to see if the target of the pending approval is in the same costcenter as the logged in user.

 

It is not possible to "reverse" the statement to get all the approvals for a given user(%ADMINMSKEY%).

 

As a side note: determining approvers for assignment approvals is simpler, as this will always be a list of users, privileges or roles, which can be expanded immediately.

 

Performance improvement for basic approvals

A major performance improvement was done with handling basic approvals, as the approver information is saved, which means that each approver only needs to run the above check once, for each new approval.

 

Whenever an approver is calculated, this approver is added to the internal approval data structure, which means that subsequent listing of approvals is very fast, compared to having to calculate this every time the user lists the approvals.

 

The MX_APPROVALS attribute

The MX_APPROVALS attribute is (as before) written to entries where an approval is pending, but is not used during the approval process. Therefore, if you have code which has manually changed this attribute, this will not have any effect on the pending approval.

 

Approval management

With the 7.2 approvals, we also added approval administration, both for manager and for administrator. This works fine for the assignment approvals (which are always expanded), but for basic approvals, you will only see approvers which have actually listed their approvals in the "To do" tab, and as a result, being added to the mxi_approver table.

 

Summary

Because of the possibility to use filters for defining approvers for basic approvals, it is not possible to expand the approvers initially, thus it is not possible to send notification messages. In addition they will not be shown in the approval management for the manager, until they have been expanded.


How to use Powershell to create a user in AD

Hi All,

 

I want to share a simple example with you to demonstrate how you can utilize SAP IdM to invoke a local PowerShell script.

In my scenario I am using Quest ActiveRoles Server Management Shell for Active Directory but this should work with Windows AD cmdlets as well.

 

In my Plugins folder I have replaced the standard To LDAP directory pass with a new Shell execute pass.

Image may be NSFW.
Clik here to view.
Screen Shot 2014-04-03 at 22.53.01.png

In the Destination tab you should disable the option "Wait for execution" and insert the following command with your arguments.

 

cmd /c powershell.exe -Command "c://scripts//ProcessQADUser.ps1" %$rep.QARS_HOST% %$rep.QARS_PASSWORD% %MSKEYVALUE% $FUNCTION.cce_core_descryptPassword(%MX_ENCRYPTED_PASSWORD%)$$ "'%Z_ADS_PARENT_CONTAINER%'" %MX_FIRSTNAME% "'%MX_LASTNAME%'"

Image may be NSFW.
Clik here to view.
Screen Shot 2014-04-03 at 22.57.50.png

Please remember to separate attributes using white spaces as PowerShell will remove commas and convert the arguments into an Array.

 


Hope this helps.

 

Regards,

Ridouan

Eliminating Whitespace

Maybe some of you have experienced this problem and maybe not. Maybe you just knew the answer but I couldn't find it on here anywhere so when I figured it out, I figured I'd share.

 

In the current environment I'm working in, when a new account is entered into IDM, be it through IDM directly or via the HR system, the first 6 characters of the last name and a couple characters from the first name or nickname are then used to complete the MSKEYVALUE, which is in turn becomes the user's Windows and SAP login IDs. We call this the 6+2 unique ID. The problem that was occurring was that if the person had spaces in their last name, that space counted as a character. It would get squeezed out when the actual MSKEYVALUE was created but it would then leave the ID in a 5+2 state.

 

For example, a name of "Jodi Van Camp", "Van Camp" being the MX_LASTNAME, would turn out an MSKEYVALUE of "VanCaJo" when it should be "VanCamJo".

 

The bottom line was, we needed to eliminate those spaces in the last name for the purpose of creating the MSKEYVALUE.

 

I thought it would be a simple replace using a script. Maybe something like this:

 

function z_eliminateWhitespace(Par){  var result = Par.replace(/\s+/g, "");  return result;
}

Or maybe this:

 

function z_eliminateWhitespace(Par){  var result = Par.replace(/\s/g, "");  return result;
}

Or this:

 

function z_eliminateWhitespace(Par){  var result = Par.replace(/ /g, "");  return result;
}

Or lastly, this:

 

function z_eliminateWhitespace(Par){  var result = Par.replace(" ", "");  return result;
}

None of this seemed to work. I've had it happen way too many times where a SQL query or JavaScript won't work exactly the way it should in IDM as it does in other environments so this wasn't a total surprise but now what? Finally, I happen on the idea of splitting the string on the spaces and rejoining it without the spaces. This was the script I eventually came up with and it seems to work:

 

function z_eliminateWhitespace(Par){  var result = Par.split(" ").join("");  return result;
}

The final script had an IF line before the split / join checking Par to make sure it wasn't empty or a NULL value but you get the general idea. Hope this perhaps helps someone out there someday.

A simple example consuming SAP Identity Management REST API

Dear community,

 

as I am often getting the question how to build a simple example using the SAP Identity Management REST API, I am writing a small blog post about it.

 

  • Attached, you are able to find a text file. Download it do a directory of your choice and rename it to index.html.
  • Download the jQuery library from following location and save it to the same directory as jquery-1.9.1.min.js:

http://code.jquery.com/jquery-1.9.1.min.js

  • Edit the index.html file with a plain text editor (e.g. Sublime Text) and change the host definition, so that it fit's your environment:

var baseUrl = 'http://localhost:50000/idmrest/v1';

  • After storing the file, open it with a browser (e.g. Goggle Chrome), and execute the "Search for Users" functionality. You will be prompted for username/password. As result, you should see a list of MX_PERSON entries.
  • Afterwards, execute the "Get Data from Display Task" functionality for a valid MSKEY, and you will see the attributes of this entry.

 

With the Google Chrome Developer Tools, you are easily able to look behind the scenes and which data is moved over the wire.

 

I recommend following  pages and tutorials for further information:

http://en.wikipedia.org/wiki/Ajax_(programming)

http://en.wikipedia.org/wiki/Representational_state_transfer

http://en.wikipedia.org/wiki/JSON

https://developer.mozilla.org/en-US/docs/AJAX/Getting_Started

https://developers.google.com/chrome-developer-tools/

 

In addition to following blog posts:

Write your own UIs using the ID Mgmt REST API

SAPUI5 and ID Mgmt - A Perfect Combination

 

Jannis

SAP IDM CEI Program

I had the opportunity to attend SAP’s first CEI program session for IDM. These sessions are designed to give SAP the chance to get some information on not only what customers are looking for from SAP IDM but also what they need from it. I think this is a fantastic initiative on SAP’s part for a few reasons.

 

  1. For the first time since SAP acquired MaXware almost seven years ago, they are going out to seek information from a variety of customers about how they would use the product. Pervious CEI initiatives were aimed at a very small group of SAP IDM customers. From what I understand this new group, mostly volunteers, is about 10 times larger! During this first session, I heard a few ideas come out that I don’t think SAP thought of, but saw the value almost immediately.
  2. SAP is actively engaging customers, partners, and consultants for feedback. The first session centered on the integration of Success Factors HR with IDM and what customers would be able to expect via an enhanced “ramp up” program.
  3. There are other sessions planned to cover a variety of topics in the coming months showing a considerable amount of initiative from SAP. I’m looking forward to exploring it in more detail.

 

Based on what I saw in this presentation and from things I’ve seen at past TechEd sessions, the forums, and general research, I’m seeing the following trends at SAP regarding Identity Management. These opinions are entirely my own, although comments from SAP are, as always, welcome.

 

  1. IDM is being positioned as the key workflow engine for managing identities throughout the SAP Landscape. If information is coming from any resource involving people, IDM is where it should be processed. HCM, Success Factors, PI, XI, AD, LDAP, no matter what the source, IDM should be handling the flow of information to other systems in the landscape and applications in the Enterprise. Unlike some other “people centric” systems like GRC and CUA, IDM is uniquely designed to work in a heterogeneous environment.
  2. It appears that SAP is poised to start a new phase of growth for IDM. This isn’t just the basic, release the latest iteration of the connector list and publish a new service pack. It’s time to add some new functionality and breathe some new life into SAP IDM!

 

One of the other benefits of this call-in session was getting to hear the voices of several people I have met in my travels and others that I have only met through SCN, Hope we all get to meet up soon! (TechEd, I mean d-code anyone?)

 

I’m looking forward to seeing what SAP has planned for IDM. Here’s to the new IDM frontier in the Landscape and the Enterprise beyond!

DB2 = 5 and other interesting facts

As always, we start with a quote:

 

Captain Spock: All things being equal, Mr. Scott, I would agree with you. However, all things are not equal.

 

My latest project has me working with DB2 as the IDM backend. We’ve faced several challenges along the way, many in the area of performance and some in just general development. I’ll be addressing some of the performance issues in a future post (need more information from DBAs and other research)

One of the first things I learned about DB2, is how IDM recognizes it. I’m not talking about the Oracle emulation layer, but rather what code IDM uses to determine the database type. Consider the following code from the SAP IDM RDS Solution, which I have since modified:


// Main function: sapc_prepareSQLStatement

// 12MAR2014 - MGP - Added DB2 Support, some cleanup

function sapc_prepareSQLStatement(Par){

 

var dbType = "%$ddm.databasetype%";

var script = "sapc_prepareSQLStatement";

var returnValue="";

//uWarning ("dbType: "+dbType);

 

// Processing

if ( dbType == 1 ) { // MS-SQL

// uWarning("Database Type is MS-SQL.");

// uWarning("Par: " + Par);

return Par;

}

else if ( dbType == 2 ) { // ORACLE

returnValue = uReplaceString(Par, "AS", "");

// uWarning("Database Type is Oracle.");

// uWarning("returnValue: " + returnValue);

return returnValue;

}

else if ( dbType == 5 ) { // DB2

returnValue = uReplaceString(Par, "AS", "");

// uWarning("Database Type is DB2.");

// uWarning("returnValue: " + returnValue);

return returnValue;

}

else {

uErrMsg(2, script + " SQL Task: invalid database type: " + dbType );

// return error message and empty result

return "";

}

}

 

Note that while SQL Server has a value of 1, Oracle has a value of 2, while DB2 has a value of 5. Makes you wonder what happened to 3 and 4… (Sybase and MySQL, maybe?)

 

The other thing that I discovered is that when writing values back to a table in DB2 certain values are not welcome. I needed to write a multi-value entry back to the database and I kept receiving error messages. I was finally able to get a useful error message by changing the properties of the To Database pass I was using so that it would do SQL Updating.

Image may be NSFW.
Clik here to view.
To Database.png

When I did that, I received the following message (data has been changed to protect the innocent):

 

SQL Update failed. SQL:INSERT into recon_roleassign_EPD values (AAA__00000,AAA__00000,BBB__00000) com.ibm.db2.jcc.am.SqlSyntaxErrorException: An unexpected token "!" was found following "PD values (NL__HR005". Expected tokens may include: "!!".. SQLCODE=-104, SQLSTATE=42601, DRIVER=3.66.46

 

I was somewhat confused since this is the “standard” delimiter, at least as far as IDM is concerned, however after checking with a knowledgeable DB2 DBA, he confirmed that not only is ‘!!’ illegal to write, but so is ‘||’. He also mentioned that there is a list out there on the web somewhere, but I was not able to find it. If anyone can find the list of illegal characters, please comment on this entry and I will update the entry. I wound up using ‘;;’ as a delimiter.

 

As an aside, I have been asked over the years, why use ‘!!’, ‘||’, or even ‘;;’ as a delimiter? The answer as I understand it, is this:

 

There’s a chance when you use any character as a delimiter that it could be part of the string. Somewhat obvious when you think about common delimiters such as comma, colon, or dash. Even possible for characters such as pipe, slash, or pound. However when you use two like characters as a delimiter the chance that it’s supposed to be part of the actual data string is greatly reduced.

 

So there you have it, DB2 = 5, and you can’t write ‘!!’ or ‘||’ to a table. What other DB2 tips do you have to share?

Viewing all 172 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>