Saturday 26 March 2016

FDMEE Web Services Part 2

Back again to continue looking at what FDMEE has to offer in terms of Web Services, in the last part I looked at what is currently available in the on-premise world of FDMEE and covered the Web Services using the SOAP protocol and a possible alternative accessing the java web servlets.

Today I am going to look at what is available using the FDMEE REST web service which will be focusing on PBCS as unfortunately this option is not currently available for on-premise, maybe this will change in the next PSU or further down the line, we will have to wait and see.

Update December 2016
REST functionality is now available for on-premise FDMEE from PSU 11.1.2.4.210

I have covered the planning REST API with PBCS in previous posts so it may be worth having a read of them if you have not already because this is going to be in similar vein and I don’t want to have to go into the same detail when I have covered it before.

As you may well be aware if you have worked with PBCS that it is bundled with a cut down version of FDMEE which only known as data management.

If you have used the epm automate utility you will have seen that it has two options when integrating with data management, the documentation states:

Run Data Rule

Executes a Data Management data load rule based on the start period and end period, and import or export options that you specify.

Usage: epmautomate rundatarule RULE_NAME START_PERIOD END_PERIOD IMPORT_MODE EXPORT_MODE [FILE_NAME]

Run Batch

Executes a Data Management batch.

Usage: epmautomate runbatch BATCH_NAME

I am not going to cover using the epm automate utility as it is pretty straight forward and I want to concentrate on the REST API.

If you look at the PBCS REST API documentation there is presently nothing in there about executing  the run rule and batch functionality, it must be available as that is how the epm automate utility operates.

Well, it is available which is good or that would have been the end of this blog, maybe it has just been missed in the documentation.

Before I look at the data management REST API I want to quickly cover uploading a file so that it can be processeed, it would be no use if you did not have a file to process.

I did cover in detail uploading files to PBCS but there is a slight difference when it comes to data management as the file needs to be uploaded to a directory that it can be imported from.

To upload a file using the REST API there is a post resource available using the URL format:

https://server/interop/rest/{api_version}/applicationsnapshot/{applicationSnapshotName}/contents?q={“isLast”:”boolean”,”extDirPath”:”upload_dir”chunkSize”:”sizeinbytes”,”isFirst”:”boolean”}

{applicationSnapshotName}
is the name of the file to be uploaded

The query parameters are in JSON format:

chunkSize = Size of chunk being sent in bytes

isFirst = If this is the first chunk being sent set to true

isLast = If this is the last chunk being sent set to true.

extDirPath = Directory to upload file to, if you are not uploading to data management then this parameter is not required which is why I didn't cover it in my previous post.

When you create a data load rule in data management and select the file name a pop up window will open and you can drill down the directory structure to find the required file.

 

It is slightly confusing that it starts with inbox and then has another directory called inbox, anyway I am going to load to the inbox highlighted which mean the extDirPath parameter will need to be set with a value of “inbox”.

Just like in my previous posts I am going to provide an example s with PowerShell scripts to demo how it can be simply achieved, if you are going to attempt this then you can pretty much use any scripting language you feel the most comfortable with.


In the above example the username and password are encoded and passed into the request header.

The file is uploaded from the local machine to the data management inbox using the “Invoke-RestMethodcmdlet.

The response status codes are:

0 = success
-1 = in progress
Any other code is a failure.

The response was 0 which means the upload was a success and the file can be viewed and selected in the data load rule setup.


Now the file has been uploaded and there is a data load rule the data management the REST API can come into play to execute the rule.

The API can be reached from the following URL format:

https://server/aif/rest/{api_version}/jobs

The current api version is V1

I am going to use boomerang REST client for chrome to demonstrate.


Using a GET request against the REST resource returns the jobs that has been executed in data management.


The response includes the total number of jobs and a breakdown of each job including the status, id, who executed the job and output log.

If I run the same request against the 11.1.2.4.100 on-premise FDMEE the following is returned.


I deliberately executed the request with no authentication header just to show that the REST resource is available, adding the header returns the following response.


It looks like required Shared Services Java methods don’t yet exist in on-premise so that is as far you can get at the moment.

Back to PBCS and to run a rule the same URL format is required but this time it will be using the post request, the payload or body of the request has to include the job information in JSON format.

The parameters to supply are:
  • jobType which will be DATARULE
  • jobName is the name of the data load rule defined in data management.
  • startPeriod is the first period which is to be loaded.
  • endPeriod for a single period load will be the same as start period and for a multi-period load it will be the last period which data is to be loaded.
  • importMode determines how the data is imported into data management, possible values are:
    • APPEND to add to the existing POV data in Data Management
    • REPLACE to delete the POV data and replace it with the data from the file
    • NONE to skip data import into the data management staging table
  • exportMode determines how the data is exported and loaded to the planning application, possible values are:
    •       STORE_DATA merges the data in the data management staging table with the existing planning applications data
    •       ADD_DATA adds the data in the data management staging table to the existing planning applications data
    •       SUBTRACT_DATA subtracts the data in the data management staging table from the existing planning applications data.
    •        REPLACE_DATA clears the POV data in the planning application and then load the data in the data management staging table. The data is cleared for Scenario, Version, Year, Period, and Entity
    •       NONE does not perform the data export and load into the planning application.
  • fileName (optional) if not supplied the file which is defined in the data load rule will be used.
Below is an example of the request payload in JSON format for executing a data load rule.


Once the request is posted the response will contain information about the rule that has been executed.


The rule status is shown as running, to keep checking the status of the rule then the URL that is returned in the href parameter can be accessed using a get request.


This time the status is returned as success so the rule has completed, the id can be matched to the process id in data management.


Converting this into a script is a pretty simple task.


The URL containing the job id can stored from the response and then another request executed to check the status.


Once again this can be confirmed in the process details area in data management.


Moving on to running a batch which is the same as running a rule except with less parameters in the payload.

The required parameters are “jobType” and “jobName” in JSON format.

The “jobType “value will be “BATCH” and the “jobname” will be the name of the batch, nice and simple.


The response is exactly the same as when running a rule.


In scripting terms it is the same logic as with running a rule, this time in my example I have updated the script so that it keeps checking  the status of the batch until a value other than running (-1) is returned.



Well that covers what is currently available with the data management REST API, hopefully this will make its way down to on-premise FDMEE in the near future, it may appear with the imminent release of hybrid functionality but we will just have to wait and see.

Monday 7 March 2016

FDMEE Web Services Part 1

After covering Essbase and Planning Web Services in detail I thought I would move on to have a look at what FDMEE has to currently offer for both on-premise and PBCS.

I am going to start off looking at Web Services with the full offering of FDMEE which is on-premise, there has been a SOAP based Web Service available since the early days of ERPi which I doubt has been widely used.

To be able to use the Web Service there is a requirement to configure Oracle Web Services Manager (OWSM) first and unfortunately there is no option like the Essbase Web Service to bypass it.

It is also worth pointing out that the 11.1.2.4 documentation states the following:

“It is recommended that you use batch scripts rather than the web services”

Interesting that the recommendation is to use batch scripts which have to be run from the FDMEE server, not really an ideal situation if you are running processes that can take advantage of Web Services, anyway I will get on to the batch scripts later as there is another alternative to having to run the scripts.

I still think it is worth covering the Web Services because they are available and can be used, though personally I am hoping that the SOAP Web Services will be replaced with a REST version like the one available with PBCS which I will get on to later.

I am not going to go into detail on configuring OWSM as it is covered in the documentation and there are various guides out there which can be followed.

A summary of the steps are:
  • Configure an external directory in WebLogic Admin console to match the external directory configuration in Shared Services.

  • Run Oracle Repository Creation Utility to create the Metadata Services schema.
  • Extend the EPM system WebLogic domain to configure OWSM.
  • Enable OWSM in WebLogic Admin console.
  • If required set up message protection to encrypt and sign the SOAP messages.
  • Configure OWSM security policy in Enterprise Manager Fusion Middleware Control, if you are extremely bored and want understand policies in great detail have a look here and here.
It is possible to configure the Web Services with a set of default policies using batch scripts which are explained in detail here.

Right, OWSM is configured wasn’t that fun ;)

Depending on the FDMEE version and how OWSM was configured you may end up with the following security policy attached to the FDMEE web application.

oracle/wss11_saml_or_username_token_with_message_protection_service_policy

If you are interested the policy definition is:

“This policy enforces message-level protection (that is, message integrity and message confidentiality) and authentication for inbound SOAP requests in accordance with the WS-Security 1.1 standard.”


I am not going to use message protection as I think it is an overkill for demonstration purposes so I removed the policy and attached:

oracle/wss_username_token_service_policy

The policy definition is:

“This policy uses the credentials in the UsernameToken WS-Security SOAP header to authenticate users. The plain text mechanism is supported.”


Finally we can access the Web Service WSDL through the URL:

http(s)://:/oracle-epm-erpi-webservices/RuleService?wsdl

The current FDMEE documentation informs us that  the following operations are available.
  • executeDataRule
  • executeMetaDataRule
  • getDataRuleNames
  • getLocationNames
  • getPeriodNames
  • lockPOV
  • unlockPOV
It is worth pointing out that the documentation has not been updated in a long time and the definition of some of the operations are not totally correct.

There are a number of different ways to test or access the Web Service, you can use the standard functionality within Enterprise Manager.


There are also quite a few browser plugin or applications that are available for browsers, like in my previous posts I am going use the  Chrome application Boomerang is a simple way to access SOAP and REST Web Services.


As you can see from above there are differences between the operations that are returned and what the documentation states are available, though it is worth pointing out some of the operations are part of the Account Reconciliation Manager product

I am going to cover the getDataRulesNames and executeDataRule operations.

An empty SOAP request for the getDataRulesNames looks like:


The documentation says a location name is required as an input but in reality it is the partition key of a location.

The partition key can be easily located in FDMEE.


If you try and execute the SOAP request with no header an error will be generated as no OWSM policy information was included.


To replicate the header for the OWSM policy I am using is pretty simple once you understand the format.


Executing the request will now return all the data rules and data rule properties for the location with the partition key 21.


There are 44 properties returned for each rule so you can pretty much find anything you want to know about them.

Executing the operation using a scripting language does not take too much effort and in the following example I using a simple PowerShell script but it can be achieved in pretty much any language.


The script reads in a SOAP XML template and then sets the operation, username, password and partition key before posting the request.

The required properties from the response can be stored and processed or just displayed like above.

The status property shows that the rule failed the last time it was run so let’s move on to running a rule through the Web Service.

It doesn't take a mastermind to know that the operation to use will be “executeDataRule

The inputs for the SOAP request are:
  • Rule Name
  • Import from Source 
  • Export to Target
  • Export Mode
  • Execution Mode – (equivalent of Import Mode in FDMEE) 
  • Load Exchange Rate Flag
  • Start Period Name
  • End Period Name
  • Sync Mode - (equivalent of offline/online mode in FDMEE)
These are self-explanatory and similar to what is available when you execute a rule from in FDMEE.

The SOAP request once again requires the OWSM policy header and the above inputs.


The response contains the Process ID and ODI session ID for the rule execution.


If I run the PowerShell script from earlier it confirms the rule has been run successfully.


This can also be verified in Process Details in FDMEE.


I am going to leave it there for the on-premise FDMEE Web Services and move on to the documentation recommends of running batch scripts.

Now as this post is about Web Services so I am not going to be running the batch scripts and look at an alternative solution.

There are various batch scripts available that run rules for loading metadata/data, write back, import mappings and batches

These rules can be found in
<EPM_ORACLE_INSTANCE>\FinancialDataQuality

Basically the scripts call another batch script in
<EPM_ORACLE_HOME>\EPMSystem11R1\products\FinancialDataQuality\bin

From there the passed in parameters are posted to a Java servlet in the FDMEE web application, so even though the process originates with a batch script in the end a request is made to the FDMEE web application, this means in theory we should be able to bypass the scripts and go direct to the web application.

The URL for the servlet is :

http(s)://<webserver>:<port>/aif/BatchExecutionServlet

Let us take the batch script for running a data load rule as an example.

The format for the load data rule (loaddat.bat/sh) batch script is as follows:

loaddata USER PASSWORD RULE_NAME IMPORT_FROM_SOURCE EXPORT_TO_TARGET EXPORT_MODE IMPORT_MODE LOAD_FX_RATE START_PERIOD_NAME END_PERIOD_NAME SYNC_MODE

So to run a rule it could be like:

loaddata admin password DelimitedFileDL Y N Store Replace N Mar-16 Mar-16 False

As the parameters are being posted to a servlet this can be replicated using the same Web Services client application as before.

First add a content-type of application/x-www-form-urlencoded to the header.


Next it is just a matter of adding each of the parameters


The password can either be clear text or encrypted, there is a script available to generate an encrypted password.

Alternatively the parameters could be entered directly into the payload.


Once the request has been posted a response should be return to confirm whether the rule was successfully executed.


The response is pretty much the same as the output from the batch scripts.

In terms of scripting this can be easily reproduced as the example below shows:


The parameters are stored and posted to the servlet and then a formatted response is output.

This concept can be applied to all of the batch script functionality, here is an example to run an FDMEE batch.


Well I think I am going to leave it there for today and in the next part I will look at the options available using the REST API which is currently only available with FDMEE (data management) in PBCS, maybe it will made available as an on-premise option in the next PSU, we will see.