Sunday, 9 July 2017

EPM Cloud - Clearing data with EPM Automate and REST

There have been a few new additions to the EPM Cloud 17.07 release that I thought might be worth quickly running through.

There is a new version of the EPM Automate utility which includes a new command for clearing data, the new features and announcements document provides the following insight:

“A new version of the EPM Automate Utility is available with this update. This version includes the clearcube command, which uses the settings specified in a Planning job of type clear cube to delete specific data from input and reporting cubes.”

To coincide with this there is also the option to clear data using the REST API which I will get on to later.

There have also been some changes for clearing data in the UI, the documentation states:

“The Clear Cube job options have been updated to enable you to clear both Essbase and relational data. Previously you could only clear Essbase data. To view the updated Clear Cube job options, click Application, then Overview. From the Actions menu, select Clear Cube. On the Clear Cube page, click Create, complete the job name, select the cube, then select from the clear options displayed. Oracle has also provided two modes for clearing partial data for aggregate storage cubes. Selecting Partial Data (Basic Mode) enables you to use the member selector to specify the regions to be cleared. Selecting Partial Data (Advanced Mode) enables you to clear Essbase data by using an MDX expression in the text box provided.”

So basically relational data can now be cleared as part of the clear cube job and the options for clearing ASO data have been split into a basic and advanced mode.

Before the July release the options in clear cube for BSO were:

The options for an ASO cube were:

If Partial Data was selected then you could choose whether it was going to be a logical clear:

“In which the input cells in the specified region are written to a new data slice with negative, compensating values that result in a value of zero for the cells you want to clear. The process for logically clearing data completes in a length of time that is proportional to the size of the data being cleared. Because compensating cells are created, this option increases the size of the database.”

or a physical clear:

“In which the input cells in the specified region are physically removed from the aggregate storage database. The process for physically clearing data completes in a length of time that is proportional to the size of the input data, not the size of the data being cleared. Therefore, you might typically use this method only when you need to remove large slices of data.”

By using the member select you could define the area of the cube which you wanted to clear.

This would automatically generate MDX which was not always the best but at least it could be manually updated.

One of the problems with the clear cube data jobs was that if you had any relational type data like comments, supporting detail or attachments there was no option to clear them.

For example if I had the following data in a form which has a comment attached to a cell.

If the clear cube job was run the data would be removed but the comments would still remain.

Ok, so you could go to clear cell details from the navigator but that would mean defining the POV twice.

Moving swiftly on to the July release, let us see what changes have happened.

If you create a clear cube job you will see there is now the clear options where you can select from Supporting Details, Comments, Attachments and Essbase Data.

For ASO cubes, if Partial Data is selected then you have the new basic or advanced mode for defining the area of the database you would like to clear.

Using the Basic mode you can use the member selector to define where you want to clear data.

In the advanced mode you have to paste the MDX directly into the input box, if you select Basic mode first, define the POV and then switch to Advanced you will see the MDX that is going to be executed.

If you do switch to Advanced mode the relational type clear options will be deselected and unavailable, so keep watch out for that.

If you start with Advanced mode and enter an MDX statement:

Then switch to Basic Mode the dimension and members section will be blank, so it works going from Basic to Advanced but not the other way around which is understandable because it could be a complex MDX statement and that would be asking a little too much for it to be converted into the basic format.

It is possible to select any of the clear options, so for example if you only wanted to clear comments and not Essbase data.

Let us take a quick example with a form that has data and a comment.

After only selecting comments in the clear options and running the job only the comments are deleted.

Before the July release once you had created a clear cube job the only way to run it was by submitting it from the UI or setting up a schedule, what was missing was the ability to run the clear from outside of the UI or predefined schedule.

At last this is now possible and starting with the 17.07 EPM Automate utility there is a new command called “clearcube” available.

The new command is extremely basic and will only run a clear job already defined in the UI, when I heard there was going to be a new command I was hoping that it would be possible to set up the various clear options and even pass in the MDX using command parameters but unfortunately that is not the case, maybe that will happen in the future or maybe I am just asking for too much.

The command usage is:

epmautomate clearcube

is the name of a job defined in the application.

Time for another example and back to the same form with data and a comment.

The clear data job has been named “Clear Demo” and created with the following options:

Using the EPM Automate utility is as simple as issuing the command:

epmautomate clearcube “Clear Demo”

The command completed successfully and checking the jobs section in the UI we can see the job has been run.

Clicking on the job provides details on the options that have been executed as part of the clear.

Going back to the form and the Essbase data and comments have been cleared.

As the command to clear data is available in the EPM Automate utility this means it will also be available through the REST API,

If you have ever used the REST API to execute a job then it will be very familiar because it is the same resource just a slightly different payload.

The REST resource requires a POST to the URL format:


The payload will be in JSON and should include

jobType = CLEAR_CUBE

jobName = Clear cube job name

In my example this would equate to:

   "jobName":"Clear Demo"

Before testing out the REST resource I updated the form to include supporting detail.

The clear cube job was updated to include supporting details.

Using a REST client I can make a POST to the resource with a payload to run the “Clear Demo” clear cube job.

The response includes much more information than working with the EPM Automate utility, it includes the details on the clear options that have been run as part of the job.

The response also includes a URL so a GET request can be made to return information about the job, this is useful if the job is in a running state so it can be repeatedly checked until it has completed.

The response contains status information and the job was completed successfully, a check of the form and the data, comments and supporting detail have been removed.

Finally, we can turn this to the scripting world, before creating a script I updated the form to include an attachment.

The clear options for the job were updated to include attachments.

I created a simple script to make a POST to the jobs REST resource including in the payload the job type and job name.

The response confirms the job was successfully completed and a check of the form shows the data, comment, supporting details and attachment have been cleared.

Well that completes a look at the new clear cube functionality, hopefully you have found it useful, until next time.

Saturday, 1 July 2017

Change to Essbase forces auto shutdown of applications after being idle

I noticed a new configuration setting that appeared in the Patch Set Update for Essbase, the readme has the following under the defect fixes and documentation update sections:

New Configuration Setting SVRIdleTime

A new setting has been added that will allow an application to auto shutdown after staying idle for n minutes. The syntax of the new configuration setting is

SVRIdleTime n

where is an integer value of 3 to 20160 (in minutes). The feature will be turn off if n is set to 0 (zero) and the default is on and it is set to 15 minutes. (26050466)

So basically once you installed this patch if any Essbase application has been idle for 15 minutes they will automatically shutdown, I can't say I am that impressed with the way this has been forced as on as default, there could be many reasons why you don’t want your applications to shutdown after being idle for 15 minutes so surely it would have been better to have the default as off and then enable it if needed.

Ok, you can change the idle time or disable the functionality by adding the setting to the cfg with a value of zero but it is something you might not be aware of when your system is patched and then suddenly you notice applications shutting down.

Anyway, I have patched Essbase and not added the new cfg setting so let us see if it is definitely being enforced.

[22:04:36 2017]Local/ESSBASE0///6264/Info(1056815)
Essbase 64-bit  - Release 11.1.2 (ESB11.

Start up everybody’s favourite Sample application and leave idle.

[22:05:03 2017]Local/Sample///6876/Info(1002035)
Starting Essbase Server - Application [Sample]

Wait 15 minutes and check the application log again.

[22:20:14 2017]Local/Sample///8376/Info(1013437)
Unloading application [Sample] after staying idle for 15 minutes

[22:20:14 2017]Local/Sample///8376/Info(1013399)
Canceling current requests.

[22:20:14 2017]Local/Sample///8376/Info(1013207)

That confirms the new feature is being enforced by default.

Let us see if the default can be changed by adding the setting with a different value than the default.

SVRIdleTime 10

The Sample application was restarted and left idle.

[22:37:06 2017]Local/Sample///11352/Info(1013437)
Unloading application [Sample] after staying idle for 10 minutes

[22:37:06 2017]Local/Sample///11352/Info(1013399)
Canceling current requests.

[22:37:06 2017]Local/Sample///11352/Info(1013207)

The setting is being picked up and applied correctly, it also confirms that only the application and not the Essbase agent needs restarting after adding or making changes to the setting.

I did check to see if it is possible to define at application level.

SVRIdleTime Sample 5

After restarting the application, the following error was generated in the application log.

[23:02:52 2017]Local/Sample///9256/Warning(1002134)
Invalid or Obsolete Configuration Setting - essbase.cfg(73): "SVRIdleTime Sample 5"

As there is no mention of it in the readme and because an error is being generated it looks like it is not possible to define the idle time at application level.

I also tested the setting with a value of zero and it does look to disable the auto shutdown on idle feature.

SVRIdleTime 0

So something to be aware of if you patch Essbase with

Sunday, 25 June 2017

EPM Cloud – Managing applications with Smart View continued

In the last post I went through in detail the ability to manage a PBCS application using Excel and Smart View, I wanted to cover what happens behind the scenes with Smart View and try to replicate the functionality but I ran out of time so here is a follow up post.

I understand going into detail might not be for everybody but in many ways it helps me to write down what I have discovered as I can refer back to it, I know it is a bit sad but I do find myself looking back on posts I have written, what is worse is when I can’t even remember writing them :)

There are currently three main tasks that are available in Smart View for managing applications:
  • Create Application
  • Update Application
  • Delete Application
In order to achieve the create and update functionality the Excel workbook containing the template information is posted to the planning web application.

If you don’t already know Smart View mainly operates by generating XML code, the XML is compressed using GZIP and is then posted to the web server, a compressed response is received which once again is in XML format.

So let us take a look at what happens when the create new application option is selected in Smart View, a post is made to:


The important part is the XML that is included in the body of the post, an example being:

The root node of the XML defines that it is a request to create an application, the “sID” node is an identifier that is always posted with planning Smart View requests, it is basically way of validating the request is being made from an authenticated user, the SID is first generated when connecting to the planning SV provider which I go into more detail later.

I think you will have already guessed what the “xlsFile” node is being used for but the value probably doesn’t make much sense, I have not included the full value as it is large because it contains the full excel template file, the value is a base64 encode of the original excel file.

If I decode the value then the binary version of the excel file is displayed.

The decoded text can be copied and then saved as a .xlsx fle.

If the file is opened it is the exact same template file that is open when creating the application.

On a side note an excel file is just a set of XML documents that have been saved as an archive file, they can be opened like any zip file, so if I open the excel file using 7zip it is possible to view the structure and XML documents in the file.

Anyway, once the post has been made from Smart View to the planning web application server to create the application then after a while a response will be returned in XML format.

Smart View will then convert this XML and display a message to the user.

Using the update application functionality operates in a similar fashion, a post is made to the same URL.

This time the XML that is posted contains a root node that defines it is a request to edit the application.

The XML includes the application name and once again the SID and Excel template file.

After the post has been successfully made and the application updated an XML response will be returned.

The XML contains the message that is then displayed to the user.

You will not be shocked to find out that the delete application operates in the same way, a post is made to the same URL with XML in the body of the post.

The XML defines that it is a request to delete an application, this time there is no need to include the excel file so only the application name and SID are included in the XML.

Once the application has been deleted an XML response will be returned which includes the message that is displayed to the end user.

Now we know what Smart View is doing behind the scenes then it should be possible to replicate this functionality outside of Smart View.

The first problem is that when you log into Smart View there is authentication done through Oracle access manager and then an authorisation cookie is created and included in Smart View requests, this would make replicating the functionality over complicated but luckily there is a simpler method.

If you every read my posts on load testing with the EPM Automate utility then you will know that Smart View requests can be made through REST calls, this allows the use of an basic authentication header in the requests just like when using any of the cloud REST APIs.

In all the previous Smart View requests that I have shown there has been a SID value, in order to obtain this we need to log into the cloud instance and generate an SSO token.

This can be achieved by making a GET request using the following URL format:


Like with all the EPM Cloud REST API resources the request requires a basic authorisation header which is a base64 encode of the users cloud credentials.

The response contains JSON which we are not actually interested in as there is no reference of SSO token.

It is actually the header of the response which contains the SSO token we are looking for.

Now that we have the token a new request can be made to the planning Smart View URL to return the SID.

A new URL format can be used which allows you to create equivalent XML requests that are made from Smart View:


The following POST connects to the planning Smart View provider passing in the SSO token in the XML body of the request.

This is the same XML request that is sent from Smart View when connecting to the planning provider and the XML response contains the all-important SID.

This means the next request could be to replicate the Smart View functionality to either create, update or delete an application.

To demonstrate this I am going to put this all together in a PowerShell script which will update the security in a PBCS application using the security worksheet template.

Any of the applications access permissions can be updated using the security worksheet but to keep it simple I am just going to update the security for one member and form folder.

Before I get questioned, the script is not exactly how the final version looks, it has just been written this way to make it easier to go through.

The first section of the script encodes the user credentials to be used in the authorisation header of the REST request, a request is made to the same URL format that was shown earlier and from the response header the SSO token is stored.

The next section replicates the Smart View request to return the SID by connecting to the planning provider and passing in the stored SSO token.

The final section reads in the security excel template file, encodes to base64. creates the XML to update the application inserting the application name and stored SID, the request is then made to update the application.

The access permissions were then verified for the product dimension member.

The security on the administration form folder was also successfully updated.

I could have taken the script a step further and started with a text file, converted it to excel and then updated the application but I think you should get the idea.

Until next time…..