Posts

Optimizing Report Scripts

Yes, who uses the report scripts now, but sometimes, that is the best option; here I am not going in the debate that which is better Report Scripts, Calc Exports or MDX. This post will only concentrate on what is the best you can do with the report scripts. I myself faced a challenge in which report script was taking a couple hours to export the data, we have to bring it down to the least, here are few observations which I would like to share with all and hope this will save our fellows' time. Following them I was able to cut down the time up to 1/100X: Here are few stages where we can optimize reports: 1. Pre Report Creation Optimization, 2. Formatting Optimization, 3. Member Selection and Grouping Optimization, 4. Report Specific Optimization, Before from the time report is extracted by report extractor and displayed by report viewer, we can do be some updates to the application, this is pre report creations optimization and this will be common for all the

Unable to Start OPMN and Eventually Essbase

I have been the victim of this a couple of times, where everything was installed and configured properly without issues, however when you try to start OPMN it goes off with following error message: Starting opmn and all managed process... opmnctl startall: opmn failed to start. Error --> Process (index=X,uid=X,pid=XXXX)     failed to start a managed process after the maximum retry limit If you have something in logs, you can find the root cause and rectify the issue. But in some cases, the worst part of the issue is you wont find anything in the logs which will point you in right direction. If you look at Essbase logs, they will go in loop, they will keep on repeating the same information and there will be no error in the logs. One of the reasons: When ever you try to start OPMN it try to bind using loopback address so when loopback address is not mapped with hostname opmn fails to start. Here is a simple solution: You need to make sure there is an entry of 127.0.0

How to Delete all UDAs, Formulas, Aliases, Attribute Association etc. at one go!

Image
There are times when we need to delete all the Alias, Descriptions, Formulas, UDAs, Smart Lists, or Attribute Binding etc. from an existing planning application. All these requirements are achievable, if you want to do it at once one of the options is: our good old Outline Load utility. In such scenarios "<NONE>" is your best companion! While loading members all we need to do is to add <NONE>  in the required column, this will remove all the existing associations and values. Let's take a scenario, as a last minute requirement your client needs cross tab reporting, your solution architect has decided to use Attributes instead of UDAs, as  repercussions you need to delete all the UDAs, change all the calcs etc. Now, in order to delete all UDAs you can: 1. Go to each member and delete UDA,  2. Go to relational database and do some unsupported background changes,  3. Use LCM, 4. Use the Outline Load utility, Here, we will be discussion

FIX, REMOVE and CLEARBLOCK, You may end up with No data ;)

Image
The FIX…ENDFIX command block restricts database calculations to a subset of the database. All commands nested between the FIX and ENDFIX statements are restricted to the specified database subset. What I know about FIX is, it takes the union of the members which are in different FIX or (Hypothetically) are repeated in the same fix, so even if you mention FIX("E1","E1","E1") it will work on "E1" and just once, not thrice! . Say you have fully populated data cube and you need to delete the data for one Entity(say E1), the fastest thing you will probably write is: FIX(@REMOVE(@RELATIVE("Entity",0), "E1")   CLEARBLOCK ALL; ENDFIX Yes, its done! This will remove the data for all the level 0 entities but not for E1. This will happen assuming there is no alternate hierarchy. ------------------------------------------------------------------------------------------------------------ Lets go in detail: the

Deep Dive in @XREF: From Basics to Pro!, What IFs & Can I....

Image
Most of the times we have to refer data from one cube to another. There are many ways in which you can transfer or refer data from a cube, few  of the most popular ways are: 1. Create a partition, 2. Create a XREF, 3. Create an export and import script, etc. Every way has its own pros and cons. Today our discussion will be focused on XREF. Trust me if you are working with Planning applications you are dealing with XREFs all the time. If there is a  planning application with multiple plan types, Planning automatically creates the XREFs between cubes for the members. If we have a planning application with two plan types, say REV and INCSTMT and we have a stored member say "OtherRevs". Let's say this member has the source plan type as REV and its enabled for both the PlanTypes. In this case Planning will create this member in both the cubes, however in REV cube it will be a stored member but in INCSTMT it will be a dynamic calc member with an auto generated XREF. Th

O2B: Creating Trigger: How many times my DB is refreshed from planning interface

Image
How many times my DB is refreshed from planning interface?  In order to get answer to this question you have to deploy O2B, create one additional table and a trigger on the Planning Database. Here I am taking an example to create trigger on MS SQL Server, same can be done on the Oracle Db too. Create an additonal table with the Name HSP_LOCK_O2B, this table should have the same set of columns including an additional one for time stamp, its name should be DateAndTime. You can create the following trigger on the HSP_LOCK table: CREATE  TRIGGER For_DB_Refresh_delete ON HSP_LOCK    FOR INSERT          AS    declare @Object_Id int;    declare @Session_Id int;    declare @User_Id int;    declare @Server_Id int;        select @Object_Id=i.Object_Id from inserted i;    select @Session_Id=i.Session_ID from inserted i;    select @User_Id=i.User_ID from inserted i;    select @Server_Id=i.server_id from inserted i;        -- Insert rec

O2B Deployment on Apache Tomcat

Image
No I don't want to touch my existing up and running EPM environment and I want to explore O2B, what should I do? You can Deploy it on any machine, why don't use Apache for that! 1.      Download the Apache Tomcat for windows 32/ 64 bit . Apache Tomcat can be download from the given below URL , for both the 32 bit and 64 bit windows. http://tomcat.apache.org/download-70.cgi . You can choose: 32-bit/64-bit Windows Service Installer   2.      Install of the Apache Tomcat -: After the download is finish  and run the Apache setup.   We have to click on  the Service Startup and Native option under the Tomcat , and then click on the next Button. Default the server shutdown port is 8005 and Http / 1.1 Connector Port is 8080 which can be changed  and we can give  a User Name and Password for the installing the Apache Tomcat , it is optional . Select the path for Java Virtual Machine and path should be of Jre , if the JRE or JDK