Author Archives: Greeny Dangerous

TM1: List all subsets per dimension and their assigned aliases

powershell_2

I was bit amazed to find out that TM1 doesn’t seem to have any built-in functions for fetching all the subsets inside a dimension and it sure does not have a way to tell which alias is currently applied to the subset. Subsets are saved in separate files inside the datafiles directory so I needed a little PowerShell magik to fetch the subsets and their corresponding alias names by myself.

Inspired by the script “Listing all Subsets per Dimension in an IBM Cognos TM1 Server” in IBM developerWorks I wanted to push it little more further. I also switched TI scripting to PowerShell script because I needed to go inside the files contents instead of just iterating through the filenames.

The thing is, inside the .sub-files there is a special line starting with “274,” following after that comes the name of the alias that is applied to the subset. The real reason why I wanted to have a list the aliases applied to each subset is described in the next post but for now here comes the PowerShell script (it must be executed inside your datafiles-folder of your TM1 server):

Get-ChildItem -Recurse -Include *.sub
| Where-Object {$_.Directory -match "\w*}subs\b"}
| Select-String -pattern "274,"
| Select-Object Path, Filename, Line
| Where-Object {$_.Line -notmatch ",$" -AND $_.Path -notmatch "datafiles\\}" 
  -AND $_.Path -notmatch "datafiles\\tunit"}
| ForEach-Object { $_.Path.Split('\')[-2].Split('}')[0] + ";" 
  + $_.Filename.Split('.')[0] + ";" + $_.Line.Split(',')[1] + ";" + $_.Path }
| Set-Content "D:\temp\TM1_Dimension_Subset_Alias.csv"

What it does is basically the following:

  • go through recursively all the files in the datafiles folder that are located in subfolders containing “}subs” in their name
  • from the files, fetch all the lines that match the pattern “274,” (alias name)
  • filter out unnecessary system folders
  • finally form a nice csv-formatted content containing dimension name, subset name, alias name and the full path to the subset file
  • write the results to output csv file

Example of the contents of the output file:

dimName1;subsetNameA;aliasname;c:\tm1server\datafiles\dimName}subs\subsetName.sub

There you go! Have fun powershelling your TM1 subsets!

-Greeny

Advertisement

SQL Server 2014 CTP1 In-memory OLTP issues with VirtualBox

I just started getting touch to the brand new SQL Server 2014 with my VirtualBox virtual machine. The first thing to to with SQL Server 2014 is to test out the new in-memory OLTP functionality (of course!). So I did and immediately faced challenges. I followed the SQL Server team blog instructions and noticed that “MEMORY OPTIMIZED DATA” -section in Filegroup properties was missing completely. I tried to run the scripts provided and then got this error message:

Msg 41342, Level 15, State 1, Line 5

The model of the processor on the system does not support creating filegroups with MEMORY_OPTIMIZED_DATA. This error typically occurs with older processors. See SQL Server Books Online for information on supported models.

Looks like I wasn’t the only one around having the same issue because it had to to something with VirtualBox default settings for a new virtual machine. So, if you want to get the new in-memory OLTP functionality working correctly with VirtualBox you need to run the following command with VBoxManage:

VBoxManage setextradata [vmname] VBoxInternal/CPUM/CMPXCHG16B 1


TM1 Performance Modeler – Export hits OutOfMemoryError

I was planning to do the very first deployment from development to production in a new customer environment. Using the transfer out – feature of TM1 Performance Modeler it should have been straightforward but right after the transferring started I encountered an nasty error message:

Transfer has failed due to the following error(s):

ERROR

The transfer failed.
java.lang.OutOfMemoryError

OutOfMemoryError

What a nice way to start production deployment. Server had gigabytes of free memory but somehow insufficient amount of it was available to Performance Modeler. Google to rescue and I found the following IBM Support page: http://www-01.ibm.com/support/docview.wss?uid=swg21600719

I found out that there’s a yet another XML configuration file called ModelingMDT.ini that should be adjusted. It’s located in the following directory:

C\Users\<USERNAME>\AppData\Roaming\IBM\Cognos Performance Modeler\bins\bin_10.1.XXXX.X

I just increased the value -Xmx512M in to -Xmx1024M and voila’: transfer out succeeded this time. Our model is pretty small at the moment but still it doesn’t fit in the default 512MB while transferring the objects. I think it’s a pretty small value for a default one. What do you think?


MDS 2012: Collections not supported in the new staging process

There were lots of good improvements and new features in MDS 2012 version but during our current project I found out some interesting things about collections.  According to Microsoft they do have made improvements to the collections GUI (which was crap indeed in MDS 2008R2) but for some reason they didn’t improve the collections importing via staging tables along with the release of the completely new staging process.

I was searching for collection-specific staging tables and confusingly found none. Then I made some findings in the following article: Discontinued Master Data Services Features in SQL Server 2012. There it’s clearly said that:

You cannot use the new staging process to:

  • Create or delete collections.
  • Add members to or remove members from collections.
  • Reactivate members and collections.

You can use the SQL Server 2008 R2 staging process to work with collections.

So it seems like they ran out of time creating new staging process for collections in product development and decided to continue supporting collections import in “2008R2 way”. I bet this will be deprecated in the next version and hopefully they’ll invent a new staging process for collections in the next SQL Server version. Until then I’ll stick up using the old staging tables and good old mdm.udpStagingSweep stored procedure 🙂


SSIS: Adding timestamp into filename

Once and a while you need to “post process” your source files for example moving them into archive directory etc. In such directory you might then end up having dozens of files so you need to rename them also. The most common way to achieve a robust and clear formula of renaming files is to add a timestamp inside the filename.

In SSIS (SQL Server Integration Services) there is no sophisticated way (read: nice string/date function) to do this but leveraging expressions and string manipulating functions you can have a nice one like the one I’ve used:

(DT_WSTR, 4) (DATEPART( "yyyy", @[System::StartTime]  )) +
RIGHT("0" + (DT_WSTR, 2) (DATEPART( "mm", @[System::StartTime]  )), 2) +
RIGHT("0" + (DT_WSTR, 2) (DATEPART( "dd", @[System::StartTime]  )), 2) +
"_" +
RIGHT("0" + (DT_WSTR, 2) (DATEPART( "Hh", @[System::StartTime]  )), 2) +
RIGHT("0" + (DT_WSTR, 2) (DATEPART( "mi", @[System::StartTime]  )), 2) +
RIGHT("0" + (DT_WSTR, 2) (DATEPART( "ss", @[System::StartTime]  )), 2)

Now, just add this expression as a pre- or postfix to your processed file and you end up with a nice and clean result like: sourcefile_20130412_082024.csv


Achilles’ heel of TM1: Daylight savings time (DST)

Once again it’s the time of the year (March 31st) when we shift into summer time (or daylight savings time (DST)). That’s great: days are getting longer and you start feeling that the summer really is near. The bad thing is if you happen to have a TM1 production environment somewhere and you have scheduled chores to execute at specific point of time. The thing is: TM1 is unable to handle daylight savings time correctly. It still amazes me how this is possible? In other words how they haven’t made any fixes to correct this? Maybe it’s just so deep deep deep inside the legacy business logic or something. Anyhow, the point is you’re on your own and have to deal with this issue yourself.

So, after Easter Monday remember to go through the chore schedules in your TM1 environment and reschedule them to start up at correct time.

For more information, please check out these:


Speaking at Microsoft TechDays 2013 in Helsinki

tommi_techdays

I will be speaking at Microsoft TechDays 2013 conference on March 6th. In Solita we have leveraged MDS in many DW/BI projects to tackle down the analytic master data management challenges that our customers have encountered. In my session I will present how to use MDS for analytical master data management by going through some examples of the solutions that we have created including

  • using MDS as a centralized repository for the analytical master data (dimension data), maintenance and change management
  • enriching the master data entites to complete dimension data and fully satisfy the reporting and analyzing needs of the business units
  • harmonize and combine master data from multiple sources to achieve robust, accurate and consistent dimension data

The full agenda for the two-day event can be found here: http://www.techdays.fi/fi/ohjelmaopas

See you in March!


MDS auto-generated Code value – how to import new members via staging tables?

Creating the business rule is straightforward but problems start when it comes to importing data via staging tables.

Recently I had to set up an entity that had to have an unique code across the enterprise. In SQL Server 2012 version of MDS this is achieved quite easily just by checking the “Create Code values automatically” checkbox when creating the entity. In 2008 R2 version we have to leverage the business rules feature.

Action "defaults to a generated value" comes handy.

Action “defaults to a generated value” comes handy.

First, create a new business rule and choose the “defaults to a generated value” action and drag Code-attribute into the “Edit action”-section. Starting the incremental generation from number 1 is fine. Now we are ready. At least if we only use MDS web user interface to add new members to our entity. In my case I need to import the data from source system(s) and therefore I need to use staging tables.

Importing via staging tables new members that will have an auto-generated code

First try
One can not insert NULL values into the MemberCode column.

One can not insert NULL values into the MemberCode column.

First of all, how the hell I’m supposed to insert any new members into the staging table since the MemberCode-column is not nullable? Luckily I found this MSDN Forum Thread where I was told to assign the legendary “” – empty string value to MemberCode. OK, works out well but the problem is that I also want to import some additional attributes (other than the mandatory Name and Code attributes) via tblStgMemberAttribute-table and there I need to have the unique MemberCode that is assigned after the model is validated or the business rules are executed. How am I supposed to assign the correct attributes to correct member if I don’t know the unique MemberCode because it’s generated after the model is validated? Now what?

How to import additional attribute values if we don't know the Code value when importing?

How to import additional attribute values if we don’t know the Code value when importing?

Second try

We’re now facing bit of a “Chicken-and-egg problem” here. Luckily I came across another blog post that handled the very same issue. The trick is quite simple now when I think about it afterwards. The point is that when the new member records have been imported to MDS with empty string as their MemberCode value MDS will assign a special system-guid to the newly imported members:

Special "#SYS-"-prefixed GUID Codes are assigned to members that were imported with empty string as membercode.

Special “#SYS-“-prefixed GUID Codes are assigned to members that were imported with empty string as membercode.

These “#SYS-“-prefixed GUID values are only temporary because when the business rules are executed MDS replaces them with the value that we have set when defining the business rule. So how does this help us since the #SYS-values are still assigned AFTER the import and we need them BEFORE we import the rows via tblStgMemberAttribute-table? We simply “emulate” the MDS’s internal process by assigning the system guids by ourselves already in the loading phase by generating the unique system guids for example using the newid()-function in SQL Server.

Generating the #SYS GUID on our own by newid() function call.

Generating the #SYS GUID on our own by newid() function call.

Now, by assigning the system GUID ourselves to the tblStgMember and tblStgMemberAttribute tables we can successfully import new members with additional attributes and get the correct auto-generated code values by validating the model.


TM1 9.5.2 crash course: Beware of the LONG-function

During the last few weeks we’ve faced some serious problems with one of our TM1 production servers. The production server instance kept crashing on daily basis and without giving any clear reason why. We arranged surveillance by using Windows’ Performance Monitor and collected some system data but found nothing suspicious there either. The only clue were the last lines in tm1server.log just before the server crashed:

tm1server.log just before the crash

So, instead of trying to find a needle in a haystack we went through the rules of the last mentioned }ElementAttributes cubes and found out that in three of them a built-in-and-safe-sounding function LONG was used. At the time of development it may have been a good idea but nowadays at least with version 9.5.2 it is definitely a no-go: PM38586: LONG FUNCTION IN TI CAUSES TM1-SERVER HANG/INSTABILITY

IBM’s recommendation is to upgrade to version 10.1.0 but that was not an option for us at the moment so we needed to find a workaround. Now after fixing the rules with a workaround that avoids using the LONG function seems to have done the trick: production server is stable again and does not crash. So, my strong recommendation is not to use LONG function in rules if you are running TM1 version 9.5.2 or lower.