Scheduled backup solution for TM1 with 7zbackup

7zip_logo

Working with in-memory applications like TM1 it’s essential to have a working backup strategy. TM1 doesn’t provide any built-in functionality for this but with little help from 3rd party tools one can easily build up a lightweight and working backup solution. What you need is 7zbackup, 7-Zip 9.20 and Windows PowerShell 2.0.

Getting started, step 1: Modify the “selection file”

(Notice: I personally needed to go back to version 7zbackup 1.8.5 stable to get the script running succesfully.)

After installing the required applications start with modifying the 7zbackup “selection” config file. That’s the file that contains the configuration needed to run through the backup process. There are many different parameters that you can modify but let’s just go through the most important ones.

includesource=<path_to_your_tm1_datafiles_folder>\datafiles|alias=PROD

Here you define what to back up. Datafiles folder is the heart of TM1 so you need to back it up entirely. If you want you can also include the logfiles folder but its not necessary.

rotate=60

As it says in the comment line This variable holds the number of historical archive backups you want to keep on target media. I tend to play safe and there’s disk space available nowadays so two months of rotation should be enough. Just remember to copy the actual backup zips to remote location periodically so you got also a real disaster recovery possibility.

There are also other parameters you may like to config but this will do it for us now.

Step 2: Set up a batch file to run the PowerShell script

Next we set up a classic bat-file to make it possible to execute the PowerShell script from windows scheduler. Here is the contents of the bat-file:


@ECHO off
for /f "tokens=1" %%a in ('powershell.exe ""') do set WD=%%a
powershell.exe -Noninteractive -Noprofile -Command "&{<path_to_your_7zbackup_folder>\7zBackup.ps1 --type full --selection <full_path_to_your_selection_file> --destpath <path_to_your_backup_destination_folder> --logfile <path_to_your_7zbackup_logging_folder>\daily_backup_%WD%.txt}"

Basically what it does is that it runs the 7zBackup.ps1 script file telling it to take a full backup, use the settings from the selection file, put the zipped backup in the given destination folder and put the 7zBackup own logging files in given directory and give the logfile a timestamped name.

If you haven’t configured the script execution policy you may encounter this error message:

File 7zBackup.ps1 cannot be loaded because the execution of scripts is disabled on this system. Please see "get-help about_signing" for more details.

That means you need to set the execution policy to allow executing custom scripts. More information here.

Step 3: Schedule the backup process with Windows Scheduler

The last thing to do is to schedule the backup process on daily basis (or more often if you like). There are 5 tasks to set up there:

  1. Run SaveDataAll() on the TM1 server to make sure that all the data is written to disk before the backup starts
  2. Shutdown the TM1 server: NET STOP <your_tm1_server_service_name>
  3. Execute the backup bat file (created in step 2)
  4. Start the TM1 server again: NET START <your_tm1_server_service_name>
  5. (OPTIONAL): Copy the newly created backup zip file from destination folder to your remote backup storage location.

Here we are! Happy backuping!

-GD

Advertisements

Configuring SharePoint 2013 with Business Intelligence features

SharePoint-2013

Installing SharePoint 2013 and SQL Server 2012 are quite straightforward processes. When it comes to configuring them and enabling all the nice Business Intelligence features from scratch is a different thing. I recently took a ride on that Via Dolorosa and completed the whole process once again. Here is what you should remember when going through it yourself.

Chapter I: Prerequisites

Before the whole installation process can begin one must install bunch of SharePoint 2013 installation prerequisites. There is a complete list of prerequisites in TechNet Blog Article so I don’t go through them detailed here. What you should do is download all of them and place them in one directory eg. E:\sp2013_pre and run the following command:


D:\prerequisiteinstaller.exe /SQLNCLi:E:\sp2013_pre\sqlncli.msi /PowerShell:E:\sp2013_pre\Windows6.1-KB2506143-x64.msu /NETFX:E:\sp2013_pre\dotNetFx45_Full_x86_x64.exe /IDFX:E:\sp2013_pre\Windows6.1-KB974405-x64.msu /Sync:E:\sp2013_pre\Synchronization.msi /AppFabric:E:\sp2013_pre\WindowsServerAppFabricSetup_x64.exe /IDFX11:E:\sp2013_pre\MicrosoftIdentityExtensions-64.msi /MSIPCClient:E:\sp2013_pre\setup_msipc_x64.msi /WCFDataServices:E:\sp2013_pre\WcfDataServices.exe /KB2671763:E:\sp2013_pre\AppFabric1.1-RTM-KB2671763-x64-ENU.exe

Pretty much all of the prerequisites can be installed manually but the AppFabric installation and configuration is error-prone so it’s better to leave it (and all of the prerequisites) to the built-in prerequisite installer.

NOTE! The built-in prerequisite installer wants SQL Server 2008 R2 Native client by default. I was using the latest SQL Server 2012 so I tricked the installer and replaced the sqlncli.exe with the SQL Server 2012 Native Client installer (same filename different driver).

Chapter II: Installation and SharePoint Product Configration Wizard

Before installing make sure that you have some domain accounts available for various service accounts used by SharePoint.  Then its pretty much “Next-next-next” : just run through the installer and when it comes to the end of installation please follow the instructions and start SharePoint Product Configuration Wizard.

SharePoint_config_wizard_ (2)SharePoint_config_wizard_ (3)SharePoint_config_wizard_ (4)First we create a new server farm and then specify the database settings for the configuration database and a domain username to access the database followed by giving a secure farm passphrase.

SharePoint_config_wizard_ (5)SharePoint_config_wizard_ (6)SharePoint_config_wizard_ (7)The last thing to do is choose the port for Sharepoint Central Administration Web Application. If you need to choose a specific one now it’s time to do that. For security we go for NTLM.

Chapter III: Configuring the farm in SharePoint Central Administration Web Application

After the Configuration Wizard finishes it’s time to use SharePoint Central Administration Web Application.

First we start the Farm Configuration Wizard and create a dedicated domain user to be used as the Service Account.

Then it’s time to choose which service applications you want to run in your SharePoint Farm. Make sure that you check at least the following services:

  • Excel Services Application
  • PerformancePoint Service Application
  • Secure Store Service

SharePoint_Farm_Config_wizard_ (1) SharePoint_Farm_Config_wizard_ (2) SharePoint_Farm_Config_wizard_ (3)

As it states the actual configuration phase “shouldn’t take long” but when installing to a virtual machine that may not have the appropriate resources this could take long.

SharePoint_Farm_Config_wizard_ (5)SharePoint_Farm_Config_wizard_ (4)

Chapter IV: Create new (Business Intelligence) site collection

After the Farm Configuration Wizard finishes it’s time to create our first site collection. In the create site collection wizard you must provide the name for the site, URL and the template that you’d like to use with your site. As we are creating Business Intelligence solutions we choose “Business Intelligence Center” (under Enterprise tab).

CreateSiteCollection

Now we have our site in place and next thing to do is install the SQL Server PowerPivot for Sharepoint instance.

Chapter V: Install SQL Server PowerPivot for SharePoint instance

For storing user created Excel PowerPivot documents we need to have a separate SQL Server Analysis Services instance installed in PowerPivot for SharePoint mode. Previously this needed to be installed inside the SharePoint farm but with SharePoint 2013 it doesn’t matter where it is installed.

SQL_PowerPivot_SharePoint_ (1) SQL_PowerPivot_SharePoint_ (2)

Just run the SQL Server installer application, choose “Perform a new installation of SQL Server 2012”  and in the “Setup Role” page choose “SQL Server PowerPivot for SharePoint“.

SQL_PowerPivot_SharePoint_ (3) SQL_PowerPivot_SharePoint_ (4) SQL_PowerPivot_SharePoint_ (5)

Rest of the installation is straightforward. Just remember to create and use a specified user account for the Analysis Services service.

SQL_PowerPivot_SharePoint_ (6) SQL_PowerPivot_SharePoint_ (7) SQL_PowerPivot_SharePoint_ (8)

Chapter VI: Manage Excel Services data model settings

Next thing to do is to configure Excel Services settings. Just add the name of the previously installed SQL Server Anaylsis Services (PowerPivot for SharePoint) in the Manage Service Application section. Follow the detailed instructions found from

http://technet.microsoft.com/en-us/library/jj219780.aspx

Chapter VII: Enable Claims to Windows Token Service

If you now try creating a Excel spreadsheet and upload it to SharePoint document collection you may encounter this error message: External Data Refresh Failed.

ExternalDataRefresh

That is because we haven’t configured the Claims to Windows Token service yet. To achieve that follow the instructions found here:

http://technet.microsoft.com/en-us/library/ff487975.aspx

claimstotoken

Chapter VII: Configure EffectiveUserName with Excel Services

We are almost there. Next thing to do is to enable the usage of EffectiveUserName with Excel Services. Briefly the steps to do are:

  • Enable EffectiveUserName property check box in Manage Service Applications
  • Add the Excel Services application user account into the administrators of the SQL Server Analysis Services instance (via SQL Server Management Studio and Analysis Server properties)
  • Now the data refresh should work properly!

For detailed instructions see the following TechNet article: http://technet.microsoft.com/en-us/library/jj938164.aspx

Chapter VIII: Deploy PowerPivot for SharePoint 2013 add-in

To enable server-side data refresh processing, collaboration, and management support for PowerPivot workbooks go and download & install the Microsoft® SQL Server® 2012 SP1 PowerPivot® for Microsoft® SharePoint from here.

Chapter IX: Configure PowerPivot and Deploy Solutions

To get the PowerPivot Gallery, Schedule data refresh, Management Dashboard, and data providers features working properly next we must run the PowerPivot for SharePoint 2013 Configuration tool.

Follow the detailed instructions here: Configure PowerPivot and Deploy Solutions (SQL Server 2012 SP1)

PowerPivot_for_sp2013_config (2) PowerPivot_for_sp2013_config (3) PowerPivot_for_sp2013_config (4)

There may be warnings about Secure Store not configured but we don’t need to care about it at this phase.

PowerPivot_for_sp2013_config (5) PowerPivot_for_sp2013_config (6)

All set! Finally make sure you add the PowerPivot Service Application user to the administrators of the Analysis Services instances.

PowerPivot_for_sp2013_config (1)

Chapter X: Add a BI Semantic Model Connection Content Type to a Library (PowerPivot for SharePoint)

Adding the BI Semantic Model Connection Content Type to a libary allows the quick usage of eg. Power View straight from the SharePoint site. Detailed instructions how to add it can be found here.

To avoid future connection problems wise thing to do at this phase is to make sure that all the corresponding SharePoint Service Application user accounts are added as administrators to all the Analysis Services instances that’ll be used with SharePoint.

AdminaccountsSSAS

Chapter XI: Install and configure Reporting Services SharePoint Mode for SharePoint 2013

In today’s world there is a big hassle about Self Service BI and all the nice-looking analyzing tools. Microsoft has them all but there might still be a need for “reporting for the masses” in other words if you need standardized reports that are delivered to plenty of people then Reporting Services comes handy.  Next we go through the steps of installing and configuring Reporting Services SharePoint Mode for SharePoint 2013. For detailed instructions, please find them here:

Install Reporting Services SharePoint Mode for SharePoint 2013

SSRS_for_SP2013_installconfigure_ (1) SSRS_for_SP2013_installconfigure_ (2) SSRS_for_SP2013_installconfigure_ (3)

First we run the SQL Server 2012 installer once again and choose to “Add features to an existing instance of  SQL Server 2012“. In the Feature Selection page you must choose the following:

  • Reporting Services – SharePoint
  • Reporting Services Add-in for SharePoint products

In Reporting Services Configuration page choose Reporting Services in SharePoint Integrated Mode and Install only.

SSRS_for_SP2013_installconfigure_ (4) SSRS_for_SP2013_installconfigure_ (5)

All set, hit install and after a moment the installation is ready.

SSRS_for_SP2013_installconfigure_ (6) SSRS_for_SP2013_installconfigure_ (7) SSRS_for_SP2013_installconfigure_ (8)

Then comes the configuration part. First we create the Reporting Services Service Application in Manage Service Applications section in the Central Administration. You must provide a unique name for the service application and then create a new application pool and a specific user account for running the pool. Also choose the database server where to create the SSRS datastore. For authentication we go with Windows Authentication.

SSRS_for_SP2013_installconfigure_ (9) SSRS_for_SP2013_installconfigure_ (10)

That’s pretty much it and again we wait for SharePoint to get the configuration done.

SSRS_for_SP2013_installconfigure_ (11)

Next we Activate the Power View Site Collection Feature in the Site Collection Administration. Now that we want to leverage the BISM Data models with Power View make sure that the Power View Integration Feature is also activated.

The last thing to do with Reporting Services is to Add Report Server Content Types to a Library (Reporting Services in SharePoint Integrated Mode).

Chapter XII: Configure PerformancePoint Services

To enable visually nice-looking dashboarding capabilities to SharePoint we must enable and configure PerformancePoint Services.

Configure the unattended service account for PerformancePoint Services

You might hit the error message “The data source cannot be used because PerformancePoint Services is not configured correctly. Additional details have been logged for your administrator” If the PerformancePoint Services Unattended Service Account is not set you must enable it. The setting is located in “Manage service applications” in SharePoint Central Administration under the PerformancePoint Services management page.

PerformancePoint_unattended_service_ (1) PerformancePoint_unattended_service_ (2) PerformancePoint_unattended_service_ (3) PerformancePoint_unattended_service_ (4)

Here is how to do it: Configure the unattended service account for PerformancePoint Services

Check also: http://ericphan.net/blog/2010/4/19/performance-point-server-error-the-unattended-service-accoun.html

Why can’t SharePoint Dashboard Designer 2013 connect to SQL Analysis Services 2012?

By default when running Dashboard Designer and trying to connect to an SQL Server 2012 Analysis Services instance you only get empty values. If you go to Windows Event Viewer you can find out the reason behind this: Dashboard Designer 2013 needs ADOMD.net version 10 support!

Here is some more information on the matter:

Why can’t SharePoint Dashboard Designer 2013 connect to SQL Analysis Services 2012?
Can’t get SSAS databases to appear in Performance Point Dashboard Designer? Check you ADOMD.net version!

This is definitely something that Microsoft should have taken care of in advance. It turns out that Dashboard Designer for 2013 was written for the SQL 2008 R2 Analysis Services clients. So you need to download and add client support for ADOMD.net version 10 from http://www.microsoft.com/en-us/download/details.aspx?id=16978

ado2008

FINAL CHAPTER: Tips n’ Tricks and Troubleshooting

Enable DAX queries with Report Builder

http://sqlblog.com/blogs/marco_russo/archive/2012/01/05/use-parameters-in-your-dax-queries.aspx

Remember to use Data Tools instead of Report Builder since the dialog is not available there.

Publish a Report to a SharePoint Library

http://technet.microsoft.com/en-us/library/bb283155.aspx

SharePoint 2013: The server was unable to save the form at this time

http://www.peneveyre.com/blog/PermaLink,guid,eeacb91f-8cdb-49d1-a59a-59de40ffe18f.aspx

http://sharepointerthilosh.blogspot.fi/2013/03/the-server-was-unable-to-save-form-at.html

 


Powershell meets TM1 again: List all the TM1 objects that exist on one server but not on the another

Now that I’ve fallen in love with Powershell I made effort to have yet another useful (?) script to use along with TM1 model deployments. When you are deep in the development phase of your project you end up adding and deleting objects all the time. At the same time when you use file-based deployment (copy datafiles from server to another) you might end up in situation where your target server has old objects that don’t exist in the source anymore (they’ve been deleted or renamed).

There is no fast built-in feature to check which objects in target environment (production?) doesn’t exist in the source (development?) no more. Here is a piece of Powershell script that gives you a list of all the TM1 objects that exist on one server but not on the another

$datafilesfolderA = ""
$datafilesfolderB = ""

Compare-Object (gci $datafilesfolderA | Where-Object {$_.Name -like "*.pro" -OR $_.Name -like "*.cho" -OR $_.Name -like "*.cub" -OR $_.Name -like "*.dim" -OR $_.Name -like "*.rux"} | Where-Object {$_.Name -notmatch "}"} | Select-Object name) (gci $datafilesfolderB | Where-Object {$_.Name -like "*.pro" -OR $_.Name -like "*.cho" -OR $_.Name -like "*.cub" -OR $_.Name -like "*.dim" -OR $_.Name -like "*.rux"} | Where-Object {$_.Name -notmatch "}"} | Select-Object name) -Property Name


TM1: List all subsets per dimension and their assigned aliases

powershell_2

I was bit amazed to find out that TM1 doesn’t seem to have any built-in functions for fetching all the subsets inside a dimension and it sure does not have a way to tell which alias is currently applied to the subset. Subsets are saved in separate files inside the datafiles directory so I needed a little PowerShell magik to fetch the subsets and their corresponding alias names by myself.

Inspired by the script “Listing all Subsets per Dimension in an IBM Cognos TM1 Server” in IBM developerWorks I wanted to push it little more further. I also switched TI scripting to PowerShell script because I needed to go inside the files contents instead of just iterating through the filenames.

The thing is, inside the .sub-files there is a special line starting with “274,” following after that comes the name of the alias that is applied to the subset. The real reason why I wanted to have a list the aliases applied to each subset is described in the next post but for now here comes the PowerShell script (it must be executed inside your datafiles-folder of your TM1 server):

Get-ChildItem -Recurse -Include *.sub
| Where-Object {$_.Directory -match "\w*}subs\b"}
| Select-String -pattern "274,"
| Select-Object Path, Filename, Line
| Where-Object {$_.Line -notmatch ",$" -AND $_.Path -notmatch "datafiles\\}" 
  -AND $_.Path -notmatch "datafiles\\tunit"}
| ForEach-Object { $_.Path.Split('\')[-2].Split('}')[0] + ";" 
  + $_.Filename.Split('.')[0] + ";" + $_.Line.Split(',')[1] + ";" + $_.Path }
| Set-Content "D:\temp\TM1_Dimension_Subset_Alias.csv"

What it does is basically the following:

  • go through recursively all the files in the datafiles folder that are located in subfolders containing “}subs” in their name
  • from the files, fetch all the lines that match the pattern “274,” (alias name)
  • filter out unnecessary system folders
  • finally form a nice csv-formatted content containing dimension name, subset name, alias name and the full path to the subset file
  • write the results to output csv file

Example of the contents of the output file:

dimName1;subsetNameA;aliasname;c:\tm1server\datafiles\dimName}subs\subsetName.sub

There you go! Have fun powershelling your TM1 subsets!

-Greeny


SQL Server 2014 CTP1 In-memory OLTP issues with VirtualBox

I just started getting touch to the brand new SQL Server 2014 with my VirtualBox virtual machine. The first thing to to with SQL Server 2014 is to test out the new in-memory OLTP functionality (of course!). So I did and immediately faced challenges. I followed the SQL Server team blog instructions and noticed that “MEMORY OPTIMIZED DATA” -section in Filegroup properties was missing completely. I tried to run the scripts provided and then got this error message:

Msg 41342, Level 15, State 1, Line 5

The model of the processor on the system does not support creating filegroups with MEMORY_OPTIMIZED_DATA. This error typically occurs with older processors. See SQL Server Books Online for information on supported models.

Looks like I wasn’t the only one around having the same issue because it had to to something with VirtualBox default settings for a new virtual machine. So, if you want to get the new in-memory OLTP functionality working correctly with VirtualBox you need to run the following command with VBoxManage:

VBoxManage setextradata [vmname] VBoxInternal/CPUM/CMPXCHG16B 1


TM1 Performance Modeler – Export hits OutOfMemoryError

I was planning to do the very first deployment from development to production in a new customer environment. Using the transfer out – feature of TM1 Performance Modeler it should have been straightforward but right after the transferring started I encountered an nasty error message:

Transfer has failed due to the following error(s):

ERROR

The transfer failed.
java.lang.OutOfMemoryError

OutOfMemoryError

What a nice way to start production deployment. Server had gigabytes of free memory but somehow insufficient amount of it was available to Performance Modeler. Google to rescue and I found the following IBM Support page: http://www-01.ibm.com/support/docview.wss?uid=swg21600719

I found out that there’s a yet another XML configuration file called ModelingMDT.ini that should be adjusted. It’s located in the following directory:

C\Users\<USERNAME>\AppData\Roaming\IBM\Cognos Performance Modeler\bins\bin_10.1.XXXX.X

I just increased the value -Xmx512M in to -Xmx1024M and voila’: transfer out succeeded this time. Our model is pretty small at the moment but still it doesn’t fit in the default 512MB while transferring the objects. I think it’s a pretty small value for a default one. What do you think?


MDS 2012: Collections not supported in the new staging process

There were lots of good improvements and new features in MDS 2012 version but during our current project I found out some interesting things about collections.  According to Microsoft they do have made improvements to the collections GUI (which was crap indeed in MDS 2008R2) but for some reason they didn’t improve the collections importing via staging tables along with the release of the completely new staging process.

I was searching for collection-specific staging tables and confusingly found none. Then I made some findings in the following article: Discontinued Master Data Services Features in SQL Server 2012. There it’s clearly said that:

You cannot use the new staging process to:

  • Create or delete collections.
  • Add members to or remove members from collections.
  • Reactivate members and collections.

You can use the SQL Server 2008 R2 staging process to work with collections.

So it seems like they ran out of time creating new staging process for collections in product development and decided to continue supporting collections import in “2008R2 way”. I bet this will be deprecated in the next version and hopefully they’ll invent a new staging process for collections in the next SQL Server version. Until then I’ll stick up using the old staging tables and good old mdm.udpStagingSweep stored procedure 🙂