Search This Blog

Thursday, December 24, 2015

Specifying Paths to External Items (Report Builder and SSRS) in SharePoint with PowerShell

[2015-Dec-24] My story: two SSRS items (main report and its sub-report) need to be deployed to SharePoint. Trivial task, and even if I need to locate sub-report to a different SharePoint document library or folder I can easily do that during a development phase or using Report Builder after deployment by specifing relative or absolute path for sub-report within a definition of the main report. And here is a good MSDN resource about this - Specifying Paths to External Items (Report Builder and SSRS).

But with multiple environments to deploy the very same set of SSRS items to adjust will be a time consuming task. However we can prepare and adjust SSRS XML files during the time of deployment and PowerShell is one of the many ways to achieve that.

Here is a top section of the main SSRS RDL file:

And here how this very same section may look like with the absolute path for its sub-report in a targeted deployment SharePoint environment:

And portion of a PowerShell script that does it all is very simple:

So, at the end of the day, it's a win-win situation both for Development and Deployment teams:  less time for deployment and more time to develop other things.

[2015-Dec-30]  A brief update for the PowerShell script. I needed to make it more flexible to support sub-report paths update wherever they are located within their main SSRS report. So I had to change my XML way of dealing with the RDL report file to a text file way (XML file is still a text file :-).

And I changed this part of the code:
# update of the connection string to the found sourcing Excel files
[xml] $xdoc = get-content $rptReference

# Getting to the $rsr_environment XML block
$SubReports = $xdoc.Report.ReportSections.ReportSection.body.ReportItems.Subreport
foreach($SubReport in $SubReports)
$SubReport_New = $SubReport
$SubReport_New.ReportName = $targetDataSources+"/"+$SubReportFolder+"/"+$SubReport_New.ReportName+".rdl"
$xdoc.Report.ReportSections.ReportSection.body.ReportItems.ReplaceChild($SubReport_New, $SubReport)


To this:
(Get-Content $rptReference).replace('<ReportName>', '<ReportName>'+$targetDataSources+"/"+$SubReportFolder+"/") | Set-Content $rptReference

(Get-Content $rptReference).replace('</ReportName>', '.rdl</ReportName>') | Set-Content $rptReference 

And it worked like a magic, regardless of how many sub SSRS reports a main report could have and their locations in there!

Wednesday, November 18, 2015

PowerMap with Custom Region shapefiles - Demystified

[2015-Nov-18]  Today is a global GIS Day ( where people across the globe celebrate and promote the use of various geo locating technologies. I used to work for one company ( about 5 years ago as a data analyst and I saw their GIS specialists worked very hard with different Geo data sets, so I respect this type of work.

Recently I had a chance to participate in the Toronto Public Library Open Data Hackathon ( to brainstorm and come up with different ideas that would improve library services. Our team designed a prototype of web application that would enhance library users experience with book clubs. And I proposed to build a reporting layer for library staff who may analyse a potential impact of improved book club experience to a number of new visits and new registrations to the library.

I also wanted to explore Microsoft PowerMap technology in its way of portraying geo layers based on custom region KLM and SHM files which contain custom polygons (Power Map for Excel–May update for Office 365). Before I share how easy or not easy use of shape files for PowerMap was, let me show you a final version of our team PowerMap video that was created for this hackathon:

So after pulling some historical library datasets that they provided to us into my model; I created my initial PowerMap tour graphics. Which I further wanted to enhance with custom region geo files form the (City of Toronto Open Data Catalogue):
- Regional Municipal Boundary
- Neighbourhoods

Currently PowerMap custom regions support only County/Region, State/Province, Postal Code and Custom Region based on geo shape files:

However I was able to overcome this limitation by just renaming the City field (column) name to something else (e.g. CityName or CustomRegion, it’s your choice); and once it wasn’t longer as City, then I was able to map it to the Toronto Regional Municipal Boundary shape file attribute.

The other more difficult thing was to portray a set a polygons from another shapefile which contained information of 140 Toronto neighbourhood regions. I had a list of neighborhood regions in my data model and no matter how hard I tried, I couldn't make them visible in my PowerMap visualization.

So I started to research this a little bit more and found one hint in one of the PoweMap examples that Microsoft provided. For one of their models their provided this note that their shape files was slightly adjusted by a GIS software [QGIS tool]. And I installed this software on my machine because I didn’t know other way of what was inside of downloaded shape files.

So using the QGIS tool I was able to see how that shape file was constructed and I also saw the reason I wasn’t able to map my data model data sets with attributes from the shape file. 

I didn’t dare to change geo file and thought it would be easier to adjust my model data set, and that change made all the difference. Now all 140 regions became visible in my PowerMap visualization! Good for Microsoft and thanks to GIS and Toronto Open Data Catlog for providing software and necessary data sets.

My key learning in this PowerMap with custom regions (shape files) exercise are:
- Don’t trust region shape file metadata (use GIS tool to check shape file content, especially attribute table);
- I can easily overcome existent PowerMap limitation of mapping only County/Region, State/Province, Postal Code to geo location points, it’s your choice how to name them;
- This leads to my 3rd finding that my model data sets can be mapped not only by names but ID or other fields as well. Because when you import a region shape file it will ask to pick a region name from a list; and I was able to map my polygon and model data sets by their reference IDs as well. So it's like a table to table relationship by reference fields that you choose.

I’m was very pleased with how Microsoft improved PowerMap with the new Custom Region feature based on external geo shape files. And I am even more happy to learn that they plan to enable this PowerMap visualization in the PoweBI! How soon, I don't know.

Please leave your comments or if you have some further questions, I'd be glad to share all that I have learned.

Happy data adventures!

[2015-Nov-24] Just a brief update: I’ve recently contacted both PowerMap and Power BI teams in terms of enabling the Excel PowerMap visualization in Power BI. The PowerMap team informed me that there was no plan to move their client application to Power BI; however they had also mentioned that 3D Map would be available in a form of custom visualization. And today Power BI team has announced availability of the Globe Map visualization (

So synergy is in the air! :-)

Tuesday, October 20, 2015

Too Big To Be Visible - SQL Server vs. Oracle data in SSRS reporting

[2015-Oct-20]  A sourcing system failed to return correct results (returned value was a very high figure with double digit number after "E"). An existent ETL process was capable to save this value into an Oracle staging database, however previously working SSRS report showed me a blank column with no data instead.

I knew where the problem was, so I decided to explore this SSRS limitation in a attempt of showing very big numbers.

First, I took a sample query from the AdventureWorksDW database and added an extra row  to query a very big number that SQL Server can handle:

Then I "migrated" this data set into Oracle environment with a small adjustment to support very big numbers that Oracle can handle too:

and then I created a SSRS report with both SQL Server and Oracle data sets. What happened next was a surprise for me; the Oracle "Fairy Land" value didn't show up in my report:

So after playing with allowed data limits for my Oracle data set, I came to a range that SSRS actually could show in the report: it only allowed me to show Oracle data values when a big number was within (-7E-28 .. +7E+28), I may be wrong in my assumptions, but somehow Oracle client was not returning data out of this range for my SSRS report:

and only after changing my Oracle "Fairy Land" data, SSRS finally showed me expected results. So this was my finding, that SQL Server data will be shown as it is in a SSRS report, however Oracle data connection has some limitations.

Happy data adventures!

Wednesday, August 12, 2015

Why I still like Excel PowerPivot models in SharePoint however Tabular solutions in SSAS are better

[2015-Aug-12] It’s quite obvious that in present time there is a growing audience for all the latest changes and updates within the Microsoft Power BI sphere. I think someone has already built a data model that showed number of tweets about Power BI which prevailed other more usual and heavily discussed software development topics on one day. However I would still like to share some thoughts that came as result of building PowerPivot data model in Excel/SharePoint vs. Tabular solutions.

Also this post was inspired by (Tabular Models vs PowerPivot Models) and (When to choose tabular models over PowerPivot models).

So here are a few risks or disadvantages that we may face with PowerPivot Excel data models SharePoint:

Performance/Availability to users

  • Excel PowerPivot in SharePoint is just a physical file that internally holds a backup of tabular model that automatically gets restored into a memory and is utilized whenever users try to interact either with linked PowerView report or manage Excel files itself. Once this “in-memory” file tabular database is not used for while it gets deleted by internal garbage collector. That means that after a short while, it will take another waiting time to bring that “hibernated” data model into memory before it will be available for users.
  • Server Tabular model is a different type in memory model, that still may be “cleansed” by internal garbage collector; and for sure certain performance tests that will compare Excel PowerPivot model vs. Tabular model needs to be built. However preliminary tests showed that Tabular model was more responsive during user interaction.


  • SharePoint Excel PowerPivot books live under control of Excel Services, PowerPivot Services and Analysis Services within SharePoint environment and may be affected by user activity with other SharePoint objects; SharePoint disk and memory space might bring some other dependency on life of Excel PowerPivot models.
  • So having a dedicated, set apart from SharePoint, SQL Server Tabular environment may bring a peace of mind to all users who will rely on their reporting data availability.


  • Even with a standalone Excel file 2 GB limit, we are faced with different file size limitations that SharePoint will handle (2 GB is still an option in SharePoint, but this has to be specifically defined).
  • With a tabular database we only have to deal with how its hardware infrastructure is organized (Hardware Sizing a Tabular Solution (SQL Server Analysis Services))


  • I like both Excel development environment as well as Visual Studio for tabular solution. In both places I can define all the metrics, build relationships between tables and define necessary calculations.  
  • Deployment could be tough with Excel PowerPivot work books: I’ve built a PowerShell scripts that prepares and updates metadata within each Excel PowerPivot data models prior their uploading to different environments in SharePoint ( However it’s still depended on how an Excel XML document is structured in Excel 2013 and this may get changed in later versions.


  • You’re on your own with managing an Excel PowerPivot model: it’s you and Excel. Yes, SharePoint will let you to schedule data model refresh once a day; however this schedules may be jeopardized by frequent updates and interactions with Excel files themselves (
  • Tabular solutions can be backed up, restored, detached, and attached; PowerShell cmdlets can be applied. And most importantly we can trigger a partial or complete data refresh of our server data models multiple times a day (XMLA scripting could only be a great benefit for this).


  • When an Excel PowerPivot model file is deployed to SharePoint, users will or will not have a complete access to the data model (depending on their access permission scope defined in SharePoint).
  • With tabular solution we can utilize all the benefits or role base security model and manage access to our server data models on a more granular lever.

Will all this being said, I still see that an initial PowerPivot data model could be designed and developed in Excel; however once it’s defined and tested then it’d be wiser to export this model into a tabular solution for a better manageability and data availability for end users. 

Monday, July 27, 2015

Single value filters and custom error messages in Power BI

"Let me be the one for you, otherwise we need to work on communication". Basically this was a very brief description of a recent use case for a power view report which had been later replicated in a Power BI slide as well.

Users were aware of PowerView / Power BI limitation (as of 2015-July-27) of not having filters with a single choice of values to select. PowerView slicers could be a good solutions for this, which would allow having that single value selection functionality (Power BI slicers still need to be improved); and use of regular filters has been a more preferable choice in order to have more room for useful visualizations within a report. 

However users usually get confused by selecting multiple values for a particular filter, which might lead to improper data discoveries. And then they simply asked, "Yes, we understand PowerView behaviour and we understand that we make mistakes with a multiple filter's values selection. Show us then an error message in our report so that we would be aware about this mistake".

It was easier to tell them that they should properly operate with their filters, however I decided to explore PowerView / Power BI functionality a bit more if it would allow some custom error messages, before taking the easy path.

First, I pulled a sample data from a Wikipedia page ( with countries'  populations statistics into my data model; 
and also exported some lookup data for parameters:
Then I created a simple visualization of this data along with 2 slicers: Population Group & Region.

Then for 'Country by Population' table I added 3 DAX measures:
Region Selected = COUNTROWS(FILTERS(Regions[Region]))
PopulationGroup Selected = COUNTROWS(FILTERS(PopulationGroup[Population Group]))
MultipleSelected = if(OR([Region Selected]>1,[PopulationGroup Selected]>1),1,0)

And one measure for the CurrentDate table:
ShowError = if([MultipleSelected], [MultipleSelected]*MIN([MessageType])+1, MAX([MessageType]))
Then I applied ShowError to the Top Left table region with CurrentDate table data:
and MultipleSelected to the main chart in order to control visual behavior of the top table region and this main chart correspondingly.
And it worked! If I selected more than one value from any of the slicers, then the Top table region would show a defined error message (instead of a report date value, please see the slide with this visualization at the beginning of the post) and main chart would become blank:
Also, I realized that with this chart's bars hiding I have found a bug with the Power BI charts in general (which I believed would be fixed soon) that data labels were still visible while data bars were not, which didn't make any sense.
This very same reports was originally developed in PowerView:

And both Power BI and PowerView files can be found here: 

So at least some communication back to users was possible in Power BI, and I think there will be more and more improvements in this Power BI area :-)

Happy data adventures !

Tuesday, July 21, 2015

Setting specific colors for PowerView charts

I was listening for Phil Collins' "True Colors" song in my car audio this morning and it reminded me about the very issue that PowerView currently has: that it doesn't allow to set specific colors to different elements of its own visuals. There are more and more announcements made about coming changes and improvements in this area for Power BI, however if you're still using SharePoint or Excel based PowerView charts, this might be a challenge for you.

There is a way to change colors in PowerView with the help of setting different themes though (Format Power View reports); however I don't really like this approach when one theme change affects all of your PowerView slides and there is not much control left on how to set different colors for different sections of your report.

So I have come with a quick workaround on how certain colors are still can be chosen for my PowerView visuals.

For this demo test I have pulled 2 data sets from Wikipedia using PowerQuery and pushed them further into a PowerPivot model.
1) Top 10 Countries by Population (
2) Top 10 Countries by Territory (

Then I created simple PowerView charts to show this data:
And here is a big question: what if I want to color top chart with Color A and second one with Color B; still not much options left with the existent PowerView interface. But I've noticed that with each color theme there is an order of particular colors, so even if I'm not allowed to set a specific color, I still have a control to choose colors from the existent theme and set them based of my customers' needs.

Default color theme starts with Blue, then goes Red, Yellow and other colors. So I added blank columns to the PowerPivot model's tables (Color 1 & Color 2):

And then I could place empty columns to the set of Values of my PowerView chart this way:
so that by default [Color 1] column would be Blue colored, [Color 2] column will be Red and real Value will become Yellow. Here is a trick: since both [Color 1] & [Color 2] are empty columns, no colors will be shown for them and only one Yellow color will prevail in the chart :-)

You can play a little bit more with this example; I've saved all PowerQueries, PowerPivot model and PowerView charts in this Excel file (Top10Countries.xlsx).

Eventually your PowerView charts can get close to your "true colors" and that's why I like it :-)

Thursday, June 18, 2015

Updating PowerPivot model and PowerView report with PowerShell

Before I start describing a problem that my PowerShell script helps to resolve, these are a few blog posts of other people that inspired me to look at the whole issue of automating changes for Excel PowerPivot models (especially when you have many such Excel files and their manual change would take significant amount of time): 

So here is my problem: we have built several Excel PowerPivot models and we want to use them for different testing environments (DEV, UAT, etc.) after we save those models in SharePoint. The only problem is that for each of the environments those models have to be deployed,  each data connections for different data sources have to be changed; and this can be done only manually: by opening each of the Excel files and through a PowerPivot management to make all the changes; however, if you have too many of Excels files, then this manual change would take significant amount of time. 

Let's me show my current approach to develop SharePoint PowerView reports based on PowerPivot models: 

(A) Excel PowerPivot model deployed to SharePoint (.xlsx file) 
(B) BISM data connection to (A) Excel PowerPivot model (.rsds file) 
(C) SharePoint PowerView report based on (B) data connection (.rdlx file) 

Deployment prerequisites: 
- PowerPivot data model sourcing from different databases
- PowerPivot/PowerView files for different environments are to be saved in different SharePoint folders (or SharePoint servers).

Deployment issues: 
- Reference to different databases has to be changed for each of the (A) Excel PowerPivot models. 
- Once an updated PowerPivot model is deployed to a different SharePoint folder (or server), then a reference to this Excel file has to be updated in the (B) BISM data connection as well 
- (C) SharePoint PowerView report contains internal reference to its data source that has be changed via SharePoint user interface.  
Could be an issue to update many files at every deployment, because it’s a manual process and it will take time if we have to deploy many reporting items. 

We can always do things manually 

Required changes for deployment: 
- Change of database references in PowerPivot models 
- Change a reference to an Excel PowerPivot file at another folder (or server) for BISM data connections 
- Change a reference to a BISM data connection file at another folder (or server) for PowerView report 

Manual process to perform required changes: 
  • Each Excel file with PowerPivot model has to be opened in desktop Excel and then through a PowerPivot management each data source connections have to be changed. 
  • Change connection string via a SharePoint menu “Edit Data Source Definition”. 
  • Change data source reference via a SharePoint menu “Manage Data Source”. 

Let’s try to automate something:  
  • Currently no automation to update PowerPivot data connections; it still has to be done manually for every single Excel file. 
  • BISM connection file is an XML file and can be updated programmatically prior its deployment to SharePoint.  
  • PowerView .rdlx file is an archived file that contains an XML \Reports\report.rdl file and can be updated programmatically prior its deployment to SharePoint 

Possible way to change PowerPivot models programmatically:
1) Each Excel (.xlsx) files can be seen as an archive file (.zip) that contains \xl\model\ file and this file contains PowerPivot model of that Excel file. 
2) Also this file is an archive file of SSAS tabular database and we can restore this (item.abf) file to a tabular SSAS server database. 
3) Then we can change settings of the newly restored database (along with connection strings).  
4) And save this file back to our Excel file. 

PowerShell script steps: 

PowerPivot change PowerShell automation: 

  1. Scan through a folder for Excel files 
  2. Rename .xlsx file to .zip file 
  3. Unzip .zip file to a model folder 
  4. Take file and rename it to item.abf file 
  5. Connect to a SSAS Tabular server 
  6. Run XMLA command to restore item.abf archive file to a tabular database on server 
  7. Run XMLA command to update required database connection strings 
  8. Run XMLA command to backup tabular database back to item.abf 
  9. Rename item.abf file to file  
  10. Copy modified file to the model folder 
  11. Archive model folder to .zip file 
  12. Rename .zip file to .xlsx file 

PowerView change PowerShell automation:

  1. Scan through a folder for .rdlx files 
  2. Rename .rdlx file to .zip file 
  3. Unzip .zip file to a model folder 
  4. Update <DataSourceReference> tag in the XML \Reports\report.rdl file 
  5. Replace modified report.rdl file in the model archive 
  6. Rename .zip file to .rdlx file 

BISM data connection change PowerShell automation:

  1. Scan through a folder for .rsds files 
  2. Update <ConnectionString> tag in the XML file 

Here is a link to a complete PowerShell script that does update all PowerPivot/PowerView/BISM data connection files.

Some thoughts about the PowerShell script: 

1) Main area that shows all key steps:

2) Easy way of updating XML files:

3) Sending XMLA command was an easy step too:

4) There was one issue of rather difference in dealing with zipped RDLX file compared with zipped XLSX file. Both desktop Excel and SharePoint didn't bother to accept .xlsx file that was created from zipping a whole folder; however powerview .rdlx file didn't like this approach and SharePoint didn't want to accept it (incompatibility error). So in case of .rdlx files I had to update an existing .rdlx(.zip) file rather than creating a new one from a folder; and this was the only approach that worked and I was able to see my modified PowerView report in SharePoint after without any issues.

And the end all PowerPivot (.xlsx), PowerView (.rdlx) and BISM data connection (.rsds) files were automatically prepared to be saved to SharePoint; and all worked well. 

Happy data adventures!