Power BI Dynamic Measures & Dimensions

By | Data Visualisation | One Comment

Power BI Dynamic Measures & Dimensions – Part 1

A great way to take your Power BI Reports to the next level is by making them fully interactive by incorporating PowerBI dynamic measures & dimensions. With this clever use of DAX, it’s possible to slice your data the way you want without leaving the dashboard screen.

In this 2-part series we’ll look at creating dynamic measures & dimensions within your PowerBI reports in part 1 and then continuing that into visual titles in part 2.

Using this concept greatly enhances the end user experience as they can monitor/analyse multiple measures and dimensions in a single report page at the click of a button making this a highly desirable solution.

In the PowerBI report example below, we have a financial dataset which includes revenue, expenses & taxes and from these we can calculate profit. We have created 1 set of measures for profit, revenue, expenses & taxes and 1 dimension for the time periods Fiscal Year to Date (FYTD), Quarter to Date (QTD) and Month to Date (MTD).

This will allow end users to view, as seen in the examples below, Profit by FYTD or QTD or MTD as well as the variances to the previous FYTD or QTD or MTD and all the visuals and visual titles will update when a different measure or time period is selected.

Profit FYTD

Profit QTD

Profit MTD

How to

There are quite a few steps involved to get this concept up & running but the benefits to your report consumers will be well worth the effort.

  1. Create the base measure for Profit, then create the Time Intelligence measures for the current periods, Profit FYTD, Profit QTD & Profit MTD and then create the measures for the previous periods, Profit PFYTD, Profit PQ QTD, Profit PM MTD and repeat this process for Revenue, Expenses & Taxes.

  1. Create 2 new tables, one for measure selection, this allows end users to select from Profit, Revenue, Expenses & Taxes measures, and one for time selection, this allows end users to select from FTYD, QTD & MTD time periods.

Dynamic Measures table

 PowerBI Dynamic Measures Table

Dynamic Dimensions table

  1. Create the base measure for the Measure Selection and Time Selection These 2 measures utilise the SELECTEDVALUE DAX function, becoming the foundation for the dynamic measures we create for the visuals within the report page.

Below is the Time Selected measure.


  1. Next we create the Selection 1 and Selection 2 These 2 measures utilise the SWITCH DAX function. Selection 1 is for the current period measures FYTD, QTD & MTD and Selection 2 is for the previous period measures PY FYTD, PQ QTD and PM MTD. Below is the Selection 1 measure.

PowerBI Dynamic Measures DAX


  1. Finally, we create the variance measures between Selection 1 & Selection 2 allowing us the show the variances by $ value and % value between the selected time periods i.e. variance between FYTD and PY FYTD as seen below.


Bringing it all together

Using your preferred slicer, in this example the Chiclet Slicer (available via the PowerBI Marketplace), we create 2 slicers, one for the Selected Measure and one for Selected Time. These slicers allow your report consumers to dynamically change measures & time periods within the report page.

Below is the Selected Measure Chiclet Slicer.

PowerBI Dynamic Measures Slicers

With all the setup now complete, we create the visuals for the report page using either or both the Selection 1 and Selection 2 measures and you’re done!


If you’d like to take your PowerBI Reports and Dashboards to the next level and need help, please contact us to discuss how we can assist your organisation.

Power BI: Dashboard In A Day

By | Data Visualisation | One Comment

Power BI: Dashboard In A Day

Over the past few weeks, FTS Data & AI have had the privilege of hosting multiple Microsoft-sponsored Power BI: Dashboard In A Day (DIAD) events across Sydney. As a Microsoft Gold Partner for Data Analytics, the team presented over 3 days’ worth of content to a combined audience of over 100 business analysts, report developers and data professionals. Handpicked to present at Microsoft’s headquarters in North Ryde, we were able to deliver events that were both informative and beneficial for all those in attendance. For those who are curious and could not attend, this blog post covers what a DIAD event is, and why you should register to attend the next one.

L-R: Alex Gorbunov, Matthew Oen, Swetha Pakki, Sahan Vaz Goonawardhane, Ajit Ananthram

What Is It?

DIAD is a free one-day course designed by Microsoft to help analysts explore the full capabilities of Power BI. Attendees learn about Power BI in detail, follow a step-by-step lab manual, attempt 2 real-world practical examples and receive expert guidance from a team of experienced instructors. Through self-paced learning, attendees can properly develop their skills in Power BI and create a business-ready dashboard in a matter of hours.

Most importantly though, a large section of the day is devoted to providing attendees with the opportunity to develop Power BI reporting from their own datasets. This is where attendees get the most value, as they can ask questions and get assistance from experienced consultants regarding their own business’s reporting projects. The FTS Data & AI team were able to lend a hand and provide tailored advice to a vast number of businesses at various stages of their Power BI reporting journey.

Presenting to a large audience at Microsoft HQ

When Is It?

DIAD events occur throughout the year. Depending on your location, you can find dates for upcoming events by contacting us.

Attendees work on real-world practical examples

Where Is It?

DIAD events are held across the country. The Sydney events this year were held at the Microsoft head office in North Ryde. Here at the Microsoft HQ, attendees were able to fully engage with the technology, and see first-hand what Power BI and other Microsoft products are capable of, and how they could be used in their business.

See the full and latest capabilities of Power BI

How Much Will It Cost?

DIAD events run for 8 hours and are FREE. Make sure you register your interest early to ensure that you reserve a seat, as spots are limited at each event. Lunch and refreshments are also complimentary and provided throughout the day.

Learn how Power BI can be implemented at your organisation

Each attendee receives expert advice

Why Should I Go?

DIAD is an event like no other. Designed by Microsoft and delivered by professionals, DIAD empowers attendees with the skills and best practices needed to develop successful Power BI reporting solutions. Based on the overwhelming positive feedback from attendees, these events have been instrumental in getting Power BI quickly adopted in several organisations.

The opportunity to receive tailored advice from professional Power BI consultants means that you can accelerate the implementation of Power BI at your business, and be confident that you have the capability to develop powerful reporting solutions at your organisation well into the future.

See how other organisations have successfully adopted Power BI


If you want to know more about Dashboard In A Day events, please contact us for more information.

SSAS Tabular Optimisation In 5 Easy Steps

By | Data & AI | No Comments

SSAS Tabular Optimisation

A well-designed SSAS tabular model is often the key ingredient in a successful analytics solution. It aids the business in delivering the type of ad-hoc analysis and on-the-fly insights that drives performance improvement and economic benefit.  However, not all models are well-designed. Indeed, a poor performing cube can quickly become a burden for any organisation, negatively impacting quality of analysis and becoming a drain on valuable business resources. That’s why SSAS Tabular optimisation is so crucial for businesses wanting to get the most value out of their analytics solution.

Recently, I consulted for a large electrical merchandising business who were having some trouble with their SSAS tabular models. With national operations, it was imperative that their cubes could rapidly and reliably deliver the analysis needed for the business to confidently make strategic decisions around sales, purchasing and inventory. Memory issues and ambiguous design principles were proving to be a challenge in getting the tabular model to behave, and it was clear that I needed to tune the existing cubes with some simple optimisation techniques.

When attempting SSAS Tabular optimisation, I employ a straight-forward 5-step strategy:

  1. Examine the model and remove unnecessary tables
  2. Examine the tables, remove unnecessary columns and edit table structure/content
  3. Examine the columns and change data types
  4. Examine the DAX measures and edit expressions
  5. Examine server properties and edit memory settings

This 5-step performance tuning approach guarantees that tabular model issues can be precisely identified and appropriately addressed.

1.      Examine the Model

A concise tabular model is one that performs best. Therefore, the first step is to review the model itself. Very often a poor-performing cube contains unnecessary tables or relationships that provide no real value. A thorough review of what tables are present in the model and what value they bring will uncover what is necessary and what is redundant. Talking to stakeholders about what they need will also help determine what tables should go and what needs to stay. In my example, I was able to reduce the cube size by removing unnecessary dimension tables that I discovered the business was no longer interested in. This redesign process typically yields ‘quick-and-easy’ wins in terms of cube performance, as it is the easiest to implement.

Figure 1. Removing unnecessary tables reduces SSAS tabular model complexity


2.      Examine the Tables

What data actually goes into the tables will ultimately determine the quality of the tabular model. Similar to the first step, a review of the tables will often uncover unnecessary columns that do not need to be loaded into the model. For example, columns that are never filtered on or contain largely null values. Table structure is also important to tabular model performance, as it can affect how much data needs to be loaded. For example, you could reduce the row count of the sales fact table by aggregating it to be at the invoice level, instead of invoice line level. Such a reduction in size will mean that less memory is required by the cube.

Figure 2. Tidy up tables by removing columns, and reducing rows


3.      Examine the columns

A crucial aspect of cube performance is compression. Columns with certain data types, or have unique values for all rows will compress badly, and will require more memory. An effective optimisation technique is to correct the data type or value in a column, such that it is able to compress better. Casting values as integers instead of strings or defining decimal points are fundamental practices that are often overlooked in tabular model design, and ultimately come at the expense of performance. In my example, I was able to create a new unique invoice ID that could be used by the business and compressed as an integer. Previously the varchar invoice key was unique at almost every row of the sales table, and was compressing very poorly. The storage engine (Vertipaq) wants to compress columns, and having similar values in the same column greatly aids this. A great tool for this kind of analysis is the Vertipaq Analyzer. This tool can highlight potential areas of interest in compression activities and help track results in terms of cube optimisation techniques.

Figure 3. The VertiPaq Analyzer reveals compression pain points


4.      Examine the DAX

For cube users, it is critical that the OLAP queries they run return accurate results rapidly. If a user cannot get the information they need from a model in a reliable or timely manner, the cube is failing to provide the benefits expected of it. Therefore, an important part of tabular model optimisation revolves around the measures, and ensuring that the DAX expressions used are performance optimised for the formula engine. Keeping the measures simple by using basic expressions, and removing complicated filtering clauses means that the measures should perform better. In my example, I was able to change some of the expressions of sales measures at different period intervals (such as month-to-date and year-to-date), such that they could run across different filtering contexts, thus reducing calculation time.

Figure 4. Simple DAX equals better performance


5.      Examine the Server

Finally, the biggest factor in tabular model processing performance is the actual memory properties. Depending on the edition of the Analysis Services, there are various levels of memory limits. For the Standard Edition, the 16Gb limit imposed on a single instance of Analysis Services can often be the ‘killer of cubes’. If a reasonable business case exists, then moving to the Enterprise Edition or cloud-based solution can be the right answer to memory woes. However, there are steps that can be taken to get the best out of a SSAS tabular model without abandoning Standard Edition altogether. Increasing the amount of RAM on the server and modifying the server instance memory properties allows you to fine tune processing and reduce the likelihood of memory exception errors. In my example, the cube was failing to process as it would run out of memory during a Full Process. I increased the RAM from 32Gb to 40Gb, and reduced the Total Memory Limits in the server instance properties. With more memory and lower thresholds to which memory cleaner processes were initiated, the cube was able to process in full each time without error.

Figure 5. Fine tune the memory limits to find the optimal level of performance



Like any business asset, a SSAS tabular model loses value when it is not properly configured or utilised. However, with the proper approach methodology, any model can be transformed from an underperforming asset into a valuable resource for a business.


If you’re having trouble with SSAS tabular optimisation, we want to hear about it! Please contact us to find out about how we can help you optimise your cubes.

PowerBI Tooltip

PowerBI Tooltips

By | Data Visualisation | No Comments

PowerBI Tooltips enhance your reports

A great way to take your Power BI Reports to the next level is by using PowerBI Tooltips. Report Page ToolTips (RPT). RPT’s allow you to enhance your reports by giving your end users more information without taking up any real estate, cluttering the report canvas.

In the example below, we have a donut chart showing Revenue FYTD by Region. One great way to enhance visual this is to add a RPT showing the Top 5 Stores for each Region as your end users hover over the donut chart.

PowerBI Tooltips

Target visual for PowerBI Tooltips


In the example below we’ve hovered over the ‘North America’ Region and the RPT shows us the Top 5 Stores by Revenue FYTD for that Region.

PowerBI Tooltips

Tooltips in Action


When we hover over the different Regions the RPT changes to show the Top 5 Stores for that Region.

PowerBI Tooltips

PowerBI Tooltips Changing context


As well as the RPT, we have setup a drill through on this donut chart and can still perform this action by right clicking and selecting the drillthrough option as shown below.

PowerBI Tooltips

Drilling through for more detail


In our second example, we demonstrate that RPTs can also be used on bar charts. This bar chart shows Revenue FYTD by Country and when you hover over a Country the RPT shows the Revenue FYTD by month for that Country. These also change as you hover over the different Countries.

PowerBI Tooltips

PowerBI Tooltips on a different visual


NB: This report also has dynamic measures and visual titles which we will cover in upcoming blogs.

How to create PowerBI Tooltips

In only a few steps you can create PowerBI Tooltips (RPT’s) in your reports, so let’s go through those steps now.

  1. Add a page in your report and in the Visualizations tab, set the Page Information tooltip slider to On and give it a name. In this example we’ve named it TT – Country FYTD
PowerBI Tooltips

Enabling PowerBI Tooltips


  1. Create a visual or visuals, below we have a line chart and 2 card visuals for our RPT, within the report canvas as normal and size the page appropriately. In the Visualizations tab you can set the page size to the default Tooltip or customise the size to get the best fit for your chosen RPT visual.
PowerBI Tooltips

Defining PowerBI Tooltips size


  1. Once you’ve finished creating the tooltip visual, hide the page by right clicking on the page name.
PowerBI Tooltips

Hiding the PowerBI Tooltips page


  1. On the report page, select the visual the RPT will appear on and go to the Tooltip settings in the Visualizations tab, set the Tooltip slider to On, the Type to Report page and select the tooltip visual you created, in our example it’s TT – Country FYTD
PowerBI Tooltips

Linking the PowerBI Tooltips to a visual


And you’re done! It’s as simple as that.

In upcoming blogs we’ll go through some more advanced concepts, dynamic measures/attributes as well as dynamic visual titles so stay tuned for that.

If you’d like to take your Power BI Reports and Dashboards to the next level and need help, please contact us to discuss how we can assist your organisation.

EMu PowerBI Collections Reporting

By | Data Visualisation | No Comments

EMu PowerBI Collections Reporting

Although they may have more interesting stories to tell, museums are no different to any other organisation when it comes to the need for management system reporting. Where most businesses require comprehensive reporting from their ERP, a museum needs to be able to extract the data from their collection management system and craft a report that effectively informs management about the status of the objects in their collections catalogue. So they needed some EMu PowerBI reporting to help do this.

FTS Data & AI were recently tasked by a museum with the delicate job of developing collections reporting from the museums’ EMu Collections Management System (EMu), a collections management system specifically designed for museums and historical institutions. This museum was undertaking an assessment program, reviewing and recording object information on their entire collection of over 100,000 historical artefacts. Monitoring the progress of this program was vital for management, as it informed them about timelines, capabilities and project resource planning.

The Problem

As previously mentioned, EMu is a collections management system specifically designed for museums. The rich data that is maintained within this system must first be extracted in order to reap the reporting rewards. However, this is easier said than done. EMu’s Texpress database engine, ODBC connectivity and ADO RecordSet connectors are not exactly conducive to PowerBI or a modern reporting outcome. So Emu PowerBI reporting is not as simple as for some systems.

Instead, a more sophisticated approach was required to extract the data. As part of the museums’ commitment to public transparency, they had developed a modern GraphQL web API that could serve collection information. Developed in-house, data was first extracted from EMu using a Harvester program which then wrote into a MongoDB cluster that serves the API. After careful examination, we identified that this API could be used to meet the reporting requirements of the assessment program. A custom query was then written, and Power BI was able to successfully connect and pull the relevant data needed for reporting. The extracted data, in JSON format, was then cleaned and transformed into a working, healthy dataset.

The Report

The report design was driven by our understanding of user workflow. An overall dashboard page, followed by a heat-formatted column chart, which then drilled through to the object details report page created a natural reporting rhythm that could be easily interpreted by report consumers:

Emu PowerBI Report
Emu PowerBI Report
Emu PowerBI Report

Custom functionality including filtering, formatting and forecasting meant that additional insights were gleaned from the dataset. We were able to not only report on the progress of the assessment program, but also provide guidance as to what objects the collections team should review and assess in the next month, proving incredibly useful in managing the resources needed in the project.

The Outcome

Starting with careful data excavation via the GraphQL API, then continuing with the injection of focused reporting design principles, we ultimately end with a sophisticated collections reporting experience that meant that the museum could leverage their existing EMu collections management system with a modern reporting tool to locate efficiencies in a large internal project.

If you have a reporting challenge and need help, please contact us to discuss how we can assist.