SSAS Tabular Optimisation In 5 Easy Steps

By | Data & AI | No Comments

SSAS Tabular Optimisation

A well-designed SSAS tabular model is often the key ingredient in a successful analytics solution. It aids the business in delivering the type of ad-hoc analysis and on-the-fly insights that drives performance improvement and economic benefit.  However, not all models are well-designed. Indeed, a poor performing cube can quickly become a burden for any organisation, negatively impacting quality of analysis and becoming a drain on valuable business resources. That’s why SSAS Tabular optimisation is so crucial for businesses wanting to get the most value out of their analytics solution.

Recently, I consulted for a large electrical merchandising business who were having some trouble with their SSAS tabular models. With national operations, it was imperative that their cubes could rapidly and reliably deliver the analysis needed for the business to confidently make strategic decisions around sales, purchasing and inventory. Memory issues and ambiguous design principles were proving to be a challenge in getting the tabular model to behave, and it was clear that I needed to tune the existing cubes with some simple optimisation techniques.

When attempting SSAS Tabular optimisation, I employ a straight-forward 5-step strategy:

  1. Examine the model and remove unnecessary tables
  2. Examine the tables, remove unnecessary columns and edit table structure/content
  3. Examine the columns and change data types
  4. Examine the DAX measures and edit expressions
  5. Examine server properties and edit memory settings

This 5-step performance tuning approach guarantees that tabular model issues can be precisely identified and appropriately addressed.

1.      Examine the Model

A concise tabular model is one that performs best. Therefore, the first step is to review the model itself. Very often a poor-performing cube contains unnecessary tables or relationships that provide no real value. A thorough review of what tables are present in the model and what value they bring will uncover what is necessary and what is redundant. Talking to stakeholders about what they need will also help determine what tables should go and what needs to stay. In my example, I was able to reduce the cube size by removing unnecessary dimension tables that I discovered the business was no longer interested in. This redesign process typically yields ‘quick-and-easy’ wins in terms of cube performance, as it is the easiest to implement.

Figure 1. Removing unnecessary tables reduces SSAS tabular model complexity

 

2.      Examine the Tables

What data actually goes into the tables will ultimately determine the quality of the tabular model. Similar to the first step, a review of the tables will often uncover unnecessary columns that do not need to be loaded into the model. For example, columns that are never filtered on or contain largely null values. Table structure is also important to tabular model performance, as it can affect how much data needs to be loaded. For example, you could reduce the row count of the sales fact table by aggregating it to be at the invoice level, instead of invoice line level. Such a reduction in size will mean that less memory is required by the cube.

Figure 2. Tidy up tables by removing columns, and reducing rows

 

3.      Examine the columns

A crucial aspect of cube performance is compression. Columns with certain data types, or have unique values for all rows will compress badly, and will require more memory. An effective optimisation technique is to correct the data type or value in a column, such that it is able to compress better. Casting values as integers instead of strings or defining decimal points are fundamental practices that are often overlooked in tabular model design, and ultimately come at the expense of performance. In my example, I was able to create a new unique invoice ID that could be used by the business and compressed as an integer. Previously the varchar invoice key was unique at almost every row of the sales table, and was compressing very poorly. The storage engine (Vertipaq) wants to compress columns, and having similar values in the same column greatly aids this. A great tool for this kind of analysis is the Vertipaq Analyzer. This tool can highlight potential areas of interest in compression activities and help track results in terms of cube optimisation techniques.

Figure 3. The VertiPaq Analyzer reveals compression pain points

 

4.      Examine the DAX

For cube users, it is critical that the OLAP queries they run return accurate results rapidly. If a user cannot get the information they need from a model in a reliable or timely manner, the cube is failing to provide the benefits expected of it. Therefore, an important part of tabular model optimisation revolves around the measures, and ensuring that the DAX expressions used are performance optimised for the formula engine. Keeping the measures simple by using basic expressions, and removing complicated filtering clauses means that the measures should perform better. In my example, I was able to change some of the expressions of sales measures at different period intervals (such as month-to-date and year-to-date), such that they could run across different filtering contexts, thus reducing calculation time.

Figure 4. Simple DAX equals better performance

 

5.      Examine the Server

Finally, the biggest factor in tabular model processing performance is the actual memory properties. Depending on the edition of the Analysis Services, there are various levels of memory limits. For the Standard Edition, the 16Gb limit imposed on a single instance of Analysis Services can often be the ‘killer of cubes’. If a reasonable business case exists, then moving to the Enterprise Edition or cloud-based solution can be the right answer to memory woes. However, there are steps that can be taken to get the best out of a SSAS tabular model without abandoning Standard Edition altogether. Increasing the amount of RAM on the server and modifying the server instance memory properties allows you to fine tune processing and reduce the likelihood of memory exception errors. In my example, the cube was failing to process as it would run out of memory during a Full Process. I increased the RAM from 32Gb to 40Gb, and reduced the Total Memory Limits in the server instance properties. With more memory and lower thresholds to which memory cleaner processes were initiated, the cube was able to process in full each time without error.

Figure 5. Fine tune the memory limits to find the optimal level of performance

 

Summary

Like any business asset, a SSAS tabular model loses value when it is not properly configured or utilised. However, with the proper approach methodology, any model can be transformed from an underperforming asset into a valuable resource for a business.

 

If you’re having trouble with SSAS tabular optimisation, we want to hear about it! Please contact us to find out about how we can help you optimise your cubes.

PowerBI Tooltip

PowerBI Tooltips

By | Data Visualisation | No Comments

PowerBI Tooltips enhance your reports

A great way to take your Power BI Reports to the next level is by using PowerBI Tooltips. Report Page ToolTips (RPT). RPT’s allow you to enhance your reports by giving your end users more information without taking up any real estate, cluttering the report canvas.

In the example below, we have a donut chart showing Revenue FYTD by Region. One great way to enhance visual this is to add a RPT showing the Top 5 Stores for each Region as your end users hover over the donut chart.

PowerBI Tooltips

Target visual for PowerBI Tooltips

 

In the example below we’ve hovered over the ‘North America’ Region and the RPT shows us the Top 5 Stores by Revenue FYTD for that Region.

PowerBI Tooltips

Tooltips in Action

 

When we hover over the different Regions the RPT changes to show the Top 5 Stores for that Region.

PowerBI Tooltips

PowerBI Tooltips Changing context

 

As well as the RPT, we have setup a drill through on this donut chart and can still perform this action by right clicking and selecting the drillthrough option as shown below.

PowerBI Tooltips

Drilling through for more detail

 

In our second example, we demonstrate that RPTs can also be used on bar charts. This bar chart shows Revenue FYTD by Country and when you hover over a Country the RPT shows the Revenue FYTD by month for that Country. These also change as you hover over the different Countries.

PowerBI Tooltips

PowerBI Tooltips on a different visual

 

NB: This report also has dynamic measures and visual titles which we will cover in upcoming blogs.

How to create PowerBI Tooltips

In only a few steps you can create PowerBI Tooltips (RPT’s) in your reports, so let’s go through those steps now.

  1. Add a page in your report and in the Visualizations tab, set the Page Information tooltip slider to On and give it a name. In this example we’ve named it TT – Country FYTD
PowerBI Tooltips

Enabling PowerBI Tooltips

 

  1. Create a visual or visuals, below we have a line chart and 2 card visuals for our RPT, within the report canvas as normal and size the page appropriately. In the Visualizations tab you can set the page size to the default Tooltip or customise the size to get the best fit for your chosen RPT visual.
PowerBI Tooltips

Defining PowerBI Tooltips size

 

  1. Once you’ve finished creating the tooltip visual, hide the page by right clicking on the page name.
PowerBI Tooltips

Hiding the PowerBI Tooltips page

 

  1. On the report page, select the visual the RPT will appear on and go to the Tooltip settings in the Visualizations tab, set the Tooltip slider to On, the Type to Report page and select the tooltip visual you created, in our example it’s TT – Country FYTD
PowerBI Tooltips

Linking the PowerBI Tooltips to a visual

 

And you’re done! It’s as simple as that.

In upcoming blogs we’ll go through some more advanced concepts, dynamic measures/attributes as well as dynamic visual titles so stay tuned for that.

If you’d like to take your Power BI Reports and Dashboards to the next level and need help, please contact us to discuss how we can assist your organisation.

EMu PowerBI Collections Reporting

By | Data Visualisation | No Comments

EMu PowerBI Collections Reporting

Although they may have more interesting stories to tell, museums are no different to any other organisation when it comes to the need for management system reporting. Where most businesses require comprehensive reporting from their ERP, a museum needs to be able to extract the data from their collection management system and craft a report that effectively informs management about the status of the objects in their collections catalogue. So they needed some EMu PowerBI reporting to help do this.

FTS Data & AI were recently tasked by a museum with the delicate job of developing collections reporting from the museums’ EMu Collections Management System (EMu), a collections management system specifically designed for museums and historical institutions. This museum was undertaking an assessment program, reviewing and recording object information on their entire collection of over 100,000 historical artefacts. Monitoring the progress of this program was vital for management, as it informed them about timelines, capabilities and project resource planning.

The Problem

As previously mentioned, EMu is a collections management system specifically designed for museums. The rich data that is maintained within this system must first be extracted in order to reap the reporting rewards. However, this is easier said than done. EMu’s Texpress database engine, ODBC connectivity and ADO RecordSet connectors are not exactly conducive to PowerBI or a modern reporting outcome. So Emu PowerBI reporting is not as simple as for some systems.

Instead, a more sophisticated approach was required to extract the data. As part of the museums’ commitment to public transparency, they had developed a modern GraphQL web API that could serve collection information. Developed in-house, data was first extracted from EMu using a Harvester program which then wrote into a MongoDB cluster that serves the API. After careful examination, we identified that this API could be used to meet the reporting requirements of the assessment program. A custom query was then written, and Power BI was able to successfully connect and pull the relevant data needed for reporting. The extracted data, in JSON format, was then cleaned and transformed into a working, healthy dataset.

The Report

The report design was driven by our understanding of user workflow. An overall dashboard page, followed by a heat-formatted column chart, which then drilled through to the object details report page created a natural reporting rhythm that could be easily interpreted by report consumers:

Emu PowerBI Report
Emu PowerBI Report
Emu PowerBI Report

Custom functionality including filtering, formatting and forecasting meant that additional insights were gleaned from the dataset. We were able to not only report on the progress of the assessment program, but also provide guidance as to what objects the collections team should review and assess in the next month, proving incredibly useful in managing the resources needed in the project.

The Outcome

Starting with careful data excavation via the GraphQL API, then continuing with the injection of focused reporting design principles, we ultimately end with a sophisticated collections reporting experience that meant that the museum could leverage their existing EMu collections management system with a modern reporting tool to locate efficiencies in a large internal project.

If you have a reporting challenge and need help, please contact us to discuss how we can assist.

Azure ML PowerBI

By | AI & ML, Data Visualisation | No Comments

Leveraging Azure ML Service Models with Microsoft PowerBI

Machine Learning (ML) is shaping and simplifying the way we live, work, travel and communicate. With the Azure Machine Learning (Azure ML) Service, data scientists can easily build and train highly accurate machine learning and deep-learning models.  Now PowerBI makes it simple to incorporate the insights from models build by data scientists on Azure Machine Learning service and their predictions in the PowerBI reports by using simple point and click gestures. This will enable business users with better insights and predictions about their business.

This capability can be leveraged by any PowerBI user (with an access privilege granted through the Azure portal).  Power Query automatically detects all ML Models that the user has access to and exposes them as dynamic Power Query functions.

This functionality is supported for PowerBI dataflows, and for Power Query online in the PowerBI service.

Schema discovery for Machine Learning Service models

Unlike the Machine Learning studio (which helps automate the task of creating a schema file for the model), in Azure Machine Learning Service Data scientists primarily use Python to build and train machine learning models.

Invoking the Azure ML model in PowerBI

  1. Grant access to the Azure ML model to a Power BI user: To access an Azure ML model from PowerBI, the user must have Read access to the Azure subscription. In addition:
  • For Machine Learning Studio models, Read access to Machine Learning Studio web service
  • For Machine Learning Service models, Read access to the Machine Learning service workspace
  1. From the PowerQuery Editor in your dataflow, select the Edit button for the dataset that you want to get insights about, as shown in the following image:
Azure ML PowerBI Edit Dataset

Azure ML PowerBI Edit Dataset

 

  1. Selecting the Edit button opens the PowerQuery Editor for the entities in your dataflow:
Azure ML PowerBI PowerQuery

Azure ML PowerBI PowerQuery

 

  1. Click on AI Insights button (on the top ribbon), and then select the “Azure Machine Learning Models” folder from the left navigation menu. All the Azure ML models appear as PowerQuery functions. Also, the input parameters for the Azure ML model are automatically mapped as parameters of the corresponding PowerQuery function.
Azure ML PowerBI AI Insights

Azure ML PowerBI AI Insights

  1. To invoke an Azure ML model, we can specify the column of our choice as an input.

 

  1. To examine/preview the model’s output, select Invoke. This will show us the model’s output column, and this step also appears (model invocation) as an applied step for the query.
Azure ML PowerBI Invoke

Azure ML PowerBI Invoke

Summary

With this approach we can integrate all ML models (built using either Azure ML service or studio) with PowerBI reporting. This enables business to effectively utilise the models built by data scientists by any user (typically BI analyst) for relevant datasets based on the problem we are trying to solve (either classification/regression) or to get predictions. Utilising all these new enhancements of Microsoft PowerBI will enlighten business users with better insights and this in turn aids in better decision making.

Let our Data Visualisation and Machine Learning experts help you explore the potential – contact us today!

PowerBI ML – How to build Killer ML with PowerBI

By | AI & ML, Data Visualisation | No Comments

PowerBI ML: Unleashing Machine Learning in Microsoft PowerBI in 5 easy steps

AI and ML are key tools enabling modern businesses to unlock value, drive growth, deliver insights and outcompete the market.  Its unmatched ability to handle massive sets of data and identify patterns is transforming decision making at every level of organisations. Consequently Data and AI strategy is therefore rapidly evolving to explore the ways in which AI can be best utilised to enhance business operations. However, pragmatically harnessing AI for business needs has remained challenging. This is because the solutions offered typically incur significant resource overhead, are hard to understand and may fail to deliver actionable business outcomes. A gap has therefore emerged between BI and AI; a failure to bridge the insights we learn, with the intelligence to improve. The most recent release of Microsoft PowerBI ML features aims to eliminate that gap, by bringing in Artificial Intelligence (AI) and Machine Learning (ML) capabilities into the practical setting of self-service analytics.

PowerBI has established itself to be a vital tool in modern data analytics. The easy to use interface coupled with powerful reporting capabilities has made it the reporting platform of choice in delivering reliable business insights. The recent inclusion of ML & AI capabilities has significantly strengthened the tool, by combining easy interactivity with cutting-edge data analysis.

Overview

PowerBI ML (Machine Learning) is now possible using Dataflows, the simple ETL tool that empowers analysts to prepare data with low-or-no code. Automated Machine Learning (AutoML) is then built off the back of Dataflows, again leveraging the interactive approach of Power BI without compromising on quality of analysis.

5 Easy Steps

  1. In a Workspace hosted by Premium capacity, select ‘+Create’ in the top right corner, and select ‘Dataflows’
  2. Choose the data source you wish to run the model on:
PowerBI ML Choosing Data Source

PowerBI ML Choosing Data Source

  1. After loading the data, the familiar Power Query screen will appear. Perform any data transformations as required, and select save & close:
PowerBI ML Power Query

PowerBI ML Power Query

  1. The dataflow should now appear underneath Dataflows in the workspace. Select the dataflow, then select the brain icon, and select ‘Add a machine learning model’:
PowerBI ML Add Model

PowerBI ML Add Model

  1. Create the model by inputting the relevant information. You will get the option to select the model type and inputs for the model:
PowerBI ML Select Model

PowerBI ML Select Model

After creating the model, you will need to train it. The training process samples your data, and splits it into Training and Testing data:

PowerBI ML Train Model

PowerBI ML Train Model

Once the model is finished training, it will appear under the Machine learning models tab in the Dataflow area of the Workspace, with a timestamp for when it was Last Trained. Following this you can then review the Model Validation report (a report which describes how well the model is likely to perform), by selecting ‘View performance report and apply model’.

Lastly, you can apply the model to the Dataflow by selecting ‘Apply model’ at the top of the validation report. This will then prompt a refresh for the Dataflow to preview the results of your model. Applying the model will create new entities (columns) in the Dataflow you created. Once the Dataflow refresh is completed, you can select the Preview option to view your results. Finally, to build reporting from the model, simply connect Power BI desktop to the Dataflow using the Dataflows connector to begin developing reporting on the results of your machine learning model.

Outcomes

With machine learning now integrated with PowerBI, users can upgrade from reporting on business performance to predicting it. From a business perspective, the addition of ML means that PowerBI reporting has gained an extra dimension. It can easily be incorporated into existing reporting and is capable of dramatically changing decision making. For the PowerBI ML user, no new skills are required, as ML leans heavily on the existing interface and user experience.

Common use cases where machine learning in PowerBI can be readily implemented include:

  • Improving your existing PowerBI CRM reporting by creating a general classification model to identify high and low value customers.
  • Boosting the value of your financial reporting by developing a forecasting model to help predict sales trends and downturns.
  • Enhancing your asset reporting by building a regression model to calculate the probability of asset failure or breakdown.
  • Refining your CRM reporting by constructing a binary prediction model to determine the likelihood of a customer leaving or staying.

If you want to know how machine learning can be implemented in your organisation, please contact us, and ask us about our AI services.