OLAP Acronym returns

After reviewing Azure Data Architecture Guide, I noticed the label Online Analytical Database (OLAP) referenced as the description for Azure Analysis Services. I will call Azure Analysis Services AAS from here on. The AAS is using tabular models and not multidimensional cubes. Lately, OLAP has not been seen as an overall description of the tabular model. There were references to semantic model replacing cube’s data source view.


It is good to see this in official Microsoft documentation. While recording or speaking, we do not have to skirt around this tabular versus cube and now just call both OLAP. I had been calling it Analytical Database or Analysis Service database.

Drilling into the documentation, you can see its place in the overall Data Architecture schema for Azure


Most of the work I do is still on premise, but I see more and more people talking about the Azure. So, I believe it is time to consider this option with most work we do. Mellissa Coates (@sqlchick) seems to be doing a lot of presentations, workshops and articles around this topic. She has some really good stuff about on premise SSIS as well on her site.

There are still some challenges to Azure DW, but Microsoft seems to be listening to client and working to improve. I find it good that they list the challenges in the document.


Power BI – 5 Nice On Premise Features That Can Be Used Part 1

Thank you Microsoft for porting Power BI reports to SQL Server Reporting Services (SSRS). This goes along way with IT shops that are still On-Premise. There is no denying that more development is going to Azure. Some IT departments I have networked with are using a Hybrid approach – new stuff not under regulation can go to Azure, but current running application reports new this option for Power BI with SSRS.

Using Power BI reports with Reporting Services is a great addition to have and we are thankful Microsoft has provided this option. Here are some Top Features available On-Premise (these could change next week 🙂 ).


Pie Charts

I like to use Pie Charts as Slicers (filters) to other visualizations. This is nice when there are hierarchies like Product Category to Subcategory. You can have 2 Pie Charts that show a measure as well as filtering.




Edit Interactions

This is such a necessity for filtering that controls the effect of selecting a bar on a Bar Chart and the changes to other visualizations on the report. There are really 3 options: No effect, full effect and partial effect. To access, go to the Format tab, then click Edit Interactions on the top left.




Then, one visualization has to be selected, like a bar graph, then a related visualization, like a clustered bar chart, can have the interaction filter change. The default is the partial or highlight filter.




See Data

This is so cool, but I have seen a new custom visualization trump this (hint: Tree View). The see data option helps list the data behind the visualization. Right-clicking the slice in a Donut Chart will give a menu for See Data.




Then, a table appears with the visualization with a Back to Report button on the top left. So cool!!!



Matrix (and Table)

The Matrix (and Table) provide details to the visualizations. This enables standard type of line reporting to be included in Power BI reports. The matrix gives the cross tab (Pivot) report while the Table is a list.



Connect Live to Analysis Services (Model or Perspective)

Not sure why there are so many demos of Power BI that show selecting a spreadsheet or table/view that has flattened data to import into Power BI when users can use an existing Analysis Service (SSAS) database to do all the centralized Business Rules. The measures in an SSAS database can help company’s isolate the rules for calculations and force users to retrieve that data from one source. Time and time again I see reports with various T-SQL statements that have slightly different logic to pull data for the same calculations.




SSAS database can get big, but developers of OLAP database should remember the use of Perspectives (like table views for relational database developers) that can limit the Attributes an end user sees.



Here are just a few of the nice features for On-Premise Power BI Reports in SSRS. There are many more and as we start to deploy them, I will update additional blogs with those we really like.

Power BI Data

There are 3 methods I see frequently used when creating visualizations in Power BI. I would stick to a file or spreadsheet for prototyping, but diffintely move to Analysis Service for the database in a enterprise. Even if someone says this is a one off, try to move to a centralized database for your company.

The first is a text file or Excel worksheet. The data is already in the rows and columns needed for a report someone else has created. Now, the reporting needs to be ported to Power BI and made ‘pretty’. There might be some ‘cleaning’ of the data, but the basic stuff is setup for the report.


The data here is flattened where the Sales per Year are pivoted into the table with the attributes Country/region, Brand and Months are imcluded. Very simple to start doing visualizations. The figure above shows the columns Sales 2013, Sales 2014, Sales 2015 and Budget.

The next is connecting directly to tables and joining them in a data model. This saves the data model into the Power BI report. The next Power BI report will have to have the same modeling usually with more or less tables/columns along with calculated Measures.


The report writing individuals have to know how relationships are managed between tables. After this skill is mastered, they then have to be able to pivot the data or create measures with DAX. DAX is the measure and caculated column syntax used in power BI. it is also used in Analysis Service Tabular Models.


You also have to make a decision of importing the data or using a Live Connection with all these methods. That decision depends on if the data has to be refreshed and how frequently.


The optimal connection, in my opinion, is an Analysis Service database. Here we are using the Tabular Model, but Multidimensional Cubes can be used.


Here, the business rules, measures and relatinoships are managed in the database and the report writer does not need to have the skills to manage this development. As you can see below, the tables are not listed when selecting the database, just the model or perspective.


The report writer also might be the architect of the Analysis Service database, but it could be someone else. The relationships are built and measures created in the database. This eliminates the problem with measures in the Power BI report.


If you are using Power BI for data modeling, I suggest you try to move to using Analysis Service Tabular Model for an enterprise database solution. Try not to tie up the model and measures in each Power BI report created.

Analysis Services Series on SQLShack

Please visit SQLShack.com for new articles on many facets of SQL Server. I have been trying my best to document how to create SQL Server Analysis Service databases with Multidimensional Cubes as well as Tabular Models.


The start was a Best Practices article for Dimensional Modeling and other items that help with a smooth transition to creating an SSAS database. Here is a link to that article.

The flow of the articles starts with the multidimensional cube, then the next article shows tabular.


After the best practices, I move to connecting to a data source, then creating attributes and measures.

Even though people are using Power BI more than Analysis Services, eventually when their systems get to large, they are going to have to find an enterprise solution for modeling the data. I do not believe that modeling in Power BI is very sustainable.


With the Microsoft push for Cloud First development, Analysis Services in Azure will be the next big item. If you have not already seen, you can import a Power BI Model into Azure Analyssi Services. Tabluar is in Power BI not Multidimensional Cubes. So, you better start learning tabular and maybe start new projects in Tabular Model Analysis Services.

Query Data Store–SQL Server 2016

SQL Server’s new Query Data Store is a brilliant option for databases if you are a DBA or have to do performance tuning on T-SQL queries. This new feature of SQL Server 2016 is contained within a database with configuration options for storage of history and query plans. The option has many benefits you can take advantage of today, but also this is an option for future upgrades to new SQL Server versions because of the history contained in the database rather than the instance.



To enable the Query Data Store, you have to be db_owner of the database or SysAdmin to the instance. Once you right-click the database in SQL Server Management Studio (SSMS), go to the Query Store menu choice on the left and change Operation Mode (Requested) to Read Write. There are other options to handle the time retention of data as well as how often it is collected.



This will enable the database to start storing query execution plans as well as statistics like IO, Read/Write and execution times in the database itself. There are new Data Management Objects (DMVs/DMFs) to use to manual retrieve this data.



There are also reports available at the database level in SSMS to view information about query statistics. If you drill into the database in SSMS, you will see an option for Query Store.



Here, you will see the four default reports that come with this Option.

  1. Regressed Queries – shows query history and changes in statistics
  2. Overall Resource Consumption – history of resources used in the database
  3. Top resource Consuming Queries – Top x of queries using the most resources
  4. Tracked Queries – enables you to see multiple query plans for a T-SQL statement and compare the plans or force a plan

The above is just a summary of what you are able to do, but just start using this feature and you will be amazed at what it enables DBAs and others to drill into. This drilling is just the history of queries on one database rather than you having to sort through all plans for an instance.

SQL Server 2016–Live Query Statistics

WOW!!! The next great update to Execution Plans is a way to watch the processing of rows through iterators while the query runs. Never saw that coming. And even a bonus, you can use SQL Server 2016 Management Studio (SSMS) to watch queries run on versions 2014, 2012 and 2008 R2.

SSMS 2016 can be installed by itself without the server installation. Go here for the installation. Once you install SSMS, you can start to view this addition in SSMS.like Figure 1.



Figure 1 – Live Query Statistics

After opening a New Query, go to the Query menu choice and select “Include Live Query Statistics.” You will need a long running query in order to see the processing happening at a slow pace.


Figure 2 – Include Live Query Statistics

Once you start a query, a new tab will show the Live Query Statistics like Figure 3. The dotted lne means the iterator has not finished or has not started.



Figure 3 – New Tab

The labeling of the iterators have the usually properties as an Execution Plan plus a running time clock of seconds that this iterator is using. In addition, you can see the “Actual Number of Row” of “Esitmated number of Rows” with a calculated Percent Done (Actual/Estimated). Figure 4 shows 144% for some iterators that have not even finished yet.



Figure 4 – Percent Greater Than 100

This is useful to see if estimate number of rows is the same or close to actual number of rows. The percentage will rise above 100 if there are more actual than estimated rows. The query in this case probably could use some performance tuning to get the estimate closer to the actual.


PASS Summit 2016-Thursday Part II

The second half of the day was spent talking with various speakers and attendees. I got into a great conversation about trying to mentor developers at work places about how to write good T-SQL. It was interesting to here how some developers (or DBAs) are not open to new ways to write T-SQL like using CTEs or proper join syntax.


I meet Lance from the program committee and discussed the difficultly of having 800+ submissions and only about 140 slots for those submissions. To make sure you get a wide range of topics and not covering too little or too many of the same topic is not always easy. They do a great job with the volunteers to make sure Summit is a great learning opportunity for all.

I also saw a new posting on SQLServerCentral.Com about learning SQL Server 2016. Try this Link – http://www.sqlservercentral.com/articles/SQL+Server+2016/127518/.


The only session I attended was on learning how to price and scale an Azure SQL Server Data Warehouse. So, now I understand DWUs and pricing is 3 fold: DWUs, Storage and GEO DR. It can be expensive, but you get Massive Parallel Processing in the cloud and easier scaling up or down the usage.image

The last 2 session time slots was spent co-presenting with Bill Anton on Performance Tuning Analysis Services. It was well attended but in a large room. There was about 2/3 of the people returning after the break for the 3 hour session. Bill did a great job getting people started with Profiler, PerfMon and some free tools and scripts. Lots of questions and even a Microsoft developer helping us answer questions. Thanks Microsoft.

Last stop of the day was dinner with some great guys from Atlanta at Rock Bottom. They went to a BIML party, I went to bed. Smile

PASS Summit 2016-Thursday Part 1

Today’s keynote was by David Dewitt. He did an excellent job explaining the architecture behind various Cloud Data Warehouse offerings. It was nice to get an explanation of the behind the scenes node and disk usage.


One announcement that caught me off guard was next year’s Summit is Oct 31 thru Nov 3. That is Halloween and my kids come first on that day. Not sure that was a good idea. Better information shared was an increase in revenue of 10%, plus 100%+ increase in Global membership in some areas. There is a new logo and the updated website will be released early next year. Denise provided some slides on what the website will look like including mobile friendly.


Next, I am going to meet up with Bill and attend a session on Azure DW.

PASS Summit 2016-Wednesday

Media preview

As all Summits, the first Regular Session day in Seattle started with the Keynote. Well, really it starts with a great breakfast at the Seattle Convention Center, but I digressed. Microsoft released information about new features for its Microsoft Data Platforms. The Keynote provided Customer specific examples of some of the newer features being used plus Program Managers demoing great examples of what is to come. For a summary, go to watch the PASStv Recap. A feature I am looking forward to seeing in action is Analysis Services in the cloud.

Congratulations to Malathi Mahadevan as PASS Outstanding Volunteer of the Year. She is a great representative of the PASS community and people love her service and passion.


The Summit is an amazing trip.Wednesday, I spent time with Bill Anton rehearsing a 3 hour Analysis Service Performance Tuning session for Thursday he has put together. His knowledge of this subject is top tier so I learned a lot. I spent time in the Speaker Ready talking and practicing for a session on Excel Tips & Tricks With SSAS Cubes. This session takes all the technical work we IT people put together in a Cube and start to have some fun with it. What amazed me was half the people in the session (there was at least 100 attendees) were Cube builders. They were in the session to find out how people use cubes in Excel to develop better cubes. That is a lot of people still building cubes.

Before the Vendor Reception, I spent time after the Excel session talking to an attendee. He was very unique to me because he was not an IT person, but hired as a Financial/Accounting analyst. He was using data feeds from their hosted applications in PowerBI. It was difficult to explain to him the difference between a Cube, the Tabular Model, PowerPivot AND the modeling in PowerBI. Dimensional Modeling might be to advanced for him, he just wanted basic table design hints.

He was also concerned that there were not enough beginner T-SQL and database design sessions available for him at the Summit, so I spent time with him going through all the remaining session that were a fit for him plus vendors he should visit. He has never heard of Pluralsite or other online training like from PragmaticWorks. We exchanged business cards and wished each other a pleasant week as we walked past the Community Zone and into the Vender Reception.

PASS Summit 2016–Tuesday

For the first day at the Summit this year, I spent half my time in a Pre-Conference session with Idera as an outgoing Idera Ace. The session include a 45 minute presentation for me on a Customer Use-Case. The second half of the day was spent in #SQLPASS Community sessions.

The Customer Use-Case included screen captures of real-world diagnosis using Idera products. Diagnosis Manager provided most of the screens. I was able to show the forecasting reports for data file and table usage/growth. We were able to show a large table that indexes were occupying lots of space and investigated the Index Usage through Glenn Berry scripts on a historical level. Removing some large include indexes help alleviate the table growth in storage size.


The other slides were about Alerts/EMails, User T-SQL History, BI Manager and Free Tools like SQL Check and SQL Fragmentation Analyzer.

Media preview

The Community sessions were for #SQLSaturday and User Group/Virtual Chapters. The SQLSaturday session was a full house with many comments and questions. Money and Sponsorship was the hot topic. Growth is still being encouraged, but overall the session had a lot of good points with many thanks to such a great community event system. At the end, an emotional Karla talked about the end to her five reign as PASS Community leader for PASS. What a wonderful ending to the session!!! I won’t show any pictures of her crying (or Grant crying).

The User Group and Virtual Chapter session were a lot about speakers than money or sponsorship.

Truly amazed at the turn out and range of areas represented. I wonder how many User Groups use remote speakers and how often? Interesting discussion about getting a Microsoft TAM or related MS employee to help sponsor or host monthly meeting. You do not pay them but it helps them in their reviews/bonuses.

Ryan Adam announces new features in SQLPass hosted user group pages. There will be an option to send out event notifications for email, twitter, Facebook, LinkedIn, Matchup and other social media at one place, not manually doing many. Cheers from the crowd!!!

The evening end with a bite to eat at Welcome Reception and visiting with new and old friends, too many to list.