Fix: Power BI – Couldn't publish to Power BI

I saw the above message when working with one of my clients today. The error says Only users with Power BI Pro licenses can publish to this workspace. And that would make sense if they hadn't already purchases Power BI Pro licenses for the user.

I checked online, and there were a number of comments about people seeing this error. There were the usual suggestions of logging out and back into Power BI. There was even one who'd quoted a Microsoft support person who said that Pro licenses don't work for up to 24 hours after you purchase them. (That sounds dubious to me).

We checked that the license had been assigned to the user. I was concerned that perhaps the licenses were purchased but not actually assigned to the users. All was good with the license assignment.

But then we noticed something odd:

Note that the user still had their Free Power BI license assigned to them, and for some unknown (and not very sensible) reason, that was the one that the system was using when they logged in.

We removed the free license assignment and all was then good.

I hope that helps someone else.

 

 

Power BI: 5 Minutes to "Wow" and for enterprises, what's next?

Power BI is amazing. And it's starting to appear all over the place. Many enterprises don't know what to make of it though. Some are scared that it'll be the "next Access" where stores of uncontrolled data end up all over the organization. Power BI's mantra of "5 minutes to Wow" is spot on. It's easy to be impressed. But enterprises are often struggling with "what comes next after that 5 minutes?"

We've been implementing many enterprise-level projects that use Power BI, and I'd love to spend time showing you what we do.

Amit Bansal and the DPS team in India are hosting a virtual summit later this year. There are three days of data-related conference presentations delivered 24 x 7. If you can't find something of real interest to you, and in your time zone, you aren't looking. And the price? At present it's $89 USD. Yes you read that right. And it includes a year of access to the recordings of all the normal conference sessions.

As part of their Data Platform Virtual Summit  event, there are also pre-cons and post-cons. Again they are amazing value.

I'm presenting a pre-con over two half days. At the current discount, it's $129 USD and if you use the discount code GREG@DPS, it's even cheaper. I'd love to see many of you attend. Let's talk about Power BI in the enterprise.

Now, this isn't a session about how to make great visuals, etc. I'll leave that to someone else. But if you want to really get into what makes a good underlying data model, how to integrate properly with security, how to structure projects, how we use Azure Analysis Services and/or Power BI Premium and more, this should be the right session for you.

You can book here.

Fix: Unexpected error in Analysis Services Power Query designer in Visual Studio 2019

I was editing using the Power Query editor in an Analysis Services project, hosted in Visual Studio 2019. When I tried to use "Add column by example", I received the error shown above:

Unexpected Error

Could not load file or assembly 'Microsoft.DataIntegration.TransformDataByExample, Version=1.0.0.0, Culture=neutral, PublicKeyToken=31bf3856ad34e35' or one of its dependencies. The system cannot find the file specified.

The problem is that the assembly had not been deployed with the Analysis Services Designer.

Fortunately, the same assembly is used by Power BI Desktop. Because I had that installed on the same machine, I was able to copy the assembly:

Microsoft.DataIntegration.TransformDataByExample.dll

from the folder:

C:\Program Files\Microsoft Power BI Desktop\bin

to the same folder as the devenv.exe program from VS 2019:

C:\Program Files (x86)\Microsoft Visual Studio\2019\Enterprise\Common7\IDE\

If that path doesn't exist on your machine, it might be here instead if you're using the Community edition:

C:\Program Files (x86)\Microsoft Visual Studio\2019\Community\Common7\IDE\

After that, it was back working again. I hope that helps someone else.

Power BI: (Workaround) Power BI Service only provides UTC-based relative date and time filters

If you've done much work with Power BI, you'll know that relative dates can be really useful. But if you've tried to use them, you'd also know that they aren't as useful as they might at first seem. You configure them in Power BI Desktop and they look great. You deploy them to the Power BI Service, and you have a problem.

Why?

Because Power BI Desktop uses your local timezone and the Power BI Service doesn't. So if the dates and/or times you have in the data you're analysing don't happen to be UTC based (or you live in England), you're fresh out of luck. If you go looking for today's data, you might well be seeing yesterday's data.

I can understand why the service might be UTC based, but if a company's data is created in Melbourne, and the report is built in Melbourne, and it's for Melbourne people to view, it makes no sense to introduce UTC into that situation.

I really hope that the Power BI Service lets us set a local timezone on a per-user basis soon.

Workaround

But until then, I need a workaround. Fortunately, for many relative dates, etc. it's not that hard. I have a Date table that gets processed daily. If I'm using Analysis Services (that's what I mostly do), I just have a process of that table happening first thing early in the morning.

And by having a rich Date table, I can make all sorts of filtering much easier. For example, I use our SDU_Tools.DateDimensionColumns function to return the columns that I need. It has the usual types of columns:

Date, Day Number, Day Name, Short Day Name, Month Name, Short Month Name, Month Number, Month Label, Year, Year Label, Day of Year, Fiscal Month Number, Fiscal Month Label, Fiscal Year, Fiscal Year Label, Day of Fiscal Year, ISO Week Number

But I also use our SDU_Tools.DateDimensionPeriodColumns function to enrich the table with:

Is Today, Is Yesterday, Is Tomorrow, Is Future, Is Working Day, Is Last Working Day, Is Next Working Day, Is Weekend, Is Same Month, Is Month to Date, Is Same Month Last Year, Is Month to Date Last Year, Is Same Calendar Year, Is Calendar Year to Date, Is Last Calendar Year, Is Last Calendar Year to Date, Is Same Fiscal Year, Is Same Fiscal Year to Date, Is Last Fiscal Year, Is Last Fiscal Year to Date

Then rather than having to do a whole lot of tricky DAX, it's easy to just use those columns for the relative filtering. You can see it in action here:

For more info on our free SDU_Tools, look here.

 

BI: (Workaround) Changing partitioned tables in SSDT tabular designer

I was working with a client the other day and we had what we thought was a simple situation:

  • A table in an SSAS tabular model project
  • The table had two partitions
  • We needed to remove 6 columns from the table

So, we'd already removed the 6 columns from the SQL Server view that the tables were being loaded from. We just needed to update the tabular model.

Now for an unpartitioned table, that's easy. You open the table properties, click Design, wait for a moment, click below the query, then on the Query menu, click the option to Refresh. When you then click Import, you can then save the changes. We do that all the time.

However, the partitioned story was different.

  • If we tried to change the partitions, it would not let us save the changes because the partition differed from the table.
  • If we tried to change the table, it would not let us save the changes because the table differed from the partitions.

<SIGH>

There really doesn't seem to be any great way to do it using the standard GUI interface.

Official Method

The "official" way is to:

  • Remove all but one partition
  • Make the changes
  • Put the partitions back

That doesn't sound like much fun, particularly if you had a bunch of partitions.

Workaround

The workaround is to right-click the .bim file, then View Code, find the columns and carefully remove them from the JSON. When you save it, all is then happy.

Wish it wasn't so, but I hope that helps someone.

 

 

Power BI: (Workaround) Times disappear when datetime values are displayed in PBI tables

I'll start this post by mentioning that there's a general consensus that you should avoid columns in tabular data models that have both dates and times. That's largely because they don't compress well. However, sometimes you really do need to have both a date and a time in a single column.

For example, if you want to sort one column by the date and time, we have no option to sort one column by two other columns. And if you're going to create a composite column to get around that, you're really back in the same problem as storing a datetime anyway.

Displaying Values in a Table

But what really surprised me recently was that when I added a datetime column from my Live Query data model (source was Azure Analysis Services) into a table in Power BI, that the time values disappeared.

Only the date was shown.

I was quite puzzled about what was going on.

Why This Happens

Power BI has a limited range of data types. If you have a date, it's stored in a date data type. But if you have a datetime, it's also stored in a date data type.

So when you're designing a report, Power BI has to decide a default format for showing you a date data type element. And no major surprise, it just shows the date.

Fixing the Issue

To make the time values appear, you need to fix the issue back in the tabular data model, by adding a custom display format for the datetime column.

I can't say that I love this as a display format decision should really be made at the client application. I'm OK with the server supplying a default display format, but the client should be able to change it.

Unfortunately in Power BI today, you have no option to change the format of elements from Live Query data sources.

I hope that helps someone.

 

Power BI (Bug): Power BI Desktop auto-hides visible tables with all columns hidden

I have a client who's publishing their tabular data models to Azure Analysis Services (AAS). They want to publish a table that's visible, but only has a single column that's hidden.

You might wonder why he wanted to do that.

He's trying to have a table with no existing columns that's an anchor point for report designers to attach their report-specific measures.  There are measures and computed columns in the tabular data model in AAS. But he wants to have a known location for measures that are only related to the specific report.

All other BI tools that I've tried, show the table . As an example, here's the table appearing in a browse window in SQL Server Management Studio (SSMS):

But Power BI Desktop automatically hides a table that has no visible columns.

Workaround

You can see the table if you use the "View Hidden" option in the fields list, but you shouldn't need to do that for a table that's supposed to be visible.

Bug?

I saw discussion that says it relates to a backwards compatibility issue with Excel, but to me it just seems like a bug.

You can add your opinion on this here: https://community.powerbi.com/t5/Desktop/Power-BI-auto-hides-visible-tables-with-all-columns-hidden-bug/m-p/994358#M472966

(And I'd encourage you to do so)

Power BI: Creating a TopoJSON file of Australian postcodes for use with Shape Map

In a recent consulting project, I needed to plot analytic values over postcodes. I couldn't just do it with the built-in maps control, mostly because the system that we're developing on doesn't have Internet connectivity. I was keen to upload the mapping data directly into the report, and the Shape Map control (while still in preview) seemed to fit what I needed exactly.

However, when you load custom maps into the control, they need to be in TopoJSON format. Now TopoJSON is a specific extension over GeoJSON which has been pretty common for a while. I tried just using a GeoJSON file with the control, to no luck.

What was successful was to download the postcode shapes from here:

https://www.abs.gov.au/AUSSTATS/abs@.nsf/DetailsPage/1270.0.55.003July%202016?OpenDocument

I downloaded the 2011 shape file for Australian postcode areas.

Next I used the  website at https://mapshaper.org/ to do the following:

  • Loaded the shape file
  • Simplified the shape file (I reduced it to about 2% of the original size, while opting to keep any small areas)
  • Export the data as a TopoJSON file
  • (Optional) I then edited the TopoJSON file to rename my postcode column from POA_CODE to Postcode and from POA_NAME to PostcodeName.

What an awesome site mapshaper.org is !

Notes on Shape Map

To use the shape map control, you need to do this:

  • Make sure you have the Shape Map control (it's currently a preview feature that needs to be enabled)
  • Drag the column containing the postcode to the Location field.
  • Once you've done that, on the format page, you can choose to add a custom map. I imported my TopoJSON postcodes file
  • Drag the analytic value to the Color saturation field.
  • Set the background color and border color.

When I first did this, nothing worked for me. Eventually I worked out that my source data had integer postcodes but the ones in the file were just text fields. I added a computed column to my data, made sure I had text postcodes available, and used that column, then all was good.

I hope this helps someone.

 

 

Book Review: Power BI MVP Book

Over the last few months, one of my Kiwi buddies (and fellow member of both the MVP and Microsoft Regional Director programs) Reza Rad has been organizing a bunch of us to write a book that's a collection of ideas from a number of MVPs. It's the Power BI MVP Book.

There are a whole lot of authors from a whole lot of different countries: Reza Rad, Anil Maharjan, Indira Bandari, Liam Bastick, Ken Puls, Jesus Gil, Thomas LeBlanc, Ike Ellis, Matt Allington, Leila Etaati, Markus Ehrenmüller, Ashraf Ghonaim, Eduardo Castro, Manohar Punna, Treb Gatte, Gilbert Quevauvilliers, Michael Johnson, Shree Khanal, Asgeir Gunnarsson, Greg Low, Gogula Aryalingam.

I've done these types of books before with the SQL Server MVP Deep Dives pair of books. They are a different book in that you're not getting a single story throughout the book. Instead, you're getting a whole set of independent chapters on a variety of topics related to Power BI.

The general idea of these books is to support a charity, and that's where anything that I would have earned from them is going.

Bottom line?

I hope you find this book useful. There is both a paperback and a Kindle edition. The Kindle eBook is far cheaper.

Greg's rating: you decide

Opinion: RIP Microsoft Professional Program

Three years back, with much fanfare at a partner conference, Microsoft announced the Microsoft Professional Degree program. It was going to be a set of courses that you could take that would lead to one of their professional degrees.

Now here in Australia, you can't just call something a degree, and I'm guessing that's the same in the USA, so I wasn't surprised when I noticed soon after I started with it, that the name had changed to the Microsoft Professional Program (MPP), and they'd dropped the "degree" word.

The first program available was for Data Science. It required you to complete 11 courses. Each course restarted every three months, and had an exam and certificate at the end. Importantly, to complete a program, you also had to complete a practical course, called the capstone project.

I loved these programs, and I completed four of them: Data Science, AI, Big Data, and DevOps.

It's not all roses

The program retirement was announced the other day. You can't enrol after next month, and you must complete everything before December.

Many people are part way through the program, have paid to certify previous exams, and are now unable to complete before the deadline. That's really not fair to them. A nice touch would have been to at least refund their exam certification costs if they're part way through a program.

And more importantly, what does it really mean for those that have invested time, money, and effort in the programs? I don't know but I'd almost bet that within a year, it'll be hard to even find any trace of the entire program ever existing.

What I don't love is the way that Microsoft has constant churn in these types of things. For things like certification that require substantial commitments to pursue, this type of churn is just not appropriate.

I wish it was the first time that many of us have been down this same path but sadly, it's not. (#MCM4Life)

Microsoft's offerings around learning have been really messy and jumbled for a very long time. The move to refocus learning around Microsoft Learn is a good move. I just wish they'd learn how to make these types of changes and consolidations without leaving their biggest supporters feeling abandoned (again).

Why I liked the MPP

I really liked the format of the MPP for a number of reasons:

  • You could take any of the courses for free (this meets the goal of the product groups who just want to get the information out there widely and without the friction of cost). Initially, that also included the exams.
  • You could pay for a certified exam. The courses were done in conjunction with edX and they would check out who you were. (i.e. government issued photo ID, etc.) If you wanted the certification, you needed to pay to certify all the relevant exams.
  • The content was not just Microsoft content. For example, the initial statistics course was actually a course from Columbia University. Some of the content was taught by DataCamp (who I'm not happy with after their data breach), and by a prof from Norway. This gave the material a wider context.
  • There was often a choice in the content. For Data Science, you could use either R or Python in each required course. For AI, there was a choice of areas of AI to work in: Speech, Vision, etc.
  • The work could be done in your own time, and fitted in amongst other activities as you had free time.

Tracks were expanding

Eventually, there were many tracks:

  • Data Science
  • AI
  • Big Data
  • DevOps
  • IoT
  • Data Analysis
  • Cybersecurity
  • Entry Level Software Development
  • IT Support

Thanks is due

Naturally, like with most things, the quality varied across the courses. But overall, I liked the effort that had been put into the classes.

A hearty thank you to anyone who was involved in creating these courses and their associated materials!

Awesome image by Pete Pedroza

For Posterity

Like a friend of mine and fellow MVP, Thomas LaRock, said in a recent post, I have no idea what really happens to the certifications that were achieved in the program. As I mentioned, I suspect they have suddenly been massively devalued. And as Thomas did, I include my course certificates for posterity.