I'll start this post by mentioning that there's a general consensus that you should avoid columns in tabular data models that have both dates and times. That's largely because they don't compress well. However, sometimes you really do need to have both a date and a time in a single column.
For example, if you want to sort one column by the date and time, we have no option to sort one column by two other columns. And if you're going to create a composite column to get around that, you're really back in the same problem as storing a datetime anyway.
Displaying Values in a Table
But what really surprised me recently was that when I added a datetime column from my Live Query data model (source was Azure Analysis Services) into a table in Power BI, that the time values disappeared.
Only the date was shown.
I was quite puzzled about what was going on.
Why This Happens
Power BI has a limited range of data types. If you have a date, it's stored in a date data type. But if you have a datetime, it's also stored in a date data type.
So when you're designing a report, Power BI has to decide a default format for showing you a date data type element. And no major surprise, it just shows the date.
Fixing the Issue
To make the time values appear, you need to fix the issue back in the tabular data model, by adding a custom display format for the datetime column.
I can't say that I love this as a display format decision should really be made at the client application. I'm OK with the server supplying a default display format, but the client should be able to change it.
Unfortunately in Power BI today, you have no option to change the format of elements from Live Query data sources.
I have a client who's publishing their tabular data models to Azure Analysis Services (AAS). They want to publish a table that's visible, but only has a single column that's hidden.
You might wonder why he wanted to do that.
He's trying to have a table with no existing columns that's an anchor point for report designers to attach their report-specific measures. There are measures and computed columns in the tabular data model in AAS. But he wants to have a known location for measures that are only related to the specific report.
All other BI tools that I've tried, show the table . As an example, here's the table appearing in a browse window in SQL Server Management Studio (SSMS):
But Power BI Desktop automatically hides a table that has no visible columns.
You can see the table if you use the "View Hidden" option in the fields list, but you shouldn't need to do that for a table that's supposed to be visible.
I saw discussion that says it relates to a backwards compatibility issue with Excel, but to me it just seems like a bug.
In a recent consulting project, I needed to plot analytic values over postcodes. I couldn't just do it with the built-in maps control, mostly because the system that we're developing on doesn't have Internet connectivity. I was keen to upload the mapping data directly into the report, and the Shape Map control (while still in preview) seemed to fit what I needed exactly.
However, when you load custom maps into the control, they need to be in TopoJSON format. Now TopoJSON is a specific extension over GeoJSON which has been pretty common for a while. I tried just using a GeoJSON file with the control, to no luck.
What was successful was to download the postcode shapes from here:
Simplified the shape file (I reduced it to about 2% of the original size, while opting to keep any small areas)
Export the data as a TopoJSON file
(Optional) I then edited the TopoJSON file to rename my postcode column from POA_CODE to Postcode and from POA_NAME to PostcodeName.
What an awesome site mapshaper.org is !
Notes on Shape Map
To use the shape map control, you need to do this:
Make sure you have the Shape Map control (it's currently a preview feature that needs to be enabled)
Drag the column containing the postcode to the Location field.
Once you've done that, on the format page, you can choose to add a custom map. I imported my TopoJSON postcodes file
Drag the analytic value to the Color saturation field.
Set the background color and border color.
When I first did this, nothing worked for me. Eventually I worked out that my source data had integer postcodes but the ones in the file were just text fields. I added a computed column to my data, made sure I had text postcodes available, and used that column, then all was good.
Over the last few months, one of my Kiwi buddies (and fellow member of both the MVP and Microsoft Regional Director programs) Reza Rad has been organizing a bunch of us to write a book that's a collection of ideas from a number of MVPs. It's the Power BI MVP Book.
There are a whole lot of authors from a whole lot of different countries: Reza Rad, Anil Maharjan, Indira Bandari, Liam Bastick, Ken Puls, Jesus Gil, Thomas LeBlanc, Ike Ellis, Matt Allington, Leila Etaati, Markus Ehrenmüller, Ashraf Ghonaim, Eduardo Castro, Manohar Punna, Treb Gatte, Gilbert Quevauvilliers, Michael Johnson, Shree Khanal, Asgeir Gunnarsson, Greg Low, Gogula Aryalingam.
I've done these types of books before with the SQL Server MVP Deep Dives pair of books. They are a different book in that you're not getting a single story throughout the book. Instead, you're getting a whole set of independent chapters on a variety of topics related to Power BI.
The general idea of these books is to support a charity, and that's where anything that I would have earned from them is going.
I hope you find this book useful. There is both a paperback and a Kindle edition. The Kindle eBook is far cheaper.
Three years back, with much fanfare at a partner conference, Microsoft announced the Microsoft Professional Degree program. It was going to be a set of courses that you could take that would lead to one of their professional degrees.
Now here in Australia, you can't just call something a degree, and I'm guessing that's the same in the USA, so I wasn't surprised when I noticed soon after I started with it, that the name had changed to the Microsoft Professional Program (MPP), and they'd dropped the "degree" word.
The first program available was for Data Science. It required you to complete 11 courses. Each course restarted every three months, and had an exam and certificate at the end. Importantly, to complete a program, you also had to complete a practical course, called the capstone project.
I loved these programs, and I completed four of them: Data Science, AI, Big Data, and DevOps.
It's not all roses
The program retirement was announced the other day. You can't enrol after next month, and you must complete everything before December.
Many people are part way through the program, have paid to certify previous exams, and are now unable to complete before the deadline. That's really not fair to them. A nice touch would have been to at least refund their exam certification costs if they're part way through a program.
And more importantly, what does it really mean for those that have invested time, money, and effort in the programs? I don't know but I'd almost bet that within a year, it'll be hard to even find any trace of the entire program ever existing.
What I don't love is the way that Microsoft has constant churn in these types of things. For things like certification that require substantial commitments to pursue, this type of churn is just not appropriate.
I wish it was the first time that many of us have been down this same path but sadly, it's not. (#MCM4Life)
Microsoft's offerings around learning have been really messy and jumbled for a very long time. The move to refocus learning around Microsoft Learn is a good move. I just wish they'd learn how to make these types of changes and consolidations without leaving their biggest supporters feeling abandoned (again).
Why I liked the MPP
I really liked the format of the MPP for a number of reasons:
You could take any of the courses for free (this meets the goal of the product groups who just want to get the information out there widely and without the friction of cost). Initially, that also included the exams.
You could pay for a certified exam. The courses were done in conjunction with edX and they would check out who you were. (i.e. government issued photo ID, etc.) If you wanted the certification, you needed to pay to certify all the relevant exams.
The content was not just Microsoft content. For example, the initial statistics course was actually a course from Columbia University. Some of the content was taught by DataCamp (who I'm not happy with after their data breach), and by a prof from Norway. This gave the material a wider context.
There was often a choice in the content. For Data Science, you could use either R or Python in each required course. For AI, there was a choice of areas of AI to work in: Speech, Vision, etc.
The work could be done in your own time, and fitted in amongst other activities as you had free time.
Tracks were expanding
Eventually, there were many tracks:
Entry Level Software Development
Thanks is due
Naturally, like with most things, the quality varied across the courses. But overall, I liked the effort that had been put into the classes.
A hearty thank you to anyone who was involved in creating these courses and their associated materials!
Like a friend of mine and fellow MVP, Thomas LaRock, said in a recent post, I have no idea what really happens to the certifications that were achieved in the program. As I mentioned, I suspect they have suddenly been massively devalued. And as Thomas did, I include my course certificates for posterity.
I spend a lot of time doing consulting that involves Power BI. One of the core skills for working with Power BI is getting yourself comfortable with the M language that's used in Power Query.
I was asked the other day, how to create a function that determines if a date is a weekday or a weekend. That's actually quite easy. The Date.DayOfWeek function can be adapted to do that.
In Power Query, go to the Edit Queries option and choose to add a new query. (You could also start by duplicating an existing query). Then on the View menu, choose the Advanced Editor and replace any code that's there with this:
Date.DayOfWeek returns the day number in the week, but you need to tell it which day your week starts on. In this case, because I wanted to have Saturday and Sunday as weekend days, I started the week on Monday. That means that Saturday is then day 5 and Sunday is day 6 because the first day is day 0. So then all I need to do is check for a value less than 5.
If you need to change the weekend to say Friday and Saturday, you'd just need to replace the Day.Monday constant in the script with Day.Sunday.
Testing the function
Testing the function is also easy. Once you save the code, you'll see a template for calling the function:
One of my Kiwi buddies who specializes in Power BI is Reza Rad. I was pleased to see he had a set of eBooks now on Power BI but was especially happy to see he had a book called Pro Power BI Architecture.
There are lots of books around to discuss how to use Power BI but there's been a real lack of books on architecting solutions using Power BI. So if you want to learn to develop dashboards or reports, this isn't the book for you. Reza has other books for that.
I enjoyed reading the book and I liked the degree of coverage it gave to these topics.
If you are looking for ways to integrate Power BI into your solutions architecture, this book is a pretty good starting point.
What I was really hoping for though, was more info on administration. Mind you, the book doesn't claim to provide that. I keep getting asked about materials around administration issues. Perhaps that's another edition for Reza to consider. But the architects who need high level overviews of all the key topics should be pretty happy.
I enjoyed reading this book, and it's a pretty easy read. Great for architects considering Power BI.
In our free SDU Tools for Developers and DBAs was an AddWeekdays function. Now that was for T-SQL. Recently though, I needed to do that for Power Query. While the M language has a wonderful set of date-related functions, it didn't have this one.
That made it time to write one. Here's the code that's required:
So how does it work?
Forgive the formatting to fit this window, but let's take a quick look through it:
I started by creating a list of dates that could possibly be in range. The start of that list is FirstListDate. If the number of days is zero or positive, I've used the StartDate. Otherwise, if NumberOfDays is negative, I went back to a date that's twice the number of required days backwards.
The reason that I've used twice the number of days is that I need to later exclude weekends. I know that twice the number of days will include all the days I need, but won't include a crazy large number of days.
I then create GeneratedDates as a list of dates from the FirstListDate. That gives me a list that either starts at the StartDate (if the NumberOfDays is zero or positive), or ends with the StartDate (if the NumberOfDays is negative).
WeekDays is then created as a list that only contains weekdays, by doing a List.Select and excluding Saturday and Sunday. I did that by setting the start day for the week to Monday (i.e. it's zero) and then looking for days < 5 (which would be Saturday).
DaysInRange is then created by removing either the front or end of the list based upon the target NumberOfDays.
Finally, ReturnDate is calculated from the first or last value in the list, depending upon whether NumberOfDays was positive, zero, or negative.
A current trend that I can't say that I love is constant churn within software applications. I have no interest to go back to the days where we got a new version of SQL Server or Power BI, etc. every few years.
In fact, I thrive on change. However, I've now become really concerned about how anyone:
Learns to use a complex application
Remembers how to use a complex application when they don't use it daily
I first really struck this issue with Azure. If I was teaching a class that used Azure, I could check every single lab on Sunday night, then Monday morning, the students would find it had all changed. That's OK for an experienced person, but not OK for a learner.
I love the rate of change for Power BI. We're endlessly getting wonderful new things. But I have to say that every class that I teach on this is like a new experience. I've got another one this coming Tuesday. I used to look forward to them but now I have a major hesitation every time, as I wonder what parts of the labs will have broken.
This is now an ongoing challenge for all this type of software though. I helped create some labs for VSTS very recently, and when I look at the product now, it barely resembles the one that I built the labs on.
Is it better? Probably yes.
But even though it might have been a few months ago, it feels like just the other week, and yet, not only has the UI changed, entire concepts have been added or removed, and the order that things are done in has changed substantially.
I don't know the answer to this but the current rate of churn is a substantial issue.
I gather the plan with the DevOps guys is to put a set of labs on GitHub, and let people who are doing the labs point out the issues day by day as they strike them. Again, for experienced users that might work. But for newcomers, I really wonder if that's what they'll think.
Will they realize the app must have changed, and it's all different, or will they just think the product is too hard to use. Either way, they'll be very frustrated.
And while initial learning the product is one thing, I'm worried about it longer-term. A product like VSTS lets you set up automation and you hope you won't need to change it constantly. But if every time you go to make a change, you're struggling to use it like you're a newbie again, that's a problem.
Finally, I'm really concerned about ongoing support.
The vast majority of support of software applications today happens from community resources like blogs, webcasts, etc.
Will they continue to be created at the same pace if the authors know they'll be irrelevant or wrong within a very short time? How will end-users learn to do things when none of the online examples they find still work?
Digital Transformation and Cloud Transformation are phrases that I hear bandied around at nearly every large organization that I currently doing consulting work for.
Yet, in so many cases, I can't see the organization achieving the changes required. This is for two core reasons:
The first is that the culture within the organizations is a major hurdle. There just isn't enough flexibility to think outside the box about alternative ways to work.
Worse (and probably more concerning), I see these companies taking advice on how to make these transformations from companies who don't themselves "get it".
An organization that is cloud-antagonistic internally, and stuck in an endless IT management quagmire, isn't likely to make a good cloud transformation, and they're certainly not going to be a successful partner to be able to help you to make a successful cloud migration or to implement a cloud transformation within your company.
An organization that doesn't use business intelligence (BI) or analytics internally isn't going to be able to help you make that transition either.
If the organization is claiming to be proficient in an area of technology, ask them about the use that they are making themselves of those same technologies. As a simple example, ask them about their internal analytics that they can see on their own phones.
To be any good at any of these areas of technology, companies need to live and breathe them daily. If they don't, find someone to help you who does.