Note: At the time of this writing, this also applies to Power BI Service.
Ah, you’ve setup a deployment pipeline and let your people know it’s ready for them to do the thing. Everything looks fine on your end, so you shoot off a message to the group and go about your busy day. (Nevermind your Test environment was set up 4 months ago, Production 3 days ago, and Development was replaced 2 months ago with a new Development environment because your region changed.) You’ve added all the permission groups to each environment and added your “contributors” as Admin to the deployment pipeline (no comment), so everything should be grand.
Except… your consultant just pinged you that it’s not. You hop on a call and confirm that even though she sees all of her work in the development workspace, and she is actively developing there, it shows up as nothing in the deployment pipeline. She checks access to the Test & Production Environment. Yep, can enter the workspaces even though nothing is there. Those workspaces are expected to be empty because artifacts haven’t been promoted yet. What gives?
You check the deployment pipeline permissions again.
Yep. The user is in a group that is an Admin under Manage Access in the deployment pipeline.. (Pro-tip: if using groups, verify the person is in the group.) What else can you check?
In this instance, the problem was in the workspace permission.
The user was in a group in the workspace that only had Viewer permissions. This made sense when I created the workspace, because the user wasn’t going to be creating / updating things directly in the workspace (only pipelines would be doing that), but it was forgotten that the user would need the additional permissions once she was given the task to add parameters and such to the deployment pipeline. As soon as the workspace access was updated to Contributor, she was able to see the artifacts in the pipeline.
Feel free to add other areas you would have checked in the comment section.
If you’ve bought a P1 reserved capacity, you may have been told “No worries – it’s the same as an F64!” (Or really, this is probably the case for any P to F sku conversion.) Just as you suspected – that’s not entirely accurate. And if you are trying to create Fabric shortcuts on a storage account that uses a virtual network or IP filtering – it’s not going to work.
The problem seems to lie in the fact that P1 is not really an Azure resource in the same way an F sku is. So when you go to create your shortcut following all the recommend settings (more on that in a minute), you’ll wind up with some random authentication message like the one below “Unable to load. Error 403 – This request is not authorized to perform this operation”:
You may not even get that far and just have some highly specific error message like “Invalid Credentials”:
Giving the benefit of the doubt – you may be thinking there was user error. There are a gazillion settings, maybe we missed one. Maybe, something has been updated in the last month, week, minute… Fair enough – let’s go and check all of those.
Building Fabric shortcuts, means you are building OneLake shortcuts. So naturally I first found the Microsoft Fabric Update Blog announcement that pertained to this problem: Introducing Trusted Workspace Access for OneLake Shortcuts. That walks through this EXACT functionality, so I recreated everything from scratch and voila! Except no “voila” and still no shortcuts.
Okay, well – no worries, there’s another link at the bottom of the update blog: Trusted workspace access. Surely with this official and up-to-date documentation, we can get the shortcuts up and running.
Immediately we have a pause moment with the wording “can only be used in F SKU capacities”. It mentions it’s not supported in trial capacities (and I can confirm this is true), but we were told that a P1 was functionally the same as an F64 so we should be good right?
Further down the article, there is a mention of creating a resource instance rule. If this is your first time setting all of this up, you don’t even need this option, but it may be useful if you don’t want to add the Exception “Allow Azure services on the trusted services list to access this storage account.” to the networking section of your storage account. But this certainly won’t fix your current problem. Still, good to go through all this documentation and make sure you have everything set up properly.
One additional callout I’d like to make is the Restrictions and Considerations part of the documentation. It mentions: Only organizational account or service principal must be used for authentication to storage accounts for trusted workspace access. Lots of Microsoft support people pointed to this as our problem, and I had to show them not only was it not our problem, but it wasn’t even correct. It’s actually a fairly confusing statement because the a big part of this article is setting up the workspace identity, and then that line reads like you can’t use workspace identity to authenticate. I’m happy to report using the workspace identity worked fine for us once we got our “fix” in (I use that term loosely) and without the fix we still had a problem if we tried to use the other options available for authentication (including organizational account).
After some more digging, on the Microsoft Fabric features page, we see that P SKUs are actually not the same as F SKU in some really important ways. And using shortcuts to an Azure Storage Account that are set using anything but to Public network access: Enabled from all networks (which BTW – is against Microsoft best practice recommendations) is not going to work on a P1.
The Solution
You are not going to like this. You have 2 options. The first one is the easiest, but in my experience very few enterprise companies will want to do this since it goes against Microsoft’s own best practice recommendation: Change your storage account Network setting to: Public network access enabled from all networks.
Don’t like that option? You’re probably not going to like #2 either. Particularly if you have a long time left on your P SKU capacity. The solution is to spin up a F SKU. In addition to your P SKU.And as of the writing of this article, you can not convert a P SKU to an F SKU, meaning if you got that reserved capacity earlier this year – you are out of luck.
In our case, we have a deadline for moving our on-prem ERP solution to D365 F&O (F&SCM) and that deadline includes moving our data warehouse in parallel. Very small window for moving everything and making sure the business can still run on a new ERP system with a completely new data warehouse infrastructure.
We’d have to spend a minimum of double what we are paying now, 10K a month instead of 5k a month, and that’s only if we bought a reserved F64 capacity. If we wanted to do a pay-as-go, that 8K+ more a month, which we’d probably need to do until we figure out if we should do 1 capacity, or multiple (potentially smaller) capacities to separate prod/non-prod/reporting environments. We are now talking in the range of over 40K additional at a minimum just to use the shortcut feature, not to mention we currently only use a tiny fraction of our P1 capacity. I can’t even imagine for companies that purchased a 3-year P capacity recently. (According to MS, you could have bought that up until June 30 of this year.)
Ultimately many companies and Data Engineers in the same position will need to decide if they do their development in Fabric, Synapse, or something else all together. Or maybe, just maybe, Microsoft can figure out how to convert that P1 to an F64. Like STAT.
You’ve assigned your Fabric Administrators and you’ve sent them off to the races to go see and do all the things. Except they can’t see and do all the things. OR CAN THEY? <cue ominous music>
Mango, crazy-eyed with anticipation about a new adventure.
At first glance, Fabric Administrator #2 can’t see any of the workspaces PBI Administrator 1 created; some of them years ago. Let’s go ahead and fix that first over here.* Once you’ve gotten that all straightened out and they can see all the workspaces, you think you are in the clear for deployment pipelines? Nope, same issue: PBI Administrator #1 can see all of the deployment pipelines and newly minted Fabric Administrator #2 can see none. Waaa-waaaa (sad trombone).
*(If you only need the user / user group to see the workspaces relative to the pipeline, then read on for a helpful hint that performs the double duty of adding the security to workspaces and deployment pipelines at the same time).
To be fair, I’m fairly certain this would be the same case for 2 PBI Administrators, but since the Fabric genie has been let out of the bottle, I can’t say for sure.
What’s an admin to do??? I mean seriously, what does Admin even mean anymore?!?
Well if we are perfectly honest, there is a reason we’ve been telling you to set up User Groups. Because if the admin that had set up the pipeline to begin with had given access to the deployment pipeline to a admin user group to begin with, then we wouldn’t be here.
(Oh yea, well if you want to be that way then I say security should really be a part of the creating a pipeline option.) Look, do you want to play the blame game or do you want to find a solution? That’s what I thought.
To fix, go into the deployment pipeline and click on the Manage Access link.
Then add your USER GROUP to the Access list Admin rights.
If you haven’t already added the group to the workspace – then here is your chance to do it all together. Just switch the toggle button to Add or update workspace permissions to ON.
You can then set a more granular access to each workspace for the user group (or user, sigh) in question. Access options include Admin, Contributor, Member, and Viewer (though we may see more down the road).
That’s it. Throw a message in the comments if you’ve encounter any similar hiccups.
PBI data source error: Sometimes the easiest thing to check is actually the cause of the problem.
You’ve opened a report in PowerBI Service and you get the dreaded “This report couldn’t access the data source. Contact <author name> the author, to have it fixed.”
As we are expanding our report offerings and learning more with PBI Service, we get this message a lot. Often it’s a security issue, and this post isn’t about that rabbit hole, but rather a short and sweet reminder about a preliminary check that is often overlooked: is your data source up and running?
A lot of times in our Dev environment we’ve had something go wonky with our tabular model and need to re-Process the database. (That’s what dev environments are for – right?) This is what happened this morning when one of our developers reached out with this error message. As I first started heading down the security route, I paused and decided to check the database first. Ding-dong! Nothing weird in PBI, it was our tabular model. A quick checked of a few tables in the cube confirmed that it wasn’t giving anything to anyone. Reprocessing the cube and then refreshing both the data AND the page(s) that give the error message in PBI cleared everything up.
Moral of the story and to put a twist on Occam’s Razor: check the easiest thing first.
First off, I want to say I never intended to start a lot of my posts about Power BI. There are plenty of experts out there and I am merely an accidental PBI Admin and advocate for our Power BI platform. So why should I be writing about a topic I’m constantly learning new things about? Well, here’s the thing: when you are in the middle of learning things and you don’t find a quick and easy answer, that may be a good time to say “hey, maybe I should blog about it”.
And I think it was Kimberly Tripp’s keynote at PASS Data Community Summit 2022 that reminded me that it’s 100% ok to write about things other people have written about. In fact, several people there mentioned this same thing. Share what you have learned – it’s quite possible you bring a different perspective that will help someone along the way. And if all else fails, you may forget the solution and years down the road google it only to find your own post. (#LifeGoals)
Now that we have that out of the way, let’s talk about WHEN YOU CAN’T CHANGE FROM A LIVE CONNECTION in Power BI.
Recently, I’ve been advocating that we consolidate our reporting systems. We have a ton and with an extremely small team, it’s a huge headache to manage them all. Admin-ing reports are only supposed to be a small portion of my week-to-week tasks. (Hint: it’s not.) Plus, some of our reporting systems are just not that good. I won’t list all the different systems, but we are lucky enough to have Power BI as part of our stack and as such, I’m wanting to move as much as we can to our underutilized Power BI service. This includes our Power BI Report Server reports.
Since we needed to make some changes to some reports we had on Power BI RS, and wanted to reap some of the benefits of Power BI service with them, we decided these reports would be a good test group to move over. The changes were made in the newer version of PBI Desktop (instead of the older version of desktop we have to use with PBI RS) and we were ready to load them up to our Power BI service. This is where it got a little sticky.
sticky buns… mmmmmmm
FOCUS.
When I uploaded a new report, I remembered it was going to create a “dataset” with the report. Even if the lineage showed the dataset went to a live connection to a database. (In the case of our test case reports, they were connected to a SSAS database). Note, these datasets don’t seem to actually take any space when connected to a live connection, hence my quotes.
A dataset for every report? Given the number of reports we needed to move over, all with the same live connection, this didn’t make sense to me. Even if the dataset was just a passthrough. (Did I mention how I really hate unnecessary objects in my view? It eats away at me in a way I really can’t describe.)
So I thought – “why not just create 1 live connection dataset and have all the reports in the workspace connect to that?” (We also use shared dataset workspaces, and if you are using that method, this still applies. In this case I wanted to use deployment pipelines, and as of the writing of this post, that wasn’t doable with multiple workspaces per environment .) I built my new dataset, uploaded it, and prepared to connect my report to the new one.
SCREECH. Nope. After all that, when I opened my shiny new report updated RS report in PBI Desktop, I didn’t even get the option to change the connection.
Darn it. I couldn’t switch it to the new dataset. My only option was the original live connection. I couldn’t even attempt to add another data source.
I grudgingly loaded the report back up to PBI Service and now I had to look at 2 datasets while I noodled. Blerg. (I’ll mentioned again how much I hate a cluttered view.) Technically I could delete my test case dataset, but I wasn’t ready to give up yet. An idea occurred to me: let me download the newly uploaded file from the PBI Service, because logically it had created a new dataset to use when I uploaded it and the lineage showed it in the path.
I opened the report in the service, choose File–>Download this file, and then selected the “A copy of your report with a live connection to data online (.pbix)” option. (Actually I tried both, but the other way was a fail.)
Then I opened it in PBI Desktop… Meh. It looked the same.
Wait a minute! What is that at the bottom??? “Connected live to the Power BI dataset”!
I check the data source settings again under Transform data – BINGO! Now I had the option to switch the dataset from my Power BI service. Which I happily did.
After this was done, I saved and reloaded the report to PBI Service and checked the data lineage of the dataset – It was connected to the new dataset! YAY!!!!!! Since all the reports in this workspace used the same SSAS database, I could connect them all to the same singular dataset. Bonus that when it came time to setup the deployment pipeline, I only needed to change the data source rules for one dataset in each proceeding environment.
Some may say this is overly obsessive. Maybe. But when you think along the lines of maintenance or verifying data lineage, I now only needed to check 1 dataset instead of going through a whole list. That can be a timesaver when troubleshooting problems.
AND it’s prettier. AND there may be other reasons you want to change that connection and this should help along the way. AND there was another big reason I was going to list, but now I’m thinking of sticky buns again so we will just leave it right there. It’s time for lunch.