I got 99 problems and Fabric Shortcuts on a P1 is one of them

If you’ve bought a P1 reserved capacity, you may have been told “No worries – it’s the same as an F64!” (Or really, this is probably the case for any P to F sku conversion.) Just as you suspected – that’s not entirely accurate. And if you are trying to create Fabric shortcuts on a storage account that uses a virtual network or IP filtering – it’s not going to work.

The problem seems to lie in the fact that P1 is not really an Azure resource in the same way an F sku is. So when you go to create your shortcut following all the recommend settings (more on that in a minute), you’ll wind up with some random authentication message like the one below “Unable to load. Error 403 – This request is not authorized to perform this operation”:

Screen shot with error message: "Unable to load. Error 403 - This request is not authorized to perform this operation"

You may not even get that far and just have some highly specific error message like “Invalid Credentials”:

Screen shot with "Invalid Credentials" error message.

Giving the benefit of the doubt – you may be thinking there was user error. There are a gazillion settings, maybe we missed one. Maybe, something has been updated in the last month, week, minute… Fair enough – let’s go and check all of those.

Building Fabric shortcuts, means you are building OneLake shortcuts. So naturally I first found the Microsoft Fabric Update Blog announcement that pertained to this problem: Introducing Trusted Workspace Access for OneLake Shortcuts. That walks through this EXACT functionality, so I recreated everything from scratch and voila! Except no “voila” and still no shortcuts.

Okay, well – no worries, there’s another link at the bottom of the update blog: Trusted workspace access. Surely with this official and up-to-date documentation, we can get the shortcuts up and running.

Immediately we have a pause moment with the wording “can only be used in F SKU capacities”. It mentions it’s not supported in trial capacities (and I can confirm this is true), but we were told that a P1 was functionally the same as an F64 so we should be good right?

Further down the article, there is a mention of creating a resource instance rule. If this is your first time setting all of this up, you don’t even need this option, but it may be useful if you don’t want to add the Exception “Allow Azure services on the trusted services list to access this storage account.” to the networking section of your storage account. But this certainly won’t fix your current problem. Still, good to go through all this documentation and make sure you have everything set up properly.

One additional callout I’d like to make is the Restrictions and Considerations part of the documentation. It mentions: Only organizational account or service principal must be used for authentication to storage accounts for trusted workspace access. Lots of Microsoft support people pointed to this as our problem, and I had to show them not only was it not our problem, but it wasn’t even correct. It’s actually a fairly confusing statement because the a big part of this article is setting up the workspace identity, and then that line reads like you can’t use workspace identity to authenticate. I’m happy to report using the workspace identity worked fine for us once we got our “fix” in (I use that term loosely) and without the fix we still had a problem if we tried to use the other options available for authentication (including organizational account).

After some more digging, on the Microsoft Fabric features page, we see that P SKUs are actually not the same as F SKU in some really important ways. And using shortcuts to an Azure Storage Account that are set using anything but to Public network access: Enabled from all networks (which BTW – is against Microsoft best practice recommendations) is not going to work on a P1.

Fabric F SKU versus PBI P SKU functionality image.

The Solution

You are not going to like this. You have 2 options. The first one is the easiest, but in my experience very few enterprise companies will want to do this since it goes against Microsoft’s own best practice recommendation: Change your storage account Network setting to: Public network access enabled from all networks.

Don’t like that option? You’re probably not going to like #2 either. Particularly if you have a long time left on your P SKU capacity. The solution is to spin up a F SKU. In addition to your P SKU. And as of the writing of this article, you can not convert a P SKU to an F SKU, meaning if you got that reserved capacity earlier this year – you are out of luck.

In our case, we have a deadline for moving our on-prem ERP solution to D365 F&O (F&SCM) and that deadline includes moving our data warehouse in parallel. Very small window for moving everything and making sure the business can still run on a new ERP system with a completely new data warehouse infrastructure.

We’d have to spend a minimum of double what we are paying now, 10K a month instead of 5k a month, and that’s only if we bought a reserved F64 capacity. If we wanted to do a pay-as-go, that 8K+ more a month, which we’d probably need to do until we figure out if we should do 1 capacity, or multiple (potentially smaller) capacities to separate prod/non-prod/reporting environments. We are now talking in the range of over 40K additional at a minimum just to use the shortcut feature, not to mention we currently only use a tiny fraction of our P1 capacity. I can’t even imagine for companies that purchased a 3-year P capacity recently. (According to MS, you could have bought that up until June 30 of this year.)

Ultimately many companies and Data Engineers in the same position will need to decide if they do their development in Fabric, Synapse, or something else all together. Or maybe, just maybe, Microsoft can figure out how to convert that P1 to an F64. Like STAT.

Synapse Workspace: The Dreaded 403 error

This error has become the bane of my existence – with multiple configuration setups. Basically, anytime I dared to change the Synapse workspace to not allow all access in the Network settings, I was bound to run into it. Today’s tidbit only addresses ONE way that might solve this error, but it is so maddingly simple that I decided to throw it up on my blog. I’ll save the rant of all the different rabbit holes I’ve been down in the last month for the “BEST PRACTICE SAYS YOU HAVE TO DO THIS <oh except it’s not allowed in these cases and I’m going to bury that information on some small page.>” Hopefully I’ll post all the different ways to address this problem at some point.

First let’s look at the error message that you see when you open up your Synapse workspace. “Failed to load. Failed to load one or more resources due to forbidden issue, error code 403.”

Failed to load 403 error.

Hmmmmm, what is this “View details” you speak of?

Failed to load 403 error - more details.

Oh, ok. It’s just really telling me all the things that aren’t working. Awesome. And don’t let that “download diagnostics” button fool you. It’s not going to tell you anything helpful. (I know, shocking.)

So what’s the 2 second fix you can check before getting lost down your own rabbit hole? A little thing in the Azure portal, under your Synapse workspace in the Network settings. Drum role……..make sure you have your own IP address (ahem, “Client IP address”) added as a rule with whatever other network rules you’ve got going on. ARE YOU FREAKING KIDDING ME?

Synapse Netowrking firewall rules image with highlighted areas showing to add your IP address.

If you are thinking “yea, duh, there is a little button that says ‘Add client IP'” then ShutIT. They put that image slam in the middle of one of the Microsoft Learn pages without context in the middle of something else completely, so by the time you (ok me) finish the other thing that the article was about, I’ve completely forgotten about this rando item. And apparently I’m not the only one because I’ve screen shared with a ton of super smart people (including MS peeps) and no one even noticed my IP wasn’t added.

For the record, my new mantra is “they created Fabric because they realized they had to simplify Synapse configurations. Microsoft realized too many people were getting pissed.”

UPDATE: If using PBI as a developer using desktop, you may have to add that person’s IP address to the firewall rules as well. Otherwise they may have issues refreshing tables.

#TidBitTrenches: SSMS Errors on Load

New install of SQL Server 2019 produced an interesting error upon opening SSMS.

Ran into an interesting issue today. Upon opening a new install of SQL Server 2019, I received the below error:

SSMS Error message. Text of error message below image.

The Microsoft.SqIServer.Management.ThreatDetection.ThreatDetectionPackage,
Microsoft. SqIServer.Management.ThreatDetection,
Version=16.0.0.0, Culture= neutral,
PublicKeyToken=89845dcd8080cc91’ package did not load correctly.
The problem may have been caused by a configuration change or by the installation of another extension. You can get more information by examining the file C:\Users\<username>\AppData\Roaming\Microsoft\AppEnv\1 5.0\ActivityLog.xml‘.
Restarting Visual Studio could help resolve this issue. Continue to show this error message?

UGH. Not helpful. I do what any reasonable IT person does – I google the error. Not super helpful either. I look back at the error message. Wait a minute. That version number looks odd. Since this is a new box, I look at what has been installed.

Installed apps image highlighting 2 versions of SSMS

2 versions of SSMS? That in itself is not normally an issue, but it was the Preview version that gave me pause. Went to the authoritative voice – Microsoft – and said:

SSMS 19.0.1 is the latest general availability (GA) version. If you have a preview version of SSMS 19 installed, you should uninstall it before installing SSMS 19.0.1.

Ok. Fine. I’ll follow the rules. This once. I uninstalled both versions and then installed the most current version.

Gene Wilder gif from Young Frankenstein with text: "Its Alive" From https://media.tenor.com/SuADVxKkQ-AAAAAC/frankenstein-its-alive.gif

And just like that, I was back in business. Not even a reboot needed.

SSMS window without errors this time.

PBI: Report couldn’t access the data source

PBI data source error: Sometimes the easiest thing to check is actually the cause of the problem.

You’ve opened a report in PowerBI Service and you get the dreaded “This report couldn’t access the data source. Contact <author name> the author, to have it fixed.”

Image of the error message.

As we are expanding our report offerings and learning more with PBI Service, we get this message a lot. Often it’s a security issue, and this post isn’t about that rabbit hole, but rather a short and sweet reminder about a preliminary check that is often overlooked: is your data source up and running?

A lot of times in our Dev environment we’ve had something go wonky with our tabular model and need to re-Process the database. (That’s what dev environments are for – right?) This is what happened this morning when one of our developers reached out with this error message. As I first started heading down the security route, I paused and decided to check the database first. Ding-dong! Nothing weird in PBI, it was our tabular model. A quick checked of a few tables in the cube confirmed that it wasn’t giving anything to anyone. Reprocessing the cube and then refreshing both the data AND the page(s) that give the error message in PBI cleared everything up.

Moral of the story and to put a twist on Occam’s Razor: check the easiest thing first.

SSRS/PBI RS Error: Cannot read the next data row for the dataset

I’m an accidental SSRS / Power BI Report Server / PBI.com Administrator, along with my other do-all-things-data duties. It’s been a hot minute (cough years) since I’ve messed around in SSRS and I’m relatively new to PBI – both from the reporting and the administering side. Years ago I did some work with Tableau, but my most recent work in that realm was more R and Python based. Suffice to say – I run into issues a lot. Particularly with PBI Report Server (PBI RS).

Pssst – hint – PBI Report Server is basically the same as SSRS, so if you find some obscure problem and everything you google results in regular PBI answers, try googling the same problem with SSRS. Different database name, same almost everything else. YOU’RE WELCOME. You have no idea how much time I’ve just saved you.

That said, this specific error I came across was in SSRS, but really you could get this error from any of the reporting products.

You’ may receive an error that vaguely resembles this:

An error has occurred during report processing. (rsProcessingAborted)
Cannot read the next data row for the dataset <objectname> (rsErrorReadingNextDataRow).
For more information about this error navigate to the report server on the local server machine, or enable remote errors

While there are probably MANY reasons you may receive this error, I’m going to talk about one here: XML. That’s right, your old frienemy XML.

In our case, we were getting XML documents from AX/various sources, and storing them in another database in an ntext column. <insert obligatory “I didn’t build it” disclaimer>. So when another process was pulling it … say for a report... it was barfing all over the place. Running a snippet of sql that the report was using, resulted in a more helpful message:

Msg 9402, Level 16, State 1, Line 15
XML parsing: line 1, character 38, unable to switch the encoding

What’s in line 1 of all these xml documents? The XML Prolog.

You wanna know something else? It’s optional. And that pesky little UTF-8 encoding is causing all of your woes. My suggestion: find what is adding the prolog to your XML docs and get rid of it. (Unless it’s a person, then just tell them to cut it out.)

There are alternatives you can do, but I wouldn’t recommend them. I’ll list some here in case you want to experiment (I didn’t.) You could clean the data and remove the prolog (aka declaration or header information), but that’s a messy solution that will probably have to be changed over time. (More than you would think at first glance.) If you are on SQL Server 2019 or above, you could also consider using the UTF-8 encoding support they introduced – even at the column level – but that could have unintended consequences, and again, unplanned maintenance down the road. Personally, I have enough to do – thank you very much. (If you want to investigate this route here’s a handy dandy little article Introducing UTF-8 support for SQL Server. )

If you aren’t dealing with an ntext column, then you may have some additional alternatives here: XML Parsing – Unable to switch encoding using utf-8 when casting from NVARCHAR. But again, you run into the issue of changing column types/lengths and from personal experience, that can cause other big headaches. Don’t say you weren’t warned.

That’s it. I didn’t really plan on making my first kinda techie article to be about SSRS/PBI/XML – but it happened to me today and now it is yours. Use it as you want or stuff it in a turkey.

Speaking of turkey: my daughter drew a great pic last night of a turkey that didn’t look exactly like a turkey because she “wanted to draw a turkey that people wouldn’t want to eat”. Unfortunately it got messed up and I can’t share it with you. As a poor substitute, I give you a turkey from Summer of 2021 that was contemplating his attack on ME.

Turkey about to attack me.

Hope you all have a wonderful Thanksgiving holiday!

–Data Work Mom