-
Migrating data solutions to the cloud: a checklist. Part 2: Discovery
This post is a continuation from Part 1: Preplanning and Evaluation in a 9 part series. If you want to download the full checklist or slides without all the wordy-word stuff: you can find it in this Github repository. (The checklist has wordy-word stuff. No getting around that.)
Topics covered today:
- Digital Estate
- Data Management
- Data Lineage
- Pilot Project
Digital Estate
Understanding your digital estate at the beginning of your project will help you determine what to assess and migrate down the road. Even if you already think you know all the things you need to migrate, it’s helpful to check how all of the things may be connected. You need to identify your infrastructure, applications, and all the dependencies. You don’t want any surprises! Don’t just rely on old documentation.
Azure Migrate has a Discovery and Assessment tool that can assist in this task, but there are certainly many other ways to acquire this information. You may have other 3rd party tools or internal processes that already gather this information for you. Just make sure that it is UTD. Personally, I really like the free pre-Azure Migrate solution: Microsoft Assessment and Planning Toolkit as it dumps everything in excel sheets that Admins and Management tend to like to see. But the visual display of Migrate (and ALL the additional tools) is pretty fantastic.
Some options available in the MAP Toolkit Whatever tool you use, from a database perspective, you want to know things like what database systems are in your environment, what version and edition they are on, how many databases may be on an instance, what are the database names, file sizes, statuses, users, configurations, and other various database metadata. You are going to want to know some performance metric results and additional server details. You are going to want to know the various components that are installed on your servers, details about those components, and how they are used. Are you REALLY using those SSRS and SSAS components and if so, how?
Lastly, you want to make sure you know all of your relationships between applications, instances, database objects, and processes. It’s no fun to find out later that you had a database with hard-coded servers in some stored procedures or unknown linked server requirements. Or a SQL job that PBI Report Server created for each data refresh.
The Key Take-Aways here:
1.) Identify the infrastructure : things like servers
2.) Identify what apps do they use – this includes all your SQL server apps!
3.) And identify dependencies they may have: Internally and across servers. Don’t forget to include things like ports/networking
Data Management
Now is the time to find out what documentation you have about your data (and what you need to get). Having this information is essential if you determine you need to move things in parts or if you have overlap in data that might be potentially consolidated. This will help you down the road when we get into some architecture designs with the 5 Rs of rationalization. Our focus here is on having a data dictionary, a business glossary, a data catalog, and classifying your data.
A quick summary of these terms: a data dictionary helps you to understand and trust data in databases better, a business glossary provides a common language for the organization when it comes to business concepts and metrics, a data catalog helps you to find, understand, trust and collaborate on data, and data classification groups your data elements to make it easier to sort, retrieve, and store.
Why are these things important for migration? First off, they are important just from a data governance standpoint. But more than that, knowing this information up front can save you a lot of headaches down the road. You may have business requirements for some of your data to be labeled in a security context. Maybe you are dealing with highly classified government data, health care data, or HR data. Or you may find you have data type mismatches? And data catalogs often review hidden dependencies that you may not have otherwise known.
All is not lost if you don’t have all of this. Azure has some internal tools like Purview to assist with this, and there are plenty of 3rd party tools. If you are like me, you already carry a script toolbox from the lifetime of your career (some of those scripts from 20 years ago still work!) that you can easily use. Apart from the Business Glossary, there are so many free options and scripts out there that this should not be a showstopper for you. For the Business Glossary – you are going to have to go to the source – your subject matter experts (SMEs).
Data Lineage
In addition to the previous items we mentioned for data management, I want to call out data lineage specifically.
Data lineage gives you insight into how your data flows. It helps you understand how your data is connected and the impact of how changes to your data, processes, and structure, affects the flow and quality of your data. KNOW YOUR DATA FLOW. Find out where your data comes from, how it travels, the place(s) it lands, and ultimately, where it else it goes.
There are a lot of tools that will help you with data lineage; with various levels of sophistication. Long gone are the days where you must shift through excel sheets to figure it all out. That’s why graphical tools like Purview are really exciting for me. [Note: from initial insights into Purview costs once it’s past the preview stage – it gets pretty pricey, fast.] This is an image of Azure Purview and I wanted to show how granular it can get at the column level and how it travels through various processes and databases.
The column level feature is really really nice. It’s not necessary at this stage, but it certainly is helpful to you at the testing and troubleshooting phases. What you really need with your data lineage at this stage – and you can still see it in this graph – is how your dataflows between resources. Because this is a great way to discover things you may not be aware of in your data flow process that you need to pull into your migration plan.
What also can data lineage help with? Reporting considerations. Knowing what can break in a report, if you change at at the source is invaluable. While getting a big picture of what reports, models, applications may be impacted after a migration help circumvent some nasty surprises.
Pilot Project
If you haven’t moved anything to the cloud previously that is related to your infrastructure, then consider having a much smaller pilot project. One that will get you a feel for all of these steps but has a lower risk than your overall project.
What items do you look for in a pilot project?
Maybe you have a database that is only used for a small app that is low risk if the migration doesn’t go as expected. Try to keep your pilot project to applications with just a few dependencies. The goal of this is to a.) help you understand the process and b.) get you a quick win that you can show to stakeholders.
You want one that is low-risk, that is small enough to manage easily, but still large enough with a long enough duration to give you a good understanding of the processes involved. Besides size and duration, the criticality of your project is important. You want to incorporate a visible win that is important to your company that supports making bigger moves.
Finally, ff you’ve already done this previously, then this is when you review what you’ve learned from your previous pilot project. What were gotchas? What went really well? What is easily repeatable and what do you need to get down on paper?
Welp, we’ve come to the end of part 2. Feel free to hit me up with items you think I’ve missed or you want more clarification on. Next week is a much needed break for me, so [probably] no updates from me. Hope everyone has a wonderful Mother’s Day!
From,
DataWorkMom
-
New Stars Of Data #5 – March 12th!
In case you forgotten, New Stars of Data is back! Bounce bounce bounce.
Super excited as this is the program that helped get me started with my speaking (along with nudges from several #sqlfamily in the community). The event is the 5th in the series and is tomorrow (May, 12, 2023). The event is free, but make sure to register here: https://www.meetup.com/datagrillen/events/291222930/.
In case you are unfamiliar with New Stars of Data, let me give you the quick summary: Ben Weissman and William Durkin from DataGrillen saw a huge gap in the community for getting new MS Data platform speakers up and running and helping to ensure the person’s success. (Ok, that’s actually an assumption on my part, but that’s my take on it.) Rather than just offer speakers a space to get noticed, they designed a program that partners the speaker with a mentor to help with every step of the way. Sometimes multiple people (both Ben and Gabi Münster helped me with mine). They help with topic selection, abstract, presentation development, and even practice run-throughs. There is also a free library for anyone to use to help improve your speaking skills. They remove the barrier of “I know this subject really well, but I don’t want to suck in a public forum”. Afterwards, the videos are available on YouTube (after the volunteers do all their wizardry to get it uploaded). So really what you end up with is a free event on current topics with great speakers.
I’ve mentioned this before, but I will also mention it continuously, attending these events (and/or watching the videos on YouTube) helps diversity in out community. By default, underrepresented people have a harder time finding mentors for stuff like this, which makes these programs super important. If that’s a topic that interests you – than make sure you support events like these.
While I’ve been in the sql community for many many years, I have to admit it was super exciting to meet a lot of the people for the first time last year at PASS DCS who directly and indirectly helped me during my New Stars of Data Event. As well as the many friendships that have developed since. It’s not always possible to meet those that have made such an impact in your life, but if you get the chance, it’s pretty awesome.
Me, Ben, and Johan at Karaoke night. The event is tomorrow and it is on CET time (or maybe it’s GMT, I forget), but don’t let that stop you if you are in a different time zone! I plan to catch a few sessions as I am getting ready in the morning and during the remainder of the event. I’ll circle back around and catch the ones that are too early for me on YouTube. The session lineup looks really great, and I’m chuffed to see all the new and upcoming people. Really relevant topics on things like Power BI, Azure, Machine Learning, and other SQL/Data topics. Get your TRAIN on and join us if you can!!!!
-
Weekly Wrap Up – April 28, 2023
As usual, it’s been a busy few week at my job and in my personal life, so I’ll try and make this short an sweet.
Weekly Wrap up (technically from the last 2 weeks):
- Certification update
- WITspiration
- Women don’t owe you an explanation <rant>
- DPWIT-DEI Mental Health and Wellness Day
- Part 1 of blog series: Migrating data solutions to the cloud – a checklist
- Favorite Items of the Week in the Wild
Certification Update
Back at the end of January I did a Microsoft Virtual Training Day for Azure Fundamentals because I remembered that if you did the Virtual Training Day, that some of the VTD sessions offer a free certification test if you take it in under 90 days. So I signed up thinking it would force myself to get it done and have a clock ticking. There was no reason for me not to take the test at that point: Class is free, test is free, can take both the class and test during work and online (a big deal because my kids could interrupt during the test otherwise, which is not allowed). I even told myself “I’ll put it just under 90 days out so I have chance to study”. (HAHAAHHA The lies we tell ourselves.)
The test is pretty broad and there are a few sections that I didn’t have experience in. Truth be told, I was at work so I frequently got interrupted during the VTD, but still got a lot of information from it. Even for the things I knew, but don’t necessarily use in my area, it was good reinforcement. I highly recommend using VTDs and other free resources such as Microsoft Learn, as part of your cert training strategy.
Long story short, I took the VTD, forgot about it, and then realized test day was upon me. Freaked out, studied additionally for a few days, and lo-and-behold passed. Yes there were a few questions that had stuff that was new to me, but it was a lot easier than I had put in my mind.
Long story short – sometimes it’s a good idea to take the plunge. Even if you don’t think you are ready. (That’s kinda my thing). I had a bunch of things going on that day that blew up on me, so I really didn’t mention it at all publicly. I’m trying to get better at announcing my accomplishments – so there you go.
WITspiration
Meanwhile we’ve begun our work with WITspiration! I met with all the members awhile back and today I had my first meeting with my Circle. SUPER excited to be in a circle with such amazing women! I forgot to ask permission if I can post their names, so will wait until I get thumbs up for that. But I think we are going to create amazing things while having a sound board for each other for all things. Stay tuned for more information and I will try and remember to tag it for easy filtering.
Women don’t owe you an explanation
<begin rant>
We interrupt this regular broadcasting to explain YET AGAIN that women don’t owe you an explanation. Recently a male reached out to me with a sentence that started along the lines of “EXPLAIN YOURSELF” in a DM. It was not about anything technical and quite frankly a little bit of common sense or googling and the person could have figured it out. It was not someone I know, but it was someone many in our #SQL community know. I’m not going to call the person out, but if you are reading this: YOU HAVE NO RIGHT TO ASK ME THAT AND/OR DEMAND I TELL YOU ANYTHING PERSONAL ABOUT MYSELF.
I didn’t reply because quite frankly I was in shock, than I checked with a few friends to verify that it was way out of line (it was, of course), and then I had other bombs drop on me that day that made the situation pale in comparison. Now in hindsight I wanted to make sure to call it out here because whether you are THAT guy or just a GUY LIKE THAT – I want you to know never, never ever, never ever ever, reach out to some woman you don’t know asking her to explain anything that is none of your damn business. And if you are starting a sentence off with “EXPLAIN” and it has nothing to do with a technical thing – I can assure you – it’s none of your damn business.
<end rant>
DPWIT-DEI Mental Health and Wellness Day
If I haven’t mentioned it, and I don’t think I have, I’m presenting the session How to NOT DO all the Things – My challenges with Neurodiversity at Data Platform WIT-DEI Mental Health and Wellness Day – May 5th, 2023.
Honored to be speaking again for the Data Platform Women In Technology group and even more honored when I look at the great speakers lined up. Normally I don’t speak on non-technical things too often and this is my first session ever talking about Neurodiversity, so it will probably be way different than other sessions you have seen me speak on. If you are used to operating with 1000x things at a time – block off 11:30 AM CDT on your calendar and check it out (or many of the other GREAT sessions). Sign up here.
Part 1 of blog series: Migrating data solutions to the cloud – a checklist
Last, but not least, – Part 1 of 9 (yes 9) is out in case you missed it. This is the none technical, but very necessary items you need to make sure you do when planning a migration. It reviews key items you need to do for Pre-Planning and Evaluation. Next week I will push out Part 2: Discovery.
Favorite Items of the Week in the Wild
None.
Well, it’s not that there aren’t any, I just need to get this post out. I know, I know, I told you I would add it with each wrap-up, but I’m not this week and we are all going to just have to deal with it.
How bout this: I turned 51 on Wednesday, and as I’ve posted elsewhere: Being the same age as old people is weird, but I’m hanging in there.Happy 51 to me!
-
Migrating data solutions to the cloud: a checklist Part 1 of 9
If you are a follower of me in the various places or a reader of this blog, you know that recently I presented the session Migrating data solutions to the cloud: a checklist at SQLBits 2023. As promised in the weekly wrap-up from April 7th, here is the first installment of a more in depth dive to that session. The session had 9 parts to it and so I thought that would make for nice chunks to consume in blog posts. I’ll add links to each one so it will be easy to navigate once all of the posts are up.
The Parts are broken down as follows:- Pre-Planning and Evaluation
- Discovery
- Assess
- Architecture
- Costs
- Migrate
- Testing
- Post – Migration
- Resources and Closing
This post will focus on the first item in the list Pre-Planning and Evaluation.
Let’s get right to it! Ok ok. I know you are probably chomping at the bit wanting to immediately get into the technical parts – I know how it goes – but that is not where you start.
“Begin at the beginning,” the King said, very gravely, “and go on till you come to the end: then stop.”
― Lewis Carroll, Alice’s Adventures in Wonderland
Goals
What is your company’s goal of moving to the cloud? Some examples are:
- Reduction in CapEx (capital expenditures)
- Lower reliance of complex and/or end-of-support technologies
- Optimization of internal operations
- Reduction in vendor or technical complexity
- Preparation for new technical capabilities
Modernization? Cost-savings? Multiple things? Maybe your current infrastructure is at end-of-life, and it just really makes sense to modernize it and move it over to the cloud before you are dead in the water. Or maybe your goal is to reduce downtime for managing infrastructure and upgrades .
What is the problem you are trying to solve? Is there a business requirement driving this or maybe something else? Make sure you understand what the problems are you are trying to solve and if your goals actually address those problems. That is a business decision that has to be determined WITH your technical team and not necessarily BY your technical team. I’m a technical person, and I’m going to be the first person drawn by the shiny new object – which often is not the best thing for the company strategy.
Ultimately your problems can drive a portion of your goals, but you may have additional objectives. These items need to be determined upfront so the choices you make during the process align. They create value streams for your organization. Higher ups love to hear about value streams!
Once you have your objectives – you can create a plan and KPIs towards those objectives and then show how you met those at the end. This is our objective, this is the state at the beginning, this is the state at the end. Look how far we’ve come!!!
It’s hard to have project success if you don’t define what your objectives are. This is a project, and if you’ve been working on projects for a while, you know how easy it is to go sideways and lose sight of your goals. Sometimes those goals change a bit, and that’s ok, but ultimately your success will be measured by if you achieve clearly defined goals that you can apply metrics towards.
Support
Executive Support
Many times, you will go into a migration project that is being driven by the top level. Maybe your CEO went to a conference or read some important articles – and if so, you are ahead of the game (well, until you have to set expectations properly). But if you don’t already have executive support, then you need to identify an executive sponsor. Someone who is high up and can lead a cultural shift within your organization. A Champion. <insert bugle sounds here>
Stakeholder Buy In.
You need to Identify and involve key stakeholders early in the process. On both the business and IT side. Involvement and communication is key.
People don’t like change. Try telling a bunch of financial analysts that have been doing their job for over 20 years and have lived through multiple technology shifts – try telling them they won’t have Excel anymore. I promise you; it won’t be pretty. Rather than focus on what you are taking away – focus on why the change is needed, and what it gives them. How can you get their buy-in for your project, while still being honest.
If you don’t have that executive support or stakeholder support, then you may need to a little digging. What are your company’s core values? What is the 5- and 10-year plan for business initiatives? How does moving to the cloud address these? At the department level – what are primary goals and concerns from a technical perspective. How can moving to the cloud help this? Everyone’s answer will not be the same but look at how you can align key stakeholder needs with the goals of the C-Suite. Building a business case that addresses top concerns and how the cloud will help – is ideal for this situation.
Teams
The next thing we need to consider is what we will do in-house versus what we may pull in a partner for.
For your internal team – first determine the different areas you will need a SME (subject matter expert) for, and a good idea of what they need to know. Second, identify potential key people you may have in-house to fulfill these roles. Ideally you will have people in your organization from across many teams that you can assign to roles. Once you have assigned roles to people, you need to assess their skills. This will allow you to see what gaps you need to fill and then plan for how your org will both fill and monitor those domain gaps?
When possible, identify potential team members that either have the skills or can upskill. Build a skills readiness plan. Investing in your people is far cheaper than hiring outside help. Consultants are great – I know, I’ve been one, but there is no replacement for having in-house knowledge, particularly once those consultants have finished and walked out the door – potentially without much knowledge transfer. The people that will be maintaining and growing your systems need to have the skills.
That said, in some cases, you may want to hire outside help. Maybe many of your team members are really spread thin, or the upskill process doesn’t match the timeline – then definitely bring in consultants.
Common things for partners to be involved in:
- Strategy: Help with defining all things such as business strategy, business cases, and technology strategies
- Plan: Help with discovery tasks, such as assessment of the digital estate and development of a cloud adoption plan.
- Ready: Support with landing zone tasks.
- Migrate: Support or execute all direct migration tasks
- Innovate: Support changes in architecture
- Govern: Assist with tasks related to governance
- Manage: Help with ongoing management of post-migration tasks
Along the lines of consultants – help them help you. Does your company have standards or best practices that are specific to your industry or workflow? Share that with them. Share any and all documentation that is relevant to your migration and environment.
If your company is able – consider Creating a board for key team members to interact and meet regularly: like a Cloud Center of Excellence (CCoE). Key members are chosen from each IT areas to meet at a specified cadence to discuss all things Azure including identifying domain gaps. Those key members not only gain knowledge about technical aspects, but also get insight into projects that are going on through the different areas of the company. And they can take back that information to their team with a level of consistency. Which is super important when you are wanting establish best practices and standardize policies across teams.
Documentation
Now we need to find out what we have regarding documentation.
From the technical side you need to assess at what your current infrastructure looks like. What type of documentation do you have for your data estate and infrastructure?? Where is your data? What does your data require for things like storage and memory? What are all the components that connect to what you want to move? If you don’t know what your current infrastructure looks like, you need to determine how and who will document that for you.
On the business side: What are business requirements for your systems? Do you have regulatory requirements? SLAs mandating certain uptimes? What about other Business requirement docs for security, compliance, backup, Recovery Point Objectives (RPO) and Recovery Time Objectives (RTO)?
And finally – have you previously tried to do a cloud migration? What documentation is left over from that? Is it current? What were some of the successes and some of the failures? Any lessons learned?
You don’t have to know everything at this point, but you do need to find out what you have and what you don’t have. And if we are perfectly honest – what may be out-of-date. This is where you need an Enterprise Architect and some in-depth meetings with the different departments. Communication is key to getting the documentation and finding what you have, what you don’t have, and what you may need down the line.
Welp, that concludes Part 1: Pre-Planning and Evaluation. What would you add to this list? Have I missed any additional parts that you’d like to see in the over all series? I love to get feedback to consider so feel free to reach out.
Stay tunned for Part 2: Discovery.
-
Weekly Wrap Up – April 7, 23
Bit behind on the blog, but I’ve made a couple of tiny improvements recently. My initial idea with the blog was “just get it up as quickly as possible” with little regards to formatting and functionality (beside just reading a post). If I’m honest, these are the things that have slowed me down in prior attempts to get content out there: gold plating. Now that it’s been up for awhile, I did some quick edits on the back end so you can at least jump to specific categories, tags, or do an easy follow. It’s still bare-bones, but right now getting more content out is my goal for the spring.
Weekly Wrap Up:
New Stars Of Data shoutout
SQLBits 2023 Reflections
Planned upcoming content and activities
Favorite Items of the Week in the WildNew Stars of Data shoutout
This is super long overdue, but my ADHD makes me procrastinate often and things fall by the wayside. Not a great excuse, but I’ll own it.
First off, if you don’t know about New Stars of Data (run by Ben Weissman and William Durkin) and you love learning all things data virtually – then please take a gander. New Stars of Data is a fantastic program to get new speakers a chance to get out there. Speakers get one on one mentoring every step of the way and the program helps new speakers deliver great content and get them on the road to speaking. How do I know? That’s where I started! (So yes I have a lot of love and bias towards the program).
The next event is May 12, 2023 and I’m really excited about the line-up. Please consider joining the event not only to get great content, but to support and encourage new speakers. If you are unable to attend the event (or a particular session), then NSOD posts the videos later to the DataGrillin youtube page. Plus – if you are really serious about wanting to get more women into speaking – this is a part of it. After I spoke at NSOD, several user groups reached out to me to keep the ball rolling. All which resulted in my next topic: SQLBits 2023.
SQLBits 2023 Reflections
Confession time: when I submitted to SQLBits last year, I didn’t really expect my session be accepted. I had started a new job Dec 2021 after a long hiatus and I stopped speaking so I could focus on getting where I wanted to be for my new company. Seeing the call for speakers for SQLBits, I promised myself I’d at least submit and choose a topic that I thought would also be great for my company. Not knowing if I’d be able to travel, I submitted to do a virtual session, which I knew there were limited spots available
And then it was accepted.
Here is where I will give a tiny bit of advise: As soon as you are accepted – start working on your presentation. You have no idea what will come down the pipeline to derail you. Everything that could throw my schedule out the window – did. You have been warned: what you do with that knowledge is up to you.
Now a few weeks after the event, I can comfortably say it was incredible. Not just as a speaker – (WHICH WAS AMAZING), but also as an attendee. Even as a virtual speaker/attendee. While I did miss the interaction and fun at the actual location – the level of detail that was provided every step of the way was beyond my expectations. I still felt a part of the event (and yes even dressed up for my session!).
I can not express how much I appreciate the opportunity not only to speak at such a great event, but to do so virtually. As a woman in tech with children – traveling has been a great barrier for me and is a large part of what prevented me from speaking in earlier years. So thank you Organizers of SQLBits for allowing this option – I know it comes at great cost and work for organizers and volunteers. By having this option, it helps me represent women in the community and still be a part of something really special. The experience was one I will never forget, and while traveling overseas to present was never on my list prior – it is a new goal for me for future years.
Planned Upcoming Content and Activities
While I’m high off of speaking (and getting good reviews!) from SQL Bits, I’ve gone ahead and submitted sessions for a few other events and I’m considering another large one. That one is in person, so I’m still waffling a bit on it. More about that in a future post. Hopefully you’ll see lots more of me speaking this year. My main issue (besides traveling) is that I have so many ideas and I always want to do THE NEXT THING, so I haven’t really refined the art of reusing a session. And for me, creating new content is a really laborious process. Hopefully as I do it more, I’ll figure out how to get better at that.
That said, my slides for SQLBits session “Migrating data solutions to the cloud – a checklist” should be available soon through the SQLBits website. (The holdup is me.) An interesting thing I learned when creating my session was I had way too much content. Probably about 4 hours worth. And I had to go back and chop a lot out. So my goal over the next few months is to transfer that chopping into a series of blog posts. If you made the session – first-off THANK YOU – and secondly, the blog posts will be a more in depth coverage of each section I went over. Expect a lot more technical aspects in certain sections.
Finally, I’ve got 2 new other initiatives I’m working on that are specific to women in tech. They are both in the beginning phase so I can’t go into details on one of them, but the other one is Tracy Boggiano and Deborah Melkin‘s women’s mentoring program: Witspiration! For all the ups and downs of being a woman in tech, I have to say that it has made such an incredible difference in my life and I want to share that with others. From a kid that had so much going against her, being a woman in tech has given me opportunities that I would have never have had otherwise. Expect to hear more of about that in the future.
Favorite Items of the Week in the Wild
Yesterday I was thinking about this as a routine feature on this blog that includes both fun things and technical stuff, but for this week it’s only going to include 2 items – FUN STUFF. Mostly because I just started saving things late yesterday and also because I’ve far surpassed the time block I gave myself for today’s post.
With that I give you:
1.) Ian Coldwater‘s adventures into Kubernetes erotica by ChatGPT. If you don’t already follow Ian, you should. They are full of all kinds of awesomeness.
2.) Generative AI by Auralnauts : Your favorite 90’s song “Ice Ice Baby”, performed by the cast of Matrix. And Wilford Brimley for some reason.Have a great weekend!
-
#TidBitTrenches: SSMS Errors on Load
Ran into an interesting issue today. Upon opening a new install of SQL Server 2019, I received the below error:
The Microsoft.SqIServer.Management.ThreatDetection.ThreatDetectionPackage,
Microsoft. SqIServer.Management.ThreatDetection,
Version=16.0.0.0, Culture= neutral,
PublicKeyToken=89845dcd8080cc91’ package did not load correctly.
The problem may have been caused by a configuration change or by the installation of another extension. You can get more information by examining the file C:\Users\<username>\AppData\Roaming\Microsoft\AppEnv\1 5.0\ActivityLog.xml‘.
Restarting Visual Studio could help resolve this issue. Continue to show this error message?UGH. Not helpful. I do what any reasonable IT person does – I google the error. Not super helpful either. I look back at the error message. Wait a minute. That version number looks odd. Since this is a new box, I look at what has been installed.
2 versions of SSMS? That in itself is not normally an issue, but it was the Preview version that gave me pause. Went to the authoritative voice – Microsoft – and said:
SSMS 19.0.1 is the latest general availability (GA) version. If you have a preview version of SSMS 19 installed, you should uninstall it before installing SSMS 19.0.1.
Ok. Fine. I’ll follow the rules. This once. I uninstalled both versions and then installed the most current version.
And just like that, I was back in business. Not even a reboot needed.
-
#TidbitTrenches: Production fixes
Recently we ran into an issue with one of our Production SQL Server boxes. We knew the fix: we’d have to upgrade to SQL Server Enterprise. And quick. My server guy asked me if he needed to block out time THAT DAY so we could get things rolling. It’s a reasonable question, he needs to make time in his schedule to do such tasks. I said no.
WHAT? WHY? Because years of experience (and impulsivity) have taught me you first stop and think.
Some examples of things that I had to think about in this situation:
- Did I want to upgrade our version along with our edition? (Yes, yes, I did)
- If yes, what version – the latest? (Probably not)
- Do I want a new server or is this an in-place upgrade? (New Server)
- What else does could this affect? (Lots)
Tons of other things to think about, but since this is post is supposed to be a tidbit, we will stop there. Here’s the thing – we could have affected a lot.
Let’s address each of these examples to give us an idea of what can go wrong. First off, our current SQL Server version is in extended support and we are missing out on features I want to implement. These features match with our company’s goals over the next few years, and our next big lift should be more Azure focused than SQL Server version focused. That means I don’t want to install another version 1 year from now. I’m a busy gal and 1 year goes by quickly!
Does that mean I want the latest version? More bang for the buck – right? Brent Ozar’s post Which Version of SQL Server Should You Use? gave me pause. And when I thought about it more, I realized we may have compatibility issues with other apps that interact with our databases on that server. (BTW: checked with one of the vendors and they confirmed they’ve done ZERO testing on the latest SQL Server version and do not know any current clients that are using it.) So I needed to really weigh benefits versus risk on what version we should go to.
What about in-place upgrades? While I’ve done more in-place upgrades than I care to count, occasionally they can cause unexpected issues; I didn’t want to add more variables when I was already tackling an issue fix. Full stop.And finally – what else could this affect? This is such a fundamentally important question that really I could have just had that as a singular bullet point. How does your data flow? Are apps that are part of the data flow process going to play nice with your changes? What is connecting to it? Do you even know? What is your plan when you get a ton of support calls because things aren’t working that you didn’t even know about? If you’ve never encounter this type of scenario – go take a look at the book: The Phoenix Project. I listened to the audio version years ago and seriously LOL’d at some parts because I felt like I had lived it. I’m not alone in this.
Final thought: Before being the superhero and trying to fix something super fast in Production, stop and think. You’ll probably find a temporarily solution that will hold you over until you’ve tested out any major changes. And that’s your tidbit from the trenches for today.
-
PBI: Report couldn’t access the data source
You’ve opened a report in PowerBI Service and you get the dreaded “This report couldn’t access the data source. Contact <author name> the author, to have it fixed.”
As we are expanding our report offerings and learning more with PBI Service, we get this message a lot. Often it’s a security issue, and this post isn’t about that rabbit hole, but rather a short and sweet reminder about a preliminary check that is often overlooked: is your data source up and running?
A lot of times in our Dev environment we’ve had something go wonky with our tabular model and need to re-Process the database. (That’s what dev environments are for – right?) This is what happened this morning when one of our developers reached out with this error message. As I first started heading down the security route, I paused and decided to check the database first. Ding-dong! Nothing weird in PBI, it was our tabular model. A quick checked of a few tables in the cube confirmed that it wasn’t giving anything to anyone. Reprocessing the cube and then refreshing both the data AND the page(s) that give the error message in PBI cleared everything up.
Moral of the story and to put a twist on Occam’s Razor: check the easiest thing first.
-
Presenting… Me! At SQLBits 2023
At the end of last year I vowed to get back into speaking. I’m in a great place in my new job and have had some time to do some training I wanted to catch up on. And so, on a whim, I submitted a session to SQLBits 2023 (yes on the last day, don’t judge). For those not in the know, SQLBits is one of the largest data conferences in the WORLD (don’t believe me? Look at the stats on the website!). It’s also in Newport, Wales. And I’m not…
Traveling is not always an option with small kids and my husband’s schedule, but I noticed that the organizers were accepting a small number of remote sessions. Even better, you had the option to change to in-person if your situation changed. Thus the “I don’t know right now if I can travel to Wales at that time” excuse was eliminated.
Then came the theme. Having organized a fair bit of SQL events in Atlanta years ago, we always went big on the theme. My inner geek LOVES a good themed event! Can you guess what the theme is for SQLBits this year?
DUNGEONS AND DRAGONS.
IN WALES. (My kids actually have tiny red dragons from our friends that were living in Wales for a bit.) That settled it, I was going to submit: if only for the small chance that I would be able to do something – I have no idea what – along the lines of a D&D theme. So on the last day, probably the last hour, I went ahead and submitted a session.
AND IT WAS ACCEPTED. Hoolllly Guacamollllllle.
The official title: Migrating data solutions to the cloud – a checklist.
Session Summary: So you’re the data person in your company and you need to look to the sky. Maybe you’ve been wanting to do it for a while, maybe it’s a mandate coming from the top. What are the steps you need to think about? Where do you even start? What are the risks? How can you begin to wrap your head around all the different things you have to coordinate? Companies don’t just need the IDEA that they should move to cloud, but a guide on how-to-do it. This session will not only give you that guide to get you going, but a foundation to take to your bosses to show just how awesome you really are.
Put aside I was actually shocked, I’ll admit to getting a little teary-eyed. I’m incredibly honored to be speaking among many of my idols and droves of speaker friends that I admire. Like blown away honored. TBH, I’m was a little disappointed that I’d have to miss other sessions that are playing at the same time. Fortunately the sessions are recorded, so I’ll still be able to catch them at other times.
If you are interested in attending my session , it’s on Saturday March 18th (the free day!!!) at 8:40 AM CDT (that’s 1:40 PM Newport time). I’m super happy (for many reasons) to be presenting on Saturday – but the conference itself is from March 14th – March 18th: 5 whole days of data goodness with a side of D&D theme. (I went ahead a bought a basic D&D kit for Christmas to indoctrinate my littles.) The last day is free, but I will be attending virtually for multiple days because this conference really is filled with a lot of incredible training opportunities.
On that note, GO REGISTER. The Early bird pricing ends on the 13th. For the cynics among you, I do not get anything for you registering except the warm and fuzzy feeling that I’m helping the community. Here – I’ll even put a convenient copy/paste link for you: https://events.sqlbits.com/2023/pricing. For those of us that can’t attend in person, there is a virtual option at a 30% discounted price.
Well, that’s all I wanted to announce. I’m officially presenting for a major international conference. Little ole me. (Special thanks to New Stars of Data and others for getting me started – more on that later.) Hope to see you there on the screen or in some of the community portals. Or maybe even in-person with a little magic.
Copywrite: https://www.wargamer.com/dnd/wizard-5e-class-guide -
PBI: When you can’t change from a live connection
First off, I want to say I never intended to start a lot of my posts about Power BI. There are plenty of experts out there and I am merely an accidental PBI Admin and advocate for our Power BI platform. So why should I be writing about a topic I’m constantly learning new things about? Well, here’s the thing: when you are in the middle of learning things and you don’t find a quick and easy answer, that may be a good time to say “hey, maybe I should blog about it”.
And I think it was Kimberly Tripp’s keynote at PASS Data Community Summit 2022 that reminded me that it’s 100% ok to write about things other people have written about. In fact, several people there mentioned this same thing. Share what you have learned – it’s quite possible you bring a different perspective that will help someone along the way. And if all else fails, you may forget the solution and years down the road google it only to find your own post. (#LifeGoals)
Now that we have that out of the way, let’s talk about WHEN YOU CAN’T CHANGE FROM A LIVE CONNECTION in Power BI.
Recently, I’ve been advocating that we consolidate our reporting systems. We have a ton and with an extremely small team, it’s a huge headache to manage them all. Admin-ing reports are only supposed to be a small portion of my week-to-week tasks. (Hint: it’s not.) Plus, some of our reporting systems are just not that good. I won’t list all the different systems, but we are lucky enough to have Power BI as part of our stack and as such, I’m wanting to move as much as we can to our underutilized Power BI service. This includes our Power BI Report Server reports.
Since we needed to make some changes to some reports we had on Power BI RS, and wanted to reap some of the benefits of Power BI service with them, we decided these reports would be a good test group to move over. The changes were made in the newer version of PBI Desktop (instead of the older version of desktop we have to use with PBI RS) and we were ready to load them up to our Power BI service. This is where it got a little sticky.
sticky buns… mmmmmmm
FOCUS.
When I uploaded a new report, I remembered it was going to create a “dataset” with the report. Even if the lineage showed the dataset went to a live connection to a database. (In the case of our test case reports, they were connected to a SSAS database). Note, these datasets don’t seem to actually take any space when connected to a live connection, hence my quotes.
A dataset for every report? Given the number of reports we needed to move over, all with the same live connection, this didn’t make sense to me. Even if the dataset was just a passthrough. (Did I mention how I really hate unnecessary objects in my view? It eats away at me in a way I really can’t describe.)
So I thought – “why not just create 1 live connection dataset and have all the reports in the workspace connect to that?” (We also use shared dataset workspaces, and if you are using that method, this still applies. In this case I wanted to use deployment pipelines, and as of the writing of this post, that wasn’t doable with multiple workspaces per environment .) I built my new dataset, uploaded it, and prepared to connect my report to the new one.
SCREECH. Nope. After all that, when I opened my
shiny new reportupdated RS report in PBI Desktop, I didn’t even get the option to change the connection.Darn it. I couldn’t switch it to the new dataset. My only option was the original live connection. I couldn’t even attempt to add another data source.
I grudgingly loaded the report back up to PBI Service and now I had to look at 2 datasets while I noodled. Blerg. (I’ll mentioned again how much I hate a cluttered view.) Technically I could delete my test case dataset, but I wasn’t ready to give up yet. An idea occurred to me: let me download the newly uploaded file from the PBI Service, because logically it had created a new dataset to use when I uploaded it and the lineage showed it in the path.
I opened the report in the service, choose File–>Download this file, and then selected the “A copy of your report with a live connection to data online (.pbix)” option. (Actually I tried both, but the other way was a fail.)
Then I opened it in PBI Desktop… Meh. It looked the same.
Wait a minute! What is that at the bottom??? “Connected live to the Power BI dataset”!
I check the data source settings again under Transform data – BINGO! Now I had the option to switch the dataset from my Power BI service. Which I happily did.
After this was done, I saved and reloaded the report to PBI Service and checked the data lineage of the dataset – It was connected to the new dataset! YAY!!!!!! Since all the reports in this workspace used the same SSAS database, I could connect them all to the same singular dataset. Bonus that when it came time to setup the deployment pipeline, I only needed to change the data source rules for one dataset in each proceeding environment.
Some may say this is overly obsessive. Maybe. But when you think along the lines of maintenance or verifying data lineage, I now only needed to check 1 dataset instead of going through a whole list. That can be a timesaver when troubleshooting problems.
AND it’s prettier. AND there may be other reasons you want to change that connection and this should help along the way. AND there was another big reason I was going to list, but now I’m thinking of sticky buns again so we will just leave it right there. It’s time for lunch.