Where to store your business logic in Microsoft Fabric Lakehouse

When creating any kind of data warehouse or lakehouse you need at least three steps. You get the data from the source systems (often called ingestion or extraction), you clean the data and apply some business logic, and you model the data for consumption. For data warehouse it is called ETL (Extract, Transform and Load) or ELT (Extract, Load, Transform). Not much is different with lakehouses, you might do more steps, but the general concepts are the same.

One of the most important steps is transforming the data or applying business logic. Data does rarely come from the source system ready for analysis. Some logic needs to be applied to transform and shape it ready for analysis. When creating traditional data warehouses in SQL Server most developers used either Stored Procedures or views to store the transformation logic.

Options for storing business logic in Microsoft Fabric lakehouse

Things are slightly different with lakehouses in Fabric. You still have multiple ways to store your transformation logic. Below I will try to go through the most common ones and talk about the pros and cons of each.

Notebooks

You can store the logic in notebooks. This is similar to creating a stored procedure in SQL Server. You create one notebook per table and write the logic in there. One of the pros of this method is that it´s very explicit. Each notebook has one purpose and it´s easy to understand where the logic is kept. Another pro is that you can write the logic in verity of languages. The main con is that you might end up with a lot of notebooks if you have a lot of tables.

Views in the SQL endpoint

You can store your logic in views in the SQL Endpoint. This allows you to write your views in a familiar SQL language. This is exactly as using views in SQL Server. The pros of this method are that it´s very explicit. You have one view per table and it´s easy to see where the logic is. Another pro is that lot of developers know SQL and find it comfortable to work with. One of the main pro is that you can have a generic notebook which loops through the views and applies the logic to the tables. Another pro is that you can use either notebooks or Data Factory when moving the data to the next stage. One of the cons is that you will potentially have lot of views if you have lot of tables. Another major con at this point in time (September 2024) is that the only way you can query SQL views via notebook is with a connector to the SQL endpoint and warehouse from Microsoft (see more information here: https://learn.microsoft.com/en-us/fabric/data-engineering/spark-data-warehouse-connector). There are several limitations to the connector at the present time. One major is that you can only use Scala to connect to and read from the SQL endpoints. Another one is that you can only query the whole table or view. No custom query is allowed. This last one shouldn´t be too much of a problem for storing business logic as the view should be only what is necessary. At the time of this writing (September 2024) the connector is quite unstable but that will hopefully get fixed very soon.

Spark views

Similar to SQL views Spark has a concept of permanent views as well. You can write them in Spark SQL and they work the same as SQL views. If you prefer to use other languages than Spark SQL you can build a dataframe in other languages and then write the view using Spark SQL giving you the flexibility of Spark and the convenience of SQL. One of the pros of this method is that it´s very explicit. You have one view per table making it easy to see where the logic is. Another pro is that you get the full flexibility of Spark. The cons are that you might end up with a lot of views if you have a lot of tables. Another con is that these Spark views are not visible anywhere in the UI. The only way you can discover them is via code.

Files or database

You can store your business logic as code in files or databases. You could for an example store the logic in JSON files and query it from there at runtime. Similarly, you could store the logic in a column in a database and query that at runtime. One of the pros of this method is that you separate your workload from your logic. The logic can be edited outside of your Fabric environment. Done well a subject matter expert without any Fabric knowledge can maintain the logic. The cons are that it can be complex to maintain complex logic in files or database columns. It needs to be planned and executed well to work properly.  

What to choose

So, what to choose. There is no one correct answer. It depends a lot on your requirements. At present time (September 2024) I would recommend using either notebooks or Spark views to store the logic. I will use either option if I want to use Data Factory to orchestrate the data load and notebooks if I want to use MSSparkUtil to orchestrate the data load. When it becomes easier to work with SQL views in notebooks, I might consider using them but the only reason I would use them at present is if I use Pipelines to read and write the data.

I hope this article helps you understand the possibilities of where to store your business logic in Microsoft Fabric. Please leave comments if you have anything to add or get in touch if you want to hear how we can help you work with Microsoft Fabric.

Fabric workspace strategy

I´m delivering a Microsoft Fabric project for a customer and we had some great discussions about what strategy we should follow for workspaces. This blog post is those discussions boiled down to couple of pages 😊

When deciding on a workspace strategy for Microsoft Fabric there are several, often conflicting, things to consider.

One is ease of maintenance. The fewer workspace the easier it is to maintain and manage.

Other considerations are segregation of duties, isolation of workloads and security isolation. These call for many workspaces.

Where on the scale an organization ends, is dependent on what is important to the organization and what kind of balance they want to reach.

Things that impact number of workspaces:

  • Number of environments (dev, test, pre-prod, prod etc.)
  • Number of stages (extract, staging, dw, mart etc,)
  • Isolation of workload resources from each other (Data Factory doesn´t use resources for Spark or wise versa)
  • Security consideration (developers are not allowed to see production, those doing ingestion should not be able to modify Spark code etc.)
  • Ways of working with DevOps (one branch per workspace limitation in Fabric)

What are the outer number of workspaces?

It´s possible to have everything in one workspace but realistically you will always end with at least one workspace per environment (dev, test, prod) or at a minimum one for prod and one for rest. So, the minimum number of workspaces is 2-3.

At the other end of the scale, each workload will have one workspace per environment and each feature will have one workspace. If we imagine that we use Data Factory to ingest data into extract lakehouse, then we use Spark to clean and transform the data into staging lakehouse, then we use Spark to load the data into DW lakehouse, then we build a Power BI semantic model and finally reports on top of that. We then have 3 workspaces for each workload (assuming dev, test and prod). 3 for Data Factory, 3 for the extract lakehouse, 3 for the staging lakehouses, 3 for the dw lakehouses, 3 for the semantic models and 3 for reports. This gives us 15 workspaces. Add to that one workspace per developer for each of the workspaces if you decide to branch the workspace during development. These branch workspaces are temporary while the development happens and are most likely only accessed by the individual developer.

Other things to consider

How your team is compromised will be a deciding factor in what strategy you decide on. If you have a small team of developers which are developing the whole pipeline from ingestion to transformation to semantic models you can use fewer workspace than if you have dedicated developers for each workload. It´s also about trust. If you trust your developers not to mess with each other’s code, you can have fewer workspaces. If you feel you need to isolate workloads from each other, you will need more workspaces.

Therefore, there is no one rule for how many workspaces you should have. In my opinion you should be pragmatic about it and try to weigh the need for workload isolation and strict CI/CD protocols against ease of maintenance and development.

Impact of CI/CD on workspace stragegy

If you decide to use CI/CD for your Fabric development, you need to decide how your developers are going to work. You can only have one branch per workspace. This means that if you have more than one feature you want to work on, you need to decide if you work on all of them in one branch (workspace) or if you want to have one branch (workspace) per feature.

My recommendation is to have one branch (workspace) per feature and then when the feature is complete you merge that into the main branch which is connected to your development workspace. It´s important that you clean up the feature workspaces so you don´t end up with hundreds of dormant workspaces with the code in different stages.

At the moment the only way to deploy is via Fabric Deployment Pipelines. Therefore, this is the recommended way to deploy. If and when there are APIs for deployment of Fabric items, you can consider building your own deployment pipeline.

Where did we end with this particular customer?

We decided to go with one workspace per stage per environment. 3 environments (dev, test and prod) and 3 stages (extract, staging, dw) in the Lakehouse. Semantic models will have one workspace per environment. Reports were kept out of the scope as they are not developed frequently by the central team. We therefore will end with 12 workspaces plus workspace per feature while it´s being developed.

Microsoft Fabric!

The news is out, the cat is out of the bag, the secret has been revealed!

If you haven´t noticed Microsoft revealed their new lake-centric software-as-a-service solution for data analytics. It´s a one stop shop for data integration, data engineering, data warehousing, data science, real-time analytics, applied observability, and business intelligence. It´s built on a data lake using the open delta file format.

You might be thinking, why invent the wheel when they have a bunch of good solutions already? The answer to that is that they are not. They are taking the good things from Synapse Analytics, Azure Data Factory and Power BI and adding a new monitoring solution. But it´s not just taking those solution as-is. The foundation of the platform is OneLake a data lake based on the delta file format. It´s called OneLake as you should only have one copy of your data and all workloads (spark, sql, real time analytics or data science) as well as Power BI will all work on top of the same copy of the data. Even for Power BI you don´t have to import the data anymore (if your data is in the OneLake).

The Synapse tools will now all work directly on the lake, even the Data Warehouse part.

So, what does it mean for you and how do you get started?

I have been so fortunate to be able to test out these solutions for the past few months. It´s very promising. I´ve not tested production workloads but what I have seen I like. It´s only preview, and some of the stuff is pretty revolutionary, so there have been some issues but overall, I´ve been impressed. I´m not saying you should drop your current solution and move everything to Microsoft Fabric, but you should start looking into what it is and where it might benefit you. I see huge potential in the platform for all sizes of organizations. I think companies of any size, even SME´s are going to see some cost savings and I think it´s going to save a lot of time because you don´t need to move as much data around. You are still going to be doing the same architecture (that never really changes) which is great.

What I really like about it is that you can have mixed workspaces with fx. a spark notebooks, data factory pipelines and Power BI reports. So if you work across you can stay in one workspace. It might also make it easier to work as a team in one workspace. I still think that for many IT teams you will have different workspaces for different workloads/personas/tools but there will be times when it makes sense to have some or all of it in one workspace.

My advice to you is to take it easy. Find a good scenario where you can test out the capabilities over the next few months and be ready to take advantage of the platform when it becomes generally available. I´m hoping for an easy “migration” path from Synapse Analytics and Azure Data Factory. I´m assuming that if you have a data lake already it should be fairly straightforward to move that into the OneLake but I´m not sure at this point in time as I have only tested copy/pasting.

How do you get started?

Before you can get started you need to turn Microsoft Fabric on in the Power BI tenant settings. You can turn it on for the whole organization (I don´t recommended that at this point in time) or for a subset of the organization through specific security group(s). Not that it´s off by default but if you don´t touch it Microsoft are going to turn it on for everyone sometime in June.

When the setting is turned on, there are two ways to get started with Microsoft Fabric.

If you have existing Power BI capacity it will be converted to Microsoft Fabric capacity, and you can use it for all the other workloads (during the free preview you won´t actually use your capacity but you can monitor how the workloads would affect your capacity). If you don´t have Power BI you can start a free 60 day Microsoft Fabric trial and get free capacity during that. You need to go to https://app.fabric.microsoft.com/. There you can click on your user image in the top right corner. There will be a start trial button there you can press to start your trial.

After you “turn fabric on” you can work with the different components on https://app.fabric.microsoft.com/ or on https://app.powerbi.com/ via the icon at the very bottom on the left. Clicking that will allow you to choose a tool to work with.

I will be posting more as I test out more production like scenarios in the coming months, so stay tuned.

Power BI Governance – Training

Part 5. The Training and support pillar

This is part 5 of my Power BI Governance series. You can read previous parts here:

Part 1, Introduction to Power BI governance

Part 2, Power BI Governance Strategy

Part 3, The people pillar

Part 4, The Process and framework pillar

Training and support pillar

The third of the five pillars of Power BI Governance is the Training and support pillar. The order of the pillars is not important, so all the pillars are equally important.

Thebest way to ensure compliance is to make sure your users know what they are doing. If they know that, the likelihood of an accidental breach is far reduced. This is what the Training and support pillar revolves around.

As said above a well-trained user base (IT and business) is the best way to maximize compliance. In the ideal situation you would train every user with the best possible classroom training and test them afterwards to gage their knowledge. But in reality, there is not budget or time to do that. So how do you plan for a more pragmatic training and how do you support your users?

I suggest you start by grouping people after how they use Power BI. This could be something like (not an exhaustive list):

  • Report consumer
  • Report developer
  • Dataset developer
  • Administrator
  • Supporter

You then decide what type of training would give, what the impact of the training is and what cost is associated with each type. For an example you might argue that report consumers would get value out of short videos, online self-paced training, and instructor-led training. Maybe the cost is least with videos, then online self-paced training with instructor-led training the most expensive.

The next phase is to order the groups into how much impact a breach would have. The bigger the impact the more you should be willing to spend on training to minimize the chance of it happening.

Of course there are constraints such as peoples location, finite training budget and other things that will impact the decision. In the end you will find a solution that will be pragmatic I hope.

When I help customers design training plans I normally start with a rule of thumb which is like this:

  • Report consumer – short videos
  • Report developer – online self-paced training
  • Dataset developer – instructor-led training
  • Administrator – mentoring
  • Supporter – internal training

Then we start with the exercise above to figure out what is best for each group.

The bottom line is, that if you want to have a successful Power BI implementation training is very important. You want to train everyone who touches Power BI but in a different way depending on their role. You want to make sure you get to everyone and deliver the right training based on their needs. It´s not only governance training that is important. Training users in properly using Power BI and using best practices will deliver value faster and will make report and dataset developers more compliant.

One of the things we have been doing is to automate the training offer to users by using Microsoft Power Automate in combination with Office 365 (who has license) and the Power BI activity log (what are they doing). When a user gets a license or when they publish their first report or dataset they receive an email with the training being offered in the organization as well as relevant document and processes needed for their role. There are many variations on how you can go about this but the goal is to minimize the effort from the governance responsible to figure out who needs to be trained.

When a user it trained it´s good idea to continue to give support. Power BI changes every month, new features get added frequently and governance requirements might change over time. A good training will have taught the user how to be compliant as Power BI was the time the training was given and while it´s rare that new features that introduce compliance issues are added, it does happen. When it does you should have a support plan in place to either update the users training or at least make them aware of the risk. Do sessions in “what´s new in Power BI” or send out a video explaining a new feature if you decide it introduces potential compliance risk. However you do it make sure you get to everyone who needs the information. The Power BI audit log can help you identify active users and if they are just consuming or developing. The main thing is to understand that you are not done when the training is done.

This concluded part 4, Training and support pillar. Part 5 will cover pillar 4 Monitoring

Power BI Governance – Processes

Part 4. The Process and framework pillar

This is part 4 of my Power BI Governance series. You can read previous parts here:

Part 1, Introduction to Power BI governance

Part 2, Power BI Governance Strategy

Part 3, The people pillar

Process and framework pillar

The second of the five pillars of Power BI Governance is the Process and frameworks pillar. The order of the pillars is not important, so all the pillars are equally important.

Processes and frameworks are the backbone of a good governance strategy. They are used to guide administrators and users how to use Power BI in a compliant manner and using best practices.

In my opinion, when it comes to governance, a framework is just a collection of processes. Therefore, I will only talk about processes in this blog.

It´s important to say at this point that we are just talking about documents to help people use Power BI correctly. If you prefer to call them something else than processes, there is no reason not to. Many of my customers prefer to call these documents best practices documents, while others don´t mind the name process. The important thing to remember is that you make it clear to the users which parts of the documents are required, and which are optional best practices. In my experience people tend to take processes more seriously than best practices but also dread them more. You need to find the best way in your organization to keep people interested and make sure they take the documents seriously.

There are many different processes you could create in an organization. It all depends on your governance strategy and how much you want to split up topics. For instance you might create a development process that covers everything from Power BI Desktop development to publish and share or you might have one development process and another process for publish etc.

When I work with customers on Power BI governance, I will suggest that we cover at a minimum:

  • Power BI Development guide
  • Power BI publish / deployment guide
  • Power BI sharing guide
  • Power BI Administration guide
  • Power BI Tenant settings documentation

There are others that could be done separately such as a security process or a naming standard but that depends on the customer and the users (their skill, tolerance to documentation and usage scenarios)

Normally I will create the documents with my customer by first running workshops to better understand what they want and need. Normally, I will do one for each document or audience depending on how the customer wants to proceed. The important thing here is to listen to the stakeholders.

For the admin docs, you need to talk to current Admins on how they are administrating the system and balance that on best practices and the governance strategy (what do we allow of settings, separation of duties etc.). I will often use the tenant setting documentation to make sure everyone understands each setting and set it appropriately. In my opinion it´s very important to understand them well and challenge each decision to turn off settings that might make life easier to the user. Preferably you will write in the tenant settings documentation why the setting is sat in the documented manner.

For the development docs you will need to talk to representation of developers, both IT and business to understand how they are developing and want to develop and balance that against the governance strategy. You might end up with one process for IT and another for business users. For example, you might decide that it shouldn’t be a problem requiring IT to use source control, but it might be too much for business users. They might be comfortable with OneDrive version control instead.

For the publish and sharing docs the audience for the workshop depends on how it works in your organization. Do you have a deployment process in place or does everyone publish from Power BI Desktop? Do you allow users access to all workspace, or do you have separation of duties on some (production) workspaces? Does the same person do both publish and share? How you proceed with the publish and sharing docs will always depend on your environment.

If you figure out that the governance requirements would fundamentally change how the users work with Power BI or turn off settings that are crucial to the way, they work you have a huge communication task ahead of you and need strong management backing.

This concluded part 4, Process and framework pillar. Part 5 will cover pillar 3 Training and support

Power BI Governance – People

Part 3. The People pillar

This is part 3 of my Power BI Governance series. You can read previous parts here:

Part 1, Introduction to Power BI governance

Part 2, Power BI Governance Strategy

People pillar

First of the five pillars of Power BI Governance is the People pillar. The order of the pillars is not important, so all the pillars are equally important.

The people pillar is about having the right roles in place and actually recognize that people who have those roles need time to perform them. There are no fixed roles in Power BI but there are some roles that I see many customers have in common. The roles are not always organizational/technical roles but sometimes a set of tasks that the same person performs that are important to the governance of the platform. The Power BI related roles I typically see in organizations are:

  • Power BI Administrator
  • Power BI Gateway Administrator
  • Power BI Auditor
  • Power BI Supporter(s)

In many cases multiple roles will be covered by one person but there might also be multiple people for a single role. As there is no one way this is done from organization to organization I´m not going to dig deep into the roles.

The main point I always try to make when it comes to roles in the Power BI governance effort is that organization acknowledge that these roles exist even though they are not described in the persons job description. I want them to also acknowledge that the roles require time from the person performing them. So, it´s about placing the hat (role) and either allocate time or understand this role is being performed at the cost of other roles the person has. All to often I see that these roles are unofficial/invisible, and people are expected to perform them besides their “normal” day job which is most often developing in Power BI. If you work in Power BI or are responsible for Power BI in your organization, I encourage you to figure out which tasks your Power BI people are doing besides developing and try to figure out if there is a need for a role description and time allocation for the role to be performed to the standard your organization wants. You should also consider if they need training to perform these tasks effectively and to a high standard.

This concluded part 3, People pillar. Part 4 will cover pillar 2 Processes and Framework.

Power BI Governance series – strategy

Part 2. Governance strategy

This is part 2 of my Power BI Governance series. You can read part 1, Introduction to Power BI Governance here

Governance strategy

In Power BI, as with so many other things, the main governance issue is people. You are trying to influence people’s behavior with either guidance or technical restrictions. Although technology does help in many cases, more often than not, governance is about influencing people’s behavior with training and best practice documents. Keep that in mind when you design your governance strategy. Don´t focus too heavily on technology. Having well trained users that know how to use Power BI in the right way is the best way to stay compliant.

Having a good governance strategy and implementing it properly is a huge step in securing compliance. In this part 2 of my Power BI governance series, we will explore what you should keep in mind when creating your governance strategy and what the key things to implementing it successfully are. A governance strategy is most often a separate document describing the purpose and goals of your governance effort. It, most likely, won´t go into details of the controls themselves.

When you start creating your governance strategy there are few things, I think. you should keep in mind:

  • Consider current IT Governance strategy
  • In your organization is Power BI:
    • Enterprise BI tool
    • Self-service BI tool
    • Managed self-service
    • All the above
  • Other considerations
    • How sensitive is your data?
    • How do developers and users work with Power BI?
    • How experienced are your developers?
    • What kind of security requirements and/or industry standards do you have to adhere to?
    • How much audit trail do you need?

The answer to the questions above should get you one step closer to figuring out what your governance strategy should contain. You can then use the five pillars of Power BI Governance described in part 1 of this blog series to help you understand what topics to cover in your strategy.

When it comes to implementing a Power BI governance strategy there are few things that can help you to be successful with it.

The key to success is, in my mind, is:

That you secure management buy-in. Without management buy-in you will have hard time implementing your strategy. Governance is often about restricting people and making them use tools in a certain way which might be different to what people want to do. Convincing people to follow your strategy without management backing will be an uphill battle in most cases.

Find a way to document your control measures. It might sound very simple but deciding before you start how you are going to document the strategy and the controls you will implement can be extremely beneficial. You need to make sure that the documents are easy for users to find, read and understand. What the right level of documentation, language and storage are for your organization will depend a lot on what your users are used to. If you are in a highly regulated business your users will be used to reading and understanding heavy texts and will know where governance documents are stored. If on the other hand you are operating in a business where users are not used to that, you might need to keep the documents on a lighter level so that you don´t risk users dismissing them or not reading them properly. There are several techniques that you can use to help your less experienced users to understand governance documents such as having short summaries at the top with key takeaways or breaking them into smaller documents that don´t require as much reading.

Figure out how you want to enforce the controls you put in place. If you put in place controls that you expect people to follow you need to be able to enforce them. When you set up your controls you need to ask yourself two questions. How do I understand if the control is being followed or not and how do I react if they are not? If you don´t know if your controls are being followed or not, they are not very useful. Yes, they might help people use Power BI correctly but it´s very important to understand if they are or not. Likewise, you need to know how you will enforce the control if people are not following them as if you don´t do anything or if you react in an unpredictable way it´s hard for people to take the control seriously. This is where management backing is very important as they usually have bigger say in how people behave.

This concludes part 2, Power BI Governance Strategy. Part 3 will cover the first pillar People

Power BI Governance series – introduction

This is the first part in a 7 part series on Power BI governance. I will add links to the next parts as I publish them.

Part 1. Introduction to Power BI governance

Governance can mean many things and often different things to different people. In this article series I want talk about my view on Power BI governance and what I think you should be doing when it comes to governing your Power BI environment.

Before I go any further, I just want to mention that Microsoft has some material on Power BI governance that you might be interested in. You can find it here: https://docs.microsoft.com/en-us/power-bi/service-admin-governance

Why governance

Governance is about making sure the right people do the right thing within the defined boundaries of the organization. We need to make sure the BI system (Power BI) does not expose data to the wrong people and that the artifacts are stored, shared, and maintained in the right way. Furthermore, we need to make sure that the users, creators, and administrators know how to use, manage, and secure the artifacts

As Power BI is partly self-service, it is vital that the governance is implemented early and in such a way that it does not impede creators and users unless necessary. Being restrictive in the wrong place can lead to implementation failure and un-governed solution frequently known as Shadow IT. It´s important to tread carefully to avoid that situation but at the same time make sure your organization is compliant and secure

Governance strategy

In my opinion Power BI governance strategy has 5 pillars, People, Processes and framework, Training and support, monitoring and Settings and external tools.

Most of these pillars are non-technical. Only Monitoring and Settings and external tools are technical. This often distracts organizations as many like to think that problems should be solvable with technology. The reality is that technology can only partly help. As with so many other things the main governance issue is people. Having well trained users that know how to use Power BI in the right way is the best way to stay compliant.

Having a good governance strategy and implementing it properly is therefore a huge step in securing compliance.

The 5 pillars cover all of what your governance strategy implementation should cover (in my mind).

The people pillar is about having the right roles in place and actually recognize that people what have those roles need time to perform them. All too often I see that people have unofficial Power BI roles with no time allocation. For example, I see with few of my clients that the Power BI Administrator is the best Power BI person in the company who is expected to do the administration besides their Power BI development. It might work and often does but it should still be recognized that it takes time and it comes with responsibility which requires it to be done properly.  

The processes and framework pillar is about having the proper documents in place so users can use Power BI correctly and be compliant. Processes or best practices are document that describe how to use or administer Power BI. Frameworks often describe the method on which you base the process/best practice documents on.

The training and support pillar is about making sure everyone that uses Power BI has gotten the required training. Here you will describe your training plan, decide what type of training each user type should get and how to make sure you reach everyone with your training. It´s also here you might describe how you support your users going forward with things such as internal user groups or subscription to external training library.

The monitoring pillar is about setting up monitoring of Power BI. Usually, it involves extracting data from the Power BI activity log as well as the Power BI REST APIs for information about existing artifacts in your Power BI tenant. Sometimes you might extract data from other parts of Microsoft 365 such as employee data to supplement the activity and inventory data. This part of the governance effort is both about describing your monitoring (documentation) as well as implementing it.

The settings and external tools pillar is about making sure Power BI settings are correctly sat as well as how to use other approved tools to support Power BI. Here you will describe all the settings and their correct value in a document. You will also describe how other tools such as Microsoft 365 sensitivity labels or Tabular Editor should be used with Power BI.

This concludes part 1, introduction to Power BI Governance. Part 2 will cover Power BI Governance strategy

Busy summer

This summer has been very eventful for me and my family and not just because of the world pandemic. Some of you might have noticed that I haven´t been very active in blogging or on social media for couple of months. This is due to some happy personal situations.

To recount what has happened we need to start by going back to June. On June 24th we had our third child. She´s fantastic and perfect like her older sisters. It´s been 11 years since our middle one was born so it took some time getting back into the baby parenting role again but I´m thoroughly enjoying it.

On August 1st we moved from Iceland to Denmark. We had been planning the move for a while as we wanted to be closer to my wife’s family as I travel quite a bit (or I used to at least). The plan was to move at the end of May when our older daughters were done with exams in their school but before our youngest was born. Unfortunately, the COVID-19 pandemic prevented us from taking to Denmark to look at houses. I managed to do that on June 15th, less than a week before the expected birthdate. My wife and I made a pact. I would find us the perfect house and she would keep the baby in until I came back. Both things worked out as planned ?. It´s a big thing moving between countries separated by an ocean. You need to pack your stuff into a shipping container two weeks before you want to receive it in the new house. This meant we risked being homeless for 14 days with a newborn. Luckily my sister could loan us an apartment, so it worked out well.

These two big events took all my energy in the last 2-3 months as you might understand.

Now I´m back to full work and ready to start to contribute more to the community and participate more.

While I was in my online hiatus some great professional things happened as well.

  • My MVP award got renewed for the third time
  • I got selected to speak at PASS Summit
  • I got selected to speak at SQL Saturday Gothenburg on September 5th (really looking forward to that one)
  • I´m still speaking at SQL Bits although now virtual. I have a training day that I´m adapting to online delivery. That is going to be awesome.
  • #DataWeekender #TheSQL has opened call for speaker.
  • Me and Michael Johnson finished the first draft of the book we are writing

Expect to see some posts about the coming events from me and also some posts where I continue my series on Power BI monitoring and governance.

If you like to connect or get in touch you can subscribe to my newsletter in the box to the right or find me on social media. If you like to see what we have to offer you can navigate to https://northinsights.com

What´s on in your Power BI environment? – Tenant Settings

Power BI is essentially a self-service BI tool where users traditionally have a lot of freedom to create the reports and dashboards that they need and organize it in a way that suits them.

If you are a Power BI admin or if you´re concerned with governance or security, you often want to know what´s going on in your Power BI environment. Since Power BI is first and foremost a self-service BI tool, Microsoft has not (yet) developed good, out of the box, monitoring tools. This means that you need to develop your own way of monitoring Power BI.

This series of blogs describe what you should be monitoring in Power BI and what method works best for each.

The blogs are:

  1. Power BI Admin Portal Settings
  2. Power BI Artifact Inventory
  3. Power BI Activities
  4. Power BI Capacities

We will start this blog series with looking at how you should monitor your Power BI Admin Portal settings.

Part 1. Power BI Admin Portal

The Power BI Admin Portal is the place where the Power BI Admin can change settings and monitor certain things.

One of the main points of interest is the Tenant settings. Some of the settings that you can change in the Tenant settings part of the portal are who can publish to web, who can share externally, who can create workspaces and where the internal help portal is. There are in all, at the time of this writing, 31 settings you can change. Some of them are fine in the default settings while others like Publish to web should be changed as soon as possible.

Besides the Tenant Settings some of the other things you can change are Capacity Settings, Dataflow Settings, look at all workspaces in the tenant, turn on audit logs, brand the Power BI portal, manage Protection metrics and add Featured content. What ever you decide to change the purpose of this blog is to encourage you to document and monitor the settings

Figure 2: Power BI Admin Portal

Record and monitor Tenant settings

It´s very important that the Tenant settings are documented and monitored regularly. Unfortunately, you cannot monitor these settings automatically, so someone needs to login to the portal and manually check the settings. We recommend that you write down all the settings and have the admin check them once a month. This is especially important if you have more than one administrator. The main reason for that is that any change made in the portal is not logged anywhere you can access. If you have not written down how you want the settings to be, it´s very difficult for an admin to know if the settings are correct as they cannot see if they have been changed unless they remember the previous setting.

Figure 3: Example of Power BI Admin Portal Settings documentation

Besides the Tenant settings we recommend that you turn on Audit logs which are needed for activity monitoring and review Embed codes to make sure there is no sensitive data being embedded outside of an approved system. If you have Power BI Premium you can also use the Capacity settings to control your capacities.

Conclusion

Go through all the settings in the Power BI Admin Portal. Change the settings as needed and then documents every setting. Manually monitor that the settings have not been changed at least once a month as a part of your governance process. Turn on Audit logs and make sure there are no reports being embedded outside of approved systems.

Come back for the next blog on Power BI Artifact collection and monitoring

If you want to discuss Power BI monitoring or governance or get help with implementing it in your organization please contact Ásgeir Gunnarsson on asgeir@northinsights.com or go to https://northinsights.com and find out what we offer and how to get in touch. We offer consulting and advisory as well as training on the whole Business Intelligence lifecycle including Power BI.

Written by:

Ásgeir Gunnarsson

Microsoft Data Platform MVP