Common Power BI Admin Scripts Part II

Today’s post builds on top of Power BI Admin Scripts Part I by describing five additional PowerShell scripts that Power BI service administrators can utilize to address relatively common scenarios. Like Part I, the five new scripts have been added to my GitHub repository and I’ll only share context in the blog.

Please be sure to read through the prerequisites section of Part I and confirm you have the necessary permissions (e.g. Power BI Admin role) and software installed such as the latest Power BI Management PowerShell modules.

Get Group Owners and Members

As a Power BI administrator, knowing who has access to what and optionally adjusting this access is a fundamental responsibility. Since groups are used throughout Power BI (and elsewhere) for provisioning access to various resources and features it’s important to be able to quickly and easily find out who is in a group (members, service principals?) and who owns the group (if anyone). Combined with knowledge of what permissions the group is mapped to, this data answers the question “Who has this access/permission?”.

This script uses the AzureAD module from the context of PowerShell 7.1.3 to retrieve the owners and members. Several of the primary properties are retrieved including UserPrincipalName, DisplayName, and Mail and the members and owners are exported out to separate CSV files.

Get Data Sources from Datasets in a Workspace

The lineage view in Power BI exposes the data sources of datasets in a workspace in a clean, visual UI. However, currently there’s no option to export this metadata and data sources are not supported in the GetGroupsAsAdmin API. Even if these limitations didn’t exist, it might still be helpful to have a simple script that can loop over the datasets in a workspace and retrieve the data sources associated with each.

You might image a scenario in which a team that owns a workspace has asked for additional resources from the IT/BI team to support their solution. One of the first steps you might take is determining what data sources they’re using. Are they building on top of the company data warehouse (e.g. Azure Synapse SQL) or are they using a mix of flat files, legacy sources, SharePoint folders, etc or something in between?

Move a Workspace to Premium or Shared Capacity

Managing the allocation of Power BI Premium capacity is another top responsibility of Power BI administrators. Common scenarios for this script include A) moving a workspace from premium capacity to shared capacity (perhaps because it’s not being used or its wasting resources) B) moving a workspace from shared capacity to a premium capacity (perhaps given its requirements for premium capacity features (e.g. large models, AI, dataflows, etc) and C) moving a workspace from one premium capacity to another premium capacity, perhaps because of relative resource availability on the new capacity.

The Set-PowerBIWorkspace cmdlet provides a simple and programmable alternative to the capacity settings graphical interface in the Power BI Admin portal. Particularly in large Power BI deployments with several premium capacities and hundreds of workspaces at various stages of their lifecycle the ability to quickly move workspaces to the right capacity can represent quite the productivity boost.

Get Workspace Artifacts

Clear visibility to which artifacts (reports, datasets, dashboards, etc) roll up to which workspaces, along with user access to each workspace is nearly as important as the events/usage information provided by the Power BI Activity log. This script filters for active workspaces (not v1/O365 groups) and includes all supported artifacts including reports, datasets, dashboards, dataflows, workbooks, and users and exports this data as a .json file.

There are multiple options for parsing JSON-structured data to answer common questions against the inventory of artifacts. Just to get started, you might use the Json.Document() M function (the JSON file connector) in Power BI Desktop or as a dataflow and then expand which columns you need. As a more longer term solution, you may consider A) inserting the data to a SQL database via T-SQL and B) configuring an incremental retrieval process to only capture changed workspaces via the new admin APIs.

Refresh a Power BI Dataset

If you need more fine-grained control over dataset refresh, such as only refreshing certain tables or partitions, or if you need to use a service principal for refresh, you might read through the recent XMLA-based refresh post. However, if you just need a simple full dataset refresh you can run on-demand (or slightly modify to run unattended on a schedule), this sample script should do the job.

This script uses the Refresh Dataset in Group API by building a URL string comprising a workspace ID and dataset ID and passing a POST request via the Invoke-PowerBIRestMethod cmdlet.

Common Power BI Admin Scripts Part III?

Yes, there will be one more blog post and set of PowerShell examples in this series before we move on to new topics. This final post will look to go a bit further (and perhaps less ‘common’) such as cloning/copying content across workspaces and rebinding to datasets, updating parameter values, changing ownership of datasets, working with gateways and gateway data sources, etc. If this material and other content on this blog is interesting to you can subscribe to email notification of future posts.

4 comments

  1. Hi, when most of these PowerBI admin activities could be achieved by Admin Portal or settings UI, when would folks write a powershell script to extract this information? What is the use case ?

    Like

Leave a Reply