Given the nearly universal application of date and time dimension tables to business intelligence projects it’s generally understood that these tables should be maintained as part of a data warehouse. However, in many cases one or both tables are either not yet fully built or access is not available. In these scenarios, custom Power Query (M) expressions that define these tables ‘on the fly’ within Power BI datasets or Analysis Services (Tabular) models can be an effective and at least a temporary alternative.
A Power Query file (.pq) containing an expression which generates a Time dimension table (one row per second) is available for download from my GitHub repository.
Like most dimensions in a data model, users will generally begin their analysis of time at a high level (e.g. hour) and then filter and drill their way into greater levels of detail, often leveraging hierarchies built into the model and/or drilling functionality in tools like Power BI. To support analysis by multiple grains and hierarchies, the query produces columns which group the seconds into hourly quartiles, hours, and minutes.
In the following example, a hierarchy comprised of the Hourly Quartile, Hour, Hour Minute, and Time columns enables the user to drill all the way down to the seconds of the 3:11 PM minute to find that 3:11:24 had the highest sales (for this minute):
How This Works
The Power Query (M) expression starts with a list value from 0 to 86,399. This list value is then converted to a table value with a single column (Second) and a new column ( “TimeDuration”) is added of the duration type based on this Second column.
The TimeDuration column is then used as the source for a new “Time” column of the time type via the Duration.Hours(), Duration.Minutes(), and Duration.Seconds() M functions. At this point, Power Query (M) time functions are applied against the “Time” column to produce additional columns such as hour number, minute number, and second number.
Finally, grouping/hierarchy columns are added to support analysis at multiple grains, to provide more user friendly names (e.g. 1 PM instead of Hour 13) and to support chronological sorting of the text columns. If you’re not familiar with a ‘list value’ or Power Query (M) generally, you might read through the first few sections of the language specification document.
- You could define your own Time dimension table in a file like Excel but then you’re creating a dependency on this data source and you will need to obtain access to a storage location, preferably a cloud-based location like an Office 365 group team site.
- You could create your time dimension via DAX but, with rare exceptions, DAX should be used for analysis (measures, queries) and row-level security, not for data transformation.
- Though a bit larger than most date dimension tables at 86,400 rows (one row per second in a day), you can still comfortably add several columns (particularly low cardinality columns) without significantly impacting the size of the dataset.
Steps to Implement
If you want to try out this custom time dimension for a Power BI dataset you can follow these steps:
- Download the Create a Time Table.pq file and open Power BI Desktop
- Launch the Power Query Editor by clicking the Edit Queries icon (on the Home tab)
- Click the New Source dropdown and select ‘Blank Query’
- Open the Advanced Editor pane and clear/delete any default syntax (let..in)
- Paste the time expression from the .pq file into the Advanced Editor Window and click Done
- Give the query a name like ‘Time’ and click Close & Apply to load the table to the data model.
You’ll now need to create model relationships between your fact table(s) and the time dimension table. Depending on the design of your fact table(s), these relationships might use the second column (0 to 86,399 or the Time column similar to how you might use either a date column or a whole number column (such as YYYYMMDD) on your date table.
Additionally, you should utilize the Sort By Column feature to sort the Hour column by the Hour Number column and sort the Hourly Quartile column by the Hourly Quartile Sort column. Finally, you should probably hide columns used for sorting or not needed for reporting/analysis and create a hierarchy to make it easier to navigate through different levels of this dimension.
If you find that this time dimension query or some modified version is useful, you might then share this table with the data warehouse or ETL team such that the table can simply be loaded directly from the DW like other tables. Alternatively, if for whatever reason an engagement with the data warehouse/ETL team is not an option, you could look to make a single time dimension expression as a standard to be used by other dataset developers, such as via templates (.PBIT).
Remember that .pq files (M expressions generally) can be reused across Microsoft data products ranging from Excel up to Azure Data Factory and hopefully soon paginated reports.
Thanks for visiting Insight Quest and feel welcome to share any feedback in the comments. I expect to have a new blog post out soon related to XMLA endpoints for Power BI Premium datasets and several other posts are in the pipeline.
Brett, I have been struggling with this for some time. I created the new table from your query and then I linked that table to my other table (sql). They are “talking” to each other, but I cannot figure out how to configure the visual to look like your bar chart number 2, above. Can you help me by telling me what item you put in what power BI field to get it to look like that please? Thank you.
Sorry to hear of your struggles on this issue. I created a hierarchy in the Time dimension table with the following columns and in the following order: Hourly Quartile, Hour, Hour Minute, and Time. Typically there’s not a need to go deeper than the Hour Minute grain for general monitoring purposes but I just added the hierarchy to the Axis field well for a clustered column chart.
Note: You mention the SQL table and the time dimension table are talking to each other so I assume you’ve already addressed any potential rounding issues. In the source for this blog post I was working with log file data from an on-premises data gateway and thus used the following Power Query (M) expression to account for rounding of the second:
EndTimeSecondOfDay = Table.AddColumn(TDataReplace3, “LocalEndTimeSecondOfDay”, each
(Time.Hour([Query Execution End Time Local]) * 3600) + (Time.Minute([Query Execution End Time Local]) * 60) + (Number.Round(Time.Second([Query Execution End Time Local]))), Int64.Type
Thanks for visiting this blog and I hope this helps.
Thanks for the help Brett. I’ll try out a few things.