Convert UTC to Local Time with Daylight Savings Support in Power BI

Update:  The Power BI service was recently updated such that DateTimeZone.ToLocal() now uses the client time zone which means the solution below is no longer needed to resolve the problem discussed.


Power BI provides exceptional business analytic services, and the Power BI component Power Query, provides a very powerful and dynamic tool for loading and transforming data into Power BI’s data model. However, one shortcoming Power Query has is a good way to convert a Coordinated Universal Time (UTC) to a local time of your choosing while preserving Daylight Savings Time (DST).  Time zone functions do exist, however, one has to be careful how and where they are used. Below I explain the problem with Power Query’s ToLocal() function in the Power BI Service in more detail, as well as my solution for converting UTC to the Pacific Time Zone.

The Problem

We built an online event management and ticketing system for one of our clients. Events can be scheduled and published. Customers purchase tickets online or from agents at the event sites. The date and time of each event and the date and time that purchases are made for each event are recorded in UTC.

The client’s accounting team needs to report on and analyze ticket purchases. The accountants are in US Pacific Time (PT), and all events are held in Pacific Time, so it makes sense to convert UTC to PT. This would be an easy modification except for the time change that takes place twice a year in most of the US, as well as various parts of the world, called Daylight Savings Time (DST).  Under Daylight Savings Time, clocks are adjusted ahead one hour in the spring, and one hour back in the forward or “Spring forward” and “Fall back”. The reasons why and if there is really any benefit with regards to safety of school kids or energy savings are beyond the scope of this article. Clearly there’s a cost handling this properly in transactional systems.

A simple PowerQuery expression can transform Coordinated Universal Time to any local time:

    DateTime.AddZone([EventStartDateTime], 0)

This expression works great as long as users are in Power BI Desktop in Pacific Time. However, as soon as the model is published to the Power BI Service, and the data refreshed, the date times will no longer be Pacific Time: ToLocal() converts to the local time of the Power BI servers which are set for Universal Coordinated Time, as any properly configured server should be.

The Solution

To resolve the Daylight Savings Time issue, we need to determine if a date is in DST or not before adding an hour offset. My solution does this with a table of DST periods including UTC time. I created a PowerQuery function to look up a UTC based value and return 1 if the value occurs in DST and 0 if it does not. I then add that value to the timezone offset.

My table of Daylight Savings Times:

Table of US Daylight Savings Times

The Power Query function:

  Source = (DateToCheck) => let
    CountOfRows = if DateToCheck is null
      then 0
    else let Source = DaylightSavings,
      DateToCheckDate = DateTime.From(DateToCheck),
      StartDates = Table.SelectRows(Source, each [DSTStartDateInUTC] <= DateToCheckDate),   EndDates = Table.SelectRows(StartDates, each [DstEndDateInUTC] > DateToCheckDate),
      CountOfRows = Table.RowCount(EndDates)

The function selects rows from the table where the UTC DateToCheck parameter provided is between a DST Start Date and DST End Date. If the table is configured correctly, at most one row will be returned when a date is in a DST period, and no rows will be returned when the DateToCheck is not within a DST start and end date range.

Note that the date time function parameter, DateToCheck, does not have a type declaration making it of type any. In Power Query, the any type can be null while DateTime is not nullable. As some of the source datetime values are null the any type for the parameter is required.

The value of the function is then used when offsetting UTC to local time. If the original date time is in a DST range, an additional hour will be added to the offset.

The next function makes it easy to adjust our Pacific Time conversion to consider Daylight Savings Time. Our custom column formula now looks like this:

      DateTime.AddZone([CreationDate], 0),
      -8 + CheckDaylightSavings([CreationDate])

DateTimeZone.SwitchZone() replaces ToLocal(). We pass in -8 hours which is the standard offset of US Pacific Time from UTC. We then call the custom function CheckDaylightSavings() which will move the time one hour forward if the passed in date time is in DST.


This seems like a lot of work to adjust for Daylight Savings Time, but it is the simplest and most straightforward way I could come up with. This isn’t a perfect solution by any means. Here are some drawbacks:

  • Only daylight savings times for which there is an entry in the Daylight Savings Time table are considered
  • This approach only converts to Pacific Times. Other time zones could easily be supported, but additional work is required to handle a time zone more dynamically if some users are in a different time zone and would like their reports localized.

There are other approaches I’ve seen from implementing an Azure Function to calling a DateTime web service as well as a fairly in depth function with extensive parameterization.

If you’d like my code, have other approaches, or questions or feedback on this approach, please post a comment.

Join the Conversation

  1. Erin Wissing
  2. Phil Spokas


Your email address will not be published. Required fields are marked *

  1. I just did a test using DateTimeZone.ToLocal with a report published to Power BI Service and it is converting the time to the local time on my local computer, not the server time. I updated the time zone on my computer and retested and DateTimeZone.ToLocal converted the time to the updated time zone on my local computer. I think there have been updates with Power BI so this solution isn’t needed anymore?

    1. Abram, many thanks for your comment!

      I’ve tested myself and confirm that DateTimeZone.ToLocal() is now using the client time zone and not that of the servers running the service

      I’ll update the post as well.

      1. I just found DateTimeZone.ToLocal() is NOT working on Power Bi Service.

        Desktop is in AEST (Australian Eastern Standard Time) which is my local time.

        However, as soon as I refresh the PBI service all dates revert back to UTC :(

  2. Noob question:

    How do I implement Steven’s solution?

    I have a table with one column being UTC. I need a new column with PST / PDT.

    Do I put his code in a calculated column? Or create a function that gets called by the column?

    1. Hi Hendrik,

      With my approach, you create a local time column in Power Query and execute a function against the column.

      With Steven’s approach, you are using a calculated column.

      There are pros and cons with each. My approach calculates the values at data load/refresh time. Steven’s approach calculates as needed/when used.


  3. I also wanted a way to address without referencing a table. Here’s what I’ve done:

    Last Refresh Pacific =

    // First let’s find the date of the second Sunday of March
    // For the formula below:
    // In DAX, we use call DATE() and WEEKDAY()
    // “1+7*2”, generically, “1+7*n” where n represents
    // the nth occurrence of that “weekday name”
    // “8-1”, generically, “8-y”, where
    // y is a number from 1 to 7, where
    // 1 = Sunday through to 7 = Saturday
    // Thus, the date of the second Sunday of the month
    // is returned to the variable ZZQQ_DST_START
    // ZZQQ_DATE is the date/time in UTC.

    // Next let’s find the date of the first Sunday in November


    IF (
    // Evaluate if ZZQQ_DATE is between DST start and DST end
    // If yes, change time to Pacific Daylight Savings Time
    // else, change time to Pacific Standard Time
    ZZQQ_DATE + (-7/24), // If condition matches, convert UTC to PDT
    ZZQQ_DATE + (-8/24) // Else convert UTC to PST

  4. Hello Phil,

    Great and insightful post. However, I am wondering if you have been able to reach a solution now in 2018 where you do not have to add a custom column. Our team has several date columns and to create a duplicate of each column to invoke the custom function will add a lot of heaviness to our tables since they are already very large. Have you been able to find a leaner solution since your post? Thanks!

    1. Hi Kaden,
      Glad you found the post helpful. Regarding the tables, consider implementing a single Dates table that is referenced via a date key, and only add the columns to the Dates table.