This question has already been answered in the Power BI forums __here__, as well as a good article by radacad, __here__, but I'm someone who loves to optimize things. If you're working with a large fact table (holding the first set of GPS coordinates) and a large lookup table (holding the second set of GPS coordinates), the calculation will take a long time. Indeed, even on my small data set of 80K records and a lookup table containing 1.4K rows, the calculation already takes up to 24 seconds for a calculated column (not a measure). Indeed, minimum distance calculations are not trivial.

**Note**: Even though I'm working with a calculated column, the same optimization techniques can be applied for a measure.

So what are we working with? I have a fact table, UFO Sightings and lookup table, Airports. Each one consisting of latitude and longitude coordinates. The goal is to determine if UFO sightings mostly occur closer to airports (*hint hint, they do*).

Notice that there is no relationship between UFO Sightings and Airports. This is intentional for the purposes of this example.

Here is what we start out with (Calculated Column):

```
Closest Airport Distance (km) =
IFÂ ( ISBLANKÂ ( 'UFO Sightings'[city_latitude]Â ), BLANK(),
VARÂ _RowLatÂ = 'UFO Sightings'[citylatitude]
VARÂ _RowLongÂ = 'UFO Sightings'[citylongitude]
// Find the airport with the closest distance
VARÂ _ClosestAirportDistance =
MINX(
ALLÂ ( AirportsÂ ),
VARÂ _SampleLatÂ = 'Airports'[Latitude]
VARÂ _SampleLongÂ = 'Airports'[Longitude]
VAR _P = DIVIDE( PI(), 180 )
VAR _A =
0.5 -
COS( ( _SampleLat - _RowLat ) * _P ) / 2 +
COS( _RowLat * _P ) * COS( _SampleLat * _P ) *
( 1 - COS( ( _SampleLong - _RowLong ) * _P ) ) / 2
VAR _Result = 12742 * ASIN( ( SQRT( _A ) ) )
RETURN
_Result
)
RETURN
_ClosestAirportDistance
)
```

Although this creates a simple query plan with just two queries, all the work is done by the formula engine (100% of it!). This is because the storage engine cannot compute complex calculation (like **ACOS**, **SIN**, **COS**, etc.).

**1. Remove columns from the lookup table**

In my case, the Airports lookup table contains columns that are not needed for the distance calculation; i.e. airport name, airport city, etc. The only columns needed are latitude and longitude. Therefore, instead of passing in the entire table, we can use SELECTCOLUMNS to remove those unneeded columns. The optimizer may do this already for us, but it's a good practice to do it ourselves anyways.

```
VARÂ _AirportLongLat =
SELECTCOLUMNS(
'Airports',
'Airports'[Latitude],
'Airports'[Longitude]
)
```

In this instance, it appears that the optimizer is aware that our calculation solely relies on the latitude and longitude columns, and our SELECTCOLUMNS is not really needed, and subsequently does not improve performance. We can confirm this in the second xmSQL query:

**2. Reduce cardinality of coordinates**

The speed of any query depends greatly on the cardinality of the column(s) you are filtering over. Our case is slightly different because we are iterating of the values (not filtering), but it may be worthwhile exploring this option.

For my purpose, I'm interested in large distances (greater than 1km), so I do not need the precision of 6 or more decimal places (which gives you precision in feet). Although a good idea, reducing the precision to 3 decimal places only gives me a minimal efficiency boost (precision still remains great though!).

37.24315904Â° N, 115.79308726Â° W

to

37.243Â° N, 115.793Â° W

Even if this technique doesn't improve the speed of the calculation much, it does reduce the size of our pbix file due to better compression. We can see this size reduction by viewing the metrics with DAX Studio, under the Advanced tab.

596KB -> 226KB = 62% size reduction!

**3. Avoid checking for nulls with IF()**

Some of my latitude and longitude values are missing in my fact table. It would be silly to run expensive trigonometry expressions over these missing values. This is why I'm checking for null values before doing the distance calculation. However, it turns out that IF and ISBLANK are expensive operations as well.

Removing this logic condition improves speed by about 8% ~ 10%!

**4. Pre-Filter the lookup table**

We know that the storage engine cannot perform the trigonometry math (ACOS, COS & SIN) required to calculate the distance. So it looks like our only option is to limit the number of iterations the formula engine needs to perform. My first thought is to filter the dimension table (airports in my case) as much as possible before performing the trig math.

One way to do this is filter the table for close matches. For example, given the GPS coordinate (0, 0) and the comparison coordinates of (1, 1) and (10, 10), we can easily (even visually) see that the first coordinate of (1, 1) is closer than (10, 10).

Therefore, we have the option to narrow down the dimension table based on the latitude and longitude table within a specified range of values (within X GPS units). In my case, I set this to 5 points (represented by the _Buffer variable):

```
VAR _Buffer = 5
VAR _FilteredAirports =
FILTER(
Airports,
Airports[Latitude] >= _RowLat - _Buffer &&
Airports[Latitude] <= _RowLat + _Buffer &&
Airports[Longitude] >= _RowLong - _Buffer &&
Airports[Longitude] <= _RowLong + _Buffer
)
```

This is by far the best improvement so far. The execution time is now just under 5 seconds!4x speed boost!

Combining this technique with the reduced cardinality saves us an additional second as well, so we end up with around 3.6 seconds.

**Using the Pythagorean theorem**

Using the Pythagorean theorem (a^2 + b^2 = c^2) will not give accurate distance results, however if your data is focused on a single part of the globe, say a single country, it may be possible to use the Pythagorean theorem to pre-calculate an estimated minimum distance first. That will give you the closest coordinates (airport in my case). You can then calculate the accurate distance using the more complex distance formula. The DAX code becomes a bit more verbose as follows:

```
// Add an estimated distance column to the Airports table using Pythagorean Theorem
VARÂ _ClosestAirportDistances =Â
ADDCOLUMNS(Â
'Airports',
"@pth_distance",
VARÂ _SampleLat = 'Airports'[Latitude]
VARÂ _SampleLong = 'Airports'[Longitude]
VARÂ _Result =Â
SQRT((_SampleLat - RowLat)Â *
( _SampleLat - _RowLat)Â + (_SampleLong - _RowLong)Â *
( _SampleLong - _RowLong)
)
RETURN
_Result
)Â Â Â Â Â Â Â
// Find the minimum Pyth. Th. distance
VARÂ _MinAirportDistanceBasedOnPyth =Â
MINXÂ (Â
_ClosestAirportDistances,
[@pth_distance]
)
// Calculate the accurate distance on only the closest airport
VARÂ _TrueClosestDistance =Â
MINX(
FILTER(
_ClosestAirportDistances,
[@pth_distance]Â = _MinAirportDistanceBasedOnPyth
),
VARÂ _SampleLat = 'Airports'[Latitude]
VARÂ _SampleLong = 'Airports'[Longitude]
VARÂ _P = DIVIDE(Â PI(), 180Â )
VARÂ _A =Â 0.5Â -
COS(Â (Â _SampleLat - _RowLat )Â * _P )Â / 2Â +Â
COS(Â _RowLat * P )Â COS(Â _SampleLat * P )Â Â *
(Â 1Â - COS(Â (Â _SampleLong - RowLong )Â * _P )Â )Â / 2
VARÂ _Result = 12742Â * ASIN(Â (Â SQRT(Â A )Â )Â )
RETURN
_Result
)
```

This technique is abouttwice as fastas our original DAX code, however...

still not faster than pre-filtering the lookup table using a buffer (technique number 4 above), so this technique may be good to use in combination with pre-filtering the lookup table. **I would urge caution** because in my case I ended up with some different distances because the Pythagorean Theorem returned different airports. Out of 80K rows, 4.7K returned different distances, averaging 3.7 km off, and the largest difference was 56 km.

**Cross-Join on the lookup table**

Another option is to do a cross-join between the fact and lookup table - performing the distance calculation on one big table instead of iterating between two tables. That being said, the query plan only has two queries, and my guess is that performance would be equivalent, if not worse, than our original code.

And indeed, using a cross-join seems to be the wrong choice.

My code ran for greater than 30 seconds so it's not even worth exploring further. Though it is interesting that even though the total number of iterations remains the same (80K x 1.4K), iterating over two tables instead of one big table results in better performing DAX.

**Conclusions**

Whenever you need to optimize your DAX code, reducing the number of iteration is always the best thing to do. In this article, we achieved a 4X speed optimization by pre-fiiltering the lookup table before applying any type of calculation. This is further refined by other techniques by using less decimals for the GPS coordinates, and removing logical checks.

Optimizing DAX is a fun and exciting topic. If you're interested in more optimizing techniques, check out my other article __here__ where I talk about SQLBI's book, __Optimizing DAX__.

This blog was in part a continuation of my other blog post about __UFO Sightings__.

Cheers!

## ã‚³ãƒ¡ãƒ³ãƒˆ