Excel What-If Data Tables: Faster calculation with VBA

For some reason this year I keep on meeting Excel’s What-If Data Tables. These are used (often in real estate and financial valuation models) to simplify doing sensitivity analysis against a range of values for certain inputs such as interest rates. What-If Data Tables are a great tool but they tend to calculate very slowly, in fact more slowly than I think they should.

Using What-If Data Tables

What-If Data Tables can be either 1-dimensional or 2-dimensional. You create them using the What-If pull-down on the Data tab.

WhatIf1First you need to create the outline for the Data Table:

WhatIf2This is a 2-dimensional data table. The idea is that you have a single formula (in cell E7) that shows the results of a set of calculations that are dependent on 2 constants (in F3 and G3). We want to recalculate the workbook for every combination of the values in the range (row) F7:H7 and the column E8:E12, and place the corresponding result in F87:H12.

Start by selecting the range E7:H12 and click the What-If Data Table button:

WhatIf3You are asked to pick the Row Input Cell and Column Input Cell, and when you click OK the range F8:H12 gets filled with a TABLE array function formula {=TABLE(F3,G3)}

Note that for some strange reason Excel requires the row input and column input cells to be on the same worksheet as the data table.

What-If data tables calculate slowly because Excel has to recalculate many times (in this case 3 x 5 = 15 times. So you need to be in Manual calculation mode, or possibly in Automatic except Data Tables mode (this makes calculation automatic but does not do the iterative calculation of the data table(s)).

Speeding Up What-If Data Table Calculation.

I did some experiments to see if the data table calculation could be speeded up and found:

  • Each iteration is calculated using single-threaded calculation.
  • The process starts with a multi-threaded recalculation, then iterates using single-threaded recalculation, then finishes with an additional recalculation.
  • There seems to be quite a large overhead to this overall process.
  • If the one combination of values in the table is the same as the initial set of values it gets recalculated anyway even if not needed.

The VBA Solution

So I wrote some VBA to see if by eliminating these problems I could make it run faster.
Note that the VBA is written as a proof-of-concept demonstration rather than production level code.

Sub IterateTables()
' Proof-of-Concept code
' for faster calculation of a 2-D What-If data table
' Copyright Charles Williams, Decision Models 16-December-2015
Dim rngTable As Range
Dim rngRowCell As Range
Dim rngColCell As Range
Dim varRowSet As Variant
Dim varColSet As Variant
Dim varResults() As Variant
Dim rngFormula As Range
Dim nRows As Long
Dim nCols As Long
Dim lCalcMode As Long
Dim j As Long
Dim k As Long
Dim varStartRowVal As Variant
Dim varStartColVal As Variant
Dim varFirstVal As Variant
Dim blCalculated As Boolean
Dim dTime As Double
' get the what-if table and its formula
Set rngTable = ActiveCell.CurrentRegion    ' expand active cell to the current region
Set rngFormula = rngTable.Cells(1, 1)   ' Top-Left corner
nRows = rngTable.Rows.Count - 1         ' number of rows in the Column of what-if values
nCols = rngTable.Columns.Count - 1      ' number of columns in the row of what-if values
' get the row and column input cells using RefEdit
With ufIterTable
.RefEditRow.Value = ""
.RefEditCol.Value = ""
If ufIterTable.RefEditRow.Value <> "" Then Set rngRowCell = Range(.RefEditRow.Value)
If ufIterTable.RefEditCol.Value <> "" Then Set rngColCell = Range(.RefEditCol.Value)
End With
' if 2-D and we have got the row and column cells then proceed
If nRows > 0 And nCols > 0 And Not rngRowCell Is Nothing And Not rngColCell Is Nothing Then
dTime = Microtimer
' create output results array
ReDim varResults(1 To nRows, 1 To nCols)
' get row and column arrays of what-if values
varRowSet = rngFormula.Offset(0, 1).Resize(1, nCols).Value2
varColSet = rngFormula.Offset(1, 0).Resize(nRows, 1).Value2
' set environment
Application.ScreenUpdating = False
lCalcMode = Application.Calculation
If Application.Calculation <> xlCalculationManual Then Application.Calculation = xlCalculationManual
' can only skip initial values if workbook is calculated at start
If Application.CalculationState = xlDone Then
blCalculated = True
blCalculated = False
End If
' initial start values
varStartRowVal = rngRowCell.Value2
varStartColVal = rngColCell.Value2
varFirstVal = rngFormula.Value2
' calculate result for each what-if values pair
For j = 1 To nRows
For k = 1 To nCols
If blCalculated And varRowSet(1, k) = varStartRowVal And varColSet(j, 1) = varStartColVal Then
' if whatis value pair is the same as the start values then skip recalc
varResults(j, k) = varFirstVal
' show calc status in the status bar
Application.StatusBar = "What-If Table Row " & j & " Column " & k
' set values for this iteration, recalc, store result
rngRowCell.Value2 = varRowSet(1, k)
rngColCell.Value2 = varColSet(j, 1)
varResults(j, k) = rngFormula.Value2
End If
Next k
Next j
' reset status bar
Application.StatusBar = False
' put results back
rngFormula.Offset(1, 1).Resize(nRows, nCols) = varResults
' reset back to initial values & recalc
rngRowCell.Value2 = varStartRowVal
rngColCell.Value2 = varStartColVal
Application.Calculation = lCalcMode
' timer message
dTime = Int((Microtimer - dTime) * 1000) / 1000
MsgBox "Time for " & nRows * nCols & " Iterations: " & dTime & " Seconds"
End If
End Sub

Timing Results

I ran some timing tests using a 4-core I7 with hyper-threading so that Excel tries to use 8 cores for multi-threaded calculation.

WhatIf4The first row gives the time for a single calculation in multi-threaded and single-threaded mode.
The second row gives the time to calculate using Excel’s What-If data table method. You can see that difference between single-threaded and multi-threaded is equal to 2*(Single-thread time – multi-thread time).
The third row gives the time to calculate using my VBA routine: a lot faster in multi-threading mode and slightly faster in single-threading mode.

A timing breakdown for XL Data Tables could be:
2 * multi-threaded Calc + 15 * single-threaded Calc + overhead
2*0.65 + 15 * 2.8 + 0.3 = 43.6

And for the VBA:
15 * multi-threaded calc + overhead
15 * 0.65 + 0.15 = 9.9


You can download the test workbook from


I have made a suggestion in Excel User Voice to use these findings to speed up Excel’s What-If Data Tables. Please vote for it if you think it has merit:


Also it would be great if existing data table users could try out my VBA code to see how it works for them.

I plan to build a more generalised solution into FastExcel Calc.




Posted in arrays, Calculation, VBA | 3 Comments

Excel Full Column References and Used Range: Good Idea or Bad Idea?

Some people love using full column references such as A:A, other people say they slow down your workbook to a crawl and lead to workbook and memory bloat.

The attraction of using whole column references is that you don’t have to adjust your formulas when additional rows of data are added or deleted.

So who is right?

Let’s run some simple tests to find out.

We need to test a variety of functions and formulas. I picked

  • SUM(A:A)
  • MATCH(99,A:A,0)
  • COUNTIF(A:A,99)
  • {IF(A:A=1,1,0)} (array entered into each cell)
  • {IF(A:A=1,1,0)} (one formula array entered into D1:D100)
  • IF(A:A=1,1,0) (not array entered)

All the test cases have cell A1 set to a value of 1, and the formula is entered into D1:D1000.

To explore the effect of changing the used range and the number of cells containing formatting or data I used a variety of test cases:

  • Test Case 1 – Base case: no other data or formatting in the worksheet.

All the other test cases are derived from the base case, making only the changes mentioned for each case.

Formatting test cases:

  • Test Case 2 – entire column A is colored yellow
  • Test Case 3 – cells A2000:A1048576 are colored yellow
  • Test Case 4 – cell XFD1048576 empty but colored yellow
  • Test Case 5 – every other row in column A colored yellow

Cells with data test cases:

  • Test Case 6 – cell A1048576 contains 98
  • Test Case 7 – cells A2000:A1048576 contain 1
  • Test Case 8 – every other row in column A contains 1

The calculation times are given in Milli-seconds for the median of 3 single-threaded Range.Calculates on the 1000 formulas in D1:D1000. Screen updating time is excluded.

Test Results

FullColInterpretation of Test Results

1. Base Case: No Additional Formatting or Data

The most efficient formula is =IF(A:A=0,1,0)
This formula uses Excel’s implicit intersection method so that each cell containing the formula in D1:D1000 only looks at a single cell in the same row in column A.

The SUM, MATCH and COUNTIF formulas are efficient because each formula only looks at the 1000 cells in column A bounded by the used range (row 1000).

The SUMPRODUCT and single-cell array formula version of =IF(A:A=0,1,0) force Excel to look at every single one of the million cells in column A, so there are 1000*10000000 cells looked at. The IF formula is slower than the SUMPRODUCT because it has to create additional virtual columns.

The multi-cell array version of the IF is a single formula array-entered into the 1000 cells in column D. This single formula looks at 1000000 cells and then returns 1000 results. Because it looks at 1000 times fewer cells it is a lot faster.

2. Column A:A formatted Yellow

Formatting the entire column does not increase the used range and so has virtually no effect on the timings.

3. Format over a Million cells from from A2000:A1048576 Yellow

Formatting a large but contiguous block of cells does increase the used range if the start cell address is beyond the previous last used cell. But (surprisingly) it does not extend the used range to the last formatted row!
The timing results show a small increase for those formulas that are dependent on the used range, and the file size increases slightly.

4. Format the Last Cell in the worksheet (XFD1048576) Yellow.

This does dramatically extend the used range: the last used row is now the last row in the worksheet.
The timing results that depend on the size of the used range increase significantly, but the file size does not change.

The results are the same if you color the entire row 1048576 or just the single cell A1048576.

5. Format every other row in column A Yellow.

This creates a much more complex used range, even though the last used row is actually 1 less than test case 4. Excel has to store formatting information for each of half a million rows!
All the timings increase significantly
(not just the formulas that are dependent on the used range) because of the additional work need to traverse the much larger cell table, and the file size shoots up.

6. Set the last cell in column A to 98.

This extends the used range in the same way as formatting the last cell, and the timings and file size are virtually identical to test case 4.

7. Set cells A2000:A1048576 to 1.

You might expect this to behave the same way as test case 3 which just formats this block of cells rather than inserting a constant value.

But the time taken by the formulas dependent on the used range increases by a factor of around 1000, and even the timings for SUMPRODUCT and the array formulas about double. The file size also increases significantly.

Clearly adding data to a contiguous block of cells has a much bigger impact than just formatting a contiguous block of otherwise empty cells.

8. Set every other cell in column A to 1.

This results in about half as many cells containing data as test case 7, but the last used row is only 1 less.
The timings for formulas dependent on the used range increases by about a factor of 2 compared to test case 5, which only formatted the same cells, but decreases by about  a factor of 2 compared to test case 7 (which had about double the number of cells containing data). The file size is not much larger than test case 5.

This seems to show that formula calculation time is dependent on both the complexity of the used range that formula has to traverse and the number of cells containing data.


Excel’s sparse storage methods are efficient and allow for the use of whole column references and contiguous blocks of identical formats. But you need to be careful to minimize both the extent and complexity of the used range.

  • Formulas other than array formulas and SUMPRODUCT handle whole column references efficiently.
  • But avoid array formula and SUMPRODUCT usage of whole column references.
  • Non-array formulas using whole column references are much slower with large used ranges, and even slower with complex used ranges.
  • Formulas handle empty cells more efficiently than cells containing data.
  • Excel is optimized to handle contiguous blocks of identical formatting efficiently.
  • File size is not dependent on the size of the used range.
  • File size is largely dependent on the number of cells containing data/formulas and also on the number of non-contiguous formatted cells.




Posted in arrays, Calculation, Formatting, Memory | 5 Comments

Timing Calculation Consistently: Windows Update/Background Process Woes

Last Friday  4 December 2015 I was presenting a session at the Modeloff Global Training Camp in London. The session was “How to make your Excel workbooks calculate faster”, and so it involved demonstrations of timing formula calculation. No problem I thought, I will use my Surface Pro 3, recently loaded with Windows 10, Excel 2016 and FastExcel V3.

Why do my calculations take twice as long as they did yesterday?

But doing my final practice run-through on Thursday the formulas were calculating more than twice as slowly as earlier in the week. And my Surface Pro 3 was getting very hot. And the battery was running down very fast.

Of course when the Surface Pro 3 gets too hot it starts throttling down the I7 processor, but that would not explain a factor of 2, or why it was getting hot in the first place.

So I started up Task Manager and clicked More Details:

Task2In this example you can see that the CPU is running at 79% with Windows Search and Carbonite Backup chewing up most of the processor time.

But on Thursday it was one of the Service Host processes. Expanding it showed one of the services was Windows Update, and stopping that service moved the cpu % down to a small percentage.

Of course I had to remember to switch it back on next day.


If you want to get more consistent calculation timings:

  • Don’t let your Surface Pro 3 get too hot!
  • Use Task Manager to see what processes and services are running and chewing up the processors.
  • Temporarily stop any processes and services that are making heavy use of the processors.



Posted in Calculation | Leave a comment

Excel Summit South: Auckland Sydney Melbourne March 2016

We_HeaderI have spent the last 2 years putting together this stellar line-up of speakers (Excel MVPs – the Excel Dev Team – External speakers) for a series of 2 day conferences in Australia and New Zealand in March 2016.

SpeakersThis is a unique opportunity to:

  • Learn from six of the world’s leading Excel MVP’s as they discuss the Excel topics most useful to you.
  • Hear industry leading speakers from around the world give you the latest views on Financial Modelling best practices, standards and spreadsheet risk.
  • Shape the future of Excel: Interact with members of the Microsoft Excel Dev Team as you explore with them the future of Excel.
  • Choose the sessions that best suit your needs from 23 masterclass sessions over two days of twin tracks for modellers and analysts.

Find out more at the Excel Summit South website

Posted in Uncategorized | Tagged , , , , | Leave a comment

Excel User Voice and Workbook Calculate

Excel User Voice

At the recent Excel Global Summit the Excel team were keen to explain how they have started using Excel User Voice to ask for and prioritise product improvement suggestions.


There are 2 important things to note about this:

  1. The Excel Dev Team actually read your suggestions on User Voice!
  2. Starting with Excel 2016 the development and ship cycle is much much faster than the traditional 3 years.

So it really is worthwhile making your suggestions to improve the product.
You get 10 votes on suggestions, and the suggestions are categorised by end-point and product area. So if you want to make a performance suggestion for Excel on Android phones you can focus down, see what other suggestions have been made in this area, and either cast a vote for an existing suggestion or make a new suggestion.

Workbook Calculate

To try this out I made a suggestion (about calculation of course).

At the moment from the UI you can either calculate all open workbooks (F9 or Automatic) or a worksheet (Shift F9).
From VBA you can also calculate a range (Range.Calculate and Range.CalculateRowMajorOrder).

But you cannot calculate a single workbook.

This is really annoying when you have 2 versions of a large slow workbook open, or you have a small rapidly changing workbook open that links to a large mostly static workbook.

My suggestion is to add an additional calculation setting: Calculate Active Workbook Only and from VBA Workbook.Calculate.


So please vote for my suggestion if you think it’s a good idea!

Posted in Calculation | Tagged , , | 4 Comments

Writing Efficient VBA UDFs (Part13): MaxMinFair Resource Allocation – an array UDF example

Last year we got to spend a great day at Microsoft Research in Cambridge (UK). one presentation was about storage strategies and featured (amongst other stuff) an interesting algorithm called MaxMinFair.

The MaxMinFair algorithm

You can read about it here at WikiPedia, but the basic idea is to share a supply resource fairly between a number of demands without allowing the greedy demands to hog too much of the resource.

The algorithm starts by sharing the supply equally between the demands.
Then any excess supply (supply > demand) is re-shared equally between the demands that have not yet been met.
Then the algorithm continues to reallocate the excess supply until either all demands are met or there is no excess supply to reallocate.

Implementing MaxMinFair as a VBA array UDF

MaxMinFair makes a great example of writing an array formula UDF.
It has 2 parameters – Supply (a single number) and Demands (a set of numbers, usually a Range).

To keep things simple Supply has to be a single number >= 0.0, and Demands has to be a single column vertical range or array of numbers.

The parameters for the function are defined as variants so that the user can give a range or an array of constants or a calculation expression that returns an array of numbers.
The function is defined to return a variant. This allows the function to return either an error value, or a single number or an array of numbers.
The function starts by setting up the error handling and coercing the Ranges to values.
The results of the function are put in an array that is dynamically sized to match the number of demands.

The heart of the UDF is a DO loop that

  • Calculates allocation by dividing the available supply by the number of unsatisfied demands
  • Adds the allocation to each of the unsatisfied demands
  • Collects any excess allocation to be the available supply on the next iteration
  • Counts unsatisfied demands

The DO loop terminates when either there are no unsatisfied demands or there is no available supply to be allocated.

The function finishes by assigning the result array (dAllocated()) to the variant function.

The VBA Code

Here is the VBA code for the function:

Option Explicit
Option Base 1
Function MaxMinFair(Supply As Variant, Demands As Variant) As Variant
' Array function for Max-Min-Fair allocation of supply to demand
' Supply must be a scalar number >=0.0
' Demands must be a scalar number or a single column range or array of data
Dim nUnsat As Long          ''' number of unsatisfied demands
Dim dAlloc As Double        ''' amount to allocate to each unsatisfied demand
Dim dAllocated() As Double  ''' arrray of amounts allocated to each demand
Dim nRows As Long           ''' number of rows in Demands
Dim nCols As Long           ''' number of columns in Demands
Dim dAvailable As Double    ''' available supply this iteration
Dim j As Long
' set up error handling
On Error GoTo FuncFail
' return #Value if error
MaxMinFair = CVErr(xlErrValue)
' both parameters must contain data
If IsEmpty(Supply) Or IsEmpty(Demands) Then GoTo FuncFail
' convert ranges to values
If IsObject(Demands) Then Demands = Demands.Value2
If IsObject(Supply) Then Supply = Supply.Value2
' Supply must be a scalar number >=0
If IsArray(Supply) Then GoTo FuncFail
If Supply < 0# Then GoTo FuncFail
dAvailable = CDbl(Supply)
If Not IsArray(Demands) Then
' scalar demand: Minimum of supply and demand
If Demands < Supply Then
MaxMinFair = Demands
MaxMinFair = Supply
End If
' Demands must be a single column array
nRows = UBound(Demands, 1)
nCols = UBound(Demands, 2)
If nCols > 1 Then GoTo FuncFail
' setup output array
ReDim dAllocated(1 To nRows, 1 To nCols)
' count unsatisfied demands
For j = 1 To nRows
' if not number raise error
If dAllocated(j, 1) <> CDbl(Demands(j, 1)) Then nUnsat = nUnsat + 1
Next j
If nUnsat = 0 Then GoTo Finish
' iterate allocating available supply to unsatisfied demands
' amount to allocate to each unsatisfied demand
dAlloc = CDbl(dAvailable) / nUnsat
nUnsat = 0
dAvailable = 0#
' share available supply equally across unsatisfied demands
For j = 1 To nRows
If dAllocated(j, 1) < Demands(j, 1) Then
dAllocated(j, 1) = dAllocated(j, 1) + dAlloc
End If
Next j
' collect excess supply for next iteration
For j = 1 To nRows
If dAllocated(j, 1) >= Demands(j, 1) Then
' remove and accumulate excess supply
dAvailable = dAvailable + dAllocated(j, 1) - Demands(j, 1)
dAllocated(j, 1) = Demands(j, 1)
' count unsatisfied demands
nUnsat = nUnsat + 1
End If
Next j
' if all supply allocated or all demsnds met then finish
If nUnsat = 0 Or dAvailable = 0# Then Exit Do
' return array of results
MaxMinFair = dAllocated
End If
End Function


Here is a small example. The UDF is entered into C2:C8 as a multi-cell array formula (select C2:C8, type the formula into the formula bar without the { }, then press Control-Shift-Enter)


You can see that the total demand is 25.9 but the supply is only 18.3. MaxMinFair has satisfied all the demands except for the 2 largest ones, which have both been allocated the same 4.9.


MaxMinFair is a good choice when you want to allocate resources without allowing large resource demands to starve small resource demands.

Implementing the algorithm as an array UDF is a good opportunity to demonstrate a variety of VBA UDF techniques.

Posted in arrays, Calculation, UDF, VBA | Tagged , , | Leave a comment

Excel Circular References: Calculation, Detection, Control and Removal

Circular references in Excel are generally bad news:

  • They are slow to calculate
  • They can be hard to detect
  • An intentional circular reference can mask an unintended circular reference,
  • They do not always converge
  • The Status Bar always shows calculate even in Automatic Mode

What are Excel Circular References?

Excel circular references occur either when a formula refers to itself or when a chain of formulas links back to it’s starting point.
Most of the time circular references are created by mistake (unintended circular references) and should be corrected. They can also be used to solve iterative or goal-seeking problems in Finance and Engineering.

Why circular references are slow to calculate.

Excel uses a special iterative calculation method to solve circular references. First a standard recalculation is done which identifies the circular formulas that cannot be resolved.
Then Excel repeatedly calculates all the unresolved circular formulas using a sheet-by-sheet  left-to-right top-to-bottom calculation process.
The calculation only stops when either

  • The maximum number of iterations has been reached
  • or the largest change in values caused by an iteration is less than the maximum change

You can control these values using Options–>Formulas.


This calculation process is slow because:

  • The calculation is restricted to using a single core/cpu (single-threaded calculation)
  • The formulas are calculated repeatedly.

Detecting Unintended Circular References.

If ‘Enable Iterative Calculation’ is checked and you create a circular reference then Excel does not tell you that you have created a circular reference.

If ‘Enable Iterative Calculation’ is NOT checked and you enter formula that creates a circular reference:

In Automatic Calculation Mode Excel shows this message:


In Manual Calculation Mode no message is shown, but the next time you press F9 to recalculate the status bar shows a circular reference message:

If the circular reference is on the active sheet the message shows one of the cells in the circular reference loop:


If the circular reference is not on the active sheet then Excel does not tell you where the circular reference is (but if you select any sheet with a circular reference the status bar shows the cell reference).

You can also use Excel’s circular reference tool on the Formulas Tab:


But none of Excel’s built-in tools will always show you where the circular references are or detect all of the circular references.

I recommend Jan Karel Pieterse’s RefTree Analyzer too for detecting circular references (warning: the detection process is slow on large workbooks.)

Controlling Circular References

A  useful technique recommended by Stephen Bullen is to add an IF switch to your circular reference loops so that you can disable them.


If A1 is zero and iteration is disabled then Excel will not regard this calculation as a circular reference, so any circular reference detected is probably intentional. Set A1 to 1 and enable iteration to request Excel to solve using iteration.
Note that not all circular calculations converge to a stable solution. Another useful piece of advice from Stephen Bullen is to test the calculation in manual calculation with the number of iterations set to 1. Pressing F9 will single-step the calculation so you can watch the behaviour and see if you have genuinely converged to the correct solution.

Removing Intended Circular References

Because circular references are slow and difficult to check it is a good idea to remove them wherever possible. Most circular references in financial models can be replaced by simple non-circular formulas.

Unrolling cash flow Interest calculations

You can calculate interest compounded a number of times on a cash balance amount using circular references:
Total cash = Total Cash * (1+interest%)

Or you can unroll the calculation into steps:
Interest accumulated once per period:
Total Cash=Cash before Interest *(1+Interest%)
Interest compounded N times per period:
Total Cash=Cash before Interest *((1+Interest%)^N)

Fee calculated as % of final cost including Fee

Using circular references:
Fee=Fee% * Total Cost
Total Cost=Costs Excluding Fee + Fee

A little algebra rearranges this to avoid the circular reference:
Fee=Fee%*Costs excluding fee / (1-Fee%)
Total Cost=Costs excluding fee + Fee

Tax deductible as % of Profit after Tax

This is an example of circular references from John Walkenbach’s Excel Bible. A company decides to give a % of their after tax profits to charity. But the charity donation is tax deductible, so the charity donation and the tax and the after tax profit are all inter-related.

Using circular references:
Gross Profit=Revenue-Expenses
Profit Before Tax=Gross Profit – Charity Donation
Tax= Profit Before Tax * Tax%
Charity= Profit after Tax * Charity%
Profit after Tax=Profit Before Tax – Tax

Some algebra rearranges this to avoid the circular references:
Gross Profit=Revenue-Expenses
Profit Before Tax=Gross Profit – Charity Donation
Tax= Profit Before Tax * Tax%
Charity= Profit after Tax * Charity%
Profit after Tax=Gross Profit / ((1/1-Tax%) + Charity%)


  • Circular References should be avoided wherever possible
  • Most Financial circular references can be eliminated
Posted in Calculation | Tagged , , , | Leave a comment

Excel ForceFullCalculation: Trading off Editing speed vs Calculation speed

A recent post at Stack Overflow reminded me that I have not posted about the pros and cons of Excel’s Workbook.ForceFullCalculation property.

So what does Workbook.ForceFullCalculation do?

The simple answer is that it switches off Excel’s smart recalculation algorithm, so that every calculation recalculates every formula in all open workbooks.

Why is ForceFullCalculation Bad?

ForceFullCalculation slows down calculation because every formula is treated as though it is volatile and gets calculated, rather than only calculating the formulas that depend either on something that has changed since the last calculation or on a volatile function.

When is ForceFullCalculation Good?

When ForceFullCalculation is switched on Excel does not need to build and maintain the dependency trees. The dependency trees are created each time you open a workbook, and are maintained every time you make a change to a workbook.

So if

  • Your workbook takes a long time to open
  • Deleting or changing formulas takes a long time

then switching on ForceFullCalculation could speed things up a lot.

Also the dependency trees can take up a lot of memory so eliminating them can be good.

So its a trade-off: faster workbook opening and editing, and less memory footprint versus slower calculation.
(And you almost certainly need to be using Manual Calculation mode.)

So how do you switch it on?

There are several ways of setting ForceFullCalculation:

  • Click Alt-F11 to go to the Visual Basic Editor, select the workbook in the Project Explorer, Press F4 to see the workbook properties and change ForceFullCalculation to true.
  • Add this VBA statement to a VBA sub and execute it.
  • If you have FastExcel V3 installed then you can set it from the FastExcel Workbook Calculation Options page.

The Quirks of ForceFullCalculation

There are a few more things you should know about this property:

  • Although it is a Workbook Property it works at Application level and so affects all open workbooks.
  • It does get saved with the workbook and, if True, re-establishes itself when the workbook is reopened.
  • It makes “Calculate” appear in the StatusBar, and you cannot make it go away with any of the F9 calculate commands.
  • Once you have turned it on in an Excel session turning it off again does not restore Smart Recalculation: you have to close and restart Excel.

Trying ForceFullCalculation out

Here are the Jimwinz’s instructions from Stack Overflow for creating a demonstration of the problem:

  1. Create a new file (save and exit other Excel files!)
  2. To create the data to count, in A2 enter =RANDBETWEEN(0,10000) if you have a slow CPU or RANDBETWEEN(0,20000) if you have a faster CPU
  3. Copy A2 down to row A10k or A20k.
  4. Copy and paste values for column A
  5. Switch to Manual Calculation
  6. In B2, enter =COUNTIF($A$1:A2,A2), copy down and press F9 to calculate
  7. Once sheet is fully calculated, select all the CountIf cells in column B and use the Clear All function. This is the stage where Excel seems to freeze for a long time.
  8. Now click Undo to get the formulas back.
  9. Switch on ForceFullCalculation using one of the methods shown above. The statusbar should show Calculate and pressing F9 does not make it go away. Now Clear All is virtually instantaneous.

So what is your experience with this nifty workbook property?



Posted in Calculation, Memory, Uncategorized, VBA | Tagged | 7 Comments

Timing Excel Formula Calculations

I have written a couple of posts previously about the methodology of timing Excel calculations:

A Quantum of Time – measuring time accurately

The XIPS Challenge – how fast does Excel calculate?

But rather than focus exclusively on how to time Excel calculations, this also post looks at why you might want to and what should you measure.

Why time Excel calculations anyway?

There are only a few reasons I can think of for why you might want to time calculations:

  • a) To find out if it is the calculation that is slowing things down or something else.
    • Could be VBA, Screen Refresh, Data Refresh, not enough RAM, Addins …
  • b) To prioritize the formulas that are responsible for slow calculation.
  • c) To compare 2 different ways of calculating the same result to see which is faster.
  • d) Because its interesting if you are a geek like me.

Most people focus on c) followed by b).

Exactly what should you be measuring?

Elapsed time vs CPU Time

Usually you measure elapsed time: how much time passes from starting something (a macro, a calculation, …) to finishing it.
Sometimes it would be nice to measure CPU time (the amount of time your CPU cores are being used), so that you can see if you have enough RAM available or if you are using all the cores you have available efficiently.
Of course measuring elapsed time is a lot easier, so that is what we do most of the time.

And anyway as Excel users it’s elapsed time that we want to minimize.

Overhead time vs Calculation time vs Screen Repaint time

If you think about what Excel has to do to calculate formulas you can break it down into a series of steps:

Overhead Time

  • Work out which formulas need to be calculated, depending on what has changed since the last calculation and what is volatile.
  • Separate the formulas into chains of interdependent formulas and allocate them to the available calculation threads.
  • Process each formula in the calculation chains in turn

Calculation Time

  • Parse the formula into a sequence of executable sub-expressions and functions.
  • Get the data needed by the sub-expressions and functions and pass the data to them.
  • Evaluate the sequence of sub-expressions and functions and return the results to the next sub-expression.
  • Return the result to the Excel value layer.

Screen Repaint Time

  • If the cell containing the formula is in the visible part of the screen then Excel will format the result using the formatting and conditional formatting rules for that cell. This can be slow!

 Multi-threaded or Single-threaded calculation

Given that most modern PCs have multiple cores Excel’s calculation time is heavily dependent on making good use of the available cores.
Some functions (INDIRECT, GETPIVOTDATA and most UDFs) are single-threaded.

Don’t use a single-threaded calculation method such as Range.Calculate to compare a single-threaded function to a multi-threaded function.

Comparing Formulas

If you want to compare formula efficiency it does not much matter if the time measured includes overhead and screen repaint time as well as calculation time.

What matters is to compare like with like: each measurement should include the same amount of overhead and screen repaint etc.

Recalculation vs Full Calculation

Most of the time when Excel calculates it’s smart recalculation engine only recalculates the stuff that has changed since the last recalculation (and any volatile formulas).
So usually you want to measure and optimize recalculation time.
Of course if you change ALL the data in a workbook then ALL the formulas need to be recalculated.

Repeatable Timings

There are many reasons why you can get quite significant differences between timings:

  • Windows multi-tasking means other processes can be stealing time
  • Modern CPUs have Turbo modes that are workload-dependent
  • Modern CPUs have high-speed caches that may or may not contain the required data or program code
  • Windows may be pageing stuff in or out of RAM
  • Excel progressively optimizes both multi-threading and the calculation chain in successive calculations
  • Excel’s smart recalculation engine tries to minimize the amount that needs to be calculated but this depends on the previous calculation
  • Excel’s memory management algorithms may need to reorganize memory

To minimize these problems you should repeat the timing of the calculation several times and pick the most consistent timing.

Calculation Methods

Excel VBA gives you a variety of callable calculation methods. Which one you use for timing purposes can make a large difference to the comparison.


Most people writing a VBA calculation timing routine use Range.Calculate in Manual Calculation mode,.

Range.Calculate is usually the best method to use when comparing formulas.

  • If you switch off screen updating whilst timing the range calculate then you eliminate screen repaint time.
  • If you turn on Automatic calculation after the Range.Calculate then the formula(s) you calculated and all their dependents and all volatile formulas will be recalculated.
  • Range.Calculate is always single-threaded so does not include the multi-threaded calculation overhead.
  • Range.Calculate always calculates all the formulas in the selected range.
  • With a high-resolution timer you can compare even single-cell formulas reasonably accurately.

Don’t use Range.Calculate to compare a single threaded formula with a multi-threaded formula.


CalculateRowMajorOrder ignores dependencies within the selected range and so may be faster but less realistic than Range.Calculate.


Worksheet.Calculate does a multi-threaded smart recalculate of a worksheet. To do a Full calculation of the worksheet (all the formulas on the worksheet) you need to toggle Worksheet.EnableCalculation to False and then back to True before Worksheet.Calculate.


Recalculates all the open workbooks using the multi-threaded smart recalc engine.


Calculates ALL the formulas in all the open workbooks using multi-threading.


Rebuilds the dependency trees and calculation chain (this is slow) and then does a multi-threaded full calculation.

FastExcel Calculation Profiling

My FastExcel V3 Profiler product has a variety of optimized calculation timing and profiling tools that are designed to simplify drilling down to the calculation bottlenecks and timing formulas. FastExcel Profiler can save you a lot of time and effort.


You can:

  • Time workbooks, worksheets and formulas using each of the available calculation methods
  • Profile a Workbook to prioritize the slow worksheets and measure volatility and multi-threading efficiency
  • Profile Worksheets to find out which are the slow areas of formulas
  • Profile Formulas: Time and prioritize each of the unique formulas on a worksheet and discover which formulas are volatile or single-threaded.


  • Work out what you are trying to achieve.
  • Design your timing test to be as close as possible to the real-life problem.
  • Repeat the timing several times until you get some timing consistency.
  • Choose your calculation method according to what you want to optimize.
  • Don’t compare single-threaded calculation times to multi-threaded calculation times.
  • Be careful that screen rendering/refresh times do not distort your comparisons.
  • Use the Windows high-resolution timer API to measure elapsed time.


Posted in Calculation, Uncategorized | Tagged , , , | 1 Comment

2014 in review

The WordPress.com stats helper monkeys prepared a 2014 annual report for this blog.

Here’s an excerpt:

The Louvre Museum has 8.5 million visitors per year. This blog was viewed about 120,000 times in 2014. If it were an exhibit at the Louvre Museum, it would take about 5 days for that many people to see it.

Click here to see the complete report.

Posted in Uncategorized | 4 Comments