Page tree
Skip to end of metadata
Go to start of metadata

Introduction

ExcelWriter is a powerful tool for generating and manipulating Excel files. Files can be saved to disk or streamed to the user over HTTP. In order to provide maximum flexibility for manipulating every element of a spreadsheet, ExcelWriter (in particular ExcelApplication) has a rich object model that must be populated at runtime and requires sufficient memory.

It is important to understand that the memory required to process a large report is much greater than the size of eventual output file.

Any cell loaded by the ExcelApplication object model may require up to 400 bytes of memory for the various objects associated with each cell for values, formulas, formatting, etc. The number of cells equals the number of rows times the number of columns. So if you have a large report with 50,000 rows by 20 columns, that's 1 million cells and it can use up 300-400 MB of memory to process just one request for that report.

Here is a table showing approximately how much memory is required for large reports with different column/row combinations:

rowscolumnscellsmemory (bytes)memory (MB)
1,0001010,0004,000,0004

20,000

20400,000160,000,000153
50,000201,000,000400,000,000381
50,000301,500,000600,000,000572

If you have a complex workbook with many sheets and you aren't sure how large it is, here's a simple macro you can use for calculating the total number of cells and approximate memory required:

Private Sub CalculateCells

Best Practices

Avoid Referenceing Empty Cells

As noted previously, this cell will occupy up to 400 bytes of memory because several other objects have to be instantiated to hold the cell's attributes. Consequently, you should avoid looping through cells that might be empty, since doing so will create Cell objects that you may not need.

Use ExcelApplication before ExcelTemplate

To conserve memory, we recommend pre-processing a partial template with the ExcelApplication before passing the file to ExcelTemplate for importing data. See Preprocessing vs. Postprocessing

Cache Frequently

If you have large reports which are requested often but whose data changes infrequently, you may want to consider using ExcelWriter to generate the document once and then stream it to multiple users. This may be appropriate if the data is changed on a predictable schedule, or it is easy to check to see if the data has been changed. If you want the user to see the most recent data, then you will need to know when new data is available so you can regenerate the report.

If the report contains sensitive data, you will want to take security precautions when using this approach. Since the report will already be generated and saved to disk, it could potentially be easier for an intruder to gain access to it. You may want to use a second server that is not public facing to act as a file server. This file server would only grant access to the report when it is requested by your web application.

When a user requests an Excel document, you can first check to see if there is new data. If there is, you can generate a new report and save it to disk. You can then stream that file to the user. We have a KB article which describes how to stream a file to the user that was previously saved to the disk with ExcelWriter.

Apply Styles to areas and not cells

Every time Cell.Style is accessed, ExcelWriter instantiates a separate Style object. To reduce file size bloating, we recommend using Global styles and setting styles on groups of cells. For details, see Effective Use of Styles

Use InsertRows instead of InsertRow

The Worksheet class has both an InsertRows and an InsertRow method. If you are only inserting one row, then you should use InsertRow; however, if you are inserting multiple rows, you should make one call to InsertRows and pass the number of rows that you want to insert. For example:

Avoid calling AutoFitWidth on a lot of data

The ColumnProperties.AutoFitWidth and Area.AutoFitWidth methods are useful for making a column exactly wide enough to fit its contents. However, when there is a lot of data they can take a long time to execute, because they have to go through each row of data and calculate the width of the Cell's contents. Consequently, it is best to use only for columns with small amounts of data that you want to make sure are spaced correctly.

If you have a large area that you want to autofit, and do not know beforehand how wide the cell contents might be, it will be much faster to find the longest string and then set the width of the columns in characters. For example, if you have several thousand rows, then you could say:

While this will not be as accurate as AutoFitWidth, it will be significantly faster.

Use DataReaders instead of DataTables

The Worksheet.ImportData method takes several different kinds of data types. DataReaders will read data directly from your data source into ExcelApplication's object model, and consequently use less memory and time than other data types. DataTables, DataViews, and arrays must all create an object in memory which contains the data before you can pass that object to ExcelApplication. ExcelApplication will then import the data into its object model. As a result, the data is stored in memory twice for these data types, as opposed to the DataReader where it is stored only once.

DataTables, DataViews, and arrays all use approximately the same amount of memory; however, there is a slight difference in speed. Two dimensional and jagged object arrays will be imported slightly faster, while DataTables and DataViews take approximately 10% longer to import. However, DataReaders remain the fastest, since they do not need to read the data into an object before importing it into ExcelApplication.

Use the newest version OfficeWriter

We are always working to improve the efficiency of ExcelWriter. Make sure to use the latest version of the product to take advantage of these improvements. See the OfficeWriter Change Log for more details.

Example Code

When streamlining a report, it is very important to know which performance issues tend to be the most problematic and the best practices for dealing with those issues.

Below is a collection of common errors that may negatively affect performance, as well as examples of best practices for each situation.  There are a number of issues that can slow performance. The examples mention which aspect of performance they are most related to:

  • Time based issues, where a report is taking longer to process than would seem normal, yet otherwise runs fine.

  • Memory based issues, where the report seems to be using more of the systems resources then necessary.

  • Overlap between these two issues is very common; when experiencing performance issues, it is recommended that all of the samples be considered.

Best Practices: Memory Related Performance Issues

 Solutions for reports that are taking more memory than it seems they ought to be.

Memory Performance

Best Practices: Time Related Performance Issues

 Solutions for reports that take longer to process than seems normal.

Time Performance

 

 

  • No labels