Excessive Memory Grant in SQL Server due to varchar max length profligacy (2024)

In databases designed without close oversight and tight discipline, it is common to see prolific if not abusive use of varchar and nvarchar columns with excessive max lengths, examples: varchar(8000), nvarchar(4000) and either ASCII or Unicode with the max option. The business analyst might be non-committal or would like to allow for unspecified very long fields. The developer who does not know to demand prudence might think that a column with fat max length is magically handled by SQL Server without any apparent cost or penalty. Note, not many developers are aware of the maximum index key size. But we should know that"Magic always comes with a Price, Dearie."

Let's start with the test table:

CREATE TABLEdbo.Products GroupIDintNOT NULL,ProductIDintNOT NULLIDENTITY(1,1),ProductNamenvarchar(2000) NOT NULL,ProductAltnvarchar(4000) NOT NULL,ProductDescnvarchar(max) NOT NULL,IntroDatedateNOT NULL,IDintNOT NULL,INDEXUCXUNIQUE CLUSTERED(GroupID,ProductID)) 

The table's string columns are populated with fields of short to intermediate length, from 5 to 30 characters on average. Executing queries, each of which groups on one of the columns will auto-generate statistics. Statistics are explicitly updated with FULLSCAN here for test consistency and determinacy purposes.

Consider queries to each of the string columns of the following form:

SELECTID,ProductNameFROMdbo.ProductsWHEREGroupID=1 

The execution plan is a simple clustered index seek:

Excessive Memory Grant in SQL Server due to varchar max length profligacy (1)

The index seek details are below:

Note the Estimated Row Size 2015 B, third row from bottom. This comprises 11 bytes overhead, 4 bytes for the integer NOT NULL column, and 2000 bytes for the nvarchar(2000) column. In Unicode, each character is 2 bytes, so the assumption seems to be average length is one-half the maximum length.

The nvarchar(4000) column Estimated Row Size is 4015 B, comprising 11 bytes overhead, 4 bytes integer, and 4000 bytes based on one-half the variable length column max length. For the nvarchar(max) column, Estimated Rows Size is 4039 B, having an extra 23 bytes.

From data distribution stats, SQL Server knows for FULLSCAN, the actual, or estimate in case of partial sample, average length of the column ProductName is 21 bytes.

Excessive Memory Grant in SQL Server due to varchar max length profligacy (3)

I was advised that the SQL Server query optimizer can in fact use the knowledge from data distribution statistics if it has reason to access said statistics. But it will not do so if it is only for estimating row size. The rule method is employed when statistics access is not otherwise needed. In this case of one-half the maximum length, and 4000+ for the max option.

The approach suggested for evaluating this was to apply an additional WHERE clause argument of ISDATE(Column) IN (0,1), which is always true and does not affect either estimated or actual rows.

SELECTID,ProductNameFROMdbo.ProductsWHEREGroupID=1ANDISDATE(ProductName) IN (0,1) 

This approach did not change the Estimated Row Size in my test case, and it might be version specific (SQL Server version 2019 in the test system).

Edit: I was actually supposed to use WHERE 'real SARG' AND (ProductName <> 'NonExistentValue' OR ISDATE(ProductName) IN (0,1)). The ISDATE output is always 0 or 1, so the string inequality does not impact result rows.

The approach that did affect Estimated Row Size is as below:

SELECTID,ProductNameFROMdbo.ProductsWHEREGroupID=1ANDProductName<>'Rumpelstiltskin' 

For this approach to produce correct query results, it must employ a non-existent value and is valid for NOT NULL columns. The Estimated Row Size is now 36 B, corresponding to 11 bytes overhead, 4 bytes for the integer, and 21 bytes for the string.

Excessive Memory Grant in SQL Server due to varchar max length profligacy (4)

The queries above have zero Serial, Required, Desired, Granted and MaxUsedMemory because each row is sent to the client without need for retention.

<MemoryGrantInfoSerialRequiredMemory="0"SerialDesiredMemory="0"GrantedMemory="0"MaxUsedMemory="0"/> 

The following queries need internal memory for intermediate results:

SELECTID,MAX(ProductName)FROMdbo.ProductsWHEREGroupID=1GROUP BYISELECTID,MAX(ProductName)FROMdbo.ProductsWHEREGroupID=1ANDISDATE(ProductName) IN (0,1)GROUP BYIDSELECTID,MAX(ProductName)FROMdbo.ProductsWHEREGroupID=1ANDProductName<>'Rumpelstiltskin'GROUP BYID 

For the first and second queries, the Memory Grant Info is mostly the same. The first is:

<MemoryGrantInfoSerialRequiredMemory="1024"SerialDesiredMemory="252632"RequiredMemory="9416"DesiredMemory="261064"RequestedMemory="261064"GrantWaitTime="0"GrantedMemory="261064"MaxUsedMemory="260296"MaxQueryMemory="3004400"LastRequestedMemory="0"IsMemoryGrantFeedbackAdjusted="No: First Execution"/> 

Recommended next reads

SQL 2016 In Memory Tables, All we ever wanted(?) Danny Ravid 7 years ago
Understanding JOINs in SQL Server Manjuke Fernando 7 years ago
SQL Server Indexes and Their Impact on Query… Sazid Khan 6 years ago

The second differs only insignificantly in MaxUsedMemory of 260040, slightly lower than 260296 in the first. The desired memory of 251064 about 1.3X the size of the 101694 rows of 2015 bytes, for room to store the data plus working space? It would appear that SQL Server is very good at reusing workspace and not allocating more memory than necessary? The IsMemoryGrantFeedbackAdjusted value is "No: First Execution".

The third query has memory grant as below, much lower desired at 15944 and MaxUsed even less at 8504. This is 2.37X the data size, and the execution has a SORT operation, whereas the first two employed Hash Match.

<MemoryGrantInfoSerialRequiredMemory="512"SerialDesiredMemory="11104"RequiredMemory="5320"DesiredMemory="15944"RequestedMemory="15944"GrantWaitTime="0"GrantedMemory="15944"MaxUsedMemory="8504"MaxQueryMemory="3004400"LastRequestedMemory="0"IsMemoryGrantFeedbackAdjusted="No: First Execution"/> 

The plan XML does in fact confirm the use of statistics for the string column:

<StatisticsInfoDatabase="[TestDB]"Schema="[dbo]"Table="[Products]"Statistics="[_WA_Sys_00000003_25869641]"ModificationCount="0"SamplingPercent="100"LastUpdate="2021-12-02T09:57:41.05"/> 

In the second execution of the above set, Adaptive Feedback increased memory allocation for the first two, the third query had a good request in the first place.

Summary

SQL Server does or could have reasonably good information to formulate a good execution plan including memory grant assessments. There are data correlation scenarios in which a combination of individual column averages can diverge significantly in specific actual queries, but such is life.

The decision to not investigate data distribution statistics only for row size estimation is a good decision in transactional queries (highly selective indexes, low number of rows touched). The rule method of one-half the max length is reasonable for an expert designed database, but perhaps not for a nitwit designer.

Statistics should be employed for high row count queries beyond some threshold level as this could have great out-of-box impact in DW and reporting environments.

There are memory max and min Grant hints, but I have not had success in using this. Perhaps the undocumented subtleties elude me for now.

Another possibility is yet another query hint option to direct accessing data distribution statistics for column size information. Microsoft tries very hard to work well out of box without deep knowledge of SQL Server internals. When they accomplish this, and people actually upgrade to that version, then it is time to put me into retirement or pasture.

Would it be a good idea to incorporate memory grant size into the plan cost model? This would favor plans having lower memory grant. Note the idea it to favor, not mandate.

Per the kindly advice from Darth Vader to Director Krennic: "Don't on your own ambition," Every unnecessarily fat column adds to the risk of choking for misguided ambition to meet stupid requirements.

Excessive Memory Grant in SQL Server due to varchar max length profligacy (2024)

FAQs

How do I fix high memory usage in SQL Server? ›

SQL Server has built-in compression features that can reduce the amount of memory required to store data. You can use the following compression techniques: Row Compression: This technique compresses the data within each row. Page Compression: This technique compresses the data at the page level.

What is the maximum length of VARCHAR Max in SQL Server? ›

varchar [ ( n | max ) ]

Use n to define the string size in bytes and can be a value from 1 through 8,000, or use max to indicate a column constraint size up to a maximum storage of 2^31-1 bytes (2 GB).

How much memory is allocated in VARCHAR Max? ›

CHAR and VARCHAR columns can store up to 8000 bytes. If a single-byte character set is used, up to 8000 characters can be stored in a CHAR or VARCHAR column. If a multi-byte collation is used, the maximum number of characters that a VARCHAR or CHAR can store will be less than 8000.

Is it bad to use VARCHAR Max? ›

Net and therefore don't have to be concerned about the size of your string/char* objects) then using VARCHAR(max) is fine. The basic difference is where the data is stored. A SQL Data row has a max size of 8000 bytes (or was it 8K). Then a 2GB varchar(max) cannot be stored in the data row.

How do I free up memory in SQL Server? ›

All of these tools are available with Database Console Commands (DBCC) for SQL Server:
  1. DBCC FREEPROCCACHE: Clears the cache by removing the entire plan cache. ...
  2. DBCC FREESYSTEMCACHE: In addition to removing elements from the plan cache, DBCC FREESYSTEMCACHE can clear other memory caches.

How do I clear SQL memory buffer? ›

After you checkpoint the database, you can issue DBCC DROPCLEANBUFFERS command to remove all buffers from the buffer pool. In Azure SQL Database, DBCC DROPCLEANBUFFERS acts on the database engine instance hosting the current database or elastic pool.

How to increase VARCHAR max size in SQL Server? ›

You can use the ALTER table command to change the length of a varchar column. You can increase the length of a varchar column to a maximum size of 64,000.

What is the difference between VARCHAR and VARCHAR Max? ›

If your data is longer than 8000 characters varchar(MAX) is what you need. You can store up to 2GB size of data this way. In varchar(MAX) fields if your data size is shorter than 8000 characters your data is stored in row automatically (therefore the data execution is faster).

Can VARCHAR be longer than 255? ›

One of the changes in MySQL version 5.0. 3 included an increase to the maximum length of VARCHAR fields from 255 to 65,535 characters.

What is the max memory limit for SQL? ›

By default, SQL Server's max memory is 2147483647 – a heck of a lot more than you actually have. Trivia time – that's the max number for a signed 32-bit integer. SQL Server will just keep using more and more memory until there's none left on the system.

Does VARCHAR allocate memory? ›

VARCHAR also allocates a little bit of extra space with each value stored. Depending on the space required to store the data, 1 or 2 bytes of overhead will be allocated. If the space required is less than 255 bytes, a 1 byte prefix will be added, otherwise a 2 byte prefix will be used.

What is the memory limit for SQL Server per query? ›

The default value is 1,024 KB. The minimum value 512 KB, and the maximum is 2,147,483,647 KB (2 GB).

Does VARCHAR length affect performance? ›

"A column only consumes storage for the amount of actual data stored. For example, a 1-character string in a VARCHAR(16777216) column only consumes a single character. There is no performance difference between using the full-length VARCHAR declaration VARCHAR(16777216) or a smaller size."

Does VARCHAR size affect performance? ›

Declared varchar column length will not affect the physical (on-disk) or data cache storage. It will affect the performance of actually using that index. The values must be loaded into a query's executing memory space in order to be read and processed.

Is it better to use VARCHAR or CHAR? ›

We can use char datatype when we know the length of the string. We can use it when we are not sure of the length of the string. Char datatype can be used when we expect the data values in a column to be of same length. Varchar datatype can be used when we expect the data values in a column to be of variable length.

How to check what is consuming memory in SQL Server? ›

To monitor the amount of memory that SQL Server uses, examine the following performance counters:
  1. SQL Server: Memory Manager: Total Server Memory (KB) ...
  2. SQL Server: Memory Manager: Target Server Memory (KB) ...
  3. Process: Working Set. ...
  4. Process: Private Bytes. ...
  5. SQL Server: Buffer Manager: Database Pages.
Dec 4, 2023

How do I know if my SQL Server has too much memory? ›

Max Server Memory is set at the instance level: right-click on your SQL Server name in SSMS, click Properties, Memory, and it's “Maximum server memory.” This is how much memory you're willing to let the engine use.

Why is my server memory usage so high? ›

Typical causes

High application server CPU or memory utilization is typically caused by a running batch job that is resource intensive, excessive garbage collection, or a looping thread.

What causes high memory usage on server? ›

First, when an application has many users, a single web server may end up with many active sessions. Most obviously, it's important that each session be kept small to avoid using up all available memory. Second, these sessions are not explicitly released by the application!

Top Articles
Latest Posts
Article information

Author: Eusebia Nader

Last Updated:

Views: 6400

Rating: 5 / 5 (80 voted)

Reviews: 95% of readers found this page helpful

Author information

Name: Eusebia Nader

Birthday: 1994-11-11

Address: Apt. 721 977 Ebert Meadows, Jereville, GA 73618-6603

Phone: +2316203969400

Job: International Farming Consultant

Hobby: Reading, Photography, Shooting, Singing, Magic, Kayaking, Mushroom hunting

Introduction: My name is Eusebia Nader, I am a encouraging, brainy, lively, nice, famous, healthy, clever person who loves writing and wants to share my knowledge and understanding with you.