Michael J. Swart

July 17, 2020

Monitoring Identity Columns for Room To Grow

Filed under: SQLServerPedia Syndication — Michael J. Swart @ 12:01 pm

My friend pointed out an interesting RCA by Github where a

database table’s auto-incrementing ID column exceeded [maxint]. When we attempted to insert larger integers into the column, the database rejected the value

This led to a discussion about setting up monitoring for this kind of problem in our software. We have a place for monitoring and health-checks for all our databases. We just need to know how to define them.

So how do I create a script that reports any tables whose current identity values are above a certain threshold? This is what I came up with. Maybe you’ll find it useful too.

Find Tables in SQL Server Running Out Of Identity Room

declare @percentThreshold int = 70;
 
select t.name as [table],
       c.name as [column],
       ty.name as [type],
       IDENT_CURRENT(t.name) as [identity],
       100 * IDENT_CURRENT(t.name) / 2147483647 as [percent full]
from   sys.tables t
join   sys.columns c
       on c.object_id = t.object_id
join   sys.types ty
       on ty.system_type_id = c.system_type_id
where  c.is_identity = 1
and    ty.name = 'int'
and    100 * IDENT_CURRENT(t.name) / 2147483647 > @percentThreshold
order  by t.name

Other Notes

  • I really prefer sequences for this kind of thing. Monitoring goes along similar lines
  • I only worry about ints. Bigints are just too big.

11 Comments »

  1. Each db also has a sys.identity_columns system table.

    Using dbatools and lists of servers, I’ve been able to do a sweep of an environment to look for identity columns approaching a cliff. I put all four int types in there simply because I was curious, but agreed, it’s really only an int problem.

    $servers = get-content -Path C:\temp\prod-servers.txt

    $servers | get-dbadatabase -OnlyAccessible | invoke-dbaquery -query “select * from (
    select @@servername as server, db_name() as db, s.name as [schema], o.name as [object], ic.name as [column], t.name as [type], ic.seed_value, ic.increment_value, ic.last_value,
    case t.name
    when ‘bigint’ then convert(decimal(4,3),convert(bigint,ic.last_value) / 9223372036854775807.0)
    when ‘int’ then convert(decimal(4,3),convert(int,ic.last_value) / 2147483647.0)
    when ‘smallint’ then convert(decimal(4,3),convert(smallint,ic.last_value) / 32767.0)
    when ‘tinyint’ then convert(decimal(4,3),convert(tinyint,ic.last_value) / 255.0)
    end
    as pct
    from sys.identity_columns ic
    inner join sys.objects o
    on ic.object_id = o.object_id
    inner join sys.schemas s
    on s.schema_id = o.schema_id
    inner join sys.types t
    on ic.system_type_id = t.system_type_id
    where o.type_desc = ‘user_table’
    ) bob
    where pct > .9” | export-csv -Path c:\temp\big-ints.csv

    Comment by Brian K — July 17, 2020 @ 3:11 pm

  2. This is awesome. I’m embarrassed I didn’t use this table.

    Comment by Michael J. Swart — July 17, 2020 @ 3:15 pm

  3. Here’s a simplified version based on sys.identity_columns where I tried a different syntax that only uses functions:

    declare @percent_threshold int = 20;
    select OBJECT_SCHEMA_NAME(object_id) as [schema], 
           OBJECT_NAME(object_id) as [table],
           [name] as [column], 
           TYPE_NAME(system_type_id) [type], 
           last_value,
           cast(last_value as int) / 21474836.470 as pct
    from sys.identity_columns ic
    where system_type_id = 56 -- int
    and cast(last_value as int) / 21474836.470 > @percent_threshold;

    Comment by Michael J. Swart — July 17, 2020 @ 3:27 pm

  4. Yup looks very nice. Love the functions; I didn’t know about OBJECT_SCHEMA_NAME and TYPE_NAME. There’s a bunch of schemas at my work so I’ve developed a schema-sensitive attitude now, that OBJECT_SCHEMA_NAME function is gonna get some lovin’.

    Gotta yank some joins out of my scripts!

    Comment by Brian K — July 17, 2020 @ 3:57 pm

  5. If you’re willing to use the dbatools powershell module, there’s a cmdlet in there called Test-DbaIdentityUsage that has among its returned columns PercentUsed.

    Comment by Ben Thul — July 17, 2020 @ 4:07 pm

  6. @Brian Nice!

    @Ben Test-DbaIdentityUsage Those dbatools people think of everything. (Ben! How have you been!)

    Comment by Michael J. Swart — July 17, 2020 @ 4:14 pm

  7. […] Michael J. Swart has a script to monitor identity columns: […]

    Pingback by Monitoring Identity Columns – Curated SQL — July 20, 2020 @ 8:10 am

  8. Have you seen this script? https://github.com/SQLRockstar/BlogScripts/blob/master/SQL_Server_identity_values_check.sql

    It tries to handle INT, BIGINT and large NUMERIC identity.

    Comment by Ricky Lively — July 20, 2020 @ 10:00 am

  9. Yeah, after I published my script, I learned that I reinvented the wheel. There are so many previously invented wheels!
    * dbatools.io
    * sp_BlitzIndex by Brent Ozar Unlimited (first written by Kendra I believe)
    * sys.identity_columns which is sql server’s view of identity columns.
    * And now Tom’s

    Comment by Michael J. Swart — July 20, 2020 @ 10:12 am

  10. One thing I noticed is that the sys.identity_columns uses this syntax IDENTITYPROPERTY(object_id, 'LastValue').
    But I was using this syntax: IDENT_CURRENT(table_name).

    There is a small difference!
    If a table has a column which is defined as an identity column, but no rows have been inserted yet, then the LastValue property will be null, but the IDENT_CURRENT will reflect the next number.

    Comment by Michael J. Swart — July 20, 2020 @ 10:14 am

  11. […] In the past I’ve written about monitoring identity columns to ensure there’s room to grow. […]

    Pingback by Can your application handle all BIGINT values? | Michael J. Swart — January 3, 2023 @ 12:24 pm

RSS feed for comments on this post. TrackBack URL

Leave a comment

Powered by WordPress