Michael J. Swart

April 28, 2009

Which identity column is running out of room?

Filed under: SQL Scripts,Technical Articles — Michael J. Swart @ 10:16 am

I wrote the following query to give me a sense of which identity columns are running out of room. The query finds the identity columns in the current database and reports the ones that are “closest” to having values that are too big for the datatype. The query does this by reporting the percentage of room used so-far.


;WITH maxValues AS
(
   SELECT user_type_idsystem_type_idname, 
   CASE name 
       WHEN 'bigint' THEN CAST(0x7FFFFFFFFFFFFFFF AS bigint)
       WHEN 'int' THEN CAST(0x7FFFFFFF AS bigint)
       WHEN 'smallint' THEN CAST(0x7FFF AS bigint)
       WHEN 'tinyint' THEN CAST(0xFF AS bigint)
   END AS maxValue
   FROM sys.types t
)
SELECT TOP 20 
   t.name AS tableName, 
   c.name AS identityColumnName, 
   maxValues.name AS typeName, 
   ISNULL(IDENT_CURRENT(t.name),0AS currentIdentity,
   100.0 ISNULL(IDENT_CURRENT(t.name),0) / maxValues.maxValue AS percentConsumed
FROM sys.columns c
JOIN maxValues 
   ON maxValues.system_type_id c.system_type_id
   AND maxValues.user_type_id c.user_type_id
JOIN sys.tables t 
   ON t.OBJECT_ID c.OBJECT_ID
WHERE c.is_identity 1
ORDER BY percentConsumed DESC

So say that you ran the query, and there’s a few numbers you’re looking at. How do you know if you’re in danger of running out of room? Well, there’s no way to know how fast each table is growing based on the schema alone. A static data table that uses 15% of the room might be acceptable. But an active table that has used 5% of the room in a months time is not. An understanding of the business domain is the only way to make a good decision here.

So now what happens now that you’ve decided there’s an identity column that’s in danger of running out of room? The only thing that comes to mind is to increase the data type (e.g. make int a bigint). This usually means a ton of regression testing. A large amount of booked downtime for maintenance.

I’m sure there are other suggestions, I may save those for another post. If you have any ideas, leave them in the comments.

4 Comments

  1. Great script! That’s not something I would normally think to monitor. It’s a good check and runs quickly… thanks for sharing! 🙂

    Comment by Michelle Ufford — May 1, 2009 @ 5:23 am

  2. Hey – great idea, and great script! I’ve had a couple of instances over the years where an identity column ran out of bits – this would’ve been handy to have around.

    Comment by Dave Carlile — May 1, 2009 @ 5:38 am

  3. Don’t forget to take table schemas into account…

    ;WITH maxValues AS
    (
    SELECT user_type_id, system_type_id, name,
    CASE name
    WHEN ‘bigint’ THEN CAST(0x7FFFFFFFFFFFFFFF AS bigint)
    WHEN ‘int’ THEN CAST(0x7FFFFFFF AS bigint)
    WHEN ‘smallint’ THEN CAST(0x7FFF AS bigint)
    WHEN ‘tinyint’ THEN CAST(0xFF AS bigint)
    END AS maxValue
    FROM sys.types t
    )
    SELECT
    t.name AS tableName,
    c.name AS identityColumnName,
    maxValues.name AS typeName,
    ISNULL(IDENT_CURRENT(s.name + ‘.’ + t.name),0) AS currentIdentity,
    100.0 * ISNULL(IDENT_CURRENT(s.name + ‘.’ + t.name),0) / maxValues.maxValue AS percentConsumed
    FROM sys.columns c
    JOIN maxValues
    ON maxValues.system_type_id = c.system_type_id
    AND maxValues.user_type_id = c.user_type_id
    JOIN sys.tables t
    ON t.OBJECT_ID = c.OBJECT_ID
    JOIN sys.schemas s
    on s.schema_id = t.schema_id
    WHERE c.is_identity = 1
    ORDER BY percentConsumed DESC

    Comment by Gavin — May 14, 2009 @ 3:07 am

  4. Hey! thanks for commenting. I didn’t see these comments before today. (stupid spam filter. There’s nothing about these comments that are remotely like spam).

    @Michelle, @Dave Thanks!

    @Gavin, taking schemas into account is a nice touch (don’t forget to include s.name in the selected columns)

    Comment by Michael J. Swart — May 29, 2009 @ 4:46 am

RSS feed for comments on this post. TrackBack URL

Sorry, the comment form is closed at this time.

Powered by WordPress