I wrote the following query to give me a sense of which identity columns are running out of room. The query finds the identity columns in the current database and reports the ones that are “closest” to having values that are too big for the datatype. The query does this by reporting the percentage of room used so-far.
;WITH maxValues AS
(
SELECT user_type_id, system_type_id, name,
CASE name
WHEN 'bigint' THEN CAST(0x7FFFFFFFFFFFFFFF AS bigint)
WHEN 'int' THEN CAST(0x7FFFFFFF AS bigint)
WHEN 'smallint' THEN CAST(0x7FFF AS bigint)
WHEN 'tinyint' THEN CAST(0xFF AS bigint)
END AS maxValue
FROM sys.types t
)
SELECT TOP 20
t.name AS tableName,
c.name AS identityColumnName,
maxValues.name AS typeName,
ISNULL(IDENT_CURRENT(t.name),0) AS currentIdentity,
100.0 * ISNULL(IDENT_CURRENT(t.name),0) / maxValues.maxValue AS percentConsumed
FROM sys.columns c
JOIN maxValues
ON maxValues.system_type_id = c.system_type_id
AND maxValues.user_type_id = c.user_type_id
JOIN sys.tables t
ON t.OBJECT_ID = c.OBJECT_ID
WHERE c.is_identity = 1
ORDER BY percentConsumed DESC
So say that you ran the query, and there’s a few numbers you’re looking at. How do you know if you’re in danger of running out of room? Well, there’s no way to know how fast each table is growing based on the schema alone. A static data table that uses 15% of the room might be acceptable. But an active table that has used 5% of the room in a months time is not. An understanding of the business domain is the only way to make a good decision here.
I’m sure there are other suggestions, I may save those for another post. If you have any ideas, leave them in the comments.
Great script! That’s not something I would normally think to monitor. It’s a good check and runs quickly… thanks for sharing! 🙂
Comment by Michelle Ufford — May 1, 2009 @ 5:23 am
Hey – great idea, and great script! I’ve had a couple of instances over the years where an identity column ran out of bits – this would’ve been handy to have around.
Comment by Dave Carlile — May 1, 2009 @ 5:38 am
Don’t forget to take table schemas into account…
;WITH maxValues AS
(
SELECT user_type_id, system_type_id, name,
CASE name
WHEN ‘bigint’ THEN CAST(0x7FFFFFFFFFFFFFFF AS bigint)
WHEN ‘int’ THEN CAST(0x7FFFFFFF AS bigint)
WHEN ‘smallint’ THEN CAST(0x7FFF AS bigint)
WHEN ‘tinyint’ THEN CAST(0xFF AS bigint)
END AS maxValue
FROM sys.types t
)
SELECT
t.name AS tableName,
c.name AS identityColumnName,
maxValues.name AS typeName,
ISNULL(IDENT_CURRENT(s.name + ‘.’ + t.name),0) AS currentIdentity,
100.0 * ISNULL(IDENT_CURRENT(s.name + ‘.’ + t.name),0) / maxValues.maxValue AS percentConsumed
FROM sys.columns c
JOIN maxValues
ON maxValues.system_type_id = c.system_type_id
AND maxValues.user_type_id = c.user_type_id
JOIN sys.tables t
ON t.OBJECT_ID = c.OBJECT_ID
JOIN sys.schemas s
on s.schema_id = t.schema_id
WHERE c.is_identity = 1
ORDER BY percentConsumed DESC
Comment by Gavin — May 14, 2009 @ 3:07 am
Hey! thanks for commenting. I didn’t see these comments before today. (stupid spam filter. There’s nothing about these comments that are remotely like spam).
@Michelle, @Dave Thanks!
@Gavin, taking schemas into account is a nice touch (don’t forget to include s.name in the selected columns)
Comment by Michael J. Swart — May 29, 2009 @ 4:46 am