I wrote the following query to give me a sense of which identity columns are running out of room. The query finds the identity columns in the current database and reports the ones that are “closest” to having values that are too big for the datatype. The query does this by reporting the percentage of room used so-far.
;WITH maxValues AS ( SELECT user_type_id, system_type_id, name, CASE name WHEN 'bigint' THEN CAST(0x7FFFFFFFFFFFFFFF AS bigint) WHEN 'int' THEN CAST(0x7FFFFFFF AS bigint) WHEN 'smallint' THEN CAST(0x7FFF AS bigint) WHEN 'tinyint' THEN CAST(0xFF AS bigint) END AS maxValue FROM sys.types t ) SELECT TOP 20 t.name AS tableName, c.name AS identityColumnName, maxValues.name AS typeName, ISNULL(IDENT_CURRENT(t.name),0) AS currentIdentity, 100.0 * ISNULL(IDENT_CURRENT(t.name),0) / maxValues.maxValue AS percentConsumed FROM sys.columns c JOIN maxValues ON maxValues.system_type_id = c.system_type_id AND maxValues.user_type_id = c.user_type_id JOIN sys.tables t ON t.OBJECT_ID = c.OBJECT_ID WHERE c.is_identity = 1 ORDER BY percentConsumed DESC
So say that you ran the query, and there’s a few numbers you’re looking at. How do you know if you’re in danger of running out of room? Well, there’s no way to know how fast each table is growing based on the schema alone. A static data table that uses 15% of the room might be acceptable. But an active table that has used 5% of the room in a months time is not. An understanding of the business domain is the only way to make a good decision here.
I’m sure there are other suggestions, I may save those for another post. If you have any ideas, leave them in the comments.