Does this approach sound reasonable?
So I need to pruning several tables of old data. One part of this will be to setup an ongoing maintenance job that removes old records. First Im going remove a larger than normal amount of records based on date ranges etc. Before I do that, I want to get a rough estimate on the batch size to limit the amount of data being deleted so as not to swell the tlog to the point it has to grow. What I DO NOT want to happen is freeze up the tlog because of doing an operation that is big enough for that to happen. Ive done that before and it aint fun.
So first, I get the free space left in the current log file
--Get the size of data actually used and the free space in the data/log files for a given database
select
name
, filename
, convert(decimal(12,2),round(a.size/128.000,2)) as FileSizeMB
, convert(decimal(12,2),round(fileproperty(a.name,'SpaceUsed')/128.000,2)) as SpaceUsedMB
, convert(decimal(12,2),round((a.size-fileproperty(a.name,'SpaceUsed'))/128.000,2)) as FreeSpaceMB
from dbo.sysfiles a
Then , one table at a time, select top(1) from the table(s) Im going to delete from and turn on client stats in SSMS. This will give me the amount of data that is coming back from the server. This would also be about the same amount being put into the tlog upon deleting that same row, correct?
So, with the above info, I should be able to come up with a relatively "safe" estimate of the number of records I can delete from one or more tables.
Sound reasonable?