Monday, February 20, 2012

Large Insert Causing Problem With TempDB

Hello,

I have an SSIS package that basically inserts a large amount of data into a SQL Server table. The table contains sixty five columns, and a single load of data can contain two million records.

The 'loads' are split up into several 'daily' flat files. The package uses a ForEachFile loop to process each of the files. As each file is processed, the data from the files is loaded into a SQL Server table (destination).

Apparently, as the package is running, tempDB begins to consume a lot of disk space. The data file for TempDB on this particular server is configured to grow in 50mb increments with unrestricted file growth. During the last run of the package, the data file grew to 17GB. I ran the following and got the data file size down to 50mb;

USE TempDb

GO

DBCC SHRINKFILE(tempdev, 1)

Should I consider incorporating this code as part of the package, or is there something else I should consider to configure the SSIS package so that I don't run into space problems with TempDB?

Thank you for your help!

cdun2

Be sure you are loading using "fast load" and set the Maximum Insert Commit Size to a reasonable value. (100,000 perhaps)|||

Thanks. I'll check into this.

cdun2

No comments:

Post a Comment