Hi, I have an SP that will basically copy one table's data to another with
some transformation involved. The source table over 1 billion rows. The
machine has limited disk space and the transaction log will grow too large
and cause trouble.
Is there a way to run this SP and not have the transactions logged for it?
Thanks,
ADAMWhat kind of transformation?
If you can use a select statement to bcp the data to a file, then you can
change the recovery model (first make a full db backup) to bulk-logged, and
use bcp utility or bulk insert statement to insert the data into the second
table. Change back the recovery model to full when you finish.
AMB
"adami" wrote:
> Hi, I have an SP that will basically copy one table's data to another with
> some transformation involved. The source table over 1 billion rows. The
> machine has limited disk space and the transaction log will grow too large
> and cause trouble.
> Is there a way to run this SP and not have the transactions logged for it?
> Thanks,
> ADAM|||Hi,
Set the database into Simple recovery Recovery model. After that execute the
Insert statment in batches of 1000 or 2000. Commit the transaction after
each batch. This will clear the Transaction log and will ensure that LDF
will not grow.
Thanks
Hari
SQL Server MVP
"adami" <adami@.discussions.microsoft.com> wrote in message
news:44698021-D132-412F-8BC2-0C47D08AA976@.microsoft.com...
> Hi, I have an SP that will basically copy one table's data to another with
> some transformation involved. The source table over 1 billion rows. The
> machine has limited disk space and the transaction log will grow too large
> and cause trouble.
> Is there a way to run this SP and not have the transactions logged for it?
> Thanks,
> ADAM|||To start, read up the setting the database's recovery model to simple, even
if temporarily for this process. Also, you could see if setting the database
to SINGLE_USER mode during the process is possible. Dropping indexes from
the destination table prior to the inserts and then re-creating them again
afterward will also reduce the number of transaction, speed things up and
reduce index fragmentation.
I have a data warehousing project where I have a similar situation, except
my row volume is only around 10,000,000 instead of 1,000,000,000, but still
a dump load of data. What I actually did was to create a view of the source
table which performs the needed transformations, and a DTS package exports
the view to a tab delimited text file. Once the export is complete, I bulk
copy (BCP.EXE) the text file into the destination table. Read up on the bulk
copy feature and how it can be used to move data with minimal logging. Of
everything I tried, this was by far the fastest.
"adami" <adami@.discussions.microsoft.com> wrote in message
news:44698021-D132-412F-8BC2-0C47D08AA976@.microsoft.com...
> Hi, I have an SP that will basically copy one table's data to another with
> some transformation involved. The source table over 1 billion rows. The
> machine has limited disk space and the transaction log will grow too large
> and cause trouble.
> Is there a way to run this SP and not have the transactions logged for it?
> Thanks,
> ADAM
No comments:
Post a Comment