Showing posts with label log. Show all posts
Showing posts with label log. Show all posts

Friday, March 23, 2012

Runaway log files

Hello,

I am administering a DB which is about 2GB in size and sees few hundred,
maybe up to a few thousand inserts a day plus very few deletions and updates
(less than few hundred). DB has about 200,000 records in it total. For some
time now I have noticed that the transaction logs grow a lot -- like 1GB per
day. I don't know why they are growing so much and would like to find out
why - DB usage that I know of cannot cause such a high growth, to my best
knowledge.

Is there a way to find out what is causing such growth? Can I examine these
logs somehow and see what is in them? I am worried that somehow someone
managed to run mass updates on the DB without my knowledge.

Thanks for help,

Tom Kitta"Tom Kitta" <tom@.energyshop.com> wrote in message
news:D527b.28080$mk1.4594@.news02.bloor.is.net.cabl e.rogers.com...
> Hello,
> I am administering a DB which is about 2GB in size and sees few hundred,
> maybe up to a few thousand inserts a day plus very few deletions and
updates
> (less than few hundred). DB has about 200,000 records in it total. For
some
> time now I have noticed that the transaction logs grow a lot -- like 1GB
per
> day. I don't know why they are growing so much and would like to find out
> why - DB usage that I know of cannot cause such a high growth, to my best
> knowledge.
> Is there a way to find out what is causing such growth? Can I examine
these
> logs somehow and see what is in them? I am worried that somehow someone
> managed to run mass updates on the DB without my knowledge.
> Thanks for help,
> Tom Kitta

Do you mean that your log is growing continuously? That may be normal,
depending on your backup strategy - if you're in Full recovery mode, but
never back up the log, for example. If that's the case, you can look at
backing up the log regularly or changing to Simple recovery mode.

If you're doing that already, and you mean that the log grows more over a
day than you expect, then one possibility is to use Profiler to trace the
"Log File Auto Grow" event, and look for other events before it in the trace
output. If there are one or two queries that require lots of log space, you
should be able to identify them.

Simon

Runaway Log Files

Our production server ran out of disk space over the weekend due to a
runaway log file condition (again).
We had this happen about 2 months ago, prior to SP1, and I reported it here.
MS admitted that had seen the problem, but was not able to reproduce.
In the Event Log, I see thousands of entries like:
Event Type: Warning
Event Source: Schedule and Delivery Processor
Event Category: Logging
Event ID: 123
Date: 8/30/2004
Time: 11:55:16 AM
User: N/A
Computer: KYLE
Description:
The report server failed to write to the trace log.
In the log files, about 8 megabytes in size each until 10 gigabyte disk
capacity was reached, contains repeated entries as follows:
ReportingServicesService!dbcleanup!1308!8/30/2004-11:55:16:: i INFO: Cleaned
0 broken snapshots, 0 chunks
ReportingServicesService!runningjobs!1308!8/30/2004-11:55:16:: i INFO:
Execution Log Entry Expiration timer enabled: Cycle: 50683 seconds
ReportingServicesService!dbcleanup!e44!8/30/2004-11:55:16:: i INFO: Expiring
old execution log entries
ReportingServicesService!dbcleanup!e44!8/30/2004-11:55:16:: i INFO:
Expiration of old execution log entries is complete. Removed 0 entries.
I was hoping this was fixed in SP1, which we are running now. Any ideas?
thx
JeffI already replied to your other post, but will duplicate info here. We have
found the bug and you should contact PSS for a fix to the issue.
--
-Daniel
This posting is provided "AS IS" with no warranties, and confers no rights.
"Jeff Dillon" <jeff@.removeemergencyreporting.com> wrote in message
news:OYofuMsjEHA.2948@.TK2MSFTNGP11.phx.gbl...
> Our production server ran out of disk space over the weekend due to a
> runaway log file condition (again).
> We had this happen about 2 months ago, prior to SP1, and I reported it
here.
> MS admitted that had seen the problem, but was not able to reproduce.
> In the Event Log, I see thousands of entries like:
> Event Type: Warning
> Event Source: Schedule and Delivery Processor
> Event Category: Logging
> Event ID: 123
> Date: 8/30/2004
> Time: 11:55:16 AM
> User: N/A
> Computer: KYLE
> Description:
> The report server failed to write to the trace log.
> In the log files, about 8 megabytes in size each until 10 gigabyte disk
> capacity was reached, contains repeated entries as follows:
> ReportingServicesService!dbcleanup!1308!8/30/2004-11:55:16:: i INFO:
Cleaned
> 0 broken snapshots, 0 chunks
> ReportingServicesService!runningjobs!1308!8/30/2004-11:55:16:: i INFO:
> Execution Log Entry Expiration timer enabled: Cycle: 50683 seconds
> ReportingServicesService!dbcleanup!e44!8/30/2004-11:55:16:: i INFO:
Expiring
> old execution log entries
> ReportingServicesService!dbcleanup!e44!8/30/2004-11:55:16:: i INFO:
> Expiration of old execution log entries is complete. Removed 0 entries.
> I was hoping this was fixed in SP1, which we are running now. Any ideas?
> thx
> Jeff
>

Runaway log files

Some of you have run into a problem with the log files being written to and
consuming the entire hard drive. We have identified the issue and have
provided a fix. The KB article itself has not yet been published but the
fix is available through PSS. Please feel free to contact PSS and refer
them to KB 885286.
For everyone who has been waiting for this, thank you for your patience.
--
-Daniel
This posting is provided "AS IS" with no warranties, and confers no rights.Hi,
When will this fix be available for download?
Erik
"Daniel Reib [MSFT]" <danreib@.online.microsoft.com> wrote in message
news:OSFl%23KPnEHA.3076@.TK2MSFTNGP15.phx.gbl...
> Some of you have run into a problem with the log files being written to
and
> consuming the entire hard drive. We have identified the issue and have
> provided a fix. The KB article itself has not yet been published but the
> fix is available through PSS. Please feel free to contact PSS and refer
> them to KB 885286.
> For everyone who has been waiting for this, thank you for your patience.
> --
> -Daniel
> This posting is provided "AS IS" with no warranties, and confers no
rights.
>
>|||This has become a critical issue for the company I work
for.
A fix is needed pronto. Is this a matter of days for the
fix or "Future Thinking"?
>--Original Message--
>Hi,
>When will this fix be available for download?
>Erik
>"Daniel Reib [MSFT]" <danreib@.online.microsoft.com>
wrote in message
>news:OSFl%23KPnEHA.3076@.TK2MSFTNGP15.phx.gbl...
>> Some of you have run into a problem with the log files
being written to
>and
>> consuming the entire hard drive. We have identified
the issue and have
>> provided a fix. The KB article itself has not yet
been published but the
>> fix is available through PSS. Please feel free to
contact PSS and refer
>> them to KB 885286.
>> For everyone who has been waiting for this, thank you
for your patience.
>> --
>> -Daniel
>> This posting is provided "AS IS" with no warranties,
and confers no
>rights.
>>
>
>.
>|||At this point the fix is only available through PSS.
--
-Daniel
This posting is provided "AS IS" with no warranties, and confers no rights.
"Erik Tamminga" <REVERSE_THIS_agnimmate@.REVERSE_THIS_nerrats.ln> wrote in
message news:eqTrhhvnEHA.2096@.TK2MSFTNGP15.phx.gbl...
> Hi,
> When will this fix be available for download?
> Erik
> "Daniel Reib [MSFT]" <danreib@.online.microsoft.com> wrote in message
> news:OSFl%23KPnEHA.3076@.TK2MSFTNGP15.phx.gbl...
> > Some of you have run into a problem with the log files being written to
> and
> > consuming the entire hard drive. We have identified the issue and have
> > provided a fix. The KB article itself has not yet been published but
the
> > fix is available through PSS. Please feel free to contact PSS and refer
> > them to KB 885286.
> >
> > For everyone who has been waiting for this, thank you for your patience.
> >
> > --
> > -Daniel
> > This posting is provided "AS IS" with no warranties, and confers no
> rights.
> >
> >
> >
>|||The fix is available now. You will need to contact PSS and reference the KB
number below. The fix is currently not available as a public download.
--
-Daniel
This posting is provided "AS IS" with no warranties, and confers no rights.
"Dave Laskey" <anonymous@.discussions.microsoft.com> wrote in message
news:0ec501c49f2b$26f188b0$3501280a@.phx.gbl...
> This has become a critical issue for the company I work
> for.
> A fix is needed pronto. Is this a matter of days for the
> fix or "Future Thinking"?
> >--Original Message--
> >Hi,
> >
> >When will this fix be available for download?
> >
> >Erik
> >
> >"Daniel Reib [MSFT]" <danreib@.online.microsoft.com>
> wrote in message
> >news:OSFl%23KPnEHA.3076@.TK2MSFTNGP15.phx.gbl...
> >> Some of you have run into a problem with the log files
> being written to
> >and
> >> consuming the entire hard drive. We have identified
> the issue and have
> >> provided a fix. The KB article itself has not yet
> been published but the
> >> fix is available through PSS. Please feel free to
> contact PSS and refer
> >> them to KB 885286.
> >>
> >> For everyone who has been waiting for this, thank you
> for your patience.
> >>
> >> --
> >> -Daniel
> >> This posting is provided "AS IS" with no warranties,
> and confers no
> >rights.
> >>
> >>
> >>
> >
> >
> >.
> >

Wednesday, March 7, 2012

Run an SP without adding anything to the transaction log?

Hi, I have an SP that will basically copy one table's data to another with
some transformation involved. The source table over 1 billion rows. The
machine has limited disk space and the transaction log will grow too large
and cause trouble.
Is there a way to run this SP and not have the transactions logged for it?
Thanks,
ADAMWhat kind of transformation?
If you can use a select statement to bcp the data to a file, then you can
change the recovery model (first make a full db backup) to bulk-logged, and
use bcp utility or bulk insert statement to insert the data into the second
table. Change back the recovery model to full when you finish.
AMB
"adami" wrote:

> Hi, I have an SP that will basically copy one table's data to another with
> some transformation involved. The source table over 1 billion rows. The
> machine has limited disk space and the transaction log will grow too large
> and cause trouble.
> Is there a way to run this SP and not have the transactions logged for it?
> Thanks,
> ADAM|||Hi,
Set the database into Simple recovery Recovery model. After that execute the
Insert statment in batches of 1000 or 2000. Commit the transaction after
each batch. This will clear the Transaction log and will ensure that LDF
will not grow.
Thanks
Hari
SQL Server MVP
"adami" <adami@.discussions.microsoft.com> wrote in message
news:44698021-D132-412F-8BC2-0C47D08AA976@.microsoft.com...
> Hi, I have an SP that will basically copy one table's data to another with
> some transformation involved. The source table over 1 billion rows. The
> machine has limited disk space and the transaction log will grow too large
> and cause trouble.
> Is there a way to run this SP and not have the transactions logged for it?
> Thanks,
> ADAM|||To start, read up the setting the database's recovery model to simple, even
if temporarily for this process. Also, you could see if setting the database
to SINGLE_USER mode during the process is possible. Dropping indexes from
the destination table prior to the inserts and then re-creating them again
afterward will also reduce the number of transaction, speed things up and
reduce index fragmentation.
I have a data warehousing project where I have a similar situation, except
my row volume is only around 10,000,000 instead of 1,000,000,000, but still
a dump load of data. What I actually did was to create a view of the source
table which performs the needed transformations, and a DTS package exports
the view to a tab delimited text file. Once the export is complete, I bulk
copy (BCP.EXE) the text file into the destination table. Read up on the bulk
copy feature and how it can be used to move data with minimal logging. Of
everything I tried, this was by far the fastest.
"adami" <adami@.discussions.microsoft.com> wrote in message
news:44698021-D132-412F-8BC2-0C47D08AA976@.microsoft.com...
> Hi, I have an SP that will basically copy one table's data to another with
> some transformation involved. The source table over 1 billion rows. The
> machine has limited disk space and the transaction log will grow too large
> and cause trouble.
> Is there a way to run this SP and not have the transactions logged for it?
> Thanks,
> ADAM