Showing posts with label odbc. Show all posts
Showing posts with label odbc. Show all posts

Friday, March 23, 2012

Running 64 Bit instance and 32 bit instance on Same Server

We are using an OEM ODBC connector that, to date, only works with the 32 bit
version of SQL 2005. My production server is a 64 bit version so I am still
maintaing the old server with the 32 bit for this ONE connectivity
requirement. The ODBC / OLEDB connector is used for a LINKED server to a
PROGRESS database.
Can I install the 32 BIT version of SQL on the same (64bit version of)
Windows Server 2003, and should I expect that the OEM ODBC driver would
function in the 32 bit instance as it does on the 32 bit Server?
--
Thanks~
JimDear Jim,
Thank you for posting here.
From your description, I understand that:
Your current ODBC connector is based on SQL Server 2005 32 bit; however
production server is a 64 bit version and you would like to know whether we
can install SQL Server 32bit on the 64 bit Windows Server 2003 and run both
64 bit and 32 bit SQL instances on it.
If I have misunderstood about your concern, feel free to let me know.
Windows Server 2003 64 bit Edition fully supports SQL Server 2005 32 bit
Editions. You can refer to the section "Operating System Requirements
(32-Bit)" of this article "Hardware and Software Requirements for
Installing SQL Server 2005":
http://msdn2.microsoft.com/en-us/library/ms143506.aspx
So, we are able to install the 32bit instance on the same 64 bit server. In
addition, we need to use "odbcad32.exe" on the 64bit system to configure
DSN for 32 bit driver. In most cases, the OEM ODBC connector should work in
this scenario. We can also contact the manufacturer of this ODBC connector
to confirm this.
If anything is unclear in my post, please don't hesitate to let me know.
Have a nice day!
Best regards,
Adams Qu
MCSE, MCDBA, MCTS
Microsoft Online Support
Microsoft Global Technical Support Center
Get Secure! - www.microsoft.com/security
=====================================================When responding to posts, please "Reply to Group" via your newsreader so
that others may learn and benefit from your issue.
=====================================================This posting is provided "AS IS" with no warranties, and confers no rights.
| Thread-Topic: Running 64 Bit instance and 32 bit instance on Same Server
| thread-index: Acgb8YBLwppW8NlwQ/C3lA9P6QDrsQ==| X-WBNR-Posting-Host: 207.46.19.168
| From: =?Utf-8?B?QmlnSmltQ2FzaA==?= <Bigjimcash@.noemail.noemail>
| Subject: Running 64 Bit instance and 32 bit instance on Same Server
| Date: Wed, 31 Oct 2007 12:09:03 -0700
| Lines: 12
| Message-ID: <EDE96934-8B17-4CDA-90D4-64D64211C8CF@.microsoft.com>
| MIME-Version: 1.0
| Content-Type: text/plain;
| charset="Utf-8"
| Content-Transfer-Encoding: 7bit
| X-Newsreader: Microsoft CDO for Windows 2000
| Content-Class: urn:content-classes:message
| Importance: normal
| Priority: normal
| X-MimeOLE: Produced By Microsoft MimeOLE V6.00.3790.2992
| Newsgroups: microsoft.public.sqlserver.server
| Path: TK2MSFTNGHUB02.phx.gbl
| Xref: TK2MSFTNGHUB02.phx.gbl microsoft.public.sqlserver.server:29189
| NNTP-Posting-Host: tk2msftibfm01.phx.gbl 10.40.244.149
| X-Tomcat-NG: microsoft.public.sqlserver.server
|
| We are using an OEM ODBC connector that, to date, only works with the 32
bit
| version of SQL 2005. My production server is a 64 bit version so I am
still
| maintaing the old server with the 32 bit for this ONE connectivity
| requirement. The ODBC / OLEDB connector is used for a LINKED server to a
| PROGRESS database.
|
| Can I install the 32 BIT version of SQL on the same (64bit version of)
| Windows Server 2003, and should I expect that the OEM ODBC driver would
| function in the 32 bit instance as it does on the 32 bit Server?
| --
| Thanks~
| Jim
||||Dear Jim,
We wanted to see if the information provided was helpful. Please keep us
posted on your progress and let us know if you have any additional
questions or concerns.
We are looking forward to your response.
Have a nice day!
Best regards,
Adams Qu, MCSE, MCDBA, MCTS
Microsoft Online Support
Microsoft Global Technical Support Center
Get Secure! - www.microsoft.com/security
=====================================================When responding to posts, please "Reply to Group" via your newsreader so
that others may learn and benefit from your issue.
=====================================================This posting is provided "AS IS" with no warranties, and confers no rights.
| X-Tomcat-ID: 54548849
| References: <EDE96934-8B17-4CDA-90D4-64D64211C8CF@.microsoft.com>
| MIME-Version: 1.0
| Content-Type: text/plain
| Content-Transfer-Encoding: 7bit
| From: v-adamqu@.online.microsoft.com (Adams Qu [MSFT])
| Organization: Microsoft
| Date: Thu, 01 Nov 2007 06:25:20 GMT
| Subject: RE: Running 64 Bit instance and 32 bit instance on Same Server
| X-Tomcat-NG: microsoft.public.sqlserver.server
| Message-ID: <7#wKAAFHIHA.4508@.TK2MSFTNGHUB02.phx.gbl>
| Newsgroups: microsoft.public.sqlserver.server
| Lines: 82
| Path: TK2MSFTNGHUB02.phx.gbl
| Xref: TK2MSFTNGHUB02.phx.gbl microsoft.public.sqlserver.server:29229
| NNTP-Posting-Host: tomcatimport2.phx.gbl 10.201.218.182
|
| Dear Jim,
|
| Thank you for posting here.
|
| From your description, I understand that:
|
| Your current ODBC connector is based on SQL Server 2005 32 bit; however
| production server is a 64 bit version and you would like to know whether
we
| can install SQL Server 32bit on the 64 bit Windows Server 2003 and run
both
| 64 bit and 32 bit SQL instances on it.
|
| If I have misunderstood about your concern, feel free to let me know.
|
| Windows Server 2003 64 bit Edition fully supports SQL Server 2005 32 bit
| Editions. You can refer to the section "Operating System Requirements
| (32-Bit)" of this article "Hardware and Software Requirements for
| Installing SQL Server 2005":
| http://msdn2.microsoft.com/en-us/library/ms143506.aspx
|
| So, we are able to install the 32bit instance on the same 64 bit server.
In
| addition, we need to use "odbcad32.exe" on the 64bit system to configure
| DSN for 32 bit driver. In most cases, the OEM ODBC connector should work
in
| this scenario. We can also contact the manufacturer of this ODBC
connector
| to confirm this.
|
| If anything is unclear in my post, please don't hesitate to let me know.
|
| Have a nice day!
|
| Best regards,
|
| Adams Qu
| MCSE, MCDBA, MCTS
| Microsoft Online Support
|
| Microsoft Global Technical Support Center
|
| Get Secure! - www.microsoft.com/security
| =====================================================| When responding to posts, please "Reply to Group" via your newsreader so
| that others may learn and benefit from your issue.
| =====================================================| This posting is provided "AS IS" with no warranties, and confers no
rights.
|
|
| --
| | Thread-Topic: Running 64 Bit instance and 32 bit instance on Same Server
| | thread-index: Acgb8YBLwppW8NlwQ/C3lA9P6QDrsQ==| | X-WBNR-Posting-Host: 207.46.19.168
| | From: =?Utf-8?B?QmlnSmltQ2FzaA==?= <Bigjimcash@.noemail.noemail>
| | Subject: Running 64 Bit instance and 32 bit instance on Same Server
| | Date: Wed, 31 Oct 2007 12:09:03 -0700
| | Lines: 12
| | Message-ID: <EDE96934-8B17-4CDA-90D4-64D64211C8CF@.microsoft.com>
| | MIME-Version: 1.0
| | Content-Type: text/plain;
| | charset="Utf-8"
| | Content-Transfer-Encoding: 7bit
| | X-Newsreader: Microsoft CDO for Windows 2000
| | Content-Class: urn:content-classes:message
| | Importance: normal
| | Priority: normal
| | X-MimeOLE: Produced By Microsoft MimeOLE V6.00.3790.2992
| | Newsgroups: microsoft.public.sqlserver.server
| | Path: TK2MSFTNGHUB02.phx.gbl
| | Xref: TK2MSFTNGHUB02.phx.gbl microsoft.public.sqlserver.server:29189
| | NNTP-Posting-Host: tk2msftibfm01.phx.gbl 10.40.244.149
| | X-Tomcat-NG: microsoft.public.sqlserver.server
| |
| | We are using an OEM ODBC connector that, to date, only works with the
32
| bit
| | version of SQL 2005. My production server is a 64 bit version so I am
| still
| | maintaing the old server with the 32 bit for this ONE connectivity
| | requirement. The ODBC / OLEDB connector is used for a LINKED server to
a
| | PROGRESS database.
| |
| | Can I install the 32 BIT version of SQL on the same (64bit version of)
| | Windows Server 2003, and should I expect that the OEM ODBC driver would
| | function in the 32 bit instance as it does on the 32 bit Server?
| | --
| | Thanks~
| | Jim
| |
|
|

Running 2+ process simultaneously causes exponetial slowness

If I run a report from our 4GL app (using ODBC) it takes say 5 mins to run.
If I run 2 instances of it simultaneously, they will slow down by a factor
5+. Reads and Duration of some transactions in Profiler will jump up
significanty.
If anything you might have expected the second process to benefit from data
caching, but for some reason the opposite occurs.
Anyone got any ideas why this might occur?
On SP3 - cannot yet install SP4 because of AWE.MrTim wrote:
> If I run a report from our 4GL app (using ODBC) it takes say 5 mins
> to run. If I run 2 instances of it simultaneously, they will slow
> down by a factor 5+. Reads and Duration of some transactions in
> Profiler will jump up significanty.
> If anything you might have expected the second process to benefit
> from data caching, but for some reason the opposite occurs.
> Anyone got any ideas why this might occur?
> On SP3 - cannot yet install SP4 because of AWE.
A 5 minute run is extremely long. Running two 5 minute queries will amp
up the cpu and disk activity accordingly and force each of them to take
much longer to complete - not to mention all other queries running. You
usually will not see a change in the number of reads or cpu values for
each. You should see a significant change in duration, however. A 5
minute operation is probably flushing all good data out of cache
multiple times while it scans tables. Not to mention that the disks are
probably quite busy during the operation, making each one take longer as
they contend for physical reads - which are very, very slow.
I would suggest you get the performance of the 5 minute query down to a
few seconds if possible, run it off-hours, or run it off a dedicated
reporting server to keep it's activity from degrading the performance of
other queries.
Have a look at the execution plan for the 5 minute query and see if it
can be tuned.
David Gugick
Quest Software
www.imceda.com
www.quest.com|||Why do you think a 5 minute report is long? Given the amount of data it's
processing it damn quick. It's actually a cursor loop processing tens of
thousands of records, not a single SELECT.
Anyway, it's just an example for this post. The point is that [any] 2
processes running simultaneously take significantly longer than twice as long
to run.
"David Gugick" wrote:
> MrTim wrote:
> > If I run a report from our 4GL app (using ODBC) it takes say 5 mins
> > to run. If I run 2 instances of it simultaneously, they will slow
> > down by a factor 5+. Reads and Duration of some transactions in
> > Profiler will jump up significanty.
> >
> > If anything you might have expected the second process to benefit
> > from data caching, but for some reason the opposite occurs.
> >
> > Anyone got any ideas why this might occur?
> > On SP3 - cannot yet install SP4 because of AWE.
> A 5 minute run is extremely long. Running two 5 minute queries will amp
> up the cpu and disk activity accordingly and force each of them to take
> much longer to complete - not to mention all other queries running. You
> usually will not see a change in the number of reads or cpu values for
> each. You should see a significant change in duration, however. A 5
> minute operation is probably flushing all good data out of cache
> multiple times while it scans tables. Not to mention that the disks are
> probably quite busy during the operation, making each one take longer as
> they contend for physical reads - which are very, very slow.
> I would suggest you get the performance of the 5 minute query down to a
> few seconds if possible, run it off-hours, or run it off a dedicated
> reporting server to keep it's activity from degrading the performance of
> other queries.
> Have a look at the execution plan for the 5 minute query and see if it
> can be tuned.
>
> --
> David Gugick
> Quest Software
> www.imceda.com
> www.quest.com
>|||MrTim wrote:
> Why do you think a 5 minute report is long? Given the amount of data
> it's processing it damn quick. It's actually a cursor loop
> processing tens of thousands of records, not a single SELECT.
>
Cursors are terrible performers in the SQL Server world. Set based
solutions can be hundreds of times faster. If you can imagine a
production SQL Server running a single query for 5 minutes when other
transactions take milliseconds, it starts to sound like a long time. For
a dedicated reporting server, it may be acceptable.
> Anyway, it's just an example for this post. The point is that [any] 2
> processes running simultaneously take significantly longer than twice
> as long to run.
The expected duration of a query is partly based on the overall system
CPU load. If a single query loads up the CPU to 25% utilization and
completes in about 60ms, running a second query simultaneously causes
each query to take about 100ms. That doesn't even take into account all
the disk activity. When you have long running processes, they are
normally accessing a lot of data. When SQL Server reads data, it loads
it into cache, moving out potentially more important data that would
benefit from staying in memory. Since it's usually the case where data
sizes far exceed available memory, you end up with a lot of disk
thrashing. And in the I/O world there is nothing slower than that.
That's why I suggest you try and tune the queries and avoid cursors
whenever possible. There are ususally set-based solutions that can work,
or at least get you most of the way there.
David Gugick
Quest Software
www.imceda.com
www.quest.com|||You're correct in assuming that something else is going on if n processes run
significantly slower than nX time. Memory thrashing, disk thrashing, or
lock contention are the likely candidates.
Check your free resources and i/o queue while both are running. Also, if
you can drop the cursor entirely, or use a fast-forward readonly cursor (w/
nolock), try that as well.
"MrTim" wrote:
> Why do you think a 5 minute report is long? Given the amount of data it's
> processing it damn quick. It's actually a cursor loop processing tens of
> thousands of records, not a single SELECT.
> Anyway, it's just an example for this post. The point is that [any] 2
> processes running simultaneously take significantly longer than twice as long
> to run.
>
> "David Gugick" wrote:
> > MrTim wrote:
> > > If I run a report from our 4GL app (using ODBC) it takes say 5 mins
> > > to run. If I run 2 instances of it simultaneously, they will slow
> > > down by a factor 5+. Reads and Duration of some transactions in
> > > Profiler will jump up significanty.
> > >
> > > If anything you might have expected the second process to benefit
> > > from data caching, but for some reason the opposite occurs.
> > >
> > > Anyone got any ideas why this might occur?
> > > On SP3 - cannot yet install SP4 because of AWE.
> >
> > A 5 minute run is extremely long. Running two 5 minute queries will amp
> > up the cpu and disk activity accordingly and force each of them to take
> > much longer to complete - not to mention all other queries running. You
> > usually will not see a change in the number of reads or cpu values for
> > each. You should see a significant change in duration, however. A 5
> > minute operation is probably flushing all good data out of cache
> > multiple times while it scans tables. Not to mention that the disks are
> > probably quite busy during the operation, making each one take longer as
> > they contend for physical reads - which are very, very slow.
> >
> > I would suggest you get the performance of the 5 minute query down to a
> > few seconds if possible, run it off-hours, or run it off a dedicated
> > reporting server to keep it's activity from degrading the performance of
> > other queries.
> >
> > Have a look at the execution plan for the 5 minute query and see if it
> > can be tuned.
> >
> >
> > --
> > David Gugick
> > Quest Software
> > www.imceda.com
> > www.quest.com
> >
> >|||Also check to ensure that one process is not blocking another...sp_who,
sp_who2
--
Wayne Snyder, MCDBA, SQL Server MVP
Mariner, Charlotte, NC
www.mariner-usa.com
(Please respond only to the newsgroups.)
I support the Professional Association of SQL Server (PASS) and it's
community of SQL Server professionals.
www.sqlpass.org
"MrTim" <MrTim@.discussions.microsoft.com> wrote in message
news:72451A09-E4F8-46E6-A5A4-B4ACC24877BD@.microsoft.com...
> If I run a report from our 4GL app (using ODBC) it takes say 5 mins to
> run.
> If I run 2 instances of it simultaneously, they will slow down by a factor
> 5+. Reads and Duration of some transactions in Profiler will jump up
> significanty.
> If anything you might have expected the second process to benefit from
> data
> caching, but for some reason the opposite occurs.
> Anyone got any ideas why this might occur?
> On SP3 - cannot yet install SP4 because of AWE.
>|||This is running in a dev enivornment, and there are only these 2 processes
running. I know that if it's run in a production environment other processes
could be blocking it. That sort of thing has all been checked.
And, unfortunately the cursor cannot be avoided. It's a 4GL program and a
cursor loop is required to feed data into the report. When only one report
is running, transactions are generally showing 0-16 ms duration (so quite
quick). Only when the two are in contention do thse start rising to 300+ms.
Given no other mitigating factors this still isn't what you'd expect.
"Wayne Snyder" wrote:
> Also check to ensure that one process is not blocking another...sp_who,
> sp_who2
> --
> Wayne Snyder, MCDBA, SQL Server MVP
> Mariner, Charlotte, NC
> www.mariner-usa.com
> (Please respond only to the newsgroups.)
> I support the Professional Association of SQL Server (PASS) and it's
> community of SQL Server professionals.
> www.sqlpass.org
> "MrTim" <MrTim@.discussions.microsoft.com> wrote in message
> news:72451A09-E4F8-46E6-A5A4-B4ACC24877BD@.microsoft.com...
> > If I run a report from our 4GL app (using ODBC) it takes say 5 mins to
> > run.
> > If I run 2 instances of it simultaneously, they will slow down by a factor
> > 5+. Reads and Duration of some transactions in Profiler will jump up
> > significanty.
> >
> > If anything you might have expected the second process to benefit from
> > data
> > caching, but for some reason the opposite occurs.
> >
> > Anyone got any ideas why this might occur?
> > On SP3 - cannot yet install SP4 because of AWE.
> >
>
>|||> Only when the two are in contention do thse start rising to 300+ms.
> Given no other mitigating factors this still isn't what you'd expect.
Again, you expect this when two processes are contending over some limited
resource. The first step is to identify which resource that is. What type
cursor are you using? If your isolation level is too high, it may be a
locking issue. It might be a memory issue, with each query causing the
other to repeatedly swap to disk. Or it could be an i/o issue, as two
queries running simultaneously can turn a fast sequential disk read into a
large number of slow random access reads.
If you can't use a better cursor, then you need to run SQL profiler and see
whats causing the contention.

Running 2+ process simultaneously causes exponetial slowness

If I run a report from our 4GL app (using ODBC) it takes say 5 mins to run.
If I run 2 instances of it simultaneously, they will slow down by a factor
5+. Reads and Duration of some transactions in Profiler will jump up
significanty.
If anything you might have expected the second process to benefit from data
caching, but for some reason the opposite occurs.
Anyone got any ideas why this might occur?
On SP3 - cannot yet install SP4 because of AWE.
MrTim wrote:
> If I run a report from our 4GL app (using ODBC) it takes say 5 mins
> to run. If I run 2 instances of it simultaneously, they will slow
> down by a factor 5+. Reads and Duration of some transactions in
> Profiler will jump up significanty.
> If anything you might have expected the second process to benefit
> from data caching, but for some reason the opposite occurs.
> Anyone got any ideas why this might occur?
> On SP3 - cannot yet install SP4 because of AWE.
A 5 minute run is extremely long. Running two 5 minute queries will amp
up the cpu and disk activity accordingly and force each of them to take
much longer to complete - not to mention all other queries running. You
usually will not see a change in the number of reads or cpu values for
each. You should see a significant change in duration, however. A 5
minute operation is probably flushing all good data out of cache
multiple times while it scans tables. Not to mention that the disks are
probably quite busy during the operation, making each one take longer as
they contend for physical reads - which are very, very slow.
I would suggest you get the performance of the 5 minute query down to a
few seconds if possible, run it off-hours, or run it off a dedicated
reporting server to keep it's activity from degrading the performance of
other queries.
Have a look at the execution plan for the 5 minute query and see if it
can be tuned.
David Gugick
Quest Software
www.imceda.com
www.quest.com
|||Why do you think a 5 minute report is long? Given the amount of data it's
processing it damn quick. It's actually a cursor loop processing tens of
thousands of records, not a single SELECT.
Anyway, it's just an example for this post. The point is that [any] 2
processes running simultaneously take significantly longer than twice as long
to run.
"David Gugick" wrote:

> MrTim wrote:
> A 5 minute run is extremely long. Running two 5 minute queries will amp
> up the cpu and disk activity accordingly and force each of them to take
> much longer to complete - not to mention all other queries running. You
> usually will not see a change in the number of reads or cpu values for
> each. You should see a significant change in duration, however. A 5
> minute operation is probably flushing all good data out of cache
> multiple times while it scans tables. Not to mention that the disks are
> probably quite busy during the operation, making each one take longer as
> they contend for physical reads - which are very, very slow.
> I would suggest you get the performance of the 5 minute query down to a
> few seconds if possible, run it off-hours, or run it off a dedicated
> reporting server to keep it's activity from degrading the performance of
> other queries.
> Have a look at the execution plan for the 5 minute query and see if it
> can be tuned.
>
> --
> David Gugick
> Quest Software
> www.imceda.com
> www.quest.com
>
|||MrTim wrote:
> Why do you think a 5 minute report is long? Given the amount of data
> it's processing it damn quick. It's actually a cursor loop
> processing tens of thousands of records, not a single SELECT.
>
Cursors are terrible performers in the SQL Server world. Set based
solutions can be hundreds of times faster. If you can imagine a
production SQL Server running a single query for 5 minutes when other
transactions take milliseconds, it starts to sound like a long time. For
a dedicated reporting server, it may be acceptable.

> Anyway, it's just an example for this post. The point is that [any] 2
> processes running simultaneously take significantly longer than twice
> as long to run.
The expected duration of a query is partly based on the overall system
CPU load. If a single query loads up the CPU to 25% utilization and
completes in about 60ms, running a second query simultaneously causes
each query to take about 100ms. That doesn't even take into account all
the disk activity. When you have long running processes, they are
normally accessing a lot of data. When SQL Server reads data, it loads
it into cache, moving out potentially more important data that would
benefit from staying in memory. Since it's usually the case where data
sizes far exceed available memory, you end up with a lot of disk
thrashing. And in the I/O world there is nothing slower than that.
That's why I suggest you try and tune the queries and avoid cursors
whenever possible. There are ususally set-based solutions that can work,
or at least get you most of the way there.
David Gugick
Quest Software
www.imceda.com
www.quest.com
|||You're correct in assuming that something else is going on if n processes run
significantly slower than nX time. Memory thrashing, disk thrashing, or
lock contention are the likely candidates.
Check your free resources and i/o queue while both are running. Also, if
you can drop the cursor entirely, or use a fast-forward readonly cursor (w/
nolock), try that as well.
"MrTim" wrote:
[vbcol=seagreen]
> Why do you think a 5 minute report is long? Given the amount of data it's
> processing it damn quick. It's actually a cursor loop processing tens of
> thousands of records, not a single SELECT.
> Anyway, it's just an example for this post. The point is that [any] 2
> processes running simultaneously take significantly longer than twice as long
> to run.
>
> "David Gugick" wrote:
|||Also check to ensure that one process is not blocking another...sp_who,
sp_who2
Wayne Snyder, MCDBA, SQL Server MVP
Mariner, Charlotte, NC
www.mariner-usa.com
(Please respond only to the newsgroups.)
I support the Professional Association of SQL Server (PASS) and it's
community of SQL Server professionals.
www.sqlpass.org
"MrTim" <MrTim@.discussions.microsoft.com> wrote in message
news:72451A09-E4F8-46E6-A5A4-B4ACC24877BD@.microsoft.com...
> If I run a report from our 4GL app (using ODBC) it takes say 5 mins to
> run.
> If I run 2 instances of it simultaneously, they will slow down by a factor
> 5+. Reads and Duration of some transactions in Profiler will jump up
> significanty.
> If anything you might have expected the second process to benefit from
> data
> caching, but for some reason the opposite occurs.
> Anyone got any ideas why this might occur?
> On SP3 - cannot yet install SP4 because of AWE.
>
|||This is running in a dev enivornment, and there are only these 2 processes
running. I know that if it's run in a production environment other processes
could be blocking it. That sort of thing has all been checked.
And, unfortunately the cursor cannot be avoided. It's a 4GL program and a
cursor loop is required to feed data into the report. When only one report
is running, transactions are generally showing 0-16 ms duration (so quite
quick). Only when the two are in contention do thse start rising to 300+ms.
Given no other mitigating factors this still isn't what you'd expect.
"Wayne Snyder" wrote:

> Also check to ensure that one process is not blocking another...sp_who,
> sp_who2
> --
> Wayne Snyder, MCDBA, SQL Server MVP
> Mariner, Charlotte, NC
> www.mariner-usa.com
> (Please respond only to the newsgroups.)
> I support the Professional Association of SQL Server (PASS) and it's
> community of SQL Server professionals.
> www.sqlpass.org
> "MrTim" <MrTim@.discussions.microsoft.com> wrote in message
> news:72451A09-E4F8-46E6-A5A4-B4ACC24877BD@.microsoft.com...
>
>
|||> Only when the two are in contention do thse start rising to 300+ms.
> Given no other mitigating factors this still isn't what you'd expect.
Again, you expect this when two processes are contending over some limited
resource. The first step is to identify which resource that is. What type
cursor are you using? If your isolation level is too high, it may be a
locking issue. It might be a memory issue, with each query causing the
other to repeatedly swap to disk. Or it could be an i/o issue, as two
queries running simultaneously can turn a fast sequential disk read into a
large number of slow random access reads.
If you can't use a better cursor, then you need to run SQL profiler and see
whats causing the contention.

Running 2+ process simultaneously causes exponetial slowness

If I run a report from our 4GL app (using ODBC) it takes say 5 mins to run.
If I run 2 instances of it simultaneously, they will slow down by a factor
5+. Reads and Duration of some transactions in Profiler will jump up
significanty.
If anything you might have expected the second process to benefit from data
caching, but for some reason the opposite occurs.
Anyone got any ideas why this might occur?
On SP3 - cannot yet install SP4 because of AWE.MrTim wrote:
> If I run a report from our 4GL app (using ODBC) it takes say 5 mins
> to run. If I run 2 instances of it simultaneously, they will slow
> down by a factor 5+. Reads and Duration of some transactions in
> Profiler will jump up significanty.
> If anything you might have expected the second process to benefit
> from data caching, but for some reason the opposite occurs.
> Anyone got any ideas why this might occur?
> On SP3 - cannot yet install SP4 because of AWE.
A 5 minute run is extremely long. Running two 5 minute queries will amp
up the cpu and disk activity accordingly and force each of them to take
much longer to complete - not to mention all other queries running. You
usually will not see a change in the number of reads or cpu values for
each. You should see a significant change in duration, however. A 5
minute operation is probably flushing all good data out of cache
multiple times while it scans tables. Not to mention that the disks are
probably quite busy during the operation, making each one take longer as
they contend for physical reads - which are very, very slow.
I would suggest you get the performance of the 5 minute query down to a
few seconds if possible, run it off-hours, or run it off a dedicated
reporting server to keep it's activity from degrading the performance of
other queries.
Have a look at the execution plan for the 5 minute query and see if it
can be tuned.
David Gugick
Quest Software
www.imceda.com
www.quest.com|||Why do you think a 5 minute report is long? Given the amount of data it's
processing it damn quick. It's actually a cursor loop processing tens of
thousands of records, not a single SELECT.
Anyway, it's just an example for this post. The point is that [any] 2
processes running simultaneously take significantly longer than twice as lon
g
to run.
"David Gugick" wrote:

> MrTim wrote:
> A 5 minute run is extremely long. Running two 5 minute queries will amp
> up the cpu and disk activity accordingly and force each of them to take
> much longer to complete - not to mention all other queries running. You
> usually will not see a change in the number of reads or cpu values for
> each. You should see a significant change in duration, however. A 5
> minute operation is probably flushing all good data out of cache
> multiple times while it scans tables. Not to mention that the disks are
> probably quite busy during the operation, making each one take longer as
> they contend for physical reads - which are very, very slow.
> I would suggest you get the performance of the 5 minute query down to a
> few seconds if possible, run it off-hours, or run it off a dedicated
> reporting server to keep it's activity from degrading the performance of
> other queries.
> Have a look at the execution plan for the 5 minute query and see if it
> can be tuned.
>
> --
> David Gugick
> Quest Software
> www.imceda.com
> www.quest.com
>|||MrTim wrote:
> Why do you think a 5 minute report is long? Given the amount of data
> it's processing it damn quick. It's actually a cursor loop
> processing tens of thousands of records, not a single SELECT.
>
Cursors are terrible performers in the SQL Server world. Set based
solutions can be hundreds of times faster. If you can imagine a
production SQL Server running a single query for 5 minutes when other
transactions take milliseconds, it starts to sound like a long time. For
a dedicated reporting server, it may be acceptable.

> Anyway, it's just an example for this post. The point is that [any] 2
> processes running simultaneously take significantly longer than twice
> as long to run.
The expected duration of a query is partly based on the overall system
CPU load. If a single query loads up the CPU to 25% utilization and
completes in about 60ms, running a second query simultaneously causes
each query to take about 100ms. That doesn't even take into account all
the disk activity. When you have long running processes, they are
normally accessing a lot of data. When SQL Server reads data, it loads
it into cache, moving out potentially more important data that would
benefit from staying in memory. Since it's usually the case where data
sizes far exceed available memory, you end up with a lot of disk
thrashing. And in the I/O world there is nothing slower than that.
That's why I suggest you try and tune the queries and avoid cursors
whenever possible. There are ususally set-based solutions that can work,
or at least get you most of the way there.
David Gugick
Quest Software
www.imceda.com
www.quest.com|||You're correct in assuming that something else is going on if n processes ru
n
significantly slower than nX time. Memory thrashing, disk thrashing, or
lock contention are the likely candidates.
Check your free resources and i/o queue while both are running. Also, if
you can drop the cursor entirely, or use a fast-forward readonly cursor (w/
nolock), try that as well.
"MrTim" wrote:
[vbcol=seagreen]
> Why do you think a 5 minute report is long? Given the amount of data it's
> processing it damn quick. It's actually a cursor loop processing tens of
> thousands of records, not a single SELECT.
> Anyway, it's just an example for this post. The point is that [any] 2
> processes running simultaneously take significantly longer than twice as l
ong
> to run.
>
> "David Gugick" wrote:
>|||Also check to ensure that one process is not blocking another...sp_who,
sp_who2
Wayne Snyder, MCDBA, SQL Server MVP
Mariner, Charlotte, NC
www.mariner-usa.com
(Please respond only to the newsgroups.)
I support the Professional Association of SQL Server (PASS) and it's
community of SQL Server professionals.
www.sqlpass.org
"MrTim" <MrTim@.discussions.microsoft.com> wrote in message
news:72451A09-E4F8-46E6-A5A4-B4ACC24877BD@.microsoft.com...
> If I run a report from our 4GL app (using ODBC) it takes say 5 mins to
> run.
> If I run 2 instances of it simultaneously, they will slow down by a factor
> 5+. Reads and Duration of some transactions in Profiler will jump up
> significanty.
> If anything you might have expected the second process to benefit from
> data
> caching, but for some reason the opposite occurs.
> Anyone got any ideas why this might occur?
> On SP3 - cannot yet install SP4 because of AWE.
>|||This is running in a dev enivornment, and there are only these 2 processes
running. I know that if it's run in a production environment other processe
s
could be blocking it. That sort of thing has all been checked.
And, unfortunately the cursor cannot be avoided. It's a 4GL program and a
cursor loop is required to feed data into the report. When only one report
is running, transactions are generally showing 0-16 ms duration (so quite
quick). Only when the two are in contention do thse start rising to 300+ms.
Given no other mitigating factors this still isn't what you'd expect.
"Wayne Snyder" wrote:

> Also check to ensure that one process is not blocking another...sp_who,
> sp_who2
> --
> Wayne Snyder, MCDBA, SQL Server MVP
> Mariner, Charlotte, NC
> www.mariner-usa.com
> (Please respond only to the newsgroups.)
> I support the Professional Association of SQL Server (PASS) and it's
> community of SQL Server professionals.
> www.sqlpass.org
> "MrTim" <MrTim@.discussions.microsoft.com> wrote in message
> news:72451A09-E4F8-46E6-A5A4-B4ACC24877BD@.microsoft.com...
>
>|||> Only when the two are in contention do thse start rising to 300+ms.
> Given no other mitigating factors this still isn't what you'd expect.
Again, you expect this when two processes are contending over some limited
resource. The first step is to identify which resource that is. What typ
e
cursor are you using? If your isolation level is too high, it may be a
locking issue. It might be a memory issue, with each query causing the
other to repeatedly swap to disk. Or it could be an i/o issue, as two
queries running simultaneously can turn a fast sequential disk read into a
large number of slow random access reads.
If you can't use a better cursor, then you need to run SQL profiler and see
whats causing the contention.

Wednesday, March 21, 2012

Run Time Database Connection

Hi
I 've created dsn for sqlserver during run time but while connecting with crystalreport it gives ODBC Error . I need to connect using CRAXDRT.Report,CRAXDRT.Application ,I'm Using vb as front end
Thanks In AdvGenerall the errors are descriptive, so please provide details on which ODBC error it is.

Dave
(Me.HomeTown = DarrenLehmann.HomeTown)
Go Aussies.|||Hi Spring soft

I solved my problem thanx for ur post|||Hi Tony,
Perhaps you could let us know the solution, for enlightenment of others.
On reflection, I suspect the report had a subreport. That's the most common reason (apart from improperly structured formulae) that raises ODBC errors.

Dave|||Ofcourse dave

Private m_crxApp As New CRAXDRT.Application
Private m_crxRep As CRAXDRT.Report
Attribute m_crxRep.VB_VarHelpID = -1
Private m_crxTab As CRAXDRT.DatabaseTable
Implements clsSimpleReport
Public Function clsSimpleReport_RunReport(sReportFileName As String, sTitle As String) As Variant
Dim i As Integer, j As Integer
On Error Resume Next
Screen.MousePointer = 11
m_crxApp.LogOnServer "p2sodbc.dll", "DSN", "DBNAME, "UID", "PWD"
Set m_crxRep = m_crxApp.OpenReport(App.Path & "\Reports\" & sReportFileName)
crvReport.ReportSource = m_crxRep
crvReport.ViewReport
End sub

My mistake was instead of p2sodbc.dll in logon server function i put some other dll file i'm not remembering that nowsql

run SSIS package with ODBC connection via SQL agent

It seems there a lot of problems running SSIS packages under the sql agent. I have read the knowledgebase articles regarding permission issues etc but I still can't get my job to run. I can run any package as a job apart from a package that connects to an external database via an odbc connection. Has anyone had any luck with this and can let me in on the secret.

I had a lot of problems with this too, the way I was able to resolve it was to configure the SQL Agent to run as a domain service account, and then log into the server under the domain service account and create my SSIS package. Save the package to disk and then import it into SSIS, use the SQL Agent to run a job that points to the SSIS package and it then succeeded. In my case it was a security issue when the SQL Agent was running under a different account than the one I had created the package under. Once I used the same account to do everything I was able to get it to work.

|||

Hey Andy

Thanks for the reply, it seems that the issue is that I'm using a 64 bit OS and 32 bit odbc driver and when sql agent runs it uses the 64bit program files as opposed to the 32 bit program files which means that it couldn't find my odbc dsn. So to get around this had to create a batch file which calls the 'x86' program files.

Have also looked at getting a 64 bit odbc for my postgres database but seems that SSIS does point at 32 bit odbc connectiosn so don't see how to include the 64 bit odbc in SSIS.

Ugly and messy!

|||

Hi there,

Ahhhh yes, I tangled with this one too. What's happening is the SQL Management Studio, the Business Intelligence Studio, and SSIS are compiled as a 32 bit applications, the only 64 bit tool I have been able to find is the x64 DTS Wizard and it needs to be run from a command line or by double clicking it in Explorer. By using this you can connect to 64 bit ODBC drivers but you can't run any job as an SSIS package because there is no 64 bit SSIS. I was using an Oracle provider and found out it has a SQL Net issue with any path names like (x86), so I had to get a patch for that. So although you can install and run SQL 2005 on the x64 platform, 99% of the functionality is 32 bit.

Crazy stuff huh?

|||

I have an asp.net web application and a web service (both of them are created in VS 2005 - asp.net 2.0). They are located on the same web server. In both web.config files, I have set <authentication mode="Windows"/> and <identity impersonate="true"/>. Also, configured the IIS settings to use Integrated Windows Authentication and unchecked the Anonymous access (for both). The web service is called from the web app, so I have to pass credentials for authentication to the web service. The web service loads and executes a SSIS package. The package and all the other sql objects are located in the sql server 2005 (windows server 2003 - the same server as the web server).

When run the web service from develop environment (vs. 2005), I get whatever I expected. When call it from web application, however, the package failed (no error message).

From SSIS package log file, found that the user credentials (domain and username) were correctly passed from web service to sql server 2005 at the first event - packageStart, but the credentials (or operator) changed from domainABC\user123 to NT AUTHORITY\NETWORK SERVICE after packageStart. Then, it complains … either the user, domainABC\serverName$, does not have access to the database, or the database does not exist.

Please help and thank you all!

|||

Jenny,

It sounds like the credentials of your web application pool are being passed to SQL Server, the default account for the application pool is NT AUTHORITY\NETWORK SERVICE. Have you checked that?

|||

I had the same issue: A package using a 32-bit ODBC driver, it ran fine in my laptop but not in my 64-bit server. The work arround was to schedule it through an SQL Agent that used a command line to invoked the 32-bit version of dtsexec. The bottom line: the package runs in a 32-bit mode instead of 64-bit mode

Rafael

|||

Hi Andy, thanks for the response.

I have worked on something else these two days. I just checked the web application pool and both my web app and web service are pointed to the DefaultAppPool and its property setting for Identity as Predefined - Network Service. Should I create a new web application pool for both my web app and web service to use? I do need impersonation. How do I configure the new web application pool? Thanks!!!

|||

Here, are some more info.

In the SSIS package, there are three connection managers –

· A:Microsoft OLE DB Provider for Analysis Services 9.0 à connectionType=OleDbConnection

· B: .Net Providers \ SqlClient Data Provider à connection type=SqlConnection

· C:Native OLE DB \ Microsoft OLE DB Provider for SQL Server à connectionType=OLEDB

After ran the web application and check the sql database, I can tell that the package was reached and when through the first two steps which relate to the connection manager B – ADO.Net connection. The remaining steps failed which are related to the connection managers A & C. I think the credentials are passed ok but some setting related to the Analysis services are not correct. Any clues?

|||

Jenny,

There are a few different ways to go about this so why don't we try the easiest way first. According to the error message you are receiving, the domain\server$ does not have access to the database. So what's happening here is because your web application pool is running with the credentials of network service, SQL Server is viewing the account as ServerName$. So what you need to do is go into SQL Server>Security and add a new user with access permissions to the database. When you add the new user, type in the domain name and the server name with a $ at the end of it, for example: mydomain\myserver$

Make sure under the user mappings you check the box of the database that the user needs access to and also check the datareader box. This will give your web server access to read the database. If you need write permissions you can go back and check the datawriter box.

Typically you should run your web application under a domain service account, provide the domain service account access to the database, and use Windows authentication, something like this for your connection string: "server=MySQLServer; Integrated Security=SSPI; database=MyDatabase"

There are probably specific reasons why you want to use Impersonate, so give the mydomain\myserver$ example a try and see if you can get it to work.

|||

Hi Andy,

FYI: The problem was solved. What I did are list as below:

A. We created a new web app pool and pointed the both web application and web service to this web app pool (which was configured not use default identity - Predefined: Network Service but use Configurable - created a new use name).

B. Under SQL server , added this new user and assign the certain rights to it.

C. In the web application and web service, set impersonate=false (instead of true)

D. In the SSIS package, we also have a flat file destination connection which is to write the output to .txt file. I give the read & write rights to this new use. then, it works.

Thank you very much for your kind help!!!

Jenny

|||

Andy_T wrote:

Hi there,

Ahhhh yes, I tangled with this one too. What's happening is the SQL Management Studio, the Business Intelligence Studio, and SSIS are compiled as a 32 bit applications, the only 64 bit tool I have been able to find is the x64 DTS Wizard and it needs to be run from a command line or by double clicking it in Explorer. By using this you can connect to 64 bit ODBC drivers but you can't run any job as an SSIS package because there is no 64 bit SSIS. I was using an Oracle provider and found out it has a SQL Net issue with any path names like (x86), so I had to get a patch for that. So although you can install and run SQL 2005 on the x64 platform, 99% of the functionality is 32 bit.

Crazy stuff huh?

Had a somewhat different problem... BIDS is running 32 bit and connecting to 32-bit ODBC drivers (Paradox DB in this case), but when executed, the server keeps looking for 64-bit drivers. While I wish it would look for ALL possible drivers and automagically drop down from 64 to 32 if necessary, setting Run64BitRuntime to False fixed it for now (which is too bad, IBM has 64 bit drivers for the AS400). Haven't scheduled the package yet, though.|||Hi,

i'm having the same problem (trying to execute a package located on server A and a web service on server B), but i'm using IIS 5.1, so i think i don't have the web application pool feature.

I also have the package's Package source configured as SSIS Package Store so my path refers to "File System" source type.

from a custom application, i call the web service and i returns "failure" when trying to execute the package (this one is retrieved successfully, started but not executed).

Is there any way to solve this problem on a similar way like Jenny_99?

regards.

rodri|||

I had a similar issue and this is the workaround I found.

We installed SQL2005 with SSIS on a 64-bit windows 2003 server. An IBM ISeries for windows (V5R3M with the latest Service Pack) was used for connecting to AS400. There are actually two versions of ODBC drivers on the 64-bit windows (odbcad32.exe). One is located at Windows\System32. Another is at Windows\SysWOW64. Originally I thought the one in System 32 is 32-bit version, and the other one is 64-bit version. But the driver version number under SysWOW64 is actually consistent with the version number in our 32-bit windows 2003 server. So I am not so sure any more. Anyway, what I found out is that DSN under SysWOW64 is actually used when the package is running in BIDS, or SSIS on the server. The one under System32 is used when the package is scheduled as a job. So I created the same DSN on both drivers to cover all bases since I don't know how to force the SQL agent to use a specific driver.

run SSIS package with ODBC connection via SQL agent

It seems there a lot of problems running SSIS packages under the sql agent. I have read the knowledgebase articles regarding permission issues etc but I still can't get my job to run. I can run any package as a job apart from a package that connects to an external database via an odbc connection. Has anyone had any luck with this and can let me in on the secret.

I had a lot of problems with this too, the way I was able to resolve it was to configure the SQL Agent to run as a domain service account, and then log into the server under the domain service account and create my SSIS package. Save the package to disk and then import it into SSIS, use the SQL Agent to run a job that points to the SSIS package and it then succeeded. In my case it was a security issue when the SQL Agent was running under a different account than the one I had created the package under. Once I used the same account to do everything I was able to get it to work.

|||

Hey Andy

Thanks for the reply, it seems that the issue is that I'm using a 64 bit OS and 32 bit odbc driver and when sql agent runs it uses the 64bit program files as opposed to the 32 bit program files which means that it couldn't find my odbc dsn. So to get around this had to create a batch file which calls the 'x86' program files.

Have also looked at getting a 64 bit odbc for my postgres database but seems that SSIS does point at 32 bit odbc connectiosn so don't see how to include the 64 bit odbc in SSIS.

Ugly and messy!

|||

Hi there,

Ahhhh yes, I tangled with this one too. What's happening is the SQL Management Studio, the Business Intelligence Studio, and SSIS are compiled as a 32 bit applications, the only 64 bit tool I have been able to find is the x64 DTS Wizard and it needs to be run from a command line or by double clicking it in Explorer. By using this you can connect to 64 bit ODBC drivers but you can't run any job as an SSIS package because there is no 64 bit SSIS. I was using an Oracle provider and found out it has a SQL Net issue with any path names like (x86), so I had to get a patch for that. So although you can install and run SQL 2005 on the x64 platform, 99% of the functionality is 32 bit.

Crazy stuff huh?

|||

I have an asp.net web application and a web service (both of them are created in VS 2005 - asp.net 2.0). They are located on the same web server. In both web.config files, I have set <authentication mode="Windows"/> and <identity impersonate="true"/>. Also, configured the IIS settings to use Integrated Windows Authentication and unchecked the Anonymous access (for both). The web service is called from the web app, so I have to pass credentials for authentication to the web service. The web service loads and executes a SSIS package. The package and all the other sql objects are located in the sql server 2005 (windows server 2003 - the same server as the web server).

When run the web service from develop environment (vs. 2005), I get whatever I expected. When call it from web application, however, the package failed (no error message).

From SSIS package log file, found that the user credentials (domain and username) were correctly passed from web service to sql server 2005 at the first event - packageStart, but the credentials (or operator) changed from domainABC\user123 to NT AUTHORITY\NETWORK SERVICE after packageStart. Then, it complains … either the user, domainABC\serverName$, does not have access to the database, or the database does not exist.

Please help and thank you all!

|||

Jenny,

It sounds like the credentials of your web application pool are being passed to SQL Server, the default account for the application pool is NT AUTHORITY\NETWORK SERVICE. Have you checked that?

|||

I had the same issue: A package using a 32-bit ODBC driver, it ran fine in my laptop but not in my 64-bit server. The work arround was to schedule it through an SQL Agent that used a command line to invoked the 32-bit version of dtsexec. The bottom line: the package runs in a 32-bit mode instead of 64-bit mode

Rafael

|||

Hi Andy, thanks for the response.

I have worked on something else these two days. I just checked the web application pool and both my web app and web service are pointed to the DefaultAppPool and its property setting for Identity as Predefined - Network Service. Should I create a new web application pool for both my web app and web service to use? I do need impersonation. How do I configure the new web application pool? Thanks!!!

|||

Here, are some more info.

In the SSIS package, there are three connection managers –

· A:Microsoft OLE DB Provider for Analysis Services 9.0 à connectionType=OleDbConnection

· B: .Net Providers \ SqlClient Data Provider à connection type=SqlConnection

· C:Native OLE DB \ Microsoft OLE DB Provider for SQL Server à connectionType=OLEDB

After ran the web application and check the sql database, I can tell that the package was reached and when through the first two steps which relate to the connection manager B – ADO.Net connection. The remaining steps failed which are related to the connection managers A & C. I think the credentials are passed ok but some setting related to the Analysis services are not correct. Any clues?

|||

Jenny,

There are a few different ways to go about this so why don't we try the easiest way first. According to the error message you are receiving, the domain\server$ does not have access to the database. So what's happening here is because your web application pool is running with the credentials of network service, SQL Server is viewing the account as ServerName$. So what you need to do is go into SQL Server>Security and add a new user with access permissions to the database. When you add the new user, type in the domain name and the server name with a $ at the end of it, for example: mydomain\myserver$

Make sure under the user mappings you check the box of the database that the user needs access to and also check the datareader box. This will give your web server access to read the database. If you need write permissions you can go back and check the datawriter box.

Typically you should run your web application under a domain service account, provide the domain service account access to the database, and use Windows authentication, something like this for your connection string: "server=MySQLServer; Integrated Security=SSPI; database=MyDatabase"

There are probably specific reasons why you want to use Impersonate, so give the mydomain\myserver$ example a try and see if you can get it to work.

|||

Hi Andy,

FYI: The problem was solved. What I did are list as below:

A. We created a new web app pool and pointed the both web application and web service to this web app pool (which was configured not use default identity - Predefined: Network Service but use Configurable - created a new use name).

B. Under SQL server , added this new user and assign the certain rights to it.

C. In the web application and web service, set impersonate=false (instead of true)

D. In the SSIS package, we also have a flat file destination connection which is to write the output to .txt file. I give the read & write rights to this new use. then, it works.

Thank you very much for your kind help!!!

Jenny

|||

Andy_T wrote:

Hi there,

Ahhhh yes, I tangled with this one too. What's happening is the SQL Management Studio, the Business Intelligence Studio, and SSIS are compiled as a 32 bit applications, the only 64 bit tool I have been able to find is the x64 DTS Wizard and it needs to be run from a command line or by double clicking it in Explorer. By using this you can connect to 64 bit ODBC drivers but you can't run any job as an SSIS package because there is no 64 bit SSIS. I was using an Oracle provider and found out it has a SQL Net issue with any path names like (x86), so I had to get a patch for that. So although you can install and run SQL 2005 on the x64 platform, 99% of the functionality is 32 bit.

Crazy stuff huh?

Had a somewhat different problem... BIDS is running 32 bit and connecting to 32-bit ODBC drivers (Paradox DB in this case), but when executed, the server keeps looking for 64-bit drivers. While I wish it would look for ALL possible drivers and automagically drop down from 64 to 32 if necessary, setting Run64BitRuntime to False fixed it for now (which is too bad, IBM has 64 bit drivers for the AS400). Haven't scheduled the package yet, though.|||Hi,

i'm having the same problem (trying to execute a package located on server A and a web service on server B), but i'm using IIS 5.1, so i think i don't have the web application pool feature.

I also have the package's Package source configured as SSIS Package Store so my path refers to "File System" source type.

from a custom application, i call the web service and i returns "failure" when trying to execute the package (this one is retrieved successfully, started but not executed).

Is there any way to solve this problem on a similar way like Jenny_99?

regards.

rodri|||

I had a similar issue and this is the workaround I found.

We installed SQL2005 with SSIS on a 64-bit windows 2003 server. An IBM ISeries for windows (V5R3M with the latest Service Pack) was used for connecting to AS400. There are actually two versions of ODBC drivers on the 64-bit windows (odbcad32.exe). One is located at Windows\System32. Another is at Windows\SysWOW64. Originally I thought the one in System 32 is 32-bit version, and the other one is 64-bit version. But the driver version number under SysWOW64 is actually consistent with the version number in our 32-bit windows 2003 server. So I am not so sure any more. Anyway, what I found out is that DSN under SysWOW64 is actually used when the package is running in BIDS, or SSIS on the server. The one under System32 is used when the package is scheduled as a job. So I created the same DSN on both drivers to cover all bases since I don't know how to force the SQL agent to use a specific driver.

sql

Friday, March 9, 2012

Run DTS in safe way.

I have two servers: progress as transational server and mssql as warehouse
server.

I did DTS that "pumps" data from progress to mssql (via ODBC). Copying the
data has to be done once a day, but sometimes there is a need to do it on
user's demand.

I'd like to ensure that I properly understood the method of running DTS I've
just found using Google.

As I can see, there are at least two methods (except of scheduling):

- using dtsrun - which requires user running the DTS to have admin
privileges to use xp_smdshell

- using sp_start_job - which requires creating a job prior to running the
DTS.

I think of using the second one as it seems to be more secure - am I right?
Are there any hidden traps? What else should I do?

--
PL
(remove "nie.spamuj.bo.w.ryj" from my email address)"Piotr Lipski" <piotr.lipski@.nie.poczta.spamuj.onet.bo.pl.w.ryj> wrote in
message news:d42s9j$f6n$1@.news.onet.pl...
>I have two servers: progress as transational server and mssql as warehouse
> server.
> I did DTS that "pumps" data from progress to mssql (via ODBC). Copying the
> data has to be done once a day, but sometimes there is a need to do it on
> user's demand.
> I'd like to ensure that I properly understood the method of running DTS
> I've
> just found using Google.
> As I can see, there are at least two methods (except of scheduling):
> - using dtsrun - which requires user running the DTS to have admin
> privileges to use xp_smdshell
> - using sp_start_job - which requires creating a job prior to running the
> DTS.
> I think of using the second one as it seems to be more secure - am I
> right?
> Are there any hidden traps? What else should I do?
> --
> PL
> (remove "nie.spamuj.bo.w.ryj" from my email address)

There are a number of ways to run a package:

http://www.sqldts.com/default.aspx?104

To run xp_cmdshell, you don't need sysadmin privileges if you configure a
proxy account - see xp_sqlagent_proxy_account in Books Online - although
even using a proxy, it may still be too risky in some environments.
sp_start_job is also a problem, because you can only start a job that you
own yourself (unless you're a sysadmin, of course).

One solution is that a user INSERTs into a queue/request table to indicate
that they want to run the package. A scheduled job can then run every few
seconds/minutes/hours, and if the row is in the table (or the flag column is
set etc), then it runs the package. This avoids giving any special
permissions to users.

Simon|||I would definatly reccomend using the SQL server agent and SP_Start_Job
as you get all the logging features for the agent - you get protection
against running the same DTS package 2 times at once. You can get
e-mail notification when the job has finished or failed or whatever.

I also like the idea of having a requests table - i may consider
however create a trigger on the table that executes the sp_start_job
stored procedure instead of creating a polling process. It all depends
on how much complexity you can afford and how much tracking / logging
you require.

One of my customers has many DTS packages used for loading data in to a
data warehouse. When a busniess manager receives some new data from a
data provider they simply place the data on a shared drive and run the
DTS package. I created a simple ASP.NET web application that lists the
DTS packages available and shows the status of each as in the
enterprise manager..

in 2 days i had a fully functional system with simple access
permissions and admin screens. If you want to a similar thing look for
these stored procedures.

msdb.dbo.sp_help_job
msdb.dbo.sp_start_job
msdb.dbo.sp_stop_job