Friday, February 24, 2012

Large System Databases

I manage a SQL Server where the master database is almost 3 times
larger than any other database and the tempdb is even bigger than that.
I have not been monitoring the size of these databases for very long
so I don't have a good baseline. However, it seems like a problem. I
would appreciate any help.
Thanks!What is large? Does the app use master to store user data?
--
Andrew J. Kelly SQL MVP
"shanghai360" <scott.seely@.sfsoma.com> wrote in message
news:1143653999.425234.24030@.t31g2000cwb.googlegroups.com...
>I manage a SQL Server where the master database is almost 3 times
> larger than any other database and the tempdb is even bigger than that.
> I have not been monitoring the size of these databases for very long
> so I don't have a good baseline. However, it seems like a problem. I
> would appreciate any help.
> Thanks!
>|||The Master database is 3.6 GB and Tempdb is 4.5 GB...perhaps not so big
for some environments.
You are right, our Sitescope monitoring software maintains a log table
in the Master database that is 3.3 GB alone. Is that a good practice?
Thank you for your help Sir!|||No that is very poor practice. Why would you use the Master database as a
user db? If that is third party software then you should ask for your money
back and tell them to call you when they get a clue. It is impossible to
say why tempdb is that big and if it is needed that big or not without much
more information. Check to be sure there are no long running open
transactions in Tempdb.
--
Andrew J. Kelly SQL MVP
"shanghai360" <scott.seely@.sfsoma.com> wrote in message
news:1143749039.920309.50530@.v46g2000cwv.googlegroups.com...
> The Master database is 3.6 GB and Tempdb is 4.5 GB...perhaps not so big
> for some environments.
> You are right, our Sitescope monitoring software maintains a log table
> in the Master database that is 3.3 GB alone. Is that a good practice?
> Thank you for your help Sir!
>|||It turned out that it was 3rd party software and consultant that
installed it in the Master. Thanks to your help we were able to
resolve the problem and move the table into a user database. We regard
to temp there was no long running open transactions...I restarted the
sql service and it shrunk to a resonable size, I'm monitoring it now.
Thanks again!

Large System Databases

I manage a SQL Server where the master database is almost 3 times
larger than any other database and the tempdb is even bigger than that.
I have not been monitoring the size of these databases for very long
so I don't have a good baseline. However, it seems like a problem. I
would appreciate any help.
Thanks!What is large? Does the app use master to store user data?
Andrew J. Kelly SQL MVP
"shanghai360" <scott.seely@.sfsoma.com> wrote in message
news:1143653999.425234.24030@.t31g2000cwb.googlegroups.com...
>I manage a SQL Server where the master database is almost 3 times
> larger than any other database and the tempdb is even bigger than that.
> I have not been monitoring the size of these databases for very long
> so I don't have a good baseline. However, it seems like a problem. I
> would appreciate any help.
> Thanks!
>|||The Master database is 3.6 GB and Tempdb is 4.5 GB...perhaps not so big
for some environments.
You are right, our Sitescope monitoring software maintains a log table
in the Master database that is 3.3 GB alone. Is that a good practice?
Thank you for your help Sir!|||No that is very poor practice. Why would you use the Master database as a
user db? If that is third party software then you should ask for your money
back and tell them to call you when they get a clue. It is impossible to
say why tempdb is that big and if it is needed that big or not without much
more information. Check to be sure there are no long running open
transactions in Tempdb.
Andrew J. Kelly SQL MVP
"shanghai360" <scott.seely@.sfsoma.com> wrote in message
news:1143749039.920309.50530@.v46g2000cwv.googlegroups.com...
> The Master database is 3.6 GB and Tempdb is 4.5 GB...perhaps not so big
> for some environments.
> You are right, our Sitescope monitoring software maintains a log table
> in the Master database that is 3.3 GB alone. Is that a good practice?
> Thank you for your help Sir!
>|||It turned out that it was 3rd party software and consultant that
installed it in the Master. Thanks to your help we were able to
resolve the problem and move the table into a user database. We regard
to temp there was no long running open transactions...I restarted the
sql service and it shrunk to a resonable size, I'm monitoring it now.
Thanks again!

Large String Parameters Causing Performance Issues

Hi All,
Bit of strange one this. I have a key field in one of my databases that is a
Char(18). I use this as a parameter field in one of my reports.
When I run the report and specify the key field parameter manually the
report runs instantly. If I attempt to set this as a parameter variable and
enter the criteria in the parameter box, the report takes over a minute to
run. The 2 statements are as follows.
1) select sum(sales_value) as sales from production.dbo.epos_transactions
where transaction_ref = '085505010500074995'
When run as above, the report runs instantly.
2) select sum(sales_value) as sales from production.dbo.epos_transactions
where transaction_ref = @.TREF
When I enter the parameter and run the report it takes over a minute to run.
Running the query through query analyser causes no issues on any of the
servers involved and the response time on all servers is instant.
I'd be grateful if anyone can shed any light on this.
Many thanks for any help.
--
Nick Colebourn
DBA
United Coop LtdHi Nick,
Could you supply a generic repro for this issue. An RDL for Northwind or a
simple DB. I would love to take a look at it.
| Thread-Topic: Large String Parameters Causing Performance Issues
| thread-index: AcUC1GzdZAw/tJ+RRciPnzP3TypLNA==| X-WBNR-Posting-Host: 213.146.135.115
| From: "=?Utf-8?B?TmljayBDb2xlYm91cm4=?="
<NickColebourn@.discussions.microsoft.com>
| Subject: Large String Parameters Causing Performance Issues
| Date: Tue, 25 Jan 2005 03:53:03 -0800
| Lines: 29
| Message-ID: <EFE8F146-E9CC-49C8-A5B8-4E29E3F57EC2@.microsoft.com>
| MIME-Version: 1.0
| Content-Type: text/plain;
| charset="Utf-8"
| Content-Transfer-Encoding: 7bit
| X-Newsreader: Microsoft CDO for Windows 2000
| Content-Class: urn:content-classes:message
| Importance: normal
| Priority: normal
| X-MimeOLE: Produced By Microsoft MimeOLE V6.00.3790.0
| Newsgroups: microsoft.public.sqlserver.reportingsvcs
| NNTP-Posting-Host: TK2MSFTNGXA03.phx.gbl 10.40.1.29
| Path: cpmsftngxa10.phx.gbl!TK2MSFTNGXA03.phx.gbl
| Xref: cpmsftngxa10.phx.gbl microsoft.public.sqlserver.reportingsvcs:40737
| X-Tomcat-NG: microsoft.public.sqlserver.reportingsvcs
|
| Hi All,
| Bit of strange one this. I have a key field in one of my databases that
is a
| Char(18). I use this as a parameter field in one of my reports.
| When I run the report and specify the key field parameter manually the
| report runs instantly. If I attempt to set this as a parameter variable
and
| enter the criteria in the parameter box, the report takes over a minute
to
| run. The 2 statements are as follows.
|
| 1) select sum(sales_value) as sales from production.dbo.epos_transactions
| where transaction_ref = '085505010500074995'
|
| When run as above, the report runs instantly.
|
| 2) select sum(sales_value) as sales from production.dbo.epos_transactions
| where transaction_ref = @.TREF
|
| When I enter the parameter and run the report it takes over a minute to
run.
|
| Running the query through query analyser causes no issues on any of the
| servers involved and the response time on all servers is instant.
|
| I'd be grateful if anyone can shed any light on this.
|
| Many thanks for any help.
|
| --
| Nick Colebourn
| DBA
| United Coop Ltd
||||Hi Brad,
Many thanks for your reply. I'll see if I can get it to do the same
with a northwind query and post back asap. For your info, I've created an
in-line table valued function containing the sql with the parameter and
called this as the dataset instead of the sql text itself. This works ok
either with static criteria or using a parameter passed across to the
function. The performance issue only seems to happen when you pass the
parameter value directly across to the sql text.
Many thanks
Nick
""Brad Syputa - MS"" wrote:
> Hi Nick,
> Could you supply a generic repro for this issue. An RDL for Northwind or a
> simple DB. I would love to take a look at it.
>
> --
> | Thread-Topic: Large String Parameters Causing Performance Issues
> | thread-index: AcUC1GzdZAw/tJ+RRciPnzP3TypLNA==> | X-WBNR-Posting-Host: 213.146.135.115
> | From: "=?Utf-8?B?TmljayBDb2xlYm91cm4=?="
> <NickColebourn@.discussions.microsoft.com>
> | Subject: Large String Parameters Causing Performance Issues
> | Date: Tue, 25 Jan 2005 03:53:03 -0800
> | Lines: 29
> | Message-ID: <EFE8F146-E9CC-49C8-A5B8-4E29E3F57EC2@.microsoft.com>
> | MIME-Version: 1.0
> | Content-Type: text/plain;
> | charset="Utf-8"
> | Content-Transfer-Encoding: 7bit
> | X-Newsreader: Microsoft CDO for Windows 2000
> | Content-Class: urn:content-classes:message
> | Importance: normal
> | Priority: normal
> | X-MimeOLE: Produced By Microsoft MimeOLE V6.00.3790.0
> | Newsgroups: microsoft.public.sqlserver.reportingsvcs
> | NNTP-Posting-Host: TK2MSFTNGXA03.phx.gbl 10.40.1.29
> | Path: cpmsftngxa10.phx.gbl!TK2MSFTNGXA03.phx.gbl
> | Xref: cpmsftngxa10.phx.gbl microsoft.public.sqlserver.reportingsvcs:40737
> | X-Tomcat-NG: microsoft.public.sqlserver.reportingsvcs
> |
> | Hi All,
> | Bit of strange one this. I have a key field in one of my databases that
> is a
> | Char(18). I use this as a parameter field in one of my reports.
> | When I run the report and specify the key field parameter manually the
> | report runs instantly. If I attempt to set this as a parameter variable
> and
> | enter the criteria in the parameter box, the report takes over a minute
> to
> | run. The 2 statements are as follows.
> |
> | 1) select sum(sales_value) as sales from production.dbo.epos_transactions
> | where transaction_ref = '085505010500074995'
> |
> | When run as above, the report runs instantly.
> |
> | 2) select sum(sales_value) as sales from production.dbo.epos_transactions
> | where transaction_ref = @.TREF
> |
> | When I enter the parameter and run the report it takes over a minute to
> run.
> |
> | Running the query through query analyser causes no issues on any of the
> | servers involved and the response time on all servers is instant.
> |
> | I'd be grateful if anyone can shed any light on this.
> |
> | Many thanks for any help.
> |
> | --
> | Nick Colebourn
> | DBA
> | United Coop Ltd
> |
>

Large string data

I have a web site that allows user to enter large strings into a database (comments, etc). What is the best way to do that? Right now I have them limited to 25 characters and the data type is varchar. Is there a better way?
Thanks!You may consider using OPENXML.|||VARCHAR can hold up to 8000 characters. 25 characters is too small for the comments. If you do not want to let user enter more than 8000 characters in the comments, it is better to use VARCHAR. Otherwise, you may change to TEXT. Note that TEXT datatype sometimes cause the problem.

Large SQL update - effect on SQL 2005 transactional replication

I'm a newbie to Replication and recently setup the following.

Publisher and Distributor on the same SQL2005 server, then I've got 7 subscribers(SQL2000 servers) and I'm using push subscriptions. I'm replicating 5 SQl tables which don't have too many changes and these are scheduled to run every 3 hours. In a few days a large one off SQL update with add an additional 10,000 rows to one of the replicated tables. I was wondering what impact this would have on the above setup i.e are there any sort of limitations here. I'm assuming not but thought I would check. I'm thinking it will just cause additional overhead on the server, but the update is being applied when no users will be using the database.

Any feedback greatly appreciated.

Thanks

You shouldn't have any problems, just understand that if you do this in one large transaction, logreader agent won't pick up changes until the transaction has been completed/committed, and the distribution agent will not pick up the changes until after the logreader agent has completed.

Large sql statement with 5000 insert and update commands

Hi
What can I do to speed up the processing of a large amount of insert
and update commands that are exchanged using a notepad document.
The document has, say 5000, individual sql statements that either
update or insert.
Is there something in the database that can be turned on or off to
speed up this processing.
thanks
NewishBy default, each INSERT is an implicit transaction and thus requires a
synchronous log i/o for each statement. If you specify an explicit BEGIN
TRANSACTION at the beginning of the script and a COMMIT at the end, you will
get improved performance because log i/o is reduced and only the log flush
for the COMMIT is synchronous.
Hope this helps.
Dan Guzman
SQL Server MVP
"Newish" <ahussain3@.gmail.com> wrote in message
news:1160325892.791516.228990@.i3g2000cwc.googlegroups.com...
> Hi
> What can I do to speed up the processing of a large amount of insert
> and update commands that are exchanged using a notepad document.
> The document has, say 5000, individual sql statements that either
> update or insert.
> Is there something in the database that can be turned on or off to
> speed up this processing.
> thanks
> Newish
>

Large sql statement with 5000 insert and update commands

Hi
What can I do to speed up the processing of a large amount of insert
and update commands that are exchanged using a notepad document.
The document has, say 5000, individual sql statements that either
update or insert.
Is there something in the database that can be turned on or off to
speed up this processing.
thanks
Newish
Hello Newish,
If you batch the inserts in a transaction you should get an improvement.
around each 2000 statements do a BEGIN TRANSACTION and a COMMIT TRANSACTION.
Simon Sabin
SQL Server MVP
http://sqlblogcasts.com/blogs/simons

> Hi
> What can I do to speed up the processing of a large amount of insert
> and update commands that are exchanged using a notepad document.
> The document has, say 5000, individual sql statements that either
> update or insert.
> Is there something in the database that can be turned on or off to
> speed up this processing.
> thanks
> Newish
>

Large sql statement with 5000 insert and update commands

Hi
What can I do to speed up the processing of a large amount of insert
and update commands that are exchanged using a notepad document.
The document has, say 5000, individual sql statements that either
update or insert.
Is there something in the database that can be turned on or off to
speed up this processing.
thanks
Newish
By default, each INSERT is an implicit transaction and thus requires a
synchronous log i/o for each statement. If you specify an explicit BEGIN
TRANSACTION at the beginning of the script and a COMMIT at the end, you will
get improved performance because log i/o is reduced and only the log flush
for the COMMIT is synchronous.
Hope this helps.
Dan Guzman
SQL Server MVP
"Newish" <ahussain3@.gmail.com> wrote in message
news:1160325892.791516.228990@.i3g2000cwc.googlegro ups.com...
> Hi
> What can I do to speed up the processing of a large amount of insert
> and update commands that are exchanged using a notepad document.
> The document has, say 5000, individual sql statements that either
> update or insert.
> Is there something in the database that can be turned on or off to
> speed up this processing.
> thanks
> Newish
>

Large sql statement with 5000 insert and update commands

Hi
What can I do to speed up the processing of a large amount of insert
and update commands that are exchanged using a notepad document.
The document has, say 5000, individual sql statements that either
update or insert.
Is there something in the database that can be turned on or off to
speed up this processing.
thanks
NewishHello Newish,
If you batch the inserts in a transaction you should get an improvement.
around each 2000 statements do a BEGIN TRANSACTION and a COMMIT TRANSACTION.
Simon Sabin
SQL Server MVP
http://sqlblogcasts.com/blogs/simons
> Hi
> What can I do to speed up the processing of a large amount of insert
> and update commands that are exchanged using a notepad document.
> The document has, say 5000, individual sql statements that either
> update or insert.
> Is there something in the database that can be turned on or off to
> speed up this processing.
> thanks
> Newish
>

Large sql statement with 5000 insert and update commands

Hi
What can I do to speed up the processing of a large amount of insert
and update commands that are exchanged using a notepad document.
The document has, say 5000, individual sql statements that either
update or insert.
Is there something in the database that can be turned on or off to
speed up this processing.
thanks
NewishBy default, each INSERT is an implicit transaction and thus requires a
synchronous log i/o for each statement. If you specify an explicit BEGIN
TRANSACTION at the beginning of the script and a COMMIT at the end, you will
get improved performance because log i/o is reduced and only the log flush
for the COMMIT is synchronous.
--
Hope this helps.
Dan Guzman
SQL Server MVP
"Newish" <ahussain3@.gmail.com> wrote in message
news:1160325892.791516.228990@.i3g2000cwc.googlegroups.com...
> Hi
> What can I do to speed up the processing of a large amount of insert
> and update commands that are exchanged using a notepad document.
> The document has, say 5000, individual sql statements that either
> update or insert.
> Is there something in the database that can be turned on or off to
> speed up this processing.
> thanks
> Newish
>

Large Slowly Changing Dimension

Hello,
I'm somewhat new to data warehousing, and have some basic questions.
Let's say I have a fact table, named sales. The financial analyst
would like to be able to sort the sales information based on several
customer related attributes. Some of these attributes would include
zip code, age, and income. While I can create a customer dimension,
these attributes change over time.
The analysts are really only interested in these customer attributes
at the time of the sale. For example, they only care about where a
customer lived at the time of the sale, but not where they are living
now. Now I know there are different techniques for dealing with
changing dimensions, but I'm left with the impression that it would be
making things more complicated then necessary.
Would it not be best to just simply store this information in the
fact table itself? While I'm aware that fact tables are primarily used
to store numerical information, I don't see the advantage of putting
this information in a separate dimension.
Your input would be much appreciated.
Thanks,
AttilaHi,
You say the analysts are interested in the customer attributes at the time
of the sale. So these attributes will have to be stored in dimensions to be
able to analyse them using datawarehouse techniques. These must be slowly
changing. So when you add more salesfacts, e.g. with a zipcode that doesnt
exist in the zipcode dimension yet, you'll have to add it to the dimension.
The alternative is to drop and recreate all the facts and dimensions, but
that's no option if the customers move or the incomes change in the source
system.
Good luck...
Jeroen.
"Attila" <ThisIsAFakeAddress35@.hotmail.com> wrote in message
news:9516751f.0401290847.23e49471@.posting.google.com...
quote:

> Hello,
> I'm somewhat new to data warehousing, and have some basic questions.
> Let's say I have a fact table, named sales. The financial analyst
> would like to be able to sort the sales information based on several
> customer related attributes. Some of these attributes would include
> zip code, age, and income. While I can create a customer dimension,
> these attributes change over time.
> The analysts are really only interested in these customer attributes
> at the time of the sale. For example, they only care about where a
> customer lived at the time of the sale, but not where they are living
> now. Now I know there are different techniques for dealing with
> changing dimensions, but I'm left with the impression that it would be
> making things more complicated then necessary.
> Would it not be best to just simply store this information in the
> fact table itself? While I'm aware that fact tables are primarily used
> to store numerical information, I don't see the advantage of putting
> this information in a separate dimension.
> Your input would be much appreciated.
> Thanks,
> Attila
|||> So these attributes will have to be stored in dimensions to be
able to analyse them using datawarehouse techniques.
Can you elaborate on this part? What specifically will an analyst not
be able to do if I store the customer data in the fact table?
Thanks,
Attila
"Spike" <jeroenaNOSPAM@.hotmail.com> wrote in message news:<bvgf14$sjl$1@.reader08.wxs.nl>...[QUOTE]
> Hi,
> You say the analysts are interested in the customer attributes at the time
> of the sale. So these attributes will have to be stored in dimensions to b
e
> able to analyse them using datawarehouse techniques. These must be slowly
> changing. So when you add more salesfacts, e.g. with a zipcode that doesnt
> exist in the zipcode dimension yet, you'll have to add it to the dimension
.
> The alternative is to drop and recreate all the facts and dimensions, but
> that's no option if the customers move or the incomes change in the source
> system.
> Good luck...
> Jeroen.
> "Attila" <ThisIsAFakeAddress35@.hotmail.com> wrote in message
> news:9516751f.0401290847.23e49471@.posting.google.com...|||I suggest reading articles written by Ralph Kimball. Here's one of his
classic articles on slowly changing dimensions.
http://www.dbmsmag.com/9604d05.html
"Attila" <ThisIsAFakeAddress35@.hotmail.com> wrote in message
news:9516751f.0402020643.36d448b7@.posting.google.com...
> able to analyse them using datawarehouse techniques.
> Can you elaborate on this part? What specifically will an analyst not
> be able to do if I store the customer data in the fact table?
> Thanks,
> Attila
>
> "Spike" <jeroenaNOSPAM@.hotmail.com> wrote in message
news:<bvgf14$sjl$1@.reader08.wxs.nl>...
time
be
slowly
doesnt
dimension.
but
source|||There is an entire chapter in Ralph Kimballs book on mini dimensions,
which help to stabilise the pain of rapidly changing dimensions.
Ashish
On Tue, 17 Feb 2004 15:29:51 -0500, Domenico Discepola
<domenico_discepola@.quadrachemicals.com> wrote:

> I suggest reading articles written by Ralph Kimball. Here's one of his
> classic articles on slowly changing dimensions.
> http://www.dbmsmag.com/9604d05.html
>
> "Attila" <ThisIsAFakeAddress35@.hotmail.com> wrote in message
> news:9516751f.0402020643.36d448b7@.posting.google.com...
> news:<bvgf14$sjl$1@.reader08.wxs.nl>...
> time
> be
> slowly
> doesnt
> dimension.
> but
> source
>
Using M2, Opera's revolutionary e-mail client: http://www.opera.com/m2/

large size of string in the drill down

Hello,

I have a issue with the drill down. In the report there is drill down in the Amount column. I am trying to pass the customer names in this drill down but there are more than 100 customers for that specific case and drill down is not able to pass all the customers.

Is there any other way to pass the large string in the drill down?

You may be able to write some custom code that would compress the string and then uncompress in the drill down.

Here is some example code.

http://www.developerfusion.co.uk/show/1642/

Otherwise try and pass keys instead of names into the drilldown.

cheers,

Andrew

|||Can someone please help on this?

Large selection in multiple parameters

I have a problem and was wondering if someone might have some good
suggestions. I've created a report that shows the number of loans that
have been given to students at various colleges by various vendors.
The report has parameters for vendor and for college, and I created the
parameters as multi-value drop down boxes. The source for the schools
box is a query against my data table for the distinct Schools that
appear.
The problem I'm having is this: There are approximately 4000 schools
in the table, and when the user does a Select All on the web site, the
report chugs away for a while and then returns nothing. It works fine
in Visual Studio (slowly, but it returns everything) but not on the
website. I don't get any error messages.
The WHERE clause in my main script is simply:
Where SchoolName in (@.schoolName)
My guess is that there are simply too much being shoved into the
parameter when I try to select all. Just wondering if anyone has a
reasonable way to get around this. I've managed a rather kludgy method
where I have an additional boolean parameter and then logic to bypass
the WHERE if it's true, but I'd rather just have the thing work
normally (via the drop-down) if that's possible.
The drop down for the Vendors works just fine, whether I select one,
many, or all of the vendors. But there are only 30 of those, so the
cause seems to be the number of choices. Is there a set limit to the
number of choices in a drop down, or a set size that can be passed
perhaps?
Using Reporting Services 2005.How about doing a bit of processing before the query gets run. Something
along the lines of
IIF parameter!School.Item(0) = true, select data regardless of school (no
whereclause on Schoolname) else select data with whereclause.
I'm not quite sure how to check if the Select All parameter was selected or
not, but it might be food for thought anyway.
Kaisa M. Lindahl Lervik
<cphite@.gmail.com> wrote in message
news:1162419349.183710.148290@.h48g2000cwc.googlegroups.com...
>I have a problem and was wondering if someone might have some good
> suggestions. I've created a report that shows the number of loans that
> have been given to students at various colleges by various vendors.
> The report has parameters for vendor and for college, and I created the
> parameters as multi-value drop down boxes. The source for the schools
> box is a query against my data table for the distinct Schools that
> appear.
> The problem I'm having is this: There are approximately 4000 schools
> in the table, and when the user does a Select All on the web site, the
> report chugs away for a while and then returns nothing. It works fine
> in Visual Studio (slowly, but it returns everything) but not on the
> website. I don't get any error messages.
> The WHERE clause in my main script is simply:
> Where SchoolName in (@.schoolName)
> My guess is that there are simply too much being shoved into the
> parameter when I try to select all. Just wondering if anyone has a
> reasonable way to get around this. I've managed a rather kludgy method
> where I have an additional boolean parameter and then logic to bypass
> the WHERE if it's true, but I'd rather just have the thing work
> normally (via the drop-down) if that's possible.
> The drop down for the Vendors works just fine, whether I select one,
> many, or all of the vendors. But there are only 30 of those, so the
> cause seems to be the number of choices. Is there a set limit to the
> number of choices in a drop down, or a set size that can be passed
> perhaps?
> Using Reporting Services 2005.
>|||What I have done in the case is as follows. I found this function
CREATE FUNCTION [dbo].[fn_MVParam](@.RepParam nvarchar(max), @.Delim char(1)=',')
RETURNS @.VALUES TABLE (Param nvarchar(max))AS
BEGIN
DECLARE @.chrind INT
DECLARE @.Piece nvarchar(max)
SELECT @.chrind = 1
WHILE @.chrind > 0
BEGIN
SELECT @.chrind = CHARINDEX(@.Delim,@.RepParam)
IF @.chrind > 0
SELECT @.Piece = LEFT(@.RepParam,@.chrind - 1)
ELSE
SELECT @.Piece = @.RepParam
INSERT @.VALUES(Param) VALUES(@.Piece)
SELECT @.RepParam = RIGHT(@.RepParam,LEN(@.RepParam) - @.chrind)
IF LEN(@.RepParam) = 0 BREAK
END
RETURN
END
This will break the comma separated string passed into your stored proc into
a table. Thow the results of this function into a temp table, and filter you
results with a join, instead of an in.
Ken
"Kaisa M. Lindahl Lervik" wrote:
> How about doing a bit of processing before the query gets run. Something
> along the lines of
> IIF parameter!School.Item(0) = true, select data regardless of school (no
> whereclause on Schoolname) else select data with whereclause.
> I'm not quite sure how to check if the Select All parameter was selected or
> not, but it might be food for thought anyway.
> Kaisa M. Lindahl Lervik
> <cphite@.gmail.com> wrote in message
> news:1162419349.183710.148290@.h48g2000cwc.googlegroups.com...
> >I have a problem and was wondering if someone might have some good
> > suggestions. I've created a report that shows the number of loans that
> > have been given to students at various colleges by various vendors.
> > The report has parameters for vendor and for college, and I created the
> > parameters as multi-value drop down boxes. The source for the schools
> > box is a query against my data table for the distinct Schools that
> > appear.
> >
> > The problem I'm having is this: There are approximately 4000 schools
> > in the table, and when the user does a Select All on the web site, the
> > report chugs away for a while and then returns nothing. It works fine
> > in Visual Studio (slowly, but it returns everything) but not on the
> > website. I don't get any error messages.
> >
> > The WHERE clause in my main script is simply:
> > Where SchoolName in (@.schoolName)
> >
> > My guess is that there are simply too much being shoved into the
> > parameter when I try to select all. Just wondering if anyone has a
> > reasonable way to get around this. I've managed a rather kludgy method
> > where I have an additional boolean parameter and then logic to bypass
> > the WHERE if it's true, but I'd rather just have the thing work
> > normally (via the drop-down) if that's possible.
> >
> > The drop down for the Vendors works just fine, whether I select one,
> > many, or all of the vendors. But there are only 30 of those, so the
> > cause seems to be the number of choices. Is there a set limit to the
> > number of choices in a drop down, or a set size that can be passed
> > perhaps?
> >
> > Using Reporting Services 2005.
> >
>
>|||Ken Reitmeyer wrote:
> What I have done in the case is as follows. I found this function
> CREATE FUNCTION [dbo].[fn_MVParam](@.RepParam nvarchar(max), @.Delim char(1)=> ',')
> RETURNS @.VALUES TABLE (Param nvarchar(max))AS
> BEGIN
> DECLARE @.chrind INT
> DECLARE @.Piece nvarchar(max)
> SELECT @.chrind = 1
> WHILE @.chrind > 0
> BEGIN
> SELECT @.chrind = CHARINDEX(@.Delim,@.RepParam)
> IF @.chrind > 0
> SELECT @.Piece = LEFT(@.RepParam,@.chrind - 1)
> ELSE
> SELECT @.Piece = @.RepParam
> INSERT @.VALUES(Param) VALUES(@.Piece)
> SELECT @.RepParam = RIGHT(@.RepParam,LEN(@.RepParam) - @.chrind)
> IF LEN(@.RepParam) = 0 BREAK
> END
> RETURN
> END
> This will break the comma separated string passed into your stored proc into
> a table. Thow the results of this function into a temp table, and filter you
> results with a join, instead of an in.
Ken,
When I use this function it works for one selection, but when I select
more than one school it tells me I have too many parameters.|||This is how I am using the function to pass a long parameter list
--temp table to hold parameters
declare @.tbl_WorkCenters table
(
work_center int
)
declare @.sql varchar(max)
--@.work_centers is the comma separated parameter list passed in from the
reprot
set @.sql = 'select ltrim(param) from fn_MVParam(''' + @.work_centers + ''',
'','')'
insert into @.tbl_WorkCenters
exec (@.sql)
Not sure if that answers your question or not.
"cphite@.gmail.com" wrote:
> Ken Reitmeyer wrote:
> > What I have done in the case is as follows. I found this function
> >
> > CREATE FUNCTION [dbo].[fn_MVParam](@.RepParam nvarchar(max), @.Delim char(1)=> > ',')
> > RETURNS @.VALUES TABLE (Param nvarchar(max))AS
> > BEGIN
> > DECLARE @.chrind INT
> > DECLARE @.Piece nvarchar(max)
> > SELECT @.chrind = 1
> > WHILE @.chrind > 0
> > BEGIN
> > SELECT @.chrind = CHARINDEX(@.Delim,@.RepParam)
> > IF @.chrind > 0
> > SELECT @.Piece = LEFT(@.RepParam,@.chrind - 1)
> > ELSE
> > SELECT @.Piece = @.RepParam
> > INSERT @.VALUES(Param) VALUES(@.Piece)
> > SELECT @.RepParam = RIGHT(@.RepParam,LEN(@.RepParam) - @.chrind)
> > IF LEN(@.RepParam) = 0 BREAK
> > END
> > RETURN
> > END
> >
> > This will break the comma separated string passed into your stored proc into
> > a table. Thow the results of this function into a temp table, and filter you
> > results with a join, instead of an in.
> Ken,
> When I use this function it works for one selection, but when I select
> more than one school it tells me I have too many parameters.
>

Large scale databases - common practice

Hi,
We are designing a large-scale database application. The current one will
receive information from many inputs constantly.
We expect it to grow in one year to 60GB (on average of 4-5GB/Mo).
The database then (after one year) will have a cleaning procedure that will
delete information and the size of the database will not increase
significantly.
Here are a few questions:
1. What are the guide lines to design and implement such a database? Where
can I find information on the web on such cases?
2. From database creation point of view - should we use a few files that
will create the database or should we use only one ? If we use a few files,
what happends to clustered indexes? Where can I find more reading material?
3. The database has the management tables of the application and the data
tables. One data table will hold most of the information in that database. Do
you think I need to divide te table? if so, how do we do it? What is the
common practice in such cases where a huge table holds so much info
(Currently we have a similar database with 27M records in).
how do you build efficient queries? How do you manage the data in the
separate tables? How do you manage indexes? and again where can I find
material on this?
We know to look inside the BOL but we are looking for common practice
solutions.
TIA!!
Noam
Please don't post independently in separate newsgroups. You can add
multiple newsgroups to the header and then all the answers appear as one.
See my reply in the other newsgroup.
Andrew J. Kelly SQL MVP
"Noam" <Noam@.discussions.microsoft.com> wrote in message
news:2DCD3F8D-DFCF-46F7-BB1C-30355D49B321@.microsoft.com...
> Hi,
> We are designing a large-scale database application. The current one will
> receive information from many inputs constantly.
> We expect it to grow in one year to 60GB (on average of 4-5GB/Mo).
> The database then (after one year) will have a cleaning procedure that
will
> delete information and the size of the database will not increase
> significantly.
> Here are a few questions:
> 1. What are the guide lines to design and implement such a database? Where
> can I find information on the web on such cases?
> 2. From database creation point of view - should we use a few files that
> will create the database or should we use only one ? If we use a few
files,
> what happends to clustered indexes? Where can I find more reading
material?
> 3. The database has the management tables of the application and the data
> tables. One data table will hold most of the information in that database.
Do
> you think I need to divide te table? if so, how do we do it? What is the
> common practice in such cases where a huge table holds so much info
> (Currently we have a similar database with 27M records in).
> how do you build efficient queries? How do you manage the data in the
> separate tables? How do you manage indexes? and again where can I find
> material on this?
> We know to look inside the BOL but we are looking for common practice
> solutions.
> TIA!!
> Noam
>
|||Thank you for your fast reply. This is the first time I posted here and I
will do as you suggest next time!.
Thanks.
Noam
"Andrew J. Kelly" wrote:

> Please don't post independently in separate newsgroups. You can add
> multiple newsgroups to the header and then all the answers appear as one.
> See my reply in the other newsgroup.
>
> --
> Andrew J. Kelly SQL MVP
>
> "Noam" <Noam@.discussions.microsoft.com> wrote in message
> news:2DCD3F8D-DFCF-46F7-BB1C-30355D49B321@.microsoft.com...
> will
> files,
> material?
> Do
>
>

Large scale databases - common practice

Hi,
We are designing a large-scale database application. The current one will
receive information from many inputs constantly.
We expect it to grow in one year to 60GB (on average of 4-5GB/Mo).
The database then (after one year) will have a cleaning procedure that will
delete information and the size of the database will not increase
significantly.
Here are a few questions:
1. What are the guide lines to design and implement such a database? Where
can I find information on the web on such cases?
2. From database creation point of view - should we use a few files that
will create the database or should we use only one ? If we use a few files,
what happends to clustered indexes? Where can I find more reading material?
3. The database has the management tables of the application and the data
tables. One data table will hold most of the information in that database. Do
you think I need to divide te table? if so, how do we do it? What is the
common practice in such cases where a huge table holds so much info
(Currently we have a similar database with 27M records in).
how do you build efficient queries? How do you manage the data in the
separate tables? How do you manage indexes? and again where can I find
material on this?
We know to look inside the BOL but we are looking for common practice
solutions.
TIA!!
NoamPlease don't post independently in separate newsgroups. You can add
multiple newsgroups to the header and then all the answers appear as one.
See my reply in the other newsgroup.
Andrew J. Kelly SQL MVP
"Noam" <Noam@.discussions.microsoft.com> wrote in message
news:2DCD3F8D-DFCF-46F7-BB1C-30355D49B321@.microsoft.com...
> Hi,
> We are designing a large-scale database application. The current one will
> receive information from many inputs constantly.
> We expect it to grow in one year to 60GB (on average of 4-5GB/Mo).
> The database then (after one year) will have a cleaning procedure that
will
> delete information and the size of the database will not increase
> significantly.
> Here are a few questions:
> 1. What are the guide lines to design and implement such a database? Where
> can I find information on the web on such cases?
> 2. From database creation point of view - should we use a few files that
> will create the database or should we use only one ? If we use a few
files,
> what happends to clustered indexes? Where can I find more reading
material?
> 3. The database has the management tables of the application and the data
> tables. One data table will hold most of the information in that database.
Do
> you think I need to divide te table? if so, how do we do it? What is the
> common practice in such cases where a huge table holds so much info
> (Currently we have a similar database with 27M records in).
> how do you build efficient queries? How do you manage the data in the
> separate tables? How do you manage indexes? and again where can I find
> material on this?
> We know to look inside the BOL but we are looking for common practice
> solutions.
> TIA!!
> Noam
>|||Thank you for your fast reply. This is the first time I posted here and I
will do as you suggest next time!.
Thanks.
Noam
"Andrew J. Kelly" wrote:
> Please don't post independently in separate newsgroups. You can add
> multiple newsgroups to the header and then all the answers appear as one.
> See my reply in the other newsgroup.
>
> --
> Andrew J. Kelly SQL MVP
>
> "Noam" <Noam@.discussions.microsoft.com> wrote in message
> news:2DCD3F8D-DFCF-46F7-BB1C-30355D49B321@.microsoft.com...
> > Hi,
> >
> > We are designing a large-scale database application. The current one will
> > receive information from many inputs constantly.
> >
> > We expect it to grow in one year to 60GB (on average of 4-5GB/Mo).
> >
> > The database then (after one year) will have a cleaning procedure that
> will
> > delete information and the size of the database will not increase
> > significantly.
> >
> > Here are a few questions:
> > 1. What are the guide lines to design and implement such a database? Where
> > can I find information on the web on such cases?
> > 2. From database creation point of view - should we use a few files that
> > will create the database or should we use only one ? If we use a few
> files,
> > what happends to clustered indexes? Where can I find more reading
> material?
> > 3. The database has the management tables of the application and the data
> > tables. One data table will hold most of the information in that database.
> Do
> > you think I need to divide te table? if so, how do we do it? What is the
> > common practice in such cases where a huge table holds so much info
> > (Currently we have a similar database with 27M records in).
> > how do you build efficient queries? How do you manage the data in the
> > separate tables? How do you manage indexes? and again where can I find
> > material on this?
> >
> > We know to look inside the BOL but we are looking for common practice
> > solutions.
> >
> > TIA!!
> >
> > Noam
> >
>
>

Large scale changes of data in SQL Server 2005

Hi
We are using a Helpdesk system which uses SQL Server 2005 as the back
end. Our office is the group HQ for all our subsidaries and affiliates,
so we tend to get a lot of IT support calls from different
organisations.
When HD agents log the call, a field exists to identify which
organisation this is, e.g. Company A, Company B and so on.
Company A has decided to change their name to Company Z. This needs to
be reflected in the software not just from this point on, but also
historically, i.e. even a called in 2005 by an employee of Company A
should now read as a call logged by an employee of Company Z.
What is the best method to change this in the database? Luckily,
Company A has been one of the smaller companies, so the total amount of
calls by them that exist is about 40.
Many thanks in advance.
SJ
Hi
I hope you do have Company table where you store the names of the companies
and their ID
UPDATE Company SET [Name]='Company Z' WHERE [Name]='Company B'
If your database design is proper you don't need change anything because
the table is refernced by ID anot by NAME
<smokejo@.googlemail.com> wrote in message
news:1163971948.529460.64020@.f16g2000cwb.googlegro ups.com...
> Hi
> We are using a Helpdesk system which uses SQL Server 2005 as the back
> end. Our office is the group HQ for all our subsidaries and affiliates,
> so we tend to get a lot of IT support calls from different
> organisations.
> When HD agents log the call, a field exists to identify which
> organisation this is, e.g. Company A, Company B and so on.
> Company A has decided to change their name to Company Z. This needs to
> be reflected in the software not just from this point on, but also
> historically, i.e. even a called in 2005 by an employee of Company A
> should now read as a call logged by an employee of Company Z.
> What is the best method to change this in the database? Luckily,
> Company A has been one of the smaller companies, so the total amount of
> calls by them that exist is about 40.
> Many thanks in advance.
> SJ
>
|||Uri Dimant wrote:

> Hi
> I hope you do have Company table where you store the names of the companies
> and their ID
> UPDATE Company SET [Name]='Company Z' WHERE [Name]='Company B'
> If your database design is proper you don't need change anything because
> the table is refernced by ID anot by NAME
>
We have a Company table and within it columns for CompanyID and
CompanyName.
I have a couple of questions though...
i) How is it possible to 'see' what values there are for the Company
Name?
ii) Where do I run the command that you mentioned?
iii) Is it possible to actually see all the data in the database and,
if so, how can I do that? For instance, in MS Access you can view the
database and all the info in it...
Sorry, I'm not really an SQL man, so have no idea!
Thanks so much in advance...
|||> i) How is it possible to 'see' what values there are for the Company
> Name?
SELECT * FROM Table WHERE NOT EXISTS (SELECT * FROM Company C WHERE
C.CompanyID=Table.CompanyID)

> ii) Where do I run the command that you mentioned?
Query Analyzer

> iii) Is it possible to actually see all the data in the database and,
What did you mean?

> if so, how can I do that? For instance, in MS Access you can view the
> database and all the info in it...
Enterprise Manager?
<smokejo@.googlemail.com> wrote in message
news:1164110500.619811.76700@.j44g2000cwa.googlegro ups.com...
> Uri Dimant wrote:
> We have a Company table and within it columns for CompanyID and
> CompanyName.
> I have a couple of questions though...
> i) How is it possible to 'see' what values there are for the Company
> Name?
> ii) Where do I run the command that you mentioned?
> iii) Is it possible to actually see all the data in the database and,
> if so, how can I do that? For instance, in MS Access you can view the
> database and all the info in it...
> Sorry, I'm not really an SQL man, so have no idea!
> Thanks so much in advance...
>

Large scale changes of data in SQL Server 2005

Hi
We are using a Helpdesk system which uses SQL Server 2005 as the back
end. Our office is the group HQ for all our subsidaries and affiliates,
so we tend to get a lot of IT support calls from different
organisations.
When HD agents log the call, a field exists to identify which
organisation this is, e.g. Company A, Company B and so on.
Company A has decided to change their name to Company Z. This needs to
be reflected in the software not just from this point on, but also
historically, i.e. even a called in 2005 by an employee of Company A
should now read as a call logged by an employee of Company Z.
What is the best method to change this in the database? Luckily,
Company A has been one of the smaller companies, so the total amount of
calls by them that exist is about 40.
Many thanks in advance.
SJHi
I hope you do have Company table where you store the names of the companies
and their ID
UPDATE Company SET [Name]='Company Z' WHERE [Name]='Company B'
If your database design is proper you don't need change anything because
the table is refernced by ID anot by NAME
<smokejo@.googlemail.com> wrote in message
news:1163971948.529460.64020@.f16g2000cwb.googlegroups.com...
> Hi
> We are using a Helpdesk system which uses SQL Server 2005 as the back
> end. Our office is the group HQ for all our subsidaries and affiliates,
> so we tend to get a lot of IT support calls from different
> organisations.
> When HD agents log the call, a field exists to identify which
> organisation this is, e.g. Company A, Company B and so on.
> Company A has decided to change their name to Company Z. This needs to
> be reflected in the software not just from this point on, but also
> historically, i.e. even a called in 2005 by an employee of Company A
> should now read as a call logged by an employee of Company Z.
> What is the best method to change this in the database? Luckily,
> Company A has been one of the smaller companies, so the total amount of
> calls by them that exist is about 40.
> Many thanks in advance.
> SJ
>|||Uri Dimant wrote:
> Hi
> I hope you do have Company table where you store the names of the companies
> and their ID
> UPDATE Company SET [Name]='Company Z' WHERE [Name]='Company B'
> If your database design is proper you don't need change anything because
> the table is refernced by ID anot by NAME
>
We have a Company table and within it columns for CompanyID and
CompanyName.
I have a couple of questions though...
i) How is it possible to 'see' what values there are for the Company
Name?
ii) Where do I run the command that you mentioned?
iii) Is it possible to actually see all the data in the database and,
if so, how can I do that? For instance, in MS Access you can view the
database and all the info in it...
Sorry, I'm not really an SQL man, so have no idea!
Thanks so much in advance...|||> i) How is it possible to 'see' what values there are for the Company
> Name?
SELECT * FROM Table WHERE NOT EXISTS (SELECT * FROM Company C WHERE
C.CompanyID=Table.CompanyID)
> ii) Where do I run the command that you mentioned?
Query Analyzer
> iii) Is it possible to actually see all the data in the database and,
What did you mean?
> if so, how can I do that? For instance, in MS Access you can view the
> database and all the info in it...
Enterprise Manager?
<smokejo@.googlemail.com> wrote in message
news:1164110500.619811.76700@.j44g2000cwa.googlegroups.com...
> Uri Dimant wrote:
>> Hi
>> I hope you do have Company table where you store the names of the
>> companies
>> and their ID
>> UPDATE Company SET [Name]='Company Z' WHERE [Name]='Company B'
>> If your database design is proper you don't need change anything because
>> the table is refernced by ID anot by NAME
>>
> We have a Company table and within it columns for CompanyID and
> CompanyName.
> I have a couple of questions though...
> i) How is it possible to 'see' what values there are for the Company
> Name?
> ii) Where do I run the command that you mentioned?
> iii) Is it possible to actually see all the data in the database and,
> if so, how can I do that? For instance, in MS Access you can view the
> database and all the info in it...
> Sorry, I'm not really an SQL man, so have no idea!
> Thanks so much in advance...
>

Large scale changes of data in SQL Server 2005

Hi
We are using a Helpdesk system which uses SQL Server 2005 as the back
end. Our office is the group HQ for all our subsidaries and affiliates,
so we tend to get a lot of IT support calls from different
organisations.
When HD agents log the call, a field exists to identify which
organisation this is, e.g. Company A, Company B and so on.
Company A has decided to change their name to Company Z. This needs to
be reflected in the software not just from this point on, but also
historically, i.e. even a called in 2005 by an employee of Company A
should now read as a call logged by an employee of Company Z.
What is the best method to change this in the database? Luckily,
Company A has been one of the smaller companies, so the total amount of
calls by them that exist is about 40.
Many thanks in advance.
SJHi
I hope you do have Company table where you store the names of the companies
and their ID
UPDATE Company SET [Name]='Company Z' WHERE [Name]='Company B'
If your database design is proper you don't need change anything because
the table is refernced by ID anot by NAME
<smokejo@.googlemail.com> wrote in message
news:1163971948.529460.64020@.f16g2000cwb.googlegroups.com...
> Hi
> We are using a Helpdesk system which uses SQL Server 2005 as the back
> end. Our office is the group HQ for all our subsidaries and affiliates,
> so we tend to get a lot of IT support calls from different
> organisations.
> When HD agents log the call, a field exists to identify which
> organisation this is, e.g. Company A, Company B and so on.
> Company A has decided to change their name to Company Z. This needs to
> be reflected in the software not just from this point on, but also
> historically, i.e. even a called in 2005 by an employee of Company A
> should now read as a call logged by an employee of Company Z.
> What is the best method to change this in the database? Luckily,
> Company A has been one of the smaller companies, so the total amount of
> calls by them that exist is about 40.
> Many thanks in advance.
> SJ
>|||Uri Dimant wrote:

> Hi
> I hope you do have Company table where you store the names of the companie
s
> and their ID
> UPDATE Company SET [Name]='Company Z' WHERE [Name]='Company B'
> If your database design is proper you don't need change anything because
> the table is refernced by ID anot by NAME
>
We have a Company table and within it columns for CompanyID and
CompanyName.
I have a couple of questions though...
i) How is it possible to 'see' what values there are for the Company
Name?
ii) Where do I run the command that you mentioned?
iii) Is it possible to actually see all the data in the database and,
if so, how can I do that? For instance, in MS Access you can view the
database and all the info in it...
Sorry, I'm not really an SQL man, so have no idea!
Thanks so much in advance...|||> i) How is it possible to 'see' what values there are for the Company
> Name?
SELECT * FROM Table WHERE NOT EXISTS (SELECT * FROM Company C WHERE
C.CompanyID=Table.CompanyID)

> ii) Where do I run the command that you mentioned?
Query Analyzer

> iii) Is it possible to actually see all the data in the database and,
What did you mean?

> if so, how can I do that? For instance, in MS Access you can view the
> database and all the info in it...
Enterprise Manager?
<smokejo@.googlemail.com> wrote in message
news:1164110500.619811.76700@.j44g2000cwa.googlegroups.com...
> Uri Dimant wrote:
>
> We have a Company table and within it columns for CompanyID and
> CompanyName.
> I have a couple of questions though...
> i) How is it possible to 'see' what values there are for the Company
> Name?
> ii) Where do I run the command that you mentioned?
> iii) Is it possible to actually see all the data in the database and,
> if so, how can I do that? For instance, in MS Access you can view the
> database and all the info in it...
> Sorry, I'm not really an SQL man, so have no idea!
> Thanks so much in advance...
>

Large SB messages cannot be dispatched ?

Hi All\Remus

I have a very weird SB problem in production, we currently have a microsoft case open but are not getting anywhere with it at the moment.

Perhaps someone out there or Remus can shed some light.

Let me give you some background, we have a service borker application using permenant dialogs between remote store Sql Server Express databases (initiator), that send messages on the same dialog to a central database (target).Over a VERY slow network.

Let me reiterate the network is very slow and experiences frequent timeouts, this is the nature of the network and a reason why went with service broker.

Last week we released a new version of the smart client app that uses SB messaging, the only change to SB was the size of the messages, they went from 20KB to over 200KB. NOTHING else changed, existing dialogs, routes, messages etc are still being used.

As soon as the first new large message was sent to central it could not get there, i pick up the following error in Profiler:

This message was dropped because it could not be dispatched on time. State: 2 1 - Sequenced Message.

I had since run many tests, i have skipped that message and let the normal process run , message are successfully delivered until it comes to a large message again then everything stops and the same errors occurs again.

I can send messages up to 30KB as soon it it is over 50KB they never get to the target and all i see is this error in profiler.

Now i need someone to explain to me what happens in the background, i know SB will break big messages up into smaller pieces, i do not know at what size this happens or how SB keeps track of the progress of a big message.

From my testing and results I am ussuming that the first piece of the message is received and SB keeps checking if packets are being received if SB it not receiving any packets anymore(network timeout) or there is a message to live timeout that expires before the complete message is received. This error occurs and it drops the message and tries again.

This gets me into an inifinite loop due to the slow network and message sizes we never actually get a message delivered, all i get is the error message constantly.

Is there some sort of timeout happening if the whole message connot be received in x seconds ? Is it due to SB checking the status of the message but getting timeouts ?

What i have done as a work around is no longer putting large binary data into the messages, therefore they are back to the original size and everyting works, smaller bianry data messages go through so it has nothing to do with the message data. As soon as a message exceeds +-40KB they cannot be delivered and i get the error above.

P.S i tried sending the same binary data from target to initiator as a test, the message never got there either but the error was different i got the following error:

This forwarded message was dropped because the message could not be delivered within the message time to live.

I cannot find anything in BOL or the net about changing or configuring a message time to live.However i do think this is perhaps the issue.

Thern this was followed by alot of Broker Forwarded Message Sent events from the test service but the message never got there and i never saw the first error again, so i am still confused abuot that too ?

Can anyone please help here ... Thanx

Send me the case number (you still have my mail, don't you?)|||

Hi Remus

the case number is SRQ070831600601.

Nope sorry i do not have your email anymore, lost a laptop in the interim, however all my contact dedtails are on the case details, i have sent though profiler traces, log files etc on the case. Please contact me or let me know if there is anything else i can do to assist.

This case has been standing still since friday, the say they cannot find a Service Broker expert Engineer currently, i really appreciate the help.

Thank You

|||This has been resolved with a Microsoft case, requires a hotfix, to be released in October.

Large SB messages cannot be dispatched ?

Hi All\Remus

I have a very weird SB problem in production, we currently have a microsoft case open but are not getting anywhere with it at the moment.

Perhaps someone out there or Remus can shed some light.

Let me give you some background, we have a service borker application using permenant dialogs between remote store Sql Server Express databases (initiator), that send messages on the same dialog to a central database (target).Over a VERY slow network.

Let me reiterate the network is very slow and experiences frequent timeouts, this is the nature of the network and a reason why went with service broker.

Last week we released a new version of the smart client app that uses SB messaging, the only change to SB was the size of the messages, they went from 20KB to over 200KB. NOTHING else changed, existing dialogs, routes, messages etc are still being used.

As soon as the first new large message was sent to central it could not get there, i pick up the following error in Profiler:

This message was dropped because it could not be dispatched on time. State: 2 1 - Sequenced Message.

I had since run many tests, i have skipped that message and let the normal process run , message are successfully delivered until it comes to a large message again then everything stops and the same errors occurs again.

I can send messages up to 30KB as soon it it is over 50KB they never get to the target and all i see is this error in profiler.

Now i need someone to explain to me what happens in the background, i know SB will break big messages up into smaller pieces, i do not know at what size this happens or how SB keeps track of the progress of a big message.

From my testing and results I am ussuming that the first piece of the message is received and SB keeps checking if packets are being received if SB it not receiving any packets anymore(network timeout) or there is a message to live timeout that expires before the complete message is received. This error occurs and it drops the message and tries again.

This gets me into an inifinite loop due to the slow network and message sizes we never actually get a message delivered, all i get is the error message constantly.

Is there some sort of timeout happening if the whole message connot be received in x seconds ? Is it due to SB checking the status of the message but getting timeouts ?

What i have done as a work around is no longer putting large binary data into the messages, therefore they are back to the original size and everyting works, smaller bianry data messages go through so it has nothing to do with the message data. As soon as a message exceeds +-40KB they cannot be delivered and i get the error above.

P.S i tried sending the same binary data from target to initiator as a test, the message never got there either but the error was different i got the following error:

This forwarded message was dropped because the message could not be delivered within the message time to live.

I cannot find anything in BOL or the net about changing or configuring a message time to live.However i do think this is perhaps the issue.

Thern this was followed by alot of Broker Forwarded Message Sent events from the test service but the message never got there and i never saw the first error again, so i am still confused abuot that too ?

Can anyone please help here ... Thanx

Send me the case number (you still have my mail, don't you?)|||

Hi Remus

the case number is SRQ070831600601.

Nope sorry i do not have your email anymore, lost a laptop in the interim, however all my contact dedtails are on the case details, i have sent though profiler traces, log files etc on the case. Please contact me or let me know if there is anything else i can do to assist.

This case has been standing still since friday, the say they cannot find a Service Broker expert Engineer currently, i really appreciate the help.

Thank You

|||This has been resolved with a Microsoft case, requires a hotfix, to be released in October.

Large Sample Database required for testing

We are creating a search engine in MS SQL using FullText etc.

we need to run some tests against a database that is near to realistic data. We have been able to generate a database of thousands of usernames, first names etc. but we can't figure out where to obtain a database full of data that contains product descriptions, product titles etc.

Does anyone have any idea where we could obtain (or even buy if need be) a large database (over 10,000 records) of sample data that we could run test queries against, so that we can optimise the queries, provide execution time estimates etc.

We have downloaded the adventureworks sample database, but there are not nearly enough records in it.

Any help would be much appreciated.

Create one on your won, there are many data generation tools outside there generating either random data or data based on your own data while obfuscating it. The MS product for this is Visual Studio for Database professionals which has a great data generator.

HTH, Jens K. Suessmeyer.


http://www.sqlserver2005.de

Large report.

My SQL script behind a report produces, say, 10,000 records or more. I
need to show,
at first, only 1st page and - this is important - to pull data from
server only for this 1st page.
In order to avoid performance bottleneck I need to avoid pulling all
data from server! When user clicks next page on client, the client
should extract data from server.
Is there a standard solution for this in SQL Sertver 2000 Reporting
Services?
It looks like a common situation, but I did not find a positive answer
and HOWTO in this group.
StanYou do not have control over this. There is a good reason why. How is
reporting services supposed to apply groups without the data?
I suggest you look at the concept of drill through using jump to report (or
jump to URL).
ms-help://MS.SQLCC.v9/MS.SQLSVR.v9.en/rptsrvr9/html/e746e8e2-8af9-4597-8476-18562a92015e.htm
Bruce Loehle-Conger
MVP SQL Server Reporting Services
<smalevanny@.hotmail.com> wrote in message
news:1142338498.985075.219970@.u72g2000cwu.googlegroups.com...
> My SQL script behind a report produces, say, 10,000 records or more. I
> need to show,
> at first, only 1st page and - this is important - to pull data from
> server only for this 1st page.
> In order to avoid performance bottleneck I need to avoid pulling all
> data from server! When user clicks next page on client, the client
> should extract data from server.
> Is there a standard solution for this in SQL Sertver 2000 Reporting
> Services?
> It looks like a common situation, but I did not find a positive answer
> and HOWTO in this group.
> Stan
>

Large report to fit onto 1 page

Hi,

I've browsed the forums and found other posts outlining the same issue and the answer was no..there is no scale to fit option in reporting services.. now a year later i was wondering if anything had been done about this?

Is there anyway to get a 20 column report to fit onto 1 page on printing?

Many thanks

Dave

Have you submitted this suggestion to https://connect.microsoft.com/SQLServer/Feedback ?

If so, let me know the URL and I will also vote on it.

There is nothing in SSRS now that allows "fit to page", but there really should be, especially for the matrix control.

BobP

Large report killing server

We have a report that needs to run 2-3 times day that is fairly large.
The need is to have it in Excel format and it is about 20 columns by
7-8,000 rows. The data is returned by the db server fairly readily, but
when Reporting Services tried to make it an Excel file, the render
object on the web front end timesout. I checked the server during this
time and the CPU usage is 100% and memory usage climbs. Is it normal
for RS to choke on a large report like this?Do you have SP1 installed?
Bruce Loehle-Conger
MVP SQL Server Reporting Services
"no one" <noone@.yahoo.com> wrote in message
news:420E4351.E3D5A5CA@.yahoo.com...
> We have a report that needs to run 2-3 times day that is fairly large.
> The need is to have it in Excel format and it is about 20 columns by
> 7-8,000 rows. The data is returned by the db server fairly readily, but
> when Reporting Services tried to make it an Excel file, the render
> object on the web front end timesout. I checked the server during this
> time and the CPU usage is 100% and memory usage climbs. Is it normal
> for RS to choke on a large report like this?
>|||SP1 for what? Reporting Services? SQL Server? Win2k?
"Bruce L-C [MVP]" wrote:
> Do you have SP1 installed?
> --
> Bruce Loehle-Conger
> MVP SQL Server Reporting Services
> "no one" <noone@.yahoo.com> wrote in message
> news:420E4351.E3D5A5CA@.yahoo.com...
> > We have a report that needs to run 2-3 times day that is fairly large.
> > The need is to have it in Excel format and it is about 20 columns by
> > 7-8,000 rows. The data is returned by the db server fairly readily, but
> > when Reporting Services tried to make it an Excel file, the render
> > object on the web front end timesout. I checked the server during this
> > time and the CPU usage is 100% and memory usage climbs. Is it normal
> > for RS to choke on a large report like this?
> >|||SP 1 for Reporting Services. Before digging too much more into this I would
suggest installing SP1. Note that you need to install it at the server and
wherever you develop your reports.
Bruce Loehle-Conger
MVP SQL Server Reporting Services
"no one" <noone@.yahoo.com> wrote in message
news:420FB123.1423F27A@.yahoo.com...
> SP1 for what? Reporting Services? SQL Server? Win2k?
> "Bruce L-C [MVP]" wrote:
>> Do you have SP1 installed?
>> --
>> Bruce Loehle-Conger
>> MVP SQL Server Reporting Services
>> "no one" <noone@.yahoo.com> wrote in message
>> news:420E4351.E3D5A5CA@.yahoo.com...
>> > We have a report that needs to run 2-3 times day that is fairly large.
>> > The need is to have it in Excel format and it is about 20 columns by
>> > 7-8,000 rows. The data is returned by the db server fairly readily,
>> > but
>> > when Reporting Services tried to make it an Excel file, the render
>> > object on the web front end timesout. I checked the server during this
>> > time and the CPU usage is 100% and memory usage climbs. Is it normal
>> > for RS to choke on a large report like this?
>> >
>|||The server guys tell me they installed it at most 6 weeks ago. Did SP1 come out
after that?
"Bruce L-C [MVP]" wrote:
> SP 1 for Reporting Services. Before digging too much more into this I would
> suggest installing SP1. Note that you need to install it at the server and
> wherever you develop your reports.
> --
> Bruce Loehle-Conger
> MVP SQL Server Reporting Services
> "no one" <noone@.yahoo.com> wrote in message
> news:420FB123.1423F27A@.yahoo.com...
> > SP1 for what? Reporting Services? SQL Server? Win2k?
> >
> > "Bruce L-C [MVP]" wrote:
> >
> >> Do you have SP1 installed?
> >>
> >> --
> >> Bruce Loehle-Conger
> >> MVP SQL Server Reporting Services
> >>
> >> "no one" <noone@.yahoo.com> wrote in message
> >> news:420E4351.E3D5A5CA@.yahoo.com...
> >> > We have a report that needs to run 2-3 times day that is fairly large.
> >> > The need is to have it in Excel format and it is about 20 columns by
> >> > 7-8,000 rows. The data is returned by the db server fairly readily,
> >> > but
> >> > when Reporting Services tried to make it an Excel file, the render
> >> > object on the web front end timesout. I checked the server during this
> >> > time and the CPU usage is 100% and memory usage climbs. Is it normal
> >> > for RS to choke on a large report like this?
> >> >
> >|||No, SP1 is older than 6 weeks, but it included some fixes for rendering in
Excel that sound like what your problem is.
--
Wayne Snyder, MCDBA, SQL Server MVP
Mariner, Charlotte, NC
www.mariner-usa.com
(Please respond only to the newsgroups.)
I support the Professional Association of SQL Server (PASS) and it's
community of SQL Server professionals.
www.sqlpass.org
"no one" <noone@.yahoo.com> wrote in message
news:42100F1E.8F4DD308@.yahoo.com...
> The server guys tell me they installed it at most 6 weeks ago. Did SP1
> come out
> after that?
> "Bruce L-C [MVP]" wrote:
>> SP 1 for Reporting Services. Before digging too much more into this I
>> would
>> suggest installing SP1. Note that you need to install it at the server
>> and
>> wherever you develop your reports.
>> --
>> Bruce Loehle-Conger
>> MVP SQL Server Reporting Services
>> "no one" <noone@.yahoo.com> wrote in message
>> news:420FB123.1423F27A@.yahoo.com...
>> > SP1 for what? Reporting Services? SQL Server? Win2k?
>> >
>> > "Bruce L-C [MVP]" wrote:
>> >
>> >> Do you have SP1 installed?
>> >>
>> >> --
>> >> Bruce Loehle-Conger
>> >> MVP SQL Server Reporting Services
>> >>
>> >> "no one" <noone@.yahoo.com> wrote in message
>> >> news:420E4351.E3D5A5CA@.yahoo.com...
>> >> > We have a report that needs to run 2-3 times day that is fairly
>> >> > large.
>> >> > The need is to have it in Excel format and it is about 20 columns by
>> >> > 7-8,000 rows. The data is returned by the db server fairly readily,
>> >> > but
>> >> > when Reporting Services tried to make it an Excel file, the render
>> >> > object on the web front end timesout. I checked the server during
>> >> > this
>> >> > time and the CPU usage is 100% and memory usage climbs. Is it
>> >> > normal
>> >> > for RS to choke on a large report like this?
>> >> >
>> >
>|||Go to http://yourserver/Reportserver
It should show version 8.00.878.00 if you have SP1 installed.
Bruce Loehle-Conger
MVP SQL Server Reporting Services
"no one" <noone@.yahoo.com> wrote in message
news:42100F1E.8F4DD308@.yahoo.com...
> The server guys tell me they installed it at most 6 weeks ago. Did SP1
come out
> after that?
> "Bruce L-C [MVP]" wrote:
> > SP 1 for Reporting Services. Before digging too much more into this I
would
> > suggest installing SP1. Note that you need to install it at the server
and
> > wherever you develop your reports.
> >
> > --
> > Bruce Loehle-Conger
> > MVP SQL Server Reporting Services
> >
> > "no one" <noone@.yahoo.com> wrote in message
> > news:420FB123.1423F27A@.yahoo.com...
> > > SP1 for what? Reporting Services? SQL Server? Win2k?
> > >
> > > "Bruce L-C [MVP]" wrote:
> > >
> > >> Do you have SP1 installed?
> > >>
> > >> --
> > >> Bruce Loehle-Conger
> > >> MVP SQL Server Reporting Services
> > >>
> > >> "no one" <noone@.yahoo.com> wrote in message
> > >> news:420E4351.E3D5A5CA@.yahoo.com...
> > >> > We have a report that needs to run 2-3 times day that is fairly
large.
> > >> > The need is to have it in Excel format and it is about 20 columns
by
> > >> > 7-8,000 rows. The data is returned by the db server fairly
readily,
> > >> > but
> > >> > when Reporting Services tried to make it an Excel file, the render
> > >> > object on the web front end timesout. I checked the server during
this
> > >> > time and the CPU usage is 100% and memory usage climbs. Is it
normal
> > >> > for RS to choke on a large report like this?
> > >> >
> > >
>|||Bruce L-C [MVP] wrote:
> Go to http://yourserver/Reportserver
> It should show version 8.00.878.00 if you have SP1 installed.
>
Since I have control of our Dev box, I checked it as well. It did not
have SP1. I downloaded and installed. The report still chokes the
server while trying to render with 100% CPU and memory usage climbing
about actual RAM (on this box it is only 512 MB). Looks like the SP
didn't help.|||As an experiment / workaround try rendering as CSV. If that works then that
can be loaded into Excel. It is possible that SP2 might make a difference
but I don't know.
--
Bruce Loehle-Conger
MVP SQL Server Reporting Services
"Bryon" <blape@.whittmanhart.com> wrote in message
news:O3VThA4EFHA.2756@.TK2MSFTNGP15.phx.gbl...
> Bruce L-C [MVP] wrote:
> > Go to http://yourserver/Reportserver
> >
> > It should show version 8.00.878.00 if you have SP1 installed.
> >
> >
> Since I have control of our Dev box, I checked it as well. It did not
> have SP1. I downloaded and installed. The report still chokes the
> server while trying to render with 100% CPU and memory usage climbing
> about actual RAM (on this box it is only 512 MB). Looks like the SP
> didn't help.|||Bruce L-C [MVP] wrote:
> As an experiment / workaround try rendering as CSV. If that works then that
> can be loaded into Excel. It is possible that SP2 might make a difference
> but I don't know.
>
sp2?|||SP2 is supposed to be release before too long (Feb/March). It is always
possible that they have put in somemore Excel rendering improvements.
--
Bruce Loehle-Conger
MVP SQL Server Reporting Services
"Bryon" <blape@.whittmanhart.com> wrote in message
news:eOTLcE5EFHA.2456@.TK2MSFTNGP10.phx.gbl...
> Bruce L-C [MVP] wrote:
> > As an experiment / workaround try rendering as CSV. If that works then
that
> > can be loaded into Excel. It is possible that SP2 might make a
difference
> > but I don't know.
> >
> sp2?

Large report export to pdf = corruption error

Hi,
I have a report which can produce 220 pages if the user select "all" in my
multiple choice list.
if the user try to export this report the PDF file is corrupted.
the same report with only a small amount of item selected instead-of the
"all" (so around 1 to 20 pages) is correct.
The export works fine.
any idea why?
Thanks.
Jerome.On Mar 7, 10:54 am, "Jeje" <willg...@.hotmail.com> wrote:
> Hi,
> I have a report which can produce 220 pages if the user select "all" in my
> multiple choice list.
> if the user try to export this report the PDF file is corrupted.
> the same report with only a small amount of item selected instead-of the
> "all" (so around 1 to 20 pages) is correct.
> The export works fine.
> any idea why?
> Thanks.
> Jerome.
I've never experience this issue; however, it sounds like a rendering
problem w/the quantity of data being exported to PDF. Have you tried
splitting up the results via the report query/stored procedure to
obtain a manageable amount of exported data? This might be a viable
solution.
Regards,
Enrique Martinez
Sr. SQL Server Developer|||I'm trying to create another report with less pages and a better query.
if this doesn't works, we'll change the prompt options to restrict the user
to 1 value only.
"EMartinez" <emartinez.pr1@.gmail.com> wrote in message
news:1173361141.038474.105280@.n33g2000cwc.googlegroups.com...
> On Mar 7, 10:54 am, "Jeje" <willg...@.hotmail.com> wrote:
>> Hi,
>> I have a report which can produce 220 pages if the user select "all" in
>> my
>> multiple choice list.
>> if the user try to export this report the PDF file is corrupted.
>> the same report with only a small amount of item selected instead-of the
>> "all" (so around 1 to 20 pages) is correct.
>> The export works fine.
>> any idea why?
>> Thanks.
>> Jerome.
>
> I've never experience this issue; however, it sounds like a rendering
> problem w/the quantity of data being exported to PDF. Have you tried
> splitting up the results via the report query/stored procedure to
> obtain a manageable amount of exported data? This might be a viable
> solution.
> Regards,
> Enrique Martinez
> Sr. SQL Server Developer
>

Large Report Dilemma...Need Suggestions

We have a database that has GUIDs that we need to use for looking up records
in a database and displaying a report. Problem is that the user can select
many records to be displayed in the report and we cannot use URL access as
the querystring will be too long. We tried web services, but the maximum
length for variable declaration is 4000 for a varchar variable in a stored
procedure and this can be exceeded also. So question is how can I get around
this and run very large reports? The GUIDs have to stay...I've already lost
that battle. How to get around this and any alternatives will be greatly
appreciated.Just to understand your problem.
You have GUID as one of the column, you use multiselect from RS to select
multiple GUIDs, once click on view report it should display some report. is
this what you are looking for ?
If yes, then 4096 is the limitation for multi-select option so you cannot
select more than that.
I have used a small query to do this it worked perfectly ok ofcourse through
RM. But you said URL, why you require URL when you can run from RM,
Is your requirement is through URL ?
Ofcourse you can use web services, using arrays and pass the parameters
loaded with GUID to the query / proc etc..
one more thing is if you are using SQL SERVER 2005 there is a new varchar
called
nvarchar(max) it can go upto 2 GB.
So just let know is my understanding is correct ? so that i can suggest
something.
Amarnath
"Wannabe" wrote:
> We have a database that has GUIDs that we need to use for looking up records
> in a database and displaying a report. Problem is that the user can select
> many records to be displayed in the report and we cannot use URL access as
> the querystring will be too long. We tried web services, but the maximum
> length for variable declaration is 4000 for a varchar variable in a stored
> procedure and this can be exceeded also. So question is how can I get around
> this and run very large reports? The GUIDs have to stay...I've already lost
> that battle. How to get around this and any alternatives will be greatly
> appreciated.|||Thanks for the reply...
The user will not be using RM to run the report. They will use our web
application, which has a datagrid, where they will select however many
records they want a report on (could be hundreds.) Then we will pass the GUID
of each record selected to RS for displaying. Also, we are currently using
SQL 2000, but do have plans to upgrade in about three months.
When I tried web services, in SQL 2000, the maximum value for a varchar is
4000 characters, so my parameter list got truncated when I built the where
clause using an IN statement.
"Amarnath" wrote:
> Just to understand your problem.
> You have GUID as one of the column, you use multiselect from RS to select
> multiple GUIDs, once click on view report it should display some report. is
> this what you are looking for ?
> If yes, then 4096 is the limitation for multi-select option so you cannot
> select more than that.
> I have used a small query to do this it worked perfectly ok ofcourse through
> RM. But you said URL, why you require URL when you can run from RM,
> Is your requirement is through URL ?
> Ofcourse you can use web services, using arrays and pass the parameters
> loaded with GUID to the query / proc etc..
> one more thing is if you are using SQL SERVER 2005 there is a new varchar
> called
> nvarchar(max) it can go upto 2 GB.
> So just let know is my understanding is correct ? so that i can suggest
> something.
> Amarnath
>
> "Wannabe" wrote:
> > We have a database that has GUIDs that we need to use for looking up records
> > in a database and displaying a report. Problem is that the user can select
> > many records to be displayed in the report and we cannot use URL access as
> > the querystring will be too long. We tried web services, but the maximum
> > length for variable declaration is 4000 for a varchar variable in a stored
> > procedure and this can be exceeded also. So question is how can I get around
> > this and run very large reports? The GUIDs have to stay...I've already lost
> > that battle. How to get around this and any alternatives will be greatly
> > appreciated.|||If you are writing custom code using asp.net then it will be a tricky way,
what you can do is take the input to a temp table and in your query use
subquery to execte it.
some thing like select ... from ... where xyz in (select guid from #temp)
Amarnath
"Wannabe" wrote:
> Thanks for the reply...
> The user will not be using RM to run the report. They will use our web
> application, which has a datagrid, where they will select however many
> records they want a report on (could be hundreds.) Then we will pass the GUID
> of each record selected to RS for displaying. Also, we are currently using
> SQL 2000, but do have plans to upgrade in about three months.
> When I tried web services, in SQL 2000, the maximum value for a varchar is
> 4000 characters, so my parameter list got truncated when I built the where
> clause using an IN statement.
> "Amarnath" wrote:
> > Just to understand your problem.
> > You have GUID as one of the column, you use multiselect from RS to select
> > multiple GUIDs, once click on view report it should display some report. is
> > this what you are looking for ?
> >
> > If yes, then 4096 is the limitation for multi-select option so you cannot
> > select more than that.
> > I have used a small query to do this it worked perfectly ok ofcourse through
> > RM. But you said URL, why you require URL when you can run from RM,
> > Is your requirement is through URL ?
> >
> > Ofcourse you can use web services, using arrays and pass the parameters
> > loaded with GUID to the query / proc etc..
> >
> > one more thing is if you are using SQL SERVER 2005 there is a new varchar
> > called
> > nvarchar(max) it can go upto 2 GB.
> >
> > So just let know is my understanding is correct ? so that i can suggest
> > something.
> >
> > Amarnath
> >
> >
> > "Wannabe" wrote:
> >
> > > We have a database that has GUIDs that we need to use for looking up records
> > > in a database and displaying a report. Problem is that the user can select
> > > many records to be displayed in the report and we cannot use URL access as
> > > the querystring will be too long. We tried web services, but the maximum
> > > length for variable declaration is 4000 for a varchar variable in a stored
> > > procedure and this can be exceeded also. So question is how can I get around
> > > this and run very large reports? The GUIDs have to stay...I've already lost
> > > that battle. How to get around this and any alternatives will be greatly
> > > appreciated.|||Thanks...I'll try that. I believe that will help.
"Amarnath" wrote:
> If you are writing custom code using asp.net then it will be a tricky way,
> what you can do is take the input to a temp table and in your query use
> subquery to execte it.
> some thing like select ... from ... where xyz in (select guid from #temp)
> Amarnath
>
> "Wannabe" wrote:
> > Thanks for the reply...
> >
> > The user will not be using RM to run the report. They will use our web
> > application, which has a datagrid, where they will select however many
> > records they want a report on (could be hundreds.) Then we will pass the GUID
> > of each record selected to RS for displaying. Also, we are currently using
> > SQL 2000, but do have plans to upgrade in about three months.
> >
> > When I tried web services, in SQL 2000, the maximum value for a varchar is
> > 4000 characters, so my parameter list got truncated when I built the where
> > clause using an IN statement.
> >
> > "Amarnath" wrote:
> >
> > > Just to understand your problem.
> > > You have GUID as one of the column, you use multiselect from RS to select
> > > multiple GUIDs, once click on view report it should display some report. is
> > > this what you are looking for ?
> > >
> > > If yes, then 4096 is the limitation for multi-select option so you cannot
> > > select more than that.
> > > I have used a small query to do this it worked perfectly ok ofcourse through
> > > RM. But you said URL, why you require URL when you can run from RM,
> > > Is your requirement is through URL ?
> > >
> > > Ofcourse you can use web services, using arrays and pass the parameters
> > > loaded with GUID to the query / proc etc..
> > >
> > > one more thing is if you are using SQL SERVER 2005 there is a new varchar
> > > called
> > > nvarchar(max) it can go upto 2 GB.
> > >
> > > So just let know is my understanding is correct ? so that i can suggest
> > > something.
> > >
> > > Amarnath
> > >
> > >
> > > "Wannabe" wrote:
> > >
> > > > We have a database that has GUIDs that we need to use for looking up records
> > > > in a database and displaying a report. Problem is that the user can select
> > > > many records to be displayed in the report and we cannot use URL access as
> > > > the querystring will be too long. We tried web services, but the maximum
> > > > length for variable declaration is 4000 for a varchar variable in a stored
> > > > procedure and this can be exceeded also. So question is how can I get around
> > > > this and run very large reports? The GUIDs have to stay...I've already lost
> > > > that battle. How to get around this and any alternatives will be greatly
> > > > appreciated.

Large query, slow DB responses

Hello,
I=E2=80=99m not a DBA and only use SQL to write the odd stored procedure et=
c,
so please be patient with me.
We have a, very badly structured, DB which misbehaves every now and
then. We are in the process of re-writing the .NET code and some sps
but until that=E2=80=99s done we have to deal with the following problem.
Our site is busiest over the weekend. It=E2=80=99s not a big site (usual
number of DB connections over the weekend is ~200). The worse thing is
that the backoffice uses the same DB as the users at home. When a
backoffice query is run during those busy times, it will strain the DB
so much that no one else will be able to access the site.
So, my question is: is there anything within SQL that I can see which
query (or queries) is causing this? I would ultimately like to write an
app the support guys could use to find these things out, but it=E2=80=99d be
nice to know where to start =EF=81=8A
Thank you for your time
KS
KS
Run SQL Server Profiler to identify long running queries . Now that you
identified them see an execution plan of each query.
Does the optimizer use indexes defined on the table?
"KS" <kiki@.dyky.co.uk> wrote in message
news:1143361090.378310.151760@.e56g2000cwe.googlegr oups.com...
Hello,
Im not a DBA and only use SQL to write the odd stored procedure etc,
so please be patient with me.
We have a, very badly structured, DB which misbehaves every now and
then. We are in the process of re-writing the .NET code and some sps
but until thats done we have to deal with the following problem.
Our site is busiest over the weekend. Its not a big site (usual
number of DB connections over the weekend is ~200). The worse thing is
that the backoffice uses the same DB as the users at home. When a
backoffice query is run during those busy times, it will strain the DB
so much that no one else will be able to access the site.
So, my question is: is there anything within SQL that I can see which
query (or queries) is causing this? I would ultimately like to write an
app the support guys could use to find these things out, but itd be
nice to know where to start ?
Thank you for your time
KS
|||Thank you for your response Uri. I did think of using a sql trace to
see the duration of each execution, but will this put on any more
strain? Is it ok to have the trace running the whole day?
Thanks again
|||If you have multiple/hyperthreaded processors, consider changing the 'max
degree of parallelism configuration option to '1' until you are able to
correct the underlying problem. This will prevent SQL Server from
generating a parallel plan for your BO users that can monopolize the
processors.
Hope this helps.
Dan Guzman
SQL Server MVP
"KS" <kiki@.dyky.co.uk> wrote in message
news:1143361090.378310.151760@.e56g2000cwe.googlegr oups.com...
Hello,
I'm not a DBA and only use SQL to write the odd stored procedure etc,
so please be patient with me.
We have a, very badly structured, DB which misbehaves every now and
then. We are in the process of re-writing the .NET code and some sps
but until that's done we have to deal with the following problem.
Our site is busiest over the weekend. It's not a big site (usual
number of DB connections over the weekend is ~200). The worse thing is
that the backoffice uses the same DB as the users at home. When a
backoffice query is run during those busy times, it will strain the DB
so much that no one else will be able to access the site.
So, my question is: is there anything within SQL that I can see which
query (or queries) is causing this? I would ultimately like to write an
app the support guys could use to find these things out, but it'd be
nice to know where to start ?
Thank you for your time
KS
|||KS
> strain? Is it ok to have the trace running the whole day?
Yes, just make sure that you created a trace on the client and not a server
side
"KS" <kiki@.dyky.co.uk> wrote in message
news:1143376377.890485.156100@.z34g2000cwc.googlegr oups.com...
> Thank you for your response Uri. I did think of using a sql trace to
> see the duration of each execution, but will this put on any more
> strain? Is it ok to have the trace running the whole day?
> Thanks again
>
|||Thank you both
|||Hi
SQL Profiler will allow you to determine which sps are taking the longest
durations, but will need to procedure to complete before giving you the
timings. If you are actually seeing blocking then sp_who2 will display
blocked processes or check out http://support.microsoft.com/kb/224453 and the
sp_blocker_pss80 script. If you are looking for a third party monitoring tool
then check out the Quest offerings http://www.quest.com/sql_server/index.asp
John
"KS" wrote:

> Hello,
> I’m not a DBA and only use SQL to write the odd stored procedure etc,
> so please be patient with me.
> We have a, very badly structured, DB which misbehaves every now and
> then. We are in the process of re-writing the .NET code and some sps
> but until that’s done we have to deal with the following problem.
> Our site is busiest over the weekend. It’s not a big site (usual
> number of DB connections over the weekend is ~200). The worse thing is
> that the backoffice uses the same DB as the users at home. When a
> backoffice query is run during those busy times, it will strain the DB
> so much that no one else will be able to access the site.
> So, my question is: is there anything within SQL that I can see which
> query (or queries) is causing this? I would ultimately like to write an
> app the support guys could use to find these things out, but it’d be
> nice to know where to start ?
> Thank you for your time
> KS
>