Sunday, February 26, 2012

Export to Excel failed with System.OutOfMemoryException

Server: SQL Server 2000 Ent running on Win2k AS, Ent Edition of Reporting
Services + SP1
my workstation: Win2k SP4, Microsoft Development Environment v7.1.3088,
Reporting Services Ent Ed + SP1.
I am trying to export 55,804 rows to Excel. The table that I am exporting
from has 11 columns.
-- Columns within >results< -- DataType Length
--- -- --
Source varchar 2
ClaimNumber varchar 17
MarketLoc varchar 7
Description varchar 50
SomeDate datetime 8
DetailNo smallint 2
ChargeCode varchar 1
UnitCount smallint 2
ChargedDollars decimal 5
AllowedDollars decimal 5
NetPaidDollars decimal 5
Because I need to dump this data for 17 different market locations ranging
in rowcounts of 600 rows to 56,000 rows I figured that I would just create a
report within Reporting Services and query the results table (228,825 rows)
table for each of the 17 market locations. I would export each market to an
Excel file and then I would copy/paste the worksheets into one file that I
can send to the end user. This would be easier than selecting the
information within Query Analyzer, saving to a file, importing each file,
formatting each column and repeating the process for each of the 17 groups
of data.
I started out with a small data set (1500 rows) as a proof of concept so
that I knew the process would work. When I had the report formatted
correctly I successfully exported the data to Excel. Then I moved on to one
of the larger groups (55,804 rows). This is where I had the problems.
I previewed the report within Visual Studio. Total page count 1241. The
rdl.data file is 12MB. I clicked on the disk icon on the toolbar and chose
to export the report to Excel. I have 768MB of RAM in my machine and the
memory usage continued to rise until it reached 1GB. This process took over
20 minutes. At the end of it all I received a dialog box indicating a
System.OutOfMemoryException was thrown. What would happen if I tried
running this on the server? I am guessing that it would consume lots of CPU
cycles and eat up lots of memory. Would it finish, or would it fail? I am
afraid to try :)
Why is the .rdl.data only 12MB yet over 1GB was consumed while converting to
Excel? When I save the results to a [text] file within Query Analyzer the
file is 12MB. I could import into Excel, but then I have to import x files
and format y columns x times.
This should be fairly easy to repro. Does anyone from Microsoft want to
take a look at it? Does anyone else have any ideas?
--
KeithKeith,
The only recommendation I have at this time is to increase the RAM on the
machine you are using to export to excel. I do not have a specific
recommendation, but I would suggest a minimum of 1GB of RAM. The SQL Server
2000 Reporting Services team is looking at addressing this issue in upcoming
service packs and releases.
--
Bruce Johnson [MSFT]
Microsoft SQL Server Reporting Services
This posting is provided "AS IS" with no warranties, and confers no rights.
"Keith Kratochvil" <sqlguy.back2u@.comcast.net> wrote in message
news:eo8Ko5qXEHA.2868@.TK2MSFTNGP09.phx.gbl...
> Server: SQL Server 2000 Ent running on Win2k AS, Ent Edition of Reporting
> Services + SP1
> my workstation: Win2k SP4, Microsoft Development Environment v7.1.3088,
> Reporting Services Ent Ed + SP1.
> I am trying to export 55,804 rows to Excel. The table that I am exporting
> from has 11 columns.
> -- Columns within >results< -- DataType Length
> --- -- --
> Source varchar 2
> ClaimNumber varchar 17
> MarketLoc varchar 7
> Description varchar 50
> SomeDate datetime 8
> DetailNo smallint 2
> ChargeCode varchar 1
> UnitCount smallint 2
> ChargedDollars decimal 5
> AllowedDollars decimal 5
> NetPaidDollars decimal 5
> Because I need to dump this data for 17 different market locations ranging
> in rowcounts of 600 rows to 56,000 rows I figured that I would just create
a
> report within Reporting Services and query the results table (228,825
rows)
> table for each of the 17 market locations. I would export each market to
an
> Excel file and then I would copy/paste the worksheets into one file that I
> can send to the end user. This would be easier than selecting the
> information within Query Analyzer, saving to a file, importing each file,
> formatting each column and repeating the process for each of the 17 groups
> of data.
> I started out with a small data set (1500 rows) as a proof of concept so
> that I knew the process would work. When I had the report formatted
> correctly I successfully exported the data to Excel. Then I moved on to
one
> of the larger groups (55,804 rows). This is where I had the problems.
> I previewed the report within Visual Studio. Total page count 1241. The
> rdl.data file is 12MB. I clicked on the disk icon on the toolbar and
chose
> to export the report to Excel. I have 768MB of RAM in my machine and the
> memory usage continued to rise until it reached 1GB. This process took
over
> 20 minutes. At the end of it all I received a dialog box indicating a
> System.OutOfMemoryException was thrown. What would happen if I tried
> running this on the server? I am guessing that it would consume lots of
CPU
> cycles and eat up lots of memory. Would it finish, or would it fail? I
am
> afraid to try :)
> Why is the .rdl.data only 12MB yet over 1GB was consumed while converting
to
> Excel? When I save the results to a [text] file within Query Analyzer the
> file is 12MB. I could import into Excel, but then I have to import x
files
> and format y columns x times.
> This should be fairly easy to repro. Does anyone from Microsoft want to
> take a look at it? Does anyone else have any ideas?
> --
> Keith
>|||Bruce, Thanks for your reply. It is unfortunate that 768MB is not enough
memory to export ~12MB of data to Excel. I am glad that Microsoft enhanced
the versions of Excel that are supported with RS SP1. Now I guess I will
have to wait for SP2 (or SQL Server 2005) for memory optimization.
In the end I ended up getting the data into Excel via the "Get External Data
| from a database query" menu. Yeah, I had to do it 17 times, but it
worked. That method saved me the pain of having to perform a "data | text
to columns" and format each column..
Keep up the good work!
--
Keith
"Bruce Johnson [MSFT]" <brucejoh@.online.microsoft.com> wrote in message
news:OmRdOCsXEHA.3664@.TK2MSFTNGP12.phx.gbl...
> Keith,
> The only recommendation I have at this time is to increase the RAM on the
> machine you are using to export to excel. I do not have a specific
> recommendation, but I would suggest a minimum of 1GB of RAM. The SQL
Server
> 2000 Reporting Services team is looking at addressing this issue in
upcoming
> service packs and releases.
> --
> Bruce Johnson [MSFT]
> Microsoft SQL Server Reporting Services
> This posting is provided "AS IS" with no warranties, and confers no
rights.
>
> "Keith Kratochvil" <sqlguy.back2u@.comcast.net> wrote in message
> news:eo8Ko5qXEHA.2868@.TK2MSFTNGP09.phx.gbl...
> > Server: SQL Server 2000 Ent running on Win2k AS, Ent Edition of
Reporting
> > Services + SP1
> > my workstation: Win2k SP4, Microsoft Development Environment v7.1.3088,
> > Reporting Services Ent Ed + SP1.
> >
> > I am trying to export 55,804 rows to Excel. The table that I am
exporting
> > from has 11 columns.
> >
> > -- Columns within >results< -- DataType Length
> > --- -- --
> > Source varchar 2
> > ClaimNumber varchar 17
> > MarketLoc varchar 7
> > Description varchar 50
> > SomeDate datetime 8
> > DetailNo smallint 2
> > ChargeCode varchar 1
> > UnitCount smallint 2
> > ChargedDollars decimal 5
> > AllowedDollars decimal 5
> > NetPaidDollars decimal 5
> >
> > Because I need to dump this data for 17 different market locations
ranging
> > in rowcounts of 600 rows to 56,000 rows I figured that I would just
create
> a
> > report within Reporting Services and query the results table (228,825
> rows)
> > table for each of the 17 market locations. I would export each market
to
> an
> > Excel file and then I would copy/paste the worksheets into one file that
I
> > can send to the end user. This would be easier than selecting the
> > information within Query Analyzer, saving to a file, importing each
file,
> > formatting each column and repeating the process for each of the 17
groups
> > of data.
> >
> > I started out with a small data set (1500 rows) as a proof of concept so
> > that I knew the process would work. When I had the report formatted
> > correctly I successfully exported the data to Excel. Then I moved on to
> one
> > of the larger groups (55,804 rows). This is where I had the problems.
> >
> > I previewed the report within Visual Studio. Total page count 1241.
The
> > rdl.data file is 12MB. I clicked on the disk icon on the toolbar and
> chose
> > to export the report to Excel. I have 768MB of RAM in my machine and
the
> > memory usage continued to rise until it reached 1GB. This process took
> over
> > 20 minutes. At the end of it all I received a dialog box indicating a
> > System.OutOfMemoryException was thrown. What would happen if I tried
> > running this on the server? I am guessing that it would consume lots of
> CPU
> > cycles and eat up lots of memory. Would it finish, or would it fail? I
> am
> > afraid to try :)
> >
> > Why is the .rdl.data only 12MB yet over 1GB was consumed while
converting
> to
> > Excel? When I save the results to a [text] file within Query Analyzer
the
> > file is 12MB. I could import into Excel, but then I have to import x
> files
> > and format y columns x times.
> >
> > This should be fairly easy to repro. Does anyone from Microsoft want to
> > take a look at it? Does anyone else have any ideas?
> >
> > --
> > Keith
> >
> >
>

No comments:

Post a Comment