Showing posts with label environment. Show all posts
Showing posts with label environment. Show all posts

Tuesday, March 27, 2012

Exporting DTS jobs?

SQL Server 2000.

Hey there,

I'm trying to set up a test environment. My projuction environemtn has a bunch of Scheduled Jobs which run either stored procedures or DTS jobs.

I've exported the SP's and the Scheduled jobs but I'm wondering if there is any way to migrate the DTS jobs?

Cheers...open DTS, select "save as" from "Package" menu. Select "structured storage file" in Location drop down. provide name etc. save.

in the destination machine, right-click on Data-Transformation-Services of EM select "open package". save it with Location = SQL Server.|||Me being the lazy bum that I am, I'd use DTSBackup (http://www.sqldts.com/default.aspx?242). I've got develoeprs that really like DTS though, so I get an infernal number of packages on some of my servers.

-PatP|||Me being the lazy bum that I am, ...
-PatP

I think I'm maybe lazier. I created a script that saves my DTS packages every night to a remote (off-server) location:

' ************************************************** ********************
' * SaveDTS.vbs
' * Hugh Scott
' * 2005/06/28
' *
' * This script logs into a SQL Server, locates all DTS packages and Then
' * saves each pacakge as a file to a specified location
' *
' * Parameters:
' * IN: None
' * OUT:
' * None
' *
' * Usage:
' *
' *
' * Modification History:
' * Date Initials Modification
' * 2005/06/28 HMS Original
' *
' ************************************************** ********************
Dim oDTS
Dim oConn
Dim oRS
Dim sSQL
Dim iPackages
Dim sServer
Dim sUser
Dim sPassword
Dim bTrusted
Dim sTarget
Const DTSSQLStgFlag_UseTrustedConnection = 256

bTrusted = True
sServer = <set server name here>
sTarget = <set backup destination here>

Set oConn = CreateObject("ADODB.Connection")
Set oRS = CreateObject("ADODB.Recordset")

If bTrusted Then
oConn.ConnectionString = "Provider=SQLOLEDB.1;Integrated Security=SSPI;Persist Security Info=False;Initial Catalog=msdb;Data Source=" & sServer
Else
oConn.ConnectionString = "Provider=SQLOLEDB.1;UID=" & sUser & ";Password=" & sPassword & ";Persist Security Info=False;Initial Catalog=msdb;Data Source=" & sServer
End if
oConn.Open

sSQL = "SELECT DISTINCT name FROM sysdtspackages"

Set oRS = oConn.Execute(sSQL, iPackages)

While Not oRS.EOF

Set oDTS = CreateObject("DTS.Package")
sName = oRS("name")
oDTS.LoadFromSQLServer sServer,,,DTSSQLStgFlag_UseTrustedConnection,,,,sN ame
sFileName = Replace(sName, ":", "")
oDTS.SaveToStorageFile sTarget & sFileName & ".dts"
Set oDTS = Nothing
oRS.MoveNext
Wend

Set oRS = Nothing
oConn.Close
Set oConn = Nothing

Schedule it as a job (set the task type to VB Script). It ensures that you always have a backup of your DTS packages when it comes time to restore the db server.

Regards,

hmscott|||Nice idea. I may implement this.|||Awesome help guys thankyou all so much! :D|||open DTS, select "save as" from "Package" menu. Select "structured storage file" in Location drop down. provide name etc. save.

in the destination machine, right-click on Data-Transformation-Services of EM select "open package". save it with Location = SQL Server.

I tried HMSCOTT's code, and it yielded the same results as everything else

it is unable to backup my packages correctly, they get restored with too many incorrect settings

the only option I have gotten to work is doing a manual "save as" which is explained above|||... they get restored with too many incorrect settings ...

Can you explain/elaborate? What settings are incorrect?

Be aware that DTS is very environment dependent. You would have to make certain that:

1. Any drivers that a DTS package would use would be the same on the new server as the old server.

2. Any custom DTS Tasks would have to be the same on the new server as the old server.

3. External paths, links and certain files must be identical (for example, an Oracle TNSNames.ora file, or a UNC path)

4. In some cases security settings will have an impact.

Regards,

hmscott|||here is a description of the problem:

if i open this particular dts package and save it to file it is 511B, but if i use any backup program or script i.e. your script, DTS backup, a script i wrote, etc... it is 472B.

when i open both files in enterprise manager, the 511B file yields the correct package (obviously for migration i would have to change settings but no big deal as long as its the same structure)

the 472B package is compeltley different, I don't even know how it thinks the translation even comes close. it messes up some success/failure criterias and creates additional connection ??|||hmm, okay, I can't address most of your specific issues. I can only comment on my own experiences and environment.

I just opened up all of the package "structured" files that I have. I opened them from a different server than the one on which they normally reside.

I found the following:
1. In all but one case, the packages restored just fine with no errors.
2. The comments that I had created using the textbox feature were all missing (ugh). Also, in some cases, the objects were shifted on the designer window (moving from the left side of the screen to the right and vice versa).
3. In one case, only two VBScript tasks showed in the designer window (out of maybe 40 tasks). I checked the disconnected edit properties, though, and found everything to be present.
4. When I executed the package (the one with the clear designer window), everything executed in the order in which I would expect it.

I am running SQL Server 2000 with SP4 (patched with hotfix 2040).

I will do some more research at work tomorrow. I'm glad you pointed this out; I excercised the recovery process once last year, but I haven't looked at this in a while.

Regards,

hmscott|||one thing to check would be to look at the size of your structured files for your backup program versus a file that you saved manually

maybe do this for the most complicated packages to see if there is an issue

it might just be that my packages are too complicated for a backup program to account for|||See, 1 night, long, long ago, all of the SQL Server developers fell asleep, and the Access developers snuck in...viola...DTS

Here's a pleasant thought...there is no DTS in 2k5|||hahaha... that is hilarious

what does 05 use to automate external data flows?|||http://www.microsoft.com/sql/prodinfo/overview/whats-new-in-sqlserver2005.mspx

Look for Intergration Services...they don't even mention DTS, nor do they mention how (or if) it's possible to migrate existing packages.|||See, 1 night, long, long ago, all of the SQL Server developers fell asleep, and the Access developers snuck in...viola...DTS

Here's a pleasant thought...there is no DTS in 2k5

Yep, with 2k5, they shot all of the Access developers and brought in the BizTalk developers. Arrrrgh!!!

On the serious side, I figured what I had done wrong in the earlier example (the one package that would not restore properly): I had a custom task in the package and had not registered the custom task on the server to which I restored the package. Once, I took care of that little task, then the package opened up just fine. My file sizes are also different between the programmatic save versus the manual save. I think that is because the programmatic method is saving version information in addition to the structure. When I go to open the package, I see a list of possible entries to open, starting with the most recently saved version.

Again, I can't comment or help on your specific scenario; I'm glad for the opportunity to have checked out the script and verified the restore process. It seems to work fine for me.

:shrug:

Regards,

hmscott

Sunday, February 26, 2012

Export to Excel failed with System.OutOfMemoryException

Server: SQL Server 2000 Ent running on Win2k AS, Ent Edition of Reporting
Services + SP1
my workstation: Win2k SP4, Microsoft Development Environment v7.1.3088,
Reporting Services Ent Ed + SP1.
I am trying to export 55,804 rows to Excel. The table that I am exporting
from has 11 columns.
-- Columns within >results< -- DataType Length
--- -- --
Source varchar 2
ClaimNumber varchar 17
MarketLoc varchar 7
Description varchar 50
SomeDate datetime 8
DetailNo smallint 2
ChargeCode varchar 1
UnitCount smallint 2
ChargedDollars decimal 5
AllowedDollars decimal 5
NetPaidDollars decimal 5
Because I need to dump this data for 17 different market locations ranging
in rowcounts of 600 rows to 56,000 rows I figured that I would just create a
report within Reporting Services and query the results table (228,825 rows)
table for each of the 17 market locations. I would export each market to an
Excel file and then I would copy/paste the worksheets into one file that I
can send to the end user. This would be easier than selecting the
information within Query Analyzer, saving to a file, importing each file,
formatting each column and repeating the process for each of the 17 groups
of data.
I started out with a small data set (1500 rows) as a proof of concept so
that I knew the process would work. When I had the report formatted
correctly I successfully exported the data to Excel. Then I moved on to one
of the larger groups (55,804 rows). This is where I had the problems.
I previewed the report within Visual Studio. Total page count 1241. The
rdl.data file is 12MB. I clicked on the disk icon on the toolbar and chose
to export the report to Excel. I have 768MB of RAM in my machine and the
memory usage continued to rise until it reached 1GB. This process took over
20 minutes. At the end of it all I received a dialog box indicating a
System.OutOfMemoryException was thrown. What would happen if I tried
running this on the server? I am guessing that it would consume lots of CPU
cycles and eat up lots of memory. Would it finish, or would it fail? I am
afraid to try :)
Why is the .rdl.data only 12MB yet over 1GB was consumed while converting to
Excel? When I save the results to a [text] file within Query Analyzer the
file is 12MB. I could import into Excel, but then I have to import x files
and format y columns x times.
This should be fairly easy to repro. Does anyone from Microsoft want to
take a look at it? Does anyone else have any ideas?
--
KeithKeith,
The only recommendation I have at this time is to increase the RAM on the
machine you are using to export to excel. I do not have a specific
recommendation, but I would suggest a minimum of 1GB of RAM. The SQL Server
2000 Reporting Services team is looking at addressing this issue in upcoming
service packs and releases.
--
Bruce Johnson [MSFT]
Microsoft SQL Server Reporting Services
This posting is provided "AS IS" with no warranties, and confers no rights.
"Keith Kratochvil" <sqlguy.back2u@.comcast.net> wrote in message
news:eo8Ko5qXEHA.2868@.TK2MSFTNGP09.phx.gbl...
> Server: SQL Server 2000 Ent running on Win2k AS, Ent Edition of Reporting
> Services + SP1
> my workstation: Win2k SP4, Microsoft Development Environment v7.1.3088,
> Reporting Services Ent Ed + SP1.
> I am trying to export 55,804 rows to Excel. The table that I am exporting
> from has 11 columns.
> -- Columns within >results< -- DataType Length
> --- -- --
> Source varchar 2
> ClaimNumber varchar 17
> MarketLoc varchar 7
> Description varchar 50
> SomeDate datetime 8
> DetailNo smallint 2
> ChargeCode varchar 1
> UnitCount smallint 2
> ChargedDollars decimal 5
> AllowedDollars decimal 5
> NetPaidDollars decimal 5
> Because I need to dump this data for 17 different market locations ranging
> in rowcounts of 600 rows to 56,000 rows I figured that I would just create
a
> report within Reporting Services and query the results table (228,825
rows)
> table for each of the 17 market locations. I would export each market to
an
> Excel file and then I would copy/paste the worksheets into one file that I
> can send to the end user. This would be easier than selecting the
> information within Query Analyzer, saving to a file, importing each file,
> formatting each column and repeating the process for each of the 17 groups
> of data.
> I started out with a small data set (1500 rows) as a proof of concept so
> that I knew the process would work. When I had the report formatted
> correctly I successfully exported the data to Excel. Then I moved on to
one
> of the larger groups (55,804 rows). This is where I had the problems.
> I previewed the report within Visual Studio. Total page count 1241. The
> rdl.data file is 12MB. I clicked on the disk icon on the toolbar and
chose
> to export the report to Excel. I have 768MB of RAM in my machine and the
> memory usage continued to rise until it reached 1GB. This process took
over
> 20 minutes. At the end of it all I received a dialog box indicating a
> System.OutOfMemoryException was thrown. What would happen if I tried
> running this on the server? I am guessing that it would consume lots of
CPU
> cycles and eat up lots of memory. Would it finish, or would it fail? I
am
> afraid to try :)
> Why is the .rdl.data only 12MB yet over 1GB was consumed while converting
to
> Excel? When I save the results to a [text] file within Query Analyzer the
> file is 12MB. I could import into Excel, but then I have to import x
files
> and format y columns x times.
> This should be fairly easy to repro. Does anyone from Microsoft want to
> take a look at it? Does anyone else have any ideas?
> --
> Keith
>|||Bruce, Thanks for your reply. It is unfortunate that 768MB is not enough
memory to export ~12MB of data to Excel. I am glad that Microsoft enhanced
the versions of Excel that are supported with RS SP1. Now I guess I will
have to wait for SP2 (or SQL Server 2005) for memory optimization.
In the end I ended up getting the data into Excel via the "Get External Data
| from a database query" menu. Yeah, I had to do it 17 times, but it
worked. That method saved me the pain of having to perform a "data | text
to columns" and format each column..
Keep up the good work!
--
Keith
"Bruce Johnson [MSFT]" <brucejoh@.online.microsoft.com> wrote in message
news:OmRdOCsXEHA.3664@.TK2MSFTNGP12.phx.gbl...
> Keith,
> The only recommendation I have at this time is to increase the RAM on the
> machine you are using to export to excel. I do not have a specific
> recommendation, but I would suggest a minimum of 1GB of RAM. The SQL
Server
> 2000 Reporting Services team is looking at addressing this issue in
upcoming
> service packs and releases.
> --
> Bruce Johnson [MSFT]
> Microsoft SQL Server Reporting Services
> This posting is provided "AS IS" with no warranties, and confers no
rights.
>
> "Keith Kratochvil" <sqlguy.back2u@.comcast.net> wrote in message
> news:eo8Ko5qXEHA.2868@.TK2MSFTNGP09.phx.gbl...
> > Server: SQL Server 2000 Ent running on Win2k AS, Ent Edition of
Reporting
> > Services + SP1
> > my workstation: Win2k SP4, Microsoft Development Environment v7.1.3088,
> > Reporting Services Ent Ed + SP1.
> >
> > I am trying to export 55,804 rows to Excel. The table that I am
exporting
> > from has 11 columns.
> >
> > -- Columns within >results< -- DataType Length
> > --- -- --
> > Source varchar 2
> > ClaimNumber varchar 17
> > MarketLoc varchar 7
> > Description varchar 50
> > SomeDate datetime 8
> > DetailNo smallint 2
> > ChargeCode varchar 1
> > UnitCount smallint 2
> > ChargedDollars decimal 5
> > AllowedDollars decimal 5
> > NetPaidDollars decimal 5
> >
> > Because I need to dump this data for 17 different market locations
ranging
> > in rowcounts of 600 rows to 56,000 rows I figured that I would just
create
> a
> > report within Reporting Services and query the results table (228,825
> rows)
> > table for each of the 17 market locations. I would export each market
to
> an
> > Excel file and then I would copy/paste the worksheets into one file that
I
> > can send to the end user. This would be easier than selecting the
> > information within Query Analyzer, saving to a file, importing each
file,
> > formatting each column and repeating the process for each of the 17
groups
> > of data.
> >
> > I started out with a small data set (1500 rows) as a proof of concept so
> > that I knew the process would work. When I had the report formatted
> > correctly I successfully exported the data to Excel. Then I moved on to
> one
> > of the larger groups (55,804 rows). This is where I had the problems.
> >
> > I previewed the report within Visual Studio. Total page count 1241.
The
> > rdl.data file is 12MB. I clicked on the disk icon on the toolbar and
> chose
> > to export the report to Excel. I have 768MB of RAM in my machine and
the
> > memory usage continued to rise until it reached 1GB. This process took
> over
> > 20 minutes. At the end of it all I received a dialog box indicating a
> > System.OutOfMemoryException was thrown. What would happen if I tried
> > running this on the server? I am guessing that it would consume lots of
> CPU
> > cycles and eat up lots of memory. Would it finish, or would it fail? I
> am
> > afraid to try :)
> >
> > Why is the .rdl.data only 12MB yet over 1GB was consumed while
converting
> to
> > Excel? When I save the results to a [text] file within Query Analyzer
the
> > file is 12MB. I could import into Excel, but then I have to import x
> files
> > and format y columns x times.
> >
> > This should be fairly easy to repro. Does anyone from Microsoft want to
> > take a look at it? Does anyone else have any ideas?
> >
> > --
> > Keith
> >
> >
>