Please start any new threads on our new site at https://forums.sqlteam.com. We've got lots of great SQL Server experts to answer whatever question you can come up with.

 All Forums
 SQL Server 2005 Forums
 SSIS and Import/Export (2005)
 SSIS - Activex script in transformation.

Author  Topic 

sharmada
Starting Member

5 Posts

Posted - 2008-02-06 : 09:33:58
I have a flatfile source to which different flatfiles will be passed as input,this is connected to an OLEDB destination which changes along with the sourcefile.
But when the new file is given as input, the OLEDB mappings are not getting refreshed.It is showing an error.

Actually this was implemented in DTS, and they have used an activex script for the transformation.
what shd I use in SSIS?

Please help me..

Sharmada

visakh16
Very Important crosS Applying yaK Herder

52326 Posts

Posted - 2008-02-06 : 09:49:57
Try with a script task in SSIS
Go to Top of Page

sharmada
Starting Member

5 Posts

Posted - 2008-02-06 : 10:32:28
I tried with script task as a transformation.
Can you please help me in VB code for making output parameters to be same as input parameters inside the script task?

Sharmada
Go to Top of Page

Qualis
Posting Yak Master

145 Posts

Posted - 2008-02-06 : 14:51:47
Ok, so I understand. You have multiple flat files. Do they all have the same format (same columns, same order, etc)? You are sending the data to an OLE DB Destination that is a different table for each file or the same table? Are you trying to change the mapping of the data on the fly (Example1) or are you trying to change the column definitions on the fly (Example2)? Example1 you should be able to do with some work in a script transform. Example2 is not possible because the Dataflow metadata cannot be changed at runtime.

Example1:
FileA
In_Column1 -> Out_Column1
In_Column2 -> Out_Column2
In_Column3 -> Out_Column3

FileB
In_Column1 -> Out_Column2
In_Column2 -> Out_Column3
In_Column3 -> Out_Column1

Example2:
FileA
In_Column1 (varchar (50)) -> Out_Column1 (varchar (50))
In_Column2 (int) -> Out_Column2 (int)
In_Column3 (int) -> Out_Column3 (int)

FileB
In_Column1 (int) -> Out_Column2 (int)
In_Column2 (varchar (50)) -> Out_Column3 (varchar (50))
In_Column3 (int) -> Out_Column1 (int)
Go to Top of Page

sharmada
Starting Member

5 Posts

Posted - 2008-02-07 : 03:11:01
All the files are with the same format with same row and column delimiters, but the columns will be entirely diffrent. The destination tables also change according to the source file.The destination tablename and the source filename are passed as global variables at runtime.
The problem is flatfile output and OLEDB table mappings are not refreshed at runtime.
So what shd i do in the script transform to refresh this mapping...
We are not doing any alteration to the data while loading to OLEDB.
So what are all the things I shd do in the transformation just to make this mappings runtime?

I think this comes under Example 1 itself..what do u think?


Thanks for the reply ... Please help me...

Sharmada
Go to Top of Page

Qualis
Posting Yak Master

145 Posts

Posted - 2008-02-07 : 13:26:36
So, you have the same number of columns per file? Do the columns contain different data (for example, col1 in file1 is a date and a varchar in file2)? You can get around the Source having different data types by making them all varchar because SQL Server will typecast them for you. You cannot, however, have the source adjust to number of columns during runtime. If the files have different numbers of columns, your solution is MUCH more complicated. Do your destination tables have different column names and column order? If so, you cannot do this in a single DataFlow. What information are you using to change the column mappings in DTS? Can you show me the ActiveX script that does the remapping?
Go to Top of Page

sharmada
Starting Member

5 Posts

Posted - 2008-02-08 : 00:15:01
In DTS inside Ativex script they are just putting input columns equal to output columns.Then they are loading the data into the table.

I have different number of columns in each file.so I want to get that refreshed along with the input.The column names and datattypes are also diffrent.
Yes the destination table also changes along with the input.The number of columns is again different but the order is same as that of file.
Please suggest the solution.

Sharmada
Go to Top of Page

Qualis
Posting Yak Master

145 Posts

Posted - 2008-02-08 : 09:02:26
You cannot do this in a single DataFlow. The reason is that the Metadata is SET at design time and CANNOT be adjusted at runtime. The reason this is done is to optimize the buffers and pipeline. If you have a set of common file types (i.e. 3 files of type1, 2 of type2 and 9 of type3) you can set up a DataFlow for each type and then dynamically set the source file and destination table with Expressions on the respective components.
Go to Top of Page

sharmada
Starting Member

5 Posts

Posted - 2008-02-10 : 08:00:35
Ok thanks for the suggestions.
Will try to implement this.

Sharmada
Go to Top of Page

tupkan
Starting Member

1 Post

Posted - 2008-06-18 : 14:38:29
Were you able to resolve this? If yes, please let me know how? I am in same situation

Thanks,
Tupkan
Go to Top of Page
   

- Advertisement -