Please start any new threads on our new
site at https://forums.sqlteam.com. We've got lots of great SQL Server
experts to answer whatever question you can come up with.
| Author |
Topic |
|
dariopalermo
Starting Member
16 Posts |
Posted - 2011-06-04 : 18:00:28
|
| Hi all,I'm looking for some advice. Here's my problem:4 different users upload many times per day a file called x.txt to our ftp site, each user in a different directory.The file contains, on each line, a fixed lenght record (aaaabbbbbbbbbbbccccccccdddddde). I need to:1) every 5 minutes, check every user's directory for x.txt2) if x.txt is found, then import in my db*3) if import went ok, move the file4) if import didn't went ok, rename and move the file5) log the entire process, step by step, to separate txt files (one per user), so the users can read their logs via ftp* I need to set the destination db, on a per user basis, to the production or the test db.Today I'm doing it with a scheduled batch and bcp, but I feel like it has very little control over the import phase: I wasn't unable, for example, to filter out duplicate rows (destination table has an unique index). I'd like to import unique rows and log in the txt file how many total rows readed, how many records inserted and how many dupes discarded.Any suggestion?ThanksDario |
|
|
sunitabeck
Master Smack Fu Yak Hacker
5155 Posts |
Posted - 2011-06-05 : 00:20:03
|
| Importing the file using bcp (or bulk insert or other methods) into a staging table in the database and then writing a script to examine and process the data in the staging table will allow you to do the things you listed - eliminating duplicate rows, generating log messages, etc. |
 |
|
|
jackv
Master Smack Fu Yak Hacker
2179 Posts |
Posted - 2011-06-05 : 03:46:39
|
| SSIS is a good option for managing the workflowJack Vamvas--------------------http://www.sqlserver-dba.com |
 |
|
|
visakh16
Very Important crosS Applying yaK Herder
52326 Posts |
Posted - 2011-06-05 : 03:49:48
|
| All you asked for that can be done with help of ssis. you need to use for each loop to iterate the process for each user. then use dft for file transfer.you can check for duplicates using merge join or lookup transformation tasks and finally file system task to rename or move file------------------------------------------------------------------------------------------------------SQL Server MVPhttp://visakhm.blogspot.com/ |
 |
|
|
dariopalermo
Starting Member
16 Posts |
Posted - 2011-06-06 : 03:59:08
|
| I'll look into SSIS. For now, I run the wizard and it helped me import the file, I'll try to modify the package to match my other requirements.Quick question: will I be able to put and "if" condition at the start of the package? The file I want to import could not be there when I run the package (every 5 minutes) and I want to avoid critical errors for this kind of situation.Bye,Dario |
 |
|
|
jackv
Master Smack Fu Yak Hacker
2179 Posts |
Posted - 2011-06-06 : 11:48:33
|
| Yes , you can use the scripting optionJack Vamvas--------------------http://www.sqlserver-dba.com |
 |
|
|
visakh16
Very Important crosS Applying yaK Herder
52326 Posts |
Posted - 2011-06-12 : 04:44:56
|
quote: Originally posted by dariopalermo I'll look into SSIS. For now, I run the wizard and it helped me import the file, I'll try to modify the package to match my other requirements.Quick question: will I be able to put and "if" condition at the start of the package? The file I want to import could not be there when I run the package (every 5 minutes) and I want to avoid critical errors for this kind of situation.Bye,Dario
you can do that check using precedence constraint expression in ssis. just place a execute sql task with dummy statement and in precedence contstraint property use expression option and give expression like Len(@FileName)>0 to proceed with processing only if there's a file.------------------------------------------------------------------------------------------------------SQL Server MVPhttp://visakhm.blogspot.com/ |
 |
|
|
|
|
|
|
|