Please start any new threads on our new
site at https://forums.sqlteam.com. We've got lots of great SQL Server
experts to answer whatever question you can come up with.
Author |
Topic |
AskSQLTeam
Ask SQLTeam Question
0 Posts |
Posted - 2006-01-25 : 08:18:09
|
Karen writes "I have a stored procedure that imports data from 116 text files from various customer sites using BULK INSERT and then does some data aggregation and comparisons. (Unfortunately I am stuck with the text file situation)The correct formatting of the files has not been reliable, so I put in a test routine which runs the BULK INSERT for each file and updates a table to show its success or otherwise, to 'screen' the files before the main processing. This works fine BUT as soon as the routine encounters a failure it terminates. Therefore the screening process has to be iterative which is not desirable.If I create a stored procedure for each of the test file imports and then call these from a 'main' stored procedure using EXECUTE commands will this still be the case? Is there a better way of checking so many files?My procedure looks something like ...<clip>insert into st_fileimport (dealernumber, handheld_file)values ('01', 'Fail')BULK INSERT dbo.filetesthandheldFROM 'foo.csv'WITH(FIELDTERMINATOR = ',' ,ROWTERMINATOR = '\n' ,ROWS_PER_BATCH = 500 )update st_fileimportset handheld_file = 'Pass' where dealernumber = '01'</clip>obviously with lots before and after and repeated many times!I apologise for my ignorance, I am an SQL newcomer.Any help would be greatly appreciated.ThanksKaren" |
|
madhivanan
Premature Yak Congratulator
22864 Posts |
|
|
|
|