Please start any new threads on our new
site at https://forums.sqlteam.com. We've got lots of great SQL Server
experts to answer whatever question you can come up with.
Author |
Topic |
Isabelle
Starting Member
2 Posts |
Posted - 2013-11-26 : 12:28:44
|
Hello,I'm trying to find the most efficient way of handling failures in a process that looks like this:CREATE PROCEDURE SP_CALCASBEGIN... some logic... ... get a set of values from a queue table ...-- exec several SP's for the values from the queueEXEC SP_1 EXEC SP_2...EXEC SP_N... delete the processed values from the queueENDThe issue I noticed is that since there is no error handling in the calling procedure it is possible for one or more of the SP_N to fail but execution of SP_CALC will continue and the queue will still get deleted. Therefore the values may never get processed for this SP_N. The success of this process is critical to the functionality of the application it supports, it is resource intensive, runs a few times every minute, and there is error handling in each of the SP_N that causes them to rollback and return control to the caller. I've tried a few things with the usual TRY..CATCH. My goal is to delete the queue only if all SP_N have succeeded and to rerun only the failed SP(I experimented with @retry in a loop for a preset number of times). Not sure if that's the best way to do it and not sure how to handle cases when SP_N doesn't succeed after a couple of retries.If anyone has worked on a similar problem before, please share your experience. |
|
TG
Master Smack Fu Yak Hacker
6065 Posts |
Posted - 2013-11-26 : 12:55:41
|
Several ways to go here. But it sounds like the least amount of change for you would be to include a return code in all the called SPs. Then in sp_calc you can decide how to proceed based on the success/failure of each of the called SPs. Only delete your queue if ALL SPs returned success. It wouldn't hurt to wrap the SP calls within try/catch in case your called SP error handling is insufficient for some unexpected error.you could have a bit flag in sp_calc like (@deleteQueue bit) set it of true once at the beginning. Then on any failure set it to false. Only delete the queue if that bit is still set to true by the time everything else is done.Be One with the OptimizerTG |
|
|
Isabelle
Starting Member
2 Posts |
Posted - 2013-11-26 : 15:28:15
|
Hi TG, thanks for your thoughts. This is definitely one way to control the clean up of the queue. I can add error handling that detects when one of SP_N's fails, forces it to execute again (a predefined number of times for example) and deletes the queue only if all SP_N's have executed successfully. If the queue doesn't get cleaned up because the records didn't get processed by SP_N, the next time SP_CALC runs it will need to process all the new records in the queue + all old records for all SP_1,SP_2, ...,SP_N again, which is double the work. Is it even possible to minimize the impact on the entire process and design for the case where this one failed SP doesn't end up re-running successfully?Isabelle |
|
|
TG
Master Smack Fu Yak Hacker
6065 Posts |
Posted - 2013-11-26 : 15:42:42
|
Hard for me to get (wisely) specific without knowing the big picture. I thought that if the queue row had to be reprocessed that the SPs that were successful on the previous run would have no work to do and therefore run very quickly. But if that is not the case then I guess you would have to track which SPs completed successfully and which didn't by queue row in order to only run the ones that need to be re-tried.Be One with the OptimizerTG |
|
|
|
|
|
|
|