Please start any new threads on our new site at https://forums.sqlteam.com. We've got lots of great SQL Server experts to answer whatever question you can come up with.

 All Forums
 SQL Server 2000 Forums
 SQL Server Administration (2000)
 Changing error severity levels

Author  Topic 

DaleHughes
Starting Member

7 Posts

Posted - 2002-06-14 : 16:46:40
I changed the severity level of an information message (#3604-Duplicate Key was Ignored) from severity=10 to severity=14. This is the same same severity level as message #2601-Unique Key Constraint Violation.

I was anticipating that this message would then be captured in @@ERROR, but I still can't catch that warning. I even stopped and restarted the SQL service. What am I missing?

Thanks in advance for your help.....

jasper_smith
SQL Server MVP & SQLTeam MVY

846 Posts

Posted - 2002-06-16 : 12:52:09
What problem is NOT capturing this Dup Key error causing ?
Is this a client or server side problem ?
What is being used to execute the SQL (Job Step,FTS,SQL Batch,Stored Procedure etc)


Go to Top of Page

DaleHughes
Starting Member

7 Posts

Posted - 2002-06-17 : 11:38:45
I have a table with a unique index. I've selected the "ignore duplicates" option. When a batch insert is performed via a SQL task in DTS, the DTS package sees the returned information message #3604 -"Duplicate Keys were Ignored".

#1) Because this message is informational (serverity level = 10), @@ERROR is set to zero - the insert was successful.
#2) DTS sees the returned message and treats it as though the statement failed, even though the statement was successful.

So my questions are these:
#1) is there a way to suppress the message so that DTS does not think the step failed? I've posted this to a few news groups and so far no one seems to know how to suppress the information message.
and
#2) I changed the serverity level of error #3604 from 10 to 14, expecting that to cause @@ERROR to then capture the error code, but after the insert it still contains a value of zero.

If I deselect the ignore duplicates option on the unique index, the entire insert statement fails and @@ERROR contains the error value #2601 - "cannot insert duplicate rows".

If I can get @@ERROR to capture the 3604 value, I can code around the duplicates ignored, and still react to more severe errors.

Any insight or other work-around ideas would be appreciated. Thanks.

Go to Top of Page

robvolk
Most Valuable Yak

15732 Posts

Posted - 2002-06-17 : 12:15:19
How about a staged import process, where the DTS transform puts the raw data into a staging/holding table. You can then insert ONLY the new rows into the table and completely avoid any duplicate warnings. It's a little bit more work but it's still fast and completely reliable...a number of SQL Teamers use this technique:

http://www.sqlteam.com/Forums/topic.asp?TOPIC_ID=13501
http://www.sqlteam.com/Forums/topic.asp?TOPIC_ID=7978
http://www.sqlteam.com/Forums/topic.asp?TOPIC_ID=10126

Just for future reference, be very careful when editing the system tables...in fact, avoid it at all costs, even if it's just to modify a value. Any changes can have unforeseen consequences, and system tables change from one release to the next.

Go to Top of Page

DaleHughes
Starting Member

7 Posts

Posted - 2002-06-17 : 14:37:16
Thanks for the help, Rob. I'm already using a staging table. I reviewed the links you listed, but what they're missing is that based on the definition of the unique key (col001, col002), duplicates exist in the source batch of rows. The current statement I'm using is ....

insert into table1
select distinct * from table2
where not exists
(select 1 from table1 inner join table2
on table2.col001 = table1.col001 and table2.col002 = table1.col002)

When duplicates exist in the source, multiple rows satisfy the constraint, but after the first row is inserted, the second row fails. Short of using a cursor (Arrrggg!) to determine if each key already exists, I'm trying everything I can think of. I appreciate the warning of system table modifcations. I'm now starting to work with using an alert on error 3604 to execute a job to perform a raiserror (3604,10,1) with SETERROR, which is populating @@ERROR when I run it from Query Analyzer, but I'm still not getting it to populate @@ERROR when triggered by the alert.

Ahh, the fun continues.... but if you have any other ideas, please let me know. Thanks


Go to Top of Page

jasper_smith
SQL Server MVP & SQLTeam MVY

846 Posts

Posted - 2002-06-17 : 17:21:04
However your staging table should not have ANY constraints and should be empty before load. Then load your data, dedupe it according to your business rules and move it to your actual destination table.

I must say I'm not a big fan of [non-error] 3604 - its not an error but it trips a lot of things up (the SQL Agent TSQL job system was one in particular before SQL2000 SP1).

HTH
Jasper Smith


Go to Top of Page

DaleHughes
Starting Member

7 Posts

Posted - 2002-06-18 : 10:08:05
The staging table (table2) is empty prior to the load and does not have any constraints - the constraint is the unique index on table1(col001,col002).

This would be SOOOO much easier if I could just get @@ERROR to capture the 3604 event. Using an alert on 3604 to run a job that performs the raiserror isn't working either, I think because its treated as a different session so the raiserror command isn't set where it can be queried after the insert.

So if changing the severity level of error 3604 in master.dbo.sysmessages from 10 to 14 isn't triggering @@ERROR to catch the 3604 event, what will?. What else do I need to do to have @@ERROR catch it? Sure, "raiserror(3604,10,1) with SETERROR" populates @@ERROR with 3604, but how would I know to issue the raiserror command? Using the alert works as far as the alert being fired when the event occurs, but @@ERROR is still zero when I check it immediately following the insert.

If I can get @@ERROR to capture the 3604 event, it would then be available to ALL my applications, and I wouldn't need to have a lot of extra "deduping" code in every single application.

Go to Top of Page

robvolk
Most Valuable Yak

15732 Posts

Posted - 2002-06-18 : 10:19:12
Well, it might be easier, but it's definitely not better to do it this way. Relying on error messages as part of your overall process is never a good idea. By the time you catch the error, something has gone wrong and you have to fix it, vs. preventing it in the first place. I'm not beating you up on this point, but you wouldn't want to have that philosophy in dealing with an unplanned pregnancy, would you?

It's true that this is a benign error that really doesn't cause any problems, but again, if you know it will happen and can prevent it, you should. It's neither wasted nor redundant effort. And it seems that the error message venue is not working out like you planned, so why not avoid it entirely?

Go to Top of Page

gminor
Starting Member

1 Post

Posted - 2002-06-18 : 14:58:40
I see what you guys are saying, but I've been having the same problem. The advice is very
helpful, but there has to be an answer to Dales original question...

Here is a quick example of my situation:

create table STG_Table (
Col1 smallint,
Col2 smallint,
Col3 smallint,
)
go

create table Data_Table (
Col1 smallint primary key,
Col2 smallint,
Col3 smallint,
)
go

The data looks like this (comma delimited):
1,1,1
2,2,2
2,1,2

Load this into an empty staging table. Then execute the following SQL:

insert into Data_Table
select distinct * from STG_Table s
where not exists(select 1 from Data_Table d where d.col1 = s.col1)

This will only return unique records, which includes both records with a primary key of 2. This
should also ensure that a record with the key doesn't already exist.

However, when you execute this, the NOT EXISTS is checked when it builds the selected
recordset, and it tries to insert both records with a key of 2 (because they are still distinct)

This then returns a 3604 and forces DTS to return an Error... There has to be a
way to get DTS to 'ignore' that warning message... Right?

Thanks for your time.

--grant

Go to Top of Page
   

- Advertisement -