How can I handle uniqueness in this situation?

I have a table like this:

create table my_table
(
type1 varchar2(10 char),
type2 varchar2(10 char) 
);

I want to uniqueness like this;

  1. if type1 column has 'GENERIC' value then just type2 column must be unique for the table. for example;
    • type1 column has 'GENERIC' value and type2 column has 'value_x' then there must not any type2 column value that equals to 'value_x'.
  2. But other uniqueness is looking for both column. I mean it should be unique by type1 and type2 columns.(of course first rule is constant)

I try to make it with trigger;

CREATE OR REPLACE trigger my_trigger
BEFORE INSERT OR UPDATE 
ON my_table
FOR EACH ROW 
DECLARE 
       lvn_count NUMBER :=0;
       lvn_count2 NUMBER :=0;
       errormessage clob;
       MUST_ACCUR_ONE EXCEPTION;
    --   PRAGMA AUTONOMOUS_TRANSACTION; --without this it gives mutating error but I cant use this because it will conflict on simultaneous connections
BEGIN

      IF :NEW.type1 = 'GENERIC' THEN

       SELECT count(1) INTO lvn_count FROM my_table
       WHERE type2= :NEW.type2;
      
      ELSE 
      
       SELECT count(1) INTO lvn_count2 FROM my_table
       WHERE type1= :NEW.type1 and type2= :NEW.type2;
 
      END IF;     
 

       IF (lvn_count >= 1 or lvn_count2 >= 1) THEN
             RAISE MUST_ACCUR_ONE; 
       END IF;

END;

But it gives mutating error without pragma . I do not want to use it due to conflict on simultaneous connections. (error because I use same table on trigger)

I try to make it with unique index but I cant manage.

CREATE UNIQUE INDEX my_table_unique_ix 
ON my_table (case when type1= 'GENERIC' then 'some_logic_here' else type1 end, type2);  -- I know it does not make sense but maybe there is something different that I can use in here.

Examples;

**Example 1**

insert into my_table (type1,type2) values ('a','b'); -- its ok no problem

insert into my_table (type1,type2) values ('a','c'); -- its ok no problem

insert into my_table (type1,type2) values ('c','b'); -- its ok no problem

insert into my_table (type1,type2) values ('GENERIC','b'); -- it should be error because b is exist before (i look just second column because first column value is 'GENERIC')


EXAMPLE 2:

insert into my_table (type1,type2) values ('GENERIC','b'); -- its ok no problem

insert into my_table (type1,type2) values ('a','c'); -- its ok no problem

insert into my_table (type1,type2) values ('d','c'); -- its ok no problem

insert into my_table (type1,type2) values ('d','b'); -- it should be error because second column can not be same as the second column value that first column value is 'GENERIC' 

Solution 1:

What you're trying to do is not really straightforward in Oracle. One possible (although somewhat cumbersome) approach is to use a combination of

  • an additional materialized view with refresh (on commit)
  • a windowing function to compute the number of distinct values per group
  • a windowing function to compute the number of GENERIC rows per group
  • a check constraint to ensure that either we have only one DISTINCT value or we don't have GENERIC in the same group

This should work:

create materialized view mv_my_table 
refresh on commit
as
select 
 type1,
 type2,
 count(distinct type1) over (partition by type2) as distinct_type1_cnt,
 count(case when type1 = 'GENERIC' then 1 else null end) 
     over (partition by type2) as generic_cnt
from my_table;

alter table mv_my_table add constraint chk_type1 
  CHECK (distinct_Type1_cnt = 1 or generic_cnt = 0);

Now, INSERTing a duplicate won't fail immediately, but the subsequent COMMIT will fail because it triggers the materialized view refresh, and that will cause the check constraint to fire.

Disadvantages

  • duplicate INSERTs won't fail immediately (making debugging more painful)
  • depending on the size of your table, the MView refresh might slow down COMMITs considerably

Links

For a more detailed discussion of this approach, see AskTom on cross-row constraints