× The internal search function is temporarily non-functional. The current search engine is no longer viable and we are researching alternatives.
As a stop gap measure, we are using Google's custom search engine service.
If you know of an easy to use, open source, search engine ... please contact support@midrange.com.



On 9/11/2018 6:43 PM, Booth Martin wrote:
This has to be simpler than I am making it.

Several random times a day a file is cleared and then populated again through DDM .  When that happens I want to automatically run a program that updates another file.

A trigger is on rows, so using a trigger would run the program for every row  -  not a good solution.

I cannot access the original file nor the machine that it is on.


A trigger need not be based on rows.  You can create a trigger that operates FOR EACH STATEMENT.

I created two tables:  TEST_TRIGGER and TRIGGER_FIRED.  The table TRIGGER_FIRED contains a single column, defined as SMALLINT.

I created a DELETE AFTER trigger on the table TEST_TRIGGER as:

create or replace trigger mylib.mytrigger
  after delete on mylib.test_trigger
  referencing old table as o
  for each statement mode db2sql
begin
insert into mylib.trigger_fired
select count(*)
  from o;

I then ran:

delete from mylib.test_trigger

I then queried TRIGGER_FIRED, and there was one row with the number of rows that had been deleted in the SMALLINT column.

A couple of caveats:

* you cannot use CLRPFM on the table with a delete trigger in place
* clearing the table with TRUNCATE TABLE will not fire the trigger

The only way I could get it to work was with the unqualified DELETE.


As an Amazon Associate we earn from qualifying purchases.

This thread ...

Replies:

Follow On AppleNews
Return to Archive home page | Return to MIDRANGE.COM home page

This mailing list archive is Copyright 1997-2024 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact [javascript protected email address].

Operating expenses for this site are earned using the Amazon Associate program and Google Adsense.