Issue:
The recording of task events are tracked in tables, namely the dmi_queue_item table. These build up proportionally to the number of tasks executed. As time goes on and performance potentially slows down, these queues will need to be deleted.
Requirements:
Periodically delete old records from the dmi_queue_item table, but keep track of a document’s workflow history for a certain amount of time beyond the dmi_queue_item cleaning.
Solution:
The first thing to determine is whether the out-of-box dm_QueueMgt job can do what’s required. This job runs a method that can delete queue items older than a cutoff date. This is useful, however, we want to keep the workflow history of a document. This is also useful because this table keeps a record of many other events in the repository which need to be deleted on a scheduled basis. The solution was to create a custom table which holds the requirement queue items to maintain the workflow history of documents, and to create a new job and method to populate it before as part of queue management.
Solution Details:
First, create a custom table using DQL (Note: this table has the same columns as the dmi_queue_item table):
UNREGISTER TABLE wf_history_s
EXECUTE exec_sql WITH query='DROP TABLE wf_history_s'
EXECUTE exec_sql WITH query='CREATE TABLE wf_history_s
(
r_object_id VARCHAR2(32),
object_type VARCHAR2(32),
id_1 VARCHAR2(32),
string_5 VARCHAR2(200),
string_4 VARCHAR2(200),
string_3 VARCHAR2(200),
string_2 VARCHAR2(200),
string_1 VARCHAR2(200),
workflow_id VARCHAR2(32),
policy_id VARCHAR2(32),
registry_id VARCHAR2(32),
audit_signature VARCHAR2(255),
audited_obj_vstamp INTEGER,
user_name VARCHAR2(32),
time_stamp_utc DATE,
audit_version INTEGER,
chronicle_id VARCHAR2(32),
controlling_app VARCHAR2(32),
object_name VARCHAR2(255),
audited_obj_id VARCHAR2(32),
version_label VARCHAR2(32),
acl_domain VARCHAR2(32),
attribute_list_id VARCHAR2(32),
host_name VARCHAR2(128),
user_id VARCHAR2(32),
i_audited_obj_class INTEGER,
event_source VARCHAR2(64),
event_name VARCHAR2(64),
r_gen_source INTEGER,
owner_name VARCHAR2(32),
time_stamp DATE,
event_description VARCHAR2(64),
session_id VARCHAR2(32),
current_state VARCHAR2(64),
application_code VARCHAR2(64),
acl_name VARCHAR2(32),
attribute_list VARCHAR2(2000),
i_is_archived VARCHAR2(32),
id_5 VARCHAR2(32),
id_4 VARCHAR2(32),
id_3 VARCHAR2(32),
id_2 VARCHAR2(32)
)'
REGISTER TABLE dm_dbo.wf_history_s
(
r_object_id CHAR(32),
object_type CHAR(32),
id_1 CHAR(32),
string_5 CHAR(200),
string_4 CHAR(200),
string_3 CHAR(200),
string_2 CHAR(200),
string_1 CHAR(200),
workflow_id CHAR(32),
policy_id CHAR(32),
registry_id CHAR(32),
audit_signature CHAR(255),
audited_obj_vstamp INT,
user_name CHAR(32),
time_stamp_utc TIME,
audit_version INT,
chronicle_id CHAR(32),
controlling_app CHAR(32),
object_name CHAR(255),
audited_obj_id CHAR(32),
version_label CHAR(32),
acl_domain CHAR(32),
attribute_list_id CHAR(32),
host_name CHAR(128),
user_id CHAR(32),
i_audited_obj_class INT,
event_source CHAR(64),
event_name CHAR(64),
r_gen_source INT,
owner_name CHAR(32),
time_stamp TIME,
event_description CHAR(64),
session_id CHAR(32),
current_state CHAR(64),
application_code CHAR(64),
acl_name CHAR(32),
attribute_list CHAR(2000),
i_is_archived CHAR(32),
id_5 CHAR(32),
id_4 CHAR(32),
id_3 CHAR(32),
id_2 CHAR(32)
)
update dm_registered object
set owner_table_permit = 15,
set group_table_permit = 15,
set world_table_permit = 15
where table_name = 'wf_history_s'
Second, create a custom Documentum method to be executed by the custom queue management job. This class should have the following methods and logic:
a. Populate Workflow History Table according to criteria. Here’s an example dql:
"insert into dm_dbo.wf_history_s " +
"(r_object_id, event_name, time_stamp, user_name, audited_obj_id, string_4, workflow_id, string_3) " +
"SELECT '0000000000000000' as r_object_id, task_name as event_name, date_sent as time_stamp, sent_by as user_name, r_object_id as audited_obj_id, name as string_4 , router_id, task_state as string_3 " +
"FROM dmi_queue_item " +
"WHERE r_object_id not in (select audited_obj_id from dm_dbo.wf_history_s) " +
"AND router_id != '0000000000000000' " +
"AND date_sent < DATEADD(Day, -"+sCutOffDate+", date(today)) " +
"AND delete_flag = 1";
b. If the Workflow History Table gets populated successfully, delete the dmi_queue_item rows according to criteria. Here’s an example dql:
"DELETE dmi_queue_item objects " +
"WHERE router_id != '0000000000000000' " +
"AND date_sent < DATEADD(Day, -"+m_cutoff+", date(today)) " +
"AND delete_flag = 1";
c. Write the job report to the repository.
Third, create the custom queue management job.
1 comment:
I think it's important to also include a step to see where the dm_audit table could meet those needs as well. That being said, the counter-argument would be that it is easier to audit needed workflow events and use the dm_audit table instead.
Post a Comment