Tuesday, August 26, 2008

Oracle Streams – The ABC’s of a Heartbeat table

-----------------------------------------------------------------------------------------------------------------------------------------------------------------------

NOTE: The following posting was in collaboration with Frank Ortega, Thanks Frank for reviewing my work.

-----------------------------------------------------------------------------------------------------------------------------------------------------------------------


What is a heartbeat table and what is it used for?

This table is especially useful for databases that experience periods of low replication activity. The use of a heartbeat table ensures the replication environment is working even if there are just a few replicated changes. It also is valuable as a quick check for the DBA to ensure that the Streams environment is functioning properly (by simply querying a date column in the heartbeat table on the destination).

By default, an Oracle Streams capture process requests a Streams checkpoint after every 10MB of generated redo. The Oracle Streams metadata is only maintained during that checkpoint if the transactions have data relative to Streams capture. Implementing a heartbeat table ensures that there are open active ‘Streams transactions’ occurring regularly in the source database, insuring Streams metadata is updated during each checkpoint.


Ultimately, the end result of this is that CAPTURED_SCN in the Streams metadata (DBA_CAPTURE), will be maintained and stay current with redo that is being generated even during typical periods of low replication activity. This will ensure that the Capture process will not fail (missing archive logs for an old SCN required for capture) during Oracle Startup following low activity periods.


What is involved in doing this?

A.- You will need to create a table in your source database that has as one of the columns a date/timestamp column. Enable supplemental logging for the table and instantiate the table at the source. Export the table from the source and Import at the destinations with the instantiation option.

B.- Add a rule to capture changes for the heartbeat table at the source. Propagate the changes to the destination.


C.- Make sure that the target destination will apply changes to this
table as well by adding a new apply rule.

D.- Set up an automated job to update this table at the source site
periodically.



Let's do it

Assumptions: A Streams environment already exists. Capture Process (CAPTURE_STREAM) on source and Apply Process (APPLY_STREAM ) on destinations exist. Queues (STREAMS_QUEUE) already exist on both source and destinations. Table level replication is occurring. STREAMADM is streams administrator account on source and destination. Capture and Apply processes are stopped prior to adding rules and restarted when you are done.


(A)

On the source:

SQL> create table heartbeat (a number primary key, b date);

SQL> alter table heartbeat enable supplemental log data (all) columns;

Using userid

SQL> exec dbms_capture_adm.prepare_table_instantiation

(table_name => '.HEARTBEAT');

exp USERID=/ FILE=strmuser.dmp TABLE=.HEARTBEAT OBJECT_CONSISTENT=y ROWS=y

On the destinations:

imp USERID=/ FILE=strmuser.dmp TABLE=.HEARTBEAT COMMIT=y LOG=import.log STREAMS_INSTANTIATION=y GRANTS=n

(B)

On the source:

BEGIN

DBMS_STREAMS_ADM.ADD_TABLE_RULES(

table_name => '.heartbeat',

streams_type => 'CAPTURE',

streams_name => 'CAPTURE_STREAM',

queue_name => ‘STREAMADM.STREAMS_QUEUE',

include_dml => true,

include_ddl => false,

source_database => 'SOURCEDB');

END;

/

BEGIN

DBMS_STREAMS_ADM.ADD_TABLE_PROPAGATION_RULES(

table_name => '.heartbeat',

streams_name => ‘CAPTURE_STREAM,

source_queue_name => 'STREAMADM.STREAMS_QUEUE',

destination_queue_name => ' STREAMADM.STREAMS_QUEUE@DESTDB’,

include_dml => true,

include_ddl => false,

source_database => 'SOURCEDB');

END;

/

(C)

On the destinations:

BEGIN

DBMS_STREAMS_ADM.ADD_TABLE_RULES(

table_name => '.heartbeat',

streams_type => 'APPLY',

streams_name => 'APPLY_STREAM',

queue_name => 'STREAMADM.STREAMS_QUEUE',

include_dml => false,

include_ddl => true,

source_database => 'SOURCEDB’);

END;

/

(D)

On the source:

SQL> create sequence temp_seq start with 1;

SQL> variable jobno number;
begin
dbms_job.submit(:jobno, 'insert into heartbeat

values (temp_seq.nextval, sysdate);',
sysdate, 'sysdate+60/(60*60*24)');
commit;
end;
/


No comments: