In the last blog, I showed you how to create a standby database through the GUI. We created an exact copy of the Primary database which runs on filesystems as the Standby database which also runs on filesystems. The golden rule to ensure the configuration works is to have an existing DDC to manually send and apply logs.
From there we are 100% confident that we can schedule the shipping and applying of the logs according to our company’s Recovery Time Objective (RTO) and Recovery Point Objective (RPO).
There are 2 options to schedule Standby™️ to ship and apply archived logs, or as many people would say “auto sync”.
The 1st method is via CRON in Linux or the Windows Scheduler.
The 2nd method is by running the Standby™️ Daemon Process in the background.
We are going to look at option 2.
We recommend you review the parameters related to the Daemon process which starts with DMN_ in the DDC file – specifically the Interval and Monitor_Interval parameters. There is also the blackout feature which will help to pause the sending and applying of archived logs for a particular period of time. The Daemon background process will write to a logfile in the DBVISIT_BASE/standby/log directory with the format dbvisitd_<DDC>.log.
The Daemon background process can be run from the command line for both Linux and Windows. In Linux, the background process will be started using the -D option and in Windows, it will run as a Windows Service.
In the video below we are going to use the Central Console (GUI) to start and run the Daemon processes:
Note that my configuration is running on Linux. The Daemon control options are somewhat different in Windows. As you can see it is super easy to configure and start the Daemon process to take care of the automatic sending and applying of archived logs for your configuration. The Log Gap Report can be used to display if the required time lag is accomplished/maintained according to company requirements ensuring the Primary and Standby databases are in sync.