Falcon provides feature to replicate Hive metadata and data events from source cluster to destination cluster. This is supported for secure and unsecure cluster through Falcon Recipes.
Following is the prerequisites to use Hive DR
Note: Set following properties in hive-site.xml for replicating the Hive events on source and destination Hive cluster:
<property> <name>hive.metastore.event.listeners</name> <value>org.apache.hive.hcatalog.listener.DbNotificationListener</value> <description>event listeners that are notified of any metastore changes</description> </property> <property> <name>hive.metastore.dml.events</name> <value>true</value> </property>
Perform initial bootstrap of Table and Database from source cluster to destination cluster
$FALCON_HOME/bin/falcon entity -submit -type cluster -file /cluster/definition.xml
Copy Hive DR recipe properties, workflow and template file from $FALCON_HOME/data-mirroring/hive-disaster-recovery to the accessible directory path or to the recipe directory path (falcon.recipe.path=<recipe directory path>). "falcon.recipe.path" must be specified in Falcon conf client.properties. Now update the copied recipe properties file with required attributes to replicate metadata and data from source cluster to destination cluster for Hive DR.
After updating the recipe properties file with required attributes in directory path or in falcon.recipe.path, there are two ways of submitting the Hive DR recipe:
$FALCON_HOME/bin/falcon recipe -name hive-disaster-recovery -operation HIVE_DISASTER_RECOVERY -properties /cluster/hive-disaster-recovery.properties
$FALCON_HOME/bin/falcon recipe -name hive-disaster-recovery -operation HIVE_DISASTER_RECOVERY
Note: