Hive Disaster Recovery

Overview

Falcon provides feature to replicate Hive metadata and data events from source cluster to destination cluster. This is supported for secure and unsecure cluster through Falcon Recipes.

Prerequisites

Following is the prerequisites to use Hive DR

  • Hive 1.2.0+
  • Oozie 4.2.0+

Note: Set following properties in hive-site.xml for replicating the Hive events on source and destination Hive cluster:

    <property>
        <name>hive.metastore.event.listeners</name>
        <value>org.apache.hive.hcatalog.listener.DbNotificationListener</value>
        <description>event listeners that are notified of any metastore changes</description>
    </property>

    <property>
        <name>hive.metastore.dml.events</name>
        <value>true</value>
    </property>

Usage

Bootstrap

Perform initial bootstrap of Table and Database from source cluster to destination cluster

  • Database Bootstrap
For bootstrapping DB replication, first destination DB should be created. This step is expected, since DB replication definitions can be set up by users only on pre-existing DB’s. Second, Export all tables in the source db and Import it in the destination db, as described in Table bootstrap.

  • Table Bootstrap
For bootstrapping table replication, essentially after having turned on the DbNotificationListener on the source db, perform an Export of the table, distcp the Export over to the destination warehouse and do an Import over there. Check the following Hive Export-Import for syntax details and examples. This will set up the destination table so that the events on the source cluster that modify the table will then be replicated.

Setup cluster definition

    $FALCON_HOME/bin/falcon entity -submit -type cluster -file /cluster/definition.xml
   

Update recipes properties

Copy Hive DR recipe properties, workflow and template file from $FALCON_HOME/data-mirroring/hive-disaster-recovery to the accessible directory path or to the recipe directory path (falcon.recipe.path=<recipe directory path>). "falcon.recipe.path" must be specified in Falcon conf client.properties. Now update the copied recipe properties file with required attributes to replicate metadata and data from source cluster to destination cluster for Hive DR.

Submit Hive DR recipe

After updating the recipe properties file with required attributes in directory path or in falcon.recipe.path, there are two ways of submitting the Hive DR recipe:

  • 1. Specify Falcon recipe properties file through recipe command line.
       $FALCON_HOME/bin/falcon recipe -name hive-disaster-recovery -operation HIVE_DISASTER_RECOVERY
       -properties /cluster/hive-disaster-recovery.properties
   

  • 2. Use Falcon recipe path specified in Falcon conf client.properties .
       $FALCON_HOME/bin/falcon recipe -name hive-disaster-recovery -operation HIVE_DISASTER_RECOVERY
   

Note:

  • Recipe properties file, workflow file and template file name must match to the recipe name, it must be unique and in the same directory.
  • If kerberos security is enabled on cluster, use the secure templates for Hive DR from $FALCON_HOME/data-mirroring/hive-disaster-recovery .