Integrate Software AG Products Using Digital Event Services : Using Digital Event Services to Communicate between Software AG Products : Using Digital Event Persistence : Configuring HDFS for Digital Event Persistence
Configuring HDFS for Digital Event Persistence
Before you can store events with Digital Event Persistence using HDFS as the storage engine, you must configure the Hadoop cluster. You must copy the custom Hive SerDe and Joda Date/Time libraries from your Digital Event Persistence installation to your HDFS CDH 5.3.0 distribution.
To configure HDFS CDH 5.3.0 as the storage engine for Digital Event Persistence:
1. In your Software AG installation, locate the Joda Date/Time and the Digital Event Persistence Hive SerDe .jar files:
*joda-time_2.9.3.jar - available in the Software AG_directory \common\runtime\bundles\platform\eclipse\plugins directory.
*com.softwareag.evp.hive.serde_10.1.0.0000-nnnn.jar - available in the Software AG_directory \common\runtime\bundles\evs\eclipse\plugins directory, where nnnn is the build number of your Digital Event Persistence installation.
2. Copy both files to the Hive library directory on all nodes in the Hadoop cluster where Hive is running, for example CDH5.3.0_directory/var/lib/hive/lib.
3. Copy both files to the Yarn library directory on all data nodes in the Hadoop cluster, for example CDH5.3.0_directory/var/lib/hadoop-yarn/lib or CDH5.3.0_directory/var/lib/hadoop-mapreduce/lib, if you are using MapReduce MRv1.
4. Restart Hive.
Copyright © 2017 Software AG, Darmstadt, Germany.

Product LogoContact Support   |   Community   |   Feedback