Building Your Event-Driven Architecture : Communicating between SoftwareAG Products Using Event Routing : Administering Event Routing : Configuring Services and Service Groups : Creating Event Persistence Services : Using Event Persistence with HDFS
Using Event Persistence with HDFS
Before you can store events with Event Persistence using HDFS as the storage engine, you must configure the Hadoop cluster by deploying the custom Hive SerDe and Joda Date/Time libraries from your Event Persistence installation to your CDH Hadoop 5.3 distribution.
To configure Hadoop HDFS as the storage engine for Event Persistence
1. In your Software AG installation, locate the Joda Date/Time and the custom Event Persistence Hive SerDe .jar files:
*joda-time_1.6.2.jar - available in the Software AG_directory \common\runtime\bundles\platform\eclipse\plugins directory.
*com.softwareag.evp.hive.serde_9.10.0.0000-nnnn.jar - available in the Software AG_directory \common\runtime\bundles\evs\eclipse\plugins directory
where nnnn is the build number of your Event Persistence installation.
2. Copy both files to the Hive library directory on all nodes in the Hadoop cluster where Hive is running, for example HDFS_directory/lib/hive/lib.
3. Copy both files into the Yarn library directory on all data nodes in the Hadoop cluster, for example HDFS_directory/lib/hadoop-yarn/lib or HDFS_directory/lib/hadoop-mapreduce/lib if you are using MRv1.
4. Restart Hive.
Copyright © 2016 - 2016 Software AG, Darmstadt, Germany.

Product LogoContact Support   |   Community   |   Feedback