Software AG Products 10.7 | Integrating On-Premises and Cloud Applications | Administering CloudStreams | webMethods CloudStreams Documentation | Administering webMethods CloudStreams | Cloud Connections, Services, and Connector Listeners | Detection and processing of duplicate Salesforce events
 
Detection and processing of duplicate Salesforce events
If you select ALL as the replay option, Salesforce sends all the events that are stored within the last 24-hour window. You may get events that were already received and these events are called duplicate events. CloudStreams provides a duplicate detection mechanism which is enabled by default. The duplicate detection mechanism checks if an event is a duplicate event based on the event extractor information that is defined as part of the event definition.
Duplicate event detection mechanism is offered on a best-efforts basis. CloudStreams internally uses persistent caches to perform duplicate detection. You must estimate the maximum load for the system and tune the various cache configurations accordingly. For guaranteed duplicate detection, tune the cache so that it can accommodate the highest payload effectively.
For example, if you expect a maximum of 10000 events in a day from Salesforce and select the ALL replay option, you must configure the maxElementsInMemory property of the ListenerEventsCache cache to be well above 10000. Set this configuration by editing the SoftwareAG-IS-CloudStreams.xml cache file available at <IS-installation-dir>\IntegrationServer\instances\default\packages\WmCloudStreams\config\resources\caching.
To know more about the configuration options for EhCache and how to size caches, see the Ehcache documentation at http://www.ehcache.org/documentation/.
If the cache is not configured specifically for this purpose, and if duplicate Salesforce events are not detected, you must implement some custom mechanism or custom handling to stop the processing of those duplicate events.