The value of a cardinality measure is based on the number of different values (= max. possible steps) of the referenced text dimension for the specified step width (level). A cardinality measure can be defined for one-level, two-level, and n-level dimensions, and is defined by the following XML element in the measure configuration file:
...
<crdkidef name="..." dimreferring="...">
<description language="de" name="..."/>
<description language="en" name="..."/>
<refdim name="..." refinement="..."/>
</crdkidef>
...
XML tag |
Description |
---|---|
name |
Internal name of the measure. Referenced in the useki XML tag in the process tree definition. |
dimreferring |
Type of dimension reference. LOOSE: Loose STRICT: Strict Default value: LOOSE |
refdim |
The name XML attribute specifies the name of the dimension to which the calculated cardinality relates. One-level dimension: The default value is the default step width for the referenced dimension. |
kigroup |
Measure group |
As well as the measure itself, only the ranking, previous periods, and planned values can be determined for cardinality measures. Statistical evaluations (minimum, maximum, total and standard deviation) cannot be displayed. Cardinality measures cannot be used as a dimension. No filters can be specified for cardinality measures.
Any additional dimension values resulting from import of process instance-independent measures will not be included in the calculation of cardinality measures. The cardinality of dimensions that are used exclusively by process instance-independent measures always return the value 0.