What is normal distribution?

Input

mu = mean value (expected value) and sigma = standard deviation in days, hours, minutes, seconds (all in the 0000:00:00:00 format)

As duration, a value in line with the normal distribution is output for each process instance. Rule of thumb: 2/3 of all output time spans range in the area of mu - sigma < = (less than or equal to) time < = (less than or equal to) mu + sigma.

Example

If you specify two hours as the mean value of a function's processing time and one hour as standard deviation, all processing times of the function will range between one and three hours with a 66% probability.