pysys.utils.perfreporter module

class pysys.utils.perfreporter.CSVPerformanceFile(contents)[source]

Bases: object

Object to hold the model for a CSV performance file.

If this file contains aggregated results the number of “samples” may be greater than 1 and the “value” will specify the mean result.

Variables:
  • RUN_DETAILS (dictionary) – A dictionary containing information about the whole test run
  • RESULT_DETAILS (list) – List where each item is a dictionary containing information about a given result
  • COLUMNS (list) – List of the columns in the performance output
COLUMNS = ['resultKey', 'testId', 'value', 'unit', 'biggerIsBetter', 'toleranceStdDevs', 'samples', 'stdDev']
RESULT_DETAILS = '#resultDetails:#'
RUN_DETAILS = '#runDetails:#'
__init__(contents)[source]

Construct an instance of the CSV performance file class.

Parameters:contents – a string containing the contents of the file (can be empty)
static aggregate(files)[source]

Aggregate a list of performance file objects into a single performance file object.

Takes a list of one or more CSVPerformanceFile objects and returns a single aggregated CSVPerformanceFile with a single row for each resultKey (with the “value” set to the mean if there are multiple results with that key, and the stdDev also set appropriately).

Parameters:files – the list of performance file objects to aggregate
static toCSVLine(values)[source]

Convert a list or dictionary of input values into a CSV string.

Note that no new line character is return in the CSV string. The input values can either be a list (any nested dictionaries are expanded into KEY=VALUE entries), or a dictionary (or OrderedDict) whose keys will be added in the same order as COLUMNS.

Parameters:values – the input list or dictionary
class pysys.utils.perfreporter.CSVPerformanceReporter(project, summaryfile, testoutdir)[source]

Bases: object

Class for receiving performance results and writing them to a file for later analysis.

Each performance result consists of a value, a result key (which must be unique across all test cases and modes, and also stable across different runs), and a unit (which also encodes whether bigger values are better or worse). Each test can report any number of performance results.

There is usually a single shared instance of this class per invocation of PySys. It is possible to customize the way performance results are recorded by providing a subclass and specifying it in the project performancereporter element, for example to write data to an XML or JSON file instead of CSV. Performance reporter implementations are required to be thread-safe.

The standard CSV performance reporter implementation writes to a file of comma-separated values that is both machine and human readable and easy to view and use in any spreadsheet program, and after the columns containing the information for each result, contains comma-separated metadata containing key=value information about the entire run (e.g. hostname, date/time, etc), and (optionally) associated with each individual test result (e.g. test mode etc). The per-run and per-result metadata is not arranged in columns since the structure differs from row to row.

__init__(project, summaryfile, testoutdir)[source]

Construct an instance of the performance reporter.

Parameters:
  • project – The project configuration instance.
  • summaryfile – The filename pattern used for the summary file(s)
  • testoutdir – The output directory for this test run
cleanup()[source]

Called when PySys has finished executing tests.

formatResult(testobj, value, resultKey, unit, toleranceStdDevs, resultDetails)[source]

Retrieve an object representing the specified arguments that will be passed to recordResult to be written to the performance file(s).

Parameters:
  • testobj – the test case instance registering the value
  • value – the value to be reported
  • resultKey – a unique string that fully identifies what was measured
  • unit – identifies the unit the the value is measured in
  • toleranceStdDevs – indicates how many standard deviations away from the mean for a regression
  • resultDetails – A dictionary of detailed information that should be recorded together with the result
getRunDetails()[source]

Return an dictionary of information about this test run (e.g. hostname, start time, etc).

Subclasses may wish to override this to add additional items such as version or build number.

getRunHeader()[source]

Return the header string to the CSV file.

getRunSummaryFile(testobj)[source]

Return the fully substituted location of the file to which summary performance results will be written.

This may include the following substitutions: @OUTDIR@ (the basename of the output directory for this run, e.g. “linux”), @HOSTNAME@, @DATE@, @TIME@, and @TESTID. The default is ‘@OUTDIR@_@HOSTNAME@/perf_@DATE@_@TIME@.csv’. If the specified file does not exist it will be created; it is possible to use multiple summary files from the same run. The path will be resolved relative to the pysys project root directory unless an absolute path is specified.

Parameters:testobj – the test case instance registering the value
recordResult(formatted, testobj)[source]

Record results to the performance summary file.

Parameters:
  • formatted – the formatted string to write
  • testobj – object reference to the calling test
reportResult(testobj, value, resultKey, unit, toleranceStdDevs=None, resultDetails=None)[source]

Report a performance result, with an associated unique key that identifies it.

Parameters:
  • testobj – the test case instance registering the value
  • value – the value to be reported
  • resultKey – a unique string that fully identifies what was measured
  • unit – identifies the unit the the value is measured in
  • toleranceStdDevs – indicates how many standard deviations away from the mean for a regression
  • resultDetails – A dictionary of detailed information that should be recorded together with the result
valueToDisplayString(value)[source]

Pretty-print an integer or float value to a moderate number of significant figures.

The method additionally adds a “,” grouping for large numbers.

Parameters:value – the value to be displayed
class pysys.utils.perfreporter.PerformanceUnit(name, biggerIsBetter)[source]

Bases: object

Class which identifies the unit in which a performance result is measured.

Every unit encodes whether big numbers are better or worse (which can be used to calculate the improvement or regression when results are compared), e.g. better for throughput numbers, worse for time taken or latency numbers. For consistency, we recommend using the the pre-defined units of SECONDS (e.g. for latency values) or PER_SECOND (e.g. throughput) where possible.

PER_SECOND = <pysys.utils.perfreporter.PerformanceUnit object>
SECONDS = <pysys.utils.perfreporter.PerformanceUnit object>
__init__(name, biggerIsBetter)[source]

Initialize self. See help(type(self)) for accurate signature.