pysys.utils.perfreporter¶
Performance number reporting classes, used by pysys.basetest.BaseTest.reportPerformanceResult
.
The CSVPerformanceReporter
can be used as-is or subclassed for alternative output formats.
PerformanceUnit¶
-
class
pysys.utils.perfreporter.
PerformanceUnit
(name, biggerIsBetter)[source]¶ Bases:
object
Class which identifies the unit in which a performance result is measured.
Every unit encodes whether big numbers are better or worse (which can be used to calculate the improvement or regression when results are compared), e.g. better for throughput numbers, worse for time taken or latency numbers.
For consistency, we recommend using the the pre-defined units where possible. For throughput numbers or rates, that means using PER_SECOND. For latency measurements that means using SECONDS if long time periods of several seconds are expected, or NANO_SECONDS (=10**-9 seconds) if sub-second time periods are expected (since humans generally find numbers such as 1,234,000 ns easier to skim-read and compare than fractional numbers like 0.001234).
CSVPerformanceReporter¶
-
class
pysys.utils.perfreporter.
CSVPerformanceReporter
(project, summaryfile, testoutdir, runner, **kwargs)[source]¶ Bases:
object
Class for receiving performance results and writing them to a file for later analysis.
Each performance result consists of a value, a result key (which must be unique across all test cases and modes, and also stable across different runs), and a unit (which also encodes whether bigger values are better or worse). Each test can report any number of performance results.
There is usually a single shared instance of this class per invocation of PySys. It is possible to customize the way performance results are recorded by providing a subclass and specifying it in the project performancereporter element, for example to write data to an XML or JSON file instead of CSV. Performance reporter implementations are required to be thread-safe.
The standard CSV performance reporter implementation writes to a UTF-8 file of comma-separated values that is both machine and human readable and easy to view and use in any spreadsheet program, and after the columns containing the information for each result, contains comma-separated metadata containing key=value information about the entire run (e.g. hostname, date/time, etc), and (optionally) associated with each individual test result (e.g. test mode etc). The per-run and per-result metadata is not arranged in columns since the structure differs from row to row.
After tests have run, the summary file is published with category “CSVPerformanceReport” using the
pysys.writer.api.ArtifactPublisher
interface.- Parameters
project – The project configuration instance.
summaryfile (str) – The filename pattern used for the summary file(s). If not specified explicitly, the summary file for the CSVPerformanceReporter can be configured with the project property
csvPerformanceReporterSummaryFile
. SeegetRunSummaryFile()
.testoutdir (str) – The output directory used for this test run (equal to
runner.outsubdir
), an identifying string which often contains the platform, or when there are multiple test runs on the same machine may be used to distinguish between them. This is usually a relative path but may be an absolute path.runner – Pass this through to the superclass.
kwargs – Pass any additional keyword arguments through to the super class.
-
DEFAULT_SUMMARY_FILE
= '__pysys_performance/${outDirName}_${hostname}/perf_${startDate}_${startTime}.${outDirName}.csv'¶ The default summary file if not overridden by the
csvPerformanceReporterSummaryFile
project property, or thesummaryfile=
attribute. SeegetRunSummaryFile()
. This is relative to the runner output+’/..’ directory (typically testRootDir, unless--outdir
is overridden).
-
formatResult
(testobj, value, resultKey, unit, toleranceStdDevs, resultDetails)[source]¶ Retrieve an object representing the specified arguments that will be passed to recordResult to be written to the performance file(s).
- Parameters
testobj – the test case instance registering the value
value – the value to be reported
resultKey – a unique string that fully identifies what was measured
unit – identifies the unit the the value is measured in
toleranceStdDevs – indicates how many standard deviations away from the mean for a regression
resultDetails – A dictionary of detailed information that should be recorded together with the result
-
getRunDetails
()[source]¶ Return an dictionary of information about this test run (e.g. hostname, start time, etc).
This method is deprecated; customization of the run details should be performed by changing the
runner.runDetails
dictionary from thepysys.baserunner.BaseRunner.setup()
method.
-
getRunSummaryFile
(testobj)[source]¶ Return the fully substituted location of the file to which summary performance results will be written.
This may include the following substitutions:
@OUTDIR@
(=${outDirName}, the basename of the output directory for this run, e.g. “linux”),@HOSTNAME@
,@DATE@
,@TIME@
, and@TESTID@
. The default is given byDEFAULT_SUMMARY_FILE
. If the specified file does not exist it will be created; it is possible to use multiple summary files from the same run. The path will be resolved relative to the pysys project root directory unless an absolute path is specified.- Parameters
testobj – the test case instance registering the value
-
recordResult
(formatted, testobj)[source]¶ Record results to the performance summary file.
- Parameters
formatted – the formatted string to write
testobj – object reference to the calling test
-
reportResult
(testobj, value, resultKey, unit, toleranceStdDevs=None, resultDetails=None)[source]¶ Report a performance result, with an associated unique key that identifies it.
- Parameters
testobj – the test case instance registering the value
value – the value to be reported. This may be an int, float, or a character (unicode) string.
resultKey – a unique string that fully identifies what was measured
unit – identifies the unit the the value is measured in
toleranceStdDevs – indicates how many standard deviations away from the mean for a regression
resultDetails – A dictionary of detailed information that should be recorded together with the result
CSVPerformanceFile¶
-
class
pysys.utils.perfreporter.
CSVPerformanceFile
(contents)[source]¶ Bases:
object
Object to hold the model for a CSV performance file.
If this file contains aggregated results the number of “samples” may be greater than 1 and the “value” will specify the mean result.
- Variables
runDetails (dict) – A dictionary containing (string key, string value) information about the whole test run.
results (list) – A list where each item is a dictionary containing information about a given result, containing values for each of the keys in
COLUMNS
, for example ‘resultKey’, ‘value’, etc.RUN_DETAILS (str) – The constant prefix identifying information about the whole test run
RESULT_DETAILS (str) – The constant prefix identifying detailed information about a given result
COLUMNS (list) – Constant list of the columns in the performance output
-
static
aggregate
(files)[source]¶ Aggregate a list of performance file objects into a single performance file object.
Takes a list of one or more CSVPerformanceFile objects and returns a single aggregated CSVPerformanceFile with a single row for each resultKey (with the “value” set to the mean if there are multiple results with that key, and the stdDev also set appropriately).
- Parameters
files – the list of performance file objects to aggregate
-
static
toCSVLine
(values)[source]¶ Convert a list or dictionary of input values into a CSV string.
Note that no new line character is return in the CSV string. The input values can either be a list (any nested dictionaries are expanded into KEY=VALUE entries), or a dictionary (or OrderedDict) whose keys will be added in the same order as COLUMNS.
- Parameters
values – the input list or dictionary