Reframe Report for discoverer

1. Test Summary

  • Total Tests: 6

  • Failures: 0

  • Testcases

    • Feelpp %my_regression_parameter={'nodes': 32, 'tasks_per_node': 128} %inputs=[]

    • Feelpp %my_regression_parameter={'nodes': 16, 'tasks_per_node': 128} %inputs=[]

    • Feelpp %my_regression_parameter={'nodes': 8, 'tasks_per_node': 128} %inputs=[]

    • Feelpp %my_regression_parameter={'nodes': 4, 'tasks_per_node': 128} %inputs=[]

    • Feelpp %my_regression_parameter={'nodes': 2, 'tasks_per_node': 128} %inputs=[]

    • Feelpp %my_regression_parameter={'nodes': 1, 'tasks_per_node': 128} %inputs=[]

from feelpp.benchmarking.reframe.report import Report
report=Report(file_path="docs/modules/discoverer/pages/kub/scenario0/20231201-1430.json")
Results
2.0.3
Main performance variables:
 |    |   num_tasks | name                     |      value |
|---:|------------:|:-------------------------|-----------:|
| 20 |         128 | cem_updateForUse         |   0.874377 |
| 21 |         128 | cem_instance_execute     | 128.441    |
| 22 |         128 | cem_instance_simulation  |  56.6162   |
| 23 |         128 | cem_instance_postprocess |  71.7721   |

1.1. Performance by Simulation Steps

fig=report.plotPerformanceByStep()
fig.show()
Results

1.2. Performance by Number of Tasks

fig=report.plotPerformanceByTask()
fig.show()
Results

1.3. Speedup of the simulation Steps

Speedup
print(report.speedup())
fig=report.plotSpeedup()
fig.show()
Results