Note that the older performance information has been moved: Archived Performance Metrics


Profiled Performance Data

What is this data?

Each of the tests shows the time it takes an end-user to complete a user story. We do this testing with bandwidth and CPU throttled and with a large sample data set to simulate real-world data and connectivity. Each story is run with a different type of performant user, identified at the top in the column headers. Each column is a different test run.

Each row is a different step in the user story. Each cell captures the time it takes a person while waiting for the OpenLMIS system to respond. Currently, these times are made with a stopwatch, so they are not 100% accurate. Time spent typing or filling in forms or data is not included in the performance time; we only measure the time spent waiting for OpenLMIS to respond. Some cells include a list of API calls involved or a list of downloads, along with the wait times or data transfer sizes for each.

Overview Charts

This google document is being used to collect manual performance tests, and create these charts that detail metrics about OpenLMIS.

Performance test steps for 3.x (on Malawi dataset)

Note: these steps for performance testing are only applicable if perftest has loaded the Malawi dataset; otherwise you will not be able to use the credentials.

Steps: 


Performance Data Entry

This Google Spreadsheet has been created for the entry of performance metric data. Summary performance data is included on the first sheet with the more detailed timings being included on the second sheet.