Non-Functional Requirements - performance
Please see http://docs.openlmis.org/en/latest/conventions/nfr.html# for the current status of defining the non-functional requirements for OpenLMIS.
Profiles and Working Data Definitions
OpenLMIS is designed to be used in environments with differing levels of connectivity and by users that will use different sizes of working data. e.g. off-line or in-process stored locally and / or that a user works with at any given time.
The proposed working data is broken up into a proposal for the current system functionality and a long-term proposal that incorporates the needs to adequately test the stock management features. The long-term working data will likely require some significant architectural changes to some of the services, particularly in the UI and Offline areas, which is why it is not being considered for the current working data.
See External User Profiles for more details | System Specs | Mozambique | Malawi | Tanzania | Zambia | Proposed | Long-Term Proposal | OLMIS UAT | Malawi DEV |
---|---|---|---|---|---|---|---|---|---|
Administrator (Implementer or overall system admin) | Same as below |
|
|
|
|
|
| Facilities: 2729 Programs: 4 Full-supply: 1040 Non-full supply: 2 Products: 10061 Users: 1204 Facility types: 5 | Facilities: 877 Programs: 6 Full-supply: 271 Non-full supply: 1065 Products: 1495 Users: 513 Facility types: 13 |
N Program Supervisor (interested in national or regional view) |
|
|
| ?? |
|
|
| Facilities: 336 Programs: 1 Full-supply: 24 Non-full supply: 2 | |
District Storeroom Manager (submits requisitions on behalf of facilities) |
|
|
|
| All Facilities in the district: An average of : 25 |
|
| Facilities: 673 Programs: 2 Full-supply: 1027 Non-full supply: 2 | |
Storeroom Manager (or Store Manager who submits requisitions on behalf of its facility) |
|
|
|
| 4 Programs (ARVs, Lab Products, Essential Medicines, HIV test kits, |
|
| Facilities: 673 Programs: 2 Full-supply: 1027 Non-full supply: 2 | |
Warehouse clerk | <need to define> |
|
| Facilities: 17 Programs: 1 Full-supply: 13 Non-full supply: 0 |
Notes:
- Malawi would like to see 6,000 products at the district level (slightly better connectivity - 3G). Malawi has 400 for full supply. Malawi will find out about HF.
- Moz up to 1,600 products. HF would manage less than 500.
- Zambia: 1100 products in the central not FTAP
- TZ: 1400 products
- The warehouses carry 5-6k product lists
Performance
Application Performance Goals
We use the word 'goal' rather than 'requirement' here to denote that these performance goals, at least at the current time, should be considered aspirational rather than hard requirements. The current system has a long way to go to achieve these goals but we believe that we will reach them (or get closer to reaching them) more quickly by setting more aggressive targets rather than more conservative ones.
Rather than defining the goal for each use case/scenario, we are defining an overall baseline and then only call out the deviations from that baseline. Measuring the Time to First Byte (TTFB) for the server response is a good initial requirement as it limits the possible external factors that could influence the result and tests already exist in the OpenLMIS build pipelines to gather this data. Other metric requirements that could be added later include:
- Page Upload Time (PUT) : request
- Page Download Time (PDT): request → response → download
- Page Load Time (PLT): request → response → download → client rendering → page ready
These additional metrics would give a more representative measure of how the system feels for the actual users; it doesn't mean much to users if the TTFB is <200ms if the download and rendering take another 59 seconds. Each of these metrics would provide an indicator for a specific part of the overall performance:
Metric | Indicator |
---|---|
Page Upload Time (PUT) | Client-Side Processing Time and Data Volume |
Time to First Byte (TTFB) | Server-Side Processing Time |
Page Download Time (PDT) | Page Data Volume and Overall Request Timeline |
Page Load Time (PLT) | Client-Side Page and Data Processing Time |
A common understanding on the context for these goals will also be helpful, especially:
- When a use-case requires multiple actions, which one or ones should be used?
- When an action results in multiple requests, which one or ones should be used?
- What type and quality of network connection should be simulated?
- From discussions with Team MtG:
- Basic 3g network
- Limited CPU/RAM hardware profile
- From discussions with Team MtG:
- What baseline latency should be simulated?
- What options for the above will allow us to most easily standardize and automate the gathering of this performance data?
Note that for the purposes of these goals we may want to keep the overall measurement (PLT) broad and not too closely tied to the underlying requests being made while still being easily reproducible.
Server Hardware Profile
The server hardware profile is assumed to be in line with the recommendations here.
Use Case/Scenario | Profile | Relevant Working Data | Baseline 3.3.1 | Proposed | ||||||
---|---|---|---|---|---|---|---|---|---|---|
PUT (sec) | TTFB (ms) | PDT (sec) | PLT (sec) | PUT (sec) | TTFB (ms) | PDT (sec) | PLT (sec) | |||
(Default) | (Any) | (Profile Working Data) | 500 | 5 | ||||||
Login | Storeroom manager | N/A | 36 | (500) | (5) | |||||
Initiate Requisition | District Storeroom Manager (see above) | # Processing Periods: 12 (plus general working data above) | 5480 | 22 | 32 | (500) | (5) | |||
Save (sync) Requisition | (same) | (same) | 2.7 | 7150 | 4.86 | 24 | (500) | (5) | ||
Submit Requisition | (same) | (same) | 2.71 | 7590 | 2.43 | 34 | (500) | (5) | ||
Authorize Requisition | (same) | 2.71 | 7270 | 2.43 | 22 | (500) | (5) | |||
Approve Requisition | N Program Supervisor | How many Requisitions are waiting for approval? Malawi: average 40 (Per Malawi’s processes, the districts prefer to approve the forms collectively. So in this case, ~40 forms to approve at a time, we do have an outlier, they have 80 forms they would like to approve at one go) | 2.6 | 2430 | 2.43 | 25 | (500) | (5) | ||
Batch Requisition Approval | N Program Supervisor | How many Requisitions? Malawi: average 40 | (3.3.0) 104 | 1000 | 30 | |||||
Convert to Order (one) | Warehouse clerk | Max number of approved requisitions waiting for approval Malawi: average 80 | 1440 | 5 | (500) | (5) | ||||
Convert to Order (multiple) | Warehouse clerk | How many approved requisitions? Malawi: 30 - 40 | (8) 20 | (8) 1000 | (8) 10 | |||||
Filter performance on the convert to order page | Max number of approved requisitions waiting for approval Define the # of variables for the filter | |||||||||
View Requisition (filter performance) | 250 | 0.5 | ||||||||
Fulfill Order |
Scalability
Requirement from CdI: The application should support at least 1700 users. The system should support at least 1000 concurrent users. The system should support 1500 health centers, with one user per health center. The system should also support an additional 50 facilities with an average of 3 users each.
This requirement may want to be more detailed as what does it mean to "support" 1700 users? Likewise, does what does the usage of 1000 concurrent users actually look like? How many concurrent requests would that likely entail and what type of concurrent operations are likely be to occurring?
- Concurrent requests per second
- Concurrency requirements for specific use cases
- How many concurrent Requisition Submission's can the system handle?
- How many concurrent Stock Management changes?
- Number of active sessions
- How much memory does each active session require on the server?
- Reliability under load (processor, requests, etc)
- Does the server become unresponsive when resource usage is maxed out?
- E.g. How would the system handle a DDOS attack?
Availability
- Requirements for system uptime
- Timeliness of reporting data
- Expected maintenance tasks
- Windows to perform these tasks
Network Usage
Client Side:
- Expected number of retries (note the possible interaction with server reliability)
- Timeouts for network connections
- Data Compression
Server Side:
- Network throughput under load
- Concurrent network connections
Browser
This section outlines general performance metrics for the OpenLMIS-UI running in a web browser
Description | Expected Performance | Tickets |
---|---|---|
RAM usage | Below 1gb ram for the browser | |
Disk space usage | Will vary based on profile |
OpenLMIS: the global initiative for powerful LMIS software