The reporting and analytics vision is to support OpenLMIS users with both routine reporting and ad-hoc analysis. Users need a way to extract and visualize data. This specification is not defining the exact list of reports needed but rather what types of reports (map, offline, etc.) and functionality is desired. Please see the Scope Out section for a clear explanation of what pieces this spec does not include like alerts (via email/sms) and integrations to systems like DHIS2.
This wiki page is an attempt to define the reporting and analysis needs for OpenLMIS users based on the following generalized user personas identified by the community.
In an attempt to gain buy-in and agreement by the OpenLMIS community on what the needs, requirements and definitions are for reporting and analytics within OpenLMIS. Do not focus on the technical solution as there may be multiple approaches to meeting the desired behaviors. Focus of this spec is to outline what the end users will need and want. This spec will support the decision making process for implementing reporting and analytics within OpenLMIS.
To support this conversation, we define reporting, dashboards and analytics in the following ways:
Reports, routine reporting and sometimes called built-in reporting contain detailed data in a tabular format and typically display numbers and text only, but they can use visualizations to highlight key data. Some key characteristics of a report include :
- It presents numbers and text in a table.
- It can contain visualizations (like a map, chart, graph) but only used to highlight findings in the data.
- It is optimized for printing and exporting to a digital document format such as CSV, Word or PDF.
- It is geared towards people who prefer to read data, for example, lawyers, who would rather read text over interpreting visualizations, and accountants, who are comfortable working with raw numbers.
- Rendered from a defined query/template.
- Does not require high technical capacity to run or access the report
- Usually viewed or run on a routine bases (more than once a year)
- Usually drives users to an action
Designer is an engineer.
Routine reporting applies typically to more than one implementation.
Dashboards are a data visualization tool that displays the current status of metrics and key performance indicators (KPIs) for OpenLMIS end users. Dashboards consolidate and arrange numbers, metrics and sometimes performance scorecards on a single screen. Some key characteristics of a dashboard:
- All the visualizations fit on a single computer screen — scrolling to see more violates the definition of a dashboard.
- It shows the most important performance indicators / performance measures to be monitored.
- Interactivity such as filtering and drill-down can be used in a dashboard; however, those types of actions should not be required to see which performance indicators are under performing.
- It is not designed exclusively for high-level decision makers or but rather should be used by the general workforce as effective dashboards are easy to understand and use.
- The displayed data automatically updated without any assistance from the user. The frequency of the update will vary by organization and by purpose. The most effective dashboards have data updated at least on a daily basis.
- Does not require high technical capacity to access
Designer is an engineer.
A dashboard applies typically to more than one implementation.
Analytics and Ad-hoc analysis leverages tools that offer the ability to select ad-hoc date ranges, pick different products, or drill down to more detailed data to be dashboards and my dynamic than reports. Some key characteristics of data analytics include :
- user can define desired data element combinations (via a programming language or wizard).
- Not routine and the combination of data elements is run infrequently (once year).
- It fits on one screen, but there may be scroll bars for tables with too many rows or charts with too many data points.
- It is highly interactive and usually provides functionality like filtering and drill downs.
- It is primarily used to find correlations, trends, outliers (anomalies), patterns, and business conditions in data.
- The data used in a ad-hoc analysis is generally historical data. However, there are some cases where real-time data is analyzed.
- An outcome of ad-hoc analysis might be to define/identify performance indicators for use in reporting and/or dashboards.
- It is typically relied on by technically savvy users like data analysts and researchers with knowledge of and experience with report programming languages.
Designer needs to be technically inclined to their BI&A tool of choice (e.g. Excel, DHIS2, Tableau, etc). Per-seat licensing for report designer and consumer may apply depending on tool choice.
Typically more implementation specific, less re-use across implementations.
|User||Tech aptitude||Scope of supervision||Kind of Reports||Access (how/when)||Frequency**||Report Types|
|Low||Only responsible for one geographic facility but may have multiple storerooms||Routine reporting||Monthly or more (if available)|
Intermediate Store Manager
(could be district, regional, province)
District/regional/provincial health officer, program coordinators, pharmacist
|Supervises a subset of facilities or specific zone. Facilities must be mapped to zones.||Routine reporting|
Monthly or weekly depending on replenishment schedules
Daily around due dates for requisitions.
Central Personnel or
Central Program personnel
|National supervision of all geographic facilities OR national supervision of one program.|
Some ad-hoc (where VAN is rolled out)
(can be from the MIS department of the MOH; sometimes there is both a Technical Administrator and a Managerial Administrator)
Create new reports, templates and conduct ad-hoc reporting.
|Ad-hoc requests from MOH, stakeholders ("I want to see...")||Upon request|
(most likely a contractor/vendor supporting the system)
|May create new reports or customize and alter existing reports based on requests from stakeholders|
Routine (help running or troubleshooting/
(Would use and customize all of the above)
|Stakeholders outside OpenLMIS||Varies||May represent directorates (public health, any interest programs), stakeholders, partners and donors like UNICEF, GAVI, etc.||Ad-hoc||Reports may be shared with these users outside OpenLMIS as needed, whether on paper or by forwarding an email containing a PDF|
**If the implementer is using stock management, frequency of reports would increase.
- If the implementer is using stock management, frequency of reports would increase with transaction data.
|Create user personas and outline objectives|
|Research of technical approach/options|
Below is a list of questions to be addressed as a result of this requirements document:
|1||How much money does a MOH want to spend on resources to support creating report templates? We want to figure out guardrails on what types of budgets MOHs have to support the creation of reports. This will help us assess reporting tool options. For example, if one solution only allows for highly-specialized resources which cost a lot, we may not go with it. FYI, Josh Zamor.||In progress.|
|2||What type of skill sets can we expect implementers to have to create the report templates for routine reporting?||In progress.|
Community members are encouraged to share key reports and examples for the team to review and keep in mind as moving forward. Assuming the perspective of a national-level program manager who wants visibility into the activity of their program by region/district, but who would also like the option to get granular-level visibility into facility-by-facility data as well.
1 - Stockout days
- Ideally would be able to view on a map geographically
- Could show avg. # of stockout days in a district – aggregated data from all facilities in that district to give a picture of the overall stockout rate for that district
- User could click on the facility list and see the specific facilities reporting stockouts
2- Stock levels/Consumption (these are two different indicators)
- Nice to view on a map geographically
- Shows SOH at beginning of period and SOH at the end of period for each facility
- Graph could show min/max levels to indicate that stock delivery is adequate to bring stock levels up to max, if they’re ordering too much, or too little
- Can show that a facility stock level is below the min or near stocking out
- Should show value of commodity, even if the commodity is provided to the consumer for free, in order to reinforce idea that stocks have real value
3 -Timeliness and completeness of reporting
- Could show aggregate for the district of whether district is reporting on time
- Similar to Stockout Report, user could click on the facility list and see the specific facilities which were not reporting on time
4 - On Time and full deliveries
- Could show aggregate data for a district/region for on time and complete deliveries that month.
- Similar to Stockout Report, user could click on the facility list and see the specific facilities which reported delayed or incomplete deliveries
5 - Inventory aging
- Show items near to expiry (configurable date range, since length of the pipeline will determine how critical a given range is)
- Show items with VVM stage 2 status
6 - Expiry/wastage/Loss (based on reason codes)
- Show items that expired/damaged/lost and require disposal (units and value)
- Show loss rate; total lost / total usable stock used during period as percent (units and value)
7 - Stocked according to plan (traffic light or similar indicator)--this is similar to but distinct from #2
- Show stocks that are within planned parameters (min/max or EOP/max, with config tolerance thresholds) for all commodities managed
- Show stocks that are below threshold for min/EOP
- Show stocks that are above threshold for max
8 - Forecast accuracy
- Show the ratio of actual consumption by product during a particular period compared to the consumption forecasted for the same period (central level dashboard)
9 - Emergency order rate
- Show number of unplanned orders/deliveries in period compared to planned deliveries
10 - Performance Score Card
- Compare HF/district/region against all peers for period and over time with a balanced scorecard comprising critical KPIs (configurable)
- This is both for self-assessment (How are we doing?) and for supervision/oversight (how are they doing?)
Some other ideas we tossed around but didn’t discuss as thoroughly were:
A program manager would use this report to measure how much money a facility should have (relating to cost recovery)
A program manager could also see a graph depicting the total costs of requisitions for their area (get a clearer picture of where their money is going)
Could see whether facility is in the red/black based on cost recovery analysis
Should show value of inventory, value of good sold/issued/dispensed, value of goods lost
Good to show on a map - district level making emergency orders to provincial level, potential risk of stockouts.
Out of Scope
- Please note this spec does not include in application alerts (like an alert to an approver that a requisitions is ready for approval) which are addressed within each functional area (like Requisitions) based on business logic.
- This spec does not include the scope of exporting data into another system like DHIS2. If an implementation of OpenLMIS decides to use another application for all reporting needs, then the following feature set would not be used by that implementer.