Date: June 25, 2020
OpenLMIS participants: Rebecca Alban (Unlicensed), Brandon Bowersox-Johnson, Wesley Brown, Christine Lenihan Dercio Duvane
PSM: Jean Miller, Trace
Jean question; Can we use Mozambique’s work as a baseline so we don’t have to duplicate the work that was done?
Wes: OpenLMIS is built so that the rest of the community can benefit from that; that work wil be carried forward for all future implementations of OpenLMIS. All bugs fixed in 3.9 and changes to the core system, those will be incorporated into all versions of OpenLMIS—work that was done in Moz has flowed into the OpenLMIS project. We want countries to be using OpenLMIS; the current version (exact number doesn’t matter so much).
There are some things that are only being done in Moz, so those will be handled separately
Brandon: Let’s talk about ‘the how’…Do we know that everything Thoughtworks has done to improve the SIGLUS version has been contributed back to core? To what extent is it compatible with core? What are we (VR) seeing in our code review?
Jean: Which of Thoughtworks work is compliant?
Dercio: we are doing architecture review; Chongsun as part of core. Dercio’s team is not doing code review; its more an architecture review and analysis of the features and how it fits into what OpenLMIS already has
Wes: Any functionality that makes its way back into the core system will be reviewed as part of our routine process. But anything that is for Mozambique specific service (not going back to the core); we don’t have access to and are not reviewing. Chongsun has been helping thoughtworks with the design and helping them to determine which things could be implemented in core. Thoughtworks has been more active recently in Discourse forum and product committee, but still is hard to get engagement from them on some things
- Request from Jean to look at every feature to determine where it should ‘live’ is a great idea but not sure that we (the core team? VR?) have the ability to do that and is beyond our scope
- Jean: So it sounds like it’s all up to Thoughtworks’s determination? Jean thought that for VR Task Order was to ensure the compliance…
- Wes: This is how we read the contract: We looked at the business requirements to get perspective on what was being built and why; then we matched that up with OpenLMIS functionality; whenever there was a gap we decided what should be done. Then we came up with recommendations. However Thoughtworks was already well into the development, and we brought in Chongsun to try to help them build in a way that was compliant with OpenLMIS contribution
- Did not think it was our mandate/ability to tell them to put it into OpenLMIS core. We made guidance but don’t have access to the code to review and verify
- Brandon: ideal vision is that we get access to the code, or maybe even fund it in other ways if not in current scope, to review what Thoughtworks did, to answer that question. To say “5/10 features was done in a way that was compatible. Other 5 were Mozambique specific”
- would be great to have visibility into this work- to see what is compatible
- Trace question re: whether we would need 24/7 code access or just a period of time ahead of our next report? Brandon: it would be in perpetuity, some 'read only' access to code would help us; we think their work should be open source so that PSM can use the work in other countries
- Christine: current Task Order was focused on reviewing estimates that Thoughtworks had provided and ensuring alignment and upgradability for future versions of OpenLMIS. Even if there are things that are being done for Moz to make sure that they are using the extension points; so you don’t lose the ability to take future upgrades. Sometimes countries face the question: make something configurable so that they can contribute back, or just build exactly what you need (and build using standards). We don’t have a mandate or way to enforce them to contribute back or to implement the new features in the recommended way; but we are focused on encouraging and ensuring they know HOW to do that. And we aren’t auditing to check in on it all because we don’t have a way to see it all
- Jean: So how does the communication work if you don’t have access to the code? Christine; the team has been identifying with Thoughtworks what things could be contributed back and building off the OpenLMIS code itself, rather than in the separate service where things in Moz are being developed.
- Wes: We can see if they decide to share the code, the functionality in the core system
- Jean: So you have a list of functionality that tells you what needs to be compliant, and what doesn’t? right?
- Wes: yes, we have it in a document from our first assessment. That has info about how OpenLMIS core could be leveraged without requiring customized development
- Jean requested this report
- Jean: How will the version control work in the future?
Jean’s current challenge: Requirements from Guinea; want to compare to existing OpenLMIS version. They have a challenge to get the exact requirement/functionality of current version of OpenLMIS. To do a comparison/study
- Current features. Cannot marry the requirements with the capabilities
- The wording on the website is not clear. The specific reports in the test environment doesn’t show much
- Have hired people to do analytical report
- Here are reports that can currently be generated (title , row/column)
- List out requirements (what features need to be) and functionality—no ambiguity
- Notes for reports that are standard versus require customization
- Would also be good to know about training requirements; we don’t have that training in what we do now; but it would be a good addition—they want help scoping the functionality in an efficient way
- Reports should be like features; the same across countries; all countries use the same criteria for M&E so the same reports are required across countries
(Christine) Lots of work has been done in Malawi & Angola on the reports
- They do see a lot of variability between what countries are asking for re: reports, so its good that they are configurable and can be customized
- For PSM indicators, sometimes other data sources are needed other than OpenLMIS
We can possibly work together in the future on:
- Creating detailed requirements (clarifying what features really need to be)
- Knowledge transfer to local or regional firm for supporting
Jean - question around what training could be done for in-country teams on reports and others. How we include that in our TOs
- Christine - VillageReach would be very open to providing this, it has not been/isn't included in our current contracts, but could absolutely be included in the future and we can continue conversations about what is needed, what support can be provided, and the approach
Jean - Question related to version control for Core v. country instances. Who is responsible for this? How is it done? How is testing and the release process managed for countries?
- Christine - for versioning, the Core team release new versions (i.e. 3.9) but with the microservices architecture, that "OpenLMIS 3.9" release actually includes a set of services, each of which has its own version depending on the changes. So the 3.9 release may actually include UI v1.8, Requisitions 1.4, etc.
- For country implementations, the versioning (and releases) are handled by the technical implementation partner. So for Malawi and Angola, VillageReach does this, and I presume ThoughtWorks does (or would) in Mozambique. And they can choose different methods of numbering/version tracking, but for a simple example, OpenLMIS-MW v1 included Core version 3.4, plus any additional changes Malawi did, so they made some branding and UI changes, so rather than the Core UI service, they have the MW version of the UI. This would be the responsibility of whichever technical partner is doing the in-country releases and development.
- For the release process, the Core team has their release process, which includes, as the final part, release testing that any country implementation is invited to participate in. So the Malawi and Angola teams have worked with the Core team to do some of the testing, before the release is published, to identify any bugs and issues for the Core team to fix prior to finalizing the release. Once the release is published, it doesn't get automatically pushed to the country implementations - but it is there for them to use. So the development partner for the country implementations can upgrade the country system with the newest core release plus any other country-specific changes they've made but aren't release yet. Then they would go through their own release testing of the country release candidate, to ensure there aren't any bugs or issues, and then they can deploy that to the production system.
- Jean - are the release processes documented anywhere?
- Christine - Yes, we have those processes for the implementations we support and the Core team has theirs as well. I can share those documents/links. We can also talk more about this in the Issue report/SLA meeting Thursday
Rebecca Alban (Unlicensed) update the website with more specific product details; particularly re reporting
Wesley Brown and Rebecca Alban (Unlicensed) create documentation for Jean and PSM team (as described above)
Wesley Brown to share Assessment Report with Jean (mentioned about, from their first Moz assessment)-- explicitly add details re: what should be Mozambique specific versus what should be core
- Brandon Bowersox-Johnson suggestions: VR may need to expand their current scope to do another level of compliance review. Brandon suggests we get access to the code so that would be possible. Maybe include Antionio Langa, or someone from the field office to make sure we are all aligned
- Trace: Larger discussion with Thoughtworks is needed to discuss the access. Maybe this access can be added into the next task order to be improved for next time (to be developed in Sept and contracted in Oct).
- Christine Lenihan to send release documentation info
- Christine Lenihan to send notes/summary
- Christine Lenihan to schedule follow up with PSM Mozambique team