Upload ISA Values


As a administrator, I want to update and upload the Ideal Stock Amount ("ISA") for each product (commodity type) by facility and period.

I will need a csv file so I know where to put the quantities:

Facility Code

Commodity Type


Ideal Stock Amount







  • ISA amounts are by commodity type, not by orderable or lot

  • User will provide the data on a per period basis

Acceptance Criteria

  1. Start with proposed RAML

    1. Make sure API endpoint is RESTful

  2. Add API endpoint with logic

    1. Need the "Manage System Ideal Stock Amounts" right created in to use this endpoint

  3. Under the Administration dropdown called "Ideal Stock Amounts", a new action is added to the page: Upload the ISA File (see attached mock up for layout)

  4. There is validation for ISA field within the csv upload process: If a user updates the ISA value to a negative amount and uploads the file, this will cause an error and the file will be rejected. ISA field can be zero or null.

  5. The only field users will update in the upload file is the ISA amounts. If there are any changes to the other fields (Facility Code, Commodity Type, Period) then the user receives an error message and the file is rejected.

    1. If it cannot find any code (or name for the Period), it will error and say it is not found

    2. If a row has the same values as another row, it will (probably) overwrite the ISA value of the other row

  6. When user successfully uploads the csv file, they are directed back to the Configure ISA page.

  7. Error Handling acceptance criteria is below:

    1. For files with wrong format (xls, pdf, whatever) we should get an error explaining that we are unable to parse the file because the format is incorrect

    2. Follows OpenLMIS style guide for error messaging. (Modal pop up first with error description and then description of error on page above file input.)

    3. Create test case in Zephyr that documents all validation error scenarios when uploading a file. Add link for test case to this ticket. Example step in test case: Validate file format is correct, if incorrect error stating "Incorrect file format...." is displayed. Another example: If there are errors with the file, the file is not uploaded and records are not updated.

    4. Backend should follow v2 validation methods and rethrow CSV errors as ValidationMessageExceptions

    5. Make sure the SuperCSV parsers for fields are handling mandatory/invalid data as constraint violations (see v2 code above)

    6. If error occurs, file input is cleared.

    7. If upload is successful (upload.file.successful), show user Alert Success with text "upload.file.successful = File uploaded successfully. “Number of records processed: <number of line items processed>"

  8. Create a performance test for this API endpoint, and the endpoint for this test should return in less than 0.5 second for the administrator user in demo data for 90% of tests (p90) for 10k values


  • Upload process in UI is similar to ticket

  • per commodity type, not orderable or lot

Out of Scope

  • Permissions are a separate ticket

  • Audit logging once a change is submitted is a separate ticket

  • Viewing the uploaded ISA amounts via the UI


Mateusz Kwiatkowski
November 22, 2017, 4:16 PM
Chongsun Ahn
November 16, 2017, 8:10 PM

Normally holding for one minute should give us enough samples if we're hitting the typical performance criteria (<0.5s), but since we've agreed to up the criteria to 3s, we can increase the "hold for" time to two minutes. Assuming we are at about 3 seconds, two minutes should give us about 40 samples, which should be sufficient. Let's change the criteria to 3 seconds and the hold-for time to 2 minutes and see where we are.

Mateusz Kwiatkowski
November 16, 2017, 1:15 PM

I don't think that 0.5 second is achieve-able for now. We've been discussing with performance of this endpoint and as I recall the only concern was that while processing one chunk of rows (we are doing update asynchronously, chunk size is 200 for now) there are lot of sqls (about 200). In the review explained that those sqls are UPDATEs of each row in chunk and it have to be this way (maybe we could get all updates and send them once using JDBC directly? but I don't think that it is something we want to do). However some of them were getting ISA info because Hibernate save method returns saved info. We've done custom update method and number of SQL's dropped a little.
In the review agreed to change acceptance time duration from 0.5 second to 3 seconds. For now this endpoint performance is (takes longer just after deploy of perf test server):
average: 2729ms
median: 2224ms
line90: 4990ms
This test is executed with 'hold for 1m' duration and only 8 samples, so I think it can be not accurate enough. Maybe test specs could be changed (increase hold for, get access token only once, change it to number of executions).
I believe that you have to decide if we want to add some improvements to this endpoint in this ticket, in other one or we are OK with this performance. Also if we want to change performance test a little bit.

Brandon Bowersox-Johnson
November 16, 2017, 12:55 AM

Also, is Acceptance Criteria #8 finished?

"Create a performance test for this API endpoint, and the endpoint for this test should return in less than 0.5 second for the administrator user in demo data for 90% of tests (p90) for 10k values"

Our team today was remembering that the Performance test might not have been completed back before we pulled this ticket off the board. If more performance optimization is needed to meet this target, feel free to reach out to if you want any support or collaboration.

Joanna Bebak
November 15, 2017, 2:17 PM

I made a re-test, and the issues no longer occur. As for #3, it turned out that the values are updated but appear more to the bottom of the csv file, not in the same place.



Mateusz Kwiatkowski


Vidya Sampath

Story Points


Time tracking


Time remaining


Epic Link




Fix versions