UI Caching Design

What are we caching?

  • session data - current user, access token, system notifications, permission strings
  • offline data - requisitions, requisition templates
  • metadata stored as session data - currency and locale settings, minimal facilities

What do we want to cache?

  • session data - current user, access token, system notifications, permission strings
  • offline data - requisitions, requisition templates
  • metadata/dictionaries - currency and locale settings, minimal facilities, orderables

Why do want to cache?

  • improved performance
  • less network traffic (a big benefit for slow connections)
  • fetching the same data is redundant
  • offline capabilities

What is wrong with the current approach a.k.a. why do we need a new one?

Proposed redesign

We can put the data into two groups:

  • session data - all the data that should be stored throughout the session but no longer, information like current user, its permissions, system notification
  • metadata/dictionaries/offline data - data that should persist between multiple sessions so it is available offline and doesn't have to be fetched constantly as the data usually takes some time to load

Since we have two different groups of data that can't really be handled mechanism the proposed design describes two different mechanisms for dealing with each of the group.

Session caching

For caching session data the openlmisSessionCacheService is used. It has a simple interface as follows

angular
    .module('referencedata-user')
    .service('openlmisSessionCacheService', openlmisSessionCacheService);

openlmisSessionCacheService.$inject = [/* Dependencies */];

function openlmisSessionCacheService(/* Dependencies */) {

    this.cache = cache;
    this.get = get;

    /**
     * Caches the response of the method throughout the session.
     * 
     * @param {string}   key      the key the resolved value will be available under
     * @param {Function} fn       the function to fetch the cached data, can return a Promise
     * @param {Object}   options  the options object, options:
     *                            - fetchOnLogin - defines when the data should be fetched (on login or on the first
     *                                             call to the get method with respective key), defaults to true
     */
    function cache(key, fn, options) {
        // method body
    }

    /**
     * Retrieves the cached value hidden under the given key
     *
     * @param  {string}  key  the key of the cached value
     * @return {Promise}      the promise resolved once the data is ready
     */
    function get(key) {
        // method body
    }
}

Underneath the service should rely on LocalDatabase class for storing data and registerPostLogin and registerPostLogout methods of the loginService to fetch data (either on login or on the first call to get) and clear it on logout.

Example usage

Instead of

We would have the following

(function() {

    'use strict';

    angular
        .module('referencedata-user')
        .run(routes);

    routes.$inject = ['openlmisSessionCacheService', 'UserRepository', 'authorizationService'];

    function routes(openlmisSessionCacheService, UserRepository, authorizationService) {

        function getCurrentUser() {
            return new UserRepository().get(authorizationService.getUser().user_id);
        }

        openlmisSessionCacheService.cache('currentUser', getCurrentUser);

    }

})();

To retrieve the data we would use the following snippet

openlmisSessionCacheService.get('currentUser')
    .then(function(currentUser) {
        // do something with the current user
    })

Metadata/Dictionary/Offline data caching

Caching this group is more complex as some of the resources are versioned and some are not. To achieve this a new layer above OpenlmisResource is added that deals with the caching of data. It has the following interface

angular
    .module('openlmis-cached-repository')
    .factory('OpenlmisCachedResource', OpenlmisCachedResource);

OpenlmisCachedResource.$inject = [/* Dependencies */];

function OpenlmisCachedResource(/* Dependencies */) {

    OpenlmisCachedResource.prototype.get = get;
    OpenlmisCachedResource.prototype.query = query;
    OpenlmisCachedResource.prototype.getAll = getAll;
    OpenlmisCachedResource.prototype.update = update;
    OpenlmisCachedResource.prototype.create = create;
    OpenlmisCachedResource.prototype.delete = deleteObject;

    return OpenlmisCachedResource;

    /**
     * @ngdoc method
     * @methodOf openlmis-cached-repository.OpenlmisCachedResource
     * @name OpenlmisCachedResource
     * @constructor
     *
     * @description
     * Creates an instance of the OpenlmisCachedResource class.
     *
     * Configuration options:
     * - paginated - flag defining whether response returned by the query request is paginated; defaults to true
     * - versioned - flag defining whether handled resource is versioned; defaults to false
     *
     * @param {String} uri    the URI pointing to the resource
     * @param {Object} config the optional configuration object, modifies the default behavior making this class
     *                        more flexible
     */
    function OpenlmisCachedResource(uri, config) {
        // implementation
    }

    /**
     * @ngdoc method
     * @methodOf openlmis-cached-repository.OpenlmisCachedResource
     * @name get
     *
     * @description
     * Retrieves an object with the given ID from cache or from the server.
     *
     * @param  {string}  id        the ID of the object
     * @param  {strong}  versionId (optional) the version of the object
     * @return {Promise}           the promise resolving to matching object, rejects if ID is not given or if the
     *                             request fails
     */
    function get(id, versionId) {
        // implementation
    }

    /**
     * @ngdoc method
     * @methodOf openlmis-cached-repository.OpenlmisCachedResource
     * @name query
     *
     * @description
     * Return the response of the GET request or cached value. Passes the given object as request parameters.
     *
     * @param  {Object}  params the map of request parameters
     * @return {Promise}        the promise resolving to the server response or cached value, rejected if request fails
     */
    function query(params) {
        // implementation
    }

    /**
     * @ngdoc method
     * @methodOf openlmis-cached-repository.OpenlmisCachedResource
     * @name getAll
     *
     * @description
     * Return the response of the GET request or cached value in a form of a list. Passes the given object as request
     * parameters.
     *
     * @param  {Object}  params the map of request parameters
     * @return {Promise}        the promise resolving to the server response or cached value, rejected if request fails
     */
    function getAll(params) {
        // implementation
    }

    /**
     * @ngdoc method
     * @methodOf openlmis-cached-repository.OpenlmisCachedResource
     * @name update
     * 
     * @description
     * Saves the given object on the OpenLMIS server. Uses PUT method. Caches the result.
     * 
     * @param  {Object}  object the object to be saved on the server
     * @return {Promise}        the promise resolving to the server response, rejected if request fails or object is
     *                          undefined or if the ID is undefined
     */
    function update(object) {
        // implementation
    }

    /**
     * @ngdoc method
     * @methodOf openlmis-cached-repository.OpenlmisCachedResource
     * @name create
     * 
     * @description
     * Creates the given object on the OpenLMIS server. Uses POST method. Caches the result.
     * 
     * @param  {Object}  object        the object to be created on the server
     * @param  {Object}  params        the parameters to be passed to the request
     * @return {Promise}               the promise resolving to the server response, rejected if request fails
     */
    function create(object, params) {
        // implementation
    }

    /**
     * @ngdoc method
     * @methodOf openlmis-cached-repository.OpenlmisCachedResource
     * @name delete
     * 
     * @description
     * Deletes the object on the OpenLMIS server. Removes the cached object.
     * 
     * @param  {Object}  object the object to be deleted from the server
     * @return {Promise}        the promise resolving to the server response, rejected if request fails or object is
     *                          undefined or if the ID is undefined
     */
    function deleteObject(object) {
        // implementation
    }

}

Underneath the OpenlmisCachedResource will use LocalDatabase for caching the data and OpenlmisResource for communicating with the backend server. Following are the descriptions on how each of the methods should behave:

  • OpenlmisCachedResource.get 
    • versioned
      • if the version id is given the component will first try to fetch the matching object from the local storage, if none is found a request will be made
      • if the version id is not given the component will first fetch the latest version of the matching object from the local storage, then a request with ETag will be sent, if the ETag matches the cached object will be returned, if not, the server response will be returned
      • saves the response in the local database
    • non-versioned
      • the component will first fetch the matching object from the local storage, then a request with ETag will be sent, if the ETag matches the cached object will be returned, if not, the server response will be returned
      • overrides the object in the local database
  • OpenlmisCachedResource.query
    • saves results in the local database
  • OpenlmisCachedResource.getAll
    • saves results in the local database
  • OpenlmisCachedResource.update
    • versioned
      • the newly-created version is cached in the local database
      • no previous versions are removed
    • non-versioned
      • the newly-created version is cached in the local database
      • the previously cached version is overridden
  • OpenlmisCachedResource.create
    • the newly-created object is cached in the local database
  • OpenlmisCachedResource.delete
    • versioned
      • all versions of the deleted resource are removed from the local database
    • non-versioned
      • the cached resource is removed from the local database

Open questions

  • How to prevent redundant calls when using getAll and query methods?

Next steps

  • create a ticket for adding ETag support for endpoints which responses are supposed to be cached
  • create a ticket for implementing openlmisSessionCacheService
  • create a ticket for implementing OpenlmisCachedResource
  • create a ticket for refactoring places that cache the session data to use the openlmisSessionCacheService
  • create a ticket for refactoring requisition related communication to use OpenlmisCachedResource
  •