Improving DHIS2 interoperability performance in DRC – a case study

Greetings! I am Mahmudul Islam from Bangladesh, Managing Director & Technical Lead of SoftWorks Ltd, a small software company based in Dhaka, Bangladesh.

I would like to invite all to join the session on the following abstract during DHIS2 Annual Conference 2022:

Session Title: Interoperability country examples
Date and time (tentative): Wednesday 22 June, 13:00-15:00
Presentation Title: Improving DHIS2 interoperability performance in DRC – a case study

Democratic Republic of Congo (DRC) Ministry of Health implemented DHIS2 to collect and analyze routine health services data from health facilities since 2014, currently having a network of 18000+ health facilities. Chemonics GHSC-TA project assisted MoH to develop monthly LMIS reporting template inside DHIS2 platform for reporting stock status of key health commodities. The LMIS data entered in DHIS2 and some selected Malaria/HIV/TB patient aggregate elements are transferred hourly to an external custom data visualization platform InfoMED using DHIS2 API. Initially the DHIS2 API was using JSON format for data transfer, it was identified that even one hour of data updated in DHIS2 crosses 450MB during high burden days. When data reaches the external system, every JSON object has to convert to SQL statement and execute in target MySQL database, which creates a huge burden and becomes a bottleneck for the target system.

The hourly DHIS2 API data extraction format was changed from JSON to CSV, which reduced the data burden to 80%, but still having issues with keeping up with insert/update/delete of data into the external system – because of huge number of records. Introduction of a message queue service, RabbitMQ, improved the load on the target MySQL database substantially. Every hour a batch of CSV data coming from DHIS2 is preprocessed and converted to meaningful SQL statements, and then passed to RabbitMQ, which works like a queue in a bank, and there are teller processes that handles each message sequentially in a separate thread. This reduced the burden on the external system trying to execute the incoming data as a single unit. There were also improvements in the data transfer API execution, if an API call fails, next hour’s API call will extract an additional hour of data.

Results & Success:
The data transfer process is able to identify sync errors due to organization unit or element mismatch, can cater for missed API calls. There is also option to bulk load full month’s data which is near impossible to do manually, because of high data load.

Challenges & Lessons Learned:
During DHIS2 version upgrade the data transfer APIs are discontinued and this leads to pile up of data waiting to transfer to external system. Proper facility and element matching in DHIS2 and external system is vital for uninterrupted data transfer between systems.


Interesting case of interoperability @Mahmud :+1: