AMQP/RabbitMQ integration for events or datavalues

I have seen in the documentation that it is possible to have a topic for DHIS2 meta data changes and updates. Is it possible to do the same with data values and events. I am interested in setting up a queue i can use to send events data and have it consumed in DHIS2.

Hi,

Sorry but this is not currently supported, we are doing a bit of rework on this for 2.32 and would welcome a jira issue for this if you could write it? https://jira.dhis2.org

It might have to wait for 2.33, but it would be good to have an issue made so it’s easier to track it.

2 Likes

Anthony,

I think there are numerous users who would like a queue system for both event data and tracker enrollments. Can you provide some more details on the environment here - where does the data come from and what’s the frequency/volume?

Regards
Calle

2 Likes

Hi Anthony,

Very interesting.

First off you will realise that DHIS2 has implemented Kafka Home - DHIS2 Documentation

However, there is need for you to clarify - is the que that you speak of in one direction, that is - do you want to place data from an external system on a que to be consumed by DHIS2 such as with Kafka above. Or do you want your system to be notified when something changes in DHIS2 so it can pick up new events in which case you can see what we have done before with our prototype project GitHub - ITINordic/sadombo: dhis2 epms interop layer. We aim to improve this to use even perhaps RabbitMQ and deal with case data. Either way you can check out a possible pattern here https://developer.nhs.uk/library/architecture/integration-patterns/publish-subscribe/

You can also have a look at GitHub - dhis2/dhis2-fhir-adapter: ⛔ [DEPRECATED] DHIS 2 FHIR adapter, please see the active fork at https://github.com/ITINordic/dhis2-fhir-adapter for some developments regarding interoperability at the patient record level which are interesting to us. Also you can check out Apache Camel as well which you could implement with RabbitMQ.

Regards,

Ranga

2 Likes

Calle,

The data is generated by an external system operating at facility level. The system forwards data to a central repository which has a service that will be transforming and relay information to DHIS2 for reporting and tracking purposes.

For now the use case will be a push mechanism to DHIS2 although bi-directional would be ideal as well

Regards

2 Likes

Anthony,

So it’s basically case data (event or tracker) flowing between two single servers (uni-directional or bi-directional). That’s a very common scenario, and there are several such systems in place in e.g. South Africa (the most well-known one being MomConnect, where pregnant women/ANC clients are registered using a mobile USSD app and the data flowing to DHIS2).

Up to now, most such data exchange mechanisms have utilised a broker system like HIE/OpenHIM, which basically imports cases from the server getting the cases from mobile devices, temporarily stores the data, converts it into one or more DHIS2 web api calls, and inserts the cases into the DHIS2 server). Not always that easy to get it going, though, especially if the systems on “both sides” are dynamic (as in implementing improvements or changes).

Recently we have examples of using HAPI FHIR for basically the same purpose. Volker has done on some nice work on FHIR and interoperability - he recently set up a working proof-of-concept of data exchange between OpenMRS (an EMR) and DHIS 2 over FHIR: https://demos.dhis2.org/fhir-demo/index.html . Another possibility is to combine e.g. HIE with FHIR…

What you are suggesting, though, is a more direct channel where source cases are formatted and then inserted into a queue to be consumed by DHIS2 as capacity allows - a solution that has benefits in terms of simplicity. But as indicated above, it might not be optimal in a dynamic/volatile environment…

@Morten Olav Hansen - can you clarify further what might be in the pipeline around this for 2.32/33? An expansion of the Kafka queuing concept?

Regards

Calle

Hi @Calle_Hedberg

So, first of all, we are removing support for Kafka, while it was working nicely, it was a pain to setup (as it requires an external component), and it was something we could not assume the server had so it ended up not really being used for much.

From 2.32 we have instead implemented queuing using ActiveMQ Artemis, this is embedded by default (which means all parts of dhis2 can rely on the queue being available), and it is also easy to configure an external AMQP server to be used (either using Artemis in standalone mode, or another AMQP compatible queue like RabbitMQ).

So for 2.32, that is probably where it will end. What will land in 2.33 is a bit more tricky to guess, but at the very least the new tracker services will use it for all imports (not using a special endpoint like we did with kafka), other use-cases can be logging, auditing, monitoring, etc.


Morten

2 Likes