In this post I would like to share our (@solidlines) experience connecting Kobo tool box and dhis2. This experience was already shared in a weekly call of the dhis2 integration working group.
Context
An organization plans to collect surveys in Kobo / KoboToolbox (a free toolkit for collecting data). However, they want to have the surveys on a DHIS2 instance as well. To avoid making the data entry twice, the idea is to transfer the survey information automatically from Kobo to dhis2. The destination in dhis2 is an single event program that was already implemented in the dhis2 instance.
The Kobo data is available using the Kobo API, but some curation data process was needed.
As a requirement, the integration process should be running every 2 hours
In addition, notification emails would sent for different purposes (the process has started, there was a problem in the dhis2 payload…), in order to monitor the automatic process.
Double check if the survey was already uploaded (using the dhis2 API).
If the survey was not previously uploaded, send data (event payload) to dhis2 (remember that the event program was already implemented in the dhis2 instance).
In this screenshot you can see the ETL process configured in Apache NiFi
Thanks! I think I will need to try it step by step myself to get the complete picture.
The events endpoint is deprecated from version 2.36 so this means this method works on earlier versions, does it work if one uses the /tracker/events endpoint?
Great, so the metadata is all being created in the process as well. Data and metadata is all being created at the same time of the mapping.
The events endpoint is deprecated from version 2.36 so this means this method works on earlier versions, does it work if one uses the /tracker/events endpoint?
Actually, this is still working fine in v40.6
BTW, if you want to use the new ´tracker` endpoing, it is needed only a change in the payload template generated + updated the URL of the endpoint.
Great, so the metadata is all being created in the process as well. Data and metadata is all being created at the same time of the mapping.
The program configuration was created previously in the dhis2. Let me update the initial post in order to reflect this.