Importing data from external system?

DHIS seems to do a good job of importing data from another DHIS system.
However, I would like to use the DHIS as a data warehouse to suck up data from other systems in the country (vertical programs).
I’ve spent some time looking at the xml format and it looks like it could be emulated by another system but will need to have the id codes for periods, facilities, data elements, etc. so it will be a bit tedious.

Has anyone done work on this problem.?.. I’m thinking of some tool to map the external data to the DHIS dataset which would allow a “drag and drop” match.

Regards,
Mark

···

Mark Spohr, MD

Hi Mark,

I share the same dilemma. I wanted to use dhis for dashboarding several spreadsheets but fundamentals (from my side not dhis) are missing. Master tables for a few dimensions are needed to transform external data into dhis:

- a master location table (wc is org units in dhis2). Dhis already prepared the placeholders but local users have to populate it (correctly). I think this is more art than science and is involved with a lot of policy decisions.

- a master date table (should be easy but is anyone willing to share theirs?)

- for name-based, a unique number system

- an ontology of services rendered to the population (bcg is a vaccine is a medicine, etc)

As a policymaker now, I now find lack of capacity for local people on dhis. Even if I say "study dhis2 and plug our data into it" , there still is paralysis. Not like an academic environment where everyone is almost willing to experiment and fail, govt employees are wary of investing on an "untested" application (untested here means more like "i dont know anyone else at the office using that. Who do i turn to if i have a question."

Challenges in change management....

Alvin

···

Sent from my iPad

On Mar 17, 2012, at 7:24, Mark Spohr <mhspohr@gmail.com> wrote:

DHIS seems to do a good job of importing data from another DHIS system.
However, I would like to use the DHIS as a data warehouse to suck up data from other systems in the country (vertical programs).
I've spent some time looking at the xml format and it looks like it could be emulated by another system but will need to have the id codes for periods, facilities, data elements, etc. so it will be a bit tedious.
Has anyone done work on this problem.?.. I'm thinking of some tool to map the external data to the DHIS dataset which would allow a "drag and drop" match.

Regards,
Mark

--
Mark Spohr, MD

_______________________________________________
Mailing list: https://launchpad.net/~dhis2-users
Post to : dhis2-users@lists.launchpad.net
Unsubscribe : https://launchpad.net/~dhis2-users
More help : https://help.launchpad.net/ListHelp

I’m in the middle of an implementation now and find that an extreme amount of “hand-holding” is required.
However, after they have been through it they generally follow through.

The other issue is getting them to experiment with reports or to even state what they would like to see in a report.

Anyway, I’ll share whatever I come up with on “foreign” data import. I have a feeling that it will a kludge.

It would be useful to find a description of all of the tables and data elements in DHIS so I could do some queries to get the index ids for the data elements I need to specify.

.Mark

···

On Fri, Mar 16, 2012 at 5:59 PM, alvin.marcelo alvin.marcelo@gmail.com wrote:

Hi Mark,

I share the same dilemma. I wanted to use dhis for dashboarding several spreadsheets but fundamentals (from my side not dhis) are missing. Master tables for a few dimensions are needed to transform external data into dhis:

  • a master location table (wc is org units in dhis2). Dhis already prepared the placeholders but local users have to populate it (correctly). I think this is more art than science and is involved with a lot of policy decisions.

  • a master date table (should be easy but is anyone willing to share theirs?)

  • for name-based, a unique number system

  • an ontology of services rendered to the population (bcg is a vaccine is a medicine, etc)

As a policymaker now, I now find lack of capacity for local people on dhis. Even if I say “study dhis2 and plug our data into it” , there still is paralysis. Not like an academic environment where everyone is almost willing to experiment and fail, govt employees are wary of investing on an “untested” application (untested here means more like “i dont know anyone else at the office using that. Who do i turn to if i have a question.”

Challenges in change management…

Alvin

Sent from my iPad

On Mar 17, 2012, at 7:24, Mark Spohr mhspohr@gmail.com wrote:

DHIS seems to do a good job of importing data from another DHIS system.

However, I would like to use the DHIS as a data warehouse to suck up data from other systems in the country (vertical programs).

I’ve spent some time looking at the xml format and it looks like it could be emulated by another system but will need to have the id codes for periods, facilities, data elements, etc. so it will be a bit tedious.

Has anyone done work on this problem.?.. I’m thinking of some tool to map the external data to the DHIS dataset which would allow a “drag and drop” match.

Regards,

Mark

Mark Spohr, MD


Mailing list: https://launchpad.net/~dhis2-users

Post to : dhis2-users@lists.launchpad.net

Unsubscribe : https://launchpad.net/~dhis2-users

More help : https://help.launchpad.net/ListHelp


Mark Spohr, MD

Hi Mark,

This is exactly what we’re doing in Rwanda. We’ve set up one instance of DHIS-2 as our HMIS (for routine data entry by health facilities across the country) and a second instance as a national data warehouse/dashboard – more intended for program managers, implementing partners and donors. Bob Jolliffe has been here helping us put together scripts to automatically synchronize sub-sets of the data between the two instances as new data is entered in the HMIS (I created a special dataset called datawarehouse in HMIS that gets pushed across). We’re also going to use the extended attributes for dataelements and indicators in the data warehouse instance to maintain our metadata dictionary with additional fields such as: primary data source, precise definition, intended use, staff responsible for collection, etc….

Bringing data in from other systems is still not easy – though now that many of our other data sources are web enabled it is practical. As you note, you need to use the code field in each of the major data entities (dataelement, indicator, orgunit) that all systems share. It is not difficult to create a view of the period table that can be used to translate periodids when importing data – for example here is the sql that gives you the year, month and quarter for all periods in your period table:

SELECT periodtype.name AS periodtype, period.periodid, period.startdate, period.enddate, date_part(‘Year’::text, period.startdate) AS periodyear, date_part(‘month’::text, period.startdate) AS periodmonth,

    CASE

        WHEN date_part('month'::text, period.startdate) = ANY (ARRAY[1, 2, 3]) THEN 1

        WHEN date_part('month'::text, period.startdate) = ANY (ARRAY[4, 5, 6]) THEN 2

        WHEN date_part('month'::text, period.startdate) = ANY (ARRAY[7, 8, 9]) THEN 3

        ELSE 4

    END AS periodquarter

FROM period, periodtype

WHERE period.periodtypeid = periodtype.periodtypeid;

Bob relies on DXF or similar XML import mechanisms – partly because of Postgres’ requirement to assign a unique id to each record across all tables whose current value is maintained in the hibernate_sequence object and it is definitely the safest way to go. I’ve found it is also relatively easy to do with a combination of Excel and a visual query designer like Access – linked to the Postgres tables - as long as I check and increment the current value before and after imports (and nobody else is working with the database)! Of course it depends upon how similar in structure your source data is to DHIS – otherwise you may need to do multiple transformations of the data before hand. If you are using a lot of category combinations (age/gender, etc…) as opposed to just the default categorycombo, it is also more difficult as well, because they also need to be mapped to the categorycomboids.

A drag and drop interface would be great… but we’re far from it now.

Randy

···

From: dhis2-users-bounces+rwilson=msh.org@lists.launchpad.net [mailto:dhis2-users-bounces+rwilson=msh.org@lists.launchpad.net] On Behalf Of Mark Spohr
Sent: Saturday, March 17, 2012 1:25 AM
To: dhis2-users@lists.launchpad.net
Subject: [Dhis2-users] Importing data from external system?

DHIS seems to do a good job of importing data from another DHIS system.
However, I would like to use the DHIS as a data warehouse to suck up data from other systems in the country (vertical programs).
I’ve spent some time looking at the xml format and it looks like it could be emulated by another system but will need to have the id codes for periods, facilities, data elements, etc. so it will be a bit tedious.
Has anyone done work on this problem.?.. I’m thinking of some tool to map the external data to the DHIS dataset which would allow a “drag and drop” match.

Regards,
Mark


Mark Spohr, MD

Just to add to that list of places, we are doing some integration of data coming from Baobab’s BART systems into DHIS2 here in Malawi. We discussed many different methods of data import into DHIS2 and reached to conclusions on what solutions might be appropriate to what context when exchanging data between systems.

In the Malawi, the dataset are fairly stable now and there is a central DHIS2 system. The Baobab system also has a common set of report that needs to be sent monthly. Hence the Baobab system uses the DHIS2’s web-api dataValueSets resource to send data into DHIS. This is a simple XML report of datavalues that has been aggregated monthly and reported anyways by the Baobab system.

  • One needs to initially do a GET on the organization unit

  • Then GET on the selected dataset (ANC Monthly in our case)

  • Then GET to check the ids of the data elements in a dataset

  • Then create a dataValueSets representation and POST this

We are still testing this out for continuous integration, but seems easy and low hanging fruit.

···

Regards,
Saptarshi PURKAYASTHA

My Tech Blog: http://sunnytalkstech.blogspot.com

You Live by CHOICE, Not by CHANCE

On 17 March 2012 09:05, Wilson,Randy rwilson@msh.org wrote:

Hi Mark,

This is exactly what we’re doing in Rwanda. We’ve set up one instance of DHIS-2 as our HMIS (for routine data entry by health facilities across the country) and a second instance as a national data warehouse/dashboard – more intended for program managers, implementing partners and donors. Bob Jolliffe has been here helping us put together scripts to automatically synchronize sub-sets of the data between the two instances as new data is entered in the HMIS (I created a special dataset called datawarehouse in HMIS that gets pushed across). We’re also going to use the extended attributes for dataelements and indicators in the data warehouse instance to maintain our metadata dictionary with additional fields such as: primary data source, precise definition, intended use, staff responsible for collection, etc….

Bringing data in from other systems is still not easy – though now that many of our other data sources are web enabled it is practical. As you note, you need to use the code field in each of the major data entities (dataelement, indicator, orgunit) that all systems share. It is not difficult to create a view of the period table that can be used to translate periodids when importing data – for example here is the sql that gives you the year, month and quarter for all periods in your period table:

SELECT periodtype.name AS periodtype, period.periodid, period.startdate, period.enddate, date_part(‘Year’::text, period.startdate) AS periodyear, date_part(‘month’::text, period.startdate) AS periodmonth,

    CASE
        WHEN date_part('month'::text, period.startdate) = ANY (ARRAY[1, 2, 3]) THEN 1
        WHEN date_part('month'::text, period.startdate) = ANY (ARRAY[4, 5, 6]) THEN 2
        WHEN date_part('month'::text, period.startdate) = ANY (ARRAY[7, 8, 9]) THEN 3
        ELSE 4
    END AS periodquarter

FROM period, periodtype

WHERE period.periodtypeid = periodtype.periodtypeid;

Bob relies on DXF or similar XML import mechanisms – partly because of Postgres’ requirement to assign a unique id to each record across all tables whose current value is maintained in the hibernate_sequence object and it is definitely the safest way to go. I’ve found it is also relatively easy to do with a combination of Excel and a visual query designer like Access – linked to the Postgres tables - as long as I check and increment the current value before and after imports (and nobody else is working with the database)! Of course it depends upon how similar in structure your source data is to DHIS – otherwise you may need to do multiple transformations of the data before hand. If you are using a lot of category combinations (age/gender, etc…) as opposed to just the default categorycombo, it is also more difficult as well, because they also need to be mapped to the categorycomboids.

A drag and drop interface would be great… but we’re far from it now.

Randy

From: dhis2-users-bounces+rwilson=msh.org@lists.launchpad.net [mailto:dhis2-users-bounces+rwilson=msh.org@lists.launchpad.net] On Behalf Of Mark Spohr

Sent: Saturday, March 17, 2012 1:25 AM
To: dhis2-users@lists.launchpad.net
Subject: [Dhis2-users] Importing data from external system?

DHIS seems to do a good job of importing data from another DHIS system.
However, I would like to use the DHIS as a data warehouse to suck up data from other systems in the country (vertical programs).

I’ve spent some time looking at the xml format and it looks like it could be emulated by another system but will need to have the id codes for periods, facilities, data elements, etc. so it will be a bit tedious.

Has anyone done work on this problem.?.. I’m thinking of some tool to map the external data to the DHIS dataset which would allow a “drag and drop” match.

Regards,
Mark

Mark Spohr, MD


Mailing list: https://launchpad.net/~dhis2-users

Post to : dhis2-users@lists.launchpad.net

Unsubscribe : https://launchpad.net/~dhis2-users

More help : https://help.launchpad.net/ListHelp

Just to add to that list of places, we are doing some integration of data
coming from Baobab's BART systems into DHIS2 here in Malawi. We discussed
many different methods of data import into DHIS2 and reached to conclusions
on what solutions might be appropriate to what context when exchanging data
between systems.

In the Malawi, the dataset are fairly stable now and there is a central
DHIS2 system.

Hi Saptarshi and all

I think that's a really critical point. Early stage of implementation
tends to see more extreme fluctuations as the codes and datasets and
orgunit structures stabilize. It really is a requirement to have
these stabilized to a certain extent before trying to link up various
systems to avoid reimplementing solutions over and over.

Then there are distinct but related problems of (i) sharing structural
metadadata and (ii) sharing data between systems. In the simplest
case structural metadata is just a dataset description as you describe
in your Baobab scenario. For that I am sure you are right - the web
api is really well suited. And I suspect it will meet 80% of common
use cases ie. systems reporting datasets into dhis Though I know
Morten is at pains to point out that this API too is very recent and
will be subject to some change, though probably not too fundamental.

It does start to get more complex when you want to synchronize entire
hierarchies, groupsets etc between systems. These problems are not
yet really solved out-of-the-box and generally still requires some
innovative scripting of custom solutions. Some of these problems are
being addressed in the ongoing design process of the web api.

Then there is the question of communicating data. On the xml side
there are currently 2 ways in and a variety of formats supported or
potentially supported. This needs to be both rationalised and better
documented but there are quite a few processes happening
simultaneously:

(i) the web api. Simple to use. Supports xml (and json?)
datavaluesets. Uses uid indentifiers. Datavaluest defined in dxf2
namespace.

(ii) the stream based dxf import in the legacy import-export module.
Supports the dxf1 xml format which is currently produced on dhis2
output as well as a dxf2 datavalueset which still has minor
differences with the format used in web api. Most important of which
is the ability to use either codes (which might be externally
assigned) as well as DHIS custom uids (a plus). Also currently only
supports default category dataelements (a minus). For the moment data
import which uses different disaggregations cannot be done directly in
this route

The other functionality of the stream based import is the ability to
load a custom xslt transform for incoming xml to transform it to
either dxf1 or dxf2. This is the way, for example, that an sdmx-hd
cross sectional dataset is imported as a dxf2 datavalueset and it
works well for that. In fact the basic schema of a dxf2 datavalueset
is strongly inspired by (and not accidentally!) by the sdmx hd schema.

In principle this does mean that any datavalueset in an xml format
where the codes are somehow mappable can be imported.

Outstanding issues which need to be solved (or solved better) as I see
it in no particular order are:
(i) harmonising of the xml in the web api and the import module
(ii) better support for dissagregated data (without 3rd party systems
having to 'understand' categoryoptioncombo)
(iii) enhanced support for synching metadata between systems
(iv) stabilization and documentation of APIs and schemas

At the moment most interoperability problems are solvable but require
navigation of an over complex labyrinth of undocumented and
inconsistent functionality. To be fair, this has also been due to a
lack of concrete use cases. I have been involved in a number of
"synthetic" scenarios over the past few years where it has turned out
that either the 3rd party system didn't really exist or the apparent
use case wasn't really required at all :slight_smile:

The situation overall is greatly improved over the past year with the
intoduction of uids, the possibility to use codes to map against 3rd
party systems and the beginnings of the web-api.

I also have some useful meat now from working with Randy and team in
Rwanda. And there is also a growing interest in interoperating with
national facility registry software which may well become reality in
some countries.

I think it would be really, really useful to start collecting some of
these existing use cases - particularly concrete ones such as
described by the contributors to this thread - in some more detail.
Including those which are straightforward, those which are doable but
difficult and those which seem to elude us at present.

Regards
Bob

The Baobab system also has a common set of report that needs

···

On 17 March 2012 09:15, Saptarshi Purkayastha <sunbiz@gmail.com> wrote:

to be sent monthly. Hence the Baobab system uses the DHIS2's web-api
dataValueSets resource to send data into DHIS. This is a simple XML report
of datavalues that has been aggregated monthly and reported anyways by the
Baobab system.
- One needs to initially do a GET on the organization unit
- Then GET on the selected dataset (ANC Monthly in our case)
- Then GET to check the ids of the data elements in a dataset
- Then create a dataValueSets representation and POST this

We are still testing this out for continuous integration, but seems easy and
low hanging fruit.

---
Regards,
Saptarshi PURKAYASTHA

My Tech Blog: http://sunnytalkstech.blogspot.com
You Live by CHOICE, Not by CHANCE

On 17 March 2012 09:05, Wilson,Randy <rwilson@msh.org> wrote:

Hi Mark,

This is exactly what we’re doing in Rwanda. We’ve set up one instance of
DHIS-2 as our HMIS (for routine data entry by health facilities across the
country) and a second instance as a national data warehouse/dashboard – more
intended for program managers, implementing partners and donors. Bob
Jolliffe has been here helping us put together scripts to automatically
synchronize sub-sets of the data between the two instances as new data is
entered in the HMIS (I created a special dataset called datawarehouse in
HMIS that gets pushed across). We’re also going to use the extended
attributes for dataelements and indicators in the data warehouse instance to
maintain our metadata dictionary with additional fields such as: primary
data source, precise definition, intended use, staff responsible for
collection, etc….

Bringing data in from other systems is still not easy – though now that
many of our other data sources are web enabled it is practical. As you
note, you need to use the code field in each of the major data entities
(dataelement, indicator, orgunit) that all systems share. It is not
difficult to create a view of the period table that can be used to translate
periodids when importing data – for example here is the sql that gives you
the year, month and quarter for all periods in your period table:

SELECT periodtype.name AS periodtype, period.periodid, period.startdate,
period.enddate, date_part('Year'::text, period.startdate) AS periodyear,
date_part('month'::text, period.startdate) AS periodmonth,

    CASE

        WHEN date\_part\(&#39;month&#39;::text, period\.startdate\) = ANY

(ARRAY[1, 2, 3]) THEN 1

        WHEN date\_part\(&#39;month&#39;::text, period\.startdate\) = ANY

(ARRAY[4, 5, 6]) THEN 2

        WHEN date\_part\(&#39;month&#39;::text, period\.startdate\) = ANY

(ARRAY[7, 8, 9]) THEN 3

        ELSE 4

    END AS periodquarter

FROM period, periodtype

WHERE period.periodtypeid = periodtype.periodtypeid;

Bob relies on DXF or similar XML import mechanisms – partly because of
Postgres’ requirement to assign a unique id to each record across all tables
whose current value is maintained in the hibernate_sequence object and it is
definitely the safest way to go. I’ve found it is also relatively easy to
do with a combination of Excel and a visual query designer like Access –
linked to the Postgres tables - as long as I check and increment the current
value before and after imports (and nobody else is working with the
database)! Of course it depends upon how similar in structure your source
data is to DHIS – otherwise you may need to do multiple transformations of
the data before hand. If you are using a lot of category combinations
(age/gender, etc…) as opposed to just the default categorycombo, it is also
more difficult as well, because they also need to be mapped to the
categorycomboids.

A drag and drop interface would be great… but we’re far from it now.

Randy

From: dhis2-users-bounces+rwilson=msh.org@lists.launchpad.net
[mailto:dhis2-users-bounces+rwilson=msh.org@lists.launchpad.net] On Behalf
Of Mark Spohr
Sent: Saturday, March 17, 2012 1:25 AM
To: dhis2-users@lists.launchpad.net
Subject: [Dhis2-users] Importing data from external system?

DHIS seems to do a good job of importing data from another DHIS system.
However, I would like to use the DHIS as a data warehouse to suck up data
from other systems in the country (vertical programs).
I've spent some time looking at the xml format and it looks like it could
be emulated by another system but will need to have the id codes for
periods, facilities, data elements, etc. so it will be a bit tedious.
Has anyone done work on this problem.?.. I'm thinking of some tool to map
the external data to the DHIS dataset which would allow a "drag and drop"
match.

Regards,
Mark

--
Mark Spohr, MD

_______________________________________________
Mailing list: https://launchpad.net/~dhis2-users
Post to : dhis2-users@lists.launchpad.net
Unsubscribe : https://launchpad.net/~dhis2-users
More help : https://help.launchpad.net/ListHelp

_______________________________________________
Mailing list: https://launchpad.net/~dhis2-users
Post to : dhis2-users@lists.launchpad.net
Unsubscribe : https://launchpad.net/~dhis2-users
More help : https://help.launchpad.net/ListHelp

Thanks for all of these great ideas.
The web-api sounds most interesting now. I’ll have to spend some time with it. This may be a good way to ease the difficulty of correlating all of the ids with the import of external data.

.Mark

···

On Sat, Mar 17, 2012 at 1:47 AM, Bob Jolliffe bobjolliffe@gmail.com wrote:

On 17 March 2012 09:15, Saptarshi Purkayastha sunbiz@gmail.com wrote:

Just to add to that list of places, we are doing some integration of data

coming from Baobab’s BART systems into DHIS2 here in Malawi. We discussed

many different methods of data import into DHIS2 and reached to conclusions

on what solutions might be appropriate to what context when exchanging data

between systems.

In the Malawi, the dataset are fairly stable now and there is a central

DHIS2 system.

Hi Saptarshi and all

I think that’s a really critical point. Early stage of implementation

tends to see more extreme fluctuations as the codes and datasets and

orgunit structures stabilize. It really is a requirement to have

these stabilized to a certain extent before trying to link up various

systems to avoid reimplementing solutions over and over.

Then there are distinct but related problems of (i) sharing structural

metadadata and (ii) sharing data between systems. In the simplest

case structural metadata is just a dataset description as you describe

in your Baobab scenario. For that I am sure you are right - the web

api is really well suited. And I suspect it will meet 80% of common

use cases ie. systems reporting datasets into dhis Though I know

Morten is at pains to point out that this API too is very recent and

will be subject to some change, though probably not too fundamental.

It does start to get more complex when you want to synchronize entire

hierarchies, groupsets etc between systems. These problems are not

yet really solved out-of-the-box and generally still requires some

innovative scripting of custom solutions. Some of these problems are

being addressed in the ongoing design process of the web api.

Then there is the question of communicating data. On the xml side

there are currently 2 ways in and a variety of formats supported or

potentially supported. This needs to be both rationalised and better

documented but there are quite a few processes happening

simultaneously:

(i) the web api. Simple to use. Supports xml (and json?)

datavaluesets. Uses uid indentifiers. Datavaluest defined in dxf2

namespace.

(ii) the stream based dxf import in the legacy import-export module.

Supports the dxf1 xml format which is currently produced on dhis2

output as well as a dxf2 datavalueset which still has minor

differences with the format used in web api. Most important of which

is the ability to use either codes (which might be externally

assigned) as well as DHIS custom uids (a plus). Also currently only

supports default category dataelements (a minus). For the moment data

import which uses different disaggregations cannot be done directly in

this route

The other functionality of the stream based import is the ability to

load a custom xslt transform for incoming xml to transform it to

either dxf1 or dxf2. This is the way, for example, that an sdmx-hd

cross sectional dataset is imported as a dxf2 datavalueset and it

works well for that. In fact the basic schema of a dxf2 datavalueset

is strongly inspired by (and not accidentally!) by the sdmx hd schema.

In principle this does mean that any datavalueset in an xml format

where the codes are somehow mappable can be imported.

Outstanding issues which need to be solved (or solved better) as I see

it in no particular order are:

(i) harmonising of the xml in the web api and the import module

(ii) better support for dissagregated data (without 3rd party systems

having to ‘understand’ categoryoptioncombo)

(iii) enhanced support for synching metadata between systems

(iv) stabilization and documentation of APIs and schemas

At the moment most interoperability problems are solvable but require

navigation of an over complex labyrinth of undocumented and

inconsistent functionality. To be fair, this has also been due to a

lack of concrete use cases. I have been involved in a number of

“synthetic” scenarios over the past few years where it has turned out

that either the 3rd party system didn’t really exist or the apparent

use case wasn’t really required at all :slight_smile:

The situation overall is greatly improved over the past year with the

intoduction of uids, the possibility to use codes to map against 3rd

party systems and the beginnings of the web-api.

I also have some useful meat now from working with Randy and team in

Rwanda. And there is also a growing interest in interoperating with

national facility registry software which may well become reality in

some countries.

I think it would be really, really useful to start collecting some of

these existing use cases - particularly concrete ones such as

described by the contributors to this thread - in some more detail.

Including those which are straightforward, those which are doable but

difficult and those which seem to elude us at present.

Regards

Bob

The Baobab system also has a common set of report that needs

to be sent monthly. Hence the Baobab system uses the DHIS2’s web-api

dataValueSets resource to send data into DHIS. This is a simple XML report

of datavalues that has been aggregated monthly and reported anyways by the

Baobab system.

  • One needs to initially do a GET on the organization unit
  • Then GET on the selected dataset (ANC Monthly in our case)
  • Then GET to check the ids of the data elements in a dataset
  • Then create a dataValueSets representation and POST this

We are still testing this out for continuous integration, but seems easy and

low hanging fruit.


Regards,

Saptarshi PURKAYASTHA

My Tech Blog: http://sunnytalkstech.blogspot.com

You Live by CHOICE, Not by CHANCE

On 17 March 2012 09:05, Wilson,Randy rwilson@msh.org wrote:

Hi Mark,

This is exactly what we’re doing in Rwanda. We’ve set up one instance of

DHIS-2 as our HMIS (for routine data entry by health facilities across the

country) and a second instance as a national data warehouse/dashboard – more

intended for program managers, implementing partners and donors. Bob

Jolliffe has been here helping us put together scripts to automatically

synchronize sub-sets of the data between the two instances as new data is

entered in the HMIS (I created a special dataset called datawarehouse in

HMIS that gets pushed across). We’re also going to use the extended

attributes for dataelements and indicators in the data warehouse instance to

maintain our metadata dictionary with additional fields such as: primary

data source, precise definition, intended use, staff responsible for

collection, etc….

Bringing data in from other systems is still not easy – though now that

many of our other data sources are web enabled it is practical. As you

note, you need to use the code field in each of the major data entities

(dataelement, indicator, orgunit) that all systems share. It is not

difficult to create a view of the period table that can be used to translate

periodids when importing data – for example here is the sql that gives you

the year, month and quarter for all periods in your period table:

SELECT periodtype.name AS periodtype, period.periodid, period.startdate,

period.enddate, date_part(‘Year’::text, period.startdate) AS periodyear,

date_part(‘month’::text, period.startdate) AS periodmonth,

    CASE
        WHEN date_part('month'::text, period.startdate) = ANY

(ARRAY[1, 2, 3]) THEN 1

        WHEN date_part('month'::text, period.startdate) = ANY

(ARRAY[4, 5, 6]) THEN 2

        WHEN date_part('month'::text, period.startdate) = ANY

(ARRAY[7, 8, 9]) THEN 3

        ELSE 4
    END AS periodquarter

FROM period, periodtype

WHERE period.periodtypeid = periodtype.periodtypeid;

Bob relies on DXF or similar XML import mechanisms – partly because of

Postgres’ requirement to assign a unique id to each record across all tables

whose current value is maintained in the hibernate_sequence object and it is

definitely the safest way to go. I’ve found it is also relatively easy to

do with a combination of Excel and a visual query designer like Access –

linked to the Postgres tables - as long as I check and increment the current

value before and after imports (and nobody else is working with the

database)! Of course it depends upon how similar in structure your source

data is to DHIS – otherwise you may need to do multiple transformations of

the data before hand. If you are using a lot of category combinations

(age/gender, etc…) as opposed to just the default categorycombo, it is also

more difficult as well, because they also need to be mapped to the

categorycomboids.

A drag and drop interface would be great… but we’re far from it now.

Randy

From: dhis2-users-bounces+rwilson=msh.org@lists.launchpad.net

[mailto:dhis2-users-bounces+rwilson=msh.org@lists.launchpad.net] On Behalf

Of Mark Spohr

Sent: Saturday, March 17, 2012 1:25 AM

To: dhis2-users@lists.launchpad.net

Subject: [Dhis2-users] Importing data from external system?

DHIS seems to do a good job of importing data from another DHIS system.

However, I would like to use the DHIS as a data warehouse to suck up data

from other systems in the country (vertical programs).

I’ve spent some time looking at the xml format and it looks like it could

be emulated by another system but will need to have the id codes for

periods, facilities, data elements, etc. so it will be a bit tedious.

Has anyone done work on this problem.?.. I’m thinking of some tool to map

the external data to the DHIS dataset which would allow a “drag and drop”

match.

Regards,

Mark

Mark Spohr, MD


Mailing list: https://launchpad.net/~dhis2-users

Post to : dhis2-users@lists.launchpad.net

Unsubscribe : https://launchpad.net/~dhis2-users

More help : https://help.launchpad.net/ListHelp


Mailing list: https://launchpad.net/~dhis2-users

Post to : dhis2-users@lists.launchpad.net

Unsubscribe : https://launchpad.net/~dhis2-users

More help : https://help.launchpad.net/ListHelp


Mailing list: https://launchpad.net/~dhis2-users

Post to : dhis2-users@lists.launchpad.net

Unsubscribe : https://launchpad.net/~dhis2-users

More help : https://help.launchpad.net/ListHelp


Mark Spohr, MD
mhspohr@gmail.com
+1 530 554 2230

Has anybody been able to leverage SDMX-HD to import data from other systems?

···

On Sat, Mar 17, 2012 at 10:22 PM, Mark Spohr mhspohr@gmail.com wrote:

Thanks for all of these great ideas.
The web-api sounds most interesting now. I’ll have to spend some time with it. This may be a good way to ease the difficulty of correlating all of the ids with the import of external data.

.Mark

On Sat, Mar 17, 2012 at 1:47 AM, Bob Jolliffe bobjolliffe@gmail.com wrote:

On 17 March 2012 09:15, Saptarshi Purkayastha sunbiz@gmail.com wrote:

Just to add to that list of places, we are doing some integration of data

coming from Baobab’s BART systems into DHIS2 here in Malawi. We discussed

many different methods of data import into DHIS2 and reached to conclusions

on what solutions might be appropriate to what context when exchanging data

between systems.

In the Malawi, the dataset are fairly stable now and there is a central

DHIS2 system.

Hi Saptarshi and all

I think that’s a really critical point. Early stage of implementation

tends to see more extreme fluctuations as the codes and datasets and

orgunit structures stabilize. It really is a requirement to have

these stabilized to a certain extent before trying to link up various

systems to avoid reimplementing solutions over and over.

Then there are distinct but related problems of (i) sharing structural

metadadata and (ii) sharing data between systems. In the simplest

case structural metadata is just a dataset description as you describe

in your Baobab scenario. For that I am sure you are right - the web

api is really well suited. And I suspect it will meet 80% of common

use cases ie. systems reporting datasets into dhis Though I know

Morten is at pains to point out that this API too is very recent and

will be subject to some change, though probably not too fundamental.

It does start to get more complex when you want to synchronize entire

hierarchies, groupsets etc between systems. These problems are not

yet really solved out-of-the-box and generally still requires some

innovative scripting of custom solutions. Some of these problems are

being addressed in the ongoing design process of the web api.

Then there is the question of communicating data. On the xml side

there are currently 2 ways in and a variety of formats supported or

potentially supported. This needs to be both rationalised and better

documented but there are quite a few processes happening

simultaneously:

(i) the web api. Simple to use. Supports xml (and json?)

datavaluesets. Uses uid indentifiers. Datavaluest defined in dxf2

namespace.

(ii) the stream based dxf import in the legacy import-export module.

Supports the dxf1 xml format which is currently produced on dhis2

output as well as a dxf2 datavalueset which still has minor

differences with the format used in web api. Most important of which

is the ability to use either codes (which might be externally

assigned) as well as DHIS custom uids (a plus). Also currently only

supports default category dataelements (a minus). For the moment data

import which uses different disaggregations cannot be done directly in

this route

The other functionality of the stream based import is the ability to

load a custom xslt transform for incoming xml to transform it to

either dxf1 or dxf2. This is the way, for example, that an sdmx-hd

cross sectional dataset is imported as a dxf2 datavalueset and it

works well for that. In fact the basic schema of a dxf2 datavalueset

is strongly inspired by (and not accidentally!) by the sdmx hd schema.

In principle this does mean that any datavalueset in an xml format

where the codes are somehow mappable can be imported.

Outstanding issues which need to be solved (or solved better) as I see

it in no particular order are:

(i) harmonising of the xml in the web api and the import module

(ii) better support for dissagregated data (without 3rd party systems

having to ‘understand’ categoryoptioncombo)

(iii) enhanced support for synching metadata between systems

(iv) stabilization and documentation of APIs and schemas

At the moment most interoperability problems are solvable but require

navigation of an over complex labyrinth of undocumented and

inconsistent functionality. To be fair, this has also been due to a

lack of concrete use cases. I have been involved in a number of

“synthetic” scenarios over the past few years where it has turned out

that either the 3rd party system didn’t really exist or the apparent

use case wasn’t really required at all :slight_smile:

The situation overall is greatly improved over the past year with the

intoduction of uids, the possibility to use codes to map against 3rd

party systems and the beginnings of the web-api.

I also have some useful meat now from working with Randy and team in

Rwanda. And there is also a growing interest in interoperating with

national facility registry software which may well become reality in

some countries.

I think it would be really, really useful to start collecting some of

these existing use cases - particularly concrete ones such as

described by the contributors to this thread - in some more detail.

Including those which are straightforward, those which are doable but

difficult and those which seem to elude us at present.

Regards

Bob

The Baobab system also has a common set of report that needs

to be sent monthly. Hence the Baobab system uses the DHIS2’s web-api

dataValueSets resource to send data into DHIS. This is a simple XML report

of datavalues that has been aggregated monthly and reported anyways by the

Baobab system.

  • One needs to initially do a GET on the organization unit
  • Then GET on the selected dataset (ANC Monthly in our case)
  • Then GET to check the ids of the data elements in a dataset
  • Then create a dataValueSets representation and POST this

We are still testing this out for continuous integration, but seems easy and

low hanging fruit.


Regards,

Saptarshi PURKAYASTHA

My Tech Blog: http://sunnytalkstech.blogspot.com

You Live by CHOICE, Not by CHANCE

On 17 March 2012 09:05, Wilson,Randy rwilson@msh.org wrote:

Hi Mark,

This is exactly what we’re doing in Rwanda. We’ve set up one instance of

DHIS-2 as our HMIS (for routine data entry by health facilities across the

country) and a second instance as a national data warehouse/dashboard – more

intended for program managers, implementing partners and donors. Bob

Jolliffe has been here helping us put together scripts to automatically

synchronize sub-sets of the data between the two instances as new data is

entered in the HMIS (I created a special dataset called datawarehouse in

HMIS that gets pushed across). We’re also going to use the extended

attributes for dataelements and indicators in the data warehouse instance to

maintain our metadata dictionary with additional fields such as: primary

data source, precise definition, intended use, staff responsible for

collection, etc….

Bringing data in from other systems is still not easy – though now that

many of our other data sources are web enabled it is practical. As you

note, you need to use the code field in each of the major data entities

(dataelement, indicator, orgunit) that all systems share. It is not

difficult to create a view of the period table that can be used to translate

periodids when importing data – for example here is the sql that gives you

the year, month and quarter for all periods in your period table:

SELECT periodtype.name AS periodtype, period.periodid, period.startdate,

period.enddate, date_part(‘Year’::text, period.startdate) AS periodyear,

date_part(‘month’::text, period.startdate) AS periodmonth,

    CASE
        WHEN date_part('month'::text, period.startdate) = ANY

(ARRAY[1, 2, 3]) THEN 1

        WHEN date_part('month'::text, period.startdate) = ANY

(ARRAY[4, 5, 6]) THEN 2

        WHEN date_part('month'::text, period.startdate) = ANY

(ARRAY[7, 8, 9]) THEN 3

        ELSE 4
    END AS periodquarter

FROM period, periodtype

WHERE period.periodtypeid = periodtype.periodtypeid;

Bob relies on DXF or similar XML import mechanisms – partly because of

Postgres’ requirement to assign a unique id to each record across all tables

whose current value is maintained in the hibernate_sequence object and it is

definitely the safest way to go. I’ve found it is also relatively easy to

do with a combination of Excel and a visual query designer like Access –

linked to the Postgres tables - as long as I check and increment the current

value before and after imports (and nobody else is working with the

database)! Of course it depends upon how similar in structure your source

data is to DHIS – otherwise you may need to do multiple transformations of

the data before hand. If you are using a lot of category combinations

(age/gender, etc…) as opposed to just the default categorycombo, it is also

more difficult as well, because they also need to be mapped to the

categorycomboids.

A drag and drop interface would be great… but we’re far from it now.

Randy

From: dhis2-users-bounces+rwilson=msh.org@lists.launchpad.net

[mailto:dhis2-users-bounces+rwilson=msh.org@lists.launchpad.net] On Behalf

Of Mark Spohr

Sent: Saturday, March 17, 2012 1:25 AM

To: dhis2-users@lists.launchpad.net

Subject: [Dhis2-users] Importing data from external system?

DHIS seems to do a good job of importing data from another DHIS system.

However, I would like to use the DHIS as a data warehouse to suck up data

from other systems in the country (vertical programs).

I’ve spent some time looking at the xml format and it looks like it could

be emulated by another system but will need to have the id codes for

periods, facilities, data elements, etc. so it will be a bit tedious.

Has anyone done work on this problem.?.. I’m thinking of some tool to map

the external data to the DHIS dataset which would allow a “drag and drop”

match.

Regards,

Mark

Mark Spohr, MD


Mailing list: https://launchpad.net/~dhis2-users

Post to : dhis2-users@lists.launchpad.net

Unsubscribe : https://launchpad.net/~dhis2-users

More help : https://help.launchpad.net/ListHelp


Mailing list: https://launchpad.net/~dhis2-users

Post to : dhis2-users@lists.launchpad.net

Unsubscribe : https://launchpad.net/~dhis2-users

More help : https://help.launchpad.net/ListHelp


Mailing list: https://launchpad.net/~dhis2-users

Post to : dhis2-users@lists.launchpad.net

Unsubscribe : https://launchpad.net/~dhis2-users

More help : https://help.launchpad.net/ListHelp


Mark Spohr, MD
mhspohr@gmail.com
+1 530 554 2230


Mailing list: https://launchpad.net/~dhis2-users

Post to : dhis2-users@lists.launchpad.net

Unsubscribe : https://launchpad.net/~dhis2-users

More help : https://help.launchpad.net/ListHelp

Yes we use use SDMX+HD to import data from openmrs based hospital system in India. Using it very simplistically (and therefor effectively) to read sdmx+hd cross+sectional data messages without bothering to exchange DSD. The matching of codes is done manually at present because they are few, but this needs to be enhanced to read the codelists. There is anonline demo site of this but I am not sure if it is still being used to develop datasets - if this is complete (I think it should be just about) I’ll share the url and you can take a look. Uses this module, which is not cosmetically complete or “nice” but it produces valid sdmx-hd messages which we can consume. https://github.com/hispindia/SDMXHDataExport .

Also we have used sdmx+hd to import data from iHRIS http://www.ihris.org/wiki/SDMX-HD_Data_Export_–_Kenya and to import data from OpenMRS using the module developed by Jembi. This uses an older approach which is not so flexible but has the advantage of supporting categorycombo data. Fixing up multidimensional data import for dxf2 is still to be done.

I am going to complete the export of Rwanda pbf data in SDMX-HD this week as well (hopefully). Will share that information when its done. Using a very simple approach, as there is no real api to that system, of sucking the data from the database and formatting as an sdmx message.

There is a lot in the sdmx-hd standard which is excessive and over complex which can be safely ignored and still remain conformant to the standard. If you look inside the dhis source code in the import/export module you will see that there is a very simple xslt transform which is applied to the incoming sdmx-hd cross-sectional message to produce a dxf2 datavalueset message. http://bazaar.launchpad.net/~dhis2-devs-core/dhis2/trunk/view/head:/dhis-2/dhis-services/dhis-service-importexport/src/main/resources/transform/cross2dxf2.xsl. This is triggered automatically (by recognizing the root element in the xml) so you just import the sdmx file in the same way you import any dxf file. http://bazaar.launchpad.net/~dhis2-devs-core/dhis2/trunk/view/head:/dhis-2/dhis-services/dhis-service-importexport/src/main/resources/transform/transforms.xml

Bob

···

On 19 March 2012 14:10, David Smith dsmith11@stevens.edu wrote:

Has anybody been able to leverage SDMX-HD to import data from other systems?

On Sat, Mar 17, 2012 at 10:22 PM, Mark Spohr mhspohr@gmail.com wrote:

Thanks for all of these great ideas.
The web-api sounds most interesting now. I’ll have to spend some time with it. This may be a good way to ease the difficulty of correlating all of the ids with the import of external data.

.Mark

On Sat, Mar 17, 2012 at 1:47 AM, Bob Jolliffe bobjolliffe@gmail.com wrote:

On 17 March 2012 09:15, Saptarshi Purkayastha sunbiz@gmail.com wrote:

Just to add to that list of places, we are doing some integration of data

coming from Baobab’s BART systems into DHIS2 here in Malawi. We discussed

many different methods of data import into DHIS2 and reached to conclusions

on what solutions might be appropriate to what context when exchanging data

between systems.

In the Malawi, the dataset are fairly stable now and there is a central

DHIS2 system.

Hi Saptarshi and all

I think that’s a really critical point. Early stage of implementation

tends to see more extreme fluctuations as the codes and datasets and

orgunit structures stabilize. It really is a requirement to have

these stabilized to a certain extent before trying to link up various

systems to avoid reimplementing solutions over and over.

Then there are distinct but related problems of (i) sharing structural

metadadata and (ii) sharing data between systems. In the simplest

case structural metadata is just a dataset description as you describe

in your Baobab scenario. For that I am sure you are right - the web

api is really well suited. And I suspect it will meet 80% of common

use cases ie. systems reporting datasets into dhis Though I know

Morten is at pains to point out that this API too is very recent and

will be subject to some change, though probably not too fundamental.

It does start to get more complex when you want to synchronize entire

hierarchies, groupsets etc between systems. These problems are not

yet really solved out-of-the-box and generally still requires some

innovative scripting of custom solutions. Some of these problems are

being addressed in the ongoing design process of the web api.

Then there is the question of communicating data. On the xml side

there are currently 2 ways in and a variety of formats supported or

potentially supported. This needs to be both rationalised and better

documented but there are quite a few processes happening

simultaneously:

(i) the web api. Simple to use. Supports xml (and json?)

datavaluesets. Uses uid indentifiers. Datavaluest defined in dxf2

namespace.

(ii) the stream based dxf import in the legacy import-export module.

Supports the dxf1 xml format which is currently produced on dhis2

output as well as a dxf2 datavalueset which still has minor

differences with the format used in web api. Most important of which

is the ability to use either codes (which might be externally

assigned) as well as DHIS custom uids (a plus). Also currently only

supports default category dataelements (a minus). For the moment data

import which uses different disaggregations cannot be done directly in

this route

The other functionality of the stream based import is the ability to

load a custom xslt transform for incoming xml to transform it to

either dxf1 or dxf2. This is the way, for example, that an sdmx-hd

cross sectional dataset is imported as a dxf2 datavalueset and it

works well for that. In fact the basic schema of a dxf2 datavalueset

is strongly inspired by (and not accidentally!) by the sdmx hd schema.

In principle this does mean that any datavalueset in an xml format

where the codes are somehow mappable can be imported.

Outstanding issues which need to be solved (or solved better) as I see

it in no particular order are:

(i) harmonising of the xml in the web api and the import module

(ii) better support for dissagregated data (without 3rd party systems

having to ‘understand’ categoryoptioncombo)

(iii) enhanced support for synching metadata between systems

(iv) stabilization and documentation of APIs and schemas

At the moment most interoperability problems are solvable but require

navigation of an over complex labyrinth of undocumented and

inconsistent functionality. To be fair, this has also been due to a

lack of concrete use cases. I have been involved in a number of

“synthetic” scenarios over the past few years where it has turned out

that either the 3rd party system didn’t really exist or the apparent

use case wasn’t really required at all :slight_smile:

The situation overall is greatly improved over the past year with the

intoduction of uids, the possibility to use codes to map against 3rd

party systems and the beginnings of the web-api.

I also have some useful meat now from working with Randy and team in

Rwanda. And there is also a growing interest in interoperating with

national facility registry software which may well become reality in

some countries.

I think it would be really, really useful to start collecting some of

these existing use cases - particularly concrete ones such as

described by the contributors to this thread - in some more detail.

Including those which are straightforward, those which are doable but

difficult and those which seem to elude us at present.

Regards

Bob

The Baobab system also has a common set of report that needs

to be sent monthly. Hence the Baobab system uses the DHIS2’s web-api

dataValueSets resource to send data into DHIS. This is a simple XML report

of datavalues that has been aggregated monthly and reported anyways by the

Baobab system.

  • One needs to initially do a GET on the organization unit
  • Then GET on the selected dataset (ANC Monthly in our case)
  • Then GET to check the ids of the data elements in a dataset
  • Then create a dataValueSets representation and POST this

We are still testing this out for continuous integration, but seems easy and

low hanging fruit.


Regards,

Saptarshi PURKAYASTHA

My Tech Blog: http://sunnytalkstech.blogspot.com

You Live by CHOICE, Not by CHANCE

On 17 March 2012 09:05, Wilson,Randy rwilson@msh.org wrote:

Hi Mark,

This is exactly what we’re doing in Rwanda. We’ve set up one instance of

DHIS-2 as our HMIS (for routine data entry by health facilities across the

country) and a second instance as a national data warehouse/dashboard – more

intended for program managers, implementing partners and donors. Bob

Jolliffe has been here helping us put together scripts to automatically

synchronize sub-sets of the data between the two instances as new data is

entered in the HMIS (I created a special dataset called datawarehouse in

HMIS that gets pushed across). We’re also going to use the extended

attributes for dataelements and indicators in the data warehouse instance to

maintain our metadata dictionary with additional fields such as: primary

data source, precise definition, intended use, staff responsible for

collection, etc….

Bringing data in from other systems is still not easy – though now that

many of our other data sources are web enabled it is practical. As you

note, you need to use the code field in each of the major data entities

(dataelement, indicator, orgunit) that all systems share. It is not

difficult to create a view of the period table that can be used to translate

periodids when importing data – for example here is the sql that gives you

the year, month and quarter for all periods in your period table:

SELECT periodtype.name AS periodtype, period.periodid, period.startdate,

period.enddate, date_part(‘Year’::text, period.startdate) AS periodyear,

date_part(‘month’::text, period.startdate) AS periodmonth,

    CASE
        WHEN date_part('month'::text, period.startdate) = ANY

(ARRAY[1, 2, 3]) THEN 1

        WHEN date_part('month'::text, period.startdate) = ANY

(ARRAY[4, 5, 6]) THEN 2

        WHEN date_part('month'::text, period.startdate) = ANY

(ARRAY[7, 8, 9]) THEN 3

        ELSE 4
    END AS periodquarter

FROM period, periodtype

WHERE period.periodtypeid = periodtype.periodtypeid;

Bob relies on DXF or similar XML import mechanisms – partly because of

Postgres’ requirement to assign a unique id to each record across all tables

whose current value is maintained in the hibernate_sequence object and it is

definitely the safest way to go. I’ve found it is also relatively easy to

do with a combination of Excel and a visual query designer like Access –

linked to the Postgres tables - as long as I check and increment the current

value before and after imports (and nobody else is working with the

database)! Of course it depends upon how similar in structure your source

data is to DHIS – otherwise you may need to do multiple transformations of

the data before hand. If you are using a lot of category combinations

(age/gender, etc…) as opposed to just the default categorycombo, it is also

more difficult as well, because they also need to be mapped to the

categorycomboids.

A drag and drop interface would be great… but we’re far from it now.

Randy

From: dhis2-users-bounces+rwilson=msh.org@lists.launchpad.net

[mailto:dhis2-users-bounces+rwilson=msh.org@lists.launchpad.net] On Behalf

Of Mark Spohr

Sent: Saturday, March 17, 2012 1:25 AM

To: dhis2-users@lists.launchpad.net

Subject: [Dhis2-users] Importing data from external system?

DHIS seems to do a good job of importing data from another DHIS system.

However, I would like to use the DHIS as a data warehouse to suck up data

from other systems in the country (vertical programs).

I’ve spent some time looking at the xml format and it looks like it could

be emulated by another system but will need to have the id codes for

periods, facilities, data elements, etc. so it will be a bit tedious.

Has anyone done work on this problem.?.. I’m thinking of some tool to map

the external data to the DHIS dataset which would allow a “drag and drop”

match.

Regards,

Mark

Mark Spohr, MD


Mailing list: https://launchpad.net/~dhis2-users

Post to : dhis2-users@lists.launchpad.net

Unsubscribe : https://launchpad.net/~dhis2-users

More help : https://help.launchpad.net/ListHelp


Mailing list: https://launchpad.net/~dhis2-users

Post to : dhis2-users@lists.launchpad.net

Unsubscribe : https://launchpad.net/~dhis2-users

More help : https://help.launchpad.net/ListHelp


Mailing list: https://launchpad.net/~dhis2-users

Post to : dhis2-users@lists.launchpad.net

Unsubscribe : https://launchpad.net/~dhis2-users

More help : https://help.launchpad.net/ListHelp


Mark Spohr, MD
mhspohr@gmail.com
+1 530 554 2230


Mailing list: https://launchpad.net/~dhis2-users

Post to : dhis2-users@lists.launchpad.net

Unsubscribe : https://launchpad.net/~dhis2-users

More help : https://help.launchpad.net/ListHelp

Holy smokes!! Thanks for the “on-point” answer. I’m very interested to hear about your OpenMRS experience because that’s the system in which I’m going to need to export data to provide DHIS2. Look forward to hearing about your results with exporting the Rwanda pdf data.

Thanks again.

···

On Mon, Mar 19, 2012 at 11:44 AM, Bob Jolliffe bobjolliffe@gmail.com wrote:

Yes we use use SDMX+HD to import data from openmrs based hospital system in India. Using it very simplistically (and therefor effectively) to read sdmx+hd cross+sectional data messages without bothering to exchange DSD. The matching of codes is done manually at present because they are few, but this needs to be enhanced to read the codelists. There is anonline demo site of this but I am not sure if it is still being used to develop datasets - if this is complete (I think it should be just about) I’ll share the url and you can take a look. Uses this module, which is not cosmetically complete or “nice” but it produces valid sdmx-hd messages which we can consume. https://github.com/hispindia/SDMXHDataExport .

Also we have used sdmx+hd to import data from iHRIS http://www.ihris.org/wiki/SDMX-HD_Data_Export_–_Kenya and to import data from OpenMRS using the module developed by Jembi. This uses an older approach which is not so flexible but has the advantage of supporting categorycombo data. Fixing up multidimensional data import for dxf2 is still to be done.

I am going to complete the export of Rwanda pbf data in SDMX-HD this week as well (hopefully). Will share that information when its done. Using a very simple approach, as there is no real api to that system, of sucking the data from the database and formatting as an sdmx message.

There is a lot in the sdmx-hd standard which is excessive and over complex which can be safely ignored and still remain conformant to the standard. If you look inside the dhis source code in the import/export module you will see that there is a very simple xslt transform which is applied to the incoming sdmx-hd cross-sectional message to produce a dxf2 datavalueset message. http://bazaar.launchpad.net/~dhis2-devs-core/dhis2/trunk/view/head:/dhis-2/dhis-services/dhis-service-importexport/src/main/resources/transform/cross2dxf2.xsl. This is triggered automatically (by recognizing the root element in the xml) so you just import the sdmx file in the same way you import any dxf file. http://bazaar.launchpad.net/~dhis2-devs-core/dhis2/trunk/view/head:/dhis-2/dhis-services/dhis-service-importexport/src/main/resources/transform/transforms.xml

Bob

On 19 March 2012 14:10, David Smith dsmith11@stevens.edu wrote:

Has anybody been able to leverage SDMX-HD to import data from other systems?

On Sat, Mar 17, 2012 at 10:22 PM, Mark Spohr mhspohr@gmail.com wrote:

Thanks for all of these great ideas.
The web-api sounds most interesting now. I’ll have to spend some time with it. This may be a good way to ease the difficulty of correlating all of the ids with the import of external data.

.Mark

On Sat, Mar 17, 2012 at 1:47 AM, Bob Jolliffe bobjolliffe@gmail.com wrote:

On 17 March 2012 09:15, Saptarshi Purkayastha sunbiz@gmail.com wrote:

Just to add to that list of places, we are doing some integration of data

coming from Baobab’s BART systems into DHIS2 here in Malawi. We discussed

many different methods of data import into DHIS2 and reached to conclusions

on what solutions might be appropriate to what context when exchanging data

between systems.

In the Malawi, the dataset are fairly stable now and there is a central

DHIS2 system.

Hi Saptarshi and all

I think that’s a really critical point. Early stage of implementation

tends to see more extreme fluctuations as the codes and datasets and

orgunit structures stabilize. It really is a requirement to have

these stabilized to a certain extent before trying to link up various

systems to avoid reimplementing solutions over and over.

Then there are distinct but related problems of (i) sharing structural

metadadata and (ii) sharing data between systems. In the simplest

case structural metadata is just a dataset description as you describe

in your Baobab scenario. For that I am sure you are right - the web

api is really well suited. And I suspect it will meet 80% of common

use cases ie. systems reporting datasets into dhis Though I know

Morten is at pains to point out that this API too is very recent and

will be subject to some change, though probably not too fundamental.

It does start to get more complex when you want to synchronize entire

hierarchies, groupsets etc between systems. These problems are not

yet really solved out-of-the-box and generally still requires some

innovative scripting of custom solutions. Some of these problems are

being addressed in the ongoing design process of the web api.

Then there is the question of communicating data. On the xml side

there are currently 2 ways in and a variety of formats supported or

potentially supported. This needs to be both rationalised and better

documented but there are quite a few processes happening

simultaneously:

(i) the web api. Simple to use. Supports xml (and json?)

datavaluesets. Uses uid indentifiers. Datavaluest defined in dxf2

namespace.

(ii) the stream based dxf import in the legacy import-export module.

Supports the dxf1 xml format which is currently produced on dhis2

output as well as a dxf2 datavalueset which still has minor

differences with the format used in web api. Most important of which

is the ability to use either codes (which might be externally

assigned) as well as DHIS custom uids (a plus). Also currently only

supports default category dataelements (a minus). For the moment data

import which uses different disaggregations cannot be done directly in

this route

The other functionality of the stream based import is the ability to

load a custom xslt transform for incoming xml to transform it to

either dxf1 or dxf2. This is the way, for example, that an sdmx-hd

cross sectional dataset is imported as a dxf2 datavalueset and it

works well for that. In fact the basic schema of a dxf2 datavalueset

is strongly inspired by (and not accidentally!) by the sdmx hd schema.

In principle this does mean that any datavalueset in an xml format

where the codes are somehow mappable can be imported.

Outstanding issues which need to be solved (or solved better) as I see

it in no particular order are:

(i) harmonising of the xml in the web api and the import module

(ii) better support for dissagregated data (without 3rd party systems

having to ‘understand’ categoryoptioncombo)

(iii) enhanced support for synching metadata between systems

(iv) stabilization and documentation of APIs and schemas

At the moment most interoperability problems are solvable but require

navigation of an over complex labyrinth of undocumented and

inconsistent functionality. To be fair, this has also been due to a

lack of concrete use cases. I have been involved in a number of

“synthetic” scenarios over the past few years where it has turned out

that either the 3rd party system didn’t really exist or the apparent

use case wasn’t really required at all :slight_smile:

The situation overall is greatly improved over the past year with the

intoduction of uids, the possibility to use codes to map against 3rd

party systems and the beginnings of the web-api.

I also have some useful meat now from working with Randy and team in

Rwanda. And there is also a growing interest in interoperating with

national facility registry software which may well become reality in

some countries.

I think it would be really, really useful to start collecting some of

these existing use cases - particularly concrete ones such as

described by the contributors to this thread - in some more detail.

Including those which are straightforward, those which are doable but

difficult and those which seem to elude us at present.

Regards

Bob

The Baobab system also has a common set of report that needs

to be sent monthly. Hence the Baobab system uses the DHIS2’s web-api

dataValueSets resource to send data into DHIS. This is a simple XML report

of datavalues that has been aggregated monthly and reported anyways by the

Baobab system.

  • One needs to initially do a GET on the organization unit
  • Then GET on the selected dataset (ANC Monthly in our case)
  • Then GET to check the ids of the data elements in a dataset
  • Then create a dataValueSets representation and POST this

We are still testing this out for continuous integration, but seems easy and

low hanging fruit.


Regards,

Saptarshi PURKAYASTHA

My Tech Blog: http://sunnytalkstech.blogspot.com

You Live by CHOICE, Not by CHANCE

On 17 March 2012 09:05, Wilson,Randy rwilson@msh.org wrote:

Hi Mark,

This is exactly what we’re doing in Rwanda. We’ve set up one instance of

DHIS-2 as our HMIS (for routine data entry by health facilities across the

country) and a second instance as a national data warehouse/dashboard – more

intended for program managers, implementing partners and donors. Bob

Jolliffe has been here helping us put together scripts to automatically

synchronize sub-sets of the data between the two instances as new data is

entered in the HMIS (I created a special dataset called datawarehouse in

HMIS that gets pushed across). We’re also going to use the extended

attributes for dataelements and indicators in the data warehouse instance to

maintain our metadata dictionary with additional fields such as: primary

data source, precise definition, intended use, staff responsible for

collection, etc….

Bringing data in from other systems is still not easy – though now that

many of our other data sources are web enabled it is practical. As you

note, you need to use the code field in each of the major data entities

(dataelement, indicator, orgunit) that all systems share. It is not

difficult to create a view of the period table that can be used to translate

periodids when importing data – for example here is the sql that gives you

the year, month and quarter for all periods in your period table:

SELECT periodtype.name AS periodtype, period.periodid, period.startdate,

period.enddate, date_part(‘Year’::text, period.startdate) AS periodyear,

date_part(‘month’::text, period.startdate) AS periodmonth,

    CASE
        WHEN date_part('month'::text, period.startdate) = ANY

(ARRAY[1, 2, 3]) THEN 1

        WHEN date_part('month'::text, period.startdate) = ANY

(ARRAY[4, 5, 6]) THEN 2

        WHEN date_part('month'::text, period.startdate) = ANY

(ARRAY[7, 8, 9]) THEN 3

        ELSE 4
    END AS periodquarter

FROM period, periodtype

WHERE period.periodtypeid = periodtype.periodtypeid;

Bob relies on DXF or similar XML import mechanisms – partly because of

Postgres’ requirement to assign a unique id to each record across all tables

whose current value is maintained in the hibernate_sequence object and it is

definitely the safest way to go. I’ve found it is also relatively easy to

do with a combination of Excel and a visual query designer like Access –

linked to the Postgres tables - as long as I check and increment the current

value before and after imports (and nobody else is working with the

database)! Of course it depends upon how similar in structure your source

data is to DHIS – otherwise you may need to do multiple transformations of

the data before hand. If you are using a lot of category combinations

(age/gender, etc…) as opposed to just the default categorycombo, it is also

more difficult as well, because they also need to be mapped to the

categorycomboids.

A drag and drop interface would be great… but we’re far from it now.

Randy

From: dhis2-users-bounces+rwilson=msh.org@lists.launchpad.net

[mailto:dhis2-users-bounces+rwilson=msh.org@lists.launchpad.net] On Behalf

Of Mark Spohr

Sent: Saturday, March 17, 2012 1:25 AM

To: dhis2-users@lists.launchpad.net

Subject: [Dhis2-users] Importing data from external system?

DHIS seems to do a good job of importing data from another DHIS system.

However, I would like to use the DHIS as a data warehouse to suck up data

from other systems in the country (vertical programs).

I’ve spent some time looking at the xml format and it looks like it could

be emulated by another system but will need to have the id codes for

periods, facilities, data elements, etc. so it will be a bit tedious.

Has anyone done work on this problem.?.. I’m thinking of some tool to map

the external data to the DHIS dataset which would allow a “drag and drop”

match.

Regards,

Mark

Mark Spohr, MD


Mailing list: https://launchpad.net/~dhis2-users

Post to : dhis2-users@lists.launchpad.net

Unsubscribe : https://launchpad.net/~dhis2-users

More help : https://help.launchpad.net/ListHelp


Mailing list: https://launchpad.net/~dhis2-users

Post to : dhis2-users@lists.launchpad.net

Unsubscribe : https://launchpad.net/~dhis2-users

More help : https://help.launchpad.net/ListHelp


Mailing list: https://launchpad.net/~dhis2-users

Post to : dhis2-users@lists.launchpad.net

Unsubscribe : https://launchpad.net/~dhis2-users

More help : https://help.launchpad.net/ListHelp


Mark Spohr, MD
mhspohr@gmail.com
+1 530 554 2230


Mailing list: https://launchpad.net/~dhis2-users

Post to : dhis2-users@lists.launchpad.net

Unsubscribe : https://launchpad.net/~dhis2-users

More help : https://help.launchpad.net/ListHelp

Hi Mark, we have put up some documentation and an example on how to send data values using the web API here:

http://dhis2.org/doc/snapshot/en/user/html/ch24.html

Lars

···

On Sun, Mar 18, 2012 at 3:22 AM, Mark Spohr mhspohr@gmail.com wrote:

Thanks for all of these great ideas.
The web-api sounds most interesting now. I’ll have to spend some time with it. This may be a good way to ease the difficulty of correlating all of the ids with the import of external data.

Yes I had a few short discussions with some of the openmrs folk down in Rwanda as well. We do need to come up with a scalable and sensible solution which doesn’t involve excessive configuration and handles the process of changing metadata (an unfortunate fact of life) cleanly. It might well be that the best approach will be to define a template of a sdmx datamessage which is handled on the openmrs side in the same way as I believe people do with excel templates.

Codelists and datset structure are defined in SDMX through a dataset structure definition, but that doesn’t preclude other ways of defining that structure if they work better. Currently I think there a few minor limitations on both the openmrs and dhis sides which are making this more difficult than it might be which I’ve referred to above .

···

On 19 March 2012 15:48, David Smith dsmith11@stevens.edu wrote:

Holy smokes!! Thanks for the “on-point” answer. I’m very interested to hear about your OpenMRS experience because that’s the system in which I’m going to need to export data to provide DHIS2. Look forward to hearing about your results with exporting the Rwanda pdf data.

Thanks again.

On Mon, Mar 19, 2012 at 11:44 AM, Bob Jolliffe bobjolliffe@gmail.com wrote:

Yes we use use SDMX+HD to import data from openmrs based hospital system in India. Using it very simplistically (and therefor effectively) to read sdmx+hd cross+sectional data messages without bothering to exchange DSD. The matching of codes is done manually at present because they are few, but this needs to be enhanced to read the codelists. There is anonline demo site of this but I am not sure if it is still being used to develop datasets - if this is complete (I think it should be just about) I’ll share the url and you can take a look. Uses this module, which is not cosmetically complete or “nice” but it produces valid sdmx-hd messages which we can consume. https://github.com/hispindia/SDMXHDataExport .

Also we have used sdmx+hd to import data from iHRIS http://www.ihris.org/wiki/SDMX-HD_Data_Export_–_Kenya and to import data from OpenMRS using the module developed by Jembi. This uses an older approach which is not so flexible but has the advantage of supporting categorycombo data. Fixing up multidimensional data import for dxf2 is still to be done.

I am going to complete the export of Rwanda pbf data in SDMX-HD this week as well (hopefully). Will share that information when its done. Using a very simple approach, as there is no real api to that system, of sucking the data from the database and formatting as an sdmx message.

There is a lot in the sdmx-hd standard which is excessive and over complex which can be safely ignored and still remain conformant to the standard. If you look inside the dhis source code in the import/export module you will see that there is a very simple xslt transform which is applied to the incoming sdmx-hd cross-sectional message to produce a dxf2 datavalueset message. http://bazaar.launchpad.net/~dhis2-devs-core/dhis2/trunk/view/head:/dhis-2/dhis-services/dhis-service-importexport/src/main/resources/transform/cross2dxf2.xsl. This is triggered automatically (by recognizing the root element in the xml) so you just import the sdmx file in the same way you import any dxf file. http://bazaar.launchpad.net/~dhis2-devs-core/dhis2/trunk/view/head:/dhis-2/dhis-services/dhis-service-importexport/src/main/resources/transform/transforms.xml

Bob

On 19 March 2012 14:10, David Smith dsmith11@stevens.edu wrote:

Has anybody been able to leverage SDMX-HD to import data from other systems?

On Sat, Mar 17, 2012 at 10:22 PM, Mark Spohr mhspohr@gmail.com wrote:

Thanks for all of these great ideas.
The web-api sounds most interesting now. I’ll have to spend some time with it. This may be a good way to ease the difficulty of correlating all of the ids with the import of external data.

.Mark

On Sat, Mar 17, 2012 at 1:47 AM, Bob Jolliffe bobjolliffe@gmail.com wrote:

On 17 March 2012 09:15, Saptarshi Purkayastha sunbiz@gmail.com wrote:

Just to add to that list of places, we are doing some integration of data

coming from Baobab’s BART systems into DHIS2 here in Malawi. We discussed

many different methods of data import into DHIS2 and reached to conclusions

on what solutions might be appropriate to what context when exchanging data

between systems.

In the Malawi, the dataset are fairly stable now and there is a central

DHIS2 system.

Hi Saptarshi and all

I think that’s a really critical point. Early stage of implementation

tends to see more extreme fluctuations as the codes and datasets and

orgunit structures stabilize. It really is a requirement to have

these stabilized to a certain extent before trying to link up various

systems to avoid reimplementing solutions over and over.

Then there are distinct but related problems of (i) sharing structural

metadadata and (ii) sharing data between systems. In the simplest

case structural metadata is just a dataset description as you describe

in your Baobab scenario. For that I am sure you are right - the web

api is really well suited. And I suspect it will meet 80% of common

use cases ie. systems reporting datasets into dhis Though I know

Morten is at pains to point out that this API too is very recent and

will be subject to some change, though probably not too fundamental.

It does start to get more complex when you want to synchronize entire

hierarchies, groupsets etc between systems. These problems are not

yet really solved out-of-the-box and generally still requires some

innovative scripting of custom solutions. Some of these problems are

being addressed in the ongoing design process of the web api.

Then there is the question of communicating data. On the xml side

there are currently 2 ways in and a variety of formats supported or

potentially supported. This needs to be both rationalised and better

documented but there are quite a few processes happening

simultaneously:

(i) the web api. Simple to use. Supports xml (and json?)

datavaluesets. Uses uid indentifiers. Datavaluest defined in dxf2

namespace.

(ii) the stream based dxf import in the legacy import-export module.

Supports the dxf1 xml format which is currently produced on dhis2

output as well as a dxf2 datavalueset which still has minor

differences with the format used in web api. Most important of which

is the ability to use either codes (which might be externally

assigned) as well as DHIS custom uids (a plus). Also currently only

supports default category dataelements (a minus). For the moment data

import which uses different disaggregations cannot be done directly in

this route

The other functionality of the stream based import is the ability to

load a custom xslt transform for incoming xml to transform it to

either dxf1 or dxf2. This is the way, for example, that an sdmx-hd

cross sectional dataset is imported as a dxf2 datavalueset and it

works well for that. In fact the basic schema of a dxf2 datavalueset

is strongly inspired by (and not accidentally!) by the sdmx hd schema.

In principle this does mean that any datavalueset in an xml format

where the codes are somehow mappable can be imported.

Outstanding issues which need to be solved (or solved better) as I see

it in no particular order are:

(i) harmonising of the xml in the web api and the import module

(ii) better support for dissagregated data (without 3rd party systems

having to ‘understand’ categoryoptioncombo)

(iii) enhanced support for synching metadata between systems

(iv) stabilization and documentation of APIs and schemas

At the moment most interoperability problems are solvable but require

navigation of an over complex labyrinth of undocumented and

inconsistent functionality. To be fair, this has also been due to a

lack of concrete use cases. I have been involved in a number of

“synthetic” scenarios over the past few years where it has turned out

that either the 3rd party system didn’t really exist or the apparent

use case wasn’t really required at all :slight_smile:

The situation overall is greatly improved over the past year with the

intoduction of uids, the possibility to use codes to map against 3rd

party systems and the beginnings of the web-api.

I also have some useful meat now from working with Randy and team in

Rwanda. And there is also a growing interest in interoperating with

national facility registry software which may well become reality in

some countries.

I think it would be really, really useful to start collecting some of

these existing use cases - particularly concrete ones such as

described by the contributors to this thread - in some more detail.

Including those which are straightforward, those which are doable but

difficult and those which seem to elude us at present.

Regards

Bob

The Baobab system also has a common set of report that needs

to be sent monthly. Hence the Baobab system uses the DHIS2’s web-api

dataValueSets resource to send data into DHIS. This is a simple XML report

of datavalues that has been aggregated monthly and reported anyways by the

Baobab system.

  • One needs to initially do a GET on the organization unit
  • Then GET on the selected dataset (ANC Monthly in our case)
  • Then GET to check the ids of the data elements in a dataset
  • Then create a dataValueSets representation and POST this

We are still testing this out for continuous integration, but seems easy and

low hanging fruit.


Regards,

Saptarshi PURKAYASTHA

My Tech Blog: http://sunnytalkstech.blogspot.com

You Live by CHOICE, Not by CHANCE

On 17 March 2012 09:05, Wilson,Randy rwilson@msh.org wrote:

Hi Mark,

This is exactly what we’re doing in Rwanda. We’ve set up one instance of

DHIS-2 as our HMIS (for routine data entry by health facilities across the

country) and a second instance as a national data warehouse/dashboard – more

intended for program managers, implementing partners and donors. Bob

Jolliffe has been here helping us put together scripts to automatically

synchronize sub-sets of the data between the two instances as new data is

entered in the HMIS (I created a special dataset called datawarehouse in

HMIS that gets pushed across). We’re also going to use the extended

attributes for dataelements and indicators in the data warehouse instance to

maintain our metadata dictionary with additional fields such as: primary

data source, precise definition, intended use, staff responsible for

collection, etc….

Bringing data in from other systems is still not easy – though now that

many of our other data sources are web enabled it is practical. As you

note, you need to use the code field in each of the major data entities

(dataelement, indicator, orgunit) that all systems share. It is not

difficult to create a view of the period table that can be used to translate

periodids when importing data – for example here is the sql that gives you

the year, month and quarter for all periods in your period table:

SELECT periodtype.name AS periodtype, period.periodid, period.startdate,

period.enddate, date_part(‘Year’::text, period.startdate) AS periodyear,

date_part(‘month’::text, period.startdate) AS periodmonth,

    CASE
        WHEN date_part('month'::text, period.startdate) = ANY

(ARRAY[1, 2, 3]) THEN 1

        WHEN date_part('month'::text, period.startdate) = ANY

(ARRAY[4, 5, 6]) THEN 2

        WHEN date_part('month'::text, period.startdate) = ANY

(ARRAY[7, 8, 9]) THEN 3

        ELSE 4
    END AS periodquarter

FROM period, periodtype

WHERE period.periodtypeid = periodtype.periodtypeid;

Bob relies on DXF or similar XML import mechanisms – partly because of

Postgres’ requirement to assign a unique id to each record across all tables

whose current value is maintained in the hibernate_sequence object and it is

definitely the safest way to go. I’ve found it is also relatively easy to

do with a combination of Excel and a visual query designer like Access –

linked to the Postgres tables - as long as I check and increment the current

value before and after imports (and nobody else is working with the

database)! Of course it depends upon how similar in structure your source

data is to DHIS – otherwise you may need to do multiple transformations of

the data before hand. If you are using a lot of category combinations

(age/gender, etc…) as opposed to just the default categorycombo, it is also

more difficult as well, because they also need to be mapped to the

categorycomboids.

A drag and drop interface would be great… but we’re far from it now.

Randy

From: dhis2-users-bounces+rwilson=msh.org@lists.launchpad.net

[mailto:dhis2-users-bounces+rwilson=msh.org@lists.launchpad.net] On Behalf

Of Mark Spohr

Sent: Saturday, March 17, 2012 1:25 AM

To: dhis2-users@lists.launchpad.net

Subject: [Dhis2-users] Importing data from external system?

DHIS seems to do a good job of importing data from another DHIS system.

However, I would like to use the DHIS as a data warehouse to suck up data

from other systems in the country (vertical programs).

I’ve spent some time looking at the xml format and it looks like it could

be emulated by another system but will need to have the id codes for

periods, facilities, data elements, etc. so it will be a bit tedious.

Has anyone done work on this problem.?.. I’m thinking of some tool to map

the external data to the DHIS dataset which would allow a “drag and drop”

match.

Regards,

Mark

Mark Spohr, MD


Mailing list: https://launchpad.net/~dhis2-users

Post to : dhis2-users@lists.launchpad.net

Unsubscribe : https://launchpad.net/~dhis2-users

More help : https://help.launchpad.net/ListHelp


Mailing list: https://launchpad.net/~dhis2-users

Post to : dhis2-users@lists.launchpad.net

Unsubscribe : https://launchpad.net/~dhis2-users

More help : https://help.launchpad.net/ListHelp


Mailing list: https://launchpad.net/~dhis2-users

Post to : dhis2-users@lists.launchpad.net

Unsubscribe : https://launchpad.net/~dhis2-users

More help : https://help.launchpad.net/ListHelp


Mark Spohr, MD
mhspohr@gmail.com
+1 530 554 2230


Mailing list: https://launchpad.net/~dhis2-users

Post to : dhis2-users@lists.launchpad.net

Unsubscribe : https://launchpad.net/~dhis2-users

More help : https://help.launchpad.net/ListHelp

Thanks Lars this is useful. I should add, looking at this example,
that the primary difference currently with the dxf2 datavalueset which
is imported through the import module is the addition of an idScheme
attribute. In the web api example we have

<dataValueSet xmlns="http://dhis2.org/schema/dxf/2.0" period="201201"
dataSet="pBOMPrpg1QX" orgUnit="DiszpKrYNg8">
  <dataValue dataElement="f7n9E0hX8qk" value="12" />
  <dataValue dataElement="Ix2HsbDMLea" value="14" />
  <dataValue dataElement="eY5ehpbEsB7" value="16" />
</dataValueSet>

In the other stream we have

<dataValueset xmlns="http://dhis2.org/schema/dxf/2.0"
idScheme="code|uid" ..... />

which allows a client to choose to use the code field or the uid as
the identifier for orgunits, dataelements etc. The idScheme attribute
is optional, and defaults to assuming uid as is the current case with
the web api. There are some cases where the code might be more
appropriate, for example where these codes are coming from a 3rd party
registry or where the client doesn't have the flexibility to map to
dhis2 uids - then we can map to the external system using our code
fields.

Need to get together on skype one of these days soon with Morten and
yourself and bring these things together.

···

On 19 March 2012 16:58, Lars Helge Øverland <larshelge@gmail.com> wrote:

On Sun, Mar 18, 2012 at 3:22 AM, Mark Spohr <mhspohr@gmail.com> wrote:

Thanks for all of these great ideas.
The web-api sounds most interesting now. I'll have to spend some time
with it. This may be a good way to ease the difficulty of correlating all
of the ids with the import of external data.

Hi Mark, we have put up some documentation and an example on how to send
data values using the web API here:

http://dhis2.org/doc/snapshot/en/user/html/ch24.html

Lars

Thanks for this reference. It was what I was looking for.

.Mark

···

2012/3/19 Lars Helge Øverland larshelge@gmail.com

On Sun, Mar 18, 2012 at 3:22 AM, Mark Spohr mhspohr@gmail.com wrote:

Thanks for all of these great ideas.
The web-api sounds most interesting now. I’ll have to spend some time with it. This may be a good way to ease the difficulty of correlating all of the ids with the import of external data.

Hi Mark, we have put up some documentation and an example on how to send data values using the web API here:

http://dhis2.org/doc/snapshot/en/user/html/ch24.html

Lars


Mark Spohr, MD
mhspohr@gmail.com
+1 530 554 2230