currently I am looking at ways of deleting larger amounts of DataValues based on their dimensions OrgUnit, Period, DataElement AND AttributeOptionCombination. the main purpose is to invalidate old values on AttributeOptionCombinations, which no there are no more values in an updated source-system. So far I tried the following options:
1 - DELETE api dataValues?pe=xxx&ou=xxx&de=xxx&cc=xxx&cp=xxxx: works, but take s a loong time since every delete-record has open the api-connection once.
2 - POST api dataValueSets?importStrategy=DELETE: should be the most applilcable, if it works - does it? I only get ignored records in the result set. Does it make sense to investigate further into this option, or is this api not forseen for deleting?
3 - POST api dataValueSets?importStrategy=UPDATE with all values set to 0: have not yet tried this option, but would fall back on this, if I can’t get (2) to work. I assume that the 0-value-records will be eliminated by DHIS2 lateron (when? during analytical table-update?)
Could someone please comment, if any of these options make sense, or if there is another, more elegant version?
you are right in that we do not support bulk deletion through the Web API. The main reason is that we want to avert disaster by someone doing it by mistake or through malicious activity. However we are getting an increasing number of requests for this so we will consider it.
An effective alternative is to simply use a SQL statement to accomplish it. Is that an option for you?
regards,
Lars
···
On Wed, Dec 2, 2015 at 5:44 PM, Uwe Wahser uwe@wahser.de wrote:
Dear all,
currently I am looking at ways of deleting larger amounts of DataValues based on their dimensions OrgUnit, Period, DataElement AND AttributeOptionCombination. the main purpose is to invalidate old values on AttributeOptionCombinations, which no there are no more values in an updated source-system. So far I tried the following options:
1 - DELETE api dataValues?pe=xxx&ou=xxx&de=xxx&cc=xxx&cp=xxxx: works, but take s a loong time since every delete-record has open the api-connection once.
2 - POST api dataValueSets?importStrategy=DELETE: should be the most applilcable, if it works - does it? I only get ignored records in the result set. Does it make sense to investigate further into this option, or is this api not forseen for deleting?
3 - POST api dataValueSets?importStrategy=UPDATE with all values set to 0: have not yet tried this option, but would fall back on this, if I can’t get (2) to work. I assume that the 0-value-records will be eliminated by DHIS2 lateron (when? during analytical table-update?)
Could someone please comment, if any of these options make sense, or if there is another, more elegant version?
What do you think of the option of just updating all the obsolete values to 0 and have them eliminated by DHIS2 (I assume that 0-Values are being eliminated unless I block this in the settings)?
Uwe
···
On Wed, Dec 2, 2015 at 5:44 PM, Uwe Wahser uwe@wahser.de wrote:
Dear all,
currently I am looking at ways of deleting larger amounts of DataValues based on their dimensions OrgUnit, Period, DataElement AND AttributeOptionCombination. the main purpose is to invalidate old values on AttributeOptionCombinations, which no there are no more values in an updated source-system. So far I tried the following options:
1 - DELETE api dataValues?pe=xxx&ou=xxx&de=xxx&cc=xxx&cp=xxxx: works, but take s a loong time since every delete-record has open the api-connection once.
2 - POST api dataValueSets?importStrategy=DELETE: should be the most applilcable, if it works - does it? I only get ignored records in the result set. Does it make sense to investigate further into this option, or is this api not forseen for deleting?
3 - POST api dataValueSets?importStrategy=UPDATE with all values set to 0: have not yet tried this option, but would fall back on this, if I can’t get (2) to work. I assume that the 0-value-records will be eliminated by DHIS2 lateron (when? during analytical table-update?)
Could someone please comment, if any of these options make sense, or if there is another, more elegant version?
What do you think of the option of just updating all the obsolete values
to 0 and have them eliminated by DHIS2 (I assume that 0-Values are being
eliminated unless I block this in the settings)?
It is a valid approach to generate a data value set filled with blank
values and import it. This will effectively remove the data values.
regards,
Lars
···
On Wed, Dec 2, 2015 at 6:01 PM, Uwe Wahser <uwe@wahser.de> wrote:
I followed the approch of overwriting all obsolete values with '0', which is just as fast as any other update. This works of course only, if I set zeroIsSigificant to true before overwriting and back to false before eliminating '0' values - in theory. In praxi, I am able to start the elimination process in the maintenance app and get a success message, but the '0' value-records still remain in the dataValues table.
Is there an issue with setting the zeroIsSigificant flag after data were entered for that dataElement?
By the way: from my experince, bulk deletion by accident is way less dangerous than randomly changing single values as you can see immediately that something went wrong (and are able to restore the backup). If you have processes changing only single values accidentially, you might find this out only after months - and that's much harder to correct. It might be worthwhile to rethink this security feature.
Regards,
Uwe
···
Am 02.12.2015 um 20:12 schrieb Lars Helge Øverland:
On Wed, Dec 2, 2015 at 6:01 PM, Uwe Wahser <uwe@wahser.de > <mailto:uwe@wahser.de>> wrote:
That's cheating
True
What do you think of the option of just updating all the obsolete
values to 0 and have them eliminated by DHIS2 (I assume that
0-Values are being eliminated unless I block this in the settings)?
It is a valid approach to generate a data value set filled with blank values and import it. This will effectively remove the data values.
Just a little update on this (we were having the same issue).
It is true that overwriting with ‘0’ works only if zeroIsSignificant is set to true. What we found to be most working, was to overwrite with nothing (just “” in the csv). This works regardless of the zeroIsSignificant-attribute. In your csv’s you can just remove all zeroes in the “value”-column – that is, if you don’t care for zeroes, obviously.
Cordialement,
GIJSBERT OOMS
ATI Système d’information et gestion de bases de données
Programme d’Appui au Secteur de la Santé (PASS SOUROU)
I followed the approch of overwriting all obsolete values with ‘0’, which is just as fast as any other update. This works of course only, if I set zeroIsSigificant to true before overwriting and back to false before eliminating ‘0’ values - in theory. In praxi, I am able to start the elimination process in the maintenance app and get a success message, but the ‘0’ value-records still remain in the dataValues table.
Is there an issue with setting the zeroIsSigificant flag after data were entered for that dataElement?
By the way: from my experince, bulk deletion by accident is way less dangerous than randomly changing single values as you can see immediately that something went wrong (and are able to restore the backup). If you have processes changing only single values accidentially, you might find this out only after months - and that’s much harder to correct. It might be worthwhile to rethink this security feature.
Regards,
Uwe
Am 02.12.2015 um 20:12 schrieb Lars Helge Øverland:
On Wed, Dec 2, 2015 at 6:01 PM, Uwe Wahser uwe@wahser.de wrote:
That’s cheating
True
What do you think of the option of just updating all the obsolete values to 0 and have them eliminated by DHIS2 (I assume that 0-Values are being eliminated unless I block this in the settings)?
It is a valid approach to generate a data value set filled with blank values and import it. This will effectively remove the data values.
Just a little update on this (we were having the same issue).
It is true that overwriting with ‘0’ works only if zeroIsSignificant is set to true. What we found to be most working, was to overwrite with nothing (just “” in the csv). This works regardless of the zeroIsSignificant-attribute. In your csv’s you can just remove all zeroes in the “value”-column – that is, if you don’t care for zeroes, obviously.
Cordialement,
GIJSBERT OOMS
ATI Système d’information et gestion de bases de données
Programme d’Appui au Secteur de la Santé (PASS SOUROU)
I followed the approch of overwriting all obsolete values with ‘0’, which is just as fast as any other update. This works of course only, if I set zeroIsSigificant to true before overwriting and back to false before eliminating ‘0’ values - in theory. In praxi, I am able to start the elimination process in the maintenance app and get a success message, but the ‘0’ value-records still remain in the dataValues table.
Is there an issue with setting the zeroIsSigificant flag after data were entered for that dataElement?
By the way: from my experince, bulk deletion by accident is way less dangerous than randomly changing single values as you can see immediately that something went wrong (and are able to restore the backup). If you have processes changing only single values accidentially, you might find this out only after months - and that’s much harder to correct. It might be worthwhile to rethink this security feature.
Regards,
Uwe
Am 02.12.2015 um 20:12 schrieb Lars Helge Øverland:
On Wed, Dec 2, 2015 at 6:01 PM, Uwe Wahser uwe@wahser.de wrote:
That’s cheating
True
What do you think of the option of just updating all the obsolete values to 0 and have them eliminated by DHIS2 (I assume that 0-Values are being eliminated unless I block this in the settings)?
It is a valid approach to generate a data value set filled with blank values and import it. This will effectively remove the data values.