Data import error - java.lang.String cannot be cast to java.lang.Boolean

We upgraded to 2.33 from 2.30 some time back and have noticed an issue with data import. In troubleshooting, I reverted to 2.30 on a test instance to see when the issue arises. It appears to be introduced from 2.31.

When I export a small data set then import using 2.30, I have no issue, but when I upgrade to 2.31 and attempt to export / import the same data (or indeed any data), I consistently get the “java.lang.String cannot be cast to java.lang.Boolean” error and the data fails to import (dry run or not).

This is the log on 2.31:

* INFO  2021-01-19 13:22:18,356 Starting data value import, options: ImportOptions{idSchemes=IdSchemes{idScheme=IdScheme{identifiableProperty=UID, attribute=null}, dataElementIdScheme=IdScheme{identifiableProperty=UID, attribute=null}, categoryOptionComboIdScheme=null, categoryOptionIdScheme=null, orgUnitIdScheme=IdScheme{identifiableProperty=UID, attribute=null}, programIdScheme=null, programStageIdScheme=null, trackedEntityIdScheme=null, trackedEntityAttributeIdScheme=null, dataSetIdScheme=null, attributeOptionComboIdScheme=null, programStageInstanceIdScheme=null}, dryRun=true, preheatCache=false, async=true, importStrategy=NEW_AND_UPDATES, mergeMode=REPLACE, skipExistingCheck=false, ignoreEmptyCollection=false, sharing=false, skipNotifications=false, datasetAllowsPeriods=false, strictPeriods=false, strictDataElements=false, strictCategoryOptionCombos=false, strictAttributeOptionCombos=false, strictOrganisationUnits=false, requireCategoryOptionCombo=false, requireAttributeOptionCombo=false, force=false, skipLastUpdated=false}: 00:00:00.000 (Clock.java [taskScheduler-5])
* INFO  2021-01-19 13:22:18,357 [Level: INFO, category: DATAVALUE_IMPORT, time: Tue Jan 19 13:22:18 CAT 2021, message: Process started] (InMemoryNotifier.java [taskScheduler-5])
* INFO  2021-01-19 13:22:18,359 Is ISO calendar: true, skip lock exception check: true (DefaultDataValueSetService.java [taskScheduler-5])
* INFO  2021-01-19 13:22:18,360 Skip audit: false, has authority to skip: true (DefaultDataValueSetService.java [taskScheduler-5])
* INFO  2021-01-19 13:22:18,361 Import options: ImportOptions{idSchemes=IdSchemes{idScheme=IdScheme{identifiableProperty=UID, attribute=null}, dataElementIdScheme=IdScheme{identifiableProperty=UID, attribute=null}, categoryOptionComboIdScheme=null, categoryOptionIdScheme=null, orgUnitIdScheme=IdScheme{identifiableProperty=UID, attribute=null}, programIdScheme=null, programStageIdScheme=null, trackedEntityIdScheme=null, trackedEntityAttributeIdScheme=null, dataSetIdScheme=null, attributeOptionComboIdScheme=null, programStageInstanceIdScheme=null}, dryRun=true, preheatCache=false, async=true, importStrategy=NEW_AND_UPDATES, mergeMode=REPLACE, skipExistingCheck=false, ignoreEmptyCollection=false, sharing=false, skipNotifications=false, datasetAllowsPeriods=false, strictPeriods=false, strictDataElements=false, strictCategoryOptionCombos=false, strictAttributeOptionCombos=false, strictOrganisationUnits=false, requireCategoryOptionCombo=false, requireAttributeOptionCombo=false, force=false, skipLastUpdated=false} (DefaultDataValueSetService.java [taskScheduler-5])
* INFO  2021-01-19 13:22:18,361 Data value set identifier scheme: IdScheme{identifiableProperty=null, attribute=null}, data element: IdScheme{identifiableProperty=null, attribute=null}, org unit: IdScheme{identifiableProperty=null, attribute=null}, category option combo: IdScheme{identifiableProperty=null, attribute=null}, data set: IdScheme{identifiableProperty=null, attribute=null} (DefaultDataValueSetService.java [taskScheduler-5])
* INFO  2021-01-19 13:22:18,362 Identifier scheme: IdScheme{identifiableProperty=UID, attribute=null}, data element: IdScheme{identifiableProperty=UID, attribute=null}, org unit: IdScheme{identifiableProperty=UID, attribute=null}, category option combo: IdScheme{identifiableProperty=UID, attribute=null}, data set: IdScheme{identifiableProperty=UID, attribute=null} (DefaultDataValueSetService.java [taskScheduler-5])
* ERROR 2021-01-19 13:22:18,362 java.lang.ClassCastException: java.lang.String cannot be cast to java.lang.Boolean
        at org.hisp.dhis.dxf2.datavalueset.DefaultDataValueSetService.saveDataValueSet(DefaultDataValueSetService.java:720)
        at org.hisp.dhis.dxf2.datavalueset.DefaultDataValueSetService.saveDataValueSetJson(DefaultDataValueSetService.java:594)
        at org.hisp.dhis.dxf2.datavalueset.tasks.ImportDataValueTask.call(ImportDataValueTask.java:86)
        at org.hisp.dhis.security.SecurityContextRunnable.run(SecurityContextRunnable.java:57)
        at org.springframework.scheduling.support.DelegatingErrorHandlingRunnable.run(DelegatingErrorHandlingRunnable.java:54)
        at java.util.concurrent.Executors$RunnableAdapter.call(Unknown Source)
        at java.util.concurrent.FutureTask.run(Unknown Source)
        at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(Unknown Source)
        at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(Unknown Source)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
        at java.lang.Thread.run(Unknown Source)
 (DefaultDataValueSetService.java [taskScheduler-5])
* INFO  2021-01-19 13:22:18,363 [Level: ERROR, category: DATAVALUE_IMPORT, time: Tue Jan 19 13:22:18 CAT 2021, message: Process failed: java.lang.String cannot be cast to java.lang.Boolean] (InMemoryNotifier.java [taskScheduler-5])

The only difference in the log with 2.30 (apart from no error) is there are two new import options flags: strictDataElements=false, ignoreEmptyCollection=false. Where should I be looking for the source of the error? Is this a database structural issue or a data issue?

Seems related to this Error while uploading data in json and xml formats

I know that for us, the java.lang.String and java.lang.Boolean is related to a decimal assignment of 0 to our program indicators. If that exists, then any visualization we have will error out with this included in the message. Sounds like the other, linked url may have a possible fix (but which also may not work, if the user below tried correctly).

1 Like

Thanks for the response Matthew, unfortunately it appears that’s not it.
Even this really simple data file throws the error:

{
  "dataValues": [
    {
      "dataElement": "re0wwrQ7Wjm",
      "period": "202001",
      "orgUnit": "A4VAbVXFBr8",
      "categoryOptionCombo": "PPsWGnAciVz",
      "attributeOptionCombo": "fO50qApzQTJ",
      "value": "83"
    }
  ]
}

As mentioned, even exporting something then attempting to import the file created by the export function results in the error. EDIT I tested different formats (XML and CSV) and the same error is thrown so it’s not specific to the JSON format.

1 Like

Fair enough @Edward_Robinson, this is beyond my experience unfortunately. Sounds like @toctave and @dmaritim can commiserate—perhaps (just in case) a follow-up to @INyabuto to confirm that he was actually able to accomplish what is being discussed here, and what his file looked like in order for it to be done would be helpful. Also just attaching @Karoline here as she messaged on the other message as well.

1 Like

Thanks Matthew, we appreciate your input. @toctave is part of the team I work with and we’ve had this issue for some time. We are now concentrating efforts to see what’s up between 2.30 and 2.31 and why it suddenly stops working in 2.31. I’ve just created a test data element and data set to eliminate noise and added a single record then performed an export and import and the issue persists. I’ll test with a fresh 2.31 install with an identically configured data element and data set and take it from there.
What I have noticed that bugs me somewhat is that the database structure is slightly different between my current data set and a reference empty data set in some cases. Many are simply larger fields (Varchar) in my data set when compared with the reference and that is probably OK, but there are some differences in indexes etc. that needs a closer look. The answer may be there. I’ll report back as soon as I have anything but in the mean time if this rings a bell with anyone, particularly the development team, any feedback would be greatly appreciated!

OK, this appears to be resolved. For anyone else facing this, what worked for us was to reset the system settings for import as follows:

Login as administrator and open the system settings app → data import then take a look at the current settings, change them and set them back the way they were, this will force the systemsettings table to update. It seems something went wrong during the 2.31 upgrade that caused corruption of the values in this table.

1 Like

Hi Ed Robinson,

We worked together in the pass. So I have a serious problem in the Import/Export on DHIS2 2.37.9.
Let me show you the error :

An error occurred in the DHIS2 application.
The following information may be requested by technical support.

TypeError: Cannot read properties of undefined (reading ‘imported’)
at ht (https://

Can you help me please

1 Like

Hi Farell, apologies, I have just seen this.
Are you, or someone else, able to get a copy of the Tomcat log file when the import is run?
It will be in the docker container for the server under logs/catalina.out
I can guide you through the process.

Thanks

2 Likes

Hi Ed, I copied it so can I send it to you ?

I tried it but it doesnt work for me. I use 2.37.9

Hi Farrel, what did you try?
I see on the deleted message a URL that includes test.com - where does that domain name come from?

Hi Ed,

I intentionally changed the domain name by test.com
isnt the real domain name