Hi, I am trying to export analytics tables in our DHIS2 implementation in Kilimanjaro. This is so I can start preparing some event reports. Unfortunately every attempt to export the tables results in an error.
I worked through the relevant Data Administration steps and attach a PDF containing the relevant Catalina logs and UI output. Everything seems fine until the analytics tables export. Specifically, the export process fails on creating the events analytics tables. The tail end of the (very large) generated error message says, 'ERROR: invalid input syntax for type bigint: “2.6”; nested exception is
org.postgresql.util.PSQLException: ERROR: invalid input syntax for type bigint:
I do not know where to look to try and localise this error. Any ideas?
DHIS2 Catalina and UI logs.pdf (299.6 KB)
Thank you for your patience! I triaged your post requesting more info about whether this is an actual bug or a configuration issue.
Many thanks. It would be a relief to find this to be a configuration issue but I am at a loss to figure out what it may be!
It seems like DHIS2 is trying to convert a decimal number “2.6” to PostgreSQL bigint type. Probably this conversion is failing, as “2.6” is not an integer.
Are you able to identify where this value “2.6” is coming from? It’s probably a value comming from a Tracker Entiry Attribute. If not, a Data Element.
If you can identify it, try to change this number to an integer value, like “2”, or “3”, then try to run again the analytics export.
This is exactly the solution, thanks. Inspecting the SQL error output revealed which table update failed. Looking at the table description in PostGRES revealed that two columns were of tpye biging and were likely to be the source of the problem. A metadata dump from DHIS2 further allowed me to see which attributes or data elements were being identified. One of these contained decimal value and in fact its type was incorrectly set to zero or whole positive number. Changing the element type made the problem go away.
Sounds great @swoodworth! Glad to see you could fix the issue.
Thanks for your feedback.
We’re having the same issue. Do you recall what tables.columns were causing the issue?
We had this same issue and found this strange: We have a TE attribute with a value type of Positive Integer. It is only populated by a program rule which does a calculation and inserts the value into the attribute. The calculation (in error, I assume) calculated the value with decimals sometimes and apparently inserted the value with decimals into the attribute, as evidenced in the program event listing within the capture app. Yet, this value with decimals which was being inserted into some table expecting an integer, caused analytics to fail.
How can an attribute ever contain data that is not in line with its definition. The attribute may have changed (i.e., it may have been a number and was switched to an integer) but if that were the case how did the decimals remain as the attribute’s value once the value type changed to integer? Or how was a value with decimals permitted to be inserted into an attribute which disallowed decimals?