Hi, I am trying to export analytics tables in our DHIS2 implementation in Kilimanjaro. This is so I can start preparing some event reports. Unfortunately every attempt to export the tables results in an error.
I worked through the relevant Data Administration steps and attach a PDF containing the relevant Catalina logs and UI output. Everything seems fine until the analytics tables export. Specifically, the export process fails on creating the events analytics tables. The tail end of the (very large) generated error message says, 'ERROR: invalid input syntax for type bigint: β2.6β; nested exception is
org.postgresql.util.PSQLException: ERROR: invalid input syntax for type bigint:
β2.6β β
I do not know where to look to try and localise this error. Any ideas?
It seems like DHIS2 is trying to convert a decimal number β2.6β to PostgreSQL bigint type. Probably this conversion is failing, as β2.6β is not an integer.
Are you able to identify where this value β2.6β is coming from? Itβs probably a value comming from a Tracker Entiry Attribute. If not, a Data Element.
If you can identify it, try to change this number to an integer value, like β2β, or β3β, then try to run again the analytics export.
This is exactly the solution, thanks. Inspecting the SQL error output revealed which table update failed. Looking at the table description in PostGRES revealed that two columns were of tpye biging and were likely to be the source of the problem. A metadata dump from DHIS2 further allowed me to see which attributes or data elements were being identified. One of these contained decimal value and in fact its type was incorrectly set to zero or whole positive number. Changing the element type made the problem go away.
We had this same issue and found this strange: We have a TE attribute with a value type of Positive Integer. It is only populated by a program rule which does a calculation and inserts the value into the attribute. The calculation (in error, I assume) calculated the value with decimals sometimes and apparently inserted the value with decimals into the attribute, as evidenced in the program event listing within the capture app. Yet, this value with decimals which was being inserted into some table expecting an integer, caused analytics to fail.
How can an attribute ever contain data that is not in line with its definition. The attribute may have changed (i.e., it may have been a number and was switched to an integer) but if that were the case how did the decimals remain as the attributeβs value once the value type changed to integer? Or how was a value with decimals permitted to be inserted into an attribute which disallowed decimals?