Issue with Analytics Table Functionality (There was an error loading data for this item, 409 Conflict, etc.)

Hello,
I am currently experiencing issues related to the Analytics Table.

  1. The issue started suddenly — dashboard items are no longer displaying data and show the message: “There was an error loading data for this item.”

  2. I attempted to trigger the export via Data Administration > Analytics Tables > Start Export, but encountered the following error: “Processing aborted: skipped stage, failing item caused abort. Parallel processing aborted after 2 successful and 1 failed items.”*

  3. The browser console logs the following messages:
    https://{our_domain}/hmis/api/41/analytics?dimension=dx%3AkmBtcRwldMW,pe%3A2021%3B2022%3B2023&filter=ou%3AUSER_ORGUNIT&displayProperty=NAME&includeNumDen=false&skipMeta=true&skipData=false 409 (Conflict)
    ||(anonymous)|@|VM1142 plugin-372.11f2ba19.js:2|
    | — | — | — | — |
    ||fetch|@|VM1142 plugin-372.11f2ba19.js:2|
    ||executeResourceQuery|@|VM1142 plugin-372.11f2ba19.js:2|
    ||(anonymous)|@|VM1142 plugin-372.11f2ba19.js:2|
    ||query|@|VM1142 plugin-372.11f2ba19.js:2|
    ||get|@|VM1151 plugin-8961.8e451236.chunk.js:2|
    ||Ze|@|VM1150 plugin-app.792e000c.chunk.js:1|
    ||Ge|@|VM1150 plugin-app.792e000c.chunk.js:1|
    ||(anonymous)|@|VM1150 plugin-app.792e000c.chunk.js:1|
    ||(anonymous)|@|VM1150 plugin-app.792e000c.chunk.js:1|
    ||(anonymous)|@|VM1150 plugin-app.792e000c.chunk.js:1|
    ||ls|@|VM1142 plugin-372.11f2ba19.js:2|
    ||Sl|@|VM1142 plugin-372.11f2ba19.js:2|
    ||t.unstable_runWithPriority|@|VM1142 plugin-372.11f2ba19.js:2|
    ||Bi|@|VM1142 plugin-372.11f2ba19.js:2|
    ||_l|@|VM1142 plugin-372.11f2ba19.js:2|
    ||(anonymous)|@|VM1142 plugin-372.11f2ba19.js:2|
    ||z|@|VM1142 plugin-372.11f2ba19.js:2|
    ||k.port1.onmessage|@|VM1142 plugin-372.11f2ba19.js:2|

  4. On the hmis container, the /var/log/tomcat9/catalina.out file contains the following:
    2025-04-22 22:48:15] [info] (JdbcAnalyticsManager.java [SimpleAsyncTaskExecutor-4023])
    [2025-04-22 22:48:15] [info] org.springframework.jdbc.BadSqlGrammarException: StatementCallback; bad SQL grammar [select ax.“dx”,ax.“yearly”, sum(value) as value from (select “year”,“pestartdate”,“peenddate”,“oulevel”,“daysxvalue”,“daysno”,“value”,“textvalue”,“dx”,cast(‘2021’ as text) as “yearly”,“uidlevel1”,row_number() over (partition by ax.“dx”,ax.“ou”,ax.“co”,ax.“ao” order by peenddate desc, pestartdate desc) as pe_rank from analytics as ax where ax.“pestartdate” >= ‘2021-01-01’ and ax.“peenddate” <= ‘2021-12-31’ and (ax.“value” is not null or ax.“textvalue” is not null)) as ax where ax.“dx” in (‘rl3LRzfHLCf’, ‘xVbLcH13oiM’, ‘sLn5DGvILGt’, ‘rLoq70NmjFS’, ‘bkYhKFy7pTl’, ‘aSrlxk53UAk’, ‘PWdvagH4GfC’, ‘cs27uzL2L1R’, ‘gg4NYg9JQ8o’, ‘XlexaQiXktB’, ‘MCPIzHz05K9’, ‘omv4Rem56fv’, ‘NRIc2gUpZ18’, ‘QdXkdUYhiEJ’, ‘iwwbeGnEG42’, ‘mSXMR8t8g7l’, ‘IrO4hMe4qOi’, ‘zvMMdw22Rdc’, ‘tNdjLRvQ3wx’, ‘fyOzwIgnW4e’, ‘y9nC10VS78e’, ‘sYsPdYMpkvu’, ‘rGbyndf0eor’, ‘h0lTXrgfHMK’, ‘ALwWZmKQ4JC’, ‘Z8dNLpSohDo’, ‘kptbCzklb7P’, ‘pDHs6yfQHal’, ‘QPvTH2xqqvT’, ‘rXGUi8fj416’, ‘TDK6cBmtBPA’, ‘xBmnd5dwICQ’, ‘sjGOAxjWdrT’, ‘BeqoCiI1zzL’) and ax.“yearly” in (‘2021’) and ( ax.“uidlevel1” in (‘v42y4gGt3rN’) ) and ax.“pe_rank” = 1 group by ax.“dx”,ax.“yearly” ]; nested exception is org.postgresql.util.PSQLException: ERROR: relation “analytics” does not exist
    [2025-04-22 22:48:15] [info] Position: 320

I checked our db table, it is like below. there are alalytics tables.
hmis=# \dt
List of relations
Schema | Name | Type | Owner
--------±----------------------------------------------------±------±------
public | _categoryoptioncomboname | table | hmis
public | _categorystructure | table | hmis
public | _dataapprovalminlevel | table | hmis
public | _dataapprovalremaplevel | table | hmis
public | _dataelementcategoryoptioncombo | table | hmis
public | _dataelementgroupsetstructure | table | hmis
public | _dataelementstructure | table | hmis
public | _datasetorganisationunitcategory | table | hmis
public | _dateperiodstructure | table | hmis
public | _indicatorgroupsetstructure | table | hmis
public | _organisationunitgroupsetstructure | table | hmis
public | _orgunitstructure | table | hmis
public | _periodstructure | table | hmis
public | aggregatedataexchange | table | hmis
public | analytics_2021_temp | table | hmis
public | analytics_2022_temp | table | hmis
public | analytics_2023_temp | table | hmis
public | analytics_2024_temp | table | hmis
public | analytics_2025_temp | table | hmis
public | analytics_orgunittarget | table | hmis
public | analytics_rs_categoryoptioncomboname | table | hmis
public | analytics_rs_categorystructure | table | hmis
public | analytics_rs_dataapprovalminlevel | table | hmis
public | analytics_rs_dataapprovalremaplevel | table | hmis
public | analytics_rs_dataelementcategoryoptioncombo | table | hmis
public | analytics_rs_dataelementgroupsetstructure | table | hmis
public | analytics_rs_dataelementstructure | table | hmis
public | analytics_rs_datasetorganisationunitcategory | table | hmis
public | analytics_rs_dateperiodstructure | table | hmis
public | analytics_rs_indicatorgroupsetstructure | table | hmis
public | analytics_rs_organisationunitgroupsetstructure | table | hmis
public | analytics_rs_orgunitstructure | table | hmis
public | analytics_rs_periodstructure | table | hmis
public | analytics_tei_enrollments_zohsmnoepmu | table | hmis
public | analytics_tei_zohsmnoepmu | table | hmis
public | analytics_temp | table | hmis
public | api_token | table | hmis
public | attribute | table | hmis

  1. Also on the hmis container, the /opt/dhis2/logs/dhis.log file shows the following logs:
  • ERROR 2025-04-22T23:14:29,930 StatementCallback; bad SQL grammar [insert into analytics_2025_temp (“dx”,“co”,“ao”,“pestartdate”,“peenddate”,“year”,“pe”,“ou”,“oulevel”,“id”,“HBmUjbhkBrK”,“dlA9s71qcS8”,“aynRBC5LFrl”,“IomUMgP6Noo”,“r1OoFlUMbUO”,“XmVPBsubZtN”,“P1VW5WMRwqN”,“I8QdrtkkZDS”,“C7Tc5WaNPeN”,“aSdyuRi1iWO”,“NhGTPBBIw3z”,“CabaCFFVz6x”,“EIhZzjji8lL”,“WVjVawxj3ot”,“omt7rFCYi1V”,“KwxnTDQvggY”,“KUONojmuK5Q”,“zsQo7WIGAzI”,“FIjCPMboX7j”,“cQ9Zf6rOkER”,“ULUbH4orDPa”,“b2FcwDoKgEK”,“GsZDQu8rTJ8”,“ALYvlZkUvQY”,“aXvQ6izciaY”,“ozMdnGEaCYL”,“p8B1RhUgUmO”,“QhkYGfkaybP”,“eCSEuW3szNH”,“uidlevel1”,“uidlevel2”,“uidlevel3”,“uidlevel4”,“daily”,“weekly”,“weeklywednesday”,“weeklythursday”,“weeklysaturday”,“weeklysunday”,“biweekly”,“monthly”,“bimonthly”,“quarterly”,“quarterlynov”,“sixmonthly”,“sixmonthlyapril”,“sixmonthlynov”,“yearly”,“financialapril”,“financialjuly”,“financialoct”,“financialnov”,“approvallevel”,“daysxvalue”,“daysno”,“value”,“textvalue”) select des.dataelementuid as dx,dcs.categoryoptioncombouid as co,acs.categoryoptioncombouid as ao,ps.startdate as pestartdate,ps.enddate as peenddate,ps.year as year,ps.iso as pe,ous.organisationunituid as ou,ous.level as oulevel,concat(des.dataelementuid,‘-’,ps.iso,‘-’,ous.organisationunituid,‘-’,dcs.categoryoptioncombouid,‘-’,acs.categoryoptioncombouid) as id ,dcs.“HBmUjbhkBrK”,dcs.“dlA9s71qcS8”,dcs.“aynRBC5LFrl”,dcs.“IomUMgP6Noo”,dcs.“r1OoFlUMbUO”,dcs.“XmVPBsubZtN”,dcs.“P1VW5WMRwqN”,dcs.“I8QdrtkkZDS”,dcs.“C7Tc5WaNPeN”,dcs.“aSdyuRi1iWO”,dcs.“NhGTPBBIw3z”,dcs.“CabaCFFVz6x”,dcs.“EIhZzjji8lL”,dcs.“WVjVawxj3ot”,dcs.“omt7rFCYi1V”,dcs.“KwxnTDQvggY”,dcs.“KUONojmuK5Q”,dcs.“zsQo7WIGAzI”,dcs.“FIjCPMboX7j”,dcs.“cQ9Zf6rOkER”,dcs.“ULUbH4orDPa”,dcs.“b2FcwDoKgEK”,dcs.“GsZDQu8rTJ8”,dcs.“ALYvlZkUvQY”,dcs.“aXvQ6izciaY”,dcs.“ozMdnGEaCYL”,dcs.“p8B1RhUgUmO”,dcs.“QhkYGfkaybP”,dcs.“eCSEuW3szNH”,ous.“uidlevel1”,ous.“uidlevel2”,ous.“uidlevel3”,ous.“uidlevel4”,ps.“daily”,ps.“weekly”,ps.“weeklywednesday”,ps.“weeklythursday”,ps.“weeklysaturday”,ps.“weeklysunday”,ps.“biweekly”,ps.“monthly”,ps.“bimonthly”,ps.“quarterly”,ps.“quarterlynov”,ps.“sixmonthly”,ps.“sixmonthlyapril”,ps.“sixmonthlynov”,ps.“yearly”,ps.“financialapril”,ps.“financialjuly”,ps.“financialoct”,ps.“financialnov”,coalesce(des.datasetapprovallevel, aon.approvallevel, da.minlevel, 999) as approvallevel, cast(dv.value as double precision) * ps.daysno as daysxvalue, ps.daysno as daysno, cast(dv.value as double precision) as value, null as textvalue from datavalue dv inner join analytics_rs_periodstructure ps on dv.periodid=ps.periodid inner join analytics_rs_dataelementstructure des on dv.dataelementid = des.dataelementid inner join analytics_rs_dataelementgroupsetstructure degs on dv.dataelementid=degs.dataelementid inner join analytics_rs_orgunitstructure ous on dv.sourceid=ous.organisationunitid inner join analytics_rs_organisationunitgroupsetstructure ougs on dv.sourceid=ougs.organisationunitid and (cast(date_trunc(‘month’, ps.startdate) as date)=ougs.startdate or ougs.startdate is null) inner join analytics_rs_categorystructure dcs on dv.categoryoptioncomboid=dcs.categoryoptioncomboid inner join analytics_rs_categorystructure acs on dv.attributeoptioncomboid=acs.categoryoptioncomboid inner join analytics_rs_categoryoptioncomboname aon on dv.attributeoptioncomboid=aon.categoryoptioncomboid inner join analytics_rs_categoryoptioncomboname con on dv.categoryoptioncomboid=con.categoryoptioncomboid left join analytics_rs_dataapprovalminlevel da on des.workflowid=da.workflowid and da.periodid=dv.periodid and da.attributeoptioncomboid=dv.attributeoptioncomboid and (ous.idlevel 3 = da.organisationunitid orous.idlevel 4 = da.organisationunitid orous.idlevel 2 = da.organisationunitid orous.idlevel 1 = da.organisationunitid ) where des.valuetype in (‘PERCENTAGE’, ‘UNIT_INTERVAL’, ‘INTEGER_NEGATIVE’, ‘NUMBER’, ‘INTEGER_POSITIVE’, ‘INTEGER_ZERO_OR_POSITIVE’, ‘INTEGER’) and des.domaintype = ‘AGGREGATE’ and ps.year = 2025 and dv.lastupdated < ‘2025-04-22T23:14:20’ and dv.value is not null and dv.deleted = false and (dv.value != ‘0’ or des.aggregationtype in (‘AVERAGE’,‘AVERAGE_SUM_ORG_UNIT’) or des.zeroissignificant = true) and dv.value ~* ‘^(-?[0-9]+)(.[0-9]+)?$’ ]; nested exception is org.postgresql.util.PSQLException: ERROR: syntax error at or near “3”
    Position: 3538 (NotificationLoggerUtil.java [ForkJoinPool-3001-worker-1])
  • WARN 2025-04-22T23:14:29,930 [ANALYTICS_TABLE wGAkagcjrIv] Process aborted after 9.911s: aborted after error: StatementCallback; bad SQL grammar [insert into analytics_2025_temp (“dx”,“co”,“ao”,“pestartdate”,“peenddate”,“year”,“pe”,“ou”,“oulevel”,“id”,“HBmUjbhkBrK”,“dlA9s71qcS8”,“aynRBC5LFrl”,“IomUMgP6Noo”,“r1OoFlUMbUO”,“XmVPBsubZtN”,“P1VW5WMRwqN”,“I8QdrtkkZDS”,“C7Tc5WaNPeN”,“aSdyuRi1iWO”,“NhGTPBBIw3z”,“CabaCFFVz6x”,“EIhZzjji8lL”,“WVjVawxj3ot”,“omt7rFCYi1V”,“KwxnTDQvggY”,“KUONojmuK5Q”,“zsQo7WIGAzI”,“FIjCPMboX7j”,“cQ9Zf6rOkER”,“ULUbH4orDPa”,“b2FcwDoKgEK”,“GsZDQu8rTJ8”,“ALYvlZkUvQY”,“aXvQ6izciaY”,“ozMdnGEaCYL”,“p8B1RhUgUmO”,“QhkYGfkaybP”,“eCSEuW3szNH”,“uidlevel1”,“uidlevel2”,“uidlevel3”,“uidlevel4”,“daily”,“weekly”,“weeklywednesday”,“weeklythursday”,“weeklysaturday”,“weeklysunday”,“biweekly”,“monthly”,“bimonthly”,“quarterly”,“quarterlynov”,“sixmonthly”,“sixmonthlyapril”,“sixmonthlynov”,“yearly”,“financialapril”,“financialjuly”,“financialoct”,“financialnov”,“approvallevel”,“daysxvalue”,“daysno”,“value”,“textvalue”) select des.dataelementuid as dx,dcs.categoryoptioncombouid as co,acs.categoryoptioncombouid as ao,ps.startdate as pestartdate,ps.enddate as peenddate,ps.year as year,ps.iso as pe,ous.organisationunituid as ou,ous.level as oulevel,concat(des.dataelementuid,‘-’,ps.iso,‘-’,ous.organisationunituid,‘-’,dcs.categoryoptioncombouid,‘-’,acs.categoryoptioncombouid) as id ,dcs.“HBmUjbhkBrK”,dcs.“dlA9s71qcS8”,dcs.“aynRBC5LFrl”,dcs.“IomUMgP6Noo”,dcs.“r1OoFlUMbUO”,dcs.“XmVPBsubZtN”,dcs.“P1VW5WMRwqN”,dcs.“I8QdrtkkZDS”,dcs.“C7Tc5WaNPeN”,dcs.“aSdyuRi1iWO”,dcs.“NhGTPBBIw3z”,dcs.“CabaCFFVz6x”,dcs.“EIhZzjji8lL”,dcs.“WVjVawxj3ot”,dcs.“omt7rFCYi1V”,dcs.“KwxnTDQvggY”,dcs.“KUONojmuK5Q”,dcs.“zsQo7WIGAzI”,dcs.“FIjCPMboX7j”,dcs.“cQ9Zf6rOkER”,dcs.“ULUbH4orDPa”,dcs.“b2FcwDoKgEK”,dcs.“GsZDQu8rTJ8”,dcs.“ALYvlZkUvQY”,dcs.“aXvQ6izciaY”,dcs.“ozMdnGEaCYL”,dcs.“p8B1RhUgUmO”,dcs.“QhkYGfkaybP”,dcs.“eCSEuW3szNH”,ous.“uidlevel1”,ous.“uidlevel2”,ous.“uidlevel3”,ous.“uidlevel4”,ps.“daily”,ps.“weekly”,ps.“weeklywednesday”,ps.“weeklythursday”,ps.“weeklysaturday”,ps.“weeklysunday”,ps.“biweekly”,ps.“monthly”,ps.“bimonthly”,ps.“quarterly”,ps.“quarterlynov”,ps.“sixmonthly”,ps.“sixmonthlyapril”,ps.“sixmonthlynov”,ps.“yearly”,ps.“financialapril”,ps.“financialjuly”,ps.“financialoct”,ps.“financialnov”,coalesce(des.datasetapprovallevel, aon.approvallevel, da.minlevel, 999) as approvallevel, cast(dv.value as double precision) * ps.daysno as daysxvalue, ps.daysno as daysno, cast(dv.value as double precision) as value, null as textvalue from datavalue dv inner join analytics_rs_periodstructure ps on dv.periodid=ps.periodid inner join analytics_rs_dataelementstructure des on dv.dataelementid = des.dataelementid inner join analytics_rs_dataelementgroupsetstructure degs on dv.dataelementid=degs.dataelementid inner join analytics_rs_orgunitstructure ous on dv.sourceid=ous.organisationunitid inner join analytics_rs_organisationunitgroupsetstructure ougs on dv.sourceid=ougs.organisationunitid and (cast(date_trunc(‘month’, ps.startdate) as date)=ougs.startdate or ougs.startdate is null) inner join analytics_rs_categorystructure dcs on dv.categoryoptioncomboid=dcs.categoryoptioncomboid inner join analytics_rs_categorystructure acs on dv.attributeoptioncomboid=acs.categoryoptioncomboid inner join analytics_rs_categoryoptioncomboname aon on dv.attributeoptioncomboid=aon.categoryoptioncomboid inner join analytics_rs_categoryoptioncomboname con on dv.categoryoptioncomboid=con.categoryoptioncomboid left join analytics_rs_dataapprovalminlevel da on des.workflowid=da.workflowid and da.periodid=dv.periodid and da.attributeoptioncomboid=dv.attributeoptioncomboid and (ous.idlevel 3 = da.organisationunitid orous.idlevel 4 = da.organisationunitid orous.idlevel 2 = da.organisationunitid orous.idlevel 1 = da.organisationunitid ) where des.valuetype in (‘PERCENTAGE’, ‘UNIT_INTERVAL’, ‘INTEGER_NEGATIVE’, ‘NUMBER’, ‘INTEGER_POSITIVE’, ‘INTEGER_ZERO_OR_POSITIVE’, ‘INTEGER’) and des.domaintype = ‘AGGREGATE’ and ps.year = 2025 and dv.lastupdated < ‘2025-04-22T23:14:20’ and dv.value is not null and dv.deleted = false and (dv.value != ‘0’ or des.aggregationtype in (‘AVERAGE’,‘AVERAGE_SUM_ORG_UNIT’) or des.zeroissignificant = true) and dv.value ~* ‘^(-?[0-9]+)(.[0-9]+)?$’ ]; nested exception is org.postgresql.util.PSQLException: ERROR: syntax error at or near “3”
    Position: 3538 (RecordingJobProgress.java [ForkJoinPool-3001-worker-1])
  • INFO 2025-04-22T23:14:30,087 Populating table: ‘analytics_2022_temp’ [LETTER, LONG_TEXT, PHONE_NUMBER, URL, EMAIL, USERNAME, TIME, TEXT, DATE, DATETIME, AGE] in: 0.166858 sec. (AbstractJdbcTableManager.java [ForkJoinPool-3001-worker-2])
  • ERROR 2025-04-22T23:14:29,930 [ANALYTICS_TABLE wGAkagcjrIv] Item failed after 0.003s: StatementCallback; bad SQL grammar [insert into analytics_2025_temp (“dx”,“co”,“ao”,“pestartdate”,“peenddate”,“year”,“pe”,“ou”,“oulevel”,“id”,“HBmUjbhkBrK”,“dlA9s71qcS8”,“aynRBC5LFrl”,“IomUMgP6Noo”,“r1OoFlUMbUO”,“XmVPBsubZtN”,“P1VW5WMRwqN”,“I8QdrtkkZDS”,“C7Tc5WaNPeN”,“aSdyuRi1iWO”,“NhGTPBBIw3z”,“CabaCFFVz6x”,“EIhZzjji8lL”,“WVjVawxj3ot”,“omt7rFCYi1V”,“KwxnTDQvggY”,“KUONojmuK5Q”,“zsQo7WIGAzI”,“FIjCPMboX7j”,“cQ9Zf6rOkER”,“ULUbH4orDPa”,“b2FcwDoKgEK”,“GsZDQu8rTJ8”,“ALYvlZkUvQY”,“aXvQ6izciaY”,“ozMdnGEaCYL”,“p8B1RhUgUmO”,“QhkYGfkaybP”,“eCSEuW3szNH”,“uidlevel1”,“uidlevel2”,“uidlevel3”,“uidlevel4”,“daily”,“weekly”,“weeklywednesday”,“weeklythursday”,“weeklysaturday”,“weeklysunday”,“biweekly”,“monthly”,“bimonthly”,“quarterly”,“quarterlynov”,“sixmonthly”,“sixmonthlyapril”,“sixmonthlynov”,“yearly”,“financialapril”,“financialjuly”,“financialoct”,“financialnov”,“approvallevel”,“daysxvalue”,“daysno”,“value”,“textvalue”) select des.dataelementuid as dx,dcs.categoryoptioncombouid as co,acs.categoryoptioncombouid as ao,ps.startdate as pestartdate,ps.enddate as peenddate,ps.year as year,ps.iso as pe,ous.organisationunituid as ou,ous.level as oulevel,concat(des.dataelementuid,‘-’,ps.iso,‘-’,ous.organisationunituid,‘-’,dcs.categoryoptioncombouid,‘-’,acs.categoryoptioncombouid) as id ,dcs.“HBmUjbhkBrK”,dcs.“dlA9s71qcS8”,dcs.“aynRBC5LFrl”,dcs.“IomUMgP6Noo”,dcs.“r1OoFlUMbUO”,dcs.“XmVPBsubZtN”,dcs.“P1VW5WMRwqN”,dcs.“I8QdrtkkZDS”,dcs.“C7Tc5WaNPeN”,dcs.“aSdyuRi1iWO”,dcs.“NhGTPBBIw3z”,dcs.“CabaCFFVz6x”,dcs.“EIhZzjji8lL”,dcs.“WVjVawxj3ot”,dcs.“omt7rFCYi1V”,dcs.“KwxnTDQvggY”,dcs.“KUONojmuK5Q”,dcs.“zsQo7WIGAzI”,dcs.“FIjCPMboX7j”,dcs.“cQ9Zf6rOkER”,dcs.“ULUbH4orDPa”,dcs.“b2FcwDoKgEK”,dcs.“GsZDQu8rTJ8”,dcs.“ALYvlZkUvQY”,dcs.“aXvQ6izciaY”,dcs.“ozMdnGEaCYL”,dcs.“p8B1RhUgUmO”,dcs.“QhkYGfkaybP”,dcs.“eCSEuW3szNH”,ous.“uidlevel1”,ous.“uidlevel2”,ous.“uidlevel3”,ous.“uidlevel4”,ps.“daily”,ps.“weekly”,ps.“weeklywednesday”,ps.“weeklythursday”,ps.“weeklysaturday”,ps.“weeklysunday”,ps.“biweekly”,ps.“monthly”,ps.“bimonthly”,ps.“quarterly”,ps.“quarterlynov”,ps.“sixmonthly”,ps.“sixmonthlyapril”,ps.“sixmonthlynov”,ps.“yearly”,ps.“financialapril”,ps.“financialjuly”,ps.“financialoct”,ps.“financialnov”,coalesce(des.datasetapprovallevel, aon.approvallevel, da.minlevel, 999) as approvallevel, cast(dv.value as double precision) * ps.daysno as daysxvalue, ps.daysno as daysno, cast(dv.value as double precision) as value, null as textvalue from datavalue dv inner join analytics_rs_periodstructure ps on dv.periodid=ps.periodid inner join analytics_rs_dataelementstructure des on dv.dataelementid = des.dataelementid inner join analytics_rs_dataelementgroupsetstructure degs on dv.dataelementid=degs.dataelementid inner join analytics_rs_orgunitstructure ous on dv.sourceid=ous.organisationunitid inner join analytics_rs_organisationunitgroupsetstructure ougs on dv.sourceid=ougs.organisationunitid and (cast(date_trunc(‘month’, ps.startdate) as date)=ougs.startdate or ougs.startdate is null) inner join analytics_rs_categorystructure dcs on dv.categoryoptioncomboid=dcs.categoryoptioncomboid inner join analytics_rs_categorystructure acs on dv.attributeoptioncomboid=acs.categoryoptioncomboid inner join analytics_rs_categoryoptioncomboname aon on dv.attributeoptioncomboid=aon.categoryoptioncomboid inner join analytics_rs_categoryoptioncomboname con on dv.categoryoptioncomboid=con.categoryoptioncomboid left join analytics_rs_dataapprovalminlevel da on des.workflowid=da.workflowid and da.periodid=dv.periodid and da.attributeoptioncomboid=dv.attributeoptioncomboid and (ous.idlevel 3 = da.organisationunitid orous.idlevel 4 = da.organisationunitid orous.idlevel 2 = da.organisationunitid orous.idlevel 1 = da.organisationunitid ) where des.valuetype in (‘PERCENTAGE’, ‘UNIT_INTERVAL’, ‘INTEGER_NEGATIVE’, ‘NUMBER’, ‘INTEGER_POSITIVE’, ‘INTEGER_ZERO_OR_POSITIVE’, ‘INTEGER’) and des.domaintype = ‘AGGREGATE’ and ps.year = 2025 and dv.lastupdated < ‘2025-04-22T23:14:20’ and dv.value is not null and dv.deleted = false and (dv.value != ‘0’ or des.aggregationtype in (‘AVERAGE’,‘AVERAGE_SUM_ORG_UNIT’) or des.zeroissignificant = true) and dv.value ~* ‘^(-?[0-9]+)(.[0-9]+)?$’ ]; nested exception is org.postgresql.util.PSQLException: ERROR: syntax error at or near “3”
    Position: 3538 (RecordingJobProgress.java [ForkJoinPool-3001-worker-1])
    org.springframework.jdbc.BadSqlGrammarException: StatementCallback; bad SQL grammar [insert into analytics_2025_temp (“dx”,“co”,“ao”,“pestartdate”,“peenddate”,“year”,“pe”,“ou”,“oulevel”,“id”,“HBmUjbhkBrK”,“dlA9s71qcS8”,“aynRBC5LFrl”,“IomUMgP6Noo”,“r1OoFlUMbUO”,“XmVPBsubZtN”,“P1VW5WMRwqN”,“I8QdrtkkZDS”,“C7Tc5WaNPeN”,“aSdyuRi1iWO”,“NhGTPBBIw3z”,“CabaCFFVz6x”,“EIhZzjji8lL”,“WVjVawxj3ot”,“omt7rFCYi1V”,“KwxnTDQvggY”,“KUONojmuK5Q”,“zsQo7WIGAzI”,“FIjCPMboX7j”,“cQ9Zf6rOkER”,“ULUbH4orDPa”,“b2FcwDoKgEK”,“GsZDQu8rTJ8”,“ALYvlZkUvQY”,“aXvQ6izciaY”,“ozMdnGEaCYL”,“p8B1RhUgUmO”,“QhkYGfkaybP”,“eCSEuW3szNH”,“uidlevel1”,“uidlevel2”,“uidlevel3”,“uidlevel4”,“daily”,“weekly”,“weeklywednesday”,“weeklythursday”,“weeklysaturday”,“weeklysunday”,“biweekly”,“monthly”,“bimonthly”,“quarterly”,“quarterlynov”,“sixmonthly”,“sixmonthlyapril”,“sixmonthlynov”,“yearly”,“financialapril”,“financialjuly”,“financialoct”,“financialnov”,“approvallevel”,“daysxvalue”,“daysno”,“value”,“textvalue”) select des.dataelementuid as dx,dcs.categoryoptioncombouid as co,acs.categoryoptioncombouid as ao,ps.startdate as pestartdate,ps.enddate as peenddate,ps.year as year,ps.iso as pe,ous.organisationunituid as ou,ous.level as oulevel,concat(des.dataelementuid,‘-’,ps.iso,‘-’,ous.organisationunituid,‘-’,dcs.categoryoptioncombouid,‘-’,acs.categoryoptioncombouid) as id ,dcs.“HBmUjbhkBrK”,dcs.“dlA9s71qcS8”,dcs.“aynRBC5LFrl”,dcs.“IomUMgP6Noo”,dcs.“r1OoFlUMbUO”,dcs.“XmVPBsubZtN”,dcs.“P1VW5WMRwqN”,dcs.“I8QdrtkkZDS”,dcs.“C7Tc5WaNPeN”,dcs.“aSdyuRi1iWO”,dcs.“NhGTPBBIw3z”,dcs.“CabaCFFVz6x”,dcs.“EIhZzjji8lL”,dcs.“WVjVawxj3ot”,dcs.“omt7rFCYi1V”,dcs.“KwxnTDQvggY”,dcs.“KUONojmuK5Q”,dcs.“zsQo7WIGAzI”,dcs.“FIjCPMboX7j”,dcs.“cQ9Zf6rOkER”,dcs.“ULUbH4orDPa”,dcs.“b2FcwDoKgEK”,dcs.“GsZDQu8rTJ8”,dcs.“ALYvlZkUvQY”,dcs.“aXvQ6izciaY”,dcs.“ozMdnGEaCYL”,dcs.“p8B1RhUgUmO”,dcs.“QhkYGfkaybP”,dcs.“eCSEuW3szNH”,ous.“uidlevel1”,ous.“uidlevel2”,ous.“uidlevel3”,ous.“uidlevel4”,ps.“daily”,ps.“weekly”,ps.“weeklywednesday”,ps.“weeklythursday”,ps.“weeklysaturday”,ps.“weeklysunday”,ps.“biweekly”,ps.“monthly”,ps.“bimonthly”,ps.“quarterly”,ps.“quarterlynov”,ps.“sixmonthly”,ps.“sixmonthlyapril”,ps.“sixmonthlynov”,ps.“yearly”,ps.“financialapril”,ps.“financialjuly”,ps.“financialoct”,ps.“financialnov”,coalesce(des.datasetapprovallevel, aon.approvallevel, da.minlevel, 999) as approvallevel, cast(dv.value as double precision) * ps.daysno as daysxvalue, ps.daysno as daysno, cast(dv.value as double precision) as value, null as textvalue from datavalue dv inner join analytics_rs_periodstructure ps on dv.periodid=ps.periodid inner join analytics_rs_dataelementstructure des on dv.dataelementid = des.dataelementid inner join analytics_rs_dataelementgroupsetstructure degs on dv.dataelementid=degs.dataelementid inner join analytics_rs_orgunitstructure ous on dv.sourceid=ous.organisationunitid inner join analytics_rs_organisationunitgroupsetstructure ougs on dv.sourceid=ougs.organisationunitid and (cast(date_trunc(‘month’, ps.startdate) as date)=ougs.startdate or ougs.startdate is null) inner join analytics_rs_categorystructure dcs on dv.categoryoptioncomboid=dcs.categoryoptioncomboid inner join analytics_rs_categorystructure acs on dv.attributeoptioncomboid=acs.categoryoptioncomboid inner join analytics_rs_categoryoptioncomboname aon on dv.attributeoptioncomboid=aon.categoryoptioncomboid inner join analytics_rs_categoryoptioncomboname con on dv.categoryoptioncomboid=con.categoryoptioncomboid left join analytics_rs_dataapprovalminlevel da on des.workflowid=da.workflowid and da.periodid=dv.periodid and da.attributeoptioncomboid=dv.attributeoptioncomboid and (ous.idlevel 3 = da.organisationunitid orous.idlevel 4 = da.organisationunitid orous.idlevel 2 = da.organisationunitid orous.idlevel 1 = da.organisationunitid ) where des.valuetype in (‘PERCENTAGE’, ‘UNIT_INTERVAL’, ‘INTEGER_NEGATIVE’, ‘NUMBER’, ‘INTEGER_POSITIVE’, ‘INTEGER_ZERO_OR_POSITIVE’, ‘INTEGER’) and des.domaintype = ‘AGGREGATE’ and ps.year = 2025 and dv.lastupdated < ‘2025-04-22T23:14:20’ and dv.value is not null and dv.deleted = false and (dv.value != ‘0’ or des.aggregationtype in (‘AVERAGE’,‘AVERAGE_SUM_ORG_UNIT’) or des.zeroissignificant = true) and dv.value ~* ‘^(-?[0-9]+)(.[0-9]+)?$’ ]; nested exception is org.postgresql.util.PSQLException: ERROR: syntax error at or near “3”
    Position: 3538
    at org.springframework.jdbc.support.SQLErrorCodeSQLExceptionTranslator.doTranslate(SQLErrorCodeSQLExceptionTranslator.java:236) ~[spring-jdbc-5.3.37.jar:5.3.37]
    at org.springframework.jdbc.support.AbstractFallbackSQLExceptionTranslator.translate(AbstractFallbackSQLExceptionTranslator.java:73) ~[spring-jdbc-5.3.37.jar:5.3.37]
    at org.springframework.jdbc.core.JdbcTemplate.translateException(JdbcTemplate.java:1575) ~[spring-jdbc-5.3.37.jar:5.3.37]
    at org.springframework.jdbc.core.JdbcTemplate.execute(JdbcTemplate.java:398) ~[spring-jdbc-5.3.37.jar:5.3.37]
    at org.springframework.jdbc.core.JdbcTemplate.execute(JdbcTemplate.java:434) ~[spring-jdbc-5.3.37.jar:5.3.37]
    at org.hisp.dhis.analytics.table.AbstractJdbcTableManager.invokeTimeAndLog(AbstractJdbcTableManager.java:427) ~[dhis-service-analytics-2.41.1.jar:?]
    at org.hisp.dhis.analytics.table.JdbcAnalyticsTableManager.populateTable(JdbcAnalyticsTableManager.java:421) ~[dhis-service-analytics-2.41.1.jar:?]
    at org.hisp.dhis.analytics.table.JdbcAnalyticsTableManager.populateTable(JdbcAnalyticsTableManager.java:298) ~[dhis-service-analytics-2.41.1.jar:?]
    at jdk.internal.reflect.GeneratedMethodAccessor984.invoke(Unknown Source) ~[?:?]
    at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:?]
    at java.base/java.lang.reflect.Method.invoke(Method.java:569) ~[?:?]
    at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:344) ~[spring-aop-5.3.37.jar:5.3.37]
    at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:234) ~[spring-aop-5.3.37.jar:5.3.37]
    at jdk.proxy3/jdk.proxy3.$Proxy427.populateTable(Unknown Source) ~[?:?]
    at org.hisp.dhis.analytics.table.DefaultAnalyticsTableService.lambda$populateTables$2(DefaultAnalyticsTableService.java:241) ~[dhis-service-analytics-2.41.1.jar:?]
    at org.hisp.dhis.scheduling.JobProgress.lambda$runStageInParallel$5(JobProgress.java:527) ~[dhis-api-2.41.1.jar:?]
    at java.base/java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:197) ~[?:?]
    at java.base/java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1625) ~[?:?]
    at java.base/java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:509) ~[?:?]
    at java.base/java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:499) ~[?:?]
    at java.base/java.util.stream.ReduceOps$ReduceTask.doLeaf(ReduceOps.java:960) ~[?:?]
    at java.base/java.util.stream.ReduceOps$ReduceTask.doLeaf(ReduceOps.java:934) ~[?:?]
    at java.base/java.util.stream.AbstractTask.compute(AbstractTask.java:327) ~[?:?]
    at java.base/java.util.concurrent.CountedCompleter.exec(CountedCompleter.java:754) ~[?:?]
    at java.base/java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:373) ~[?:?]
    at java.base/java.util.concurrent.ForkJoinPool$WorkQueue.helpComplete(ForkJoinPool.java:1223) ~[?:?]
    at java.base/java.util.concurrent.ForkJoinPool.helpComplete(ForkJoinPool.java:1915) ~[?:?]
    at java.base/java.util.concurrent.ForkJoinTask.awaitDone(ForkJoinTask.java:433) ~[?:?]
    at java.base/java.util.concurrent.ForkJoinTask.invoke(ForkJoinTask.java:687) ~[?:?]
    at java.base/java.util.stream.ReduceOps$ReduceOp.evaluateParallel(ReduceOps.java:927) ~[?:?]
    at java.base/java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:233) ~[?:?]
    at java.base/java.util.stream.ReferencePipeline.reduce(ReferencePipeline.java:662) ~[?:?]
    at org.hisp.dhis.scheduling.JobProgress.lambda$runStageInParallel$6(JobProgress.java:537) ~[dhis-api-2.41.1.jar:?]
    at java.base/java.util.concurrent.ForkJoinTask$AdaptedCallable.exec(ForkJoinTask.java:1428) [?:?]
    at java.base/java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:373) [?:?]
    at java.base/java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1182) [?:?]
    at java.base/java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1655) [?:?]
    at java.base/java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1622) [?:?]
    at java.base/java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:165) [?:?]
    Caused by: org.postgresql.util.PSQLException: ERROR: syntax error at or near “3”
    Position: 3538
    at org.postgresql.core.v3.QueryExecutorImpl.receiveErrorResponse(QueryExecutorImpl.java:2725) ~[postgresql-42.7.3.jar:42.7.3]
    at org.postgresql.core.v3.QueryExecutorImpl.processResults(QueryExecutorImpl.java:2412) ~[postgresql-42.7.3.jar:42.7.3]
    at org.postgresql.core.v3.QueryExecutorImpl.execute(QueryExecutorImpl.java:371) ~[postgresql-42.7.3.jar:42.7.3]
    at org.postgresql.jdbc.PgStatement.executeInternal(PgStatement.java:502) ~[postgresql-42.7.3.jar:42.7.3]
    at org.postgresql.jdbc.PgStatement.execute(PgStatement.java:419) ~[postgresql-42.7.3.jar:42.7.3]
    at org.postgresql.jdbc.PgStatement.executeWithFlags(PgStatement.java:341) ~[postgresql-42.7.3.jar:42.7.3]
    at org.postgresql.jdbc.PgStatement.executeCachedSql(PgStatement.java:326) ~[postgresql-42.7.3.jar:42.7.3]
    at org.postgresql.jdbc.PgStatement.executeWithFlags(PgStatement.java:302) ~[postgresql-42.7.3.jar:42.7.3]
    at org.postgresql.jdbc.PgStatement.execute(PgStatement.java:297) ~[postgresql-42.7.3.jar:42.7.3]
    at com.mchange.v2.c3p0.impl.NewProxyStatement.execute(NewProxyStatement.java:1341) ~[c3p0-0.10.1.jar:?]
    at org.springframework.jdbc.core.JdbcTemplate$1ExecuteStatementCallback.doInStatement(JdbcTemplate.java:425) ~[spring-jdbc-5.3.37.jar:5.3.37]
    at org.springframework.jdbc.core.JdbcTemplate.execute(JdbcTemplate.java:383) ~[spring-jdbc-5.3.37.jar:5.3.37]
    … 35 more
  • ERROR 2025-04-22T23:00:15,632 processing aborted: skipped stage, failing item caused abort. parallel processing aborted after 3 successful and 1 failed items (NotificationLoggerUtil.java [pool-13-thread-738])
  • ERROR 2025-04-22T23:00:15,706 [ANALYTICS_TABLE UGBzN6vnahk] failed after 2.563s: processing aborted: skipped stage, failing item caused abort. parallel processing aborted after 3 successful and 1 failed items (RecordingJobProgress.java [pool-13-thread-738])
    java.lang.RuntimeException: processing aborted: skipped stage, failing item caused abort. parallel processing aborted after 3 successful and 1 failed items
    at org.hisp.dhis.scheduling.RecordingJobProgress.cancellationAsAbort(RecordingJobProgress.java:427) ~[dhis-service-core-2.41.1.jar:?]
    at org.hisp.dhis.scheduling.RecordingJobProgress.failedStage(RecordingJobProgress.java:280) ~[dhis-service-core-2.41.1.jar:?]
    at org.hisp.dhis.scheduling.JobProgress.autoSkipStage(JobProgress.java:418) ~[dhis-api-2.41.1.jar:?]
    at org.hisp.dhis.scheduling.JobProgress.runStageInParallel(JobProgress.java:548) ~[dhis-api-2.41.1.jar:?]
    at org.hisp.dhis.analytics.table.DefaultAnalyticsTableService.populateTables(DefaultAnalyticsTableService.java:237) ~[dhis-service-analytics-2.41.1.jar:?]
    at org.hisp.dhis.analytics.table.DefaultAnalyticsTableService.create(DefaultAnalyticsTableService.java:141) ~[dhis-service-analytics-2.41.1.jar:?]
    at org.hisp.dhis.analytics.table.DefaultAnalyticsTableGenerator.generateAnalyticsTables(DefaultAnalyticsTableGenerator.java:107) ~[dhis-service-analytics-2.41.1.jar:?]
    at org.hisp.dhis.analytics.table.scheduling.AnalyticsTableJob.execute(AnalyticsTableJob.java:70) ~[dhis-service-analytics-2.41.1.jar:?]
    at org.hisp.dhis.scheduling.JobScheduler.runDueJob(JobScheduler.java:231) ~[dhis-service-core-2.41.1.jar:?]
    at org.hisp.dhis.scheduling.JobScheduler.lambda$runIfDue$2(JobScheduler.java:199) ~[dhis-service-core-2.41.1.jar:?]
    at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:539) [?:?]
    at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264) [?:?]
    at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
    at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
    at java.base/java.lang.Thread.run(Thread.java:840) [?:?]
  • INFO 2025-04-22T23:00:15,710 Populated analytics tables: 00:00:02.958 (Clock.java [pool-13-thread-738])
  • ERROR 2025-04-22T23:00:15,724 Job failed: ‘Update analytics table’ (DefaultJobSchedulerLoopService.java [pool-13-thread-738])
    java.util.concurrent.CancellationException: null
    at org.hisp.dhis.scheduling.RecordingJobProgress.startingStage(RecordingJobProgress.java:245) ~[dhis-service-core-2.41.1.jar:?]
    at org.hisp.dhis.scheduling.JobProgress.startingStage(JobProgress.java:236) ~[dhis-api-2.41.1.jar:?]
    at org.hisp.dhis.scheduling.JobProgress.startingStage(JobProgress.java:240) ~[dhis-api-2.41.1.jar:?]
    at org.hisp.dhis.analytics.table.DefaultAnalyticsTableService.create(DefaultAnalyticsTableService.java:144) ~[dhis-service-analytics-2.41.1.jar:?]
    at org.hisp.dhis.analytics.table.DefaultAnalyticsTableGenerator.generateAnalyticsTables(DefaultAnalyticsTableGenerator.java:107) ~[dhis-service-analytics-2.41.1.jar:?]
    at org.hisp.dhis.analytics.table.scheduling.AnalyticsTableJob.execute(AnalyticsTableJob.java:70) ~[dhis-service-analytics-2.41.1.jar:?]
    at org.hisp.dhis.scheduling.JobScheduler.runDueJob(JobScheduler.java:231) ~[dhis-service-core-2.41.1.jar:?]
    at org.hisp.dhis.scheduling.JobScheduler.lambda$runIfDue$2(JobScheduler.java:199) ~[dhis-service-core-2.41.1.jar:?]
    at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:539) [?:?]
    at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264) [?:?]
    at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
    at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
    at java.base/java.lang.Thread.run(Thread.java:840) [?:?]
  • ERROR 2025-04-22T23:00:15,733 java.util.concurrent.CancellationException
    at org.hisp.dhis.scheduling.RecordingJobProgress.startingStage(RecordingJobProgress.java:245)
    at org.hisp.dhis.scheduling.JobProgress.startingStage(JobProgress.java:236)
    at org.hisp.dhis.scheduling.JobProgress.startingStage(JobProgress.java:240)
    at org.hisp.dhis.analytics.table.DefaultAnalyticsTableService.create(DefaultAnalyticsTableService.java:144)
    at org.hisp.dhis.analytics.table.DefaultAnalyticsTableGenerator.generateAnalyticsTables(DefaultAnalyticsTableGenerator.java:107)
    at org.hisp.dhis.analytics.table.scheduling.AnalyticsTableJob.execute(AnalyticsTableJob.java:70)
    at org.hisp.dhis.scheduling.JobScheduler.runDueJob(JobScheduler.java:231)
    at org.hisp.dhis.scheduling.JobScheduler.lambda$runIfDue$2(JobScheduler.java:199)
    at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:539)
    at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
    at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
    at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
    at java.base/java.lang.Thread.run(Thread.java:840)

The issue began while using version 2.41.0, after upgrading from 2.38, which had been working fine until then.
We later upgraded again to 2.41.3.1, assuming it might be a code-related issue, but the same error persists.

I would appreciate your help identifying what might be causing this issue.
If there are any further checks or actions I should take, please let me know.

Thank you!

Hi @sggs

Welcome to the community, and thanks for the detailed post. Sorry to hear that you’re facing this issue.

May I ask whether you also upgraded the Postgresql and Java versions? Please make sure to use the same software requirements for versions 40+. Please check out the Upgrade Guide - Software Requirements for the New Version

What is the current version of the Postgresql and Java? You might want to upgrade it to version 13 and the JDK to 17.

I hope that this resolves the issue. If it still doesn’t we might want to check which metadata configuration might be causing this (if it wasn’t the software requirements issue).

Thank you!

The PostgreSQL version is 13.11, and the Java version is also confirmed to be 17…
How can I check our metadata configuration? Thank you for your kind support

hmis=# SELECT version();
version

PostgreSQL 13.11 (Ubuntu 13.11-1.pgdg20.04+1) on x86_64-pc-linux-gnu, compiled by gcc (Ubuntu 9.4.0-1ubuntu1~20.04.1) 9.4.0, 64-bit
(1 row)

root@hmis:~# java --version
openjdk 17.0.14 2025-01-21
OpenJDK Runtime Environment (build 17.0.14+7-Ubuntu-120.04)
OpenJDK 64-Bit Server VM (build 17.0.14+7-Ubuntu-120.04, mixed mode, sharing)

1 Like

@sggs It’s a big jump to upgrade from v2.38 directly to v2.41.0. Try upgrading one version at a time. In other words, first upgrade to v2.39, then from v2.39 to v2.40, and so on. Like this, if the issue happens again, you can then at least determine which version of DHIS2 caused this change in behaviour.

1 Like

I must say we have a similar scenario where we upgraded from 2.41.1 to 2.41.3.1 for testing the security patch and had exactly the same error as listed above on analytics failing. We are still investigating why so I cannot shed more light at this stage but to say I don’t think it is an isolated issue or due to the big jump in versions.

Apologies for the incorrect information earlier. At that time, I updated the version from 2.41.0.1 to 2.41.1.0. After the update, the dashboard stopped refreshing properly, and eventually, it disappeared entirely. Could you please advise how we can resolve this issue?

I solved this problem. There was an issue with the JdbcAnalyticsTableManager.class. After modifying the contents of this class, I performed the following steps: I ran “Generate” and then executed the analytics process, and the Analytics Table functioned properly.

Specifically, I modified the ApprovalJoinClause method as follows:

for (OrganisationUnitLevel level : levels) {
    sql.append("ous.idlevel").append(level.getLevel())
       .append(" = da.organisationunitid or ");
}

I hope this information is helpful to others who may be experiencing the same issue.
It seems that this problem may be related to the Java version in some cases, though I can’t say for certain.
I do hope this fix will be included in a future release.

1 Like

Great work! It’s good to see you’ve figured out and decided to share the solution with the community. :clap:. Thanks!