Large metadata Import, tomcat dying silently

I am trying to import a metadata export into a clean instance, both the source and new instances are using version 2.34. The metadata.json file is 66MB, when attempting the import I get to the spinning wheel and it stays there. The server log only adds one line showing that it attempts to process the uploaded file: “skipping unknown property system” , nothing else, no errors.

After a while, refreshing the browser shows that tomcat is not longer running. Still no errors are logged. Postgres’ log shows “connection reset by peer” multiple times.

Has any one encountered this error before, should DHIS2 be able to process a file this large? Thank you for your comments and suggestions.

Hi @Eric_Boyd_Ramirez,

Welcome to the community! :tada::tada:

If it’s a performance issue then maybe some configuration settings need to be changed/updated. May you share more info about the infrastructure and software environment that you are using. Java heap space size, RAM, …etc?

However, it might also be because the metadata import needs more preparation and configuration. Please consider these instructions in a post by @Markus:

Also I would try a different approach! Since you are making an exact copy of the metadata then how about cloning the instance and then using import to delete all the data from the new instance? It will be clean as new, I think! :smiley:

Thank you for your reply @Gassim. About the environment, this is a clean server running Ubuntu 20:
JAVA_OPTS=‘-Xms4000m -Xmx7000m’
8GB of RAM, 50GB HD.

The full database is about 2.5GB

The idea of cloning sounds interesting. My goal is to have an exact copy without any client data/records, how would I use import to delete these?
Thanks again,

Hi @Eric_Boyd_Ramirez,
You’re welcome! The specifications mentioned look great comparing to posts by experts in the community so a hint where to look further is in the log “connection reset by peer” Maybe you need to increase some of the configurations for connection and memory. In tomcat-dhis/conf/server.xml check the connectionTimeout and in the dhis.conf check the connection.pool.max_size and all similar options. Similarly, in the postgresql.conf check max_connections work_mem
I searched for the keyword ‘connection’ in the Installation docs: performing-system-administration - installation

The Import/Export has a delete option for each ‘Data, Event, and TEI’ so you could export and then import to delete.

If this doesn’t delete everything then perhaps it might make the other option easier since the size will much less. :smiley: If you have some time and would like to try it please give it a shot and post back. I’ve tried to use it before but not delete the whole thing so I hope it will work for you.

Thanks! :relaxed:

1 Like

Increasing the ConnectionTimeOut solved the problem; thank you! Now moving on to the next problem, on the clean server the metadata import report says all objects are ignored, nothing created, and I get about 40 messages saying “Invalid reference [XXXXXXX] (DataEntryForm) on object …”

Looking at the list of processed objects in the report DataEntryForm is not included, in fact is not available in the Data Export list options either, am I missing something? BTW, there’s about 20 custom da entry forms in this instance.

Thank you!

You’re welcome @Eric_Boyd_Ramirez! I’m glad it worked. Could you mark the post as solution and start the new problem in a new topic?

About the new issue, when you create the new topic please post the template you are using to import?