Hi,
I am trying to import a metadata export into a clean instance, both the source and new instances are using version 2.34. The metadata.json file is 66MB, when attempting the import I get to the spinning wheel and it stays there. The server log only adds one line showing that it attempts to process the uploaded file: âskipping unknown property systemâ , nothing else, no errors.
After a while, refreshing the browser shows that tomcat is not longer running. Still no errors are logged. Postgresâ log shows âconnection reset by peerâ multiple times.
Has any one encountered this error before, should DHIS2 be able to process a file this large? Thank you for your comments and suggestions.
If itâs a performance issue then maybe some configuration settings need to be changed/updated. May you share more info about the infrastructure and software environment that you are using. Java heap space size, RAM, âŚetc?
However, it might also be because the metadata import needs more preparation and configuration. Please consider these instructions in a post by @Markus:
Also I would try a different approach! Since you are making an exact copy of the metadata then how about cloning the instance and then using import to delete all the data from the new instance? It will be clean as new, I think!
Thank you for your reply @Gassim. About the environment, this is a clean server running Ubuntu 20:
JAVA_HOME=â/usr/lib/jvm/java-8-openjdk-amd64/â
JAVA_OPTS=â-Xms4000m -Xmx7000mâ
8GB of RAM, 50GB HD.
The full database is about 2.5GB
The idea of cloning sounds interesting. My goal is to have an exact copy without any client data/records, how would I use import to delete these?
Thanks again,
Hi @Eric_Boyd_Ramirez,
Youâre welcome! The specifications mentioned look great comparing to posts by experts in the community so a hint where to look further is in the log âconnection reset by peerâ Maybe you need to increase some of the configurations for connection and memory. In tomcat-dhis/conf/server.xml check the connectionTimeout and in the dhis.conf check the connection.pool.max_size and all similar options. Similarly, in the postgresql.conf check max_connectionswork_mem I searched for the keyword âconnectionâ in the Installation docs: https://docs.dhis2.org performing-system-administration - installation
The Import/Export has a delete option for each âData, Event, and TEIâ so you could export and then import to delete.
If this doesnât delete everything then perhaps it might make the other option easier since the size will much less. If you have some time and would like to try it please give it a shot and post back. Iâve tried to use it before but not delete the whole thing so I hope it will work for you.
Increasing the ConnectionTimeOut solved the problem; thank you! Now moving on to the next problem, on the clean server the metadata import report says all objects are ignored, nothing created, and I get about 40 messages saying âInvalid reference [XXXXXXX] (DataEntryForm) on object âŚâ
Looking at the list of processed objects in the report DataEntryForm is not included, in fact is not available in the Data Export list options either, am I missing something? BTW, thereâs about 20 custom da entry forms in this instance.