There is a postgresql setting you can make which allows you to record long running queries into the postgresql log file. Sometimes this is useful to analyze performance issues.
For example to collect all queries taking more than one minute, add the following line to your postgresql.conf file:
log_min_duration_statement = 60000
You don’t need to restart the database. Just reload will do. eg on ubuntu you would do something like ‘sudo service postgresql reload’.
If it is possible for you to do this, then I would ask you to send the resulting postgresql log file to me and I will see if there are any useful insights we can glean. Don’t post it here but you can send by email to firstname.lastname@example.org. Depending how busy your system is, the file might be quite big but should compress well.
After you have done this you can disable the setting. What I often do on production settings is to keep it there with a value of say 300000 (5 minutes). That way you will record really troublesome queries without impacting your day to day performance too much.
(Note that there is some risk that demographic data could leak into the log file when you do this. Please do check through this before sending by mail)