Hello @bobj I have two questions regarding dhis2-tools:
Connecting to postgres remotely, given that it’s hosted in one of the lxd containers. Am using Azure Linux VMs, and would like to redirect my data pipelines to the postgres container.
Secondly, can you deploy a war file to an existing database i.e in a situation where I have migrated a database into my lxd containers.
For the first question I am assuming that you have a server in Azure and you would like to connect an instance in your computer to that server? I don’t really understand what do you mean with your data pipelines. Anyway, if that would be the case you have several options but for simplicity and security I would perform the following
Ensure you can SSH to the server
Make sure you can access the SQL container from withing the host (I think this is by default in the firewall configuration and not only from the DHIS2 container)
Connect to the SSH server by using portforwarding (How to Use SSH Port Forwarding {Ultimate Guide}) probably you will end up with something like $> ssh -L 5432:localhost:5432 you@server (This can also be achieved by putty if you need to)
As from that moment your local connections will be passed to the remote server. So you can have access with any SQL administration software or you could have your DHIS2 instance pointing there. The connection will remain as long as you don’t close the tunnel.
Thanks @jaime.bosque this was helpful. However, since am defining the connection from Azure Data Factorys’ linked service (does not have an option for ssh tunneling), I opted for a network address translation and it worked fine, the data pipelines can pull directly from the container.
As for the second question - We have always used the native way of installing DHIS2, however, with the development of dhis2-tools, I find containerization very useful. Therefore, wanted to migrated data into the containers. I found a workaround nonetheless.
Also, and just from the security perspective, by having enabled the NAT you might have exposed your database connection to the outside world so I would make changes in the configuration file and/or firewall configuration to allow connections from only specific addresses.
Hi @jthomas . If you are using the dhis2-tools-ng probably @bobj can help you better as he’s the one behind such tool. I believe that the containers come with some security measures enabled, and probably it includes a firewall that blocks any connection to the postgres port except the one coming from the dhis2 container.
You probably need to add a rule that allows the connection from the host machine to achieve what you are looking for and then you should connect to the machine (via SSH with local forwarding) with tools like pgAdmin. You could also enable the connection from the outside world so you don’t need to use the SSH, but from the security point of view that’s probably a bad idea.
P.S . Not sure if pgAdmin allows you to set up SSH tunnels from the connection. Other tools like dbeaver do , so you could simplify what I mentioned above.
I am not sure this is a great idea, since does this not open up your Postgres database directly to all connections from the Internet? The entire point of having the database containerized is to be able to isolate it from the host machine and the Internet.
We have seen quite a few instances of databases being compromised when left exposed to the Internet.
Totally agree with that @moses_mwale but I would go even further and say that this approach should not even be used for development purposes on any machine, since it really opens up the database to potential attacks. I am not sure what the limitation with an SSH tunnel combined with a jump host would be on an Azure machine, but on any machine where the Postgres database is exposed to the Internet directly, you are really opening yourself up for unwanted attention from hackers.
I would strongly encourage you to seek another alternative using an SSH tunnel combined with a properly configured jump host, whether on production or development.