Anyone using Docker for their VPS company?

I wonder If anyone has a hosting/server company which uses Docker rather than traditional virtualization? And what would be the pros/cons of both systems for you?


Well-known member
Yeah only playing around, wouldn't have the confidence for shared hosting (VPS environment alternative) as level of isolation isn't the same


Well-known member
there's a difference between installing and using docker in a per client vps virtualized environment as opposed to removing vps virtualization and just setting up a dedicated server with docker containers for each client
I don't see any reason why would anyone do the #1? Isn't docker an alternative for vps virtualization?

I'm using a gameserver webpanel which they just announced now the panel works with Docker. It is still beta tho but I just want to take advantage of this new feature.


Well-known member
I've been reading about these.

So far I've learned you need even more skill to run them.

Docker is more of a container service.

So you put nginx in one, php in another, mysqld in another and your file storage in another and your mobile app in another and your API in another.

This way you can roll out new updates without downtime and easy updates...

You wouldn't quite imagine using vps's like this.

Each docker as a container runs an almost full linux stack though it can be minimised.


Well-known member
The biggest problem with docker is it is a massively moving target, and is the sort of thing you need a large team to manage well. Fitting a webforum in it would be challanging due to the shared persistant state.
I was thinking about Docker only for shared gameserver hosting. And thanks to the webpanel that I mentioned before, does all the thing. So I just have install to webpanel on one of my dedicateds and create gameservers through that panel for each client which will be created on Docker containers. That's the easy part tho. I was just wondering would it be safe this way or not? Safer than or equals to VPS virtualization?


Well-known member

As DevOps matures enterprises have a need to put critical applications running on Docker Containers into production. Security folks currently have a hard time to decide if their Docker use case is ready to be operationalized. Is it a “go”, a “no go” or do they need to implement additional controls before putting it into operation?

The answer depends on how you define security. If security for you is a measure of segregation capabilities then containers are not quite there yet. If you take a step back and look at the bigger picture and consider the many pockets where security must usually go, then you come to interesting insights!

Applications deployed in containers are more secure than applications deployed on the bare OS

In short, despite the challenges, Gartner believes that one of the biggest benefits of containers is security. Gartner asserts that applications deployed in containers are more secure than applications deployed on the bare OS and, arguably, on a VM. Although containers will not prevent applications from being compromised, they greatly limit the damage of a successful compromise because applications and users are isolated on a per-container basis so that they cannot compromise other containers or the host OS — as long as a kernel privilege escalation vulnerability does not exist on the host OS.

just read up on docker issue tracker to get a feel or what could go wrong and decide :) At networking level

Last edited:


Well-known member
I was experimenting with Docker for XenForo with different configurations such as:
  • a monolithic image that packs everything including a web server, a data base server, an email server, and whatnot
  • a micro service style setup in which separate images are used for each service and linked together
The latter is more flexible that allows easy upgrades, independent scaling of services, and easy clustering. To make the deployment easier, Docker Compose can be used. Alternatively, recently introduced Docker Application Bundle (DAB) can also be used.

A few things that need to be considered include the decision whether the code would be copied inside the image or a volume mounted at run time. The earlier would allow versioning (using image tags) as updates to XF or add-ons are made, but would require building a new image after each change, while the latter would make it easier to make changes in the code base, but keeping parity between local and production environments will have yet another moving part. The other major thing to consider is to not distribute the Docker image with XenForo code in a public repositories. Additionally, it would be better if the XF related configurations can be done externally, without any file changes in the downloaded code, to allow easy updates. One way to achieve this is to use environment variables instead of making changes in the config.php. Also, the the directories that change at run time (such as data and internal-data) should be configured as external volumes, if the code is packaged inside the image.

I intend to write a detailed guide to run XenForo in Docker, but thinking of a common ground that works for many webmasters is difficult.
As I understand, they're secure, right? Sorry I'm not much of a tech guy so I can't say I understand the details. Anyways, thank you for your interests guys. :)