ipv4 docker overlay networks are now supported
Each tenant on kraud has a least one VPC, a "virtual private cloud" as the industry term happens to be. VPCs are fully isolated networks from other VPCs, and can span across multiple datacenters, including on-prem.
using the ipv6 VPC network
Each pod can connect to every other pod within the same VPC (again, also across data centers) over a wireguard encrypted channel. A VPC is a layer 3 ipv6 transport, which is the most efficient way to do routing between many different sites and participants.
Domain names of pods resolve to their ipv6 vpc adress by default. You can simply start two pods in the same namespace, let's say "db" and "app" and then connect to "db" from the app. This works out of the box with almost all modern software.
You can also connect to an app in a different namespace using its fqn, i.e. "app"."namespace"
using ipv4
Feedback from you, our users, included the feature request to use ipv4 between pods instead of ipv6, because some legacy applications do not support v6. Also some docker compose setups include multiple networks with fixed ip addresses that do not work well with dynamically assigned v6 addresses.
Starting today, every namespace additionally has a default ipv4 overlay assigned. It is implemented using ip6ip on top of vpc, so it's not as efficient as the default vpc routing, but it allows for convenience features such as assigned fixed ipv4 addresses
You will notice that docker --context kraud network ls
now returns slightly different output.
Since docker cli lacks a large amount of features, we need to condense the information a bit
the first part is the namespace, the same thing returned by kra ns ls
the second part is the overlay network
as you can see, this one has an ipv4 assigned. you can specify which one!
docker-compose.yaml | |
---|---|
hopefully this brings us closer to direct docker compose compat, and helps you with legacy applications. We're always happy to hear about fedback and feature requests. If you need help, dont hestiate to contact support@kraudcloud.com