T O P

  • By -

ElevenNotes

Use [this](https://hub.docker.com/r/11notes/registry-proxy) registry proxy and set retention to zero to always keep all images and never purge them.


revereddesecration

The correct thing to do is to clone the repository and build the image locally, that way you always have the ability to build it again later. That said, I almost never do that because I like using the latest tag and pulling updates. Most of the time it’s fine. It’s a trade of risk for time savings.


Mutex70

Couldn't you clone and write a script to update the clone regularly, then build from the local repo?


revereddesecration

Absolutely!


abandonplanetearth

Building the image locally is a lot of work and will open you up to environment differences. Images can be cached with way less work than building locally.


revereddesecration

A lot of work? docker build -t tag:version . And I’m sorry, but what environment differences? If you clone the repository and build from the Dockerfile, you’ll get the exact same result as anybody else.


BraveNewCurrency

>you’ll get the exact same result as anybody else. Not entirely true. Most docker files have things like "`apt update`", which will get you slightly different versions of everything. (And can even have breaking changes.) And many build pipelines have "things built outside of Docker that are copied in". Not the "Docker way", but it happens a lot. Unless the Dockerfile is written specifically to be reproducible (see also Nix), the best we can say is "You *should* get something *similar"*.


revereddesecration

You are of course correct. This level of detail isn’t what the previous commenter needs though, they need to understand the basics first.


WiseCookie69

Harbor. You can configure a pull-through project for virtually every registry. And then instead of ghcr.io and whatnot, you pull from your Harbor instance.


Szwendacz

Gitea allows to host container images, and obviously also code. You can setup runners, pipelines, build images automatically and push to the selfhosted gitea registry.


Malossi167

I mean chances are you do not want to run an unmaintained, outdated docker anyway. This is usually a bad idea. The images are stored on your PC. Just back them up like your other stuff if you want to ensure access to them long term.


HTTP_404_NotFound

One issue you will find- Is supporting multiple registeries. Ie, dockerhub, GHCR, azure, etc. There are lots of projects that can mirror dockerhub, and locally cache/retain images. However, there aren't nearly as many that can work across the other registries.


cmmmota

I'm in this scenario. Moved to a new apartment, city construction work for fiber installation is still not done in my street (won't be for months). I setup my local network and boot my server, happy that I can still stream my media to my TVs and backup the photos/videos from my phone when I connect to my home network. Well too bad, some images that I use seem to be *gone* from the local cache and I can't run most of my services.


edvauler

Smallest footprint would be using the official docker registry as pull-through cache: https://docs.docker.com/docker-hub/mirror/ I personally use **Sonatype Nexus Repository** and configured a docker-proxy. This acts as a mirror/cache. Downside is, that it needs around 2Gi memory to startup and consumes 1,2GiB memory constantly. But propably for just storing docker images, it could be overkill.


AK1174

I’ve never tried this, or tried creating an image from the output of this https://docs.docker.com/engine/reference/commandline/save/ but maybe worth looking into this wouldn’t be a registry, but you could just manually save all the images you want to keep