Follow

it's absolutely ridiculous the amount of boilerplate and redownloading that happens as a result of docker. the fact that the apt cache directories aren't cached between builds, or that there isn't a default caching http proxy to save redownloads is insane. there is so much waste.

i wish everyone who works on docker had to develop for one day a week without internet access.

@sneak Are you talking about docker itself or the people that write Dockerfile with poor practices?

@sneak This is why I always push for Artifactory/Nexus or some kind of repo I can run onsite. Re-downloading from dockerhub or something is antiquated, and yes, a waste; download it once, cache it locally, what's so hard about this? We still have checksums so there's no risk there

@fak3r i don't know why docker push doesn't support s3 api

@sneak Well that'd be brilliant too, esp if you have to pull things TO aws over and over..

Sign in to participate in the conversation
Mastodon

The social network of the future: No ads, no corporate surveillance, ethical design, and decentralization! Own your data with Mastodon!