There’s a couple of things you could do. One of which would be to create a base image with all of your dependencies installed and use that in a FROM in a Docker container. Right now you’d need to put that base image on a public registry as resin.io would need to be able to access it directly, but we’re working on adding build-time secrets support to the platform (which would enable private registries).
That said, that might be overkill if what you’re looking for is to speed up builds. We use Docker layer caching on the build server, so as long as your dependencies are on a higher line in your Dockerfile than what you are updating, they will be pulled from cache and not re-run.
So you could do this:
RUN apt-get update && apt-get install package1 package2 package3
COPY ./my_app.py .
As long as any changes you are making happen in
my_app.py, everything up through the “
WORKDIR” line in the Dockerfile will come from cache after the first build. So the first build might take 15 minutes to install your dependencies, but subsequent builds would take only a second or three to load from cache.