Adding large files to an image via Git LFS

I have been working on a custom image, and want to drop a bunch of my favorite hi-res images into /usr/share/wallpapers. So I put them in my repo with a files module and committed them using Git LFS to avoid directly committing large binaries. However, when I run my build, Blue Build just writes the symbolic link for each large file into the wallpapers directory instead of the actual file. I tried setting up LFS in my image using a script module , and have considered writing a custom module to handle LFS. But it seems that, because module seem to run within the context of a Containerfile, it does not recognize that it is even in a Git repo. Do you have any suggestions for how to go about this? Or should I just put my wallpaper files in a bucket and download them at build time?
4 Replies
Luke Skywunker
This thread might be able to help you. You can also make use of the copy module if you need
BlueBuild
copy
The copy module is a direct translation of the COPY instruction in a Containerfile.
GitHub
Dockerfile COPY from image resolves symbolic links · Issue #40449 ...
Description When building a docker image by copying files from an existing image symbolic links in the source image are resolved instead of copied. This results in the built image being larger than...
xyny
xyny5w ago
the checkout action requires lfs: true to be set for lfs files to be fetched this is not set by the checkout action included by default in the bluebuild action you can set the bluebuild action to skip checkout and do it yourself in a previous build step if you're using github, of course
Luke Skywunker
Ah this sounds more true
joshmock
joshmockOP5w ago
ahh thank you. I'll try this! I was reading the bluebuild source to see if it needed to happen there, but the GitHub action makes way more sense.

Did you find this page helpful?