We have a computer vision program which has some automated tests run on video files.
Now, those files are not gigantic, but a couple hundred MB. So checking into git is not really an option.
What is the best way to bring those files into the CI build env without too much work?
As far as now, our ideas are:
put them on a DO droplet and rsync them down (not nice because might need maintenance)
pull them from GDrive (not nice because cannot just download via curl or so, need to parse some HTML or use GDrive API which is a pain)
Short of that, I think your other options are fine. I would probably put these on AWS S3 because that will give you the best value of cost, simplicity, and speed.
I believe 1G of storage is free with GitHub and BitBucket, but you might want to check their limits on bandwidth - GitHub has a 1G/month policy for free accounts. Bitbucket are a bit more lenient, as they only mention bandwidth in their AUP, presumably to deal with people who are really over-using the service. GitLab allows 10G of storage across a project (including LFS) and when I looked for bandwidth limits a month or so ago, I could not find any.
Of course, most of the above does not apply if you pay for your Git hosting.
As an aside, GitHub has issues with folks using them as a CDN and has some rate limits. I don’t know if/how they apply to Git LFS, but something to maybe research.