this post was submitted on 14 May 2025
315 points (99.4% liked)
Programming
20172 readers
293 users here now
Welcome to the main community in programming.dev! Feel free to post anything relating to programming here!
Cross posting is strongly encouraged in the instance. If you feel your post or another person's post makes sense in another community cross post into it.
Hope you enjoy the instance!
Rules
Rules
- Follow the programming.dev instance rules
- Keep content related to programming in some way
- If you're posting long videos try to add in some form of tldr for those who don't want to watch videos
Wormhole
Follow the wormhole through a path of communities !webdev@programming.dev
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
That's low enough that it may cause problems for a lot of infrastructure. Like, I'm pretty sure that the MELPA emacs package repository builds out of git, and a lot of that is on github.
Likely the point. If you need more, get an API key.
Or just make authenticated requests. I'd expect that to be well within with capabilities of anyone using MELPA, and 5000 requests per hour shouldn't pose any difficulty considering MELPA only has about 6000 total packages.
This is my opinion on it, too. Everyone is crying about the death of Github when they're just cutting back on unauthenticated requests to curb abuse... lol seems pretty standard practice to me.
Do you think any infrastructure is pulling that often while unauthenticated? It seems like an easy fix either way (in my admittedly non devops opinion)
It's gonna be problematic in particular for organisations with larger offices. If you've got hundreds of devs/sysadmins under the same public IP address, those 60 requests/hour are shared between them.
Basically, I expect unauthenticated pulls to not anymore be possible at my day job, which means repos hosted on GitHub become a pain.
Same problem for CGNAT users
Quite frankly, companies shouldn't be pulling Willy nilly from github or npm, etc anyway. It's trivial to set up something to cache repos or artifacts, etc. Plus it guards against being down when github is down, etc.
It's easy to set up a cache, but what's hard is convincing your devs to use it.
Mainly because, well, it generally works without configuring the cache in your build pipeline, as you'll almost always need some solution for accessing the internet anyways.
But there's other reasons, too. You need authentication or a VPN for accessing a cache like that. Authentications means you have to deal with credentials, which is a pain. VPN means it's likely slower than downloading directly from the internet, at least while you're working from home.
Well, and it's also just yet another moving part in your build pipeline. If that cache is ever down or broken or inaccessible from certain build infrastructure, chances are it will get removed from affected build pipelines and those devs are unlikely to come back.
Having said that, of course, GitHub is promoting caches quite heavily here. This might make it actually worth using for the individual devs.
Ah yeah that’s right, I didn’t consider large offices. I can definitely see how that’d be a problem
If I’m using Ansible or something to pull images it might get that high.
Of course the fix is to pull it once and copy the files over, but I could see this breaking prod for folks who didn’t write it that way in the first place
I didn't think of that - also for nvim you typically pull plugins from git repositories