Anyone Can Access Deleted and Private Repository Data on GitHub
GitHub's architecture allows access to data from deleted and private repositories, posing security risks. The Cross Fork Object Reference vulnerability enables retrieval of sensitive information even after deletion, necessitating user vigilance.
Read original articleGitHub's architecture allows access to data from deleted and private repositories, posing significant security risks for organizations. This vulnerability, termed Cross Fork Object Reference (CFOR), enables users to access sensitive data from deleted forks and repositories. Even after a repository is deleted, any data committed to it remains accessible through forks, as GitHub retains commit data within its repository network. For instance, if a user forks a public repository, commits code, and then deletes the fork, the committed data can still be retrieved. This issue extends to private repositories as well; if a private fork is created and later made public, any commits made before the public transition can be accessed by anyone.
The implications are severe, as sensitive information, such as API keys, can be exposed indefinitely. GitHub's design does not align with the common user perception that deleting a repository removes all associated data. Users can access deleted commit data if they know the commit hash, which can be discovered through brute force methods or GitHub's public events API. Truffle Security has highlighted these vulnerabilities and emphasized the need for organizations to rotate keys and be vigilant about data exposure on public repositories. The findings suggest that many users may not fully understand the risks associated with GitHub's repository management, leading to potential security breaches.
Related
Simple ways to find exposed sensitive information
Various methods to find exposed sensitive information are discussed, including search engine dorking, Github searches, and PublicWWW for hardcoded API keys. Risks of misconfigured AWS S3 buckets are highlighted, stressing data confidentiality.
Leaked admin access token to Python, PyPI, and PSF GitHub repos
The JFrog Security Research team discovered a leaked admin access token for Python repositories on GitHub. PyPI promptly revoked the token, preventing a supply chain attack. Emphasizes the importance of scanning binaries for security.
Binary secret scanning prevents serious supply chain attack on Python ecosystem
The JFrog Security Research team discovered a leaked admin access token for Python repositories on GitHub, prompting swift action from PyPI to revoke the token. This incident underscores the critical need for enhanced security measures.
"GitHub" Is Starting to Feel Like Legacy Software
GitHub faces criticism for performance decline and feature issues like blame view rendering large files. Users find navigation challenging and core features neglected despite modernization efforts. Users consider exploring alternative platforms.
Nation-State Actors Targeting Software Supply Chain via GitHub [2023)
GitHub warns of Lazarus Group, linked to North Korea, targeting cryptocurrency, gambling, and cybersecurity sectors via social engineering. Group aims to breach software supply chains for financial gain. Panther Labs offers security workshop.
- Many users express frustration over GitHub's classification of vulnerabilities as features, emphasizing the need for stricter privacy measures.
- There is a consensus that users should not have to navigate complex security implications of "private" repositories, which can inadvertently expose sensitive data.
- Several commenters highlight the risks associated with the Cross Fork Object Reference vulnerability, suggesting that it undermines the integrity of private repositories.
- Users recommend best practices, such as creating new repositories instead of forking, to mitigate potential security issues.
- Some comments reflect a broader skepticism about trusting cloud services with sensitive data, advocating for local storage solutions instead.
Here is their full response from back then:
> Thanks for the submission! We have reviewed your report and validated your findings. After internally assessing the finding we have determined it is a known low risk issue. We may make this functionality more strict in the future, but don't have anything to announce now. As a result, this is not eligible for reward under the Bug Bounty program.
> GitHub stores the parent repository along with forks in a "repository network". It is a known behavior that objects from one network member are readable via other network members. Blobs and commits are stored together, while refs are stored separately for each fork. This shared storage model is what allows for pull requests between members of the same network. When a repository's visibility changes (Eg. public->private) we remove it from the network to prevent private commits/blobs from being readable via another network member.
In the meantime I'll be calling "private" repos "unlisted", seems more appropriate
GitHub may very well say that this is working as intended, but if it truly is then you should be forced to make both the repo and fork public at the same time.
Essentially "Making repo R public will make the following forks public as well 'My Fork', 'Super secret fork', 'Fork that I deleted because it contained the password to my neighbours wifi :P'.
OK. I'm not sure if the last one would actually be public, but I wouldn't be surprised if that was "Working as intended(TM)" - GitHub SecOps
I've used github for a long time, would not have expected these results, and was unnerved by them.
I'd recommend reading the article yourself. It does a good job explaining the vulnerabilities.
Similar (but less concerning) is the ability to use short SHA1 hashes. You'd have to either be targeting a particular repository (for example, one for which a malicious actor can expect users to follow the tutorial and commit API keys or other private data) or be targeting a particular individual with a public repository who you suspect might have linked private repositories. It's not free to guess something like "07f01e", but not hard either.
If these links still worked exactly the same, but (1) you had to guess 07f01e8337c1073d2c45bb12d688170fcd44c637 and (2) there was no events API with which to look up that value, this would be much, much less impactful.
https://github.com/github-community-projects/private-mirrors
On the other hand, once an API key or password has been published somewhere, you should rotate it anyway.
What would github do after receiving a DMCA request in that case?
That GitHub is telling these companies, and bear in mind that these companies are paying customers of GitHub, yeah we don't care that your private proprietary code can be hacked off GitHub by anybody, is incredibly disturbing. Is there really not enough pressure from paying customers to fix this? Is Microsoft just too big to care?
Once we eliminated the references in the tree and all forks (they were all private thankfully), we reached out to BitBucket support, and they were able to garbage collect those commits, and purge them to the point where even knowing the git hashes they were not locatable directly.
That's exactly how you should treat anything made available to the public (and there's no need for the subsequent qualifier that appears in the article—"as long as there is at least one fork of that repository").
The one thing they seem to be able to show is that commits in private branches show up in the parent repository if you know the SHAs. And that seems like a real vulnerability. But AFAICT it also requires that you know the commit IDs, which is not something you can get via brute forcing the API. You'd have to combine this with a secondary hole (like the ability to generate a git log, or exploiting a tool that lists its commit via ID in its own metadata, etc...).
Not nothing, but not "anyone can access private data on GitHub" as advertised.
Making it a "template" repo mostly fixed the issue. That creates a copy instead of a fork. However it still happens from time to time.
Say a private commit depends on a public commit C. Suppose in the public repo, the branch containing C gets deleted and C is no longer reachable from the root. From the public repo's point-of-view, C can be garbage-collected, but GitHub must keep it alive, otherwise the deletion will break the private commit.
It would be "a spooky action at a distance" from the private repo's POV. Since the data was at a time public, the private repo could have just backed up everything. In fact, if that's the case, everyone should always backup everything. GitHub retaining the commit achieves the same effect.
The public repo's owner can't prevent this breakage even if they want to, because there's no way to know the existence of this dependency.
The security issue discussed in the post is a different scenario, where the public repo's owner wants to break the dependency (making the commit no longer accessible). That would put too much of a risk for anyone to depend on any public code.
My mental model is that all commits ever submitted to GitHub will live forever and if it's public at one time, then it will always be publicly accessible via its commit hash.
I'm not so sure about the "forever" part as git gc is a thing, and at least in 2013 they ran it regularly: https://stackoverflow.com/a/56020315
No idea about nowadays though. There is this blog post:
https://github.blog/engineering/scaling-gits-garbage-collect...
> We have used this idea at GitHub with great success, and now treat garbage collection as a hands-off process from start to finish.
When I looked into it a while back, apparently it is intended behavior, which just seems odd.
Have we stopped naming vulnerabilities cute and fuzzy names and started inventing class names instead? Does this have a logo? Has this issue been identified anywhere else?
For example if the root repo is DMCA’d, or, if repo B forks repo A, then B adds some stuff that causes B to get DMCA’d. Can A still access B?
>But what’s more interesting; GitHub exposes a public events API endpoint. You can also query for commit hashes in the events archive which is managed by a 3rd party, and saves all GitHub events for the past decade outside of GitHub, even after the repos get deleted.
Oof
1. If a URL would be in the “[t]his commit does not belong to any branch of this repository, and may belong to a fork outside of the repository” and that URL uses a shortened commit hash, return 404 instead. Assuming no information leakage via timing, this would make semi-brute-force probing via short hashes much harder.
GitHub is clearly already doing the hard work for this.
2. A commit that was never public should not become public unless it is referenced in a public repository.
This would require storing more state.
> Any commits made to your private fork after you make the “upstream” repository public are not viewable.
Does that mean a private repo that has never been or will be public isn’t accessible? That scenario wasn’t mentioned.
(there is checkbox allowing that when you are opening PR that I bet almost noone noticed)
I reported that years ago and all they changed it that they extended documentation about this "feature"
my main issue was that you cannot easily revoke this access because target repo can always reopen PR and regain write access.
but they basically "stated works as intended"
Accessing commits on a private fork when it's upstream is made public
The other 2 are just common sense... push something to a public repo and it's public forever. Everyone knows once somethings on the internet it's already too late to make it secret again.
1) Never store secrets in any repo ever! As soon as you discover that its happened, rotate the key/credential/secret asap!!
2) Enterprises that rely on forking so that devs can colab are fucked! Protecting IP by way of private repos is now essentially broken on GH!
3) what the actual fuck github!!??
If I force push and orphan a commit, I expect that will get garbage collected and be gone forever.
Or if I commit a file I shouldn't have and rewrite my repo history and push up a whole new history, is the old history still hanging out forever?
If true, then it seems that there is no way to delete any commits at all from any repo that has any forks?
Github is a software distributing network. Like the app store, or Steam. They grant you access to licensed content, which you self license, and then they facilitate access for you. Based on the honor system. But some things can just be assumed to be true for the sake of simplicity and liability.
For example, If I make a repo public and then take it private the hashes that were obtained while it was open are still open. If I make a repo that's closed and open it, the whole thing is open.
If you fork a public repo and make private commits on it to a software distributor like Github, that is probably just going to end in a violation of the license. In this scenario, Github is saving you from yourself.
If anyone's wondering: Organizations that require SAML are included in your organizations even when you don't have a SAML session when signing in elsewhere via OAuth. Unlike generalized per-organization app authorizations, where GitHub can actually hide organization membership. Only way to find out if a user has a SAML session is for the consuming app to request the membership with your token, and interpret 403 as "no SAML session". As far as I know only Tailscale implemented this. This really sucks for apps like SonarCloud where someone can now view work code from their so cleanly separated personal and professional use GitHub account.
> I submitted a P1 vulnerability to a major tech company showing they accidentally committed a private key ... They immediately deleted the repository,
That is a ridiculous response to a compromised key. The repository should not have been "deleted", the key should have been revoked.
Imagine if you lost a bag with 100 keys to your house. Upon realising you desperately try to search for the bag only to find it's been opened and the keys spread around. You comb through the grass and forests nearby collecting keys and hoping you find them all.
Or you just change the locks and forget about it.
If you upload something, anything, to a computer system you do not own you need to consider it no longer secret. It's as simple as that. Don't like it? Don't do it.
I detest things like delete buttons in messaging apps and, even worse, email recall in Outhouse-style email apps. They just give people a false sense of security. I've been accidentally sent someone's password several times on Teams. Yeah you deleted the message, but my memory is very good and, trust me, I still know your password.
If there's a security problem here it's in people believing you can delete stuff from someone else's system, or that that systems make it look like you can. The solution is the same though: education. Don't blame GitHub. Don't force them to "fix" this. That will only make it worse because there are still a million other places people will upload stuff and also won't actually delete stuff.
Turns out I found out you could even invite external collaborators into your fork and totally bypass enforced SSO.
Even if you block forking into your main repo, the existing forks remains active and still can pull from upstream.
It feels like if you need proper security, you have to go with enterprise
Ultimately I don't think it's feasible to break this behaviour and the most we can hope for is a big red warning when something counterintuitive happens.
What gives?
Trusting some company will actually delete your stuff is kind of naive in my opinion.
The example of people forking and putting an API key in the repo, I would never let my people do this. Once you push, it will be "out there".
But that's just me...
How I manage this is that every time I want to open-source a previously private feature, I take the changeset diff and apply that to the files in the public repository. Same features, but plausibly different hash.
That, unfortunately, sounds like the result of publishing something on the Internet. Not GitHubs fault.
If you’re not gonna share it then it hardly matters. Use a backup drive.
Git is distributed. You don’t have to put your dotfiles on GitHub. Local is enough.
Morally seems even worse, Crowdsec did it by accident, GitHub knows about it for years now.
it doesn't matter if it's behaving as intended or how there are forks
also point 1 implies that github likely doesn't properly GCes there git which could have all kinds of problematic implications beyond the point 1 wrt. purging accidental leaked secrets or PI....
all in all it just shows github might not take privacy security serious ... which is kinda hilarious given that private repo using customers tend to be the paying customers
Sounds like a win for foss
“Private” means it should only be available to specific involved parties only.
If you implement any other behavior to these concepts you are implementing anti patterns.
We need to be precise and consistent in the wording of the functions we are providing in order to ensure we easily can understand what is going on, without having to interpret documentation to be able to fully understand what is going on.
Insane
Also Microsoft: It's a feature!
----
This is clearly documented and can be explained even to non-technical managers.
From my POV calling that vulnerability is trying to build a hype.
I think that having quote from here on visibility changing settings page would be even more clear: https://docs.github.com/en/pull-requests/collaborating-with-...
... yeah if <Do Work> is push your keys to GitHub.
"Private repositories" were never private as I said before. [0]
And if that entity has a complex system of storage and retrieval of data by and for many users, that changes frequently, without public scrutiny - it should be assumed that data breaches are likely to occur.
So I don't see it as very problematic that GitHub's private repositories, or deleted repositories, are only kind-sorta-sometimes private and deleted.
And it's silly that the article refers to one creating an "internal version" of a repository - on GitHub....
Still, interesting to know about the network-of-repositories concept.
If you know the hash of some data, then you either already have the data yourself, or you learned the hash from someone who had the data.
If you already have the data, there is no vulnerability - since you cannot learn anything you don't already have.
If you got the hash from someone, you could likewise have gotten the data from them.
People do need to be aware that 'some random hex string' in fact is the irrevocable key to all the data behind that hash - but that's kinda inherent to gits design. Just like I don't tell everyone here on HN my login password - the password itself isn't sensitive, but both of us know it accesses other things that are.
If github itself was leaking the hash of deleted data, or my plaintext password, then that would be a vulnerability.
Related
Simple ways to find exposed sensitive information
Various methods to find exposed sensitive information are discussed, including search engine dorking, Github searches, and PublicWWW for hardcoded API keys. Risks of misconfigured AWS S3 buckets are highlighted, stressing data confidentiality.
Leaked admin access token to Python, PyPI, and PSF GitHub repos
The JFrog Security Research team discovered a leaked admin access token for Python repositories on GitHub. PyPI promptly revoked the token, preventing a supply chain attack. Emphasizes the importance of scanning binaries for security.
Binary secret scanning prevents serious supply chain attack on Python ecosystem
The JFrog Security Research team discovered a leaked admin access token for Python repositories on GitHub, prompting swift action from PyPI to revoke the token. This incident underscores the critical need for enhanced security measures.
"GitHub" Is Starting to Feel Like Legacy Software
GitHub faces criticism for performance decline and feature issues like blame view rendering large files. Users find navigation challenging and core features neglected despite modernization efforts. Users consider exploring alternative platforms.
Nation-State Actors Targeting Software Supply Chain via GitHub [2023)
GitHub warns of Lazarus Group, linked to North Korea, targeting cryptocurrency, gambling, and cybersecurity sectors via social engineering. Group aims to breach software supply chains for financial gain. Panther Labs offers security workshop.