Security is not part of most people's jobs
Chris Siebenmann discusses the lack of security priority in workplaces, where job performance overshadows security adherence. Rewards for job skills often neglect security, hindering its importance and feedback mechanisms in organizations.
Read original articleIn a blog post dated June 25, 2024, Chris Siebenmann discusses the role of security in people's jobs. He argues that in most workplaces, security is not a priority for employees as they are primarily focused on getting their job done to avoid being fired. Siebenmann highlights that individuals are typically hired, promoted, and rewarded based on their job performance rather than their adherence to security protocols. This leads to a situation where security becomes an overhead and may hinder employees from achieving promotions or bonuses. The author emphasizes that what is rewarded in a workplace is what employees will prioritize, often at the expense of security measures. Siebenmann suggests that the lack of incentives for prioritizing security, coupled with a focus on job-related skills, contributes to the neglect of security practices in many organizations. The post concludes by noting the challenges in receiving feedback on the long-term consequences of design and programming decisions due to high turnover rates in the industry.
Related
Simple ways to find exposed sensitive information
Various methods to find exposed sensitive information are discussed, including search engine dorking, Github searches, and PublicWWW for hardcoded API keys. Risks of misconfigured AWS S3 buckets are highlighted, stressing data confidentiality.
Why SMBs Don't Deploy SSO
Small and medium-sized businesses (SMBs) hesitate to deploy Single Sign-On (SSO) due to perceived lack of operational benefits compared to costs. Encouragement for free essential security features and simplifying SSO adoption processes is highlighted.
BeyondCorp (2014)
Google's BeyondCorp approach rethinks enterprise security by moving away from traditional perimeter security to enhance protection in the changing tech environment. Visit the link for more details on this innovative strategy.
No Matter What They Tell You, It's a People Problem (2008)
The article emphasizes the crucial role of people in software development, citing teamwork, communication, and problem-solving skills as key factors for project success. It highlights the importance of job satisfaction and team cohesion, underlining the significance of positive personal relationships within development teams.
Why I Attack
Nicholas Carlini, a computer science professor, focuses on attacking systems due to a passion for solving puzzles. He categorizes vulnerabilities as patchable or unpatchable, stresses responsible disclosure, and highlights the importance of going public to prevent future exploitation.
Part of the issue with software engineering as a field is we keep telling ourselves that university isn’t vocational training, even though we act as if it is. So it’s entirely possible (even likely) that a new grad hasn’t heard of any of the OWASP top 10 vulnerabilities, but they will know how to reverse a linked list. Organizations can harden themselves by making sure everyone has a minimum understanding of the most common threats. If you save a password in plaintext it’s malpractice at this point.
Second thing is you need to have someone in the organization whose job isn’t beholden to expediency. Studies have to pass an ethics review before they’re greenlit. Maybe a product should at least get some eyeballs before it’s implemented.
Quite often, the rare secure systems are built by great product teams who understand and are passionate about user data and security of the product themselves and architect the system accordingly.
I'm jaded and think our best hope is that the good guys can find the vulnerabilities before the bad guys. We can't depend on the companies to do the right thing when they have no incentive. There are no negative consequences for companies with bad security currently.
The bad guys get to attack-at-will and the good guys have to beg permission, "please Equifax, you have all my data, may I please test the effectiveness of your security for myself?" They say no, and I'll risk going to prison if I try.
I'm jaded. I think we don't do this because it would embarrass many powerful people. We might also realize many of our institutions are incapable of building secure systems, which would be a national embarrassment.
This is important though. It's no exaggeration to say this is a matter of national security.
Currently we sacrifice national security so companies can avoid being embarrassed.
Culture is "here's how we do things." I suspect that incentives work a lot better if they're reinforcing culture than if they're working against it. Going against cultural practices requires strong incentives, and even then it's hard.
I'm reminded of businesses trying to keep employees from holding doors open for strangers who look the part. This is going against ingrained cultural practices about common courtesy and incentives alone are likely not enough. If it's important enough, you might need a doorman.
Which makes security spending like entertainment spending, when you have extra money to spend you do it to make yourself and potentially your customers feel good. If the economy is bad you lie about your security posturing just like you lie about how much you care about the customer in general.
The audits would grade the work in comparison to the company's accepted level of security, and for those which fall below measures are taken, while for those above there's an unbounded, significant bonus, higher the better the security practices of the employee.
A company might also set a upper limit for unapproved, revenue-affecting practices, of course
Everyone does enough to not be accused of gross negligence, but really I have not seen anyone pay more than lip service. And I don't blame them. No matter how much this hurts to say as a security professional.
Stepping back from security, consider where else this kind of problem arises. I'm not going to remember who, but a decade back or so, one of the service branch secretaries said something to the effect that he'd rather reduce the rate at which servicemembers rape each other than win wars. Compliance with anti sexual harassment policies and contribution to positive culture became direct bullet points you had to meet on an officer evaluation report to get promoted. Is this right? I don't claim to know, personally, but consider the arguments. Nominally, what a military cares about is winning wars, but how much does what any individual officer does contribute to that? Throughout the Global War on Terror years, we regularly completely destroyed all of the leadership and fighting apparatus of elements in Iraq and Afghanistan that opposed US strategic goals. But we couldn't forcefully install competent local government with no loyalty to the prior regimes, and we couldn't eliminate sympathizers and supporters and even direct contributors to opposing efforts that took refuge in other countries the US military had no authority to operate in. In that sense, winning or losing was outside of the scope of what the military could even do. It was only one part of a larger national strategy that had many civilian pieces that could also fail.
On the other hand, if they believed that reducing rape rates could promote long-term credibility and public image for the military and it was something that was actually under the military's own full control to do, then it arguably makes sense to place emphasis there.
IT security is kind of like that. Software companies nominally care about profit above all else, or in reality scoring personal profit for their owners, whether or not that is because the company itself ever earns accounting profit. But how on earth do you assess the contribution of any individual developer to that outcome? We're looking here at people being promoted and rewarded based on quickly shipping features that seem to work and address some use case, but is that ultimately even what causes a company's owners to get wealthy? The company with the best software doesn't always win, and the company that wins doesn't even always make its owners the richest.
Like reducing rape in the military, reducing security vulnerabilities in software is more of a social goal than an organizational goal. It's rare that it ultimately matters when companies have very large, very public breaches. It's a cost when it happens, but typically one that is drowned out by other considerations. Many large incumbents effectively can't fail over any short period of time of their own accord. They need to be outcompeted, and simply being more secure is not enough of a differentiator for users to switch. If we want to force companies to give a shit anyway and reward employees based on security, we need to make it matter to the organizations. Or you could better professionalize software development and make attention to security a matter of licensing, the same way you don't leave it up to financial advisors and attorneys to act in the best interests of their clients by organizational reward. You take away their ability to get a job at all if they don't do it, by means outside of the employers themselves.
Ultimately, the military didn't really care about soldiers being raped until public outcry forced them to care. I don't think software companies are going to care, either, until public pressure hits some kind of breaking point, probably involving the force of government.
Related
Simple ways to find exposed sensitive information
Various methods to find exposed sensitive information are discussed, including search engine dorking, Github searches, and PublicWWW for hardcoded API keys. Risks of misconfigured AWS S3 buckets are highlighted, stressing data confidentiality.
Why SMBs Don't Deploy SSO
Small and medium-sized businesses (SMBs) hesitate to deploy Single Sign-On (SSO) due to perceived lack of operational benefits compared to costs. Encouragement for free essential security features and simplifying SSO adoption processes is highlighted.
BeyondCorp (2014)
Google's BeyondCorp approach rethinks enterprise security by moving away from traditional perimeter security to enhance protection in the changing tech environment. Visit the link for more details on this innovative strategy.
No Matter What They Tell You, It's a People Problem (2008)
The article emphasizes the crucial role of people in software development, citing teamwork, communication, and problem-solving skills as key factors for project success. It highlights the importance of job satisfaction and team cohesion, underlining the significance of positive personal relationships within development teams.
Why I Attack
Nicholas Carlini, a computer science professor, focuses on attacking systems due to a passion for solving puzzles. He categorizes vulnerabilities as patchable or unpatchable, stresses responsible disclosure, and highlights the importance of going public to prevent future exploitation.