July 14th, 2024

Six Dumbest Ideas in Computer Security

In computer security, common misconceptions like "Default Permit," "Enumerating Badness," and "Penetrate and Patch" hinder effective protection. Emphasizing a "Default Deny" policy and proactive security design is crucial.

Read original articleLink Icon
Six Dumbest Ideas in Computer Security

In the realm of computer security, the focus on innovation and new technologies often overshadows fundamental flaws in security practices. The article highlights the six most misguided ideas in computer security, starting with the concept of "Default Permit," where systems allow everything except a few restricted items, leading to vulnerabilities. The second idea, "Enumerating Badness," involves trying to track and block every known threat, which becomes impractical in the face of the vast number of malicious entities on the internet. Lastly, "Penetrate and Patch" emphasizes fixing vulnerabilities as they are discovered, rather than designing systems with security in mind from the start. These flawed approaches persist in the industry despite their ineffectiveness. The article stresses the importance of adopting a "Default Deny" policy, focusing on known good entities, and designing systems with security as a core principle to enhance overall protection.

Related

Why I Attack

Why I Attack

Nicholas Carlini, a computer science professor, focuses on attacking systems due to a passion for solving puzzles. He categorizes vulnerabilities as patchable or unpatchable, stresses responsible disclosure, and highlights the importance of going public to prevent future exploitation.

Security is not part of most people's jobs

Security is not part of most people's jobs

Chris Siebenmann discusses the lack of security priority in workplaces, where job performance overshadows security adherence. Rewards for job skills often neglect security, hindering its importance and feedback mechanisms in organizations.

Windows: Insecure by Design

Windows: Insecure by Design

Ongoing security issues in Microsoft Windows include vulnerabilities like CVE-2024-30080 and CVE-2024-30078, criticized for potential remote code execution. Concerns raised about privacy with Recall feature, Windows 11 setup, and OneDrive integration. Advocacy for Linux desktops due to security and privacy frustrations.

Windows: Insecure by Design

Windows: Insecure by Design

The article discusses ongoing security issues with Microsoft Windows, including recent vulnerabilities exploited by a Chinese hacking group, criticism of continuous patch releases, concerns about privacy invasion with Recall feature, and frustrations with Windows 11 practices. It advocates for considering more secure alternatives like Linux.

The IT Industry is a disaster (2018)

The IT Industry is a disaster (2018)

The IT industry faces challenges in IoT and software reliability. Concerns include device trustworthiness, complex systems, and security flaws. Criticisms target coding practices, standards organizations, and propose accountability and skill recognition.

Link Icon 53 comments
By @tptacek - 3 months
We're doing this again, I see.

https://hn.algolia.com/?q=six+dumbest+ideas+in+computer+secu...

You can pick this apart, but the thing I always want to call out is the subtext here about vulnerability research, which Ranum opposed. At the time (the late 90s and early aughts) Marcus Ranum and Bruce Schneier were the intellectual champions of the idea that disclosure of vulnerabilities did more harm than good, and that vendors, not outside researchers, should do all of that work.

Needless to say, that perspective didn't prove out.

It's interesting that you could bundle up external full-disclosure vulnerability research under the aegis of "hacking" in 2002, but you couldn't do that at all today: all of the Big Four academic conferences on security (and, obviously, all the cryptography literature, though that was true at the time too) host offensive research today.

By @lobsang - 3 months
Maybe I missed it, but I was surprised there was no mention of passwords.

Mandatory password composition rules (excluding minimum length) and rotating passwords as well as all attempts at "replacing passwords" are inherintly dumb in my opinion.

The first have obvious consequences (people writing passwords down, choosing the same passwords, adding 1) leading to the second which have horrible / confusing UX (no I don't want to have my phone/random token generator on me any time I try to do something) and default to "passwords" anyway.

Please just let me choose a password of greater than X length containing or not containing any chachters I choose. That way I can actually remember it when I'm not using my phone/computer, in a foreign country, etc.

By @kstrauser - 3 months
Hacking is cool. Well, gaining access to someone else's data and systems is not. Learning a system you own so thoroughly that you can find ways to make it misbehave to benefit you is. Picking your neighbor's door lock is uncool. Picking your own is cool. Manipulating a remote computer to give yourself access you shouldn't have is uncool. Manipulating your own to let you do things you're not suppose to be able to is cool.

That exploration of the edges of possibility is what make moves the world ahead. I doubt there's ever been a successful human society that praised staying inside the box.

By @Hendrikto - 3 months
This is full of very bad takes.

> I know other networks that it is, literally, pointless to "penetration test" because they were designed from the ground up to be permeable only in certain directions and only to certain traffic destined to carefully configured servers running carefully secured software.

”I don‘t need to test, because I designed, implemented, and configured my system carefully.“ might be the actual worst security take I ever heard.

> […] hacking is a social problem. It's not a technology problem, at all.

This is security by obscurity. Also it‘s not always social. Take corporate espionage and nation states for example.

By @CM30 - 3 months
I think the main problem is that there's usually an unfortunate trade off between usability and security, and most of the issues mentioned as dumb ideas here come from trying to make the system less frustrating for your average user at the expense of security.

For example, default allow is terrible for security, and the cause of many issues in Windows... but many users don't like the idea of having to explicitly permit every new program they install. Heck, when Microsoft added that confirmation, many considered it terrible design that made the software way more annoying to use.

'Default Permit', 'Enumerating Badness' and 'Penetrate and Patch ' are all unfortunately defaults because of this. Because people would rather make it easier/more convenient to use their computer/write software than do what would be best for security.

Personally I'd say that passwords in general are probably one of the dumbest ideas in security though. Like, the very definition of a good password likely means something that's hard to remember, hard to enter on devices without a proper keyboard, and generally inconvenient for the user in almost every way. Is it any wonder that most people pick extremely weak passwords, reuse them for most sites and apps, etc?

But there's no real alternative sadly. Sending links to email means that anyone with access to that compromises everything, though password resets usually mean the same thing anyway. Physical devices for authentication mean the user can't log in from places outside of home that they might want to login from, or they have to carry another trinket around everywhere. And virtually everything requires good opsec, which 99.9% of the population don't really give a toss about...

By @moring - 3 months
> Think about it for a couple of minutes: teaching yourself a bunch of exploits and how to use them means you're investing your time in learning a bunch of tools and techniques that are going to go stale as soon as everyone has patched that particular hole.

No, it means that you learn practical aspects alongside theory, and that's very useful.

By @Zak - 3 months
I'd drop "hacking is cool" from this list and add "trusting the client".

I've seen an increase in attempts to trust the client lately, from mobile apps demanding proof the OS is unmodified to Google's recent attempt to add similar DRM to the web. If your network security model relies on trusting client software, it is broken.

By @munchausen42 - 3 months
About 'Default Deny': 'It's not much harder to do than 'Default Permit,' but you'll sleep much better at night.'

Great that you, the IT security person, sleeps much better at night. Meanwhile, the rest of the company is super annoyed because nothing ever works without three extra rounds with the IT department. And, btw., the more annoyed people are, the more likely they are to use workarounds that undermine your IT security concept (e.g., think of the typical 'password1', 'password2', 'password3' passwords when you force users to change their password every month).

So no, good IT security does not just mean unplugging the network cable. Good IT security is invisible and unobtrusive for your users, like magic :)

By @trey-jones - 3 months
Most security-oriented articles are written by extremely security-minded people. These people in my experience ignore the difficulties that a purely security-oriented approach imposes on users of the secure software. I always present security as a sliding scale. On one end "Secure", and on the other "Convenient". Purely Secure software design will almost never have any users (because it's too inconvenient), and purely Convenient software design will ultimately end up the same (because it's not secure enough).

That said, this is a good read for the most part. I heavily, heavily disagree with the notion that trying to write exploits or learn to exploit certain systems as a security professional is dumb (under "Hacking is C00L"). I learned more about security by studying vulnerabilities and exploits and trying to implement my own (white hat!) than I ever did by "studying secure design". As they say, "It takes one to know one." or something.

By @billy99k - 3 months
This is a mostly terrible 19-year old list.

Here is an example:

"Your software and systems should be secure by design and should have been designed with flaw-handling in mind"

Translation: If we lived in a perfect world, everything would be secure from the start.

This will never happen, so we need to utilize the find and patch technique, which has worked well for the companies that actually patch the vulnerabilities that were found and learn from their mistakes for future coding practices.

The other problem is that most systems are not static. It's not release a secure system and never update it again. Most applications/systems are updated frequently, which means new vulnerabilities will be introduced.

By @teleforce - 3 months
This article is quite old and has been submitted probably every year since it's published with past submissions well into double pages.

For modern version and systematic treatment of the subject check out this book by Spafford:

Cybersecurity Myths and Misconceptions: Avoiding the Hazards and Pitfalls that Derail:

https://www.pearson.com/en-us/subject-catalog/p/cybersecurit...

By @dale_glass - 3 months
"We're Not a Target" isn't a minor dumb. It's the standpoint of every non-technical person I've ever met. "All I do with my computer is to read cooking recipes and upload cat photos. Who'd want to break in? I'm boring."

The best way I found to change their mind is to make a car analogy. Who'd want to steal your car? Any criminal with an use for it. Why? Because any car is valuable in itself. It can be sold for money. It can be used as a getaway vehicle. It can be used to crash into a jewelry shop. It can be used for a joy ride. It can be used to transport drugs. It can be used to kill somebody.

A criminal stealing a car isn't hoping that there are Pentagon secrets in the glove box. They have an use for the car itself. In the same way, somebody breaking into your computer has uses for the computer itself. They won't say no to finding something valuable, but it's by no means a requirement.

By @Arch-TK - 3 months
> #3) Penetrate and Patch

This is one of the reasons why I feel my job in security is so unfulfilling.

Almost nobody I work with really cares about getting it right to begin with, designing comprehensive test suites to fuzz or outright prove that things are secure, using designs which rule out the possibility of error.

You get asked: please look at this gigantic piece of software, maybe you get the source code, maybe it's written in Java or C#. Either way, you look at <1% of it, you either find something seriously wrong or you don't[0], you report your findings, maybe the vendor fixes it. Or the vendor doesn't care and the business soliciting the test after purchasing the software from the vendor just accepts the risk, maybe puts in a tissue paper mitigation.

This approach seems so pointless that it's difficult to bother sometimes.

edit:

> #4) Hacking is Cool

I think it's good to split unlawful access from security consultancy.

You don't learn nearly as much about how to secure a system if you work solely from the point of view of an engineer designing a system to be secure. You can get much better insight into how to design a secure system if you try to break in. Thinking like a bad actor, learning how exploitation works, etc. These are all things which strictly help.

[0]: It's crazy how often I find bare PKCS#7 padded AES in CBC mode. Bonus points if you either use a "passphrase" directly, or hash it with some bare hash algorithm before using various lengths of the hash for both the key and IV. Extra bonus points if you hard code a "default" password/key and then never override this in the codebase.

By @cwbrandsma - 3 months
Penatrate and Patch: because if it doesn’t work the first time then throw everything away, fire the developers, hire new ones, and start over completely.
By @oschvr - 3 months
> "If you're a security practitioner, teaching yourself how to hack is also part of the "Hacking is Cool" dumb idea."

lol'd at the irony of the fact that this was posted here, Hacker News...

By @nottorp - 3 months
> #4) Hacking is Cool

Hacking is cool. Why the security theater industry has appropriated "hacking" to mean accessing other people's systems without authorization, I don't know.

By @dingody - 3 months
I don’t entirely agree with the author’s viewpoint on “Hacking is Cool.” There was a time when I thought similarly, believing that “finding some system vulnerabilities is just like helping those system programmers find bugs, and I can never be better than those programmers.” However, I gradually rejected this idea. The appeal of cybersecurity lies in its “breadth” rather than its “depth.” In a specific area, a hacker might never be as proficient as a programmer, but hackers often possess a broad knowledge base across various fields.

A web security researcher might simultaneously discover vulnerabilities in “PHP, JSP, Servlet, ASP.NET, IIS, and Tomcat.” A binary researcher might have knowledge of “Windows, Android, and iOS.” A network protocol researcher might be well-versed in “TCP/IP, HTTP, and FTP” protocols. More likely, a hacker often masters all these basic knowledge areas.

So, what I want to say is that the key is not the technology itself, nor the nitty-gritty of offense and defense in some vulnerabilities, but rather the ability to use the wide range of knowledge and the hacker’s reverse thinking to challenge seemingly sound systems. This is what our society needs and it is immensely enjoyable.

By @esjeon - 3 months
> Educating Users

isn't dumb, because "users" are proven to be the weakest link in the whole security chain. Users must be aware of the workplace security, just like how they should be trained w/ the workplace safety.

Also, there's no security to deal with if the system is unusable by the users. The trade-off b/w usability and security is simply unsolvable, and education is a patch to that problem.

By @iandanforth - 3 months
And yet the consequence of letting people like this run your security org is that it takes a JIRA ticket and multiple days, weeks, never to be able to install 'unapproved' software on you laptop.

Then if you've got the software you need to do your job you're stuck in endless cycles of "pause and think" trying to create the mythical "secure by design" software which does not exist. And then you get hacked anyway because someone got an email (with no attachments) telling them to call the CISO right away, who then helpfully walks them through a security "upgrade" on their machine.

Caveats: Yes there is a balance and log anomaly detection followed by actual human inspection is a good idea!

By @tonnydourado - 3 months
I've seen "Penetrate and Patch" play out a lot on software development in general. When a new requirement shows up, or technical debt starts to grow, or performance issues, the first instinct of a lot of people is to try and find the smallest, easiest possible change to achieve the immediate goal, and just move to the next user story.

That's not a bad instinct by itself, but when it's your only approach, it leads to a snowball of problems. Sometimes you have to question the assumptions, to take a step back and try to redesign things, or new addition just won't fit, and the system just become wonkier and wonkier.

By @woodruffw - 3 months
Some of this has aged pretty poorly -- "hacking is cool" has, in fact, largely worked out for the US's security community.
By @gnabgib - 3 months
(2015) Previous discussions:

2015 (114 points, 56 comments) https://news.ycombinator.com/item?id=8827985

2023 (265 points, 202 comments) https://news.ycombinator.com/item?id=34513806

By @AtlasBarfed - 3 months
#1) Great, least secure privilege, oh wait, HOW LONG DOES IT TAKE TO OPEN A PORT? HOW MANY APPROVALS? HOW MANY FORMS?

Least secure privilege never talks about how security people in charge of granting back the permissions do their jobs at an absolute sloth pace.

#2) Ok, what happens when goodness becomes badness, via exploits or internal attacks? How do you know when a good guy becomes corrupted without some enumeration of the behavior of infections?

#3) Is he arguing to NOT patch?

#4) Hacking will continue to be cool as long as modern corporations and governments are oppressive, controlling, invasive, and exploitative. It's why Hollywood loves the Mafia.

#5) ok, correct, people are hard to train at things they really don't care about. But "educating users", if you squint is "organizational compliance". You know who LOVES compliance checklists? Security folks.

#6) Apparently, there are good solutions in corporate IT, and all new ones are bad.

I'll state it once again: my #1 recommendation to "security" people is PROVIDE SOLUTIONS. Parachuting in with compliance checklists is stupid. PROVIDE THE SOLUTION.

But security people don't want to provide solutions, because they are then REALLY screwed when inevitably the provided solution gets hacked. It's way better to have some endless checklist and shrug if the "other" engineers mess up the security aspects.

And by PROVIDE SOLUTIONS I don't mean "offer the one solution for problem x (keystore, password management), and say fuck you if someone has a legitimate issue with the system you picked". If you can't provide solutions to various needs of people in the org, you are failing.

Corporate Security people don't want to ENGINEER things, again they just want to make compliance powerpoints to C-suite execs and hang out in their offices.

By @dasil003 - 3 months
I was 5 years into a professional software career when this was written, at this point I suspect I'm about the age of the author at the time of its writing. It's fascinating to read this now and recognize the wisdom coming from experience honed in the 90s and the explosion of the internet, but also the cultural gap from the web/mobile generation, and how experience doesn't always translate to new contexts.

For instance, the first bad idea, Default Permit, is clearly bad in the realm of networking. I might quibble a bit and suggest Default Permit isn't so much an idea as the natural state of when one invents computer networking. But clearly Default Deny was a very very good idea and critical idea necessary for the internet's growth. It makes a lot of sense in the context of global networking, but it's not quite as powerful in other security contexts. For instance, SELinux has never really taken off, largely because it's a colossal pain in the ass and the threat models don't typically justify the overhead.

The other bad idea that stands out is "Action is Better Than Inaction". I think this one shows a very strong big company / enterprise bias more than anything else—of course when you are big you have more to lose and should value prudence. And yeah, good security in general is not based on shiny things, so I don't totally fault the author. That said though, there's a reason that modern software companies tout principles like "bias for action" or "move fast and break things"—because software is malleable and as the entire world population shifted to carrying a smartphone on their person at all times, there was a huge land grab opportunity that was won by those who could move quickly enough to capitalize on it. Granted, this created a lot of security risk and problems along the way, but in that type of environment, adopting a "wait-and-see" attitude can also be an existential threat to a company. At the end of the day though, I don't think there's any rule of thumb for whether action vs inaction is better, each decision must be made in context, and security is only one consideration of any given choice.

By @dvfjsdhgfv - 3 months
> The cure for "Enumerating Badness" is, of course, "Enumerating Goodness." Amazingly, there is virtually no support in operating systems for such software-level controls.

Really? SELinux and AppArmor have existed since, I don't know, late nineties? The problem is not that these controls don't exist, it's just they make using your system much, much harder. You will probably spent some time "teaching" them first, then actually enable, and still fight with them every time you install something or make other changes in your system.

By @jeffrallen - 3 months
mjr (as I always knew him from mailing lists and whatnot) seems to have given up on security and enjoys forging metal instead now.

> Somewhere in there, security became a suppurating chest wound and put us all on the treadmill of infinite patches and massive downloads. I fought in those trenches for 30 years – as often against my customers (“no you should not put a fork in a light socket. Oh, ow, that looks painful. Here, let me put some boo boo cream on it so you can do it again as soon as I leave.”) as for them. It was interesting and lucrative and I hope I helped a little bit, but I’m afraid I accomplished relatively nothing.

Smart guy, hope he enjoys his retirement.

By @ricktdotorg - 3 months
it's 2024! if you run your own infrastructure in your own DC and your defaults are NOT:

- to heavily VLAN via load type/department/importance/whatever your org prefers

- default denying everything except common infra like DNS/NTP/maybe ICMP/maybe proxy arp/etc between those VLANs

- every proto/port hole poked through is a security-reviewed request

then you are doing it wrong.

"ahh but these ACL request reviews take too long and slow down our devs" -- fix the review process, it can be done.

spend the time on speeding up the security review, not on increasing your infrastructure's attack surface.

By @voidUpdate - 3 months
I'm not sure if I'm completely misunderstanding #4 or if it's wrong. Pentesting an application is absolutely a good idea, and its not about "teaching yourself a bunch of exploits and how to use them" in the same way programming isn't just "learning a bunch of sorting algorithms and how to use them". It's about knowing why an exploit works, and how it can be adapted to attack something else and find a new vulnerability, and then it goes back to the programming side of working out why that vulnerability works and how to fix it
By @bawolff - 3 months
> If you're a security practitioner, teaching yourself how to hack is also part of the "Hacking is Cool" dumb idea. Think about it for a couple of minutes: teaching yourself a bunch of exploits and how to use them means you're investing your time in learning a bunch of tools and techniques that are going to go stale as soon as everyone has patched that particular hole.

I would strongly disagree with that.

You can't defend against something you don't understand.

You definitely shouldn't spend time learning some script-kiddie tool, that is pointless. You should understand how exploits work from first principles. The principles mostly won't change or at least not very fast, and you need to understand how they work to make systems resistant to them.

One of the worst ideas in computer security in my mind is cargo culting - where people just mindlessly repeat practises thinking it will improve security. Sometimes they don't work because they have been taken out of their original context. Other times they never made sense in the first place. Understanding how exploits work stops this.

By @mrbluecoat - 3 months
> sometime around 1992 the amount of Badness in the Internet began to vastly outweigh the amount of Goodness

I'd be interested in seeing a source for this. Feels a bit anecdotal hyperbole.

By @motohagiography - 3 months
what has changed since 2005 is that these ideas are no longer dumb, but describe the factors of the dynamic security teams have to manage now. previously, security was an engineering and gating problem when systems were less interdependent and complex, but now it's a policing and management problem where there is a level of pervasive risk that you find ways to extract value from.

I would be interested in whether he still thinks these are true, as if you are doing security today, you are doing exactly these things.

- educating users: absolutely the most effective and highest return tool available.

- default permit: there are almost no problems you can't grow your way out of. there are zero startups, or even companies, that have been killed by breaches.

- enumerating badness: when managing a dynamic, you need measurements. there is never zero badness, that's what makes it valuable. the change in badness over time is a proxy for the performance of your organization.

- penetrate and patch: having that talent on your team yields irreplacable expereince. the only reason most programmers know about stacks and heaps today is from smashing them.

- hacking is cool: 30 years later, what is cooler, hacking or marcus?

By @amelius - 3 months
> "Let's go production with it now and we can secure it later" - no, you won't. A better question to ask yourself is "If we don't have time to do it correctly now, will we have time to do it over once it's broken?"

I guess the idea is generally that if we go in production now we will make profits, and with those profits we can scale and hire real security folks (which may or may not happen).

By @mikewarot - 3 months
My "security flavor of the month" is almost universally ignored... Capability Based Security/Multilevel Secure Computing. If it's not ignored, it's mis-understood.

It's NOT the UAC we all grew to hate with Windows 8, et al. It's NOT the horrible mode toggles present on our smartphones. It's NOT AppArmor.

I'm still hoping that Genode (or HURD) makes something I can run as my daily driver before I die.

By @cratermoon - 3 months
The real dumbest idea in computer security is "we'll add security later, after the basic functionality is complete"
By @jrm4 - 3 months
Missed the most important.

"You can have meaningful security without skin-in-the-game."

This is literally the beginning and end of the problem.

By @lsb - 3 months
Default permit, enumerating badness, penetrate and patch, hacking is cool, educating users, action is better than inaction
By @kazinator - 3 months
Penetrate and Patch is a useful exercise, because it lets the IT security team deliver some result and show they have value, in times when nothing bad is happening and everyone forgets they exist.
By @jibe - 3 months
"We're Not a Target" deserves promotion to major.
By @kuharich - 3 months
By @Jean-Papoulos - 3 months
>As a younger generation of workers moves into the workforce, they will come pre-installed with a healthy skepticism about phishing and social engineering.

hahahahahaha

By @al2o3cr - 3 months

    My guess is that this will extend to knowing not to open weird attachments from strangers.
I've got bad news for ya, 2005... :P
By @jojobas - 3 months
>My prediction is that the "Hacking is Cool" dumb idea will be a dead idea in the next 10 years.

19 years later, hacking is still cool.

By @janalsncm - 3 months
> hacking is cool

Hacking will always be cool now that there’s an entire aesthetic around it. The Matrix, Mr. Robot, even the Social Network.

By @rkagerer - 3 months
#7 Authentication via SMS
By @uconnectlol - 3 months
> Please wait while your request is being verified...

Speaking of snakeoil

By @michaelmrose - 3 months
> Educating Users

This actually DOES work it just doesn't make you immune to trouble any more than a flu vaccine means nobody gets the flu. If you drill shit into people's heads and coach people who make mistakes you can decrease the number of people who do dumb company destroying behaviors by specifically enumerating the exact things they shouldn't do.

It just can't be your sole line of defense. For instance if Fred gives his creds out for a candy bar and 2FA keeps those creds from working and you educate and or fire Fred not only did your second line of defense succeed your first one is now stronger without Fred.

By @tracerbulletx - 3 months
Hacking is cool.
By @amelius - 3 months
Also needs mention:

- Having an OS that treats users as "suspicious" but applications as "safe".

(I.e., all Unix-like systems)

By @grahar64 - 3 months
"Educating users ... If it worked, it would have worked by now"
By @chha - 3 months
> If you're a security practitioner, teaching yourself how to hack is also part of the "Hacking is Cool" dumb idea. Think about it for a couple of minutes: teaching yourself a bunch of exploits and how to use them means you're investing your time in learning a bunch of tools and techniques that are going to go stale as soon as everyone has patched that particular hole.

If only this was true... Injection has been on Owasp Top 10 since its inception, and is unlikely to go away anytime soon. Learning some techniques can be useful just to do quick assessments of basic attack vectors, and to really understand how you can protect yourself.

By @umanghere - 3 months
> 4) Hacking is Cool

Pardon my French, but this is the dumbest thing I have read all week. You simply cannot work on defensive techniques without understanding offensive techniques - plainly put, good luck developing exploit mitigations without having ever written or understood an exploit yourself. That’s how you get a slew of mitigations and security strategy that have questionable, if not negative value.