November 24th, 2024

The Two Factions of C++

C++ faces internal conflict between modern tech companies and legacy systems, with concerns over its evolution, safety issues raised by the U.S. government, and a growing shift towards Rust.

Read original articleLink Icon
FrustrationSkepticismConcern
The Two Factions of C++

The C++ programming language is currently facing significant internal conflict, characterized by two distinct factions: modern tech companies advocating for evolution and legacy systems resistant to change. The C++ Evolution Working Group has recently agreed on principles that prioritize backward compatibility and avoid breaking changes, which some view as a regression in the language's development. Meanwhile, major tech firms like Google and Microsoft are increasingly adopting Rust, a language perceived as safer and more modern, leading to concerns about C++'s future viability. The U.S. government has also issued warnings against using memory-unsafe languages like C++, further complicating its standing in the industry. The divide between modern C++ users, who leverage advanced tooling and automated refactoring, and those reliant on outdated systems is growing. This rift is exacerbated by the C++ committee's focus on maintaining legacy compatibility, which hinders the introduction of innovative features. As a result, many in the community are questioning the language's ability to adapt to contemporary programming needs, with some suggesting that the dream of a unified, dialect-free C++ is no longer feasible.

- C++ is experiencing a divide between modern tech companies and legacy systems.

- The C++ committee prioritizes backward compatibility, limiting language evolution.

- Major tech firms are shifting towards Rust for its safety and modern features.

- The U.S. government has issued warnings against using C++ due to safety concerns.

- The future of C++ is uncertain as the community grapples with its ability to adapt.

AI: What people are saying
The discussion surrounding the article on C++ reveals several key themes and concerns among commenters.
  • There is a divide between modern tech companies and legacy systems, with many expressing frustration over C++'s evolution and tooling.
  • Some commenters advocate for the adoption of Rust as a safer alternative, while others defend the current trajectory of C++.
  • Concerns about the lack of standardized tooling and package management in C++ are prevalent, with many suggesting this hampers its usability.
  • Commenters highlight the importance of automated testing and modern development practices in maintaining and evolving C++ codebases.
  • The departure of influential figures like Herb Sutter from Microsoft raises worries about the future direction of C++ standards and community engagement.
Link Icon 51 comments
By @saurik - 5 months
I compile a lot of C++ code from a lot of places, and the only time I run into code that somehow simply doesn't work on newer versions of C++ and where the developers aren't even sure if they will accept any patches to fix the issue as they claim it "isn't supported" to use a newer version of C++--even for the public headers of a library--is, you guessed it: code from Google.

Meanwhile, most of the C++ code from Google seems to be written in some mishmash of different ideas, always at some halfway point along a migration between something ancient and something passable... but never anything I would ever dare to call "modern", and thereby tends to be riddled with state machines and manual weak pointers that lead to memory corruption.

So... I really am not sure I buy the entire premise of this article? Honestly, I am extremely glad that Google is finally leaving the ecosystem, as I generally do not enjoy it when Google engineers try to force their ridiculous use cases down peoples' throats, as they seem to believe they simply know better than everyone else how to develop software.

Like... I honestly feel bad for the Rust people, as I do not think the increasing attention they are going to get from Google is going to be at all positive for that ecosystem, any more than I think the massive pressure Google has exerted on the web has been positive or any more than the pressure Google even exerted on Python was positive (not that Python caved to much of it, but the pressure was on and the fact that Python refused to play ball with Google was in no small part what caused Go to exist at all).

(FWIW, I do miss Microsoft's being in the space, but they honestly left years ago -- Herb's existence until recent being kind of a token consideration -- as they have been trying to figure out a tactical exit to C++ ever since Visual J++ and, arguably, Visual Basic, having largely managed to pivot to C# and TypeScript for SDKs long ago. That said... Sun kicking Microsoft out of Java might have been really smart, despite the ramifications?)

By @bagxrvxpepzn - 5 months
To the people who work on C++ standards: I approve of the current C++ trajectory and please ignore all of the online noise about "the future of C++." To anyone that disagrees severely with the C++ trajectory as stated, please just consider another language, e.g. Rust. I don't want static lifetime checking in C++ and if you want static lifetime checking, please use Rust. I am not a government contractor, if you are a government contractor who must meet bureaucratic risk-averse government requirements, please use Rust. I have an existing development process that works for me and my customers, I have no significant demand for lifetime checking. If your development process is shiny and new and necessitates lifetime checking, then please use Rust. To Rust advocates, you can have the US government and big tech. You can even have Linux. Just leave my existing C++ process alone. It works and the trade offs we have chosen efficiently accomplish our goals.
By @adambatkin - 5 months
Something that Rust got _really_ right: Editions. And not just that they exist, but that they are specified per module, and you can mix and match modules with different Editions within a bigger project. This lets a language make backwards incompatible changes, and projects can adopt the new features piecemeal.

If such a thing came to C++, there would obviously be limitations around module boundaries, when different modules used a different Edition. But perhaps this could be a way forward that could allow both camps to have their cake and eat it too.

Imagine a world where the main difference between Python 2 and 3 was the frontend syntax parser, and each module could specifically which syntax ("Edition") it used...

By @pornel - 5 months
There will be eventually only one faction left using C++ — the legacy too-big-to-refactor one.

The other faction that has lost faith in WG21, and wants newer, safer, nimble language with powerful tooling is already heading for the exits.

Herb has even directly said that adding lifetime annotations to C++ would create "an off-ramp from C++"[1] to the other languages — and he's right, painful C++ interop is the primary thing slowing down adoption of Rust for new code in mixed codebases.

[1]: https://www.open-std.org/jtc1/sc22/wg21/docs/papers/2024/p34...

By @fefe23 - 5 months
Oh no! Herb Sutter is leaving Microsoft?!

That does not bode well for Microsoft. At least from the outside perspective it looks like he was the adult in the room, the driving force behind standards adoption and even trying to steer C++-the-language towards a better vision of the future.

If he is gone, MSVC will again be the unloved bastard child it has long been before Herb's efforts started to pay off. This is very disheartening news.

I'm happy he held out for this long even though he was being stonewalled every step of the way, like when Microsoft proposed std::span and it was adopted but minus the range checking (which was the whole point of std::span).

Now he has been pushing for a C++ preprocessor. Consider how desperate you have to be to even consider that as a potential solution for naysayers blocking your every move.

By @danpalmer - 5 months
Python similarly has 2-3 factions in my experience: teams doing engineering in Python and using all the modern tooling, linting, packaging, types, testing, etc; teams doing data science and using modern but different tooling (i.e. Anaconda); and teams that don't get onboard in any of the language health initiatives and are on unsupported language versions with no packaging, tooling, linting, etc.

Javascript/Node/Typescript has even more identifiable factions.

I think developing factions around these things is unfortunately normal as languages grow up and get used in different ways. Rust has arguably tried to stay away from this, but the flip side is a higher learning curve because it just doesn't let certain factions exist. Go is probably the best attempt to prevent factions and gain wide adoption, but even then the generics crowd forced the language to adopt them.

By @torginus - 5 months
I have been saying this for a more than a decade, but the number one thing that killed C++ as an attractive modern language is (the lack of) modules - the ability to include C++ code and libraries from others (perhaps with transitive dependencies), would allow an actual community of devs and companies spring up around the language.

Instead we have greybeards and lone warriors, and million-line legacy codebases, half of which have their own idea on what a string or a thread is.

By @hypeatei - 5 months
One thing I cannot stand about C++ is the fractured nature of everything. Compilers, build tools, package management, etc... It feels like you need to be a wizard just to get a project compiling and start writing some code.
By @munificent - 5 months
I really really like this article. I think the two camps the author describes very much reflect my experience over the past couple of decades at a dotcom startup, then a game developer, and now at Google

However, I think the author is a little off on the root cause. They emphasize tooling: the ability to build reliably and cleanly from source. That's a piece of it, but a relatively small piece.

I think the real distinguishing factor between the two camps is automated testing. The author mentions testing a couple of times, but I want to emphasize how critical that is.

If you don't have a comprehensive set of test suites that you are willing to rely on when making code changes, then your source code is a black box. It doesn't matter if you have the world's greatest automated refactoring tools that output the most beautiful looking code changes. If you don't have automated tests to validate that the change doesn't break an app and cost the company money, you won't be able to land it.

Working on a "legacy C++ app" (like, for example, Madden NFL back when I was at EA) was like working on a giant black box. You could fairly confidently add new features and new code onto the side. But if you wanted to touch existing code, you needed a very compelling reason to do so in order to outweigh the risk of breaking something unexpectedly. Without automated tests, there was simply no reliable way to determine if a change caused a regression.

And, because C++ is C++, even entirely harmless seeming code changes can cause regressions. Once you've got things like reinterpret_cast<>, damn near any change can break damn near anything else.

So people working in these codebases behave sort of like surgeons with a "do no harm" philosophy. They touch as little as possible, as non-invasively as possible. Otherwise, the risk of harming the patient is too high.

It's a miserable way to program long-term. But it's really hard to get out of that mess once you're in it. It takes a monumental amount of political capital from engineering leadership to build a strong testing culture, re-architect a codebase to be testable, and write all the tests.

A lot of C++ committee changes aimed at legacy C++ developers are about "how can we help these people that are already in a mess survive?" That's a very different problem than asking, "Given a healthy, tested codebase, how can we make developers working in it go faster?"

By @mgaunard - 5 months
If you're comparing Herb Sutter and the Google people at the standard committee, there is one thing that was clear: Herb was good at getting people to agree on compromises that served everybody, while Google was mostly claiming they knew better than everybody else and pushing their own agenda.
By @AlotOfReading - 5 months
Profiles aren't a mess because they're intended for legacy codebases instead of big tech monorepos. They're a mess because they're not a serious effort. There's no actual vision of what problems they're trying to solve or what the use cases are, or even what kind of guarantee profiles are going to make.
By @shultays - 5 months

  “We must minimize the need to change existing code. For adoption in existing code, decades of experience has consistently shown that most customers with large code bases cannot and will not change even 1% of their lines of code in order to satisfy strictness rules, not even for safety reasons unless regulatory requirements compel them to do so.” – Herb Sutter

  with large code bases cannot and will not change even 1% of their lines of code in order to satisfy strictness rules
Do people really say this? Voice this in committee? I have been in a few companies, and one fairly large one, and all are happy to and looking forward to upgrade newer standards and already spend a lot of time updating their build systems. Changing 1% of code on top of that is probably not really that much compared
By @Mike4Online - 5 months
C++ is not just C++ but also the C preprocessor, the STL, the linker, the C libraries and SDKs you can't help but depend on, the build system, the build scripts, the package manager, the IDEs and IDE add-ons, the various quirks on various platforms, etc. That's on top of knowing the code base of your application.

Being really good at C++ almost demands that you surrender entire lobes of your brain to mastering the language. It is too demanding and too dehumanizing. Developers need a language and a complete tool chain that is designed as a cohesive whole, with as little implicit behavior, special cases and clever tricks as possible. Simple and straight-forward. Performance tweaks, memory optimizations and anything else that is not straightforward should be done exclusively by the compiler. I.E. we should be leveraging computers to do what they do best, freeing our attention so we can focus on the next nifty feature we're adding.

Zig is trying to do much of this, and it is a huge undertaking. I think an even bigger undertaking than what Zig is attempting is needed. The new "language" would also include a sophisticated IDE/compiler/static-analyzer/AI-advisor/Unit-Test-Generator that could detect and block the vast majority of memory safety errors, data races and other difficult bugs, and reveal such issues as the code is being written. The tool chain would be sophisticated enough to handle the cognitive load rather than force the developer to bear that burden.

By @PittleyDunkin - 5 months
> Nimble, modern, highly capable tech corporations that understand that their code is an asset. (This isn’t strictly big tech. Any sane greenfield C++ startup will also fall into this category.)

Oh I see, this is a fantasy.

By @seanhunter - 5 months
Ports of massive legacy codebases are possible and they happen. They can be extremely difficult, they take will and effort but they can get done. The idea that you have to slow down the development of the language standard for people who won't port to the new version is weird- Those people won't be updating compilers anyway.

How do I know this? I migrated a codebase of about 20m lines of C++ at a major investment bank from pre-ansi compilers to ansi conformance across 3 platforms (Linux, Solaris and Windows). Not all the code ran on all 3 platforms (I'm looking at you, Solaris) but the vast majority did. Some of it was 20 years old before I touched it - we're talking pre-STL not even just pre ansi. The team was me + one other dude for Linux and Solaris and me + one other different dude for windows, and to give you an idea the target for gcc went from gcc 2.7[1] to gcc 4[2], so a pretty massive change. The build tooling was all CMake + a bunch of special custom shell we had developed to set env vars etc and a CI/CD pipeline that was all custom (and years ahead of its time). Version control was CVS. So, single central code repo and if there was a version conflict an expert (of which I was one but it gives me cold sweats) had to go in, edit the RCS files by hand and if they screwed up all version control for everyone was totally hosed until someone restored from backup and redid the fix successfully.

While we were doing the port to make things harder there was a community of 667 developers[3] actively developing features on this codebase and it had to get pushed out come hell or high water every 2 weeks. Also, this being the securities division of a major investment bank, if anything screwed up real money would be lost.

It was a lot of work, but it got done. I did all my work using vim and quickfix lists (not any fancy pants tooling) including on windows but my windows colleague used visual C++ for his work.[4]

[1] Released in 1995

[2] Released in 2005

[3] yes. The CTO once memorably described it to me as "The number of the beast plus Kirat". Referring to one particularly prolific developer who is somewhat of a legend on Wall Street.

[4] This was in the era of "debugging the error novel" so you're talking 70 pages of ascii sometimes for a single error message with a template backtrace, and of course when you're porting you're getting tens of thousands of these errors. I actually wrote FAQs (for myself as much as anything) about when you were supposed to change "class" to "typename", when you needed "typedef typename" and when you just needed "typedef" etc. So glad I don't do that any more.

By @omoikane - 5 months
> Stories of people trying their best to participate in the C++-standard committee process across multiple years

This links to:

https://thephd.dev/finally-embed-in-c23

It was a fascinating story, particularly about how people finally coming to terms with accepting that a seemly ugly way of doing things really is the best way (you just can't "parse better").

The feature itself is interesting too.

https://gcc.godbolt.org/z/jGajc6xd5

By @liontwist - 5 months
“Governments are telling you to stop using C++”.

This invokes the imagery of a 1950s Apollo era scientist saying something serious. But I promise you there is no visionary low level language authority in the background. It’s just a staffer being influenced by the circle of blogs prominent on programming Reddit and twitter.

> no overhead principle

It’s actually nice to hear they are asserting a more conservative outlook and have some guiding design principle.

Bjarne is more of a super-bureaucrat than a designer. In the early days he pulled C++ into whatever language movements were popular. For a while it looked like Rust was having that influence.

But the outcome has been a refinement of C++ library safety features which are moderate and easy to adopt.

By @Mond_ - 5 months
Woah, my post made it to the front page and I'm late. Hi!

In hindsight I would've probably written a few things differently, but I really didn't want to fall into a trap of getting stuck editing.

By @serjts - 5 months
The real, everpresent and probably future nail in the coffin of C++ is the lack of a standard apckage manager and build system. The rest is just what happened to be picked up by social/news as it is easier and flashier to talk about.
By @__d - 5 months
The author doesn’t appear to consider the use of binary-only (typically commercial licensed) libraries. There’s still a need for an ABI and “modern tooling” isn’t a magic wand.
By @uluyol - 5 months
I think the discussions in these threads show how accurate the framing of this article is. You have some people celebrating Google and friends (slowly) leaving the C++ ecosystem and those that continue to emphasize the flaws that have driven companies away from it in recent history (safety being #1) on the list.
By @BD103 - 5 months
Also see "On 'Safe' C++", which goes deeper into many of the insights brought up by this article. <https://news.ycombinator.com/item?id=42186475>
By @minetest2048 - 5 months
Any mirrors/archives? DNS not resolving for me

EDIT: found one on wayback: https://web.archive.org/web/20241124225457/https://herecomes...

By @vlovich123 - 5 months
I’m not sure I understand the whole ABI argument. Isn’t the raison d’être for namespace versions precisely to evolve the language? Why can’t the existing implementations be copied into a std::v2 but with a changed ABI. Existing ABI issues are non-issues because the old code remains while new code will by default compile against v2 picking up all the goodies and can downgrade the types they actually use across ABI in the places they need by changing the namespace version used for a given compilation unit via compile-time flags (or something along these lines)?

Were namespace versions determined to not solve this problem? That would be the most ironic thing after all if the change management system introduced in c++11 to avoid std::string is either unused, untrusted, or unworkable for the purpose it was intended.

By @bayindirh - 5 months
I personally like these discussions about C++. Yes, I think C++ should continue to be C++. I also like it that way.

On the other hand, having a bit more transparency into the workgroups and their way of doing things may allow the process become a bit more efficient, approachable, and maybe would allow shedding some of the problems which have accumulated due to being so isolated from the world.

Some of the alleged events really leave a bad taste in the mouth, and really casts a shade of doubt for the future of C++.

Lastly, alienating people by shredding their work and bullying them emotionally is not the best way to build a next generation of caretakers for one of the biggest languages in the world. It might not fall overnight, but it'll certainly rot from its core if not tended properly. Nothing is too big to fail.

By @zamalek - 5 months
> Relatively modern, capable tech corporations that understand that their code is an asset.

I strongly disagree with this. The more code you have, the more resources you have to spend maintaining it. There is a very relevant example close by in the post: the bit about Google having a clang-based tool that can refqactor the entire codebase. Great! Problem is, an engineer had to spend their time writing that, and you had to pay that engineer money - all because you have an unmanageable amount of code.

The real tech asset is processes: the things you have figured out in order to manage such an ungodly amount of code. Your most senior engineers, specifically what's in their heads, are an asset too.

By @nottorp - 5 months
Two factions? Considering C++ has everything, I'd assume there are tens of factions.
By @bluGill - 5 months
Languages should not have a package management system. They all have a all the world is my language blindspot and fail hard when you have anything else. Sometimes you can build plugins in a different language but they still assume the one true language is all you want.

package management belongs to the os - or at least something else.

don't get me wrong, package management is a real problem and needs to be solved. I'm arguing against a language package manager we need a language agnostic package manager.

By @up2isomorphism - 5 months
It does not require particular careful inspection to see that with all these zillions features comes into C++ 20, C++ still does not a have a straightforward string split function. And I still feel printf is more reliable and easier to use than all these “modern” fmt.

There must be some extremely ideological reason behind these horrible “modern” C++ standards.

There are some good trend happening during C++ 11, but now it is completely out of control now.

By @ramshanker - 5 months
I am working on a new C++ project in 2024 for my part time project. And this article provided me enough information to battle future "Why not use XYZ instead" discussion. ;)

My Rational for Using C++ in 2024: (A) Extreme computational performance desired. (B) I learned C++ 20 years back. (C) C++ has good enough Cross-Platform (OS) compatibility.

By @29athrowaway - 5 months
C++ is dead by entropy. So complex nobody can truly learn it anymore.
By @KerrAvon - 5 months
I feel the need to point out that `const` is a viral annotation in C++
By @chris_wot - 5 months
I think he has this about right. The project I contribute to (and no, I'm not a massive contributor) is LibreOffice and it is a C++ codebase. It has a decent build system that is easy for anyone to run out of the box. It uses modern C++17+ code, and though it has a lot of legacy code, it is being constantly modified by people like Noel Grandin via clang plugins (along with a lot of manual effort).

This code was originally developed in the late 1980s.

A good packaging tool would have helped a lot.

By @mgaunard - 5 months
The main problem with bad C++ tooling is often the same, it's having a modular system that relies on importing/exporting binaries, then tracking binary versions when combining applications.

You should only track source versions and build things from source as needed.

By @icameron - 5 months
> The C++ committee seems pretty committed (committeed, if you will)

I'll will not, thanks.

By @softwaredoug - 5 months
Related - Is C doing anything about memory safety so it can be called memory safe?
By @alkonaut - 5 months
What many newer programming platforms (I deliberately don't say "language") got right, is that you can't design a language in a vacuum. If you design a language and leave the implementation open you'll iterate too slowly and eventually you'll grind to a halt or diverge in implementations.

A good programming platform has to consist of tooling which includes package managers, compilers, linters, etc. Ideally, in this orbit you would also have "Language Servers" or similar. At the very least, the compiler and language should be written with this in mind, e.g. written for the ground up to support incremental compilation and so on.

Go, C#, and Rust all have tooling-first and more importantly first-party tooling. The people who design the language MUST be able to iterate quickly with the people who make the compiler, who in turn should be able to walk down the hall and talk to the people who make the package manager, the package manager repository, and so on.

By @xdmr - 5 months
A plague o' both both your houses!
By @titzer - 5 months
Replace C++ with asbestos (no, I'm serious, not just stark), and we're basically having exactly the same conversation that's gone on over decades in the meatspace world, with analogous players, sunk cost/investment calculus, and migration consternation. The only part of the conversation that is missing is the liability conversation and damages.

And I do take asbestos as a serious example. Asbestos is still manufactured and used! Believe or not there are "safe" uses of asbestos and there are protocols around using them. Nevermind the fact that there is a lot of FUD and dishonesty about where exactly the line cuts on what is safe versus not safe...for example we are finding out how brake dust affects the wider environment as we crawl out from under the tent of utter misinformation of a highly motivated entrenched industry.

I feel like this is not a new human phenomenon. We made particularly poor choices in what tech we became dependent on, and lo and behold, the entrenched interests keep telling us it's not that bad and we should keep doing it because...reasons.

It will eventually play out the way it must; C++ might seem a lot more innocuous than asbestos, and in some ways that's true, but it resists all effort to reform it and will probably end up needing to just be phased out.

By @cryptonector - 5 months
> Speaking of big tech, did you notice that Herb Sutter is leaving Microsoft, and that it seems like MSVC is slow to implement C++23 features, and asking the community for prioritization.

Uh, they took decades to implement a bunch of C99 features. Is that predictive? I suspect it is.

By @glitchc - 5 months
So the gist of the article is this: The C++ committee should take charge of tooling and implement standardized tooling that matches the standards. Okay, but that won't stop the existence of other tooling, including old tooling, and it won't fix the problem of legacy code. So what's the point? Why bother? Plus unsafe memory calls are mainly found in libraries and applications, not the core language. How will standardized tooling fix that or any of the existing problems for that matter?
By @nuancebydefault - 5 months
When I comment on HN topics that are related to C++, there's a very high chance that I get downvoted. Anyways I can't help it, I will comment here...

I feel it would be best for the C++ language that its development would stop. There's no way to fix its current problems. The fact that it stayed compatible with previous iterations over so many years is an accomplishment, almost a miracle, it should be cherished. Deviating from that direction doesn't make sense. Keeping that does not make sense either.

By @pphysch - 5 months
The "standardization" of C++, SQL, et al. are some of the most catastrophic examples of premature abstraction in software development.

Programming languages benefit far more from a robust implementation, tooling, and good technical documentation (which may read like a standard) than from a prescriptive standard. The latter generates enormous waste, for what?

By @paulddraper - 5 months
> We’re basically seeing a conflict between two starkly different camps of C++-users:

> * Relatively modern, capable tech corporations that understand that their code is an asset. (This isn’t strictly big tech. Any sane greenfield C++ startup will also fall into this category.)

> * Everyone else. Every ancient corporation where people are still fighting over how to indent their code, and some young engineer is begging management to allow him to set up a linter.

Well said.

And because of this, a lot of the first is leaving for greener pastures.

By @physicsguy - 5 months
C++ is still important in domains where performance is really critical.

I also think there's a place where it can easily be supplanted, but currently cross platform native software has Qt and bindings for it in other languages are mixed.

In performance critical things, Rust still doesn't feel like the final answer since you end up cloning a lot and refactors are very painful. Go obviously has it's issues since SIMD support is non-existent and there is limited control over garbage collection, though it works well for web APIs.

By @EVa5I7bHFq9mnYK - 5 months
What about performance? The appeal of C was that it translated nicely to pdp-11 instructions with virtually no overhead. Then the appeal of C++ was that it translated nicely to C code (in fact, first versions of C++ were just a preprocessor, passing the job down to the actual C compiler), and you could still insert ASM code if needed.

All these new features introduce some run-time overhead, it seems.

By @dillon - 5 months
My naive opinion is a commitment to not break the ABI is a good thing not just for everyone else but for C++ as well. Languages like C#, Swift and Python (maybe even Rust?) have tools to integrate with C++ fairly deeply and cleanly. If C++ commits to being stable enough then there won’t be a reason to rewrite some amount of C++ into something else. It’s not a surprise that big tech is trying to move away from C++ and that’s not necessarily bad and remaining stable means the transition isn’t rushed. In the meantime people who enjoy and excel at writing C++ still can. Just seems like an overall positive thing to commit to.