I've done a bit of C++ coding in my time. The feature list of the language is so long at this point that it is pretty much impossible for anyone new to learn C++ and grok the design decisions anymore. I don't know if this is a good thing or not to keep adding and extending or whether C++ should sail into the sunset like Fortran and others before it.
Don't get me wrong. There is still a time and a place for Fortran. And this will also likely always be the case for C++. But I'm not sure it is entirely wise to choose it if you're creating a new project anymore.
The feature list of the language is so long at this point that it is pretty much impossible for anyone new to learn C++ and grok the design decisions anymore.
Even if it is possible, it's a high bar. The height of that bar matters in bringing new people in.
I have seen decades of would-be "C++ killers" come and go. I think that in the end, it is C++ that kills C++. The language has just become unusably large. And that's one thing that cannot be fixed by extending the language.
I have seen decades of would-be "C++ killers" come and go. I think that in the end, it is C++ that kills C++.
I think you're right.
I am, admittedly, a card carrying member of the C++ curmudgeon club. But I would gladly gravitate to a sexy new C++ subset for my projects, if one gains some momentum.
I do a lot with goLang, right now, instead.
But I would adore joining with a community effort to choose reasonable safe default C++ libraries for a bunch of use cases, if one gained the traction to cover my own use cases.
C++ innovates often first and adapts it into mainstream. And its kind of a swiss-army knife. You don't need to use and learn everything, just pick what you need. Unless you need to get into an old existing code base...
Just an idea: The language could be divided into multiple standard levels, where each level has more features and functionality. It would be essentially a "restricted", "standard" and "full" version of the language, where full is basically what it is now and the others are constrained versions with less functionality (no multiple inheritance and what not rules). But at this point, if you don't use the language in its full, why bother with it at all? Just thinking a bit...
You don't need to use and learn everything, just pick what you need.
I used to think the same, but now I think you should at least skim through everything. Reason being otherwise you may reinvent the wheel a lot, and there are many use-cases where you really don't want to do that (but C++ makes it so easy, I was constantly tempted to just do what I want and not look for it being already available)
Given how long and widely C++ has been a dominant language, I don't think anyone can reasonably expect to get rid of all the unsafe code, regardless of approach. There is a lot of it.
However, changing the proposition from "get good at Rust and rewrite these projects from scratch" to "adopt some incremental changes using the existing tooling and skills you already have" would lower the barrier to entry considerably. I think this more practical approach would be likely to reach far more projects.
There's been plenty of interop options between C++ and just about anything for decades. If languages like D, that made it piss easy, weren't gonna change people's minds, nothing can. Ditching C++ is the only way forward.
If you're hoping for the standard lib to have things built on evolving standards and ecosystems like HTTP clients, then I doubt that will ever happen. There are plenty of examples of why that would be a terrible idea (urllib, std::regex, etc).
Probably not going to happen. I will say that it's less bad than you might think, because there is more-or-less an unofficial extended stdlib, i.e. high-quality, widely used libraries which are maintained by people in the Rust team.
But yeah, I'm involved in a somewhat larger project and we've cracked 1000 transitive dependencies a few weeks ago, and I can tell you for free that I don't personally know the maintainers of all of those.
If this was more of a security-critical project, there's probably a dozen or so direct dependencies that we would have implemented ourselves instead.
This has been one of my biggest frustrations while learning Rust. I'm coming from .NET which has an incredible wealth of official System and Microsoft libraries all of which are robust and well documented.
Rust on the other hand has the bare minimum std library, with everything else implemented by the community. There isn't even a std async library. It's insane.
Even the popular community libraries are severely lacking in documentation or inexplicably unmaintained.
Rust has a ton of potential but it desperately needs some broad funding to align the fundamentals to a decent standard.
Google started work on Carbon due to the difficulty of getting the C++ standards committee to accept any real, fundamental changes to the language. If Google, a grandmaster at manipulating standards committees, couldn't get something passed, I don't foresee this proposal getting anywhere.
The C++ standards committee don't see memory safety or UB as a problem. If they did they wouldn't keep introducing new footguns, e.g. forgetting return_void() in a coroutine. They still think everyone should just learn the entire C++ spec and not make mistakes.
It boggles the mind that any language - let alone a systems programming language that most of the world's infrastructure is built upon - wouldn't adjust their specification to eliminate undefined behavior wherever possible. And C++'s all seem to be in the worst possible places, too.
I'm a bit skeptical that a borrow checker in C++ can be as powerful as in rust, since C++ doesn't have lifetime annotations. Without lifetime annotations, you have to do a whole program analysis to get the equivalent checks which isn't even possible if you're e.g. loading dynamic libraries, and prohibitively slow otherwise. Without that you can only really do local analysis which is of course good but not that powerful.
Lifetime annotations in the type system is the right call, since it allows library authors to impose invariants related to ownership on their consumers. I doubt C++ will add it to their typesystem though.
Read the proposal: Lifetimes annotations, the rust standard library (incl. basic types like Vec, ARc, ...), first class tuples, pattern matching, destructive moves, unsafe, it is all in there.
The proposal is really to bolt on Rust to the side of C++, with all the compatibility problems that brings by necessity.
Ah ok just read the article and not the proposal. I'm surprised that they went that far but as I wrote I think that lifetime annotations are a good idea, hope the C++ people find a way to add them to the language that actually works well, which sounds like an incredibly difficult task.
On one hand, I'm pleased that C++ is answering the call for what I'll call "safety as default", since as The Register and everyone else since pointed out, if safety constructs are "bolted on" like an afterthought, then of course it's not going to have very high adoption. Contrast this to Rust and its "unsafe" keyword that marks all the places where the minimum safety of the language might not hold.
On the other hand, while this Safe C++ proposal adopts a similar notion of an "unsafe" context, it also adds a "safe" keyword, to specify that a function will conform to compile-time safety checks. But as the proposal readily admits:
Rust’s functions are safe by default. C++’s are unsafe by default.
While the proposal will surely continue to evolve before being implemented, I forsee a similar situation as in C where code that lacked initial const-correctness will struggle to work with newer code and libraries. In this case, it would be the "unsafe" keyword that proliferates everywhere just to call older, unsafe code from newer, safe callers.
Rust has the advantage that there isn't much/any legacy Rust to upkeep, and that means the volume of unsafe code in Rust proframs is minimal, making them safer overall today. But for Safe C++ code, there's going to be a lot of unsafe legacy C++ code and that reduces the safety benefit for programs overall, for the time being
Even as this proposal progresses, the question of whether to start rewriting some code anew in Rust remains relevant. But this is still exciting as a new option to raise the bar in memory safety in C++.
Null safety is orders of magnitude simpler than memory safety. Kotlin is a null safe language by default. Java is infamously not. Anyone who has worked on a mixed-language Kotlin project can tell you how quickly null safety becomes a pain once guarantees break down - and that's in a language where these issues are flagged instantly and you can "fix" the problem in a couple of characters! Mixed memory safe/unsafe codebases would be a nightmare in comparison.
Also, C++'s ecosystem consists of deeply entrenched libraries with ancient codebases. Safe C++ might be useful in a decade or two if library maintainers could be pushed to make the switch (good luck with that, if it's half as much of a paradigm shift as Rust), but by then there will probably be multiple competing language features that claim to solve the same problem. It's the C++ Way™.
As much legacy code as has been written in C++, I'd think a total rewrite might be better ironically, in terms of passing on important institutional knowledge, to the extent large-scale production codebases are concerned. Attacking it bit by bit (pun not intended) might even take longer than just ripping things up and starting anew.
The big downside is that, for backwards compatibility, the default must still be unsafe code. Ideally this could be toggled with a compiler flag, rather than having to wrap most code in "safe" blocks (like rust, but backwards).
One potential upside that people don't seem to be discussing is that the safe subset could also be the place to finally start cutting down the bloat of C++. We could encourage most developers to write exclusively in the safe subset, and aim to make that the "much smaller and cleaner language" trying to get out of C++.
Anyway, I've come to know since then that the proposal was not a part of a damage control campaign, but rather a single person's attempt at proposing a theoretical real solution. He misguidedly thought that there was actually an interest in some real solutions. There wasn't, and there isn't.
The empire are continuing with the strategy of scamming people into believing that they will produce, at some unspecified point, complete magical mushroomsguidelines and real specified and implemented profiles.
The proposal is destined to become perma-vaporware. The dreamy guidelines are going to be perma-WIP, the magical profiles are going to be perma-vapordocs (as in they will never actually exist, not even in theoretical form), and the bureaucracy checks will continue to be cashed.
So not only there was no concrete strike back, it wasn't even the empire that did it.