Yeah, but that makes it sound like they’re all equal, and there hasn’t been any progression, which is untrue. You’re either insane or a historical reenactor if you write something new in COBOL.
I think Rust is genuinely a huge leap forwards compared to C/C++. Maybe one day it will be shitty and obsolete, and at the very least it will become a boring standard option, but for now…
I now want a community led historical reenactment of loose tie wearing software devs in the 60s where they are just chain smoking and banging out COBOL or Fortran punch cards
C++ when it was new was exactly like this. Rust still hasn’t had 30 years of legacy, all these Rust prophets will shit on it’s name in 15 years when they have to maintain huge codebases with it
Besides, C++ is very likely to adopt memory safety
As to whether C++ can update enough to steal it’s thunder, I feel less qualified to answer. It’d be pretty impressive if they managed to preserve backwards compatibility and do that at the same time, though.
makes it sound like they’re all equal, and there hasn’t been any progression
Programming peaked with Lisp (and SQL for database stuff).
Every “progression” made since Lisp has been other languages adding features to (partially but not quite completely) do stuff that could already be done in Lisp, but with less well implemented (though probably with probably less parentheses).
You can use typed Lisp, there are plenty of them, from Typed Racket to Shen in their complexity. Or to make your own type system in 50-100 lines when you actually need
There have been “improvements” but fundamentally in my perspective, these “improvements” could be revealed to be a mistake down the line.
Assembly has produced some insane pieces of software that couldn’t be produced like that with anything else.
Maybe types in programming languages are bad because they are kinda misleading as the computer doesn’t even give a shit about what is data and what is code.
Maybe big projects are just a bad idea in software development and any kind of dependency management is the wrong way.
I like modern languages, types and libraries are nice to have, but I am not the student of the future but of the past.
That’s a valid argument, but a very weak one. If we are not completely sure something is an improvement in all aspects are we just to dismiss it altogether?
Yeah, you could dismiss combustion engines for the same reason, or like, carpentry. You wouldn’t be wrong, they have caused problems down the line at various points (modern climate change, medieval deforestation), but you bet I’d still call them an advance on mule power, or on no carpentry.
This is pretty much an nullification of the idea of technological progress existing at all, which is a kinda hot take.
I see your perspective and I think you kinda miss my perspective which I am to blame for.
I don’t say there weren’t improvements. I am saying that given the uncertainty of “goodness”. Maybe we shouldn’t idolize it. You can appreciate the attempt of creating memory safe code through a programing language without thinking the bare metal code should be written in that language. You can like a typeless easy to write language like Js without thinking desktop app should be written in it. You can like the idea behind functional programming while believing that any application is in the end about side effects and therefore a purely functional application impossible.
You can approach the whole topic as an area of study and possible technological advances instead of a dogma.
You can like the idea behind functional programming while believing that any application is in the end about side effects and therefore a purely functional application impossible.
It’s a bit of a tangent, but if you’re doing something completely deterministic and non-interactive, like computing a digit of pi, it’s great in practice as well. I use Haskell semi-regularly for that kind of thing.
You could argue printing the output is a side effect, but is a side effect followed by termination really “side”?
Yeah, but that makes it sound like they’re all equal, and there hasn’t been any progression, which is untrue. You’re either insane or a historical reenactor if you write something new in COBOL.
I think Rust is genuinely a huge leap forwards compared to C/C++. Maybe one day it will be shitty and obsolete, and at the very least it will become a boring standard option, but for now…
Rust is already obsolete, compared to Stingpie’s excellent assembly language, paired with object oriented programming!
This is the SEALPOOP specification:
Well that’s going to get cancelled. Think of the non binary folk out there!
Don’t worry, they have DreamBerd
I now want a community led historical reenactment of loose tie wearing software devs in the 60s where they are just chain smoking and banging out COBOL or Fortran punch cards
!retrocomputing@lemmy.sdf.org
I don’t think much happens in person, but the community for it definitely exists.
C++ when it was new was exactly like this. Rust still hasn’t had 30 years of legacy, all these Rust prophets will shit on it’s name in 15 years when they have to maintain huge codebases with it
Besides, C++ is very likely to adopt memory safety
Yeah, that’s my guess too.
As to whether C++ can update enough to steal it’s thunder, I feel less qualified to answer. It’d be pretty impressive if they managed to preserve backwards compatibility and do that at the same time, though.
Now it seems the way is unique_ptr and shared_ptr. And std::any to replace void*. At least is what it seems to me.
i think it’s more like a “significant” step in language design that could make a “huge” leap in software quality
I mean, until Electron is rewritten in Rust, so people with Stockholm syndrome can still write painful JavaScript desktop apps…
Tauri gets us quite a long way there
Programming peaked with Lisp (and SQL for database stuff).
Every “progression” made since Lisp has been other languages adding features to (partially but not quite completely) do stuff that could already be done in Lisp, but with less well implemented (though probably with probably less parentheses).
Spoken like a true Lisp fan. I dunno, I really like static typing, and too many brackets gets tiresome.
You can use typed Lisp, there are plenty of them, from Typed Racket to Shen in their complexity. Or to make your own type system in 50-100 lines when you actually need
There have been “improvements” but fundamentally in my perspective, these “improvements” could be revealed to be a mistake down the line.
Assembly has produced some insane pieces of software that couldn’t be produced like that with anything else.
Maybe types in programming languages are bad because they are kinda misleading as the computer doesn’t even give a shit about what is data and what is code.
Maybe big projects are just a bad idea in software development and any kind of dependency management is the wrong way.
I like modern languages, types and libraries are nice to have, but I am not the student of the future but of the past.
That’s a valid argument, but a very weak one. If we are not completely sure something is an improvement in all aspects are we just to dismiss it altogether?
Yeah, you could dismiss combustion engines for the same reason, or like, carpentry. You wouldn’t be wrong, they have caused problems down the line at various points (modern climate change, medieval deforestation), but you bet I’d still call them an advance on mule power, or on no carpentry.
This is pretty much an nullification of the idea of technological progress existing at all, which is a kinda hot take.
@Tartas1995@discuss.tchncs.de, so you can reply in the right place.
I see your perspective and I think you kinda miss my perspective which I am to blame for.
I don’t say there weren’t improvements. I am saying that given the uncertainty of “goodness”. Maybe we shouldn’t idolize it. You can appreciate the attempt of creating memory safe code through a programing language without thinking the bare metal code should be written in that language. You can like a typeless easy to write language like Js without thinking desktop app should be written in it. You can like the idea behind functional programming while believing that any application is in the end about side effects and therefore a purely functional application impossible.
You can approach the whole topic as an area of study and possible technological advances instead of a dogma.
Oh, well I can agree with that.
It’s a bit of a tangent, but if you’re doing something completely deterministic and non-interactive, like computing a digit of pi, it’s great in practice as well. I use Haskell semi-regularly for that kind of thing.
You could argue printing the output is a side effect, but is a side effect followed by termination really “side”?