So that next time your coworker uses the wrong type, the compiler can scream at him: “NO I WONT COMPILE THIS YOU DUMBASS, LOOK JOHN SAID ON LINE 863 THAT IT SHOULD BE A DOUBLE, NOT A FLOAT FOR FUCK SAKE”
I’m not sure if you’re being rhetorical or not, but “string|number” is definitely correct here. A computer could definitely figure this out, but typing is for the benefit of the coders more than the code itself. It’s basically functional documentation
Yeah that’s what I’m saying, I hate it when coworkers will assign everything as “any” just to avoid the scary red squigglies. Oh well I guess that’s what code reviews are for 🙃
💯% accurate. funny how the typescript developer thinks this is some kind of “gotcha!”… like maybe just try a language besides typescript and find out for yourself 😆
Exactly. Most languages I know of that allow this at all will coerce the “1” to an integer and give x = 2. They get away with this because they define the “+” operator as taking numbers only as arguments, so if you hand them x = x + "cheese" they’ll error out.
i like when my strongly typed language can type itself, why should i have to type extra words because the compiler is stupid?
So that next time your coworker uses the wrong type, the compiler can scream at him: “NO I WONT COMPILE THIS YOU DUMBASS, LOOK JOHN SAID ON LINE 863 THAT IT SHOULD BE A DOUBLE, NOT A FLOAT FOR FUCK SAKE”
Tell me you are a Java dev without telling me you are a a Java dev 😂
As a JS dev, I can only wish we had those types 🥲
you can still have that without having to declare the type manually. check out Swift or OCaml for example
deleted by creator
In the world of C and pointer arithmetic this makes perfect sense /s
I’m not sure if you’re being rhetorical or not, but “string|number” is definitely correct here. A computer could definitely figure this out, but typing is for the benefit of the coders more than the code itself. It’s basically functional documentation
deleted by creator
Yeah that’s what I’m saying, I hate it when coworkers will assign everything as “any” just to avoid the scary red squigglies. Oh well I guess that’s what code reviews are for 🙃
Type error unless there’s an implementation of
+
that specifies adding together and integer and a string.💯% accurate. funny how the typescript developer thinks this is some kind of “gotcha!”… like maybe just try a language besides typescript and find out for yourself 😆
deleted by creator
my complaint is that typescript is stupid, yes. so why wouldn’t i compare to what other languages do that is less stupid?
on the plus side, at least now i know that the ad-hominem minded devs came here too, and brought their righteousness with them.
deleted by creator
“brought” 😏
OCaml 😍
Exactly. Most languages I know of that allow this at all will coerce the “1” to an integer and give x = 2. They get away with this because they define the “+” operator as taking numbers only as arguments, so if you hand them
x = x + "cheese"
they’ll error out.