These days, kids identify them by the aspect ratio.
And video quality. Watching some historical videos from my childhood, like tv shows on youtube… the quality is pure potato. Either the archiving is terrible, or we just accepted much worse quality back then.
People always said that Betamax was better quality than VHS. What never gets mentioned is that regular consumer TVs at the time weren’t capable of displaying the difference in quality. To the average person they were the same.
Here is an alternative Piped link(s): https://piped.video/hGVVAQVdEOs
Piped is a privacy-respecting open-source alternative frontend to YouTube.
I’m open-source, check me out at GitHub.
Vcr won vs betamax because it was cheaper to make a VCR, then weighed less than betamax so less material costs.
VHS won because you could record more than 30 minutes on a tape
You kinda can tell though. CRTs didn’t really use pixels, so it’s not like watching on today’s video equipment though
CRT screens definitely used pixels, but they updated on the horizontal line rather than per pixel. This is why earlier flatscreen LCDs were worse than CRTs in a lot of ways as they had much more motion blur as stuff like “sample and hold” meant that each pixel wasn’t updated every frame if the colour info didn’t change. CRTs gave you a fresh image each frame regardless.
I have heard that pixels in CRTs are round and LCD/LED are square, that’s the reason why aliasing is not too noticeable on CRTs. Is this true or another internet bs?
They’re not round persay, but they aren’t as sharp so have more light bleed into one another giving a natural alaising effect. This is why some old games where the art is designed to account for this bluring look wrong when played on pixel perfect modern TVs.
It’s per se, not persay. From the latin ‘per’ meaning by and ‘sē’ meaning itself.
Noted.
Ummm…what? How do you think did CRTs show the picture?
What they’re referring to is that analogue CRTs don’t really have a fixed horizontal resolution. The screen has a finite number of horizontal lines (i.e. rows) which it moves down through on a regular-timed basis, but as the beam scans across horizontally it can basically be continuous (limited by the signal and the radius of the beam). This is why screen resolutions are referred to by their vertical resolutions alone (e.g. 360p = 360 lines, progressive scan [as opposed to interlaced]).
I’m probably wrong on the specifics, but that gives the gist and enough keywords to find a better explanation.
[EDIT: A word.]
VHS was capable of not bad quality, people just had a lot bad equipment.
Some TV shows (if they were crazy) were shot on film so you could re digitize them now in 4 or 8k and they’d look amazing. But there was also a lot of junk that was out there.
And as others have mentioned if you do an awful job of digitizing it then you could take something that looked good and throw all of that quality away. But if the tape wasn’t stored in good condition then it could just struggle to be digitized in the first place when done properly.
There’s a lot of archival video that is just terrible. Digital video compression issues have damaged a lot of old footage that’s gotten shared over the years, especially YouTube’s encoders. They will just straight up murder videos to save bandwidth. There’s also a lot of stuff that just doesn’t look great when it’s being upscaled from magnetic media that’s 240x320 at best.
However, there’s also a lot of stuff that was bad to begin with and just took advantage of things like scanlines and dithering to make up for poor video quality. Take old games for example. There’s a lot of developers who took advantage of CRT TVs to create shading, smoothing, and the illusion of a higher resolution that a console just wasn’t capable of. There’s a lot of contention in the retro gaming community over whether games looked better with scanlines or if they look better now without them.
Personally, I prefer them without. I like the crisp pixelly edges, but I was also lucky enough to play most of my games on a high quality monitor instead of a TV back then. Then emulators, upscaling, and pixel smoothing became a thing…
Here is an alternative Piped link(s): https://piped.video/vscKaVByjRU
Piped is a privacy-respecting open-source alternative frontend to YouTube.
I’m open-source, check me out at GitHub.
It was filmed with poor quality and the films can degrade overtime. It was archived that way because the source was 💩
I watch a lot of hockey. Just watching hockey games from the 2000s are full on potato. I don’t remember them looking that bad back then.
Hockey is definitely the sport helped the most by HD video.
All sports have been, also the rise of faster refresh LCD as those early flat screens blurred a lot.
pure potato
Lol
deleted by creator
Here is an alternative Piped link(s): https://piped.video/watch?v=wCDIYvFmgW8&
Piped is a privacy-respecting open-source alternative frontend to YouTube.
I’m open-source, check me out at GitHub.
I do by audio quality. We currently live in the age of badly understandable dialogues.
I noticed when watching Good Omens on Amazon Prime that they offer a language option “Original + Dialogue Boost”.
It works wonders. Almost feels like back in the days again when TV shows wanted dialogue to be understood.
Or you could just turn on dialog boost on your amplifier.
I think most people have given up and use subtitles on all the time.
This is actually because our microphones became better
Sure, microphones got better but there is more too it. One huge factor is the mixing for cinemas and not for home theaters or worse for TV speaker.
No, the video actually goes into that. Directors think it’s “more real” to have mumbled dialogues. But they seem to misinterpret that as “more mumble = more good”.
It’s a combination of both. Studios typically will mix the end result for the highest-end sound setups, which most people don’t actually have. If you’re lucky enough to have a full surround with the ability to properly dial in equalizer and other settings, you probably won’t have a problem hearing the dialogue even when it’s mumbled. But on conventional tv speakers, it can easily get lost in the mix.
This video was exactly what first came to mind when I read “badly understandable dialogues”! It bothers me that as we got better mics, the actors became more unintelligible instead of the other way as one would predict.
Here is an alternative Piped link(s): https://piped.video/watch?v=VYJtb2YXae8
Piped is a privacy-respecting open-source alternative frontend to YouTube.
I’m open-source, check me out at GitHub.
Does anyone actually randomly send you nude girls? Genuinely curious
No. Doesn’t look like it.
It’s just as well. Where would you even put them all?
I hear this all the time, and maybe I just don’t watch THAT many shows/movies, but I haven’t come across anything where the actors sound like they’re mumbling. Do you have a few examples I could look up?
I’ve used subtitles for most of my adult life, ever since having kids. First it was so I could watch without waking the baby, and then it was so I could follow along over all the noise in the house. And I never went back. So as sound mixing changed and got muddier, I guess I didn’t notice, because I was already used to not being able to hear half the dialogue anyway.
… or how blurry the image is (SD vs HD).
You mean 4k vs HD, right?
Radio vs TV for Boomers
B&W vs Color for Gen X
SD vs HD would be Millenials
4K vs HD for Zoomers
I’m late gen X so that doesn’t fly.
And I’m late boomer, so I’m b&w to color.
I’m a millennial and I can’t really tell the difference between SD and HD. Do you mean like when YouTube switches to 360p instead of 1080?
Just FYI. Ultra HD is 4k Full HD is 1080p (2k) HD is720p (1k) SD is 480p
I recall some people calling 1440p 2K and it really grinds my gears
That’s a big problem for stuff that was originally shot on video. Old stuff shot on film can look pretty good when digitized.
But then you also have that very specific window of time when a lot of stuff especially SFX was done on video that can’t be upscaled. Babylon 5 fans weep.
When I was a kid I used to think black and white meant the TV show or whatever used to be in color but since it got old it turned black and white. My thought process was they changed color just like old people’s hair turns grey… This was 35 years ago before internet.
deleted by creator
No, that was just for effect. Notice that all the scenes playing in Kansas are B&W (even the ones at the end), and all of Oz was in Color. It gave the place an extra kind of quality above the B&W pictures they were used to. I have heard that people in the cinemas gasped in surprise when the switch happened.
Yeah but it’s more complicated than that. They colorized a lot of movies after the fact, the colors were always extremely bright, kinda like when people would color their hair extremely bright. On the contrary I’m not very bright
Even early 16:9 stuff looks pretty dated now if it hasn’t been remastered to 1080/4k.
Laughs in Australian 576i free-to-air TV
deleted by creator
That’s such a trip. Only a 6 year difference between the two of you, yet you experienced the dawn of something and they didn’t, and it shapes both of your perspectives so much.
Even though it technically applies to transistors, Moore’s Law has been a good barometer for the increase of complexity and capabilities of technology in general. And now because of your comment I’m kinda thinking that since the applicability of that law seems to be nearing its end, it’s either tech will stagnate in the next decade (possible, but I think unlikely), or we may be due for another leapfrog into a higher level of sophistication (more likely).
Retaliate! Hand them a rotary phone and ask them to order a pizza.
Bonus: If they actually managed to phone someone, ask them to send an SMS with it next ;-)
There are places you can still call for delivery.
Can’t send a text with a landline though.
Can’t send a text with a landline though.
Of course, that was the joke ;-) Does not hurt to watch them trying to figure this out.
The real differentiator is pacing and editing.
I sometimes watch old movies and it gets infuriating how long they talk around the same fact that everyone already agrees on. Yes, he was killed with a knife because it’s still stuck in his head, now move on!
The pacing of the oldest movies comes from the theater. Watch a live play and it will seem well paced and ‘natural.’ Fast cuts from TV ads made people want a faster pace in television shows. Back in the day, ‘Miami Vice’ was cutting edge because they were the speediest.
Especially in action scenes. I used to watch Hawaii five O the 2010 version and sometimes a chanal showed the old version with the same name, the are so incredibly different in pacing and the amount of violence. I really liked the old one in that regard, much less shooting and blood.
It’s not so back and white anymore, is it
Re-watching Buffy the Vampire Slayer with my kids in new hi-def, and you can clearly and easily see the stunt doubles now, and the SFX look really dated now that you can see them clearly.
It’s amazing what old CRTs would let you get away with.
It’s not so much what they got away with but working with the tools they had. It is the same for pixel art in the early gen consoles.
The SFX was the limitations of tools they had, (and budget) but there were a lot of aspects of set design and stunt doubles where they could get away with more on a TV show in SD compared to a movie that was on film. When HDTV started, even news shows were forced to drastically improve the quality of the set pieces and makeup because small details could now be seen.
Lotta old shows are re-formated just to have the wider screen, since they would still film at higher res for movies or just because. It’s not just an indication of age if something is still only in 4:3, it’s an indication of thrift or just a general lack of giving a shit about the future.
a lot of old disney animated shows are now widescreen. Seinfeld is also widescreen and HD, probably was on film.
Watching Friends on Netflix in 4k is really trippy, since there are shows filmed just a little later on digital that look so bad.
Also the quality of the show go watch old Thomas the Tank Engine and compare it to the new one
Can always tell when a show is 4:3 aspect. Recently I’ve noticed some modern TV shows adopting the theater aspects of flat (1.85:1) or scope (2.4:1) which I think is pretty cool. The last episode of Strange New Worlds I watched was in scope, that’s some high end filming.
SNW is really top tier production quality across the board. The camera work, the sound, music, design, everything is goddamned impeccable, and that extends to the post production. So much thought goes into every part of it, and I really have to give Paramount its kudos for enabling that level of attention to detail in all aspects of the franchise right now. If I told a fellow Trekkie in the 90s that we would ever see the day, they would laugh.
I identified them by awkward haircuts and clothing styles. I knew something was off / wrong, but it wasn’t until adulthood that I was able to piece it together.
I have a relative who says their children won’t watch 2D animated features because they are old
They still make 2D cartoons
I think the distinction might be film versus series. Most movies are that bubbly CGI look but the streaming shows are mostly still 2D.
Asteroid City switched between aspect ratios as well as switching between black&white as they swapped between the TV story and the ‘real’/cinema story.
would you recommend the film? it’s been quite under discussed for a Wes Anderson film…
Even for fans of his films, you have to be prepared for the weirdness to be dialled up to 11 in this one. It’s the cinema equivalent of “I’m so meta, even this acronym”.
Any of his others would be an easier and maybe more satisfying watch. It’s a nice enough story of course, with the usual silly and neurotic characters and bizarre beautiful sets - just don’t be surprised when people come out of the cinema looking confused.