I’ve written about my issues with television technology before, and nothing much has changed, except what the manufacturers are pushing. The technology of the moment seems to be 4K UHD.
Let’s back up a step or two and consider the current state of things.
Traditional broadcast and home video equipment in the United States refreshed the screen about 30 times per second. Films typically used a slightly slower rate of 24 frames per second. Both of these are considered by most people to be good enough. Depending on the conditions, though, some people are sensitive to much higher refresh rates. Once you get to 60 Hz (frames per second) a majority of people won’t detect a difference. If you go above 100Hz virtually nobody can.
For resolution, there’s a similar limit. Your eyes can only resolve the individual dots down to about 300 dots per inch. In a typical living room you’re sitting far enough away from the television that you can’t see the difference between 720p and 1080p with a diagonal size of 40-42 inches.
Did this stop the industry from pumping out sets with 240 Hz refresh rates? How about 24 inch TVs with 1080p resolution? Yeah, it’s possible that you’re putting that 24 inch set on the wall next to the kitchen table, and you might be able to justify that, but let’s be honest; it’s a marketing gimmick. “More is always better,” right?
I’m getting long-winded so I’ll just quickly summarize my remaining point: available broadcast and cable bandwidth just isn’t sufficient for a 1080p 60Hz refresh, the current Full HD standard. You need something better than an average Internet connection or a Blu-Ray player in order to view it.
If you’re not even thinking about a 4K set, you can stop here. I won’t be offended. The same goes if you’ve already spent the money on a 4K set. But, if you’re still thinking about it and the calendar hasn’t hit 2018 yet, you might want to give this a read: The industry wants you to go 4K, but the professionals won’t be joining you