The smart(shit)ification of TVs pisses me off.
The smart(shit)ification of TVs pisses me off.
I absolutely hate "smart" TVs! You can't even buy a quality "dumb" panel anymore. I can't convince the rest of my family and friends that the only things those smarts bring are built-in obsolescence, ads, and privacy issues.
I make it a point to NEVER connect my new 2022 LG C2 to the Internet, as any possible improvements from firmware updates will be overshadowed by garbage like ads in the UI, removal of existing features (warning: reddit link), privacy violations, possible attack vectors, non-existent security, and constant data breaches of the manufacturers that threaten to expose every bit of personal data that they suck up. Not to mention increased sluggishness after tons of unwanted "improvements" are stuffed into it over the years, as the chipset ages and can no longer cope.
I'd much rather spend a tenth of the price of my TV on a streaming box (Roku, Shield TV, etc.) and replace those after similar things happen to them in a few years. For example, the display of my OG 32-inch Sony Google TV from 2010 ($500) still works fine, but the OS has long been abandoned by both Sony and Google, and since 2015-16 even the basic things like YouTube and Chrome apps don't work anymore. Thank goodness I can set the HDMI port as default start-up, so I don't ever need to see the TV's native UI, and a new Roku Streaming Stick ($45) does just fine on this 720p panel. Plus, I'm not locked into the Roku ecosystem. If they begin (continue?) enshitifying their products, there are tons of other options available at similar price.
Most people don't replace their TVs every couple of years. Hell, my decade old 60-inch Sharp Aquos 1080p LCD TV that I bought for $2200 back in 2011 still works fine, and I only had to replace the streamer that's been driving it twice during all this time. Sony Google TV Box -> Nvidia Shield TV 2015 -> Nvidia Shield TV 2019. I plan to keep it in my basement until it dies completely before replacing it. The Shield TV goes to the LG C2 so that I never have to see LG's craptastic UI.
Sorry, just felt the need to vent. Would be very interested in reading community's opinions on this topic.
You actually can buy quality dumb TVs, but you have to do the legwork and do research on what are often referred to as "commercial displays." I see them everywhere in businesses for ads and showing the menu. They're sometimes a little pricier, but they're usually built a little "beefier" too, as they're expected to deal with more rough usage in like a restaurant context.
However, the other solution is the one you've already mentioned where you never plug the Smart TV into the internet, and instead bypass the "smart" on the TV with your own streaming boxes.
I think as more people realize there is a market for dumb TVs, you'll start to see that market grow more and more until they no longer just "commercial displays." Just gotta get enough people buying them and not buying Smart TVs.
I think if enough people never gave them Internet access, the manufacturers would start adding in cellular modems to ensure they get the data flowing (that is, data on your viewing habits and sending you ads).
Having worked in this field, I can tell you how it usually operates: You want the most data for the least amount of investment. As soon as your operational costs start to eat into your already thin margins, the equation falls apart.
Complex solutions designed to capture data from that 1-3% of users who actively avoid it end up costing a lot more money than their data is actually worth. In order to make this particular solution work, you need to make enough money selling whatever tiny amount of data you get from those 1-3% of users to cover the cost of putting a cellular modem in all of your TVs plus the ongoing cost of paying various regional cellular networks to deliver that data to you. You are likely tripling or quadrupling the total cost of your data collection operation and all you have to show for it is a rounding error. And that is before we factor in the fact that these users likely aren't using the built in streaming apps, so the quality of the data you get from them is below average.
That's what they do with CPAP machines.
I feel like the market is only going to grow in the top end. Audio/videophiles sort of areas with large, high quality, top end feature sets.
The low end tends to be partly subsidized by the “smart” features. Think TVs that show ads in the menu, or Amazon or Google screens that want you to use their services because it’s “easy” and they’re “right there” so maybe people will subscribe. Couple that with the “feature” that it’s already built in so it saves you an extra box/purchase for people who want cheap TVs, and I don’t see it going away anytime soon.
Exactly this.
Manufacturers are NOT INTERESTED in selling low-cost dumb TVs when they can sell smart TVs and get long-term returns. They are even willing to sell the TVs at cost because they will monetise later with ads and selling your data.
Manufacturers don't want you to have a dumb TV, they want everyone to go smart - which is part of why business-targetted dumb panels are priced higher - to disincentivise regular end-customers from buying.
The paradox being that if therr were "premium" smart TVs for people like us - with proper support, privacy, customization options and no crap like ads - we'd probably buy it, and pay a premium for it.
But that's just too much work for them and they probably don't even realize that kind of market exists.
They aren't very good though. They are durable, but usually expensive and missing a lot of features you might actually want for that price tag. For example, I've yet to find any OLED "commercial displays" that support Dolby Vision, VRR, and eARC.
It's way cheaper and easier to just buy the TV you want and not connect it to your wifi.
Computer monitors should work too, and are more readily available. Just dig through the business oriented monitors and ignore the gaming ones, as cable providers aren't really going to have anything that can take advantage of 60 fps display rates.
My personal experience with computer monitors is that they work great except they always seem to cheap out on speakers if they have built in speakers. Tiny, tinny things whose volume is always way too low.
I don't mind having separate speakers, but once in a while it would be nice to not need them.
Since I’m going to be skipping the TV part with my HTPC, then why not simply use a computer monitor. Nowadays you can also get a 40+” monitor, and that should be big enough for most people. These things might not even have any speakers, so you might need to plug it into an audio system to make it all work.
The other option is to buy the smart TV, turn off the networking, and hook it up to a Shield, Apple TV, or Roku. All those box makers are going to support the devices longer than TV manufacturers, and the streaming apps can't ignore them.
so is using something like an Apple TV or Roku box actually more secure than just using the apps directly on the TV?
Last time I looked for commercial dumb TV, a SHARP was like $4000 for a 65" 1080p or something :-/
$910 for a 65" 4k Samsung display.
https://www.samsung.com/us/business/displays/4k-uhd/qe-series/qe65t-series-65-lh65qetelgcxgo/
I did this for a long time on my old Vizio TV, but the experience was notably worse with external devices compared to built-in, due to the limited framerate support over HDMI. This led to awkward juddering when e.g. trying to play 23.976fps movies with only 30hz or 60hz output. It also meant built-in video features like motion interpolation did not work effectively.
I guess this is less of an issue today with VRR support on high-end TVs, but still, a lot of devices you might connect to a TV don't support VRR.
Your streaming box was either not configured properly, or was very low cost.
The most likely solution is that you need to turn on a feature on your streaming box that sets the output refresh rate to match that of the content you are playing. On Apple TVs it is called "match frame rate". I know Rokus and Android TV devices have similar options.
Newer TVs can detect when 24 fps content is being delivered in a 60 hz signal and render it to the panel correctly, but this doesn't usually work if you have the selected input set to any low-latency modes ("Game", "PC", etc)
This is one of the downsides of the widespread adoption of HDMI, it has quite a few downsides. Something like display port would be better, but it's far less common. Such is life.
This is good to know, thank you for the info. I am getting worried about my increasingly old TV (15+ years) and I do not want a smart TV to replace it.