Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

My Samsung TV got more and more unusable with every update. Over the years, saved apps, like Youtube, started to disappear every time it woke up. Then it would default to their Samsung TV app, rather than your last app. Samsung TV app happened to be on the Baywatch channel every time my young children started the stupid thing. Finally, after it took 2 minutes to load the youtube app, I factory-reset the device, disconnected the internet from it, and put a Beelink mini PC in front of it. Works flawlessly.

Samsung product life cycle support seems like planned obsolescence.



I have a similar experience with my high-end Samsung TV from 2013. The TV itself still works perfectly so I'm not replacing it soon (still 1080p, not 4K, but...), but over time, Samsung has steadily removed key features with each update. When I first bought it, it supported Skype video calls (and now the integrated webcam can't be used at all), IPTV streaming, and various third-party apps — all of which are now gone.

NEVER BUYING A SAMSUNG TV AGAIN


Microsoft removed support for Skype on TV, not Samsung.

Most apps get removed because the people writing them don't want to support them anymore. The Samsung framework from 2013 was always trouble and it doesn't support many current W3C features that you'd want as a developer. Most people I know are drawing the line at supporting 2014 or 2016 Samsung devices.

Could Samsung update their devices to ensure they still supported modern frameworks? Possibly, but they don't really get any revenue from providing OS upgrades and those devices suck in terms of RAM and CPU.


I hate this idea that software "rots" all by itself when it's just left on a device and is impossible to keep working. I would at the very, very least expect my device to work exactly as it did on day one, for the next 50 years, assuming I don't change the software. It's bits on a flash drive! It doesn't rot, outside some freak cosmic ray from space flipping a bit.

If you're saying the software stops working because the backend it talks to goes away, well that's a deliberate choice the company is making. All they have to do is have a proper versioning system and do not touch the backend service, and it also should work forever.


I certainly hate that idea as well, but I also accept a pretty decent amount of that because of interactions with the greater world outside of one company’s direct control.

For instance, suppose a streaming service starts requiring a new login method. They have to update their apps to use this new API. If there are and have been over a dozen different distinct smart television operating systems in the past 15 years, and there will be a dozen more in the next 15 years, it’s unreasonable to expect that even companies the size of say, Netflix, are going to reach far enough back in their history to update all those apps. They probably don’t have developers who understand those systems anymore.

And also, the software distribution mechanisms for each of those platforms are probably no longer intact either in order to receive an update. While it’s true that my Panasonic Blu-ray player that I bought in 2009 is still perfectly functional, and has a Netflix app, I assume it doesn’t work and that Panasonic would be hard pressed to distribute me a working updated app.

The only way things would be much different would be if technology progressed at a far slower pace, so there had been no need to adopt any breaking changes to how the app is built, how the apps and firmware was distributed, etc.


There are several examples I've seen of firmware on devices failing because of bit rot, so that's not true. We used to design devices so that the bootloader was pulled from NOR instead of NAND because of this. Then the device could be recovered using a USB stick.

Most people don't encounter it because their device was updated at least once. People should be less trusting in flash drives than they are, I recently pulled three USB flash sticks out of storage and two of the three are now unhappy.

There's a strong argument that consumer electronics should be able to be more incrementally upgraded. Including things like baseline upgrades for certificates. One of the things about TVs and these systems is that they are usually running on something like OverlayFS to avoid corruption of the base OS and enhancing security/integrity. They focus on replacing the underlying image, which is often security signed as well. If you screw something up with a device that's in a customers home then you're going to be spending a lot of money fixing it, the manufacturers have their war stories in this regard, so they're very risk adverse.

As for freezing the backend, you can't. Your API will evolve and for example if your database changes then your backend services will need to be touched. That database will change, some metadata or label will need to change. Even if you keep the API the same you'll need to maintain the legacy backend. Then you need that service running, consuming compute, for years even if there's hardly anyone using it and it's costing money. Then you need security patches for the backend service because the framework needs upgrading or the OS needs upgrading. Eventually the backend framework will be EoL/EoS and so you need to spend to upgrade. It's like saying we'll keep a Java backend running on a public facing API well beyond it's life, log4j anyone?


Certificates expire.


Google learning this the hard way with the recent chromecast outage[0]

[0]: https://www.googlenestcommunity.com/t5/Streaming/Regarding-a...


I think there is a strong argument to simply not checking certificate expiry dates in embedded hardware.

Just keep using the expired certificate forever.

Sure - that means if someone leaks the private key that everyone worldwide needs to do a firmware update to get security.

But that's probably less user harm than everyone worldwide needing to do a firmware update to replace an expired cert, and having a dead device otherwise.


1) You can't pass a PCI-DSS audit if you have expired certificates. 2) You can't always tell the CDN providers what to do with certs. 3) We've seen examples of new root certificates that mean devices don't know about things like LetsEncrypt


At the very least the user should be able to override the failing certificate check. So much "security" cargo culting is intentionally planned failure.


99% of consumers don't understand what that means and if we normalise the average consumer bypassing certificate checks that's definitely a bad thing.


So don't burn CA pubkeys into your binaries without means for user override. If the software can persist a user-specific analytics ID it can support user certs. This is a solved problem.


Yeah but how many people would do that? You, me, and maybe thousand other people here and similarly minded. That's sadly fart in the wind for such companies and not worth creating more friction and risk (ie folks hack their under-warranty tvs till they stop working and then come back asking for free replacements and tarnishing the brand).

I wish there was some trivial real-life applicable solution to this that big companies would be motivated to follow, but I don't see it. Asking for most users to be tinkering techies or outright hackers ain't realistic, many people these days often don't accept basic aspects of reality if it doesn't suit their current comfy view, don't expect much.


But we could do it for our friends and families. A repair shop could do it too. Instead of a full brick.


Here in South Korea, everyone who uses online banking has to renew and reissue banking certificates every year. While I'm not convinced the certificate process is 100% safe, using certificates is one good concept in the sh*t show of Korean online "security" malware users are required to install.


You can add as many user-defined, custom trust anchors as you want, they’re not going to make an expired server TLS certificate work.

Don’t get me wrong, allowing users to add their own trust anchors is absolutely a good thing. But it wouldn’t change anything if the vendor did what GP suggested, which is that the vendor "[does] not touch the backend service." Because one day, their TLS certificate would expire, and they would technically no longer be able to deliver security updates even if the user wanted them.


Not my problem as a buyer. Build the infrastructure to make certificates and everything else work for a reasonably long time. Service is part of the contract.


That's the point, there are no substantive contracts between you and the OS. If we want apps to be responsible for root certs that's interesting, but then the app needs some roof of trust with the OS anyway.


> Not my problem as a buyer.

Mentioning that certificates expire was directed against GP’s unreasonable demand that the vendor "do not touch the backend service." This doesn’t have to do anything with the buyer.


This is exactly why "Smart" TVs don't make any sense. My in-laws have a perfectly fine Sony TV, it's nok 4K, but the HD picture quality is amazing still. Apps have slowly started to disappear as they are no longer being updated and new one aren't being added.

I don't know how this work, but either Sony or the streaming service must be making the apps, and neither seems interested in maintaining apps for a 10+ year old TV. So when the streaming services are updating their backend, older TV don't get updated applications.

Smart TVs make absolutely no sense, the streaming service are moving to fast, so you'll need a cheaper box, or a product that is support for a decade.


100%. I think most people should probably transition their thinking from using smart TV apps being an obvious or reasonable thing to do, to viewing them like the ads you sometimes find in the box when you buy something. They’re basically just ads for streaming services, and they’re mainly there to try to trick you into connecting the TV to the Internet so that it can gather data for them.

In the event that one wants the app functionality, they’ll always be better off with a streaming stick. Even in respectable brands of TVs like Sony, the SOC’s are weaker than what you find in that $40 “Chromecast with Google TV.” so they’re pretty horrible to use even while they are current and supported.


My experience with LG wasnt any better. Thorough about a year the tv became increasingly unresponsive. You start it, after 30 seconds the sound andpicture appeared, and for about 2 full minutes it would not react to inputs what so ever (except turning off). So if you happen to turn the tv off with higher volume, you could not launch it in the evening without it blasting for 2+ minutes at night. Abhorent


LGs, while still smart TVs, are relatively competent at being dumb TVs. Your only other options these days (sans rescuing a dumb TV from e-waste) are commercial panels and projectors.


If you just use an HDMI input and attach some streaming box to it, Samsung TVs work just fine. Just never touch the remote and only interact with the source and everything works.


We have a 4K TV from Philips (really, TP Vision), which has Android TV, but you can just set it to an HDMI input and then it works as a dumb TV.

Being a Philips (TP Vision), it also has Ambilight, which is nice.

It’s a few years old though, so no guarantees that newer Philips (TP Vision) models work the same way.


Still appreciating my 2011 high end Samsung TV. I believe it's the last non-smart product year. It could stream videos from a network share but that's about it.

Judging by current trends i will have to replace the attached chromecast before the TV breaks.


what bother's me even more is that they are constantly spying on me (phone home, what am I watching, ...) and pushing advertisements to my TV. My next TV will probably not be connected to the internet.


Why wait for the next TV when you can just disconnect the darn existing box now?


I use a pi-hole to block the spying. My experience with Amazon's FireOS & Roku is they phone home a lot.


The issue is not Samsung per se, it is the smart TV crap we can't get rid of.

With luck there are some old TVs still on remaining stock and that is about it.


But you can at least for now still use those "smart" TVs as dumb displays for whatever device you want and just ignore the fact that the TV is running a full android stack or similar. There really is no need to scrounge for older devices with inferior display tech.


It's kinda common knowledge at this point that almost all Smart TV's suck, especially Samsung. I went the Samsung route as well - the TV itself is fine, but the software is horrible.

The solution (that I hope everyone knows about by now) is to buy an Apple TV and connect it. Once the TV starts, it shows Apple TV from the get-go and not any of the Samsung stuff.


> The solution (that I hope everyone knows about by now) is to buy an Apple TV and connect it. Once the TV starts, it shows Apple TV from the get-go and not any of the Samsung stuff.

Or just connect the TV to your PC where you have the freedom to run whatever software you want. Why replace one crappy "smart" device with another.


Well I'm not sure what use you'd have out of Skype integration when Skype itself is being axed in a couple of months


Why are other apps gone?


Contrary to lots of other opinions here, I bought a 65" Samsung TV at the beginning of covid and I sincerely don't have any significant complaints. The remote is easy to use, launching apps is straightforward, connecting an ARC soundbar was no problem, nor was connecting a Chromecast and an Xbox, and it "just works". Every once in a blue moon (maybe twice a year-ish) I've had to power cycle it to fix a wifi connectivity issue, which may well just be a result of DHCP lease expiration on my network.

I have a modern Sony Bravia, too, which is running "Google TV" natively. On the plus side, the UI is just about identical to what you get with a Google TV dongle (which I also have, plugged into an old 32" monitor in front of my bike trainer), but it's also a really heavy interface that's also increasingly rich in ads. If your household is like mine, and holds subscriptions to a half dozen or more streaming services, some of which are bundled and some of which are either discounted or comped via entirely different subscriptions (mobile phone) or membership (credit card), it's really not helpful to have Google show me subscriptions I might want to add-on to my Google TV sub, nor do I appreciate seeing ads for content from things I don't subscribe to. Also, the Sony remote has about 50 buttons -- not a fan.

All things considered, I end up having to fiddle with the Sony TV far more frequently than the Samsung one, usually because of network or app issues.

We have an old Roku stick plugged into an old tv in a spare room, too, and it's almost intolerably slow. It's primary use case is to plug into our projector for backyard movies in nice weather, so I keep it around, but man is it dog slow.


> don't have any significant complaints.

Are you happy with it spying on you?

That's what all Samsung televisions do, and there is no way to turn it off. They advertise on their own web page that they monitor the content viewed on their televisions for targeted advertising.

This isn't via some sort of metadata, they take screenshots at regular intervals and upload them to very insecure hosting.

I hope you never look at any "sensitive" content on your TV!


I don't really care because I only use the TV to access other streaming apps, and I know they already see everything I watch anyway. I don't have either cable TV or a cable-like alternative (YTTV, Roku Live, Sling, etc). Periodically I'll use it to cast something to, but it's usually my kid's soccer matches from a website on a laptop.

Fwiw, to the best of my awareness, I don't receive any advertising from my Samsung TV other than perhaps the strip of suggested things to watch (half of which are "continue watching" linked to watch history in the app I'm hovering over) that lives above the app list. This is wildly different from Google TV, which has a core value prop of embedding advertising right in your face.


You sure sound sure of yourself for not knowing what you’re talking about.

It takes less than a minute to disable ad tracking and ACR on a Samsung TV.

Settings > General > Terms > disable two checkboxes.


Do you have a source for the screenshot thing?


It's called automatic content recognition (ACR). Most systems take low resolution (about 640x480 or 320x240) black and white screenshots at regular intervals, compress them do death, and upload that to big brother. That's more than enough to determine what specific kink or style of porn you're into, if you make the mistake of thinking that watching that kind of stuff in the privacy of your own home is private.

https://www.zdnet.com/home-and-office/home-entertainment/how...


A quick search points me to [1]. Granted, it does not contain further links but there should be enough names to find more.

1. https://eandt.theiet.org/2024/12/11/smart-tvs-take-screensho...


I never worked for Samsung, but I built TVs for JVC and LG, among many other brands. I don't work in consumer electronics anymore but a decade ago that was my field.

TVs are a wildly unprofitable business. It's astoundingly bad. You get 4-6 months to make any profit on a new model before it gets discounted so heavily by retailers that you're taking a bath on each one sold. So every dollar in the BOM (bill of materials) has to be carefully considered, and not far back the CPUs in practically every TV was single core or dual core, and still under 1GHz. Bottom of the bin ARM cores you'd think twice to fit to a cheap tablet.

They sit within a custom app framework which was written before HTML5 was a standard. Or, hey want to write in an old version of .NET? Or Adobe Stagecraft, another name for Adobe Flash on TV?

Apps get dropped on TVs because the app developers don't want to support ancient frameworks. It's like asking them to still support IE10. You either hold back the evolution of the app, or you declare some generation of TV now obsolete. Some developers will freeze their app, put it in maintenance mode only and concentrate on the new one, but even then that maintenance requires some effort. And the backend developers want to shutdown the API endpoints that are getting 0.1% of the traffic but costing them time and money to keep. Yes, those older TVs are literally 0.1% or less of use even on a supported app.

After a decade in consumer electronics, working with some of the biggest brands in the world (my work was awarded an Emmy) I can confidently say that I never saw anyone doing what could be described as 'planned obsolescence'. The single biggest driver for a TV or other similar device being shit is cost, because >95% of customers want a cheap deal. Samsung, LG and Sony are competing with cheap white label brands where the customer doesn't care what they're buying. So the good brands have to keep their prices somewhere close to the cheap products in order to give the customers something to pick from. If a device contains cheap components, it was because someone said "If we shave $1 off here, it'll take $3 off the shelf price." I once encountered a situation where a retailer, who was buying cheap set-top boxes from China to stick a now defunct brandname on, argued to halve the size of an EEPROM. It saved them less than 5c on each box made.

For long life support of the OS and frameworks, aside from the fact that the CPU and RAM are poor, Samsung, LG and Sony don't make much money from the apps. It barely pays to run the app store itself, let alone maintain upgrades to the OS for an ever increasing, aging range of products.

And we as consumers have to take responsibility for the fact that we want to buy cheap, disposable electronics. We'll always look for the deal and buy it on sale. Given the choice of high quality and cheap, most people choose cheap. So they're hearing the message and delivering.


Yeah, but is there a way for consumers to compare the compute performance of any given TV?

If OEMs differentiated their TVs based on compute performance, consumers might be able to make an informed choice. (See smartphones: consumers expect a Galaxy Sxx to have faster compute than a Galaxy Axx.)

If not, consumers just see TVs with similar specs at different prices, so of course they’re going to pick the cheaper one.


It's really hard to get these things across to consumers.

This is why we ended up with phrases like "Full HD".

The average consumer doesn't know what these numbers mean, people who read hackernews aren't the 99%. Phones have helped a little bit with widening the idea of newer = better, but ask the average person how many cores their phone is or how much RAM it has? They don't know.

Also, it's hard to benchmark TV performance as a selling point. Perhaps sites like rtings need to have UX benchmarks as well? They could measure channel change times, app load times, etc. That might create some pressure to compete.


>I can confidently say that I never saw anyone doing what could be described as 'planned obsolescence'. The single biggest driver for a TV or other similar device being shit is cost, because >95% of customers want a cheap deal.

You are literally the first person I have ever seen say this online, besides myself. I have worked in hardware for years and can vouch that there is no such thing as planned obsolescence, but obsession over cost is paramount. People think LED bulbs fail because they are engineered that way, but really it's because they just buy whatever is cheapest. You cannot even really support a decent mid-grade market because it just gets eviscerated by low cost competitors.


I was in a meeting with a senior guy from one of the top Asian brands and I said "We're getting out of TVs, we've lost $x millions and that's enough."

He said "Hah, we can lose way more than that!"


Thanks for sharing. Without insight beyond being a consumer, I do think there's room for disription (ideally from within the industry itself) vs 10y ago.

Comparing models from 2005/2015/2025, for example. Most people literally can't tell 4k from 1080 and anything new in the HD race mostly feels like a scam. The software capabilities are all there. I think to differentiate from the no-name stuff, longevity is going to become a more significant differentiator.


We tried to disrupt the market, back about 10 years ago.

One of the significant problems is that 80% of TV SOCs are made by one company, MStar (or their subsidiary). And there's only a handful of companies who make the motherboards with those chipsets. Anyone entering the market either buys those or isn't competitive. It's hard to be competitive because everything is so concentrated and consolidated. Since ST Microelectronics and Broadcom left the TV chip market it became a much less diverse market.

We were an established company who made software for STBs, we had done a ground-up build of what was probably the most capable and powerful framework for TV/DVRs. The new design was commissioned from us by a well known open source Linux distro, who then decided they didn't want to continue with the project after they realised that getting into TV OS's was hard. We then took on ownership of that project but getting investment or even commitments from buyers was impossible.

The retailers and TV brands wanted to rehash the same thing over and over because that was tried and tested. It didn't matter that we made something that was provably better and used modern approaches, it wasn't worth the effort for them. If you can't order about 500,000 TVs then you're not going to get anyone to make anything custom for you these days and you'll not make a profit.

--

It was a DVR/TV framework that was designed by people who had worked for big names in the TV business with a clean slate. It would handle up to 16 different broadcast networks (e.g. satellite, terrestrial, cable) and up to 255 tuners, even hot pluggable. Fast EPG processing and smart recording to either internal storage or USB storage. It was user friendly and allowed for HTML5 apps. We pushed it as much as we could but eventually on the brink of financial ruin the company was sold to someone who had no interest in what had been built. I will always feel that something great was lost.


The problem is getting that jank even when you buy the expensive models, though.


But then they're running on the same common platform as the models half the price. But more than 95% of the cost of the TV is in the panel itself, a fancy model is usually just a bigger model and maybe some different, higher end panel. But the CPU inside is nothing special because then they can keep costs down to compete the with the cheap 60in TV you saw while shopping for groceries.


> TVs are a wildly unprofitable business... not far back the CPUs in practically every TV was single core or dual core

Explain to me then how an Apple TV device for $125 (Retail! not BOM!) can be staggeringly faster and generally better than any TV controller board I've seen?

I really want to highlight how ludicrous the difference is: My $4,000 "flagship" OLED TV has a 1080p SDR GUI that has multi-second pauses and stutters at all times but "somehow" Apple can show me a silky smooth 4K GUI in 10 bit HDR.

This is dumbass hardware-manufacturer thinking of "We saved 5c! Yay!" Of course, now every customer paying thousands is pissed and doesn't trust the vendor.

This is also why the TVs go obsolete in a matter of months, because the manufacturers are putting out a firehose of crap that rots on the shelves in months.

Apple TV hasn't had a refresh in years and people are still buying it at full retail price.

I do. Not. Trust. TV vendors. None of them. I trust Apple. I will spend thousands more with Apple on phones, laptops, speakers, or whatever they will make because of precisely this self-defeating decisions from traditional hardware vendors.

I really want to grab one of these CEOs by the lapels and scream in their face for a little while: "JUST COPY APPLE!"


> Explain to me then how an Apple TV device for $125 (Retail! not BOM!) can be staggeringly faster and generally better than any TV controller board I've seen?

This is the result of Apple being vertically integrated and reusing components from other product lines in products like Apple TV. The SoC used in the Apple TV are from lower-tier bins of chips produced for mobile applications.

With the Apple TV, you are getting a SoC that is effectively the same as a recent-year iPhone. With most other Smart TV devices you are getting a low computational power SoC, Raspberry Pi tier, with processing blocks that are optimized for the video playback and visual processing use cases.

Apple also does this with the iPhone where the non-flagship variants will reuse components or designs from prior years.

Television/Smart TV manufacturer margins are in the single-digit percentages and the Samsung and LG tv businesses are significantly threatened since their high-volume products have been commoditized from Chinese producer competition. Most potential customers are shopping based on screen size per dollar, versus specs like peak luminance and contrast ratios. Flagship TV products like "The Wall" are low-volume halo products. Lifestyle products like "The Frame" exist because they are able to differentiate to certain segments of customers that place enough value the packaging aesthetics to buy a higher priced product with better margins for the manufacturers.

Most other hardware device manufacturers are jealous of Apple's margins. Nvidia would probably be one of the few exceptions.

Thin margins on commodity tier products drive these manufacturers to cut their BOM costs as much as possible, even if it makes the product worse in other ways. This is also the big driver for why ads are appearing as part of the Smart TV experience at the device/screen level. Vizio for example shared that they made more money from their ACR business than they did from the device sales themselves. There are companies with business models based around giving you the screen for "free" in exchange for permanent ad-space. Even adjacent products and companies like Roku have business models where they are selling their hardware at near break-even cost points because their business model is built around 'services' from having a large user audience.


Budget mobiles phones exist, and make a profit. These have 4G radios, screens, batteries, cameras, and storage.

There is no excuse for TV manufacturers when selling premium devices costing thousands of dollars.


Greater than 95% of the cost of a TV is in the panel.

TV panels must have a near 0% defect rate and a single piece of dust during the manufacture will render the finished panel e-waste. The bigger the panel the risk of a defect goes up exponentially because the surface area for any defect becomes bigger. It follows the same issue as to why chip companies introduced chiplets, the smaller die sizes improves the yield and they can throw away less silicon.

A TV panel is basically a 50in chip, and a mobile phone display is a 6in chip.


Samsung also has access to competitive mobile SoCs through vertical integration though.


In theory they do have access and should, but in practice they don't.

Samsung's flagship mobile phone products tend to ship with Qualcomm Snapdragon SoCs in competitive markets, such as USA/North America, versus their "in-house" Exnyos SoC used in markets where consumers tend to have less choice (e.g. Samsung S-series phones with Snapdragon for USA, Exnyos for EU and KDM markets)


We bought a samsung tv in 2016 and it slowly became unusable by mid-2020. Fortunately it got dropped by the movers and we were able to justify buying a new TV (LG). The LG UI/UX is awful though, I wish we'd bought a sony. LG TVs don't have a way to simply select "HDMI1/2/3/4" you're stuck using it's "smart" detection system, which can only be reset by physically unplugging the HDMI cables from the back of the TV, which is never easy to get to. Apparently the solution is to buy Sony and just pay the extra price.

I have a "smart" Samsung TV in my home office but it's never been plugged into the network and has a chromecast and various networked devices plugged in to it as a "dumb tv", that has been working out great, the TV still turns on/off easily and is as fast as the day I bought it (makes sense, it's still running the factory firmware).


> LG TVs don't have a way to simply select "HDMI1/2/3/4" you're stuck using it's "smart" detection system, which can only be reset by physically unplugging the HDMI cables from the back of the TV, which is never easy to get to. Apparently the solution is to buy Sony and just pay the extra price.

Another possible solution is to only use one input on the TV. Connect an A/V receiver to that one input and connect all your other devices to the A/V receiver. Then you should only need to deal with switching inputs on the TV if you want to watch over the air TV using the TV's tuner. You can probably even get rid of that need by getting a stand-alone TV tuner and hooking that up to the A/V receiver.

Many A/V receivers have network interfaces that you can use to control them if for some reason you don't want to use their remote. Most Denon receivers for example have an HTTP server that presents a web-based interface if you browse to it from a computer or mobile device.

They also run a simple HTTP based API that is easy to use from scripts. For example here is a shell script that gets the current volume setting of mine:

  URL=http://192.168.0.xx/goform/AppCommand.xml
  cat > tmp.$$ <<HERE
  <?xml version="1.0" encoding="utf-8"?>
  <tx>
    <cmd id="1">GetVolumeLevel</cmd>
  </tx>
  HERE
  curl -X POST -H "Content-Type: text/xml" --upload-file tmp.$$ $URL
  rm tmp.$$
which when run gives me this at the moment:

  <?xml version="1.0" encoding="utf-8" ?>
  <rx>
  <cmd>
  <volume>-45.0</volume>
  <disptype>RELATIVE</disptype>
  <dispvalue>-45.0dB</dispvalue>
  </cmd>
  </rx>


But this breaks DRMs if that's something you need.


It generally should be OK if you get an A/V receiver that implements the current HDMI and HDCP and related standards.


I had a Samsung QLED TV, and still had to upgrade the firmware once. Thankfully you can do this by USB storage without connecting the TV to the Internet. The preloaded firmware had audio issues where sound would drop out, even when playing through the built-in speakers, and I haven't seen that issue happen since upgrading the firmware.


I also had the Baywatch bug. Neo QLED right?

Every time you’d start the tv it’d switch to the Samsung Baywatch 24/7 stream.

So inappropriate for the children.


>So inappropriate for the children.

The bug, or Baywatch itself?


This describes essentially all Samsung products: really cool for the first few months then progressively accelerating slide straight into the trash.

I'm never buying any Samsung products again if I can avoid it. A forced update bricked my damn phone when it forcibly restarted while I was showing something to a client.

Samsung doesn't give a shit. They'll trash the device you paid for and tell you to suck it up and buy a new one.


Yep, I stopped using Samsung products not too long ago.

Reminds me of the time when a Samsung VP (or whatever his title was) showed up at a Microsoft Build conference to promote their TVs and the shiny new Tizen .NET Framework that shipped inbox. I asked if they planned to backport it to last year’s model—which I had just purchased—so we could test with and target existing TVs in the market. He looked me straight in the eye and, with a smarmy grin, said (paraphrasing), 'No, we want consumers to buy new TVs.' I walked away disgusted and abandoned any idea of targeting that platform.

Similarly, I vaguely recall a Samsung event that had leadership--CEO?--flat out say they wanted consumers to buy new TVs every year or so. I couldn't immediately find the quote though.


I pulled my Samsung Smart TV off the network a while ago, precisely because it was getting slower and slower over time. The allegations of spying pushed me over, but the apparent belief that they own my TV would also have done it.

I want a separation between my display device and the thing serving it anyhow, but that's just me in my techie world. The fact that performance got worse with each update, though, that's just over the line for everyone. I mean, if you're going to babble about how you're upgrading my experience, shouldn't you, you know, upgrade my experience instead of constantly downgrading it? My experience gets downgraded, but gee golly, it sure seems like yours is getting upgraded.

Well. It's really not that hard to not plug in the ethernet cable.

My Roku boxes have also had the same trajectory over the years. As time rolls on, they just get slower and slower with each update. Slowly, but surely. How exactly this is accomplished I'm not even sure, it's not like they're overflowing with new features or doing bold new computations for my benefit. They just get a little bit slower every effing time. But at least replacing my Roku boxes is $20-40 now. Hey, sure, OK, a $40 thing probably can't be expected to work 5 years from now. If nothing else, video codecs do march on and specs may exceed what the hardware decoders can handle. OK. My $1000+ TV does not get that grace. It damned well better be able to turn on in less than 30 seconds, even 10 years, 20 years from now. No excuses.


I had a similarly negative experience, sadly. Samsung managed to break HDMI-CEC in the final firmware update for one of their tvs, and wouldn't allow downgrading.

Which tends not to be great for a tv one wants to use with a Chromecast or similar media box...


I find it appalling that no matter how much money you spend on a Samsung TV, you'll get banner ads in the fucking source switcher. Absolute total disregard for their users.

LG still has bits that are ultimately ads, but at least they're less egregious, presented as suggested content in a Home view that already aggregates content from various sources. Not ads for fucking McDonalds and similar. At least that was the case as of a couple of years ago—I disconnected my LG from the internet the day I got an Apple TV and never looked back.

Just let me buy a large class leading display without trying to insert yourself into my life, please. I'm already paying through the nose for it.


I had a smart TV that gradually got slower and slower until it became basically useless. I figured it was just running out of RAM as apps got larger with updates over the years.


Sounds like every Android vendor, woth Google leading the pack.

(disclaimer: maybe 5-10 years ago)




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: