Hacker News new | past | comments | ask | show | jobs | submit | more mattbee's comments login

I thought the same about my 2020 Ryzen, until I started working with the Unity editor two months ago.

I'm reminded of the dead parrot sketch - this thing wouldn't "voom" if I put four million volts through it.


My Samsung has one, but it opens a slow, animated, dynamic menu based on HDMI data. So depending on what devices are turned on, the input you want might have moved in the list.

When I turn the PS5 off before the TV, the TV forgets the input ever existed. So it goes back to the TV's default channel - which oh, is GB News (the Fox News of the UK).

So every time I play I have to watch a bit of bilious TV while I scroll through the input list.

Truly a cursed device.


My Samsung does that too. The most enraging thing though is they fucked up standard HDMI. Like, I can't even plug in my laptop or steam deck. It will show up fine for about two seconds, and then the TV cycles it off. It seems to be looking for some kind of two-way communication or signal which it never gets. But plugging a laptop HDMI in has worked on EVERY SINGLE TV I'VE HAD FOR MANY YEARS, maybe decades, yet it doesn't work on my new TV. Aggravating.


Try turning off Input Signal Plus. That seems to attempt to do some fancier detection of the HDMI signal that breaks some things.


Thank you, I will definitely give this a try!


HDMI CEC is truly infuriating. I have turned it off everywhere (TV, receiver, google tv box, PlayStation). I'd much rather power on (or off) three things manually with three different remotes if it works consistently, as opposed to playing the "which device refused to power on/off via HDMI Control this time" game, every single time.


It would be better to display the menu immediately and to display according to the physical ports, with a number associated with each one, so that you can push input and then the number and then it switches immediately (even if nothing is connected, e.g. in case you will connect something later). Unfortunately, too many modern computers and other stuff have excessive animations and other stuff, and the numbers on the remote control will not work, because they will not make them to work. (Unfortunately, some TV remote controls (such as the one in the article) do not even have numbers, and even when they do, they often do not work with all of the functions that they should work with.)


You should not have to - but if you're talking about Samsung's shitty home hub streaming service, that can be turned off (I think you disable "Autorun Home Hub" or Smart Hub or...).

I was sick of getting Fear TV or some garbage poorly streamed to me when all I care about is Apple TV, Roku, or Art mode on my Frame TV.


I blocked all samsung ads in my router for my tv. Still got that samsungtv ehich always started playing. Also got rid of that by putting it behind a childs lock iirc. Now, if for some reason my apple tv does not properly switch on, i do not get that samsungtv channel anymore, but a black screen instead.

It is terrible ux. After the samsung galaxy s i swore off samsung, but somehow got tricked by the looks of a frame.

Never again. And the tv can be thrown out of the window, but my better half still likes tv.


My LG is like that too (well, not quite that bad), but holding down a number key is a shortcut to that input. Huge annoyance mitigator once I learned that.


Just like cars, I do not put it past TV manufacturers to sell a dedicated touch-screen TV remote before the decade is out.

Imagine:

* an unpredictably modal interface

* chugging, tasteless animations

* software updates every few weeks

* terrible battery life

* a constant glow out of the corner of your eye

* easily broken

But you can sell ads on it. You know it makes awful sense.


Oh, hey, I was working on that back in 2014 for one of the big TV manufacturers. The project was ultimately cancelled.

It was nice for things like switching HDMI inputs; you could dynamically update the name and icon, making it more intuitive for someone who had never used the TV before and didn't know what was plugged into which port. You could also adjust settings more easily without everyone have to watch together with you on the big screen as you dug to find the obscure setting to tweak.

But your complaints were equally valid, and were a concern at the time.

I would have liked to see it ship, if just to see if customers liked it. A traditional remote still worked too. But oh well.


When I press the "input" button on my remote, the TV displays a list of HDMI ports and what is plugged into them. Why would I want to be looking at my remote for that information? I'm already remote-controlling the best display device I own.


Look at fancy pants input button over here.

I have a ~2022 Samsung OLED, and it doesn't have an input button that I can find. I have to go into the home ribbon menu to find the inputs.


Oh, how I hate the new Samsung remotes!

They have very few buttons, which you can't tell apart in the dark (unless you remember the layout) and everything must be done through the UI which tries to upsell you some streaming service everywhere.


The Wii U gamepad had this functionality. It was pretty handy for the reasons you describe.

https://en-americas-support.nintendo.com/app/answers/detail/...

What TVs should be adding, though, is Wiimote functionality. Build the IR array into the bezel and let me point the remote to select something with a cursor, with arrow keys as a fallback if I'm lazy.


LG has had this for years with their Magic Remote.


Was it LG? I bought a lot of their discontinued Android-powered “smart” remotes for a project a few years ago. They unfortunately had their uses for other applications limited by a battery life of less than 30 minutes - I assume they were meant to live on the included Qi-powered stand.


>you could dynamically update the name and icon

You could do that sanely, with e-ink display on a button.


Yeah, the Logitech Harmony remotes that combined real buttons with a touch screen, particulary the Harmony One, were amazing. You had buttons for all of the common stuff, like volume, play, pause, numbers, and so on, but then you also had a touch screen so you could directly trigger actions that can't have physical buttons because they're different between individual setups.


This was one hell of a remote control

https://www.amazon.com/Logitech-Harmony-Elite-Remote-Control...

even if it was pricey. (Used to be able to get them refurbed at a decent price...) The touchscreen works really well, you can even use it to control the cursor on a PC. It has the buttons you'd expect on a remote control. It can run your Phillips Hue, CD changer, Blu Ray Player, TV everything. Makes the dominant paradigm of Apple, Netflix, Spotify and all that look like garbage, but I guess a lot of people now don't have anything to control with it anymore.

The configuration of my system got messed up and and I didn't bother to fix it because I thought they'd discontinued it; the latest I've seen is that they quit manufacturing it but they are still keeping the database up so I might trying bringing it up again.


My more well-to-do uncle has an older version of this for his absurd setup. Idk that it was the remote's fault or not, but the system was so stupidly complicated it burned into my brain that I'd just much rather not have any of the materialistic garbage it attempted to control. Not judging exactly, because everyone has different preferences, but I just couldn't envision myself loving what amounts to the digestion of video enough to try and wrangle any of it.

A TV and a receiver? Sure, fine. But also the PlayStation, movie server, regular cable input, Roku and Netflix and the "Smart" features of the TV for some reason. So many redundant boxes and services.


For some, the setup itself becomes a hobby to tinker with rather than a means to an end.


This is a typical setup for lots of folks. This is the average home theater.


Maybe it's because I'm not part of the demographic that wants to own a house with a basement or extra rooms in the suburbs to begin with, but I can see how it would be quite a nice setup if you already had the house, money, and interest in media.


Ultimately it would feel kind of baller, just seems like a lot of stuff


Random bit of trivia: the older versions of these had the interface implemented in Flash. That's right, Adobe Flash Player on a remote.


Nowadays, devices like the Broadlink RM4 Pro fills the same niche. It can learn both IR and Wireless protocols to remotely control most household devices (not just audio/video stuff).

The difference is that is does not come with a remote - instead there is either a phone app which can be used to directly control it or it can work with Alexa/Home Assistant.

I think it's a great way to "smarten" some older "dumb" devices.

https://community.home-assistant.io/t/getting-started-with-b...


I'll note my displeasure that buttons appear to be disappearing for "smart lighting" products where I assume they expect that you don't want light switches and want to control everything from a phone.

Phillips used to make a great switch, for instance, that was powered by the piezoelectric effect and didn't need batteries. I got a good deal on mine but regular pricing was unreasonable and it's been replaced by something nowhere near as cool. Sengled switches work well and are well priced but whenever the power goes out (10x a year at my location) they drain the battery trying to contact the hub, so I've spent in CR2032's whatever I saved.

(I'm glad Phillips avoided making the own goal of selling off the Hue brand because we'd still like to have a few Western brands in this space. I mean, they are getting their ass kicked by the likes of

https://us.govee.com/products/govee-curtain-lights?srsltid=A...

as it is without a western competitor to DJI we'll probably lose Boeing and Airbus within my lifetime)


Agree - I had one of these and was the only way to get all my hi-fi gear and other things working together. Everything just worked.


I currently use this remote and love it!


Vizio did this back in the 2018 time frame (ask me how I know). It's a small Android tablet with the Vizio app on it. You can also just download the app onto your phone and use it that way. It was so unpopular that Vizio eventually relented and handed out normal remotes. As far as I know, their modern TVs continue to use a standard remote.


I still have my 2018 Vizio that came with the stupid android tablet. On day one I blocked internet access for the TV, plugged in an Apple TV, and put the android remote in a drawer where it sits to this day.


I bought a very cheap Vizio TV around that time (I was in college) that didn't include a remote in the box. You had to use the app.

I'm sure I'm messing up some of the details, but --

The TV needed to be connected to a network for the app to work. The university required you to register the device's MAC address before it could join the network. The TV had an ethernet port, and its MAC was printed on a sticker on the back of the TV, so I was able to get that going. But it wasn't convenient to keep an ethernet cable routed to the TV (the room was awkward) so my roommate and I wanted to get it on the WiFi.

There was literally no way to open the TV's OSD and view the WiFi MAC address with the Vizio app. You needed a physical remote to access that part of the UI.

IIRC we ended up finding an old WiFi access point and connected the TV to it in order to view its WiFi MAC in the access point's admin UI.

They could have just given us a damn remote in the box! It was infuriating.


Well, my TV is it's own remote: it's one of my tablets or my phone or my laptop connected to a Raspberry Pi 3B with a TV hat, connected to the antenna on the roof. It runs tvheadend and I run its client on my devices, TVH Client on Android and TVH Player on Linux and Windows. Those devices are smart and can also run Netflix, YouTube etc without the TV spying on me. Each app can do its own spying but at least for YouTube there are alternative players that are more well behaved.

If there is another person at home I can boot a second Raspberry connected to another cable from the antenna or connect one of those devices to the HDMI input of my TV that sits unused in a corner of my living room. It's not usual to watch something with other people nowadays.

With this arrangement everybody can watch TV anywhere in the house and carry it wherever they go without having to pause the stream.


I gotta say: Palm Pilots with IR blaster apps such as OmniRemote check most of these boxes, but we loved them. In 2004.


It's been quite a while, but I vaguely remember my original PSP-1001 had an IR blaster program (app?) on it as well


Phone remotes do indeed exist! Both Roku and Apple TV (and definitely others; that’s just my direct experience) have them. They’re ok, handy when you need it.

So I gotta imagine Samsung/LG will eventually just make apps anyways. Why bother with a touch screen remote that has to be a similar size anyways


https://www.sony.com/electronics/support/remote-controls-rem...

I had one of these and oddly loved it, a bit of a geek toy.


I've been in the process of making my own macro pad recently, maybe it's time to make my own remote as well.


They already have phone apps.


That's essentially what the Philips Pronto remotes were.


how about a smart-remote

its just 1 button and it does what the built in AI predicts you wanna do.


Why even have a button?


Visionary.


This is the "you can already build such a system yourself quite trivially by getting an FTP account, mounting it locally with curlftpfs, and then using SVN or CVS on the mounted filesystem" comment, but about potatoes.


We should all do our part to stop perpetuating the meme about that comment, which is pretty unfair.

https://news.ycombinator.com/item?id=42392302


People really need to stop bringing this up. We've got billionaires mocking this guy. The OpenAI CEO tried to make an example of "haters" out of him.

https://zedshaw.com/blog/2018-03-25-the-billionaires-vs-bran...

This isn't right, "good sport" or not. It doesn't reflect well on this community.


I needed to triple check if I was in a time loop, but I guess that's just the he icing on the joke. in that sense: "why not just rcs? much simpler."


Dropbox really is bad and didn’t deserve to be successful. Its success is more a product of ZIRP than a comment about its utility. Why do we act like this “lore” of HN is something we all have to refer to to prove how “wrong” we were?


Saying that Dropbox is bad is the second coming of the original meme saying it’s trivial.

Yall will never learn.


Self flagelation because we can't imagine any other future. "The way things are is the only way things could have ever been" - you, implicitly.

So tired of folks who uncritically think "this is the best way that things could have been". Voltaire literally wrote a whole book called "Candide" which tries to call out this exact thought-cliche.


You can't have it both ways. Either Dropbox is trivial and should have been replaced, or it's not and deserves its success.


I think people think that comment is wrong because ner-ner Dropbox was in fact a big success, and just because it's _possible_ to do implement something on your boutique computer doesn't mean it won't have value for people who can't do that.

[I totally agree if Microsoft had more vision and were competing on an OS that actually worked better for the owner of the computer in 2007, they wouldn't have ceded core parts of the computer's function to some two-bit VC-fuelled bodgers]

But to be the comment is wrong because - well - it wasn't and it isn't trivial to run a version control system on top of a network filesystem! Synchronous filesystems over the internet are awful. If the commenter had ever tried this for more than about 2 hours they'd find it hangs on the first network wobble, and probably loses data.

Anyhow we're getting off-topic. Do you think French fries deserve their success?


Given that Trump seems hellbent on taking Canada I'd say that its worth it just for Americans to fix their French fries norms and make them "worth their success".


Dropbox was an excellent useful little app when it first released. It absolutely deserved to be successful. Back then seamless file synchronization was unheard of, and laptops weren't too useful so it was the only reasonable option.


But HN was wrong and Dropbox was quite successful. Their "deservedness" is irrelevant, the fact they actually did the thing is all that matters.


I think that's the norm in UK secondaries too. My 11yo is allowed to take his phone to school for but policy is it stays switched off, in the locker, until the end of the school day.


If you've received the output of an AGPL-3 program after requesting it through a network, you have interacted with that program through a network.

Why does it matter that it's gone through several API layers, a queue, an email system that takes 3 days or whatever? That's what a network is.

The demands of the license seem broad and clear.


That appies to slicer case.

But what about bills/statements etc.. - something that generates reports on the timer. That's not subject to license, right?

But what if there is a "paperless billing" checkbox on a webpage? Does this count as "requested through the network"?

What if there is a "long / short bill" button that customizes the report?

What if normally, bill is monthly; but checking checkbox for first time starts "backfill" process which generates PDFs right away?


Yeah the point of this long "incendiary" [sic] blog post seems to be "hang on to the output for a bit before passing it on to the user and you have success defeated AGPLv3" except I don't think a reasonable judge would agree with that at all. Also, the OP seems to think "receiving output from program" doesn't count as "interaction" and I don't agree with that.


That feels overly broad. Where is the boundary? Is it a network if you receive a USB drive after making a phone call to order something? Send a physical letter via pigeon?


Sorry, I elided the quote from the licence. It says "computer network" not just "network".


For completeness it should be noted that it doesn't cover all users interacting with the program. It only covers those who are "remotely" interacting with it.

I don't recall seeing any discussion of what counts as remote.


It feels like fly is trying to repeat a growth model that worked 20 years ago: throw interesting toys at engineers, then wait for engineers to recommend their services as they move on in their careers.

Part of that playbook is the old Move Fast & Break Things. That can still be the right call for young projects, but it has two big problems:

1) AWS successfully moved themselves into the position of "safe" hosting choice, so it's much rarer for engineers to have influence on something that's seen by money men as a humdrum, solved problem;

2) engineers are not the internal influencers they used to be, being laid off left and right the last few years, and without time for hobby projects.

(maybe also 3) it's much harder to build a useful free tier on a hosting service, which used to be a necessary marketing expense to reach those engineers).

So idk, I feel like the bar is just higher for hosting stability than it used to be, and novelty is a much harder sell, even here. Or rather: if you're going to brag about reinventing so many wheels, they need to not to come off the cart as often.


I've definitely been to a LAN party where IP addresses were written on clothes pegs by the entrance. You take a peg on your way in, clip it to your ethernet cable, configure that IP statically!



ReiserFS4 didn't get merged because of his empathy issues, and those issues are also presumably contributors to his murdering his wife.

Man, I have worked with some egotistical programmers and given some bad performance reviews. But "could also be a murderer" never crossed my mind.


This is pleasingly insane, congratulations! Is there a program to test the fairness of a given dice or coin? Is that a program that's even feasible to write?


You've always got the standard way to get fair random numbers from a fairness-unknown coin. Flip it twice. Restart if you get both heads or both tails. If you get H then T or T then H, those are equally probable, so take the first one of those as the final outcome.

This generalizes to a die of N sides. Roll it N times. If you don't get all N distinct results, restart. If you do, then take the first result as your final outcome.

(That may take a lot of trials for large N. It can be broken down by prime factorization, like roll 2-sided and 3-sided objects separately, and combine them for a d6 result.)


Hmm my intuition isn't agreeing with this. Does this have a name so I can read more about it?



I have the humility to admit that this, despite everything I pretend to know, has always escaped my understanding.

Someone please (jump?) at the chance to explain this one to me.

(assume i failed 9th grade 3 times)


The key assumption is that T and H may not have the same probability, but each flip isn't correlated with past or future flips. Therefore, TH and HT have the same probability. So you can think of TH as "A" and HT as "B" then you repeatedly flip twice until you get one of those outcomes. So now your coin outputs A and B with equal probability.


I feel like I am missing something so obvious that I feel the need to correct wiki, but that likely means I am fundamentally missing the point.

"The Von Neumann extractor can be shown to produce a uniform output even if the distribution of input bits is not uniform so long as each bit has the same probability of being "one"->[first] and there is no correlation between successive bits.[7]"

As long as the person doesn't favor which of the two bits they chose is "first", then it should appear as random.

But that is self-defeating, as if the person had the capability to unbiased-ly choose between two binaries, they wouldn't need the coin.

But since the only way to determine the variation from expectation is repeatedly increasing sample size, I don't see how doing it twice, and just taking encoding of the bits, then...

Is the magic in the XOR step? To eliminate the most obvious bias (1v5 coin), until all that could had been left was incidental? Then, always taking the first bit, to avoid the prior/a priori requisite of not having a fair coin/choosing between two options?

and it clicked. Rubber duck debugging, chain of thought, etc.

I will actually feel better now.


>To eliminate the most obvious bias (1v5 coin), until all that could had been left was incidental?

There is only one coin, flipped _twice_; not a running occurrence, but in couples, perfectly simulating two coins functionally.

Once a literal couple of coins result in a XOR'd result eventually, no matter how biased - they differ - the exact ordinality of which will be random.

Two sides to a coin, no matter how random, still half the chance.

(for lurkers cringing at my subtle mis-understanding)


Maybe I don't understand why or what you don't understand but...

Say you have a biased coin. It lands heads 55% of the time (but you don't know that.) Then the probabilities are:

HH = (0.55 * 0.55) = 0.3025

TT = (0.45 * 0.45) = 0.2025

HT = (0.55 * 0.45) = 0.2475

TH = (0.45 * 0.55) = 0.2475

If you disregard the HH and TT results then the equal probabilities of HT and TH result in a perfect binary decider using a biased coin. You assign HT to one result and TH to the other.


Maybe this intuitive "proof" will help.

Coins and dice and datums (solid objects with detectable outcomes) may, or may not have bias, it depends on how they were made and on manufacturing defects that resulted. But, at a minimum, such bias can oftentimes be side-stepped or bypassed.

Consider this argument from Johnny Von Neuman.

Suppose you have a single biased coin with these outcome probabilities:

A) Heads (1) 60% (Call this probability p.)

B) Tails (0) 40% (The probability of this outcome is q=(1-p), by definition.)

Now let us apply this algorithm to sequential tosses for this coin:

1) Toss the coin twice.

2) If you get heads followed by tails, return 1. (Say this outcome occurs with probability p’.)

3) If you get tails followed by heads, return 0. (The probability of this outcome is q’=(1-p’), by definition.)

4) Otherwise, ignore the outcome and go to step 1.

The bit stream that results is devoid of bias. Here’s why. The probabilities of obtaining (0 and 1) or (1 and 0) after two tosses of the coin are the same, namely p(1-p). On the other hand, if (1 and 1) or (0 and 0) are thrown, those outcomes are ignored and the algorithm loops around with probability 1 – 2p(1-p). So, the probability (p’) of getting a 1 using this algorithm after any sequential two tosses of the coin is p’ = p(1-p) + p’(1-2p(1-p)). The solution of which is p’=1/2, and since q’=(1-p’), then q’=1/2. A fair unbiased toss!

In fact, the example bias numbers given above don’t matter for the argument to hold (note that after solving for p’ it is independent of p). The outcome of the algorithm is a fair toss (in terms of the (0 and 1)-bit stream that results), regardless of the actual bias in the coin for a single toss. All the bias does is have an effect on the efficiency with which the bit stream is created, because each time we toss heads-heads or tails-tails we loop around and those two tosses are thrown away (lost). For an unbiased coin the algorithm is 50% efficient, but now has the guarantee of being unbiased. For a biased coin (or simply unknown bias) the algorithm is less than 50% efficient, but now has the guarantee of being unbiased.

This algorithm is trivial to implement for the Satoshi9000.


Thank you so much for explaining it with a concrete example, now even I understand it :)

This really is a useful idea.


  >Maybe I don't understand why or what you don't understand but...
Small mis-step because of an extremely bias head-example (99%H, 1%T).

When imagined, the first result is 99% Heads...until you finally flip a Tails.

We had to do this exact thing in 6th grade, and I picked proving 5%...fml.

I forgot that they are discrete pairs, not continuous (like my head cannon).

The XOR is the magic. Always has been.


It may be more likely that H or T happens (an unfair coin), but in a pair of H and T, both HT and TH are equally likely. Therefore which is "first" is equally likely H or T.

Only holds if no spooky effects change results based on last result. (like a magic die that counts upwards or a magic coin that flips T after H no matter what)

P(TH) = p(T)*p(H) = P(HT)


Your second paragraph is correct and may be where the previous poster's intuition was disagreeing, that the method doesn't necessarily hold for repeated iterations in a physical system where one trial starts from where the last one ended.

It's not even really "spooky" - all you need is a flipping apparatus that's biased towards an odd number of rotations, and so then THTH is more common than THHT and you get a bias towards repeating your last result.


Exactly right, I was thinking an unfair coin could have "memory" but then the method doesn't hold.


What about a 'dirty' coin or dice, where the dirt falls off during the run?


That's a clever point. But I think a corner case.

I suspect that when the user is loading coins or dice in the machine, they would notice any dirt that was significant enough to look as though it might be a problem.

And oil deposits from your fingerprints I would imagine are so minuscule as to be insignificant in creating varying bias.

Even then, in both cases, you could wipe the objects with an alcohol swab before putting them into the shaker cups.

It could be argued, I suppose, that every micro-collision of the coin or die with the cup removes a few atoms, but I would suggest that its effect on the bias of the coin or die over time is again minuscule. Indeed, unmeasurable over a full sequence of cycles (128 for example) of the machine when generating a Bitcoin key.

But an interesting point. Keep 'em coming!


That'd do it.

P(H|N) != P(T|N)

And

P(H|N) != P(H|N-1) (and visa versa)

Means that

P(HT) = P(H|N-1, T|N) != P(TH)


I love the slow pace of the video, including a few minutes presentation of all available programs. And indeed, there are programs to test dice and coin bias:

* https://youtu.be/bJiOia5PoGE?si=IEhbNJk0C0-7_2Nj&t=229

* https://youtu.be/bJiOia5PoGE?si=3Se3lYFVAAkElx0w&t=245


You can measure the Shannon entropy of a sequence


....you can do that using our universe's physical constants too.

Care to elaborate? Or link?

I mean, everything that is, is just displaced temporarily homogeneous complexity, allowable between the fluctuations of gradients of temperature, allowing the illusion of work to appear more than just energy dissipating into the either of expanding space-time, dragged by the irreconcilability idea of "gravity".

But that doesn't help bake an Apple pie from scratch, as Carl Sagan would put it.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: