Hacker Newsnew | past | comments | ask | show | jobs | submit | more GeneralMayhem's commentslogin

It's much weirder now.

The current holder of that domain is using it to host a single page that pushes anti-vax nonsense under the guise of fighting censorship... but also links to the actual PuTTY site. Very weird mix of maybe-well-meaning and nonsense.


The guy behind that page and bitvise appears to have gone totally crazy during the pandemic. On his blog, he said in 2021 "I forecast that 2/3 of those who accept Covid vaccines are going to die by January 1, 2025."

And in 2022, he wrote "Covid-19 is mostly snake venom added to drinking water in selected locations. There may also be a virus, but the main vehicle of hospitalizations is boatloads of powder, mixed in during 'water treatment.' Remdesivir, the main treatment for Covid, is injected snake venom. mRNA vaccines hijack your body to make more snake venom."


> mixed in during 'water treatment.' Remdesivir, the main treatment for Covid, is injected snake venom. mRNA vaccines hijack your body to make more snake ven

Whaaaaat the fuuuuuuck

Can anyone debug this statement?? I’m not looped into weird this realm of paranoid delusion torecognizs what they’re referring to here.


There's no sense debugging the output when the hardware that produced it is clearly defective.


> MCP promises to standardize AI-tool interactions as the “USB-C for AI.”

Ironically, it's achieved this - but that's an indictment of USB-C, not an accomplishment of MCP. Just like USB-C, MCP is a nigh-universal connector with very poorly enforced standards for what actually goes across it. MCP's inconsistent JSON parsing and lack of protocol standardization is closely analogous to USB-C's proliferation of cable types (https://en.wikipedia.org/wiki/USB-C#Cable_types); the superficial interoperability is a very leaky abstraction over a much more complicated reality, which IMO is worse than just having explicitly different APIs/protocols.


I'd like to add that the culmination of USB-C failure was Apple's removal of USB-A ports from the latest M4 Mac mini, where an identical port on the exact same device, now has vastly different capabilities, opaque to the final user of the system months past the initial hype on the release date.

Previously, you could reasonably expect a USB-C on a desktop/laptop of an Apple Silicon device, to be USB4 40Gbps Thunderbolt, capable of anything and everything you may want to use it for.

Now, some of them are USB3 10Gbps. Which ones? Gotta look at the specs or tiny icons, I guess?

Apple could have chosen to have the self-documenting USB-A ports to signify the 10Gbps limitation of some of these ports (conveniently, USB-A is limited to exactly 10Gbps, making it perfect for the use-case of having a few extra "low-speed" ports at very little manufacturing cost), but instead, they've decided to further dilute the USB-C brand. Pure innovation!

With the end user likely still having to use a USB-C to USB-A adapters anyways, because the majority of thumb drives, keyboards and mice, still require a USB-A port — even the USB-C ones that use USB-C on the kb/mice itself. (But, of course, that's all irrelevant because you can always spend 2x+ as much for a USB-C version of any of these devices, and the fact that the USB-C variants are less common or inferior to USB-A, is of course irrelevant when hype and fanaticism are more important than utility and usability.)


As far as I know, please correct me if I'm wrong, the USB spec does not allow USB-C to C cables at all. The host side must always be type A. This avoids issues like your cellphone power supplying not just your headphones but also your laptop.


No, you're thinking about USB-A to USB-A, which is definitely prohibited by the spec. (Whereas USB-C to USB-C cables are most certainly not disallowed.)

What's disallowed is for a non-host to have USB-A, hence, USB-A to USB-A is impossible, because one side of the cable has to be connected to a "device" that's not acting in host mode.

Only the host is allowed to have USB-A.

This is exactly why USB-A is superior to USB-C for host-only ports on embedded devices like routers (as well as auxiliary USB ports on your desktop or monitor).

Generally, many modern travel routers have one USB-C and one USB-A port. Without any documentation or pictograms, you can be relatively sure that USB-A would be used for data, and USB-C is for power (hopefully, through USB-PD). Since USB-A couldn't possibly be used to power up the router, since USB-A is a host-only port.

USB-C is great for USB-OTG and the bidirectional modes, when the same port could be used for both the host and the peripheral device functions, like on the smartphones. https://en.wikipedia.org/wiki/USB_On-The-Go

If the port can ONLY be used in host-mode, and does NOT support Alt Mode, Thunderbolt, or bidirectional USB-PD, then USB-A is a far more fitting connector, to signify all of the above.


Yeah, I loughed out loud when I read that line. Mission accomplished, I guess?


Uptime and reliability are not the same thing. Designing a bridge doesn't require that the engineer be working 99.9% of minutes in a day, but it does require that they be right in 99.9% of the decisions they make.


Another way to think about it is that, if the engineer isn't right in 99.9% of decision, the bridge will have 99.9% uptime.


That's pretty bad for a bridge haha


Your first example has to do with the fact that tuples are copied by value, whereas lists are "copied" by reference. This is a special case of an even larger (IMO) misfeature, which is that the language tries very, very hard to hide the concept of a pointer from you. This is a rampant problem in memory-managed languages; Java has similar weirdness (although it's at least a bit more consistent since there are fewer primitives), and Go is doubly odd because it does have a user-controllable value vs. pointer distinction but then hides it in a lot of cases (with the . operator working through pointers, and anything to do with interfaces).

I think the whole thing does a misservice to novice or unwary programmers. It's supposed to be easier to use because you "don't have to worry about it" - but you really, really do. If you're not familiar with most of these details, it's way too easy to wander into code that behaves incorrectly.


    > This is a special case of an even larger (IMO) misfeature, which is that the language tries very, very hard to hide the concept of a pointer from you.
When I came to Python from Perl, it only took me about one day of Python programming to realize that Python does not have references the same way that Perl does. This is not flame bait. Example early questions that I had: (1) How do create a reference to a string to pass to a function? (2) How do I create a reference to reference? In the end, I settled on using list of size one to accomplish the same. I use a similar trick in Java, but an array of size one. In hindsight, it is probably much easier for junior programmers to understand the vale and type system in Python compared to Perl. (Don't even get me started about the readability of Perl.) Does anyone still remember the 'bless' keyword in Perl to create a class? That was utterly bizarre to me coming from C++!


> Your first example has to do with the fact that tuples are copied by value, whereas lists are "copied" by reference.

My mental model for Python is that everything is '"copied" by reference', but that some things are immutable and others are mutable.

I believe that's equivalent to immutable objects being 'copied by value' and mutable ones being '"copied" by reference', but "everything is by reference" more accurately reflects the language's implementation.


Yeah, I know that's how it works under the hood - and why you have things like all integers with values in [-5, 256] being assigned to the pre-allocated objects - but I don't think it's a particularly useful model for actually programming. "Pass-by-reference with copy-on-write" is semantically indistinguishable from "pass-by-value".


There is no copy on write and no pass by reference.

Python is "pass by value", according to the original, pedantic sense of the term, but the values themselves have reference semantics (something that was apparently not contemplated by the people coming up with such terminology — even though Lisp also works that way). Every kind of object in Python is passed the same way. But a better term is "pass by assignment": passing a parameter to an argument works the same way as assigning a value to a variable. And the semantic distinctions you describe as nonexistent are in fact easy to demonstrate.

The model is easy to explain, and common in modern programming languages. It is the same as non-primitive types in Java (Java arrays also have these reference semantics, even for primitive element types, but they also have other oddities that arguably put them in a third category), or class instances (as opposed to struct instances, which have value semantics) in C# (although C# also allows both of these things to be passed by reference).

The pre-allocated integer objects are a performance optimization, nothing to do with semantics.

The model is useful for programming, because it's correct. We know that Python does not pass by reference because you cannot affect a caller's local variable, and thus cannot write a "swap" function. We know that Python copies the references around, rather than cloning objects, because you still can modify the object named by a caller's local variable. We know that no copy-on-write occurs because we can trivially set up examples that share objects (including common gotchas like https://stackoverflow.com/questions/240178).


> I don't think it's a particularly useful model for actually programming

I think “everything is by reference” is a better model for programming than “you need to learn which objects are by reference and which are by value”. As you say, the latter is the case in Go, and it’s one of the few ways the language is more complex than Python.

You could argue that in Python you still have to learn which objects are mutable and which are immutable - but if it weren’t for bad design like `+=` that wouldn’t be necessary. A object would be mutable if-and-only-if it supported mutating methods.


> I think “everything is by reference” is a better model for programming than “you need to learn which objects are by reference and which are by value”.

The model is: everything is a reference (more accurately, has reference semantics); and is passed by assignment ("value", in older, cruder terms).

> but if it weren’t for bad design like `+=` that wouldn’t be necessary. A object would be mutable if-and-only-if it supported mutating methods.

`+=` is implemented by `__iadd__` where available (which is expected to be mutating) and by `__add__` as a fallback (https://stackoverflow.com/questions/2347265). The re-assignment of the result is necessary to make the fallback work. Reference for your "major wtf" is https://stackoverflow.com/questions/9172263.


SF city and county are actually the same legal entity, not just the same land. It's officially called the City and County of San Francisco, and it's just as unusual as it sounds. The mayor also has the powers of a county executive with both a sheriff's department (county police to run the jails) and police department (city law enforcement) reporting to him; the city government runs elections like other counties; the Board of Supervisors - which is the typical county legislative structure - also serves as city council. (Denver, Colorado works the same way, I think.)


Philadelphia is another example.


I don't think that's the point. If non-technical people are able to make a product happen by asking a machine to do it for them, that's fine. But they're not engineering. It simply means that engineering is no longer required to make such a product. Engineering is the act of solving problems. If there are no problems to solve, then maybe you've brought about the product, but you haven't "engineered" it.

I don't think that memorizing arcane Linux CLI invocations is "engineering" either, to be clear.


If I were to "build" the next big app entirely using llms, never writing a line of code, did I create it and do I own it?

If you answered yes that's really all that matters imo. Label me what you want.


If you hired people to build that product, you never wrote a line of code. No, you didn’t build it. Your team did. You’re not magically a software engineer, you hired someone else to do it.

Is there a product? Yep. Do you own it? Maybe. But again, you’re not suddenly the engineer. A project manager? Maybe.


> No, you didn’t build it

That's why I used the word create. I would be responsible for the creation of the product, so imo I created it. I'm the creator. It wouldn't exist without my vision, direction, and investment (of time and/or money).

Like a movie Producer: they don't actually "build" the movie. They use their money pay people to manifest a movie, and at the end of it they have created a movie and get a share of the profits (or losses) that come with it.

No, they shouldn't call themselves cinematographers, but they can say that they "produced" the movie and nobody takes issue with that.

> Do you own it? Maybe.

If I paid for it then absolutely I own it. I get to keep the future profits because I took the risk. The people that "built" it get nothing more than what I paid them for their labor (unless I offerred them ownership shares).


i think people are trying to make this difficult when it’s honestly super simple.

yes, you can make a product. no, it does not suddenly magically make you a musician.

you did the equivalent of hiring someone else to do it. you did not do it.

if you claim you wrote the novel, you’re lying. someone else did. if someone takes credit for work someone else did, they’re lying. it’s honestly not complicated. at all.


you're not countering what i'm saying, so i think we agree.

i'm just adding that (as an "engineer") i don't care what you call me, or what i call myself, because nobody cares and it doesn't matter. i'm commodified labor. replacable. with no claim on anything. and nobody will ever agree on the correct title anyway.

what actually matters imo is who the owner is.


yeah, it sounds like we both agree with the original post.

it literally doesn’t make someone an engineer.

its not difficult to understand but for some reason when its said it pisses certain people off.

i suspect many of the people upset want to convince themselves they’re suddenly magically a musician, architect, engineer, novelist, programmer, etc… when it just couldn’t be further from the truth. they’re just doing the equivalent of sending a dm to a coder friend and the friend is the actual programmer.

i think some people don’t appreciate being told the truth.


It's not just time-value. It's also not just tying/advertising (although it is some of that - if I'm getting a ton of "free" points to American, I'm more likely to fly with them). It's both of those, and so much more.

Loyalty points work like gift cards in that huge numbers of them go unredeemed for any value, so selling them is just printing money. And unlike gift cards, which are typically denominated in currency, airline points don't have a fixed exchange rate to USD, so the airline can sell them to Chase or whatever for $0.01, and then if it needs to rebalance the books to shed the outstanding liability it can easily adjust the point costs of flights to make them only worth $0.009 - it's the same as a price hike, but in a way that's less noticeable to most customers most of the time. And that's assuming they don't just sell the points at an outright profit to begin with.

You can find a number of analyses showing that airlines operate at a loss if you set aside the miles-economy revenue streams. United famously got a line of credit secured against their loyalty program in 2020, in which they and their creditors valued the loyalty program at more than the value of the entire company of United Airlines - which would naively imply that the actual airline, the part of the company that owns large expensive machines and actually sells a product to consumers, had negative value.

Here's a longer overview with numbers and sources - https://www.youtube.com/watch?v=ggUduBmvQ_4


Kind of like how GM's credit arm was briefly more profitable than the actual manufacturing.


If GM also owned an oil company and shares in Apple, those parts of it would also probably be more profitable than making cars.


The tradeoff on short domestic flights is that it encourages more - and larger - carry-ons, which slows down boarding/deplaning and therefore adds to turnaround time. If I don't have to pay for checked bags, I'd often prefer to have mine checked, especially if I have a connection - but since I do, I'll squeeze everything into a carry-on roller bag instead. Personally, it only takes me an extra second or two, but when you have a whole family doing this and only parent who can actually reach the overhead bins, it bogs down the whole aisle.


This is why I love it when airlines charge for carry-on bags, like spirit does. Everyone just has a teeny little backpack. Getting on and off is a breeze.


That'd be a very impressive service record - Neptune is right around ten thousand times as far as the moon.


That's just short of 20 years worth of use if earth-moon is your work-home commute, that's a pretty good analogy actually.


Extra impressive then, since you'd be making what's typically been a 6-day round trip every day!


At least in the remake, I think the dance minigame was randomized, but you could pretty easily play it perfectly. The dance partner does a little hand gesture to show you what to do next, and then you just hit he key in time with the music.


Yep, I believe the original didn't have the hand gestures.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: