Hacker Newsnew | past | comments | ask | show | jobs | submit | thway15269037's commentslogin

ChatGPT to this day does not have a single simplest feature -- fork chat from message.

That's the thing even the most barebones open-source wrappers had since 2022. Probably even before because ERP stuff people played with predates chatgpt by like two years (even if it was very simple).

Gemini btw too.



Well apparently 3 years later they did a thing. I asked about it so many times I didn't even bother to check if they added it.

Though I'm not sure if they did not sneak it as some part of AB-test because the last time I did check was in october and I'm pretty sure it was not there.


I believe they announced “branch in new chat” on Sept 5th, so you’re not far off.

ChatGPT has conversation branches. Or do I misunderstand?

Just edit a message and it’s a new branch.


In not aware of a feature to access the previous message versions after editing.

This is a big use-case for me that I've gotten used to while using Open-WebUI. Being able to easily branch conversations, edit messages with information from a few messages downstream to 'compact' the chat history, completely branch convos. They have a tree view, too, which works pretty well (the main annoyances are interface jumps that never seem to line up properly).

This feature has spoiled me from using most other interfaces, because it is so wasteful from a context perspective to need to continually update upstream assumptions while the context window stretches farther away from the initial goal of the conversation.

I think a lot more could be done with this, too - some sort of 'auto-compact' feature in chat interfaces which is able to pull the important parts of the last n messages verbatim, without 'summarizing' (since often in a chat-based interface, the specific user voicing is important and lost when summarized).


The web app has < and > icons to flip between different branches.

I don't see them on their mobile app though.


You can click the three dots on any response and click "Branch in new chat". Not sure when it was added but it exists.

Yeah I got corrected above. Good, but not good it took them 3 years.

This is a constant frustration for me with Gemini. Especially since things like Deep Research and Canvas mode lock you in, seemingly arbitrary. LLMs to my understanding are Markovian prompt-to-prompt, so I don't see why this is an issue at all.

Now I'm curious: does this situation classify as force-majeure for a major firms? "Hey, you know, actually our entire consumer base just disappeared overnight. Crazy, huh?". And will various governments have to intervene to save them when/if that happens.

Not taking into account that they all be busy handing money to openAI, at least someone somewhere has to notice that something is very wrong.


And what if they just do it anyway? What are they going to do, sue them? Make them scrub every git repository on the planet?


it will be easy to prove that it is not technically possible since Git is decentralized. but fines... oh, those fines could be enormous. possibly, AMD could get barred from implementing HDMI at all - all HDMI has to do is to stop selling the spec to AMD specifically.


Both are deprecated though. And both say something unexpected on their repositories: one suggests you to use Docker Desktop (what?!), the other to try Fedora (what?!!). Am I taking crazy pills?


So much this. People don't realize that when 1 trillion (10 trillion, 100 trillion, whatever comes next) is at stake, there are no limits what these people will do to get them.

I will be very surprised if there are not at least several groups or companies scraping these "smart" and snarky comments to find weird edge cases that they can train on, turn into demo and then sell as improvement. Hell, they would've done it if 10 billion was at stake, I can't really imagine (and I have vivid imagination, to my horror) what Californian psychopaths can do for 10 trillion.


That's fine, good even. Afaik at least for some of these tasks dev teams are doing a lot of manual tuning of the model (rumored that "r in strawberry" had been "fixed" this way, as a general case of course). The more there are random standalone hacks in the model, the more likely it will start failing unpredictably somewhere else.


I'm not worried about it because they won't waste their time on it (individually RL'ing on a dog with 5 legs). There are fractal ways of testing this inability, so the only way to fix it is to wholesale solve the problem.

Similar to the pelican bike SVG, the models that do good at that test do good at all SVG generation, so even if they are targeting that benchmark, they're still making the whole model better to score better.


So if JavaScript was made on this day, today, it would be named AIScript? Got it. Totally hard to understand the level of hype.


Then I sure hope Nvidia completely ceases to exist, like SGI, who, ironically, was decimated by Nvidia and cheap consumer hardware.


Unlikely. Unless some new technology comes around that completely invalidates existing GPUs and Nvidia cannot pivot to it quickly enough, there's just no way. They're too big, too rich, too powerful. They basically own the dedicated GPU market, with AMD holding maybe a piddly 10% at best.


So, when anyone will fork in? Call it MaxIO or whatever. I might even submit couple of small patches.

My only blocker for a fork to maintain compatibility and path to upgrade from earlier versions.


To be fair, their previous behavior and attitude towards the open source license suggests that minio would possibly engage in at least a little bumptious legal posturing against whoever chose to fork it.


This is nauseating. I hope someone forks KDE into like KDElive or something so I could continue use X11 and not bother with piece of junk of Wayland.

The whole thing is a wrong way for free software and is being pushed by couple big corps out of their interests.


I had a simple proxmox/k8s cluster going, and fitting RAM for nodes was the last on my list. It was cheapo ol' DDR4.

Where I live price for my little cluster project gone up from around ~400 usd in july (for 5 node setup) to almost 2000 usd right now. I just refreshed page and it's up by 20% day-to-day. Welp. I guess they are going to stay with 8gb sticks for a while.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: