Hacker News new | past | comments | ask | show | jobs | submit | more diimdeep's comments login

It is for those who spend on anything to increase their self worth? do not see how otherwise it can be useful. And these color palettes is actually pretty awful.


Nice addition.

Slightly more CPU heavy than Kitty.

in MacOS yabai tiling manager has problems dealing with native tabs, that's gonna annoy me like with Finder.

also in MacOS try pressing "Cmd + Shift + \", also works in Finder, it's there, it's cool, but I never use it.


That's neat but how do I select a panel afterwards? Arrow keys do nothing and hjkl start a search.


On my system it's ctrl+tab to cycle through tabs, then escape to open the one that is currently highlighted.


I once was foolish enough to upload a lot of personal photos to what was Picasa Web Albums integrated with desktop Google Picasa software back in 2007, but then years later deleted all of them. To this day I keep wondering whether Google still keeps all that photos somewhere in data lake warehouse.

https://en.wikipedia.org/wiki/Picasa_Web_Albums


Just use programming language to build itself, it is even possible with C [0]

[0] https://github.com/tsoding/nob.h

If it is painful, ditch that language.


Very annoying marketing and pretending to be anything other than just wrapper around llama.cpp.


Can you ollama haters stop with this bullshit?

Does llama.cpp do dynamic model loading and unloading? Will it fetch a model you request but isn't downloaded? Does it provide SDKs? Does it have startup services it provides? There's space for things that wrap llama.cpp and solve many of its pain points. You can find piles of reports of people struggling to build and compile llama.cpp for some reason or another who then clicked an Ollama installer and it worked right away.

It's also a free OSS project giving all this away, why are you being an ass and discouraging them?


Sure llama.cpp does not do all of that, except that it lets you curl model from public and free to use endpoints, it does that. But SDK? - fuck that. Load, unload and startup services - who is even need that ? All this value is so minuscule compared to core functionality provided by ggml/llamacpp.

But this submitted link is not even about all of that, it is about what really llama.cpp does not do - it does not write more lines of marketing material than lines of code, which is that marketing material is about, lines of code that really just wrap 10x more lines of code down the line, and all of that by not making it clear as day.


It's not even worth countering all these lies you're telling. Enjoy your self inflicted irrational misery.


They're going to go corporate


only after ungoogled-chromium I realised how dumb chrome omnibar without glued google autocomplete is, it not configurable at all and visited/bookmarked local search simply does not work.

it is abismal compared to librewolf (firefox)

but chromium feels faster and with better video files handling compared to firefox


if you let users watch two videos and pick which one is more interesting, this will go very bad in no time as it did with early Zuckerberg site Hot-or-Not


what for world models be equivalent of ChatGPT for LLM to really blow up in utility?


text to roblox maybe?


This will turn into gamified form filling habit for no profit, what is the point ?

If you mechanically open phone at least do something useful in it

read a quote https://github.com/jameshnsears/QuoteUnquote

track a habit https://github.com/iSoron/uhabits

learn vim https://play.google.com/store/apps/details?id=develop.exampl...

c++ quirks https://github.com/vsklamm/CppQuiz

or else


Imagine butthurt of US hawks if this were China or Russia based product, think about American Exceptionalism.


All three of them pose an actual security risk but I'm more than sure I'd want the American one out of the trio as a 1st world country citizen.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: