Hacker Newsnew | past | comments | ask | show | jobs | submit | freetonik's commentslogin

I was also suspicious of Cloudflare as a full platform, but now it's one of my favorite ways to develop and scale web applications. I have implemented Minifeed[1] (and Exotext [2]) completely in Cloudflare Workers (except for the full-text search, for which I use a self-hosted instance of Typesense; though in my testing, Cloudflare's D1 database does come with full-text search enabled - it's SQLite compatible, and it works well!).

I also didn't want to have any kind of rich frontend layer, so all my HTML is generated on the backend. I don't even use complex templating libraries, I just have a few pure functions that return HTML strings. The only framework in use is Hono which just makes HTTP easier, although standard handlers that Cloudflare offers are just fine; it takes maybe 2-3 times more lines of code compared to Hono.

D1 is a fine database. Queues are fantastic for my purpose (cron-scheduled fetches of thousands of RSS feeds). Vector database is great, I generate embeddings for each fetched blog post and store them in the vector database, which allows me generate "related" posts and blogs. R2 is a simple S3-compatible object storage, though I don't have many files to store. Deployments and rollbacks are straight-forward, and the SQLite database even has time-travel when needed. (I've also tried Workflows instead of Queues, but found them unstable while in open beta; I haven't tried them after they became generally available.)

I know this might sound like an ad or something; I have nothing to do with Cloudflare. In fact, I couldn't even get through to the initial interview for a couple of their positions :/ It's just I always had this cloud over my head every time I needed to create and maintain a web project. Ruby on Rails + Heroku combo was probably the easiest in this regard, abstracting away most of the stuff I hate to deal with (infra, DB, deployment, etc.) But it was still not as robust and invisible, and also pricey (Heroku). Cloudflare workers is an abstraction that fits my mindset well: it's like HTTP-as-a-service. I just have to think in terms of HTTP requests and responses, while other building blocks are provided to me as built-in functions.

Minifeed has been chugging along for 2+ years now, with almost 100% uptime, while running millions of background jobs of various types of computing. And I didn't have to think of different services, workers, scaling and stuff. I am well aware of how vendor-locked in the project is at this point, but I haven't enjoyed web development before as much as I do now.

The only two big missing pieces for me are authentication/authorization and email. Cloudflare has an auth solution, but it's designed for enterprise I think. I just didn't get it and ended up implementing simple old-school "tokens in db + cookie". For email - they have announced the new feature, so I hope I can migrate away from Amazon SES and finally forget about the nightmare of logging into the AWS console (I have written step-by-step instruction notes for myself which feel like "how to use a TV" note for some old, technically-unsavvy person).

[1] https://minifeed.net/

[2] https://exotext.com/


> with the claim that we should move from “validating architecture” to “validating behavior.” In practice, this seems to mean: don’t look at the code; if tests and CI pass, ship it.

But... tests and CI are also code. It may be buggy, it may not cover enough, etc. It is also likely written by an LLM in this scenario. So, it's more like a move from “validating architecture” to “LLM-based self-validating”


Not a wide in the number of sources (yet), but I'm curating a directory/reader/search engine of personal blogs, and the "Global" view shows the latest posts across 1300+ feeds: https://minifeed.net/global

Super cool, I might want to connect this to my swipe engine [1] (see comment below). Since you have already curated so many feeds this is the source I was looking for (and trying to build myself [2]). I’ll shoot you an email.

[1] https://philippdubach.com/standalone/rss-tinder/ [2] https://news.ycombinator.com/item?id=46602227


Feel free to grab the ones I have collected over at https://blogroll.org. There's an OPML export option that only has the ones with an RSS feed filed.

Is there a feed of your global feed? A meta feed? If I download an opml it will just go out of date right?

A feed of the posts of the global feed or the feed of newly added blogs?

Either way I was thinking to implement those.


A feed of the posts from the global feed

How can one submit their blog?


>If you are writing complex web applications with state, local processing of data and asynchronous interactions it's not enough.

>Next objection usually is: do you need complex apps on the client?

It's not even an objection, it's a question I ask and almost never hear a coherent answer to. The vast majority of web applications I use every day (online banking, github, forums, social media, admin interfaces of various developer tools, etc.) don't really need to be dynamic and frontend-rich. I don't care if submitting a form refreshes the page. Funnily enough, full page refresh with a full round trip with "old school websites" is often faster than dynamic SPA interaction.

I don't care that when I click "delete", the item may not disappear from the screen immediately. I don't want to see some in-between state descriptions like "Deleting..." because I know it's a lie in a distributed, eventually consistent system. Just tell me the truth: the request has been sent. I can then refresh the page and see the new current state, whatever it is.

I really don't understand this desire to make websites behave like local apps while in reality they aren't.


> it's a question I ask and almost never hear a coherent answer to.

There are a lot of coherent answers though.

One is that responding with HTML encumbers the server with brittle UI over the wire when it could instead be a simpler API server that any client can talk to.

Returning data instead of UI from the server is a clean separation of concerns.

There's nothing incoherent about that.


That's the theory. In practice, if your UI is changing a lot, the data your UI needs is also changing a lot, meaning that your data API will either have a lot of churn or you'll allow a lot of flexibility in how untrusted clients can use it, which introduces it's own pile of issues.

> I don't care that when I click "delete", the item may not disappear from the screen immediately.

The disconnect here between tech people and non-tech people is that most users do care about stuff like this.

I run a popular website as a solo project so all the feedback/complaints are routed to me, and one thing I've learned is that users really don't want websites to "feel old". Sure, they want it to be fast, but they also want all the bells and whistles like loading indicators and animations.

If you show Hacker News to someone who's not a developer, especially if they're under 30-35, their reaction to the layout and functionality will be visceral disgust. I really can't stress enough how much modern users hate the traditional plain HTML look. If you're trying to convince users to use your site and it looks or functions anything like HN, they'll get angry and close the tab within seconds to look for an alternative. Even if you've made a SPA with plenty of bells and whistles, users will still get upset if anything feels "clunky", which is often user-speak for "this component needs animations and a transition state". They don't know or care that all the fancy stuff increases the complexity of the codebase.

Every software project hits a point where the super clean abstractions the developers came up with start to clash with the messy way it's used in the real world. This is the frontend version of that. We have no choice but to give users the UX they want.


> I really don't understand this desire to make websites behave like local apps while in reality they aren't.

Since the dawn of time, humans have sought to eliminate mushy laggy UIs, such as having to wait for a full page reload whenever you click a button. I don't like SPAs either, but dunno, I don't find it particularly hard to understand how we got here.


Immediate feedback informs the visual language in order to convey meaning in a easier way to a larger public. You may know what an eventually consistent system is, but many users don't, and they want visual information abstracted to something they can understand. It's reassuring.

Also, not everything can be reduced to static forms. Charts with knobs, drag and drop interfaces, interactive diagrams are all useful visual aids that you would like to erase because... they don't conform with your naive views on how things should look like?


Surprisingly, Linux-based distros often have better chances of running old games compared to Windows.

Homepage https://rakhim.org/ (with links to my books, talks, projects, and social media)

Blog https://rakhim.exotext.com/


Continuing to work on Minifeed (https://minifeed.net/), a directory, reader, and search engine for personal blogs. The indexing & searching backend in Typesense, and I'm moving from their paid managed cloud service[0] to a self-hosted VM. It was very easy to start with the reasonably priced managed service, but with the number of feeds/posts growing, I have decided to self-host it. I'm also using Typesense Dashboard, a nice visual tool to do basic administration [1].

Overall, Minifeed keeps chugging along, fetching new posts every day from almost 2k feeds. I'm hoping to find some nice and ethical monetization strategy for it this year.

[0] https://cloud.typesense.org/

[1] https://github.com/bfritscher/typesense-dashboard


Hey OP, I hope you're doing better (I just read some of your earlier posts in the blog).


My blog directory/search engine [1] runs on Cloudflare workers as well. I was able to get pretty good results, too. For example, the listing of 1200+ blogs [2], each with 5 latest posts, loads in ~500ms. A single post with a list of related posts, loads in ~200ms. Yeah, it's still a lot, but it includes all the normal web app logic like auth middlewares, loading user settings, and states; everything is rendered server-side, no rich frontend code (apart from htmx for a couple of buttons to make simple one-off requests like "subscribe to blog" or "add to favorites"). A static page (like /about) usually loads in ~100ms.

This is a bit stochastic because of regions and dynamic allocation of resources. So, e.g. if you're the first user from a large georgraphic region to visit the website in the last several hours, your first load will be longer.

My other project (a blog platform) contains a lot of optimizations, so posts [3] load pretty much as fast as that example from the thread, i.e. 60-70ms.

1. https://minifeed.net/

2. https://minifeed.net/blogs

3. https://rakhim.exotext.com/but-what-if-i-really-want-a-faste...


Looks like Unity code. Not sure if it’s Visual Studio or VS Code, but yeah, it was baffling to me how weirdly bad C# support in either IDE is. Maybe something wrong with my setup, but autocompletions indeed suck (in addition to just wrong picks, editors often would suggest a symbol that doesn’t make sense from the typing perspective, as if there aren’t any language servers or intellisense or whatever).

VS code would also eat up the curly brace at the end of a class declaration when auto-generating a method skeleton.

I gave up and installed Rider. So far so good.


They say it's vscode in the article. I can't say I've seen anything that egregious happen with unity in visual studio.

It's stuff like this though that keeps me from using vscode for code editing (I use it for markdown and JSON file editing only). I guess I don't know what I'm missing but it's never been a smooth experience for me. If I'm on Windows I tend to stick with visual studio.

Maybe I should consider rider...


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: