Hacker Newsnew | past | comments | ask | show | jobs | submit | brownjohnf's commentslogin

> I think the point is just that for most applications, the reverse proxy server is not a performance bottleneck

I'm not commenting either way on Caddy vs. other solutions, but whether or not a reverse proxy is a pure performance bottleneck, it can become a cost issue. If a reverse proxy is capable of handling twice as much traffic as another solution (through some combination of simultaneous connections and raw speed), it'll cost half as much to operate. Especially at scale, those costs can really matter.

Raw speed for speed's sake is only sometimes the most important factor.

Edit: grammar.


For me, reverse proxy capacity is more about surviving machine failures. I'll need N+2 of them per region per zone regardless of how efficient they are. For my simple personal site, I run Envoy on 3 machines limited to 64M of RAM and it easily supports 10,000qps per instance with many more concurrent connections (for clients downloading the requested document slowly). One instance alone is enough for all the capacity I desire (and I have rate limits to prevent one IP from using more than its fair share of the limited capacity), so I pay for 128M of RAM that I don't need simply to survive VM failures during a deployment.

I guess my point is that even an inefficient proxy is going to be light on resource usage, and you will always need extras. At some scale, the inefficiency matters, but at most scales it really doesn't. So if Caddy is easy to operate, I'd say go for it. (But personally I use envoy + cert-manager. More flexible and less magic.)


When I started out with tmux (and I echo the parent comment’s sentiment about it being a core piece of my workflow) I found https://pragprog.com/titles/bhtmux2/tmux-2/ to be extremely useful. It’s very brief and digestible, and sets you up to make any further customizations you want.


> In safe languages like C# or Java there’s no unsafe anywhere, not even in standard libraries

My (limited) understanding of the unsafe keyword in rust is that it’s just indicating that the compiler cannot guarantee that the block is safe, not that it’s necessarily dangerous. By this standard, every single line of any C program is unsafe. Depending on the guarantees of other languages, this is true to varying degrees. I’m not familiar enough with Java and C# to comment on them specifically, but I’m not sure the try to provide the same guarantees that rust does.

I think all python, ruby, is, etc. would be considered unsafe. The unsafe keyword in rust seems more like the equivalent of saying “my static analyzer couldn’t guarantee the safety of this bit; it may or may not be totally fine”.

That said, I do agree that unsafe blocks can be red flags. At least they can help guide you in where to start looking for potential issues.


The design of a hammer has actually changed substantially over time. More recently, there's been a trend towards specialization in the framing hammer alone. 100 years ago most hammers looked like the classic hammers I remember from childhood: relatively small, sharply curved claws, a smooth face. As production framing exploded in the US after WWII, framers were pushing for more efficiency.

The framing hammer got heavier with a longer handle, broader face, straighter claw, and waffled face better for gripping nails (also why most loose framing nails have a cross-hatched pattern: so they can mate with the hammer face). From there, materials science really kicked in, and we saw steel-handled models, followed by fiberglass and other composite handles.

The latest developments (that I'm aware of) are products like the Stiletto (http://www.stiletto.com/p-80-ti-bone-iii-hammer-with-milled-...), which leverage materials like titanium to reduce weight while maintaining driving power, and include a replaceable steel face to prolong hammer life and allow using different faces for different applications.

Modern hammers with advanced material properties and functions can cost hundreds of dollars, but deliver much higher efficiency with less fatigue and a longer life. I compare that with the Sears hammer in my grandfather's garage and see a whole new generation of evolution.

There's a great article about hammer history at Fine Homebuilding: https://www.finehomebuilding.com/project-guides/framing/hamm...

edited: Fix link formatting.


Noted. What has been stable for a long time and remains frustratingly static and unrefined is the field of surgcial instruments. There are a few exceptions but most haven't changed in the 20years I have been a surgeon. The regulatory costs of developing new instruments and intertia of manufacturers doesnt help. Once a manufacturer gains a market share they try and keep by not innovating which keeps their deveopment costs down and the consumer volume is comparatively low. I don't get the feeling there is any sense of patient altruism. I have been using the same 4mm dia endoscope 25cm long since 2007 when the first iphone came out. It is very simple, made of glass fibres with a sony 3chip camera that is 10y old. We get some v slow progression on video output and have just got 4k. Compared to consumer tech the advances are ridiculously slow. We need a camera (the iphone camera is small enough) on a steerable stick. How hard an engineering task can that be?.


A couple of weeks ago I noticed we had several active farmers here, today a surgeon with twenty years of experience :-)

I'm happy to see so many different people here in addition to programmers, techies, VC etc.


> The regulatory costs of developing new instruments and intertia of manufacturers doesnt help.

I work on (mostly non-critical) medical devices and I would like to remind that regulatory is there for a purpose.

If anything the 737 Max fiasco should remind people what can happen when regulatory is considered a cost that need to be cut in a field where there can be some hazards, and where some people in the chain do not have the best interest of the public in mind.

And yes, in the medical industry too, there can be some people who care more about optimizing profits than about patients.

Maybe there are some undue regulatory rules, but hopefully there are not the majority.

In the case of what I work on, regulatory does not prevent us from using state of the art CPUs and GPUs, so yes R&D may need somehow longer cycles from consumer electronics to get a return of investment, but let's be honest it is not too much scandalous to get some features a few years after you get similar things in consumer electronics, especially in cases where there are diminishing returns of improving X or Y.

And yes it is easier to build things when you don't have to care about e.g. ability to disinfect materials, you can cope with more bugs, etc.


Half of that is the manor manufacturer, and the other half is the FDA. These huge companies that can afford lots of money intentionally get the FDA to set a high bar with lots of expensive testing in the regulations. This creates a HUGE barrier to entry. Right now I'm working with a pathology lab who are buying a pathology slide scanner. This is basically a slide handling robot with a high end camera hooked to a PC with an image viewer. The combined system costs over $300,000 from companies like Leica and Philips. The software is incredibly basic, it's basically Thumbs Plus with Irfanview and that's $100,000 alone, aside from the scanner. But they charge that because it's incredibly expensive for a startup to come in and challenge their pricing.


I don't suppose you'd count something like the da Vinci robots as a step forward? I got to try out a demo unit once (on a dummy! not on a real human), really cool stuff. I'm really glad I tried it _after_ I had my laparoscopic appendectomy - it was somehow a little terrifying how clumsy it felt holding the "regular" tools used, compared to using the robot.


Yes, tho they have been around now for about 20y, are big (no good for eg neurosurgery, ent) and cost $2m. You don't need that for lap appendicectomy, - just a better scope, and a good surgeon. Robots have no advantage over human dexterity. They have a few niche roles eg where we can't get our hands in eg the prostate.


The metaphor of a hammer has barely changed over time. What you're describing are implementation improvements - not trivial, but you could take a modern hammer back a few centuries and it would still be recognisable as a hammer. Albeit a very unusual one.

UX/UI has the same issues. Everything is a metaphor anyway. You don't get to choose whether your interface is a metaphor, because there is no other option for interfaces. You only get to choose the type of metaphor, and its affordances - from hand-editing binary in a "file" (...which is also a metaphor) to voice recognition.

There are some good points in the article, but they're maybe 10% of the way to a full understanding of this issue. Most of the complaints are about inconsistencies and expert-level operation (written scripting) vs beginner-level operation. But there's also a point about contextual metadata.

Modern operating systems are pretty bad at all of the above, but that's because designing intuitive and powerful interfaces that incorporate expert-level features with some workable built-in intelligence - and preferably some form of composability - is incredibly hard.

It's so hard it's barely been attempted, never mind done successfully. So most operations in userland are explicitly task-oriented. Their settings are customisable, but not the menu of operations on offer.

As a non-expert if you want to rename a folder full of files, you buy a file renamer. You don't try to write a script, because even trivial scripting requires a level of comfort with abstractions that most users simply don't have.

Experts do have that skill level, but they can already use $scripting_lang.

It's possible to imagine an OS that would be more open-ended and wouldn't silo data inside task-oriented applications. But this runs into huge problems with efficient representations and the most appropriate schema for each domain, and just the concept on its own is far outside anything most users would want to deal with.


That’s a really good counter example because it’s still easily and immediately recognisable as a hammer. Anyone picking it up knows how to use it without even thinking about it. In fact they might not even concously notice the changes and how they improve the tool except that it seems better. That’s great incremental UX improvement.

A lot of the critics of lack of innovation in computer UX design don’t want incremental improvements of the existing building blocks of modern UIs, they want to tear it all down and start from scratch. They want VR interfaces, Jeff Raskins The Humane Environment, Button-less UI, etc. They don't care about better hammers, they want nail screwdrivers or handle-less hammers.


The think this supports the OPs argument that we're refining, not inventing. Everything you've said here is about refining, it's not a completely new way to attach materials together.


Yeah, probably should have clarified that I intended no comment on the OPs comment about innovation in computer UI. Just wanted to point out that there's actually a surprising amount of evolution in design of the hammer.

Something that seems to be so simple, and has existed for thousands of years, can still be made better. I'm not a professional carpenter, but I've used a hammer a lot to do things like framing, and can confirm that many of these innovations are meaningful in function, not just form.


Going back to the OP though that talked about settling on the WIMP model, you're not really contradicting their point.

If you take a hammer from 1920 and lay it next to the most jazzed up hammer from 2020, they would be recognised as the same tool/having the same general purpose. A carpenter from 1920 wouldn't need to change the way he used a hammer if he picked up the 2020 model, even if the 2020 model might enable new ways of actually using it (or improve old ways of using it).

So while there is evolution and development going on, we're not replacing the hammer metaphor as it were.

The WIMP model has also seen evolution and refinement, but it's still recognisable as the same model. I think the analogy holds.


It may be splitting hairs, but I think certain changes to hammers must qualify as invention, certainly, including the crosshatch innovation, the material science involved for both fiberglass and titanium handles, and the improved weight distribution. It's not clear to me what would be considered a complete reimagining of the hammer, as a hammer is such a broad category of tool. Is a mallet a hammer? When I smack something with the backside of an impact driver, is it a hammer? I sure use wrenches as hammers occasionally.

So, what's the line between inventing and refining?


A slide hammer, a dead blow hammer, a flooring hammer, a nail gun, a staple gun, liquid nails/glue, screws and a screw driver (powered or not), a jackhammer, a power chisel.

If the idea behind a hammer is to use the momentum of a relatively large mass to drive a relatively small mass into a material, then the idea of a piece of steel on a handle is just the simplest thing you can manufacture as admittedly a versatile one but not necessarily the best one. If your task as the user is to join two materials together then hammer and nail won’t necessarily even look like hammer and nail (glue, screws). If the goal is to separate material like you might with a chisel, depending on the material you might not be using a manual hammer but something that looks very different, like a saw, a file, a jackhammer, etc.

What the person who mentioned the evolution of framing hammers is pointing out refinement of the hammer as it is. Creating a tool better suited to the user’s task is closer to what TFA is about.


The marketing terms here are continuous innovation (that doesn't change user behavior) vs discontinuous innovation (that changes user behavior).


I just want to say thank you for posting this. I had no idea about the evolution of hammers, and this little bit of depth from an unfamiliar domain brightened my day :)


Right but a hammer is still a hammer right?

It's not like we're bashing nails with bricks and calling it an MVP replacement to the hammer.

What you described isn't innovation, its iteration.


They're actually not that different. specialization not withstanding, it's a handle with a perpendicular weight at the end. 40% down this page is one from 20,000 years ago, instantly recognizable as a hammer. http://oldeuropeanculture.blogspot.com/2015/12/baba-hammer-a...


And furthermore, most nails aren't driven with a hammer at all, since the invention of the nail gun.


“Longer this, shorter that, better materials” is refinement, not “substantial redesign”.

The point is the essential form and function has remained the same for thousands of years. Knives, forks, spoons etc. are still knives, forks, spoons etc.


Thanks. I feel I know everything about hammers now.


I think the common terseness of many of the core suite of original unix tools actually reflects a strong focus on human, not machine, ergonomics. I still appreciate the speed and ease of typing them, and like many other aspects of the CLI, it's optimized for users who know it well and use it heavily. Once you're familiar with the names, it's not challenging to remove that mv = move, wc = wordcount, etc. Terminals of the era also still actually printed mechanically, so keeping command length short was a major ergonomic win for round trip speed.

As a sibling comment mentions, these commands were (are) commonly composed into scripts. As the name implies, however, a script is just a playbook for a series of commands to run. Given the terminals of the era, I'm sure short commands/variables/etc. were appreciated in scripts as well, but it seems to me that the primary motivation for optimizing input speed would be the use of these commands in an interactive environment.

A few examples of these core short program names: ls, cat, cp, rm, wc, uniq, cmp, diff, od, dd, tail, tr, etc.


I find it interesting how GUI and CLI drift apart so far in this area. Powerful GUI software for specialised tasks is often overloaded with buttons and toolbars everywhere because the user needs to be able to click them. The terminal is the complete opposite, instead of clicking through menus to find the right option(or use a ton of keyboard shortcuts) you have to know what to type. But it's also very efficient and flexible, and in exchange for more difficult discoverability of features it circumvents menus completely.


That's true, but I think GUI drifts back when you consider the arcane keyboard shortcuts a good GUI has for power users.


These old commands are also terse because the user was very often working at a teletype at 110 baud, or some other very slow type of terminal.


I'm a backend developer at MobileDevHQ, and just checked our db. We're currently tracking 520,684 searches.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: