> The kinds of topic being discussed are not "is DRY better than WET", but instead "could we put this new behavior in subsystem A? No, because it needs information B, which isn't available to that subsystem in context C, and we can't expose that without rewriting subsystem D, but if we split up subsystem E here and here..."
Hmm, sounds familiar...
Bingo knows everyone's name-o
Papaya & MBS generate session tokens
Wingman checks if users are ready to take it to the next level
Galactus, the all-knowing aggregator, demands a time range stretching to the end of the universe
EKS is deprecated, Omega Star still doesn't support ISO timestamps
It's an infuriatingly accurate sketch. A team should usually have responsibility for no more than one service. There are many situations where this is not possible or desired (don't force your kafka connect service into your business logic service), but it's the ideal, IMO. More services mean more overhead. But someone read a blog post somewhere and suddenly we have four microservices per dev. Fun times.
This is the kind of situation you get into when you let programmers design the business information systems, rather than letting systems analysts design the software systems.
I don't think I've ever worked on a project that had "system analysts". You might as well say "this is what happens when you don't allow sorcerers to peer into the future". Best I've ever had are product managers who maybe have a vague idea of what the customer wants.
Well, that's just the problem, innit. In decades past, systems analysts performed a vital function, viewing the business and understanding its information flows as a whole and determining what information systems needed to be implemented or improved. Historically, in well-functioning information-systems departments, the programmer's job was confined to implementation only. Programming was just a translation step, going from human requirements to machine readable code.
Beginning in about the 1980s or so, with the rise of PCs and later the internet, the "genius programmer" was lionized and there was a lot of money to be made through programming alone. So systems analysts were slowly done away with and programmers filled that role. These days the systems analyst as a separate profession is, as you say, nearly extinct. The programmers who replaced the analysts applied techniques and philosophies from programming to business information analysis, and that's how we got situations like with Bingo, WNGMAN, and Galactus. Little if any business analysis was done, the program information flows do not mirror the business information flows, and chaos reigns.
In reality, 65% of the work should be in systems analysis and design—well before a single line of code is written. The actual programming takes up maybe 15% of the overall work. And with AI, you can get it down to maybe a tenth that: using Milt Bryce's PRIDE methodology for systems analysis and development will yield specs that are precise enough to serve as context that an LLM can use to generate the correct code with few errors or hallucinations.
I worked for a somewhat large bank that used to do this "system analysis" job at its beginnings. Don't recall how they called this process step, but the idea was the same. Besides the internal analysts, they used to hire consultancies full of experienced ladies and gentlemen to design larger projects before coding started.
Sometimes they were hired only to deliver specifications, sometimes the entire system. The software they delivered was quite stable, but that's beyond the point. There sure were software issues there, but I was impressed by how those problems were usually contained in their respective originating systems, rarely breaking other software. The entire process was clear enough and the interfaces between the fleet of windows/linux/mainframe programs were extremely well documented. Even the most disorganized and unprofessional third-party suppliers had an easier time writing software for us. It wasn't a joy, but it was rational, there was order. I'm not trying to romanticize the past, but, man, we sure un-learned a few things about how to build software systems
Nobody wants to wait for those cycles to happen in the sorts of businesses that feature most prominently on HN. That flow works much better for "take existing business, with well defined flows, computerize it" than "people would probably get utility out of doing something like X,Y,Z, let's test some crap out."
Now, later-stage in those companies, yes, part of the reason for the chaos is because nobody knows or cares to reconcile the big-picture, but there won't be economic pressure on that without major scaling-back of growth expectations. Which is arguably happening in some sectors now, though the AI wave is making other sectors even more frothy than ever at the same time in the "just try shit fast!" direction.
But while growth expectations are high, design-by-throwing-darts like "let's write a bunch of code to make it easy to AB test random changes that we have no theory about to try to gain a few percent" will often dominate the "careful planning" approach.
> Nobody wants to wait for those cycles to happen in the sorts of businesses that feature most prominently on HN.
Bryce's Law: "We don't have enough time to do things right. Translation: We have plenty of time to do things wrong." Which was definitely true for YC startups, FAANGs, and the like in the ZIRP era, not so much now.
Systems development is a science, not an art. You can repeatably produce good systems by applying a proven, tested methodology. That methodology has existed since 1971 and it's called PRIDE.
> That flow works much better for "take existing business, with well defined flows, computerize it" than "people would probably get utility out of doing something like X,Y,Z, let's test some crap out."
The flows are the system. Systems development is no more concerned with computers or software than surgery is with scalpels. They are tools used to do a job. And PRIDE is suited to developing new systems as well as upgrading existing ones. The "let's test some crap out" method is exactly what PRIDE was developed to replace! As Milt Bryce put it: "do a superficial feasibility study, do some quick and dirty systems design, spend a lot of time in programming, install prematurely so you can irritate the users sooner, and then keep working on it till you get something accomplished." (https://www.youtube.com/watch?app=desktop&v=SoidPevZ7zs&t=47...) He also proved that PRIDE is more cost-effective!
The thing is, all Milt Bryce really did was apply some common sense and proven principles from the manufacturing world to systems development. The world settled upon mass production using interchangeable parts for a reason: it produces higher-quality goods cheaper. You would not fly in a plane with jet engines built in an ad-hoc fashion the way today's software is built. "We've got a wind tunnel, let's test some crap out and see what works, then once we have a functioning prototype, mount it on a plane that will fly hundreds of passengers." Why would a company trust an information system built in this way? It makes no sense. Jet engines are specced, designed, and built according to a rigorous repeatable procedure and so should our systems be. (https://www.modernanalyst.com/Resources/Articles/tabid/115/I...)
> Which is arguably happening in some sectors now, though the AI wave is making other sectors even more frothy than ever at the same time in the "just try shit fast!" direction.
I think the AI wave will make PRIDE more relevant, not less. Programmers who do not upskill into more of a systems analyst direction will find themselves out of a job. Remember, if you're building your systems correctly, programming is a mere translation step. It transforms human-readable specifications and requirements into instructions that can be executed by the computer. With LLMs, business managers and analysts will soon be able to express the inputs and outputs of a system or subsystem directly, in business language, and automatically get executable code! Who will need programmers then? Perhaps a very few, brilliant programmers will be necessary to develop new code that's outside the LLMs' purview, but most business systems can be assembled using common, standard tools and techniques.
Bryce's Law: "There are very few true artists in computer programming, most are just house painters."
The problem is, and always has been, that all of systems development has been gatekept by programmers for the past few decades. AI may be the thing that finally clears that logjam.
In the construction world, it's basically the separation between architects and builders.
Sure you can definitely build things and figure out things along the way. But for any sufficiently complex project, it's unlikely to yield good results.
IMO programs are 90% data or information, and modern software vastly underutilizes that concept.
If you know what data you need, who needs it, and where it needs to go, you have most of your system designed. If you just raw dog it then stuff is all over the place and you need hacks on hacks on hacks to perform business functions, and then you have spaghetti code. And no, I don't think domain modeling solves it. It often doesn't acknowledge the real system need but rather views the data in an obtuse way.
Per Fred Brooks: "Show me your flowcharts, but keep your tables hidden, and I shall continue to be mystified. Show me your tables, and I won't need to see your flowcharts; they'll be obvious."
It's telling that PRIDE incorporates the concept of Information Resource Management, or meticulous tracking and documentation of every piece of data used in a system, what it means, and how it relates to other data. The concept of a "data dictionary" comes from PRIDE.
No, this is the situation you get into when you have programmers build a system, the requirements of that system change 15 times over the course of 15 years, and then you never give those programmers time to go back and redesign, so they keep having to stack new hacks and kludges on top of the old hacks and kludges.
Anyone who has worked at a large company has encountered a Galactus, that was simply never redesigned into a simple unified service because doing so would sideline other work considered higher priority.
Even so, wouldn't you expect that you could crush an open empty beer bottle by putting a heavy enough weight on it? A human can't do it, but I would expect an elephant can.
There is quite a lot of pressure put outside from the beer of a full bottle, but that little bit of air is probably enough to cause it to implode at some point.
I'll be honest; I have no idea how to estimate that. I'm sure there are folks on here who can (and might). It's probably not as deep as you'd think.
It is kinda neat, but OpenSCAD's limitations are the main thing that motivated me to write this Python library to generate 3D meshes used signed distance functions:
Could you please elaborate on how this is different than the other python based modeling tools - build123d[0] and CadQuery[1].
I recently also got annoyed with OpenSCAD and its limitations and therefore started experimenting with Build123d. I'm very much a beginner in the CAD space and would love to understand what inspired you to build sdf.
My basic understanding is that STL files are essentially like Bitmap images and store a list of triangles and their positions, whereas STEP files are more like Vector art where there is a list of instructions on how to build the model. Most CAD GUI programs also operate on a similar model to vector art where they record a list of operations one on top of another. It's why STEP files are a standardized format and can be imported / exported from most GUI based CAD builders. I think.
Given that SDF also seems like it builds only STL files (I could be wrong), wouldn't learning build123d or CadQuery work better if one cares about compatibility with existing GUI based CAD modeling software?
Additionally, atleast build123d offers a similar conceptual model to using Fusion360 and FreeCad - I have limited experience here - but essentially you sketch something in 2D on a particular plane, and then apply some operations to convert it to 3d in a particular manner - the simplest being extruding. This means the mental modeling of how to construct something is very similar across both GUI based CAD programs and Build123d, and that makes it easier for me to jump between GUI based and code based CAD modelling.
I'd love to understand your point of view, and learn more.
It seems like you already understand the differences. I wasn't aware of those other projects. build123d looks pretty neat.
Like most of my projects, this was just for fun and I mainly made it for myself. I'm a DIY kind of guy when it comes to software. I just throw things up on GitHub in case anyone else can get some use or inspiration out of it.
> It seems like you already understand the differences.
Honestly I was about to ask the same questions as the parent comment. Whenever I'm interested in something I usually look at what available tools exists out there already; seeing a new tool mentioned that I've never heard of before, my reflex is to ask "oh neat, what makes it different than the existing tools?". I don't think the question was ill-intended, just genuine curiosity; assuming that you wrote your library because you had no idea build123d existed rather than because you were unsatisfied with it and wanted to tackle the problem differently is a bit of a leap.
I know it wasn't ill-intended, but my answer is largely the same. I like the idea of using SDFs to define models and this was just a fun little side project. And FWIW, my project predates build123d.
SDFs are very neat up until the point where you need to build parts that have very precise specifications.
Something like two precisely interlocking gears with a tooth geometry with a profile that's the developed curve of the opposite tooth is a nightmare to build with SDFs
Or precise fillets.
Or hard intersections and differences.
Very useful for doing soft, squishy shapes, less so for hard CAD.
Also, a suggestion: in your project, please consider using Wavefront OBJ as an output format, it is a much, much better choice than STL (STL can't represent the actual topology of the object, it has to be reconstructed).
> You can even load an existing 3D mesh and operate on it as an SDF. Great for hollowing, chopping, eroding/dilating, etc. existing models.
This has my instant interest. Multiple times I have wanted to take an existing .STL file and cut a hole on it or add another object to it and have never had success.
I've tried things like Meshlab, but while the interface has what appears to be a hundred different functions, attempting to use anything returns some error code that requires a PhD to understand and none of the "repair" functions seem to help.
I mean seriously: Mesh inputs must induce a piecewise constant winding number field.
How the hell am I supposed to accomplish that on a STL file?
Sounds great in theory until you actually try it and discover that anytime a STL touches another object the F6 render craps out with "Manifold conversion failed: NotManifold"
People should share the original files or at least step files along with the stl files. But if you must work with stl, Fusion works brilliantly for this. You can open the stl file, which gives you the usual mesh that's hard to work with. You then convert that mesh to a solid object, on which you can use "direct modeling". It's not the same as a parametric object, but the editing features are quite powerful and sort of mindblowing. [1]
If you have the paid version of Fusion, you can run "feature detection" to turn things like holes, fillets, extrusions etc. into dedicated features which are even easier to edit. [2]
That feature requires getting pyopenvdb installed, which can be a headache, and I never really updated the README with examples, but it does work. There is one example script:
If it's a one-time thing, Prusa Slicer (and some other slicers too, probably) allow adding and subtracting simple shapes. So if, for example, you need to add a hole for a screw, you can do it directly in the slicer without messing with (and breaking the mesh of) an STL.
Blender also has a high learning curve but you typically don't need a PhD to understand the errors (instead you just watch youtube videos and copy what they do).
Removing faces from an STL and adding other objects is quite straightforward. Previously, Autodesk had Meshmixer and 123D, I guess Meshmixer is still available: https://meshmixer.org/ and I found it to be great for quick editing of the type you're describing.
See also my site fncad https://fncad.github.io ! It's basically intended as "SDFs in your browser with realtime preview with openscad-like syntax". I mostly use it for 3d models for printing.
What do you use for sdf meshing? I never really got the perf where I wanted it.
They long pre-date the demoscene, going back centuries in mathematics. Ray tracing/casting of implicit surfaces (described with SDFs and more general signed functions) for computer graphics goes back to the 1960s and 70s. The 1990s demoscene 2D metaball effects were based on computer graphics work by Jim Blinn for Cosmos in 1980. Most current applications are based on that long ongoing research. (I did my PhD in implicit surface stuff, so I've seen tons of academic papers on it going back ages, and I never ran into demoscene methods in that context.)
Not sure about SDFs, but ray casting/tracing goes back a long way being used to design sundials thousands of years ago. A method of ray casting was published in the 1600s to show how to trace out the outline of the Moon on the Earth during a solar eclipse.
Bless you for your service, sir! I have used `sdf` to create a whole bunch of stuff (buttons for my mother, tealight holders, etc.) and `gg` gets used in a bunch of places (including a couple of bots).
I always hated that video mode, but it looks good here when the magenta and cyan are limited just to links and headers!