Hacker Newsnew | past | comments | ask | show | jobs | submit | greenavocado's commentslogin

What version?

1.13.6, so should not be affected by the malware

B-rep (boundary representation) modelers (like build123d via OpenCascade) represent solids using faces, edges, topology geometry

OpenSCAD uses Constructive Solid Geometry (CSG) which represents objects as boolean combinations of primitives (union, difference, intersection). It does not maintain an explicit boundary/face structure internally


My fever dream for the past two decades has been an interstate "road train" roll-on/roll-offstation network where cars are towed at moderate speed for comfort (45-55 mph) on extremely long flat bed trailers between cities so people don't have to pay attention to the road between cities and can sleep or relax.

640K is all anybody actually needs

https://news.ycombinator.com/item?id=18120477


The problem with your claim that the plebs are incapable of research because they don't have equipment and are dumb is the wholesale erosion of belief in institutions after the COVID "vaccine" situation

I assume you are expert in some domain. How would you feel if someone who is not familiar with your domain comes in and start questioning your expert judgment? Even in your domain probably being an expert means having access and expertise of equipments. Without that I cannot imagine having expertise to judge what is correct and what is wrong for that domain.

I reject the scare quotes you're putting around the word vaccine.

The COVID vaccine is a triumph of human ingenuity and we should all feel incredibly proud it exists. It was the moon landing of our time.

More broadly, vaccines have probably saved more human lives than any other medical technology in history.


You think far too highly of cavemen

It is essential to purchase and configure Home Assistant (https://www.home-assistant.io/) compatible devices around the home whenever possible if you want a "smart home" that will last. Everything else is an Internet of Shit treadmill that lasts at most a few years before it falls off and is replaced by a new piece of e-waste.

The caveat here is that it needs to be local. I have a few things that work with HA, but they basically highjack the apps cloud login tokens ..

That is terrifying

> can a liberal democracy organize a "just" version of a purge ?

Absolutely, it happened before on January 30, 1933


Explain why you think making a single commit is related to any source code sharing obligation? You completely failed to establish why making a single commit is indicative of it being garbage. Your statements are a series of non-sequiturs so far and thus I can't take you seriously.

> Explain why you think making a single commit is related to any source code sharing obligation?

When you share code it's presumably for people to use. It is often useful to have commit history to establish a few things (trust in the author, see their thought process, debug issues, figure out how to use things, etc).

> You completely failed to establish why making a single commit is indicative of it being garbage.

A single commit doesn't mean it's garbage. It erodes trust in the author and the project. It makes it hard for me to use the code, which is presumably why you share code.

My garbage code response was in regards to the growing trend to code (usually with ai) some idea, slap an initial commit on it and throw it on GitHub (like using a napkin and tossing it in the rubbish bin).


Here's the thing, get used to single big commits. Eventually, somebody is going to try to train on specific change sets. This'll enable models to learn specific authors mannerisms, idiosyncracies etc... Single large commits creates an info asymmetry boundary, which is about the only defense a creator has in a world of willful infringement to train algorithms to replace or devalue them in the market. It sucks... But this is the world we're growing into now.

Multiply-accumulate, then clamp negative values to zero. Every even-numbered variable is a weighted sum plus a bias (an affine transformation), and every odd-numbered variable is the ReLU gate (max(0, x)). Layer 2 feeds on the ReLU outputs of layer 1, and the final output is a plain linear combination of the last ReLU outputs

    // inputs: u, v
    // --- hidden layer 1 (3 neurons) ---
    let v0  = 0.616*u + 0.291*v - 0.135
    let v1  = if 0 > v0 then 0 else v0
    let v2  = -0.482*u + 0.735*v + 0.044
    let v3  = if 0 > v2 then 0 else v2
    let v4  = 0.261*u - 0.553*v + 0.310
    let v5  = if 0 > v4 then 0 else v4
    // --- hidden layer 2 (2 neurons) ---
    let v6  = 0.410*v1 - 0.378*v3 + 0.528*v5 + 0.091
    let v7  = if 0 > v6 then 0 else v6
    let v8  = -0.194*v1 + 0.617*v3 - 0.291*v5 - 0.058
    let v9  = if 0 > v8 then 0 else v8
    // --- output layer (binary classification) ---
    let v10 = 0.739*v7 - 0.415*v9 + 0.022
    // sigmoid squashing v10 into the range (0, 1)
    let out = 1 / (1 + exp(-v10))

i let v0 = 0.616u + 0.291v - 0.135 let v1 = if 0 > v0 then 0 else v0

is there something 'less good' about:

    let v1  = if v0 < 0 then 0 else v0 
Am I the only one who stutter-parses "0 > value" vs my counterexample?

Is Yoda condition somehow better?

Shouldn't we write: Let v1 = max 0 v0


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: