Hacker Newsnew | past | comments | ask | show | jobs | submit | silentplummet's commentslogin

This is fair, but let's recognize that the discussion takes place in the context of a few well known individuals who have made an enviably successful career as freelance victims, who stridently prevaricate about the persecution they've experienced in the industry, but by all accounts don't actually do any work on technology at all.

One wrote a story for a mediocre video game and claims publicly to be a game developer. Is she a 'woman in tech'? It's like if I painted a mural on a house during construction this one time and went on claiming a career in carpentry. But don't you dare claim I'm not legitimately a carpenter, because hey, stop persecuting me!

The dichotomy isn't false, it's quite real and true, and it seems obvious to me that there are a handful of lamentably visible charlatans who are to blame for it. On the other hand, I could list dozens of women in the industry whose output I greatly respect and they seem to experience great success, but you never hear a peep out of them. It's almost like the more "tangentially involved" one is with tech, the more vocal one becomes about this supposed persecution...


We detached this comment from https://news.ycombinator.com/item?id=10761246 and marked it off-topic.


That's a pretty fast turnaround time. 6 minutes! I don't think the Air Force can even scramble jets that fast.

Would you care to point out exactly what part of the comment is "off-topic" pursuant to `the experience of women in tech`?


The timing is random.

The comment is trying to convert this thread into the most well-rehearsed flamewar of the last several years. That's not just off topic, it's arson. The HN guidelines explicitly ask you not to do such things on HN: https://news.ycombinator.com/newsguidelines.html.

Also, please stop posting inflammatory rhetoric to HN generally. That's not what this site is for.


[dead]


We've banned this account for serial trolling.


OP doesn't go far enough, it's all he mentioned and an insider contract to move product manufactured by businesses owned by Bush's skull society buddies.

Research who Michael Chertoff is and his role in the Bush administration. Who do you suppose conveniently owns the corporations that manufacture the rapey scanners and other equipment that had to be procured for the security matinee?


>American's apathy and frankly ignorance is what gave scumbags a series of blank checks with immunity. Americans didn't do anything learning about Iraq WMD's, 2008 frauds, Snowden leaks, and so on. Largely nothing but griping.

Hold on, I think you're giving the average American way too much credit here. My mom has at various times said that the police state is a good thing, the FBI should be able to view anybody's data for any reason, and groups the state names as terrorists should be denied freedom of speech.


Yeah, there are plenty of those... an even bigger problem.


This is exactly correct. I flagged the submission.


@jerf, @silentplummet (all of 39 days) the article is posted on "thecrimson", expect it to be a bit bio-ish. What title would you give the article? For me I only change the title if it's too long. This is a fault of story posters web wide. A bug bear of mine - there is no universal editor on the web.


Well, since you ask, I wouldn't post it as it doesn't seem to have anything of HN interest ("people exist who have done interesting things" isn't all that great of an article to start with, it being in a marketing context cuts it down even more), but apparently 47 people disagreed, so shrug. I do sometimes wonder if there's a substantial contingent of people who upvote things based on the title alone, bolstered by the occasional appearance of commenters who have clearly only read the title, but there's no way for me to prove it either way. (And that's an observation well beyond just this one article, btw, not a targeted thing.)


"I wouldn't post it as it doesn't seem to have anything of HN interest"

I add anything I find interesting. You'd be surprised what the HN crowd read. A lot of the time I'll add two stories. The first a journal with an in-depth description of some topic. Followed by a general science article that points to the journal. Rarely does the source article get upvoted to the first page. Not always and it depends on the topic.

The article added here is further information on this article "What killed the dinosaurs? Dark matter, says theoretical physicist Lisa Randall", https://news.ycombinator.com/item?id=10714657 I was curious what kind of background Randall has to theorise an idea linking asteroid impacts, dinosaur extinction and dark matter.


The aviation community appreciates the importance of robust/utilitarian/minimal transmission of information. Look at the format of TAFs and METARs for instance, the raw format of how the weather data is actually transmitted.

It's incomprehensible to the layman because there is an extreme economy of transmitted data. To decode it you have to know a fairly large vocabulary of domain specific symbols and do some computations to retrieve the correct wind speed and direction, which are encoded.

This is so you can always get your weather, one byte at a time if needed.

KDEN 101353Z 33009KT 10SM FEW080 FEW180 FEW220 04/M09 A2985 RMK AO2 SLP079 T00441089


When we use a computer programming language, we are directing an abstract machine to interpret and manipulate data to get a result that we want. We have all the knowledge we need about the machine because we conceived and designed it ourselves.

DNA is like a programming language for a biological computer, a living cell. However, we don't know nearly everything there is to know about a living cell. We can't predict its mechanisms. There is no debugger. The compiler didn't come with an instruction manual. The code bootstraps itself into its own machine and runs in an environment we can't predict. And the syntax has been obfuscated and optimized by a genetic algorithm that's been running in parallel on quintillions of cores for a billion years.

Because the code executes on an unknown machine in an unpredictable physical environment, many features we might expect to see in a programming language are missing. This might be what he meant by "shallow".


>DNA is like a programming language for a biological computer, a living cell. However, we don't know nearly everything there is to know about a living cell. We can't predict its mechanisms. There is no debugger. The compiler didn't come with an instruction manual. The code bootstraps itself into its own machine and runs in an environment we can't predict. And the syntax has been obfuscated and optimized by a genetic algorithm that's been running in parallel on quintillions of cores for a billion years.

And, just to finish off, the machine is stochastically nondeterministic.


I mostly meant that there is tons of encapsulation in computer programming. Like one could be a full-stack engineer that could go from soldering a transistor (or designing an IC from scratch, to writing a web app that uses web sockets that sit on top of HTTP which is on top of TCP/IP sockets, which is served ruby sitting on top of linux which is virtualized by Amazon AWS which is managed by a hypervisor sitting on top of a cluster of computers, all talking to each other via TCP/IP..... etc.), and there are so many layers 'deep' to that cake.

And this is a personal feeling, but there is less encapsulation, in biology. There are less 'categories' of things that build on top of each other that you have to learn, but those categories are immense and the knowledge in each of those is incomplete. I suppose you could say the knowledge in some of programming is 'incomplete' by virtue of closed-source encapsulation (trust us, this hardware works like you think it does), but that is somewhat artificial.


Thanks for that explanation. Do you really think the stack is smaller in biology? Biology has been optimized over three billion years - if we've already invented more layers than there are in biology then are we not overthinking it?

This is what I came up with in a hurry for biology:

  ...elements
  atoms
  chemicals
  nucleic acids
  genetic circuitry
  peptides
  proteins
  multiprotein complexes
  microcompartments
  organelles
  cells
  clusters of differentiation
  organs
  organisms
  communities...
Care to fill in or improve the list?


> Because the code executes on an unknown machine in an unpredictable physical environment, many features we might expect to see in a programming language are missing. This might be what he meant by "shallow".

Without knowing how much more there is to know about biology, how can we expect to see certain features or not? What about tasks that are supremely efficient in biology but resource intensive "in silico"? I'm having a hard time fathoming biology as shallow in any way. The fact that it's bootstrapped and live, that you don't get to restart the computer or cut the flow of information makes it all the less shallow to me, unless I'm misunderstanding how that word was used.


A stain on the history of human civilization. A few men will become rich beyond imagination by a skillful appeal to base laziness and apathy. It's the renaissance of the disposable generation. Just toss the cup away. Toss the whole maker away when it breaks, too.

Make no attempt to contemplate where all of this trash ends up.

What's wrong with taking a few minutes to make the coffee and clean up after yourself? Did you know coffee grinds make a great nutritious supplement to household plants and gardens?


Eh - there is actually plenty of room for pretty much all the trash humanity could possibly create. A hole ten miles square and couple of hundred feet deep could hold all of the U.S.'s trash for the next 100 years (this is not a controversial statement). And recycling is energy inefficient for pretty much everything but aluminum and steel.

Keurig solved a huge problem in offices - how to always have coffee available when people en masse are pretty bad at refilling the coffee. No amount of hand wringing will make people better at things like refilling coffee.


Like space, energy is also a non-argument, there's plenty of it.

The thing we don't have is infinite petroleum resources to make single use plastic cups. The other thing we don't need is unnecessary amounts of toxic production polluting the air, water, and land far and wide outside the hole in the ground where the waste goes.


What was the problem? There were already machines that self filled with water. Just toss coffee in a filter and go!

I promise you, it's NOT that hard.


And yet, people still don't make a new pot when they take the last of the old.


There's a reason one of the classic books in conversion optimization is called "Don't make me think!"


I've read some convincing arguments where they calculated that keurigs are less wasteful than brewing because of the tiny amount of coffee in a pod, versus large scoop of grounds you are using in a coffee pot, which translates to huge resource savings in growing the beans.


You know, I hate the waste of a Keurig, but it has advantages as well as costs. Remember how bad office coffee used to be? Over-cooked, lukewarm, foul? All that goes away with a K-cup machine. I despise the waste, but man is it a colossal improvement in quality.


I buy brands that make their pods biodegradable. This one specifically is cheap and doesn't harm the environment. I can throw these into my garden and they disappear pretty quickly: http://www.amazon.com/San-Francisco-Bay-OneCup-Coffees/dp/B0...

The problem with taking a few minutes is that some of us who drink coffee don't have a few minutes. Not a morning person - and that's why I drink it in the first place.


> Toss the whole maker away when it breaks, too.

Just like a laptop.


Indeed. Just like every electrical item. Integrated circuits aren't easy to repair, or cost effective for that matter.

I don't think Keurig machines are well made as an aside, so more of them get thrown away than there should be. But that's an issue with Keurig, I don't really expect your average consumer to be popping open electronics and repairing things (regardless of what the item is).


>Did you know coffee grinds make a great nutritious supplement to household plants and gardens?

Nope. I don't drink coffee, as I dislike the taste. Chewing roasted coffee is more fun than drinking it.


Sometimes you really do legitimately have a lot of static, global state. For instance, consider a program that needs to reference local, national, and/or global geography and its metadata, on a wide scale, randomly. All the countries have subdivisions, and subdivisions of subdivisions, and so on all the way down, which are all inter-referential. You can easily hit 100 MB of state that is essentially constant, and needs to be indexed 50 different ways for millions of function calls per user action that would access it.

Why not manage access to such things in a singleton class?


Singletons are fine, but it's almost always better to lazily initialize them rather than eagerly, to save on startup time. As a bonus, if you have no eager global initialization in your language, you can make import completely side-effect-free, which is a really nice simplification that I wish more languages adopted.


The slow startup from imports is my biggest annoyance with python.

We had a decent sized library at a previous company that pulled in modules that defined huge register maps, wrapped c++ libraries, etc.

I wrapped all imports in a lazy importer that was triggered by the first attribute access. It brought our script startup times from 3 seconds down to a fraction.

Blows me away that this isn't default behavior for ALL modules.


That behaviour feels to me like it may result in faster startup, but would also result in less predictable performance for code bases with somewhat random access such as web applications.

You could I suppose do some cache warming to make sure the first user request isn't slowed down, but its one more thing to think about.


>"I wrapped all imports in a lazy importer that was triggered by the first attribute access."

Well, putting code in the root of your file is generally the problem to such things, I would argue. Granted, I don't know about how that is necessary when it comes to "register maps" and "wrapped c++ libraries". But I'd imagine you should be encapsulating them away anyways and that would include fixing large startup time by design.


If this was the default, any change could completely upend the initialization order of your app. "Explicit is better than implicit".


As long as these data are immutable, sharing them is easy.

If you want hundreds on megs of shared mutable state, a database is a proper solution.


And make 300,000 queries over TCP like getting the list of county names in a state, or getting the list of place names in a county, because my actual use case involves fuzzy matching an arbitrary subset determined by user input, of 18,000,000+ unsanitized data records against geographical place names so they can be assigned geometries?

I'd like the program to finish in 15 seconds or less, please.


If you're making 300K queries over TCP to a database in order to do a calculation, then I'd say you need a much better data structure and/or algorithm. Either that, or do the bulk of the calculations on the database in P/T-SQL, or pre-calculate before-hand so that your on-line queries are just lookups instead of actual calculations.


You know there is such a thing as querying a database without going over a network, right?


It's moot.

The train of the discussion, if you go and read the OP's link and inner links, is like this:

- Singletons are bad - Why are singletons bad? - They're not "real" OO, they're global state, they obfuscate dependency, etc, etc, etc - But what if I just legitimately have a ton of global state? - Use a database! Use a filesystem!

The last point in the chain admits that the first point is mistaken. "Use a database" is just saying "use someone else's code to solve your problem". What if the database is implemented using singletons? What if it uses code that isn't OO at all? All you've accomplished is to say "OO can't solve your problem, use something external". In fact, my problem is solved just fine by using a singleton.


>essentially constant

immutable singleton is fine. The other concern is performance, but if you don't have to do this, there is no point.


Who cleans up after anything we do, really? Our economic systems lubricate the exchange of work and capital by abstracting away the cost of cleanup, which is rarely well understood to begin with, from view.

Observe any abandoned or unused commercial structure in your town, such as the old building Walmart vacated to build a bigger Walmart 1000 feet down the road. Did the price of building either structure include the cost of cleaning them up?

Thermodynamics guarantees that literally everything we might try to do makes a mess (a net entropic gain), and that includes cleanup efforts. Cleaning up is really just shifting messes around. We spend a little extra energy to make a neat pile but at the cost of producing a little extra poop and carbon dioxide.

We kick the entropy can down the road a thousand different ways every day. And then we do things like shipping cotton across the whole god-#@%^ Pacific ocean, so Chinese slave^H sweatshop laborers can assemble them into clothes, so we can ship them back across the ocean again and buy shirts. The ludicrous inefficiency of our corporate masters sacrificing both natural resources and our country's prosperity to save a buck is conveniently abstracted out of sight by the glorious global capitalism.

Make no attempt to look at the man behind the curtain.


> Observe any abandoned or unused commercial structure in your town, such as the old building Walmart vacated to build a bigger Walmart 1000 feet down the road. Did the price of building either structure include the cost of cleaning them up?

The cost of cleanup typically falls upon the person who builds there next.

If that new Walmart is built on a property that already had a strip mall, first they tear down the strip mall and bear that cost.

The new Apple campus is a great example. They tore down a bunch of old HP buildings and had to bear that cost. They reduced the cost by grinding up the old buildings to make concrete for the new building. Also, they probably discounted the purchase price of the land to account for the cleanup, so in some respect, HP bore that cost too.


> The cost of cleanup typically falls upon the person who builds there next.

Yes and that cause issue of abandoned sites, if the desirability of the site is lower than the price of cleaning up. Worse, there is not necessarily negative consequences that would push the market to avoid such situation. For heavily polluted sites like gas station, chemical plant, ... the owner of the land has generally made a profit, so disposable land is a viable business model. Not even counting that after enough years, the site could look clean enough to be resold.

At some point, it was discussed in Europe to impose a viable (i.e. funded upfront) reconversion plan for land used for stuff like gas station or landfill. The trigger in this case was some incident about some block of flat built on some forgotten nastiness by an unscrupulous builder. Not quite sure if anything actually happened.


While I agree with the sentiment, your "entropy" metaphor is quite wrong. Cleanup efforts lower the overall entropy of the Earth, they do not "kick it down the road" (which is perfectly fine for an open system with an external source of power like the Sun).

If you are worried about the Sun dying in a few billion years, then yeah, you are technically right.


Cleaning up may lower the entropy of the thing being cleaned, but there's no way it lowers the entropy of the Earth.


Could the energy we get from the sun make it so that we did in fact reduce entropy in some cases?


I think you're misunderstanding entropy; it only really applies in a closed system. By using energy from the sun to "decrease" entropy on earth, what you've don't is just expand the system to include energy from the sun. You've converted solar energy into kinetic and potential energy (moving things around, ordering them, building walls etc). That transformation isn't perfectly efficient - friction, electrical resistance, etc. create waste heat, which disappeared and can no longer be used. You've cleaned up one area of the system (decreasing entropy) but the means you used to do so used more than you gained. Entropy ALWAYS increases.


Entropy always increases in a closed system. Entropy of a part of a system (i.e. an open subsystem, e.g. the Earth) is a well defined quantity (it is even a thermodynamical potential, hence completely independent of the history of the system, only dependent on the current state) and can be decreasing (and even the most basic thermodynamics textbook has plenty of examples).

And yes, the entropy of Earth is decreased by cleanup efforts. Only when including the Sun you get a system in which the total entropy increases. But this would be quite useless thing to do, given that the Sun is an infinite source of energy for practical purposes.


As the sun's energy keeps us warm and entropy increases a lot when the temperature rises, getting rid of the sun would reduce Earth's entropy.


What do you propose is the alternative to capitalism? It's the only system that seems to have reasonably worked.


Slightly off topic, but I don't think that the argument "capitalism is the system which worked better) is valid. It's not that it's wrong, per se, but I think it's an implication of mankind evolution and not the capitalist system itself.

In more concrete terms: if you pick any given point in time, you'd probably be able to say the same without being wrong. At date X, the social system in practice is/was the one which worked better up until that X point; it's because we've been evolving somewhat continuously, not because the system per se.


Depends, what is your definition of "worked". Mercantalism was actually successful in many ways.


A thousand years earlier: What do you propose as the alternative to feudalism?


Bit depth is not a measure of quantity, but of dynamic range. In other words, what matters is not only how many quantization steps you have, but how large of a space you are trying to map them over.

The examples in the article point to this obvious conclusion, but don't quite state it explicitly. The relationship between space and quantization effects is demonstrated for a physical dimension interpretation of space by the horizontal greyscale bar that stretches and contracts.

But what if you stretch and contract the dynamic range of your monitor itself? Each bit in the encoding space (naively) offers a doubling of dynamic range in the natural space, so even your 30 bit encoding can be stretched if you display it on a monitor that intends to output a contrast ratio many times greater than what we are used to.

For instance, imagine a monitor that could output light perceptually as bright as the afternoon sun, next to effectively infinite blackness. Will 30 bits be enough when 'stretched' across these new posts of dynamic range, or will banding (quantization) still be visually evident when examining a narrow slice of the space?


HDR monitor do exist. BrightSide Technologies was showing them at SIGGRAPH over ten years ago. Looks like consumer OLEDs are starting to get on board [1]. And, they did really want 10 bits per channel. But, even that seems lightweight.

10 bits per channel will carry us for a while. Apparently Dolby bought BrightSide and now they are pushing for 12 bits. 16 bit ints will probably be enough for home use in practice. Internally, most games that do HDR rendering use 16 bit floats for their intermediate frame buffers. That format is popular in film production as well. I would be surprised if consumer display tech ever bothered to go float16-over-DVI. But, maybe it will get cheap enough eventually that we might as well have the best :)

[1] http://www.avsforum.com/forum/40-oled-technology-flat-panels...


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: