Hacker Newsnew | past | comments | ask | show | jobs | submit | bwarp's commentslogin

It'll be another ceremony. A parade of metal and glass. A celebration of our supposed progress. Another device for consumption. A window into a walled world. Another false need. Another life not really improved despite their insistence that it is

Please at least spare a thought for those around the world who are too busy trying to stay alive on this day to worry about a new monolith to worship.

I like technology, but I'm a human first.


I think that may be a bit over-the-top...


Possibly, but isn't the entire carnival of Apple the same?

Tell me which bit of my post was not factual?


Which bit was not factual? There isn't a single factual statement in there except in the most absurdly pedantic sense, it's all mocking hyperbole.


Another life not really improved despite their insistence that it is


We put on a great show because we all gonna die.


I can't get behind this mindset. First off, who are you to say that these sorts of devices don't enrich people's lives? For this conversation, I'll concede that tablets are largely for consumption. At what point would it be enriching?

Under your rationale, if you always think of who has it worse than you, you could never have a valid complaint. "Man, I will be so disappointed if the iPad 3 doesn't have LTE." "Don't you think about starving children in Africa?"


Psychs (both types) who are funded by medical insurance are the issue. They're being effectively paid commission from the insurer to find anything that could be wrong with you so they can eat. The moment you get sick in the US, you are a landgrab for cash. It's just wrong.

Here in the UK, the free ones on the NHS are pretty good! If you don't want treatment, noone is going to lose any money (in fact they're saving cash) so you're fine.


This is where we are. We use Crucible/Fisheye as well which really just makes things painfully hard.


We actually use Crucible and it our process is very smooth. Every development task must have a review created for it (usually spanning several commits) and one other person reviews the code. When the reviewer is satisfied, they give their approval and the task is closed. We do reviews for nearly all committed code.


Considering the web server takes less than 5% of the execution time in my stack, I'm not bothered and 99.99% of you probably shouldn't be either.

It'd be a better performance improvement to rewrite the stack entirely in C than change the web server and that's just illogical.


RAM, separate GPU+core and profit buy the looks.


The CPU alone is probably $25 in high volume, and these guys are not high volume. (Guesstrapolated from an iSupply teardown of a kindle fire.)

If you want $50 then you are looking at a Raspberry Pi grade CPU. Those are ok for getting data in and out of a H.264 decoder, but you would not mistake them for modern workstation performance. (But good news, they might announce shipping in 15 hours.)


A complete understanding if you ask me is one which requires no impenetrable "black-box abstractions" to be substituted in the abstraction hierarchy. In the case of a CPU, you would have to have a full gate level understanding of it and an understanding of how the gates operate.

To be honest, the "complete understanding" died way before the original post. It was the moment that they started building computers with integrated circuits inside them (before 1970) and complexity rocketed logarithmically. It died doubly so the moment that electrical engineering and the practical side of computer science diverged into "them" and "us".

If you want to use something which you understand, you will probably need to buy a crate of transistors, resistors and diodes and wirewrap yourself an entire computer from scratch PDP-7 style.

This fact is a warning: We really are building things with too deep abstraction hierarchies causing knowledge to be divided. One day we will never hope to comprehend anything in a lifetime.


Then blacksmiths in the Middle Ages did not 'understand' the forging of swords. Only modern materials science allowed us to 'understand' why forging creates harder metal.

This is a semantic discussion about the meaning of 'understanding': does it mean you can globally explain how the system works and could come to understand the smallest detail of every part? Or does it mean you understand the smallest detail of every part?

The latter is a nonsensical definition: if that is the case, then nobody understands processors, because nobody understands transistors, because nobody understands quantum mechanics, because nobody understands why the fundamental forces act in certain ways. Nobody understands Newton's laws, nobody understands where babies come from and nobody understands what it means to perform a 'computation'[1].

Of course, that means the former was also a nonsensical defintion.

[1] http://plato.stanford.edu/entries/church-turing/


> Then blacksmiths in the Middle Ages did not 'understand' the forging of swords. Only modern materials science allowed us to 'understand' why forging creates harder metal.

The interesting way to construe the article's claim is not that it's impossible to know everything, but that's impossible to know everything that people already know about the field you work in.

Were there blacksmiths who knew everything anyone knew about forging swords? Did Newton or Da Vinci know everything anyone knew about the various fields they were expert in? Are there farmers now who know everything anyone knows about how farming works? The article claims that at some point it became a certainty that programmers cannot know everything that anyone knows about how to use the tools they use and what those tools do. The stack is too complex. That's at least a sensible and interesting claim.


No blacksmith, farmer or famous scientist/artist/Renaissance man in history has ever known everything there was to know (about the field they worked in). Even back then, there was already more knowledge being produced than they could possibly ever obtain. As with us, the ultimate problem is a lack of time. Whether the time required to travel to a neighbouring city to learn from their guild or the time required to read a paper on the internet, the problem is time.

All stacks have always been too complex. A farmer, to produce optimally, has to understand everything there is to know about meteorology, biology, sociology, economics and earth sciences concerning his specific area. That has never been the case.

In that sense of 'understand', nobody has ever completely understood any system they worked with/in. Given a system and a person, you can come up with a legitimate detail question about the system that you also believe the person couldn't answer.


Semantically speaking, understanding is simply knowing enough to recreate something, preferably with your own aquired skills and knowledge. We're all just fancy parrots wrapped up in monkey bodies.

To be honest, the bedrock abstraction should stop at "what humans can realistically create with their own hands from nothing". You can make your own transistor quite easily and Ebers-Moll provides a nice set of rules to work with.

The quantum physicists and philosophers can remain arguing about technicalities then and let the rest of the world observe, understand and create.


> what humans can realistically create with their own hands from nothing

could medieval blacksmiths really create a sword from nothing but their own hands? how would they get the iron? would they have the knowledge locate veins of iron and then mine that iron? could they build the kiln and forge the equipment necessary for forging

arguably, by this standard, even farmers could not farm. farmers may know how to plant their crops, but without current crops to gather seeds from, would they know how to find the strains of plants that suit farming and then gather the seeds from those plants?


Well actually, I reckon yes. Much to my parents' latent terror (I didn't tell them until they came back from holiday), at the age of 15 a friend and myself built a small blast furnace using some ceramic pots, bits of stone lying around and a hoover. I managed to get 50g of what looked like pig iron out of that bugger before it basically fell to bits and set fire to the lawn. That was from about 2kg of ore I found at the bottom of a cliff face next to a beach at Skegness. Was great fun! (flux was limestone from the shed, coke was 3 large bags of barbeque charcoal)

I'm sure that and the rest of the process wasn't beyond people with a higher budget and requirements...

As for plants - it's all knowledge and experience. There is no abstraction. Eat this, don't eat that. I grow quite a few edible plants myself and there is little abstraction.


> You can make your own transistor quite easily

Not from scratch no. Not even with all the intermediate knowledge available but none of the tooling and technology. Unless by "quite easily" you mean "in under a dozen generations".



You're joking right? Taking apart an existing diode to build a transistor from it comes nowhere close to making the transistor from scratch. Not to mention the clamps and files, the microscope, the phosphorous bronze for the contacts, ...

This video is great, but it's a century or two of progress away from "from scratch".


I do all my "fun" computing these days on a BBC Micro, in assembly language. I don't have total understanding of the ICs, it's true. But in its 64k of addressable memory (32k RAM), I've a pretty good idea of where everything is and what happens when. Very satisfying.


Very cool. I still have the old Osborne 1 on which I learned to program.

Interesting historical point - the BBC Micro was designed by Acorn Computers, who are the parent of the ARM processors that are so ubiquitous today.


Indeed, the original ARM was designed on a BBC Micro + 6502 Co-Pro. Amazing what "real work" you can get done on one :-)

http://en.wikipedia.org/wiki/ARM_architecture#History


I was the proud owner of an ARM copro [1] many years ago (I still have the Master it was plugged into) and the first Acorn RISC machine (an A310 with 512k RAM if I remember correctly).

They were and still are extremely powerful and productive machines.

[1] http://en.wikipedia.org/wiki/BBC_Micro_expansion_unit#ARM_Ev...


Returning to your original post,

We really are building things with too deep abstraction hierarchies causing knowledge to be divided

Abstraction is necessary, true, but it's not clear to me what the abstraction-level we have now really gets us. In other words, say we had a BBC micro with a 2Ghz 6502 in it. What productive computing tasks that we do now could it not do? Or let's imagine an Atari ST with a 2Ghz 68000, to get us a little more memory. What could it not do, that we need to do now? I'm struggling to think of anything.


It doesn't get us anything at all other than a fucking huge rabbit hole to stare down every time you do anything. Lets look at a pretty naff case for .Net CLR on x86 for windows workflow:

application -> xaml -> framework -> server -> container -> c# -> cil -> bytecode -> x86 instructions -> microcode.

Now forth on a 68000:

Forth screen -> 68k instructions

To be honest, for what I consider to be life and death stuff, a 10MHz 68000 is good enough (I have one in my TI92 calculator).


The main thing is that you'd need a new set of abstractions for security, and then you'd need to implement HTML5 on it anyways to do all the things we can do on a computer now.


to do all the things we can do on a computer now

Such as what? 99% of websites are a) screens where you enter something into predefined fields, to be stored in a database and/or b) screens that format nicely for display things that you or other people have previously entered. They were doing that in the 1970s. Only the buzzwords change.


HTML5 is a piss poor evolutionary abstraction of what was effectively SGML and scripting carnage.

If you could start again, would you really end up with HTML5?


Any idea where someone (such as me) born too late to have an original Micro might be able to get hold of one?


If you're in the UK, there's a liquid market in BBCs, C64s and similar on eBay. There are a few sellers who refurb them (e.g. new power supplies, cleaned up cases etc). You can easily adapt a BBC to use SCART too (the lead will cost about a fiver) and use it on a modern TV, if you don't fancy using a big old CUB monitor.


I hope that the RasberryPi will be a re-run of the BBC Micro, they could do a lot worse than bundle it with a modern version of BBC basic, which was very advanced for its time.


Brandy Basic is pretty much that: http://sourceforge.net/projects/brandy/


Get a Cub monitor though - it's not the same without one!


I have two :-)


Cool - I bow before your Eliteness! (pun intended ;-)


Good for you!

I (the parent of your post) actually have a BBC Master (and the advanced reference manuals) lying around still for precisely that reason. It's quite a handy and very powerful little machine to be honest.

It even runs LISP (AcornSoft LISP).


Will the game be changed once again when 3D printers will be able to print circuits?


Not until the printers can make their own CPUs and complex parts such as threaded rods. That is a long time off.

3d printers are supposedly promoted as printing themselves i.e. as self-replicating. They are not. They print a small fraction of their own non-complex parts.


It's an awful community (no pun intended). There are some of the worst people on the Internet hiding behind it.


My god I remember that guy from the late 90's. There was an Internet fad around then.

Now I feel old.


At the risk of sounding like a whinging old man, but do we really need 1Gbit to the home at the moment?

If you take the UK as an example, a lot of the country (outside major cities and metropolitan areas) is still stuck on 512kbit. The same is true worldwide with even parts of SA on dialup. Shouldn't we be concentrating on throwing more resources into getting these connections usable rather than feeding crazy large bandwidth to the rich?

As the broadband speeds are controlled pretty much by consumer demand, isn't it better to have more people than an elite few?

On the same subject, I'm sitting here on approximately 12Mbits and I genuinely have no problems with it streaming HD iPlayer and with three computers on it. I don't need any faster and it costs a whopping $20 a month equiv (unmetered consumption).

Also, if you consider the cost of bandwidth and caps thrown on people in Europe, a gig connection would suck up your entire allocation in about 48 seconds...


An interesting argument and a straw-man argument. Let's deal with the straw man first. Just because you have no use for the extra speed doesn't mean others do not. In addition, systems / uses for the speed will only be built when the number of people with access to it is growing. Therefore, there will be early adopters where the speed will gain them little initiallly, but over time we will develop more services that need it.

Now let's deal with the remote people problem. There are already efforts underway to more efficiently service outlying areas. These generally revole around WiMax and things like Lightspeed (although that looks dead for the moment). People who live far from cities will not be served as well. That's they way it will always be do to the economics of population density, hence the reason that serving those environments typically include long range wireless. Second, since when do we need to get everyone at the same level before we move forward? Would we be where we are now in personal computing if we stopped in 1996 to make sure everyone had a computer before we built faster ones? We can continue in the same way we have with universal access fees supporting access to non-city dwellers, but that's no reason to prevent progress in cities.


I didn't eat meat for about 5 years. One day I just bought a quarter pounder with cheese from McDonalds for no particular reason other than I felt crappy after being sick for two weeks.

That was the tastiest thing I'd ever eaten!

At the same time, I don't know why I do eat meat now but I seem to crave/need it and just can't shift that. I don't usually actually like it either ironically (apart from that one quarter pounder above).


I don't know why I do eat meat now but I seem to crave/need it

A friend of mine was vegetarian for many years and one day, about two years ago, she collapsed and the doctor told her she needed to eat meat again because her body had stopped processing protein properly and she wasn't getting enough from non-meat alternatives anymore (or something like that, don't remember the exact details now).


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: