Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Alan Kay on re-creating Xerox PARC's design magic at CDG (fastcodesign.com)
163 points by sandij on Oct 3, 2015 | hide | past | favorite | 83 comments


PARC was in the glory days of corporate research labs. PARC, Bell Labs, IBM Almaden, Sarnoff Labs, HP Labs, etc. Research labs that were part of big companies. I was lucky enough to visit many of those in their prime.

I got to tour PARC in 1975, years before Steve Jobs did, when I took a summer course in computer architecture taught by William McKeeman. He was the architect of a generation of Burroughs CPUs, and knew everybody in CPU design. So I met Alan Kay before anybody had heard of him. He had a vision, but it wasn't quite what most people think it was. He said the big advantage they had was that they were funded heavily enough to build the single-user computers of the future now. The Alto was over $20K (some said $50K) per unit, which was insanely expensive for a single user computer. It took another decade to get the cost down. Kay's group saw as their job to get the software ready for when the hardware came down in price.

Kay was thinking that the killer app for personal computers was going to be simulations. He later had a demo graphical hospital simulation, which was a discrite-event simulator where patients came in with a complaint ("I am a victim of Bowlerthumb"), and went through Admitting, Examination, Surgery, etc. out to Discharge. Smalltalk is based on Simula, an Algol-derived simulation language, and was originally intended for discrite-event simulation. Document preparation and mail were a sideline.[1]

Kay was operating in an empty world. Almost nobody else was throwing money at what software should look like a decade or two hence, for a class of machines that didn't exist yet. That was a huge advantage. Anything good that was done there advanced the state of the art.

Kay's group was only a small part of PARC. There were other people in the large building working on copier technology and the physics behind xerography. (Unfortunately for Xerox, they didn't invent organic photoconductors, which made xerography machines much smaller and cheaper. IBM did.) Kay's group had considerable engineering resources to draw upon, machine shops and electronics shops and chemistry labs that could make things. They were able to have their own CRT tubes made for the Alto. It's a lot easier to invent when you have that kind of substantial engineering backup. That's why they were able to build a laser printer - they were in an engineering facility that could build both a CRT and a copier. Kay's group just did the software and some of the electronics.

That's hard to reproduce - all that engineering backup. It's only available within a big business that makes real stuff. Today, Samsung and Fujitsu have labs like that, but few US companies, at least in the electronics/computer sector, do. There are a few military operations with such capabilities - China Lake Naval Weapons Center is one; they can design, build, and flight-test something in-house.

So that's why it's hard to reproduce PARC - you need an empty field of research, and heavy engineering backup.

[1] https://www.computer.org/csdl/mags/co/1977/03/01646405.pdf


> The Alto was over $20K...

That is about $89K in 2015 dollars, for those curious. If it really was $50K, then it was closer to $222K in 2015 dollars.

If I had the discretionary income of a multi-billionaire, then the hell with conspicuous displays of wealth, give me a dedicated team and lab making one-off, custom hardware and software for me to augment my intelligence and knowledge.


Add kodak to the list.


Yeah, Kodak had their moment. [1]

[1] https://en.wiktionary.org/wiki/Kodak_moment


> and heavy engineering backup

Has the cost of that come down to where it's feasible at smaller scale?


The linked Bloomberg article has a hilarious quotation at the bottom:

> Exploring odd things isn’t likely to help SAP attract more cloud customers in the short term, says Bill Hostmann, an analyst at researcher Gartner. “What SAP really needs is more execution,” he says. “A lot of people took inspiration from PARC. Did Xerox directly benefit from that? Not really.”

Bill Hostmann has no clue. Xerox got the laser printer out of PARC, which ended up as a multibillion dollar business for them.


SAP ain't Xerox though. The idea of SAP hiring a bunch of maverick UI geniuses is pretty mind-boggling.

This is a company whose products are almost the complete antithesis of good UI design (they're also the antithesis of software integration you don't customize their software for your organization, you customize your organization for their software). If there's one corporate software experience I've had that was worse than Lotus Notes, it's being forced to use SAP (and at the time I was also being forced to use Perforce for a marketing website).

I guess this is a sign SAP (or someone at SAP) at least recognizes they're got a real problem, but I doubt they'll be able to turn that boat around.


> The idea of SAP hiring a bunch of maverick UI geniuses is pretty mind-boggling.

Except that in the late 90's they did exactly that. They brought in Frog and what they produced was quite exceptional given the state of native UI at the time.

Not saying SAP is the paragon of innovation, just that even a broken clock is right twice a day.


For awhile.

Xerox didn't realize the potential of what they had--it took time to realize their vision, by which time it was too late for Xerox. They failed to capitalize on their entire portfolio.


The bigger problem for Xerox was that in 1975, about five years after establishing PARC, the DOJ forced them to license their entire patent portfolio to Japanese competitors. That dramatically changed the dynamic at Xerox, which had been able to think big ideas while riding its copier monopoly.

Xerox, AT&T, and to a lesser extent Google are strong examples of competitive markets being bad for innovation. Bell Labs didn't survive the AT&T breakup in the same form, and I doubt Google would be able to screw around with self-driving cars and wearable computing if they didn't have network effects protecting them from competition.


Peter Thiel likes to make this contrarian point: competition forces you into a deflationary downward spiral where you must spend all your energy fighting with and therefore duplicating your competitors. An example of the latter would be Google Plus, which was an attempt to compete with Facebook by badly imitating them.

Government is the ultimate monopoly, and government research can spend time trying to send people to space, look back to the origins of the universe, and so on.


The problem of Xerox was that they research group produced things much beyond their ability to market and execute. Xerox executives thought about the world in terms of printers, not in terms of revolutionary computers. They could have made trillions of dollars if they had executives who truly understood and valued innovation.


They only barely understood the world in terms of printers, too. If they'd capitalized on their entire portfolio they could have taken over the world and we'd probably be a decade beyond where we are now.


"Kay has also tried to help companies re-bottle the lightning that crackled within Xerox PARC during his tenure there."

I worked for a while in one of these companies. It was really nice having Alan Kay around, and some of the other geniuses he was working with at the time.

But, there was also massive amounts of money at stake, and far too much heavy shit going down on a daily basis for any of the more weird guys (like me) to get very far. Although, i did make a ton of money for that company (me and one of the other weirdos), when it came time for the rest of the developers to join in it was like "what the fk is this ?" haha. good times.


Interesting, what does this mean: "far too much heavy shit going down on a daily basis for any of the more weird guys (like me) to get very far"?


Heavy shit: serious faces, quiet but stern money making, regular momentous meetings in small rooms with closed doors.

Weird guys: hairy, randomly communicative, oddly passionate.


Sounds like copious amounts of MBAs were involved.


I can usually spot a company that has too many MBAs. I once heard a teacher say, "There's nothing more useless to an organization than an MBA! Oops, I didn't mean that they way it came out." This was a business school teacher.

I don't believe that, but today, with so many of us knowing the tricks, and what not they teach in these programs; if I owned a company, my MBA count would be low. The accounting/statistics aspect of these programs is fine, but the cheap sales/staff ploys that these programs teach are just irritating, and very noticeable.


I have always wondered if a TV/Movie director or producer would make a better manager for a software team than an MBA.


People like Catmull of Pixar have tried to write books about this.

Managing creative people - which may include some software people - is not the same as managing a pool of lawyers or accountants.

Software has a problem because too many managers think all software people are interchangeable development units.

In reality there's a universe of psychological difference between corporate biz-logic devs and the kind of creative lunatics who used to work at PARC - and all shadings between the extremes. Put good people in the wrong environment and they'll be worse than useless.

Same for all the other many possible dimensions.


Software has a problem because too many managers think all software people are interchangeable development units.

I've observed that a lot of bad software development managers were previously failed software developers. Perhaps one reason they view us as interchangeable is because developers of that level of quality are indeed interchangeable?

Related to my observation that offshoring can make sense if you note that most corporate development projects fail (not much less than half outright, more than half when you add the "declare victory" messes), and that offshoring is a cheaper way to fail.


Absolutely. Anyone who can ship real things would be superior.

The MBA is basically capitalism's answer to the Soviet "party apparatchik."


> My attempts to do Xerox PARC-like things over the years, like Interval [Research Corporation] with [Microsoft co-founder] Paul Allen, most of them have failed for one of two reasons. it’s either so antithetical to the corporate culture that nobody really wants it. Or in Paul Allen’s case, he wanted to monetize everything. He treated [Interval] as an innovation center or product engineering division. He never understood the process of doing invention research. A company should have an invention center because it’s a wild card.

I went out to dinner at SIGGRAPH in the early 90's with some people from Xerox PARC and some people from Interval Research. When one of the Interval people quipped "We're the Xerox PARC of the 90's", one of the Xerox PARC people took issue with that and corrected them: "Actually, we consider ourselves to be the Xerox PARC of the 90's."


I had this experience in my own company for a very very brief time when we had enough money for it. We were (trying) doing innovation in the our R&D dep first but after that I tried to make that into something more. Unfortunately in a commercial entity (as is indicated in the article) this is extremely hard to do as when the time for new budgets arrives my co-founders would ask where the tangible (as in money) results were. Results did come and in the end it did actually break even but the freedom was stripped too soon to really make something happen. Sounds like the ultimate job to be honest.

And that at SAP of all places. I worked with SAP software and consultants and probably as many here who worked with SAP as well will probably agree; that is a far shout from this invent lab...


If you want freedom to do research without yearly commercial results expected there are still places like Microsoft Research and Oracle Labs where you can do this.


I can't imagine Larry Ellison letting anything out of Oracle Labs unless they invent a new way to permanently tie customers into existing legacy software.


I work at Oracle Labs and almost everything I do is done fully in the open, open source on GitHub, with lots of papers, blog posts and conference talks to explain how it all works.



Yeah, but Larry is really imaginative at finding ways to use awesome technologies to do that.

(haha only serious ...)


They're probably workibg on longevity treatments for a patient base of one: Larry Ellison.


Agreed, but I saw (and see) actual benefit for slightly smaller companies as well. I think any 200+ employee company should have both innovation and invention deps. Even services companies benefit from this imho.


Xerox PARC systems were great and it is a pity that the mainstream didn't adopt them as such, only a few pieces.

Every time I dig into their archives I come out marvelled with how programming might have felt in those Workstations. Memory safe programming languages for the whole stack (+ Assembly of course), automatic memory management, OS wide REPL, visual debuggers, code correction, modular systems...

Specially since Smalltalk and Oberon (Native and BlueBottle) allowed me to have a glimpse of it.


Yep. The standard we accept for what constitutes a "workable" development environment has essentially trended towards "what is the crappiest workflow that can be tolerated by a professional programmer working 40 hours per week on it".

There's no real incentive to market programming tools that have a higher standard for usability, because there's no ecosystem of sensible "there is consensus that this should be a a thing and this is a sane way to do it" programming tools, so you can't work at that level anyway. Someday there will be a market again for sane development environments but right now everyone just powers through the learning process until they get to the level of "my mind is warped enough that I can comprehend a build file this evil" level because that's where the money is at anyway. It's a chicken and egg problem. No ecosystem no market.

It's just an accident of history that such a thing ever existed in the first place. A waypoint on the way to the era of industrial mindfuck programming.


Question: Why don't we see this rate of invention in open source community? It is already (apart of it corporation-sponsored part) an artistic community, people work on stuff they like, there's no business pressure and so on.


Lots of reason. OSS usually tries to duplicate something that's already there, or make small improvements. There's a lot more chance for a result that others will find impressive when doing something incremental, because you are building on all the stuff that's already there. In fact, academia is also the largely that way: do a small increment.

Doing something that's true invention is far more difficult, and the results are far less impressive in the short to mid term. As an example, you probably won't find Objective-Smalltalk [1] very impressive, I would say that a good part of the reason for this is that it is really trying to invent something new. This is slow/difficult because you must eschew the easy/obvious answers. It is difficult to communicate for the same reasons: everything is subtly strange.

[1] http://objective.st/


Some of academia looks at completely novel approaches rather than increments, but the main issue here is, after the budget has gone, or the phd has been issued, the project is left in the water as a new project, new budget and new paper to publish is required. There's no such thing as a "10 year project" in academia.


> OSS usually tries to duplicate something that's already there, or make small improvements.

Rust? I'm sure there are many other counter examples too.


Rust is hardly groundbreaking. It's a mix of imperative/functional styles that are well known, thrown in with pointer ownership which has been researched and toyed with in various forms for years before. It's a small improvement on C and C++. Making a practical implementation is praisworthy, but I'd say this hardly counts as an invention in open source, and it's particulaly unrepresentitive of the FOSS community anyway because it's backed by a big company with a big research pot.

The kind of game changers Kay is talking about are not that easily approachable. You won't be able to take your existing knowledge of language X and suddenly see how they mostly apply to Y too. If that were the case then you've not really changed the paradigm, only given a glimpse of how it could look from the existing one.

And the reason you don't see many of these kinds of innovations in the FOSS world (although they definitely exist), is because they don't gain traction. If something is clearly new and takes significant effort to learn, very few people are going to take the time to investigate it. Meanwhile, solutions which fit well into the existing paradigm are easily accessible by masses of developers, and they flourish. This is probably one of the main reasons that real invention is rare: people are after fame, and adoption rates aren't going to go up quickly if you challenge existing conventions.


Thanks, you put it much better than I did.


Because the majority of the open source community are either people building up their CVs or working on side projects.

Innovation like the one at Xerox PARC needs lots of money.

Imagine how much a Interlisp-D, Smalltalk, Mesa/Cedar workstation would have cost in the 70's versus a plain PDP-11. Also a reason (among many others) why the market didn't adopt their technologies.


> Innovation like the one at Xerox PARC needs lots of money.

Rather, I'd say it needs people working full time. With things like basic income, this can be decorrelated from "money" a good deal.


Check this video running in another thread. This a known Alan Kay talk, he did it already a few times.

https://www.youtube.com/watch?v=NdSD07U5uBs

As an example of the type of research being done, you must be willing to build something that will be considered commodity in 15 years time at today's price.

You won't get there with basic income.


Basic Income can be a great stepping block to cultural improvements which include things like tech. It is true that some things need allocation of resources which Basic Income doesn’t directly address, but a society with Basic Income might be more reliably capable of getting it done.


I agree with basic income for everyone.

Just don't agree it is enough to innovate at Xerox PARC scale, as what is being discussed is funding, not salaries.


The researchers at PARC needed lots of money. For example, they built the Alto. They then used the Alto as a tool to design lots of other great things.

A product like the Alto doesn't get done by a few hipsters subsisting on "basic income".

https://en.wikipedia.org/wiki/Xerox_alto


Building an Alto then required lots of money, you can prototype fairly complicated things now on a cheap FPGA.


Yes. Of course.

But, to quote Wayne Gretzky:

   I skate to where the puck is going to be,
   not where it has been.
You can build today's complicated things cheaply. But if you really want to achieve a quantum leap you need to think 10 years out. It is not cheap to build today the prototypes of things that will be desirable in 10 years. That is what Xerox PARC was doing.

It is tempting to look at an FPGA and say that it only costs $10. Or maybe even $1000 for the top end. But that's the tip of the iceberg. Designs that go into a high end FPGA could easily require 10 man-years of engineering time. That's for a single FPGA hardware design itself, not the system it goes into, not the associated software.

Once you need to do those sorts of things you won't find the right people by recruiting basic income hipsters that are lounging around at your local Starbucks. And you will need money to pay the considerable non-salary expenses, even if you could convince all the people to work for nothing but equity.

Right now, it's relatively cheap to do software-only things. Which explains why there are so many of those sorts of startups. But it's not cheap to design and build physical things.


I was sufficiently good at building stuff in FPGAs and writing the software to use it that I don't need to work anymore. I see the kind of decisions that I need to make now on what projects to work on as the same as those that would need to be made by creative people receiving basic income.

People seem happy to donate to Kickstarter projects, the amounts needed would be smaller if the participants already had enough to live on.


You are missing the point.

What would be the Alto of today designed with hardware from 2030, not with a cheap FPGA from 2015?


I don't think I am missing the point. The Alto hardware was fairly simple, most stuff was done in software.

What do you want to build today that is too expensive to try out ?


> What do you want to build today that is too expensive to try out ?

Virtual reality environments with good graphic. Or maybe something like microsoft hololens. Glasses which work like AR HUD and tracks your movements, your gestures and maybe voice. This requires a lot of custom hardware (microsoft developed ASICs for this to do it in realtime).

Also .. it would be awesome to port smalltalk to 3D.


> I don't think I am missing the point. The Alto hardware was fairly simple, most stuff was done in software.

It is fairly simple to you today, given that Alto exists.

It wasn't that simple for those guys in 1968, trying to imagine how a 199x computer would look like (Alan's words). That was their goal.

> What do you want to build today that is too expensive to try out ?

That is why they are called inventions and there are patents to assign to them.

Nobody knowns without doing the proper research.


"The Alto hardware was fairly simple, most stuff was done in software."

It was pretty simple. I've used and programmed an Alto, back when Stanford had a few around. It was basically a reworked Data General minicomputer in a small rackmount case. 16 bit word oriented, programmable in BCPL or Mesa at the low level, Smalltalk at the higher level. The removable hard disk was the same as a DEC RK05 cartridge disk. The Ethernet interface was very simple, and coax Ethernet was electrically simple. Alan Kay referred to Ethernet as "an Alohanet with a captive ether". The CPU was microcoded, and cycles were stolen from the CPU's microcode engine to run the peripherals. None of this was pushing the state of the art. The CPU hardware was a minimum viable product.

The keyboard, mouse, and display were all new, and nicely engineered. Most of the hardware effort went into those. The keyboard had nice key switches and a massive metal casting. The display was the first good black-on-white display, with a big portrait-format screen. The original mouse wasn't that great.


Thanks for the insight.


What do you want to build today that is too expensive to try out ?

That, as they say, is the $64,000 dollar question.

I sure don't know. I'm not creative enough. Do you know?

My intuition is that there is about 0.01% of the population that is visionary enough to properly answer that question.


Maybe the real question is how your 0.01% can find each other in order to discuss ideas.

From what I have read, the people at PARC got there by similar routes, there was one group from SDS and another from Evans & Sutherland. I'm not sure we still have the same kind of "landmarks" that will attract the right people.


> Maybe the real question is how your 0.01% can find each other in order to discuss ideas.

I'm one of that 0.01%.

I encourage others to contact me to get the discussion going.


> That, as they say, is the $64,000 question.

You are more right than you think. The initial manufacturing cost of an Alto (adjusted for inflation) was...

http://m.wolframalpha.com/input/?i=12000+dollars+from+1973&x...


Wow. That's quite close.

I'm sure you get the reference, but since we're dealing with a worldwide audience, here's where it originates: https://en.wikipedia.org/wiki/The_$64,000_Question

It was a meme: "a common catchphrase for a particularly difficult question or problem"


Imagine how much a Interlisp-D, Smalltalk, Mesa/Cedar workstation would have cost in the 70's versus a plain PDP-11.

Eh, if they'd put Altos in serial production instead of small batches at a time adding up to 2,000 units it wouldn't have been vastly more expensive. There was nothing both exotic and wildly expensive about them compared to contemporary PDP-11s + a graphics console. The extras were a mouse, chord keyboard, network adapter, and more memory per person than normal for a PDP-11.


Open source/free software philosophy is solution oriented. Successful open source project serves immediate needs and revolves around implementation.

If you have a problem, you write code that solves the problem and give it to others to use. Software should be at least minimally usable to be adopted. It should work with other software.

Look at the software in Alan Kay's VPRI: http://www.vpri.org/ They don't write software for you to use. They build software to use as platform to test ideas. If some of those ideas is good, someone may rewrite the whole thing to fit into current software infrastructure.


Because of money. OSS is usually written as a hobby in the spare time of developers because doing it full time rarely pays the bills: http://www.zdnet.com/article/electricity-bill-threatens-surv...!


Assuming that you're right that we're not (and I'm not convinced; a lot of amazing stuff is coming out of the open source community):

Time.

I'm working on a Ruby ahead-of-time compiler. I started in 2008. I slowed myself down a lot by all the blog articles I've written about it, but yet, at this point it's only now getting close to being able to compile itself.

The reasons it's been so slow going, is that in those years, I've only put in the equivalent of a few months of fully time work on it - it's not my only project, and working on my side projects have to compete with spending time on my son and other leisure activities.

It takes a lot more people to get the equivalent output of a lab of fulltime staff. Even more so because this slows down interaction and communication as well.


Inertia, Communication, Noise.

The landscape is very different from what it was in 70s. They were working on a clean slate before half of the human population had computer terminals in front of them. Their battle was to get the terminal in front of people, but I'd argue that the biggest accelerator to adoption has been social media, not the way GUIs behave or programming languages function. The web is now the legacy software we're stuck with if we want any impact at this scale.

Say for example, if you had a completely new idea for a general purpose operating system which simplified things greatly, but was unlike Unix, and it had no web browser. Now what? People aren't interested - they want their web browser.


Long term focused thought. I've been running a consultancy for ten years, my current research project that I believe will shortly become a product has been around over ten. We mostly make our money by open source users finding us. But that sort of invention is time consuming and hard; I'm only working on said project because shadow.cat is capable of letting me.


What makes you think we don't?

Most innovation is coming directly from the OS community: hardware, software, systems, etc.


Examples, please. Particularly software innovations.


The thing to recreate would, in my opinion, not be Xerox PARC, but the SRI bootstrapping idea.

https://www.youtube.com/watch?v=agdPQuFr0yg


Point 2 uses the exact phrase I have use to describe my experience of visiting Valve : "Artist colony." Take smart creative people who often don't fit well elsewhere and let them go.


There's a difference between:

Department of Simulation Research

and:

Department of Research Simulation


One of their projects was on the HN front page a few days ago: https://news.ycombinator.com/item?id=10293368.


This is a thing that happened once. What if it just cannot be recreated?


Or, what if it really is simple to recreate, but we can't do it because we try too hard, in a wrong way? If points 2 - 4 are the keys to recreating something PARC-like (and I believe they are), then the problem is pretty obvious - those, especially 2, are the points which most companies try to avoid.

Keeping smart people on basic income with no obligations sounds like unprofessional. Like something we do with children, not with adults. Adults need to have deadlines. Their goals should be related to how their company will make money. Everything not related to company making money is not important and should not be done at worktime. -- It's because bullshit like this we don't get much real innovation. Companies, and even whole societies, don't feel like subsidizing a playground for smart adults, that occasionally spawns ideas, which usually can't be monetized directly or entirely.

But we need that because real inventions happen in context of a problem, whether real or invented, or sometimes after the fact. The problem needs to be a terminal value for this to work. That is, "Solving X" is a good problem, "Making money by solving X" is a very bad problem. And if you put "Making money" as an input to the invention process, you get shit like obnoxious ads, disguised Ponzi schemes and growth hacking.


It created trillions in value. Why not try?


Created for whom, and... did it, really, all by itself? I think it's kinda overblown, to put it mildly.


Well, this is the problem with our industry. Any physicist will know who Niels Bohr was and what he did. Not our industry. Computer Science has become a field for twenty-year-old's whose memories span months, or ego-driven Sheldon Cooper's who can't be bothered to credit anyone for anything.

Frankly, I'm embarrassed to work in this industry.


Yeah, Alan Kay says things to that effect, too. Another way to look at it is, assigning all the credit for those "trillions in value" to Xerox PARC (my oh my, creating so much value and capturing so little? how did that happen?) is a warped way to look at the history of computing and thus is the perfect example of the problem that makes you "embarrassed to work in this industry."

As to Niels Bohr... I think the transistor or the concept of decidability are more like what physicists do/achieve than OO or WIMP interfaces. One can disagree with that but one can also agree with that, I think, even if their age is above 20, their memory spans whole years etc. etc.


Actually, not every physicist knows what Niels Bohr did. I know a very smart and hardworking physicist who did some complicated math on details of particular renormalization procedure in quantum field theory but could not explain how Bohr's model of hydrogen works.

As to the Computer Science detriment, aren't you confusing computer science with a programming job? Do you really mean that the science is done by 21 year old people with no long term memory? I find that hard to believe.


Xerox didn't do it all by themselves. Listen to one of Alan's talks and he'll tell you about they stood on the shoulders of guys like Doug Engelbart and Ivan Sutherland. Xerox didn't invent the research atmosphere to facilitate giant leap innovations, they brought in people from DARPA/SRI who were already doing it.

Don't get me wrong, Xerox deserves a lot of credit for making some huge contributions to that sphere, but we know they didn't do it all by themselves.


This is like hearing that a major record label gave The Beatles (all four living) a recording contract. You would scratch your head and ask why it didn't happen earlier.


I really agree how universities are stressing on a particular programming language rather than taking a more generalist approach. One of my friends transferred from a university in Queensland, Australia to Sydney. She could not transfer the credits of one of her courses (Intro to Comp Sc) as she had taken the course in Java and not in python even though if you look at the concepts , there was a significant overlap.


Here's a fair question to ask Alan Kay: He had roles as a "fellow" at Apple and Disney and HP.

What did he do to recreate Xerox PARC's "magic" while at these places?


He was a fellow at Atari too, without helping them very much. Alan's a great guy, a great speaker, and he made fundamental contributions to OO with Smalltalk. But probably his single biggest contribution was that he funded the creation the Alto out of his group's budget. (So as a VC rather than a scientist or manager.)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: