Hacker News new | past | comments | ask | show | jobs | submit login
Why Google isn't our Bell Labs (begun.co)
143 points by zchry on Dec 17, 2013 | hide | past | favorite | 122 comments



The Bell Telephone Company/AT&T was also a privately owned, publicly traded, company with an obligation to maximize profit to shareholders.

I think there are probably a lot of differences between Bell Labs and Google. Some of them are due to just differences in historical context between the 60s and the 2000's, some are probably due to how Bell's monopoly effected it's R&D approaches, sure. It would be interesting to delve into this.

But they are definitely not about the "fundamental differences between a publicly traded company and a state-sanctioned monopoly," the OP is just confused. Bell/AT&T was a publicly traded company, as well as a state-sanctioned monopoly.

Also, in the histories of Unix that I've read, Bell/AT&T hardly wanted to contribute Unix to the 'public domain', they wanted to market/commercialize it, they just failed -- and never really realized the potential market value of what they had. There were also issues of AT&T being forbidden from entering some aspects of computer business by a 1950s antitrust consent decree. (The UNIX developers on the other hand definitely wanted to share it, and often had to act under the radar to do so. Another historical difference is that they could get away with that.)

I think the author is perhaps guilty of romanticizing Bell Labs as they are accusing others of romanticizing Google!


Glad you point out that Bell Labs' gifts to the public domain were involuntary (they distributed unix source code without giving away the rights, then took them back; UC Berkeley then created a free replacement so they could keep teaching OS courses and At&T tried to sue them to oblivion and lost).

If you want to make the argument that Google isn't Bell Labs, it's probably better to point out that Google isn't really tackling really fundamental research projects like inventing the transistor and the laser. Even producing Go isn't on par with C because C was amazing and unique for its time, whereas Go is just part of a flood of new languages.

Bell Labs:UNIX::Google:Linux distribution.


I've always had the impression that MSR (Microsoft Research) was generally doing much more fundamental research than Google.

Google still has a ways to go to even be on par with HP Labs or PARC, let alone Bellabs.

Let's not forget all the research IBM Research Labs did over the years (less so now due to focusing on profit, but they still were doing more fundamental research than google even in recent memory).


Isn't Google doing fundamental research on AI?


No. They give prizes, maybe fund researchers, but they don't assign people to do fundamental research. Unless you call the youtube-cats and driver-less cars fundamental.


Google isn't doing basic research like measuring and discovering the background radiation from the origins of the universe.


Actually, Google has helped to discover planets, among other things

See, e.g., http://research.google.com/pubs/author37653.html


That was an accident that turned out to be pretty damned amazing


"chance favors the prepared mind"


But if they wanted to, its more than within their ability. I'm sure they have plenty of idle computing time to do so with the proper physical sensors providing data.


Computer power (or lack thereof) is definitely not the limiting reagent in science today.


A few years back I worked on data taking for the CMS detector at the LHC. We were Tier1 in the US; we took the data from Tier0 at the physical detector and distributed it to 800+ orgs around the work. We had 6000+ linux machines and a staff of 10.

To say computing power isn't the limiting factor is disingenuous. Yes, you can go spin up thousands of instances in Amazon and store PB of data. Show me your budget to do so though.


I think this is only true if you don't consider trillion-fold cost reduction as an option.


> they distributed unix source code without giving away the rights, then took them back

I always found it ironic (or perhaps subversive, not sure which) that the guys in the Unix room later based 8th Edition Research Unix on 4.1cBSD.


What do you think made C such an amazing language compared to say Pascal and Fortrant?

I like C and how it handles pointers and arrays but was it really such a departure from what was available then?


I'm not dissing Pascal or Fortran. But C was tied to UNIX (and a toolchain such as lex/yacc) in a way that allowed UNIX to be quickly and cheaply ported which made it revolutionary. Fortran and Pascal lacked similar infrastructure (eventually there was Oberon which was kind of like the Lisp or Smalltalk stacks but never took off).

Lisp and Smalltalk would be the comparables, but Lisp was not portable the way C is and Smalltalk was very slow.


Very true that AT&T was no saint but it should come as no surprise that Bell Labs is romanticized when one looks at how Epic their output was. From a sample of their employees we have: Information Theory, karnaugh maps, the transistor, laser, photovoltaics, CCDs, C,R, Unix and for kicks - stumbling upon CMB hence enriching our understanding of the universe, it's age and size. It's mind boggling. No company today comes anywhere near that in terms of fundamentally rearranging knowledge and society.

I would argue that if Bell Labs was 8 or 9 on the Richter Scale then Microsoft Research, at a 4 or 5, is the closest we have today in terms of independence and diversity of research unfettered by profit concerns.

One guess for why Bell Labs was able to achieve so much is that on one hand they had decades of fundamental new insights by Brilliant physicists and mathematicians ripe for the picking (as evinced by multiple cases of identical independent inventions) and on the other hand they had the backing of a monopoly of incredible proportions. The planets don't often align like that.

Monopolies and excess profit are almost always a sign of market failure. However, some have argued that one of the benefits of a monopoly is it allows firms to reinvest some of that beyond normal profits and reduced costs on R&D (also typically argue monopolies are temporary because creative destruction). If you consider that only those like Google, Samsung, GE, IBM and Microsoft can even consider such broad-based research then it seems there is merit to the idea. Yet, the terrain of something so complex as innovative output should not have a single and global optimum.


At the risk of sounding like that guy who said that everything that could be invented, was, I'd add a third point that Bell Labs' heyday was right during a golden era of discovery where there was a ton of low hanging fruit to be had. It was really a perfect storm of science.

I doubt this could be repeated today even if you somehow copy pasted the Bell Labs from 1960 including their staff and culture. I'd love to be wrong about this :-(


3D printing? Drones - flying and driving? Robotics? CERN? Always on, location-aware connectivity? Quantum Computing? Automatic speech and text translation? Amazon-scale logistics calculations? Recommendation engines?

How are we not in a goldener age of discovery now?


I appreciate your positive outlook :-) There is no doubt a great deal of technical advancement under way today, but it seems mostly applied, incremental work. Perhaps we're nearing the end of a Kuhn-like cycle of scientific advancement and some kind of breakthrough will kick things off again.


How is automating driving incremental?

Quantum computing? jAs a sometimes-Bayesian statistician - do you know the torment I will be able to unleash on the scientific world once MCMC inference is practically for free? Just imagine... spending all your time carefully choosing distributions for your data and uncertainty, and the implementation of the inference is an afterthought? I need a moment alone.


Other monopolies have produced great things: Xerox's PARC labs, IBM.

But not always: Standard Oil, East India Company.

Maybe opportunity (as you note) plays a part.


> East India Company

Another issue might be that we are focussing a bit too much on technical innovation. Because if you mean the Dutch East India Company (better known as the VOC), there is no denying that it was very innovative from an economics standpoint.

http://en.wikipedia.org/wiki/Dutch_East_India_Company


Perhaps one reason for the change is also that finding cool and new stuff was "easier" 50 years ago. In my opinion all the low hanging fruits are already taken and in order to invent a new transistor you need more than just a few guys sitting in a room with little equitment.


Additionally the author is wrong about Google having to maximize shareholder returns.

Google has dual class shares, so the only people Google realy has to please are Larry Page and Sergy Brin.


The parent is spot on. Bell Labs was the first to leave the Multics project.

It took several years for Thompson, Ritchie, McIlroy and Ossanna to get funding.

They did it under the radar with a PDP-7 garbage picked by Thompson while sending letter after letter to get funding for a DEC PDP-10 or Sigma7 which they would use to create the OS and play games/simulations (i.e. Space Travel). Mind you computer time in the late 60's was upwards to $75 and hour.

After all is said and done Bell Labs at the time was a government regulated national telecommunications monopoly and was not allowed to sell it. They could licenses it though and did. Berkeley only had to pay for the tapes since they where a school.

FWIW they didn't get the funding based on asking to program an unspecified OS but to create system specifically designed for editing and formatting text. The big buzzword back then was "word processing". Also the higher ups already knew by then that they had something which Ken and Dennis had done on their own. AT&T bought them a PDP-11 and the first end user applications where for AT&T patent dept. to be used by secretaries.

Thus, during 1969, we began trying to find an alternative to Multics. The search took several forms. Throughout 1969 we (mainly Ossanna, Thompson, Ritchie) lobbied intensively for the purchase of a medium-scale machine for which we promised to write an operating system; the machines we suggested were the DEC PDP-10 and the SDS (later Xerox) Sigma 7. The effort was frustrating, because our proposals were never clearly and finally turned down, but yet were certainly never accepted. Several times it seemed we were very near success. The final blow to this effort came when we presented an exquisitely complicated proposal, designed to minimize financial outlay, that involved some outright purchase, some third-party lease, and a plan to turn in a DEC KA-10 processor on the soon-to-be-announced and more capable KI-10. The proposal was rejected, and rumor soon had it that W. O. Baker (then vice-president of Research) had reacted to it with the comment `Bell Laboratories just doesn't do business this way!' --Ritchie 1979 http://cm.bell-labs.com/who/dmr/hist.html

Towards the end of the 80's Berkeley students took a nod from Richard Stallman and put their distribution on the net (which was mostly rewritten at that point (It was Keith Bostic who proposed removal of non-AT&T code)).

ATT UNIX was licensed at about $200k while the BSD patchset was just a piece of software you added if for anything the TCP/IP stack, vi, job control, curses, mail services and csh.

I don't for a minute believe that Google or ATT are in any way for the greater good of society and research any more than I would believe that apple or microsoft care about education and/or removing vender lock-in. It's not like any of the aforementioned monopolies have invested in real problems like cancer research and/or a cure.

If you want to explore the innovators it's best to look at the McIlroys, Thompsons and Ritchies, Bostics and Stallmans and so fourth who really did innovate and change our lives for the better.


AT&T operated under the idea of a steady rate of return for its shareholders, versus simply maximizing the rate of return. So there was money to pay for the labs, each local bell operating company paid IIRC 1.5% of its revenues to the labs for the right to use the work they produced.


I think that's an interesting observation about changes in the past few decades in how how corporations see their responsibilities to stakeholders.

I think it's true that 30-40 years ago, compared to now, corporations more often were guided by things other than simply maximizing share prices. It's not unique to AT&T, and has nothing to do with AT&T not being a publicly traded corporation, which of course it was.


I may be a bit biased, because I'm a Googler... but looking through this thread and seeing all the people downplaying Google's contributions to Go, Android and Dart. As if they were minor...

But what about all of the papers that have changed the way the world does computing? Some choice links below:

MapReduce http://research.google.com/archive/mapreduce.html

BigTable http://research.google.com/archive/bigtable.html

Dremel Paper http://research.google.com/pubs/pub36632.html

Chubby Paper http://research.google.com/archive/chubby.html

Urs Holzle on OpenFlow: http://youtu.be/VLHJUfgxEO4

Megastore Paper http://research.google.com/pubs/pub36971.html

Spanner Paper http://research.google.com/archive/spanner.html

Granted, some of these will take awhile before having implementations outside of Google... but you can't deny the impact of this work...


Papers that have changed the way the world does computing? You must be kidding. A few thousand companies using (largely) poorly constructed products based on some papers is not changing the world of computing.

Bell labs created the first transistor for goodness sakes. The photovoltaic cell. The first gas laser! Wifi! TDMA and CDMA. The CCD. Hell half the concepts of modern operating systems came from Bell Labs.

Mind you, Bell Labs had 90 years to do all this, but to even pretend like Google compares is hubris.


Most companies running data centers are now copying the way Google runs data centers, Google essentially reinvented large scale data center management. Most of the online services that most people on the planet earth now consume in some part or leverage the knowledge that was published in the papers listed above.

You can look at data centers before and after Google the same way people look at smartphones before and after the iPhone.

And Android didn't change things? Practically every new non-Apple consumer electronics device that has any kind of UI uses the AOSP.


Most companies running data centers are now copying the way Google runs data centers

Most companies don't run data centers. Of the ones that does, the vast majority don't look a thing like Google because very, very few companies have computing requirements that look anything like Google's

And Android didn't change things?

Changing the marketplace doesn't mean innovating. You can be very successful (as Google is) without leaving so much as a footnote in history of invention.


Most companies don't run data centers.

Most companies don't build transistors. Just sayn'


> Most companies running data centers are now copying the way Google runs data centers

Most large tech companies perhaps but once you get out of a few major players this sadly isn't yet the case.

> Google essentially reinvented large scale data center management.

Almost everything Google has done are evolutionary refinements to long-running practice in HPC data centers which had long been doing automated provisioning and management, custom power & heat management, etc. for the same reasons: they couldn't afford not to.


Could you disambiguate something for me? I was under the impression that more people ran things the Amazon way than the Google way. That is, I see various places which have OpenStack and Amazon EC2/S3 portability, and I assumed they looked more towards Amazon for how to run this sort of thing than Google. (Eg, to my limited understanding, the Amazon details is more well published.)

While there's some mix and match, and Google and Amazon no doubt learn a bit from each other, how do you conclude that Google is the main reference?

For example, if I search for "how to run a data center" I find http://www.forrester.com/Five+Data+Center+And+IT+Infrastruct... , which talks about "Amazon. Salesforce.com. Microsoft. Rackspace" in the abstract, and does not mention Google.


Bell labs didn't invent either the transistor or the laser:https://news.ycombinator.com/item?id=6911981

I give Bell huge credit for the work they did - in most ways that is more important. But don't undersell Google's work in the fundamentals of AI science and their contributions to software engineering.


They also didn't invent the photovoltaic cell (that was 19th century tech, Bell labs was founded in 1925), nor wifi (that was NCR, which was acquired by AT&T after they did wavelan, not to mention ignoring significant work by other parties like CSIRO), and the theory behind CDMA also predates Bell Labs work on it.

They were certainly a giant, even the giant, but the GP is giving them credit for other people's work. If the argument is "invention counts, refinement doesn't" in an attempt to exclude google, well, Bell loses out a lot as well.


I don't know what the few years after the transistor's first appearance actually seemed like, but we've got 65+ years of perspective working for us to appreciate it.

Those "papers" do represent actual inventions, no matter how condescendingly you say the word "papers". And inventions only a handful of years old at that.

I'd venture a guess that many of AT&Ts inventions were (or would be, given a wider audience like we have today) looked down upon in a similar way - there were probably people who said "Ha! A toy like that? A couple of screws and some silly wires? We've already got _____ that can get that done, that clunky thing will never be practical."

---

Then again, I also think its a good thing that with more resources at our disposal we can take such a critical eye to work being produced by capitalistic companies, and as practices and inventions enter the industry, more easily take them for what they are really worth.


Keep up the good work! Google is a giant standing on the shoulders of giants. The nay sayers with their devil's advocate arguments are just keeping you on your toes!

Google did not invent the search engine, they reinvented it. We can thank Google rather than anyone else (including Tim Berners Lee) for placing the wealth of the world's information at our fingertips. Walk into any office and you will see Google Maps on screen somewhere. Again, Google did not invent maps but it is Google Maps that everyone uses.

Thanks to the efforts of Google we no longer need to know 'general knowledge', i.e. the stuff that wins prizes in quiz shows. We can use our brains for storing more useful stuff. This is tantamount to evolution. As mentioned, keep up the good work!


According to Wikipedia, Bell Labs most notably contributed to the invention of "radio astronomy, the transistor, the laser, the charge-coupled device (CCD), information theory, the UNIX operating system, the C programming language". Google doesn't come close to such groundbreaking innovations [1]. Granted, there's not a whole lot of those these days but it can't be a coincidence that innovation slowed down at the same moment Bell Labs was downsized. Google doesn't help with that.

[1] MapReduce is probably the most important thing Google researched but it only helps with companies' computing load, much like AT&T's telephony protocol improvements we never talk about.


> But what about all of the papers that have changed the way the world does computing?

Is this a joke?


It's like comparing invention of a new hemorrhoids drug to the discovery of DNA.

One has niche uses and the other creates new industries and revolutionizes science. It's hardly the same.


You guys did not invent map-reduce!!


There is map-reduce the algorithm, and then there is map-reduce the distributed computing mechanism. Unfortunately they share a name, but Google engineers definitely did invent the map-reduce being referred to here.


They most certainly did not, any more than Apple invented the smartphone — the core algorithm would have been familiar to a mainframe COBOL programmer, after all, and the distributed aspect was common under other names in scientific computing or relational databases. I'd be somewhat surprised if they didn't first hear about it at some place like Stanford and realize how promise there was in a cleaner, less crufty implementation than the various HPC job schedulers or hacked-up shell scripts managed by hapless grad-students around the world.

None of the above is intended to diminish their achievements: in fact, I think the reverse is true because the industry history is littered with people who had great ideas and never really implemented them well. Big ideas just aren't worth as much as the many engineer-years of hard work it took Google to go from a concept to massive-scale HPC being a routine, commodity service. It's easy to say “Google succeeded because Larry and Sergey had two key Big Ideas” but that ignores the huge accomplishment they made in actually building a sustainable company which took those ideas to massive scale.


Those contributions are great, but most of them will fade out after several years.


Go, Android, and Dart are minor.

The biggest impact is Android, and that's revolutionary how exactly?


Search and maps weren't revolutionary? Easy to miss, I guess - after all, who ever uses those niche items?


It's inventions like the Karnaugh Map and the MOSFET that changed how the world does computing. I don't mean to downplay the technical merits of BigTable or Megastore, but all they really changed is how salesmen sell things.


Actually map reduce as a programming paradime predates Google by a long way - Dialcom/BT where using it back in the 80's for billing systems.


In reading The Idea Factory[1] it became incredibly clear that Bell Labs only released and licensed a great deal of this technology as the results of various antitrust settlements that plagued the company throughout its entire existence. Also, part of the role of the labs appears to have been to give the company something to "show off" whenever congress or the DOJ complained about the extraction of monopoly rents. I highly recommend the book, it was really fascinating to see the degree to which many of our assumptions about the functioning of the labs and its relationship with the corporation are in fact historically inaccurate.

[1] http://www.amazon.com/The-Idea-Factory-American-Innovation/d...


Indeed "The Idea Factory" is a very good book recommendation. In the conclusion of the book the author argues that the Howard Hughes Medical Institute[1] may be the closest existing research organization to Bell Labs. While much smaller in size than Bell Labs it shares a focus on basic research and is well funded for the long haul.

[1] http://en.wikipedia.org/wiki/Howard_Hughes_Medical_Institute


Another theme of The Idea Factory is relevant to this discussion. The Bell System was the hardest thing people tried to build in the 20th century, and it couldn't have been done without valve amplifiers, transistors, information theory, satellites, and masers. Perhaps Google's technical staff won't become that good unless Google sets out to cure cancer, end war, or something equally ambitious. And they don't have to do it out of altruism: if Americans had to pay a trillion dollars a year for a war avoidance system, they'd be silly not to.


I think that is the point of the article. Bell Labs had to release their research, Google does not have to. Well maybe the author of the article should replace "state sanctioned" by "state tolerated" in his description of Bell's monopoly but the key point remains - the difference between Google and Bell Labs is (was) the government involvement.


Thanks for the book recommendation, I swear I get the best nonfiction books from HN.


Google, as a publicly traded company, has an obligation to maximize profit for shareholders — and there’s nothing wrong with that!

Why is there always this knee-jerk instinct to disclaim any critique of unfettered capital? It unhinges the basic thesis of his argument to say "there's nothing wrong with that." The claim is obviously that there is something -- albeit loosely defined -- wrong with it, which is why the comparison is being made in the first place. Let's feel free to have a real discussion about whether there's something wrong with the primary motive of enterprise to be maximizing shareholder profit, shall we?


Why is there always this knee-jerk instinct to disclaim any critique of unfettered capital?

As a dude that makes this disclaimer when trying to make a more subtle point, it acts as a sort of a preemptive dismissal to the just-as-kneejerk "SOCIALIST!" or "ENTITLEMENT!" critiques. A way to separate critiques of a business model itself from some saying "in this one context, this business model has some ramification that aren't so hot" is useful.


Except that a nasty ramification is itself evidence in favor of a broad critique of the business model. And what's wrong with socialists?


Except that a nasty ramification is itself evidence in favor of a broad critique of the business model.

I guess, but it doesn't necessarily mean that. You can probably extrapolate any small point into a larger one if you want, but it'd be disingenuous to paint any "this thing isn't perfect" point as broad critique.

And what's wrong with socialists?

I didn't say there was. It's not uncommon for people to loudly dismiss things that even hint at questioning their world view with eye-roll-educing cries of socialism. If we're not actually discussing socialism, this can be a time wasting distracting and it's often worthwhile to nip that tangent in the bud.


People say that to justify the "yea, capitalism!" statement that is about to follow. But it's not true. Google doesn't have an obligation to do squat, at least not legally (and most who claim this use the phrase "legal obligation"). Sure, the board can vote out some C-levels (maybe), but no one is going to jail. The next time you hear someone make that claim, know that what follows is likely as equally uninformed or follows an agenda.


Not to mention, the entire premise is idiotic. Knowing what will produce most profits is not a simple arithmetic problem. Google could dump half their profit into advanced research and just claim "it'll pay off". Pretty much any action outside embezzlement could be claimed to be done to increase profits.


I think what we can assume is that over the long run and averaging over all public corporations, shareholders are willing and to some degree able enforce their interest in profit maximization. That doesn't mean it's true for every single corporation at every single moment in time. And you're right, it's impossible to know whether any individual action maximizes profits or not.


And i think it's actually an obligation to look after the shareholder's interest, presumably under the threat of the top dogs being replaced. To my mind that's an important but subtle difference from the "legal obligation to maximize profits"


Bell Labs most certainly did not release Unix into the public domain:

http://en.wikipedia.org/wiki/USL_v._BSDi


Technically accurate, but it's not that simple.

Under a 1956 consent decree in settlement of an antitrust case, AT&T (the parent organization of Bell Labs) had been forbidden from entering the computer business.. Unix could not, therefore, be turned into a product. Indeed, under the terms of the decree, Bell Labs was required to license its non-telephone technology to anyone who asked.

AT&T made Unix available to universities and commercial firms, as well as the United States government, under licenses. The licenses included all source code including the machine-dependent parts of the kernel

-- http://en.wikipedia.org/wiki/History_of_Unix

So, while they did not actually release it to public domain, they got about as close as they could have without actually doing so.

Remember the GPL didn't exist until 1989. Also, I would speculate the generally free access to Unix V probably helped bring about the modern environment of open source, which didn't really exist then.


According to Wikipedia[1], AT&T/Bell Labs UNIX license fees (for non-academic use) during the '70s and '80s ranged from $20,000—$200,000, and the terms were not even close to public domain, free, or open.

[1] http://en.wikipedia.org/wiki/Computer_Systems_Research_Group


I like how people write blog, get tens of comments and feedback and yet they haven't done the simplest research required for saying something in public with this certainty


> In fact, despite being a part of a state sanctioned monopoly, Bell Labs produced a staggering amount of freely-available knowledge

Wasn't it because of the state-sanctioned monopoly that Bell Labs released so much? I thought that was part of the deal they made with the government.


Yea, I came here to ask the same question. I don't understand the presence of "despite" in that sentence.


Solution: we make Google a state sanctioned monopoly?


Google to Bell Labs is not an apt comparison. The better one would be Google to AT&T. Both Google and AT&T have components (like Bell Labs) that produce open technology. The author's omission of Go, Dart, and Android is glaring.

This article will look even sillier in the future when we're being taxied around in our driverless cars, after Google has had the decades that Bell Labs had to make its accomplishments.


I think the big difference is that Bell Labs did much more fundamental research: basic science, physics, material physics, semiconductors. Google is putting a good deal of money into R&D, into engineering research like the self driving car and now robotics, but not basic science the way Bell Labs was.


People seem to be bringing up Go, Dart, Android as things to debate over, but there are some other potential things to consider, like MapReduce. In fact, I'd argue that MapReduce has had a far more profound impact than any of the previous three. It's hard to find a large-scale data processing pipeline that isn't essentially built on top of the principles that MapReduce began.


Do you really believe that? Map and reduce were available in FP languages in the 1970s! Yes Google have a nice implementation of doing it in a distributed compute cluster, but there was no new discovery there.


As I mentioned in another post, the map-reduce concept that has been around for quite some time is not the same as the MapReduce process that Google has instrumented. Google has a way to run it super-efficiently on data centers, with distributed disks, processors, networks, etc. That is Google's oft-emulated MapReduce.


And that's also why people don't get the expected magic out of clones that implement only a part of said stack.


I remember writing mappers in Pl/1G which appear to be much easier than this bloody Java :-)


Surely everyone knows that the people who created UNIX, C, and Unicode are the same people who have created Go. Right? Ken Thompson? Rob Pike? You certainly know who these people are. Do you care what company they currently work for?


Ignoring the obvious difference in reach and success, in what way are Go, Dart and Android not equivalent to C, C++ and Unix in terms of open software projects?


* Go is just a less crufty C with a few niceties (modules, gc, coroutines).

* Dart, imho, is just an attempt to capitalise on Java expats, like they did with Android, but this time in the browser. It's sadly a much less interesting language than Javascript, which it aims to replace. It does however fit the bill of getting more enterprise friendly software running in the browser and in the cloud.

* Android is a pretty crappy Java runtime running on top of Linux. It's not at all interesting.

What all of these things lack in comparison to C, C++, Unix, and itself Java for that matter, is broad, industry wide repercussions. None of them are the culmination of years of careful research. If you look for the huge public-facing industry epochs out of Google, you're looking at marketing and social change, not individual technologies.


Android has industry-wide repercussions. I'm too young to remember but my history says that UNIX was regarded as a crappy, uninteresting operating system for years.

Go's dealing with a much more established computing environment than C, so it's not going to take over the industry overnight, but a lot of us think it hits the niche between C and Python very well, and could (with enough time) dislodge Java as the choice for most server-side development.

Agreed on dart.

More importantly, look at how much google stuff is not public: Their whole distributed computing infrastructure. That's why they're not bell labs.


I always loathe these attempts to compare how "innovative" the past is compared to the future, when the relative ease of mining entirely new fields of endeavor is not considered. One might as well complain that Humanity's days of invention are clearly behind us, because in the past thousand years we've come up with at most one invention on par with Fire, The Written Word, and Agriculture. And piffle, Computing is hardly that anything more than an obvious extension of other things anyhow.

We do not get to discover brand new fields of endeavor every day. Of course Go isn't as "innovative" as C... probably no computer language can be as innovative as C ever again. (Or Lisp, or a couple of others.) Even if one were to somehow be constructed (or simply pulled from the future somehow) it would almost certainly still have some sort of pedigree that could be traced whereby people could poo-poo its innovativeness. This is not a weakness, this is a strength of the richness of the field and how much exploration we've done. We don't get to discover new fundamental things every day precisely because we've done such a good job of exploration in so many fields, not because we've lost the ability to explore.


I disagree that there is no room for PL innovation left. I think the situation with Go is that the authors were deliberately trying not to innovate too much in the interests of familiarity.

There's lots of programming language innovation going on, you just don't see it in languages like Dart and Go. Which, by the way, is fine; Go has plenty of reason to exist without being innovative from a PL standpoint. (I would point to languages like Clojure, Kotlin, Scala, and C# as examples of industry innovation in programming languages.)


I guess this is sort of flame-baity, but what did C# really innovate in? Even the highly-lauded LINQ is a few functional features + reified code (and rather ad-hoc at that). I know LINQ is award-winning, and probably changed the industry by putting functional concepts in front of tons of people that wouldn't have otherwise used it. But is there anything even remotely new in C#'s actual language design? (Yes, anyone can be a critic.)


async/await and "where" clauses for variance/existential types, perhaps?


Async workflow was in F# about 5 years before C#, and implemented purely as a library - no hardcoded keywords needed. As I understand, any language with monad syntax can create such a feature.

The new generic variance is actually interesting. AFAIK, the MSR team had that as part of the spec, and it's been sitting in the CLR since 2.0. It's curious that C# is the only language besides MSIL to expose the feature.


I agree, but I do tend to think true innovation moves around in ebbs and flows, comes in spurts, and is mostly the product of hard work and necessity.

Computing has reached a point where resources have become so cheap we don't know what to expend it all on. There's still a hard compsci core, masses of unsolved problems, but most of us live in this nasty ecosystem of commodity "innovations" where it's not even clear there's an incremental improvement, if there is it can be a matter of opinion, let alone something that's going to indisputably change the way most humans live for the next half a century.

I don't suppose it's a bad thing, it's just that once a technology reaches a certain level of awesome we expect the flow of goodness to continue, and blinker up and squint at it long after looking for novelty and ingenuity in really small details.

The only way I can get a sense of perspective sometimes is to talk to my grandparents, who washed in a tin bath, witnessed TV, washing machines, microwaves and even modern soaps, shampoos and detergents come in to the home. I swear my gran still revels in modern food processes that I take for granted, and sometimes even despise. Somehow incremental improvements in programming language semantics don't seem to compare to the fact that 2 years ago I had laser eye surgery, or that we have the capability to print chips at unfathomable nanoscopic sizes.


You're right that the past was not 'move innovative', but I think it's misguided to think we are not discovering new things every day - we certainly are. They're just not noticed immediately because of relative 'obscurity' to an existing paradigm we have built around imperative and structured models of computing. This is the real difference between the past and now - in the past they were free to explore without restraint, but now we are carrying baggage that can't simply be dropped (The requirement to remain compatible with C, JS or whatnot, and worse, the unwillingness of many programmers to learn new things.)

Even C had baggage, from (B)CPL and Algol. C wasn't some magical innovation that appeared, but the result of incremental improvements to an existing paradigm, and the popularity of C++, Java and the likes happen for the same reason.

Conversely, LISP was hugely innovative because it introduced a new paradigm for thought and experiment, which led to even more new ideas appearing that wouldn't have been discovered if we were only stuck in an imperative world, but some of those ideas were then able to be brought back into the imperative world which brings about more innovations. Likewise, integrating procedural ideas into the mathematical world has brought about new innovations too.

I don't see languages like Dart as very innovative because they barely add anything new to the mindset. It's largely existing ideas that have been tried before, getting rid of some of the bad ideas of past, and putting it together in a neat package, however, it's still very much in the same existing paradigm, and incremental improvements to an existing paradigm only tend to solve small problems.

Bigger problems will be solved by completely new ways of thinking, but one needs to be able to abandon existing frameworks in order to explore them, then worry later about how to integrate the good ideas into existing models. This can be seen with LINQ - an innovation which cleverly linked ideas from FP into C#, but which no imperative-only programmer could've ever imagined within their limited paradigm.

I think people are so disappointed with Dart, and somewhat Go, is because there's a huge amount of innovation in programming languages outside the popular paradigms which they have largely ignored for the sake of making 'simple' languages to appeal to popularity, but many hackers are more concerned with discovery, and advancing our subject.


I'm old enough to remember and as far as I can recall it was never considered "crappy" or "uninteresting". The command-line interface was considered a bit dangerous (rm /<return> OOPS!) A lot of people were trying to get their hands on it to escape from the stranglehold of IBM and the other non research-friendly OSes.


You aren't that old, son: http://web.mit.edu/~simsong/www/ugh.pdf


Unix was certainly an improvement on multics.


Anything on a billion devices (in five years) is very interesting. Android lacks broad repercussions, wut. Because it wasn't crafted by CS titans at the dawn of computing?


No, because it's a kernel google didn't invent with a runtime that uses a language they didn't create and has borrowed a lot of ideas from a competing OS that beat them to market.

Android is important. Is it really innovative? I don't see how.


The comment you're replying to wrote:

"Ignoring the obvious difference in reach and success, in what way are Go, Dart and Android not equivalent to C, C++ and Unix in terms of open software projects?"

You replied with:

"What all of these things lack in comparison to C, C++, Unix, and itself Java for that matter, is broad, industry wide repercussions."

Which is exactly the one thing the original comment wasn't asking for. Your criticisms may or may not be true. However, I think it's fair to ask what differences exist while IGNORING the magnitude of the success of each company's projects. The article asserted that Google would never release its work to the public in the way that Bell Labs did because it is a publicly traded company. To counter this assertion, the original comment listed present day examples of open source projects that Google has released. The success (or failure) of those projects doesn't negate the fact that Google releases a lot of code to the public.


I reaffirmed the differences in scope because I lean toward the opinion that the achievements in Bell labs were substantial on technical merit alone, to such a degree than I don't think it would have necessarily been less historic if they had been private inventions. I don't think you can make a meaningful comparison avoiding this aspect.

Go, Dart and Android, to a great extent, have also been 'set free' yet they will never match some of the technologies out of Bell that we haven't even mentioned.


> Dart

> much less interesting language than Javascript

What's so interesting about Javascript? It's prototype-based OOP that raises the shittiness of dynamic typing to a whole new level? :-)

I mean, seriously, just the fact that Dart brings static typing and sane class-based OOP to web front-end is enough for it to be considered worthwhile.


Dart is emphatically not statically-typed.

https://www.dartlang.org/articles/why-dart-types/

Under the heading, "Background: Dart is a dynamically typed language, and proud of it":

"Dart is dynamically typed, in the tradition of LISP, Smalltalk, Python, and JavaScript. Users of dynamically typed languages, and in particular users of JavaScript for web programming, will understand why we chose this. If you are more of a static typing person, you may not be convinced—but let’s save this for another discussion. For now, let’s take this as a starting point: Dart is dynamically typed."


It's optionally statically typed. You can choose to use types if you want. That alone is a huge improvement.


You should really read the link that I posted there (note as well that this is documentation from the official Dart site). Scroll down to the heading, "Why is the static typing unsound?" to determine why Dart is really and truly not statically typed by any stretch of the imagination. The TL;DR is that, by definition, static type systems are pessimistic and dynamic type systems are optimistic, and Dart's type system is profoundly optimistic.

If you truly want a Javascript alternative with optional typing, try TypeScript instead.


All six of those things were or would have been novel in the 1970's but not so much today.


Go is simply the follow-on to Limbo which was developed at Bell Labs.


C is simply the follow-on to B which was developed at Bell Labs.

B was simply the follow-on to BCPL which was developed at the University of Cambridge.

BCPL was simply the follow-on to CPL which was developed at the University of Cambridge.

CPL was simply the follow-on to Algol 60 ...


And yet Google does release a lot of stuff. They are doing a lot to advance the state of the art in day-to-day computing. Operating systems, programming languages, lots of stuff.


While this is true, I think Microsoft Research is much more akin to Bell Labs than Google (though I'm biased). See e.g. this list of top CS papers [1].

[1] http://jeffhuang.com/best_paper_awards.html


MS Research is all kinds of awesome in my book just for sponsoring a lot of the Haskell guys.


Is Google's research: http://research.google.com/pubs/papers.html inferior?


In my opinion, yes, but others are free to disagree (and "inferior" is subjective). In addition to the kinds of applied research that Google also does (machine learning, etc.), MSR also does a lot of fundamental research much further afield, like that done by the biology group at MSR Cambridge [1], or the quantum computing work done in the quantum computing group and Station Q [2] (see also Scott Aaronson's relevant blog post [3]).

[1] http://research.microsoft.com/en-us/groups/biology/

[2] http://research.microsoft.com/en-us/groups/quarc/

[3] http://www.scottaaronson.com/blog/?p=1471



Don't get me wrong, Google does lots of great research across a variety of fields. And I'm neither a biologist nor a quantum physicist, so I can't claim to know exactly how important either MSR or Google's research is in those fields, but my distinct impression is that Microsoft is investing more and doing more in each of them (again, see the Aaronson post for the take of one person familiar with the state of quantum physics research). And not that it's especially important, but for the at least the first paper you cite it doesn't look like any of the research was actually done at Google.


It's really strange to discuss putting weights of importance on topics you (and I) are tangibly quantify.

I'm not even sure how anyone justifies claiming MS or Google research is _inferior_ to the other's. If we have that much insight into what is superior, then surely we've already won and know where to throw our bucks?

Regardless, Google already does plenty of work in these fields. This year, they've expanded to partnerships with NASA, USRA, and D-Wave with a new AI Lab -- https://www.youtube.com/watch?v=CMdHDHEuOUE.


Well, if we are to trust the data provided on your parent's link, yes, Google's research is inferior. At least in terms of number of "best papers awards", which is of course only a metric.


It seems strange how people will really jump on these sort of discussions. Are they more for entertainment? Do people feel like they gain a lot of information? Simply the fun of sparring? It would be cool to better understand. Maybe there's a business model here? Can't think of anything catchy.

"Why is ..." vs. "Why isn't..."


The underlying analogy is hopelessly flawed. Google is all of Google, yet Bell.Labs was but part of AT&T. So when we speak of Google we get pure frontline business operations and sales teams and marketing departments and product support engineers as part of package. While Bell Labs is a jewel box containing fewer messy details.


So how many nobel prizes did Google generate so far?


Armchair expert articles like these make conclusion first and justify it later. Google has obviously given a LOT back to community. They had all the reasons to keep BigTable or MapReduce or GWT as trade secrets. On the other hand, Bell labs never intended all research to go to public domain. In reality lot of things "escaped" to public domain because they apparently did not saw any competitive benefit in them. Also saying that only Google is doing this is also purely false. FaceBook and Twitter has given back to community a lot. Similarly Microsoft Research has published probably more research (many times including the one that offered competitive advantage) than very likely any other company on the planet: http://research.microsoft.com/en-us/groups/science/publicati....


They may not be Bell labs but they are the closest that this generation has.


Isn't MSR a true and proper dedicated research lab?


Both Microsoft and IBM put Google to shame when it comes to pure research labs. Hell companies like Boeing are probably closer to Bell labs than Google


Google is still barely a decade old. Give them a few more years. Maybe they'll come up with something that will please you guys!


The Bell Labs Song [1] neatly lists all the innovations -- and the list goes on and on, ranging from physics and cosmology, through low-level electronics and programming, to systems and organization design and applied math concepts.

Oh, and four Nobel Prizes.

[1] http://www.youtube.com/watch?v=IFfdnFOiXUU


Is this even worth debating? No one at Google comes even close to the greatness of, say, Shannon or Deming.


Applied Minds is our Bell Labs. And there are a couple of other companies I know of. Large corporations don't qualify.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: