Ideas are easy to find, it's just that the bounds that we have been limited to work within, both economically and societally, have become tighter and tighter. Today, the leadership of the vast majority of private institutions have only the goal of increasing their own economic power, and similarly there is constant media production reminding people how important the almighty economy is, causing people to focus on any potential fear of economic pain. Regret being one of the most powerful emotional responses that we have.
We don't have to look far in any direction for solutions that improve our education systems, our research output, and the general welfare of humanity. Unfortunately, the forces of corruption will always be strong, and it will take a more creative and imaginative people to implement them.
> Ideas are easy to find, it's just that the bounds that we have been limited to work within, both economically and societally, have become tighter and tighter.
An idea I've been having lately, and one that might be unpopular on this site, is that we've found all the low-hanging in electronics, networking, and software. So the main reason why it seems like there are no new ideas is because we're only looking at places where there are few new ideas to be found.
If we stop equating "tech" with computers, and instead search the vast parameter space of new technologies, we'll quickly find lots of interesting new ideas. Unfortunately, that also means we've more or less reach the end of the growth era of Silicon Valley, and from here on out SV just becomes another old fashion industry with minimal avenues for new growth.
I think we're still really far from using the potential of software. Just look how terribly inefficient and error-prone most processes in, for example, hospitals or government agencies still are. There might be no more quicksorts to discover, but there is still plenty of work to do, most of it fairly trivial.
Medical and government are the 2 examples I would think of as the least trivial places to work in. There is nothing to be discovered or invented in these sections. We all know what they need, its just close to impossible to get over the political hurdles to actually do it.
There's plenty to invent even there, even in the theoretical mathematical areas! There are all sorts of interesting ideas around provably correct elections implemented via things like zero knowledge proofs or calculations from various sorts of private keys.
Can I give you some sort of receipt when you vote that can't be used to figure out who you voted for but can be used to determine that your vote was tallied correctly? Quite possibly!
There are many, many things that these sectors need that are not yet invented. And quite frankly, while an existing solution might work in for their cases, unless it works in the confines of the regulatory environment of medical and government, then it’s not an actual solution.
Those peculiarities are part of the yet-to-be-discovered solutions, not just annoying barriers to adopting existing products.
Its not a discovery. Its just being highly influential in the organization and actually wanting something to be done when it isn't profitable/would reduce your budget for the future.
Gaining that influence is a different type of problem than what was being discussed, yes. But health care has a huge set of competing political forces influencing it, and the key to figuring out how to untangle them effectively is to approach things systematically and creatively, just like with any other problem. The problem shouldn't be dismissed as a candidate for systematic examination or given less credence than other types of problems just because it's a political one; we have yet to discover how to build health care organizations that untangle those forces effectively.
There are still millions of ideas. You just need to look around a little more and massive problems (opportunities) are everywhere. In my job (tech leadership consulting, dev, etc) we deep dive on a new company/industry every few weeks when kicking off new projects and it seems like every niche industry or business has like 50 grotesque inefficiencies and incumbent crappy solutions providers just waiting for someone with the right design and software skills to come by, make the investment, and grab the business.
If you are lacking in inspiration, go work for a consulting company for a year and your idea bucket will be full to overflowing.
I’ve gone to doctors the last couple of years for minor, but not-small maladies and they have no treatments. Even in medicine there are common problems that plague big portions of our populations but no one wants to solve. I’m guessing it might be the same in the industries commonly discussed here.
They may not be businesses, but they could certainly be features.
I'm not sure if treating lots of minor maladies is a good idea. I'd think that there's a certain threshold of severity, below which treatment is riskier than just leaving it be, and that a lot of the "unsolved problems" you're talking about are things that don't meet that threshold.
One of the reasons the US spends so much on healthcare is that we have an ideology of "treatment at all costs"[0] when doing nothing is often the right solution.
I don't know. There are minor maladies that don't need treatment because they get better on their own after some minor suffering, but there are a lot more minor maladies that over time only get worse. For example anything related to bad joints, or arteriosclerosis. Any effect of aging.
I generally agree with you. But in the way that our defense budget is a significant portion of our federal budget, leaving many other state and federal needs un-fund-able, I'd suggest that there's a similar ratio of "elephant" medical research areas starving us of solutions to maladies that may be deserving of treatment.
This was originally in response to your parent, but gonna attach it to your comment instead, cause you're right.
I don't believe we've found the low-hanging fruit at all, it's still all over the place.
There are hundreds of pieces of software and hardware that are expensive, or unintuitive, or user-hostile, many are all of the above. HN complains every day about bad software. Is this really the best we've got?
There's just no way. What it does take is people willing to take a step off the usual treadmill and getting their talent and ability sucked into large companies and willing to take a risk to build their own stuff. That ability is there, the will is not. Some people have 30,000 HN comments. You're telling me 10k of those detailed thoughtful comments aren't absorbing time that can be put into building better stuff?
Hardware is thousands of times more powerful and cheaper than it was 20 years ago. Software is likewise more capable, or at the very least, libraries are way more abundant.. Say what you want about new languages, but I'll take the option of Rust, Nim, Go, Clojure, etc, than what existed 15 years ago any day. Not that that stuff was bad, but today you have the option, and it's fucking cheap to build.
Why aren't we building? Why do we act like we need millions of dollars in funding to do stuff you can do in your underwear? Do we really believe all the good ideas are gone or have been executed well? That seems crazy to me. I just think we're lacking imagination and we've adopted a woe is me attitude (I know I struggle with this all the time).
That's just software, where greenfield stuff is cheaper than anything in the world, where implementing your thoughts is a matter of sitting down. If you've worked in any business, you know of the inefficiencies within, and if you've worked in a big one, you really know about them.
There is so much low hanging fruit! Just about any product you can pick up in your house, any regular process you go through, there's a major opportunity to look at that and think about how that can be done better and cheaper and more reliably. What's more, that's about the easiest thing to do, because it's just working off what already exists.
Sorry if I'm not more specific, but there's gonna be people looing back in ten or twenty or two years and wondering how so many people missed x or y or z, which in retrospect is so obvious.
I think its a case of diminishing returns - yes there is sucky legacy software everywhere, but the ROI was much greater when introducing software in a space where there was none, or the incumbent system hasn't met growing needs.
Agreed. We're not finding something as big as the Internet again. Smartphones more or less look like the same across generations now. Innovation seems to be much more gradual than it use to be.
I work in big business (a large bank) and see gross inefficiencies. I also see vendors aggressively trying to sell us solutions to these inefficiencies. I.e. the solutions already exist and creating them may not be the hard part - the hard part is to get companies to adopt them, given how disruptive a rollout of many of these solution is.
Agreed, I joined a consulting company earlier in the year for this exact purpose. I believe technologist have to be more willing to get out of their comfort zone in order to apply their skills to underserved industries. It's a risk and a great outcome isn't guaranteed but they have to be taken at some point.
I'm not sure confident those inefficiencies are as easy to solve as you think. I've work on several projects where the goal was to overhaul a really old webapp. The end result usually ends up being at at the same level of performance as the original design. Worse, management often demands a vast increase in features, which makes the product slower, not faster.
Not to say we don't find big performance gains here and there. But the trend does seem to be heading towards optimization and not revolutionary improvements.
True, but revolutionary improvements are almost by definition so rare they don't form a trend. And revolutionary improvement is never what people really want regardless of what they may say.
Here's my story about revolutionary improvement in business software.
I've spent the last five years or so in the "enterprise blockchain" space. This term has been progressively diluted over the years; a typical enterprise blockchain deployment has no blocks or chains. Even so the nub of the concept remains. There are hashes, signatures, transactions etc.
Why did the business world go so crazy about blockchains? Because it sounded like revolutionary change. These executives mostly don't care about cryptocurrencies or payments, but they cared a lot about the idea of peer to peer business processes. Simply moving structured information between organisations is by no means a solved problem and worse, it's one that very rarely gets attention from the wider software industry. There was EDI in the 1980s. Then nothing for a long time until XML came along, and a whole bunch of inter-firm protocols were "upgraded" to use XML. Then nothing until blockchain. This is one reason why you so often hear of business processes that still use fax machines, or emailing PDFs/excel files around. IT works well for sharing inside the organisation and pretty terribly between it.
Here's what the conventional pre-blockchain approach to inter-org business process improvement looks like:
1. Identify a shared business process problem.
2. Create a consortium or shared for-profit venture to implement it, get the most likely customers to buy in. This can easily take years because the customers are likely to be competitors.
3. Hope that this consortium somehow manages to get lucky and hires competent tech people, because sure as hell the firms that want the software don't have them. This step often goes wrong.
4. Hope that somehow this group creates a usable, reliable product that actually solves their business need. This step also often goes wrong and the newly birthed software firm produces something too far from the real problem.
5. Hope that the original members of the consortium actually do the work to start using the solution they paid for. This often takes years.
6. Finally hope that they never get hacked, that your controlling government doesn't end up in a fight with theirs that would mean they can't store your data, hope they don't become dominated by one competitor in the group or grind to a halt due to having a captive market i.e. no incentives to improve anything. Inevitably some of these things happen.
Classic example: SWIFT. But there are many others.
So the idea that firms could just connect directly to each other in an agile manner with interoperable apps seemed like a promise of revolutionary change. Execs like that, it makes them seem bold and visionary.
In practice what you see in these projects is very similar. They don't work in an open source manner, they still create new software firms or non-profit orgs via shared investments. The main difference is that step 5 gets more expensive because now the participants are expected to do some IT work themselves to run P2P nodes. But by the time they reach step 5 the executives are exhausted and just want a low-risk point-and-click hosted experience. Their own IT departments can't handle anything that doesn't look exactly like what they're used to and nobody in charge has enough technical knowledge to argue with them. Plus the executives have ticked the "we are innovative" box and don't want to control their own data anymore, they just want to cut a cheque and have someone else do the work.
So the theoretically revolutionary shift to p2p business processes ends up being P2P in name only. One company ends up writing the software, hosting the nodes and generally just being a conventional SaaS business.
I've come to the conclusion that there won't be any revolutionary steps forward in the business software space, ever, because in the end businesses outside the tech sector have virtually no tolerance for technology risk. They don't understand it, don't want it and really wish the whole concept of "innovation" (read: risky tech projects) would go away.
I believe the next advancements not to be technological. A combination of neuroscience and cognitive psychology can alter the human fabric of society in various directions. For an example of this, we can look at media produced in the United States and look at it as a cognitive psychology experiment that attempts to foment rage, which short circuits not just logical thought, but also the ability to feel compassion for others (emotions are powerful and deep in our psychology - compassion and empathy is a highly advanced mental activity, involving complicated parts of the prefrontal cortex).
This is one of the reasons that I think advertising and news media that takes advantage of high arousal emotions should be immediately outlawed as psychological experimentation, and that education should teach people how emotions can be used against them, and how to manage them effectively.
The first lesson of this teaching would be that all decisions, even economic ones (and ESPECIALLY economic ones) are made emotionally.
Any unprofitable problem can be redefined from "How to Solve Unprofitable Problem X for Me or Friends, Etc." to "How to Solve Problem X Profitably".
Even small problems may be hidden economic launch pads, given that the ingenuity to make their solutions profitable is likely to provide surprising insights.
Generic approaches to making an unprofitable niche more profitable to serve:
- Generalize the problem in some novel but useful way, so the solution has increased market size
- Generalize to a user customizable solution, to offload costs of complete specialization
- Reduce the costs of creating the solution in some way, such as new specialized tooling
- Divide up the problem/solution into subproblems/solutions whose potential value as a flexible toolkit is greater than a monolithic solution.
- Identify a small market that having adopted a small unprofitable solution will now be easy customers for related products, i.e. amortizing the initial cost across the long term value of customer contacts
Thinking deeply about the specifics of a problem once may result in more novel specific approaches.
Some problems just aren't that valuable to solve. I use tons of neat little tools like jsondiff.com that just aren't really worth paying for. Monetising those products would make them less pleasant to use, and I'd move to something else.
Some things can just remain simple and free, and that's fine.
Just tried it. Nice. Couple of "issues"
a) Answers are carrying over from one question to the next and not getting cleared automatically
b) When I answer a question, the animation disables the controls and I have to click again to enable and move to next question.
Tested on 74.0 (64-bit) on OSX. These might not be "issues" and how you defined the experience for your little one. If so, feel free to ignore.
Thanks for checking it out. a) seems like a bug, but I can't recreate. After I click "next" for the next problem, the answer field is empty. Let me test some more b) yes, this is indeed a weird UX and my daughter said it is "annoying". But it exists so that the kid gets a pause to know whether the answer was right or wrong. I should maybe show it for 3 seconds and proceed to next automatically.
Will upload new version of the site tomorrow now that I have a full two users for the app :)
Is there reason to think that people in the past were more selfless and less economically motivated?
I'd wager that throughout history, most great ideas and optimizations came from people who were either seeking profit or other kinds of status (eg recognition from peers).
> Is there reason to think that people in the past were more selfless and less economically motivated?
Not sure if more selfless but the common ownership of agricultural/pastoral land was a thing in the rural parts of my country (Romania) until relatively recently.
The community-owned rural pastures (called "islaz") are the last to go right at this moment, it's like seeing the English enclosures play out with 200 years difference right in front of one's eyes (the "islaz" from my parents' village has been sold out to a private entity only 4 or 5 years ago). We also used to have common ownership and common use of the agricultural land in the Middle Ages, with some remnants of that system still extant in some mountainous communities until the late 1700s - early 1800s. They were called "obști" [1]
> The obște (pl. obști) was an autonomous agricultural community of the Romanians/Vlachs during the Middle Ages. Mixing private and common ownership, the communities generally employed an open field system. The obști were usually based on one or more extended families. This system of organization was similar throughout the Vlach-inhabited areas and it generally receded as overlords assumed more power over the rural communities and as the peasants lost their freedom by becoming serfs.
Probably hard to quantify economic motivation over long periods of time with very different cultural environments. Economic motivation seems a strange label for what would have been basic survival for early agricultural societies. I think what feels strange now is that we've had these massive increases in productivity over 150 years, but it hasn't translated to more leisure or less stress.
I also think there's a difference between an inventor coming up with an idea in hopes of selling it, versus the modern VC-funded apparatus of blitzing social media, adtech, and blackhat SEO to generate. In the past there were gate-keepers to distribution which have in many cases been disintermediated, and for a while there was a golden age where small independent producers could get internet distribution because the old giants (Apple, Microsoft) were too big and slow to capture it. But once Google and Facebook came of age, and VCs figured out the playbook, independents are once again at a disadvantage.
The last 150 years have absolutely seen huge increases in leisure time. Notice that in the Bible, there is only one day of Sabbath, the rest of the week is work. A 6.5 day work week was standard for millennia. This idea of a relaxed bucolic existence like a Monet painting was always a myth.
"Less stress" isn't defined enough to discuss, but I for one will say I don't stress about finding food over winter. My ancestors did at that time though.
Thats not exactly true, actually our ancestors rested a lot more than we do now.
The same myth is that serfdom is/was worse than 9-7 office work.. when in fact you could work 1 day for someone else and 6 days for yourself. Now you work 5 days for someone else and 2 days for yourself.
Serfdom is much more then just "paying taxes in work time". It was basically totalitarian dictatorships and also quite ineffective economically. Abolition increased agricultural productivity and peasants' nutrition.
When we are discussing in the context of “how much work i have to put in for someone else vs how much for myself” - my point still stands. Peasents had to work for their feudal lord a lot less than I for example have to work for company shareholders.
TBH with pretty absurd ratio right now.
You get paid for your work. You can change company you work for. If you don't want to work for company, you can try running self-sufficient farm - which is how peasant gained food.
Except that peasant did not owned the land. Could not make his farm larger nor smaller. If lord decided you will produce more grains instead of vegetables, peasant had to oblige regardless of own preferences. And peasants had to give part of produce to lord. And had to use Lords mills and what not instead of choosing them in free market.
Nor could peasant change occupation or marry without Lords permission.
There is a lot of shallow talking in what you said because things are not black/white. And yes I can change the company I work for also reducing workdays amount but that will directly make my life unsustainable and I dont have an option to buy 50 morgs of field to start a farm (this costs around few million dollars where I live) and thats the area lords were giving to their peasents..
So yea your argument is:
- “just go and die if you dont like working 3 times as much as your ancestors”
Resorting to insults when you dont have arguments is kind of weak technique.
The problem with underperforming agriculture and lack of nutrition was persistent in that system. You seem to assume that peasants under serfdom had easy low work access to food and everything. That is not the case. The land does not produce food for whole year without you working it a lot.
Peasants in serfdom had hard time to produce enough food for themselves.
Peasant did not tended to few million dollars worth of land. Lords were not giving land to peasants. Rather, peasants were sold along with land and belonged to the land. That is massive difference - serfdom was one step above slavery.
You still completely missed the point again, which convinces me this discussion has no sense. We were not discussing the live conditions back then and now but how much you had to work for your land lord vs for yourself. Disparity is huge... and its just stupid to deny that...
> Is there reason to think that people in the past were more selfless and less economically motivated?
Perhaps, but perhaps not. The point isn't about some innate feature of humans that has only increased recently, but rather that throughout history humans (and the great classes therein) have faced material (not only economic) hardships. It is material conditions which drive history.
I think there is a huge dividing line between inventing to seek profit and seeking "other kinds of status". The only thing in common these motivations have is that they arise due to our relations with other humans. Likewise, many motivations, depending on the predominant point of view (which certainly does change with time and place) can be viewed either altruistically or selfishly. I'd wager that a cynical society, or one dominated by the totalizing logic of capital ("accumulate!") would have a tendency to see previous motivations as selfish, because it's very hard to empathize with people through history. In other words, we view things in a certain way because it is unimaginable to us to see it any other way, since we have no experience with those times and places and the motivations therein.
The ancient world is filled with proverbs and sayings and stories of heroism for some kind of greater good, certainly without looking for some kind of profit, and more often than not, without looking for some kind of recognition. As to whether these are just stories we tell or whether there is some truth to them is an important question. It's probably worth asking anthropology, or at the very least sociology. The latter has discovered huge changes in public consciousness since the industrial revolution.
> Is there reason to think that people in the past were more selfless and less economically motivated?
They were certainly less consumerist and didn't have the constant onslaught of advertising telling them to buy shiny new things. They were far more self reliant and able to repair things for themselves, which tends to constantly breeds it's own innovations. Lots of little tricks and custom devices tend to crop up (over engineering isn't just for programmers) yet very few will get commercialize, at best they'll be passed along to future generations but many will get lost.
> 'd wager that throughout history, most great ideas and optimizations came from people who were either seeking profit or other kinds of status
I'd wager most were out of necessity and/or laziness and most died out or were replaced by newer technology. Swords, scythes and shoes didn't have single inventors, they had generations of hackers tweaking them.
I don't think your parent comment meant "is is not the case that there exists at least one person in the past who was more selfless and less economically motivated".
In some ways this reminds me of Joseph Tainter's Collapse of Complex Societies. It's been a while since I've read it, but I recall he correlated increasing social complexity with a decrease in innovation. Although, I think he measured "innovation" as patents, copyrights, and other intellectual property which I think is a poor measure of ingenuity or innovation. Still, interesting hypothesis...
"The span of my existence in the valley is going to collapse to about a third, maybe less in your lifetime and the challenges that you are going to have in the office and the opportunities is going to be very very significant but it's going to require a lot of you much more than of me or of anybody in the past because the technology moved actually fairly slowly"
Agreed. In the world of business, there has been a glut of leaders who arrive at positions of leadership without any formal training in leadership, and so they rule with "instincts" which are sometimes no more than primitive impulses or ego. This limits how far any project can go.
Just for a moment, assume there is an abundance of people with interesting ideas that deserve to be pursued, but for some reason these people are being stifled. Why are they being stifled, and are there ways we can better support them, so they can move forward?
On this theme, and speaking of a specific startup where I worked, where some of us were deeply excited about the project, I was recently writing a response to some criticism I received in response to "How To Destroy A Startup In Three Easy Steps":
Leimgruber phrased it best, in his review on Amazon, so I quote him here, to answer everyone:
Personally, I find the book most interesting not for the absurdly lousy management characters, but for giving a glimpse into the mind of a person that accepts this kind of treatment as okay, shoulders unreasonable burdens, and seems repeatably drawn into difficult situations with the corresponding drama that inevitably ensues.
This begs the question for me (and likely much of this book's readership):
Why are many talented software developers drawn to solving impossible problems, drinking unhealthy amounts of coffee, neglecting their sleep and personal lives, and constantly trying to fix everthing and everyone around them while ignoring their own psychosocial needs?
To my mind, the interesting question runs in the other direction. Why is bad leadership so common? Why is it so universally accepted? To anyone who suggests that we should quit our jobs after some disagreements with management, I would ask why is it that we need to leave? Why doesn't the leadership leave? Shouldn't management resign, if they are unfit to get the mission done?
Some questions have large implications. Why are so many leaders so completely self-destructive? If Milton had simply been greedy, in a rational way, he would have allowed me to work on the technology that might have eventually generated a lot of money for him. But I find that business leaders are rarely rational. Impulses and ego seem to be the most common forms of decision making. Why is this accepted?
The second comment I'd like to respond to was written by "Antoni" on Goodreads:
I loved the first 80% of it, which is enough to give a positive opinion I guess. What I didn't like really is that the book is written from the perspective of startup employee, not the founder. So there's only part of the story. Only information that the writer assumes. He uses a lot of exaggerations as well that are fun to read and enjoyable but some dialogues are hard to believe to be true.
I would recommend it to people working in tech startups, to feel good about the environment that they work for rather than take some valuable lesson from the book since it's more about management tyranny, mobbing and lack of transparency rather than actual reasons why startup failed from a perspective of a person that had full picture (instead of an employee).
In response, please consider these four ideas:
1.) The failure of any venture is always a complex event, and no one can easily say why it failed. Consider when an airplane crashes, it often takes an army of investigators years to figure out why the accident occurred, even though the investigators are guided by the experience of all previous airplane failures. A startup with an entirely novel idea will be too unique for anyone to easily diagnose its failure. There are too many variables, and too many embedded assumptions.
2.) A good leader over-communicates in a crisis, and every day is a new crisis for a startup. Above all else, the leadership needs to "listen real loud." A startup is either a transparent learning organization or it is dead. Milton's crass hoarding of secrets was a self-inflicted injury. While there might be some other reasons why the startup failed, it is absolutely true that our lack of communication was the starting point of all the other problems that we faced. Since I was central to the technology effort, the startup could only succeed if I was well-informed about our real needs. Keeping me in the dark was a problem for the whole company. I'm confused how anyone could complain that this book is about "lack of transparency rather than the actual reasons why the startup failed." I've tried to be clear about this, but I'll repeat it here again: lack of transparency was one of the reasons why the startup failed. We can debate whether it was the most important factor, but it was obviously a significant factor.
3.) Antoni says they wished the book was told "from the perspective of a person that had the full picture (instead of an employee)." Possibly I failed to emphasize this enough, but no one at Celelot had the full picture. Just like the three blind men in the fable, we were each touching a different part of the elephant, and we were reaching different conclusions about its shape. I was holding onto the technology, so I believed one thing, while Milton was holding onto the sales leads, so he believed something else. This much is completely normal at all businesses, it is a problem with a standard solution: lots of honest communication. Sadly, honesty was lacking. A series of lies were told about the company's finances, so myself and Kwan were constantly guessing at the truth. At some points we felt we were working at a well-capitalized firm, other times we thought the whole place was about to run out of money. But neither Milton nor John knew much about the company, either. At no point did Milton sit down and have a good faith conversation with me about the status of the code at the company. At first I was elated with the level of autonomy I'd been granted, then later I realized that the leadership was operating with assumptions that were out of line with reality. A ship captain who has no idea of their location near the coast is a ship captain who is about to run aground, and likewise, Milton's ignorance of our progress meant the whole company was slipping toward hidden reefs. If Milton were to write a book about Celelot, he could fill in his side of things, but his side of things would not represent the total truth.
4.) We have suffered a glut of books that aim to build a cult of personality around certain entrepreneurs. This tendency has gone furthest with Steve Jobs. What is remarkable is that this trend should get going at a time when innovation from Silicon Valley is clearly decreasing. In his 2006 book, The HP Way, David Packard talks about the process by which he and Bill Hewlett grew Hewlett–Packard. In their rejection of standard corporate hierarchies and their hunger for input from everyone, they were clearly blazing a radically new path in both management style and technology. It is noteworthy that when they were at their most creative, in the 1940s and 1950s and 1960s, no one set out to create a cult of personality around them. In 1968, when Robert Noyce and Gordon Moore founded Intel, no one thought to write them up as heroic characters, but it was in that era that their technology was creating the most profound shifts in industry. At some point after 2000 the rate of innovation in Silicon Valley began to slow, and yet this was the era when the rhetoric about visionary geniuses and innovation began to take on the tone formerly reserved for artists and military conquerors. Real leadership is rare, so we should celebrate it whenever it appears, but we should remember it comes as often from the lower ranks as the upper ranks, so a series of books that only looks at the upper ranks must automatically leave us with a skewed picture of reality. My point is, we need more honesty about what is actually happening in these companies. We need less books written by or about founders, and more books written by those who are in the trenches, working everyday to build something new. Above all else, we need better documentation of the ways that management often sabotages the worker's efforts to invent the future.
----------------------------
A final point for Hacker News to consider: some of these distortions that we are seeing, regarding misalignment of leadership styles, and in particular misalignment between short-term goals and what the team is committed to building, is a political question that very much goes back to the question "Are Ideas Getting Harder to Find?"
Now as always, there are a lot of people with interesting ideas. Are they getting the support they need to move forward with their ideas? Or are they being stifled?
There's a lot of interesting stuff happening in crypto community that is like what you're talking about, especially DAOs which are being used to democratically power everything from decentralized finance to non-profit fundraising platforms.
The article says... "that research effort is rising substantially while research productivity is declining sharply". No surprise there. Nowadays, Universities have to become paper mills in order to get funding. Also... Risky research and cross disciplinary research is a sure fire way not to get tenure.
As to the question, where new ideas come from... I teach creativity... the literature is in near agreement on this topic. New ideas come mostly from recombining old ideas. Essentially, this is an act of playful association. New Ideas also come from new technologies. For example... the neodymium magnet is the reason we had the Sony Walkman.
As one of the Hackaday comments mentioned, the original Walkman must have used samarium cobalt magnets rather than neodymium magnets. SmCo magnets were known since the early 1960s but neodymium magnets were discovered in 1984, 5 years after the Walkman launched:
Cross disciplinary conceptual mining is a great way to realize the "low hanging fruit" waiting for disruption in any industry. The silos of knowledge are high and thick, just like the skulls of the industry thought leaders who say things can't be done in some different manner.
I've worked as a developer a number of industries, and as an MBA consultant in 3 times as many more. I'm in a developer job at the moment, but when I was actively consulting I'd just learn how they run operations, and the glaring opportunities for substantial expense eliminating and productivity enhancing changes glow like the sun.
> the literature is in near agreement on this topic. New ideas come mostly from recombining old ideas.
Absolutely. Not just in literature/arts but science as well. The ideas of atoms, physics, etc are a "rehash" of ancient greek ideas. Though obviously not the same, but there is no doubt that the study of pre-socratics influenced science.
> New Ideas also come from new technologies.
Even more true. Everything from the invention of letters, numbers to telescopes and microscopes led to new ideas, new arts and new sciences.
Also, another major component is wealth. Research, new ideas, etc require wealth so that you have the time and resources to pursue and expand on these ideas. Ancient greece was a wealthy slave owning society which allowed them the freedom to pursue noble ideas rather than slaving away to make ends meet. This is where we get the idea of "liberal" education. Liberal here doesn't mean the education is free, but rather education is for those who are free ( aka wealth slave owners ). Wealth frees you from life's mundane tasks. It's also why every major civilization or flourishing of knowledge/ideas always was preceded by immense wealth ( america, ancient greece, mesotopamia, ancient egypt, italian renaissance, the englightment, etc ). People mistakenly assume you get knowledge and then wealth, but historically, it's always been the other way around. You get wealthy first and then you pursue knowledge.
So, take all the ideas, compute their cross-product, sort, and voila! Found all of them, though as it is itself a new technology, recurse... this will take a while...
I’ve become extremely interested in this notion lately. When I have discussions about intellectual property with random people the dominant idea is that patent promote innovation. People get angry if I suggest that might not be true, as if they think I want all creative people to go bankrupt or something.
But its increasingly questionable to me. Open source software has shown the great wealth of innovation that occurs when we collaborate instead of hiding our work.
I think we could (probably) sharply increase the rate of innovation if we let go of intellectual property restrictions. I really want economists and other thinkers to take seriously this notion that intellectual property restrictions are deeply harmful to society and study it in detail. Unfortunately entrenched interests benefit from the status quo (naturally) and I feel they would be in opposition here. But we can still work to open source our output. It’s harder when open source competes with closed source, but if we can learn how to use open source to our advantage we might be able to win.
In any case I think it is desperately important that we take seriously the idea that IP laws might be doing more harm than good. Most people I speak to take it as a given that we need them, and I think we’re shooting ourselves in the foot.
> I think we could (probably) sharply increase the rate of innovation...
As long as we're looking at deeply held assumptions, it's probably also a good idea to figure out what the ideal pace of innovation is. None is certainly bad, but too fast also causes problems for people: technological products are expensive and will be shorter-lived as innovation speeds up. At some point, the cost to society of all that churn probably balances against the benefits.
It’s a good point. I would argue however that reducing or eliminating intellectual property restrictions also significantly eases the difficulty in repairing and upgrading old hardware. So obsolete technology is more easily upgraded or repurposed.
I think you'd lose a lot of people with the open source argument. Open source software is many wonderful things but it's not innovative. Been using and writing it for over 20 years: the projects that fundamentally change the conversation are nearly always proprietary (or sometimes open sourced for strategic reasons but developed as proprietary). Open source hits are usually clones.
Linux -> clone of UNIX, on desktop sometimes macOS
Git -> clone of BitKeeper
Android -> developed in a proprietary way with minimal community, heavily iOS inspired
GIMP -> clone of Photoshop
OpenOffice -> clone of MS Office
qemu -> clone of VMWare
etc.
It makes sense. Open source projects exist to 'scratch an itch' and are rarely optimised to maximise the size of the userbase. There's little incentive for open source developers to innovate: more users can be as much of a downside as an upside. Mostly it just equals more support headaches. Proprietary products are all about getting lots more users, so there's an incentive to try new things.
How do you solve for 'work for 5 years on open source project, get it to some degree of popularity, watch AWS making a service out of your code without paying you a dime'?
50% of revenue might make sense for Unity, where Unity provides clear value. But it absolutely would not make sense for some small library that gets used for a few operations. It also would not make sense for high profit work. Personally I’m seeking an end where everyone shares basically everything they do and no one extracts anything out of the duplication and re use of information.
That end is incredibly sad. We still have to pay our mortgage and our dentists, but can't ask for a fair pay for our highly skilled work. I guess we deserve the ad-based monetization model that has taken over the world. Brave new world, let's shill for megacorps with all the clickbait we can possibly muster.
This is a misunderstanding of my goal. Even if we share everything we can there is always novel work someone needs done, and we can get paid to do it. But also if we share everything we can then most of life will become much cheaper. My end goal is to reduce the marginal cost of living to near zero and then arrange society to provide for everyone. In that case you wouldn’t need to pay the dentist or the mortgage. But IP restrictions add costs everywhere, and those costs must be eliminated. Perhaps we can’t eliminate a mortgage, but we can make your car cheaper to repair and we can make the goods you need much cheaper. Remember my goal is to make life better for all. And I know it is counter intuitive. That’s why I say we need economists to study it. But the goal isn’t sad, the goal is to thrive. The status quo, where a majority of people slave away their entire lives even in the wealthiest nation on earth... that is sad.
> But also if we share everything we can then most of life will become much cheaper.
This is an extraordinary claim that demands actual evidence. Eliminating IP costs might make things cheaper, but physical goods (like food) will still need material, energy, and labor inputs to produce for the forseeable future. How do you plan to bring these costs down to “near zero”?
If you want economists to study this, you’ll need to convince them it’s plausible. That requires at least some back-of-the-envelope reasoning from data that already exists, even if it’s not a perfect example of the policies you’re advocating. There are records from times & places where IP law was either nonexistent or unenforced; what do they say about your theories? If you expect things to be different this time, what leads you to think that (with specifics)?
You're looking for the GPL. Richard Stallman shared your end goal. The GPL has been in decline for a long time though; most people concluded it was a bad idea.
The GPL does not solve our cultural problems. We’ve got to be doing advocacy for open collaboration and study its effects on our economic systems. If people do not understand the importance of open source the license won’t do much to help. And you’re right that GPL has its detractors. It’s highly restrictive. I use CC0/BSD licenses now. It’s perhaps overly permissive but it’s good to experiment.
Thank you for your question. I want to challenge your assumptions without sounding combative, and provide what I think is a solution that prevents the conflict from existing in the first place.
First of all I will point out that if one individual creates something useful and then a major service provider picks it up, this can mean very good things for society. If the service is good, hopefully AWS can provide it to many customers and help many people gain access to the utility this creative person has helped realize. There are situations where the big company disturbs the project so much it fails, but I assume you’re actually asking what happens when someone else profits off of your work.
Well in the theoretical I am also imagining that all parties share. This would mean that even if AWS takes on the code and sells cloud services with it, they would only be competing against other cloud providers who could provide an identical service. That is to say there is no one way source code exchange, it should be mutual. In this case, I would argue society only stands to gain from AWS entering the fray. It is a pure positive and not a dilemma directly.
Then there is the question of the individual. This one individual worked hard on something for five years and then a big company swooped in and took the profit they could have made. But if it is open source in my theoretical then anyone could have swooped in at any time, and the creator should have known that was a possibility. They never should have sunk more effort in to the project than they were comfortable with knowing an alternative provider could have taken over the market. That creative person should instead have joined groups of other engineers working on projects that people need. Perhaps they contribute to a project that eventually gets taken up by AWS, but it would not damage their livelihood as they would just shift focus to another.
We play by the rules we’re given. In a society where everyone shares, creative projects do not strictly belong to one individual. We would not make choices that exposed us to risks we did not in some way accept.
Some would say that without this person being willing to sink five years in to something for their own profit, they would never be creative and this would be a loss for society. But I don’t believe that. I’m two and a half years in to a project that I’ve licensed CC0 & BSD. Anyone could swoop in at any time and bring it to production, potentially earning great profit. And I would find it a great honor if someone did that. So you see, the creative drive exists outside of the need for profit.
As far as how people survive - there is a great deal of work that cannot be digitally duplicated. You can pirate a musicians albums but you can’t pirate a live show. In a world where there was no IP restrictions and we all shared, people would find income doing novel work, instead of duplicating something someone had already done. One good hack to this is to automatically provide for everyone. Universal basic services. Then there’s no starving artists.
Sidebar: IP laws are not strictly necessary to enforce this. You are not decent to your neighbor because the law requires it. If society values open source and collaboration and boycotts those who would use IP law selfishly, we can accomplish this all with no change in laws. Some kind of union system can help enforce these boycotts and formalize their terms. Though some laws do need to change to reduce the benefits of playing selfishly.
Another term for this is the Hegelian dialectic [0]. Very similar to your term playful association but it describes that the "first" old idea, the thesis, allows there to be a counter idea, the antithesis. These two counter ideas morph into one idea, the synthesis. A consequence of this term over playful association is that these New Ideas are more tangible and have less sense of randomness.
An example of this playing out - First came Disco, the thesis. People who didn't like Disco came up with Rock, the antithesis. The combining of those two became Pop, the synthesis. And the music timeline continues (don't overthink the music choices I'm completely guessing)
I’m seconding the request for any literature on the subject! I’m very interested in the harm intellectual property restrictions might be doing to our rate of innovation.
Tons of books out there, very difficult to sort the wheat from the chaff. The following is what I found useful. Its a mixed bag, academic and coffee table.
This started off as a blog and developed into a wonderful and useful little book. In my opinion, creative rituals are a very under-estimated component of creativity.
- Sternberg, R. J. (2001). What is the common thread of creativity? Its dialectical relation to intelligence and wisdom. American Psychologist.
A very readable 3 page essay.
- Christensen, C. (2013). The innovator's dilemma: when new technologies cause great firms to fail.
Perhaps the most well-known book on this list. It was this book that addressed the idea of 'disruptive innovation'... innovation that completely destroys some businesses, and creates others. He uses the railroad as an example, but clearly the internet falls into this category as well.
- Seelig, T. (2012). inGenius: A crash course on creativity.
This evolved from the course notes from one of the first university courses on the subject of creativity. A bit annoying to navigate (her chapter headings are 'too creative') but it has good stuff.
On top of this, there is almost everything that Edward De Bono wrote (the lateral thinking guru).
This may be why it seems like it's harder to find new ideas now. It was relatively easy during the waves of "do X on a computer", "do X on the Web", and "do X on a phone", but now we're waiting on the next transformative idea that will allow everything old to be reinvented again.
Well... I designed the course for the school, but only taught it once. Frankly, it was not a roaring success. One reason was that I was asked to deliver it online. With my recent (obliged) experience in online teaching, I think I could do a better job if I did it again. I incorporated much of the material into other courses, to some success.
To answer your question directly, like all teaching, it requires that the assignments and exercises be appropriately designed. No matter how much I tried to design an assignment that embodied the ideas I was teaching, students would still come up with really un-creative responses. I came to the conclusion that the only way to guarantee success was to mentor them individually.
Most creativity courses just deliver the 'received wisdoms' (i.e. they are theory courses). Very few actually try to foster creativity. The book inGenius: A crash course on creativity (Tina Seelig) was written by someone who did.
Here is the course description to a creativity course delivered at the University of New South Wales. Its not bad, but for a creativity course, the assignments seems a bit un-imaginative.
Author of "Borrowing Brilliance" (Borrowing Brilliance: The Six Steps to Business Innovation by Building on the Ideas of Others) proposes some thoughts about creativity - like, it's a combination of ideas from another area.
The funny thing about inventions is that nobody needs them until they have it.
So throwing more and more researchers at one topic won't change anything as long as there won't be adaptation of this research.
In my opinion there are too many inventions and not much products around them that people want to use.
Adaptation is often correlated with business need. So adaptation is the biggest problem right now cause as soon as people start using invention they will tell if this is improving their lives or it's stupid idea.
We don't need more researchers we need more testers and producers and here comes availability. When researcher put patent on his research availability is near zero so invention is stopped in time.
Also don't forget about : Great Inventions Are Often Overlooked.
What's your point because if it will start mutate too fast at this point you need to produce vaccine for every mutation that is deadly. For example flu vaccine is produced twice a year and not for every strain only for those that are most dangerous. Maybe there is need to develop new protection techniques instead of injecting dead virus into human body. Over the years it might become true that only way for human to survive on this planet is gene editing like CRISPR.
There are already first trials to edit genes of adult human that you can read about here:
https://ir.editasmedicine.com/news-releases/news-release-det...
Some time in (I think) the past few months, someone posted a link to an AT&T or Bell Labs executive talking to a team of his engineers about the state of innovation at the phone company. The context was the 1970s, and a number of innovations -- direct dialing and touch-tone dialing were I believe two of them. It transpired that these had been. developed decades before deployment -- direct dialing being a 19th century invention, touch tone from the 1920s or 30s, as I recall.
The upshot being that innovation had already slowed tremendously.
The video begins, approximately, with the speaker relating a story of how an earlier executive had announced to a previous group, "Gentlemen, this company was destroyed last night".
If this rings any. bells, Ma or otherwise, I'd appreciate the link or reference.
NB: the title is misleading. this is not about pure ideas, this is about economic growth, which to be measurable (significant) needs to be exponential.
no doubt, there are more ideas now than ever. technology is a great enabler. but finding economically impactful ideas ... that does seem to be getting harder, from my armchair view.
Value of ideas is massively overstated in biz school lit, imo... Ideas are fine, but it takes a lot of work to prove that one actually works, and then a lot more work to bring it to production.
All of the work involved needs financial and material support to happen. It's easy to have an armchair idea, but to put a hundred+ hours into the very first realization is extremely difficult if it's not your day job. (And repeat for the thousands(?) of hours necessary for bringing to market...)
In my most anti-elitist frame of mind, I expect there's a mismatch between who has good ideas and who has time to prove things out... Elite business schools are populated (on average) by walking Dunning-Krueger effects, powered by trust funds and lack of consequences: of course they have a shortage of good ideas... Meanwhile people with the actual experience and expertise to produce good ideas have day jobs to keep them from executing.
This. It's also worth remembering that the cost to bring products to market - particularly in the software/technology industry, continues to rise. As larger players continue to expand their portfolio of offerings with years of engineering effort behind them, it's significantly more difficult for a small startup to produce something compelling enough for customers to use it over the other options on the market.
In other words, the bar for quality and capability in all products we use on a daily basis continues to rise. That has the direct effect of raising the expectations that consumers and businesses have for new products they want to purchase, and thus the cost to build them. That translates to more time needed to deliver something meaningful to market, which is also likely correlated with higher amounts of VC funding concentrated in a smaller number of companies (pre-COVID)
That's an interesting point. Yes, for economic growth to be measurable, the nominal gains (i.e. dollar-value gains) have to be exponential--but there are a few reasons to think that as the economy grows, innovations that lead to productivity improvements should become cheaper.
First, the productivity gain should scale with economic growth. A technology that leads to a 5% improvement in productivity of a farmer leads to 5% improvement in productivity of 1000 farmers.
Second, the fixed costs of innovation--capital costs and research costs--should in many cases increase sublinearly relative to the above (linearly increasing) returns on investment. Thus relative to available capital (which increases as the economy grows) these innovations should become cheaper.
Finally, some per-unit production costs may also decline as scale increases.
As a simple example, if a farmer makes $100/year using a donkey and $200/year using a tractor, a tractor costs $10, and a tractor factory costs $10000...well, you get the idea.
So I don't think this is explicable by economic growth, unless there's a facet of growth that hinders innovation (or its application).
Ironically, as technology and business become better and better at monetizing new ideas, this makes things more expensive for individuals to bring ideas to market.
First, large competitors (whether businesses or investors) fastest way to wealth is to jump into markets validated by someone small, with overwhelming advantages in terms of customer relationships, being able to fill out solutions spaces, funds for deep optimization, etc.
Second, even if the small inventors expected mean return has gone up, so has the standard deviation, so risk has gone up.
Anything that discourages or inhibits individuals and small teams from following through on new ideas will have a big impact. Because most good ideas are by individuals and small teams, and few of us have the ear of CEO's or other efficient fund decision makers.
Why do youth the variance in returns had increased? Because of the first point?
I don’t think I buy the first point, though. Intellectual property rights are probably at an all time high. Is the point that many types of innovation are outside of copyright able/patentable innovation, that enforcement is imbalanced, or something else? (What about the prototypical awe of the 800lb established competitor who simply buys the startup?)
It is tough for any small group to protect itself from a motivated, well funded, and well informed competitor even using copyright, trade marks and patents.
Part of the reason is that it can take vast sums and years to enforce IP rights against deep pockets.
The other reason, is that no matter how much IP you formally protect, most of any business is not its IP, it is the sales and support channels and services, etc. Even a lean software startup’s code is likely mostly the non-IP code that delivers and supports the IP actually solving a problem.
This means well funded and well informed competitors have a good chance of solving the same problem using a slightly different solution. You saved them work by defining the problem, providing an example solution, and validating the market for solutions to that problem.
Probably the best way for a small team to protect any really valuable IP is if they keep it a Trade Secret, while growing the business until it has the depth of resources, customer engagement, market bisinility and organizational depth so that deep pocket copycats might be successful too, but without stopping your momentum.
Obviously, this strategy depends on being able to deliver new IP to market without disclosing it exists. But customers care more about the results you can give them than how you do it. So innovative code buried in an app may not attract much or any scrutiny.
Aiming initially at underserved or previously unserved markets also avoids attracting competitive attention.
Edit: I think variance in returns has gone up because information tech is making it easier and easier for new organizations to serve larger markets, so more small companies will make it big.
And at the same time, competitors are more likely to notice anything significant, and their are many more competitors observing and reacting globally.
And we now have massive global competitors like Google, Facebook and Amazon that are getting harder to avoid as they attempt and often succeed in dominating one market after adjacent market.
So expected (in the statistical sense, I.e. mean) returns have gone up, but the ease that a startup can be crushed is also higher.
There is a kind of conservation of risk and reward in economics, so it isn’t unexpected that bigger opportunity attracts players who in turn create bigger risks for new players.
The problem isn't finding ideas, but finding ideas that other people haven't already thought and wrote about, or patented. Indeed, the crawl to the precipice of knowledge in a domain can take years. There are only so many shortcuts, no matter how much pruning of the tree of knowledge you do, the path will only get longer. In fact, a great deal of effort goes into re-inventing the discoveries of others, hidden from them by language, disciplinary perspective.
It may just be a change in the way that we work. This paper may just as well be titled "Are institutions getting less effective?" or "Are research institutions getting less effective?". Hell, it might even be that our smartest people aren't researching and the amount of brainpower we throw at research problems has actually decreased even though the effort/money we throw at research problems has increased.
It may just be a change in the way that we work. This paper may just as well be titled "Are institutions getting less effective?" or "Are research institutions getting less effective?".
Lots of progress might be stymied by entrenched interests using laws, regulations, or some form of corporate bureaucracy/cronyism as a "moat" to ward off competition.
The short shrift paid to EVs by auto makers and car dealers before Tesla came along is one example. (Yes, I know that batteries also had to improve.) Intel's practices before AMD stepped up its game recently can be seen as another example. Then there's SpaceX.
Hell, it might even be that our smartest people aren't researching and the amount of brainpower we throw at research problems has actually decreased
Is research now a "2nd tier" career? Are you thinking this is based on economic competition from industry?
Hasn’t the trend always been to favor industrial application rather than fundamental research, in terms of pay at least? Or do you mean second tier in terms of status/prestige?
People get paid based on what they contribute to the economy not what they contribute to science
No, I'm thinking more in the context that research professorships are not what they once were. They have effectively become a kind of sweatshop labor to the academic research institutions for which the reward is a PhD and then the delightful prospect of becoming an adjunct professor.
Such tremendous effort for little reward tends to dissuade many of the brightest minds from research careers and towards other careers like finance or consulting.
My read of his comment is that fruitful research was historically done by people within the aristocracy after their fundamental needs for comfort were met.
In my mind, it seems there's no such thing as an "original" idea, given that all ideas are made from or composed of other ideas (as that's how you describe them). Therefore originality, seems to be just a new combination of old ideas you've already been exposed to.
As an example, can you ever imagine a color that's not composed of a combination of colors you've already seen before? A truly original color?
Otherwise, if you're trying to communicate an idea to another, and cannot describe the idea using existing ideas, it might be considered truly original. But what does one do in this case? You show or demonstrate it to them...
As so, ideas seemingly come from observation - or more specifically - our senses, and any subsequent ideas built from thereof.
Thus, if there's a limit to we can sense and observe, then ideas would also be limited or "finite", and we would naturally find ideas becoming "harder to find" eventually...
I believe Ludwig Wittgenstein mentioned something like this in his work Tractatus Logico-Philosophicus.
I see where you are coming from and I get that your argument is that new ideas are (?often) extension of existing ones but I think this is a bit of an oversimplification. It seems like everything can be described as an abstraction of something else on a larger/smaller scale. Which philosophically would mean that the universe as a whole has the same value as any atom or quarks within it.
But even if we ignore the philosophical aspect of it, it seems a bit like Duell's "Everything that can be invented has been invented" statement from late 19-th/early 20-th century. A statement which hasn't aged too well for better or worse.
> I think this is a bit of an oversimplification. It seems like everything can be described as an abstraction of something else
Correct, any new ideas you abstract are composed of pre-existing ideas you have. Trace any of those ideas to their roots and you'll realize all ideas come from what you originally had to subject your senses to (i.e seeing it, hearing it), otherwise they're some abstracted combination thereof.
Although my prior post was an attempt at a logical proof of sorts, the only real caveat I find in it, is this part: if there's a limit to what we can sense and observe. This is not proven as far I can tell (which is why I prefaced it with an 'if').
Again, I think Ludwig Wittgenstein's work may say all of this better, but then again I find it hard to grasp everything it tries to convey.
EDIT: As for Duell's statement, perhaps it's taken too literally. I can see they had an equivalent to email (e.g. regular mail), atom bombs (regular bombs) and AI (basic schemes & algorithms) back in those days, just not literal equivalents. But his statement is not the argument I'm making in my prior post. Rather it's more "our ideas are limited by what we sense".
Its kind of understandable isn't it? The easiest problems get solved first, so people have to "go up the stack" if you will. Unfortunately going up the stack means one needs to get more and more educated or in some cases need to embed themselves in a specific context. But in general though, people are naturally ingenious, they will always come up with new ideas.
In the most recent past, the one "low tech" idea that blew me away is the concept of influencers. People understood Instagram and then understood the general psychology of people and then exploited it to create vast empires. I mean previously clout was something only celebrities had after doing extraordinary things like act in movies or be the best in the world in what they do. But people found a way to commoditize it and democratize it. And they didn't need any formal education for it. All they needed was a phone.
The easiest problems get solved first, so people have to "go up the stack" if you will.
The implication of this is that we're running out of problems to solve, and eventually we'll have fixed everything. That ignores the fact that a solution to a problem very often creates new problems, and some of them will be equivalent to the previous easy problems. In some fields you could be solving the "easy" problems forever.
There's certainly phases of progress in particular fields. e.g. in physics, we had relativity and quantum mechanics... and no further revolutions for quite a while. If Einstein's genius made some of it possible... he also didn't make further progress. He was older yes, but also perhaps there was less to find? Other people were smart enough to find what he found, but weren't bold enough.
Looking at the bigger picture, Einstein had unexplained empirical evidence (speed of light constant in all directions, inconsistent with waves in an ether). So that was a driver.
Looking bigger again, even with better equipment, that senses further, we can only discover new things if they can be sensed with that equipment.
(Without evidence) my opinion is we will have periods of slow progress and periods of rapid progress. We are currently in a period of stagnation. We need empirical data, intelligent analysis, and boldness.
The title is misleading and inflammatory: The article itself is making a much narrower claim about the monetary value (economic growth) derived from equivalent amounts of research effort.
In that context, they claim it takes more effort now than years ago to achieve the same level of economic benefit.
This is far different than anything having to do with the # of "ideas", which is too abstract of a notion to be meaningful the way they use it.
Even if they're correct that it now takes more research to produce equivalent economic benefit, then through one concept of "idea" we actually have more than ever: researchers aren't doing the same bit of research over & over, they're doing new bits. Each bit easily fits into the concept of an "idea", and so in that sense ideas are plentiful, it just takes more of them than before to achieve a given level of economic growth.
The pigeonhole principle [0] is also becoming more of a factor due to the sheer volume of production.
Especially in domains with a limited branching factor. For example, in music, if you only have 12 tones in an octave to play with, along with variable length and spacing, you start to run out of "good" melodies under 20 notes after 400 years.
Timbre and genre are very important things. The remix doesn't need to be as good as the original since it's way better suited to be mixed in an electro swing DJ set. The different context creates an entirely new niche in which the original has no place.
Disclaimer: I googled for something random and sifted a bit through the results - not a fan of dubstep remixes, so I chose electro swing instead :)
Isn't this just increasing the branching factor? How hard will it be to find new melodies a million years from now? Easier than today, the same, or harder?
It is, but it is increasing the branching factor by much more than finding all the "good" melodies in 400 years.
Here's a good example of music that stands on it own right despite using the melody of despacito: https://www.youtube.com/watch?v=ydqReeTV_vk - I'm pretty sure that most people at HN will appreciate this one whether they like despacito or not :)
I think the relevant thing about the pigeon hole principle is that it deals with fundamentally discrete things. Sure, there is a finite number of scores using twelve notes in a finite period of time, but that's just not how we judge or compare music. I would say that there's so many branching factors in music that quantifying one piece is akin to measuring a coastline; the smaller your scale, the longer the coastline. Beyond timbre, there's other cultural factors like, who wrote the music. In a million years we might have umpteen songs with the same score, but they'll still occupy different pigeon holes in the collective consciousness.
A good example is astrophysics. There are so many nice theories around, but they are all wrong. If you want to theorycraft the universe now, you need to deal with an incredible amount of very precise data, advanced maths and billion dollar machines. If you don't, the best you get is to have you theory proven wrong by an actual specialist, most of the times it is "not even wrong".
Researchers are now working on a subdomain of a subdomain, and they spend decades studying just to reach the level where they can start having good ideas.
Most of the easy good ideas are already taken. What remain are the hard ones, and those that require a lot of luck and/or hard work to materialize.
I think this is slightly specific to astrophysics and cosmology. In Neuroscience, it definitely feels like there is plenty of headway to be made by individuals with good ideas, although I admit that it feels as though ideas are harder to come by as more is published and the bar is continually raised.
Had a similar discussion yesterday and I am aching for that conceptual freshness.
What are all the non-Elon, non-besos plebs going to do for a day job? Our industries are too mature for easy growth. Cars, computers, space are done for the plebs, we need another industry of easy growth to pull this ship along. If history repeats, manufacturing will become king again.
In the arts with near infinite creative control lately we are seeing many retreads of past ideas, we need some great fresh ideas.
Ai tooling and general purpose automation will kick off a customisable manufacturing system, but it’s not enough. Where are all the brains at.... finance?
Good machines compress time for tasks. Dishwashers still need human stackers despite reducing an all day job down by the river to 3mins of stacking/unstacking dishes.
Compressing design work from hours or weeks of thinking about different forms down to a few minutes of machine learning pattern recognition cpu cycles, means we have to move our knowledge work from massaging clay into new Porsche designs to higher level abstractions about what function in society cars can fulfill. The abstractions get higher, we can fit more into our time on this planet and the detailed oriented will be required to keep all of this from breaking down and succumbing to entropy.
Humans at the bottom of the chain will have nothing to do except hold the world together, people falling out of work, machines breaking and money and time being wasted away on excessive drug use and manufacturing for pleasure instead of progress too much can grind this potential for new life to a halt.
Agriculture is direly needed in the dirty thirties if we are to avoid repeating the hunger caused by the dust bowls in the 30s. A green base is fundamental to surviving the crash of the roaring twenties. Automating farming and increasing food production to multiple times of what the US can demand is a life saving purpose. God only hopes you have ‘nothing’ to do when it comes to crass commercial works. Building self sustainable farming for homes so that families can survive on and off grid is also has a potential for great application.
I haven't read the article yet, but I think it's just harder to reach the border of human knowledge. You have to specialize hard to reach it in a reasonable time.
Sadly that makes the smart people not take general, discipline arching careers. Only in young disciplines like machine learning there is still a lot of jumping sparks for other disciplines.
Computing had been in the works since Babbage, over hundreds of years. Once the tech was in place there was infinite jobs to be had making sure humanity could democratize and exploit computing to it's highest potential.
What's coming down the hundreds-of-years pipe that we can bust open for this half-century? The 40s were famously the fighting forties, if we don't get on it we're going to be forced to take what others have in an attempt to access the infinite or stave off nihilism.
Nukes are underutilized but likely to be surpassed by fusion if the recent builds and smaller experiments pay off. With infinite computing power and orders of magnitude cheaper energy we employ everybody to do the same as before but bigger and better. Material science could unlock a bunch of new avenues for improvement. Biomed is kicking off but that's unlikely to employ everybody. I can't see it all, there's gotta be something.
Not sure how realistically viable it is, but I hope that quantum computing will bring some of that freshness back. But again, it will heavily depend on how much of it is "legit" and how much is just hype.
An alternate way to look at the problem is that perhaps we should be doing a better job of educating our children to absorb the existing state of knowledge. The hope would be that they are prepared to contribute new ideas before their naive optimism is crushed and they head toward seeking rent on old ideas. The underlying theory here is that new ideas are discovered by reinterpretration of the underlying data with a "purer" or partially forgotten version of previous explanations.
Don't ideas naturally breed more ideas? Perhaps it takes a certain type of mind, but from my perspective, ideas are synthesized from life experience. That life experience could be prehistoric or modern, but the flow of information in life is going to generate ideas. Ideas are needed to solve problems, solving problems ensures survival, so we naturally think in terms of ideas much of the time.
Illustration: A prehistoric person invents the basket: They gather apples from a tree that is a mile from their cave, but can only carry 6 apples and must make several trips. On one trip, they pass a tree with a nest, the nest has eggs and the idea happens: eggs:apples, nest:[basket]. They fabricate a basket out of a hide, and can then get all the apples they need in one trip.
In Moore's law, I'd argue that keeping apace in fact requires many more ideas to get to the next node than it took to get to the current node. The fabrication complexity is exponentially greater, not linearly greater, and more ideas are required to get there. Getting to node x+1 indicates much greater progress than was seen getting to node x, because exponentially more ideas were required to succeed. Moreover, you had to have all the ideas of x to build on in order to get to x+1.
I argue that we are not generating fewer ideas as 'harder to find' would imply. But rather, exponentially more, and, the resulting achievements are all the more spectacular.
I'm reminded of the discovery of the 'modern' variation on the Fisher-Yates shuffle algorithm. Both the original algorithm, and the improvement made in the variation, are so simple that it can be trivially understood, and makes one think Oh yeah. Neat. And kinda obvious. (I'd forgotten the particulars so I had to look it up on Wikipedia. Took me very little time to piece it back together.)
Clearly it's not totally obvious, or it would have made its way into the original paper. Still, the barrier to a Wikipedia-worthy computer science discovery, has risen somewhat since then.
Edit Come to think of it, this is a good example for explaining what we mean by 'improved algorithm' to non-software folks.
Good ideas are easy to find in any niche, but the low central bank interest rates make price discovery extremely hard, so most of the time it's just not worth to implement those ideas, as people are not compensated well enough for executing on those ideas.
I believe that right now any project that gets people closer to hard money is a good direction.
I work in health care make okay money but definitely not a lot. Ideas are often easy to come up with, identify a problem and an idea of how to solve it. Over my career I have had several ideas I truly believe would greatly improve the health of the clients I see. However to try and make my idea come to life would cost a great deal of time and money so have never really chased after the ideas. I truly believe there are many many ideas left to discover it’s truly is just an economic factor holding them back. So I would say ideas are not getting harder to find just harder to make happen. If I made any sort of medical device I would have to have it certified in so many ways I wouldn’t even know where to begin.
Sure, health care is even worse than any other field because of the high regulations and the health cost bubble in US.
But you can always just accumulate gold/silver/Bitcoin, the weakest points of the fiat system from the moderate salary that you have. It's still important though first to understand thoroughly what you're buying.
Forgive me, but I don't understand your point. Is it that it's so easy to borrow money that there's no need to sift good ideas from bad? And if so, that seems like it would result in more bad ideas being funded, not fewer good ideas.
Yet the paper claims to present evidence to the contrary (sort of; they're talking about innovation gains, not "ideas")--that the cost of gains is increasing. I don't see how that would arise from low interest rates?
Money (in the form of buying power) can't be created from nothing, so whenever a bad idea/execution is funded, it takes away buying power from companies that have better execution, but less access to cheap loans.
As an example I personally know a CEO who had a great idea, great exececution, but a new competitor startup was funded with millions of dollars of cash that threatened to sue him in the US. He didn't have money for lawyers, so he became in a bad position very fast, even if he was quite certain that the competitor didn't have any case against him.
Another example is cheap loans driving up ad prices and rent in SF, which I have seen often talked about here.
I was specifically talking about money defined by the Austrian theory of money (buying power / store of value). Just because monetary units (currency) are created, it doesn't increase buying power of the total amount of the total amount of monetary units. For example if people buy houses from additionally created currency (even if it's debt), the buying power of one unit of that currency goes down, which can be seen by the number of required units needed to buy a house going up.
The problem with banks is that they have a good reason to conflate unit of account with store of value: they make lots of profit on people not understanding the difference.
Money is made up in all cases. It's an abstraction of relative demand. Even if governments abandoned fiat money, they could still pull money from thin air.
Mostly because governments use violence to change the relative price of anything. If a government made owning a gold coin or a bitcoin a crime punishable by death, I think its buying power would go down. Similarly, by forcing citizens to pay taxes in a specified currency at the threat of jail time, a new currency is created.
Sure, money is a man made belief, Yuval Noah Harari's books are great at explaining this :)
Generally gold confiscation doesn't mean the purchasing power going down, just transfering it to the governments. It's still quite scary though, as I wouldn't be surprised if it happened again :(
The scariest thing is if all the countries in the world try to ban gold and Bitcoin at the same time to preserve the fiat system.
I'm hopeful that as Schnorr signatures will make coinjoins more practical, it will be easier to use as well, thereby making ownership of Bitcoins less visible to governments. Wasabi wallet is great at doing coinjoins, but when I tried it, it was just too slow.
And even if both companies theoretically have access to cheap loans, it would still have a negative effect on companies that focus more on execution and ideas rather than focus more on raising lots of money.
This is well known in the Austrian School. It's not that cheap money drives out good ideas, but it dilutes them, because cheap money allows bad ideas to persist - there is no Schumpeterian creative destruction.
The corollary is that average productivity declines, because inefficient zombie companies are still encouraged to walk the earth, which is exactly what we have seen since the GFC triggered money printing and artificially low interest rates. Of course, non-Austrian economists are perplexed, and create many spurious explanations, which is an ironic example of the ineffectiveness of (economic) research.
[The US legal system is a separate issue of rent seeking by an entrenched lobby, which creates and preserves laws and legal frameworks that reward litigation. US healthcare is another anomalous parasitic example. Other countries do not have these problems to the same extent, but they often have other types of rent seeking, bureaucracy, inefficiency and corruption to compensate.]
This is a libertarian talking point. The idea is that government control of money leads to less private innovation. We had the industrial revolution in the late 19th century while on the gold standard and now with government-controlled money, most of the long term innovation like renewable energy, nukes, the internet are all funded by the government.
Low interest rates certainly lead to non-optimal housing development. I completely support massive government funding of energy, nuclear, technology and research.
But I don't support the NY Fed and the bank CEOs controlling the open markets, setting the interest rates and flooding dumb "elite" investors with cheap money so they can build bad systems. Having 1 of 12 FRBs dominate the monetary system is a flawed idea, especially when that 1 is Wall Street. Financial engineering is dangerous and antithetical to real research, development and progress.
That's an argument. But it seems to me still a far cry from arguing that cheap capital leads to less innovation--in effect, that dumb money drives out the smart.
Me too. I'd be ok with lots of governments not developing nukes either. Or those that already have to significantly reduce their nukes.
Nukes suck.
Atomic energy though is pretty awesome. We could already have 100% stable, constant CO2-free energy if an entire generation of Americans didn't grow up watching the Simpsons
Well technically yes. It effectively died in the 1930s when England stopped using it and the US forced citizens to hand over all gold. Its peak in usage was ~1870 to ~1910.
We were still on the standard through the 70s, but FDR and company did devalue the dollar with respect to gold during the 30s to finance all the New Deal programs. Bretton Woods in 1944 re-established the gold standard for the dollar and then all other currencies were pegged to the dollar and that continued until 1971, when Nixon "temporarily" suspended it, effectively creating the modern fiat boondoggle we have today.
Maybe we should take an evolutionary perspective on this? Some ideas may be good in any environment and I expect they would be harder to find over time. Other ideas are probably good or bad depending on their environment.
If the environment is fairly stable then we would expect good ideas to get harder to find (evolution slows down), but when there is a big change, I expect it would be a lot easier to find new ideas that only made sense recently.
Which is to say, I think the answer to this question is probably "no" in the short term, and we should see more ways that creative thinking pays off.
I wonder if this is partially a function of decreased attention spans. It takes more discipline than ever to turn off distractions and focus on innovative solutions to hard problems.
EDIT: This quote seems relevant[1]:
> It is fair to say that, in general, no problems have been exhausted; instead, men have been exhausted by the problems.
I managed to build a profitable small business and mostly automate it. But now that I'm back in the search for another more ambitious idea to work on, it is way too easy to be distracted by all the electronics around me than to be able to ponder deeply about ideas.
I tried reading a novel the other day, I couldn't make it through 3 pages without reading some random BS on the interwebs.
Damn capitalism for incentivizing companies to make everything as addictive as possible...
Good ideas that are beneficial, yes. However, there are a lot of less optimal solutions for the human race out there, some are even in production, e.g. tracking people, social media
Note that the article is talking about research ideas for researchers to research. Not for business ideas for entrepreneurs to develop. I clicked expecting the latter. It doesn't surprise me that academic research fields dry up over time.
Intelligent design theory is a giant unexplored pool of ideas. I am currently looking into bioinformatics applications and there are just so many! Ask me questions if interested!
We stand on the shoulders of giants...
... and there are many barriers to access to those giants these days.
(paywalls, patent overuse, market exclusion by .1%, ideological blind spots in communities, too many standards, too many standards behind paywalls or other blocking infrastructure)
The last is touched on by community in-fighting. People fighting over which standard product to look like.
PS: I work in wireless. Some of the barriers are there because people do really insecure new ideas that risk people's security. That's generally a good reason for standards and standard bodies.
tl;dr We haven't had any deep conceptual revolutions nor massive new sources of cheap energy for a while
Estimating how much innovation to expect depends on how you set the baseline. There are a couple of reasons for considering the first half of the twentieth century to have been the most revolutionary period in history both theoretically and technologically (not unrelated areas) at the same time as harnessing vast reserves of hydrocarbons to give us millions of years of stored sunlight to burn as fast as we can build machines to consume it. Theoretical revolutions involved working out consequences of Electromagnetism, Einstein and Quantum mechanics, and of course computation powering the calculations. Electrification, real-time long-distance communication, cars, trucks and planes... We were riding out the consequences of the easiest applications of these novelties and now it's getting incrementally harder to squeeze out the benefits.
no. i think it's ideas are harder to get funding for. everyone wants results and wants them now. this short term pressure creates noise, and noise is expensive.
options for us as humans are growing,
today you can travel by plane, travel to space, buy something around the world, learn whatever you want online...
hyperproductivity is here, but both market and options are increasing too so not sure what math could be used realistically but for sure we can't foresee in near future death of ideas
Yes, but sites like:
https://github.com/iamadamdev/bypass-paywalls-chromehttps://sci-hub.tw/
exist and make it trivial to get around draconian publishing practices. I'm not advocating not paying for journalism---it's just that not everyone realistically can, and paywalling them seems like a bad solution.
A more plausible reason, at least to me, is the increasing specialization of fields and lack of cross-field communication.
Also, ideas seem to be flowing plentifully at least in math and TCS, where old and hard conjectures are still being proved with astonishing regularity.
In the aftermath of WWII the so-called "Men in Black" were formed, not to police aliens (what a conceit!) but to sequester dangerous technology like free energy and anti-gravity.
With great responsibility comes great power.
There are all kinds of awesome technologies but each one is as dangerous as it is powerful, and we are fucking nuts, so it's kept a secret. Rex Research is one of the "holes" where the knowledge is allowed to leak out as a kind of pressure-release valve.
Can you back this up with sources? This sounds like paranoid conspiracy theory.
Most inventions are "invented" in multiple places at nearly the same time. If free energy is possible, how was each and every discovery nation, by nation, squashed? How do you stop it from spreading once that cat is out of the bag?
We don't have to look far in any direction for solutions that improve our education systems, our research output, and the general welfare of humanity. Unfortunately, the forces of corruption will always be strong, and it will take a more creative and imaginative people to implement them.