Just like a psychological diagnosis is often based on proximity to a cluster of symptoms rather than specific ones that must be met, you can't point to one particular line that makes it AI. It is the combination of all of the sentences that aligns with our perception of AI.
Am I missing something or does the article fail to explain the point of Arrow’s Theorem? Is it satisfied for the discrete case, provably impossible, or what?
> While this applies to discrete rankings and voter preferences, one might wonder if it’s a unique property of its discrete nature in how candidates are only ranked by ordering. Unfortunately, a similarly flavored result holds even in the continuous setting! It seems there’s no getting around the fact that voting is pretty hard to get right.
I agree, it could do with a little more proofreading. Arrow’s theorem states that no voting state which ranks candidates can satisfy the the given conditions.
More curved means more optical aberrations. The Rubin is a so-called three-mirror anastigmat design that minimizes astigmatism, coma, and spherical aberration. (Chromatic aberration is not a problem in reflectors because dispersion only occurs when light is refracted.) A two-mirror design couldn’t be used in such a wide-field telescope without severe image quality issues, at least comparatively speaking.
I don’t think it is. What you describe is similar to any other industry disruption, and I don’t think those are unethical. I’d actually argue that preventing disruption is often (not always) unethical, because you artificially prolong an inefficient or inferior alternative.
So you're saying that, we should stop pursuing art and prose? Because when you fine tune midjourney with 30 or so images of an artist, it can create any image with the artist's style.
You removed the value and authenticity that artist in 30 minutes, you applauded it, and defended that it should be the norm.
OK then, we can close down all entertainment business, and generate everything with AI, because it can mimic styles, clone sounds, animate things with gaussian splats, and so on.
Maybe we can hire coders to "code" films? Oh sorry. ChatGPT can do that too. So we need a keypad then, only the most wealthy can press. Press 1 for a movie, 2 for a new music album, 3 for a new book, and so on.
We need 10 buttons or so, as far as I can see. Maybe I can ask ChatGPT 4 to code one for me.
Doesn't matter. You pay the artist for their style of rendering things. Consider XKCD, PHD Comics, Userfriendly, etc. At least 50% of the charm is the style, remaining 50% is the characters and the story arc.
You can't copyright style of a Rolex, but people pay a fortune to get the real deal. Same thing.
> My word, the lawsuits that would arise between artists...
Artists imitate/copy artists as a compliment, at least in illustration and comics world. Unless you do it in bad faith, I don't think artists gonna do that. Artists have a sense of humor to begin with, because art is making fun of this world, in a sense.
No, you pay them for the finished product. The STYLE is independent. Lots of artists have similar styles. They don't all pay each other for copying their styles.
Every artist has their own style, because it's their way of creating the product.
Pixar, Disney and Dreamworks have different styles, same for actors, writers, and designers, too. You can generally tell who made what by reading, looking, listening, etc.
I can recognize a song by Deep Purple or John Mayer or Metallica, just by their guitar tone, or their mastering profile (yes, your ear can recognize that), in a couple of seconds.
If style was that easy, we could have 50 Picassos, 200 John Mayers, 45 Ara Gulers (A photographer) which you can't tell them apart, but it doesn't work that way.
XKCD took a couple of guest artists because of personal reasons. It was very evident, even if the drawing style was the same.
People, art, and hand made things are much more complex than it looks. Many programmers forget because everything is rendered with their favorite font, but no two hand-made thing is ever the same. Eat the same recipe from two different cooks, even if you measure the ingredients independently and give them beforehand, you'll have different tastes.
Style is a reflection of who you are. You can maybe imitate it, but you can't be it.
Heck, even two people implementing the same algorithm in the same programming language doesn't write the same thing.
> Style is a reflection of who you are. You can maybe imitate it, but you can't be it.
Isn't this an argument that AI-generated artwork will never be more than a lesser facsimile? That'd suggest that human-made works will always be more sought-after, because they're authentic.
It'll be, and human made things will always be better and more sought-after, however capitalism doesn't work that way.
When the replacements become "good enough", it'll push the better things because of being cheaper and 90% being there. I have some hand-made items and they're a treat to hold and use. They perform way better than their mass produced ones, they last longer, they feel human, and no, they're not inferior in quality. In fact it's the opposite, but most of them are not cheap, and when you want to maximize profits, you need to reduce your costs, ideally to zero.
Honestly, that'll be boring. I don't want to be a star of a movie, that's not what pulls me in.
I want to see what the person has imagined, what the story carries from the author, what the humans in it added to it and what they got out of it.
When I read a book, I look from another human's eyes, with their thoughts and imagination. That's interesting and life-changing actually. Also, the author's life and inner world leaks into the thing they created.
The most notable example for me is Neon Genesis Evangelion. The psychological aspects of it (which hits very hard actually) is a reflection of Hideaki Anno's clinical depression. You can't fake this even if you want.
This is what makes human creation special. It's a precipitation of a thousand and one thing in an unforeseen way, and this is what feeds us, albeit we are not aware of this and love to deny it at the same time.
"This is what makes human creation special.", that's a load of garbage. There is nothing inherently special about human creation. Some AI artwork I've seen is incredible, the fact it was AI generated didn't change its being an incredible piece of art.
Thinking our creation has some kind of 'specialness' to it is like believing in a soul, or some other stupid thing. It's pure hubris.
Actually, I'm coming from a gentler point of view: "Nature and living things are much more complex than we anticipate".
There are many breakthroughs and realizations in science which excite me more than "this thing called AI": Bacteria have generational memory. Bees have a sense of time. Mitochondria (and cells) inside a human body communicate and try to regulate aging and call for repairs. Ants have evolved antibiotics, and expel the ones with incurable and spreadable diseases. Bees and ants have social norms, they have languages. Plants show more complex behavior than we anticipated. I'm not entering the primates' & birds' region because only the titles will be a short chapter.
While some of them might be very simple mechanisms on chemical level, they make a much more complex system, and the nature we live in is much sophisticated than we know, or want to acknowledge.
I'm not looking from "Humans are superior" perspective. Instead, I'm looking from "our understanding of everything is too shallow" perspective. Instead of trying to understand or acknowledge that we're living in a much more complex system on a speck of dust in vast emptiness, we connect a bunch of silicon chips, dump everything we babbled to a "simulated neural network", and it gives us semi-nonsensical, grammatically correct half-truths.
That thing can do it because it randomly puts a word after word after a very complex and weighted randomization learned from how we do it, but imitating it blindly, and we think that we understood and unlocked what intelligence is. Then we applaud ourselves because we're one step closer to strip a living thing from its authenticity and making Ghost in the Shell a reality.
Living things form themselves over a long life with sight, hearing, communication, interaction and emotions, at least, and we assume that a couple of millions lines of code can do much better because we poured a quadruple distilled, three times diluted version of what we have gone through.
This is pure hubris if you ask me, if there's one.
Then the market will decide, won't it? Why the fuss about generative AI then? If you're so confident about its inferiority, you shouldn't have to worry about it, right? The better product will win, right?
The market does not choose the superior product. It might choose the least common denominator, the cheapest product, the product that got on the market the earliest, or the one with the richest backers, but not "the superior product".
The first part is debatable, unless you qualify it as "superior at making their creator money".
The market selects for that, and only that. Other qualities of the product are secondary, making any statements to the effect of "the best product [outside the context of simply making the most money] will win" misguided at best.
What will actually happen is people will think "meh good enough", shitty AI art will become the norm, and we'll be boiling frogs and not realize how shitty things have become.
Yes, that is true. I 100% agree. It is needed without a doubt.
For one moment, let's think it this way. You are a 20-year experienced engineer who is making whatever money you are making. Suddenly, your skills are invalidated because of a new disruption. And you have another friend in, the same situation.
Fortunately for you, luck played out and you could transition! You found a way into life, meaning and value. Your joy and your everyday life continued as it is.
But the other friend enjoyed the process, and liked doing what they were doing and there was no suitable transition for them. Humans are adaptable, but to them, nothing mattered because the whole existence didn't offer any value. The sole act of doing was robbed WITHOUT ANY ALTERNATIVE. The experience and value of a person rendered worthless.
Can you relate to that feeling? If yes, thank you.
If no, your words are empty and hold no value.
Artist went through the similar phase during the invention of photography. Now, it is rather soul-crushing because anything an artist make can easily be replicated, making the whole artistic journey a moot.
> Can you relate to that feeling? If yes, thank you.
> If no, your words are empty and hold no value.
Being sympathetic towards those people doesn't mean you should bend to their will if you don't believe it's the right thing to do. I can be sympathetic to a child who cries over not being able to ride a roller coaster because they aren't tall enough without thinking the height requirement should be removed.
I think the big difference is that it's not a direct replacement - it feeds off of the existing people while making it much harder for them to make a living.
It would be as if instead of cars running on gasoline, they ran on chopped up horseflesh. Not good for the horses, and not sustainable in the long term.
Some "disruptions" are unethical, some are not. It's about what they actually consist of. Labelling many things as "industry disruption" abstracts beyond usefulness.
Do you really feel that way universally? Would it be ethical to disrupt the pharmaceutical industry by removing all restrictions around drug trials? Heck, you could probably speed things up even further if you could administer experimental drugs to subjects without their consent.
Obviously this is a bit facetious, but basing your ethical framework on utilitarianism and _nothing_ else is pretty radical.
If having those restrictions makes the world worse overall, then it would be ethical to remove them. But I assume the restrictions are designed by intelligent people with the intention of making the world better, so I don’t see any reason to think that’s the case.
I agree that the current crop of artists are worse off with AI art tools being generally available. But consumers of art, and people who like making art with AI art tools, are better off with those tools being available. To me it’s clear that the benefit of the consumers outweighs the cost to the artists, and I would say the same if it was coders being put out of jobs instead. You can prove this to yourself by applying it to anything else that’s been automated. Recording music playback put thousands of musicians out of work, but do you really regret recorded music playback having been invented?
P.S. Adobe firefly is pretty competent and is only trained on material that adobe has the license to. If copyright were the real reason people didn’t like AI art tools, you would see artists telling everyone to get Adobe subscriptions instead of Midjourney.
> If having those restrictions makes the world worse overall, then it would be ethical to remove them
Worse how? As defined by whom?
You could make a pretty compelling argument that "the world" would be better off by, e.g., forcing cancer patients through drug trials against their will. We basically could speed run a cure to cancer!
These longtermist, ends justify the means, ideas can easily turn extremely gross.
Don't even try to stop my grocery-store-sample-hoarding robot army, Wegmans! You're being unethical in your pathetic attempt to prevent your sampling disruption!
Summary: Wolfram's model doesn't handle violations of Bell's Inequality, which has been observed in nature. Violations of Bell's Inequality require true randomness in the underlying universe. Hypergraph state machines assume an underlying mechanism that has no randomness. In other words, it functions like the kind of hidden variables theory ruled out by Bell's theorem. They haven't gotten anything like Hilbert space (quantum states) out of Wolfram's model.
There's no doubt that he really, really wants to find a theory of everything. But he hasn't found one, which he admits himself.
What he claims to have found is a framework which can be used to produce such a theory. But what he's pushing seems to be little more than the idea that simple rules can have complex results - the same thing he's been pushing for years. He seems to have moved on from cellular automata, though, essentially acknowledging that his previous ideas were wrong, at least in the specifics.
But the richness that can arise from simple rules is hardly a new idea. Many programming languages demonstrate that. The lambda calculus demonstrated it in the 1930s, but no-one seriously claims that it underlies the physics of the universe.
There's actually a surprising amount in physics that can be proved mathematically from first principles, given some basic premises. The constant speed of light gives us special relativity (and Pythagoras' theorem gives us its equations.) Noether's Theorem gives us conservation laws. The inverse square law follows from the simple mathematics of an abstract 3D space.
Just producing a single new result along those lines would be a major leap forward for Wolfram's ideas. But so far, I'm not aware of anything like that.
And when he does try to relate his ideas to physics, it usually amounts to nothing more than speculation and handwaving. Here's an example from the linked post, about the nature of space:
> "Well, I think it’s very much like the picture above. A whole bunch of what are essentially abstract points, abstractly connected together. Except that in the picture there are 6704 of these points, whereas in our real universe there might be more like 10^400 of them, or even many more."
This is pretty meaningless, since it has no connection to anything observable or testable.
All sorts of mathematics can be used to model a problem. An example is string theory, which can be used to derive correct results about the universe. But that doesn't mean that the strings it postulates exist, because there are other models that can derive the same results without strings.
Wolfram seems inclined to conflate the particular mathematical models he's using with the physics of what he's modeling.
There are cases where that can make sense, like the examples I gave above - in those examples, an observable result about the universe is derived mathematically without needing to postulate new, unobserved theoretical entities (like strings). Such derivations can actually explain something, not just describe or model it.
But in the string theory case (just to use it as a convenient relevant example), what we're almost certainly dealing with is a model that happens to work because it is equivalent in certain important ways to other models that also work. (Occam's razor can come in handy in these cases.)
Wolfram's very approach, where he starts with some framework and then tries to fit it to physics, seems almost guaranteed to produce this kind of result: he may be able to model something, but there's no reason to think that the framework he's using is a uniquely meaningful reflection of physical reality. It's very much a case of "When all you have is a hammer, everything looks like a nail." Wolfram is obsessing over a particular type of hammer.
I'm not an expert at all, but I think the "one model to rule them all" is an sf problem that no serious physicist really work on...because rules are models to describe the universe. Knowing a unique model that match exactly means we are able to know everything about the universe. But how one can know that there is no more phenomena to be discovered, and that all models match exactly ?
I can't say exactly what Wolfram may be thinking, but part of it may just be solving the discrepancies between general relativity and quantum physics.
It's assumed that that must be possible, simply because the universe manages to make it work somehow. But our mathematical models of those two theories aren't fully compatible. That's something that serious physicists definitely think about, although there probably aren't that many working on solving it directly.
The other thing that seems to interest Wolfram is finding fundamental causes - e.g. why gravity exists in the first place, etc. He seems to think that can be derived from his particular rules.
This is something that many physicists deliberately avoid, on the theory that physics is about modeling and predicting what exists, not about explaining why it exists.
It's not really as simple as that - e.g. the examples I gave earlier contradict that idea - but it's certainly the case that many physicists would consider Wolfram's goals to be unscientific in a sense, because it's not likely to be possible to get evidence for claims about why e.g. space or gravity exist. Of course, we can't properly assess that until we see such a claim, which hasn't yet been produced.
Finally, I think you're right that "one model" is never really going to apply. If you look at existing physics there are all sorts of different principles that apply at different scales and to different phenomena, and there are reasons for that. It's only really to address issues like the GR/QM discrepancy that some sort of better compatibility between models is needed.
With all due respect, I think you're wildly out of touch with how most people use "phones" these days. Actual calls probably represent about 5% of my phone use. The term "phone" is essentially a historical misnomer at this point.
Maybe being in my mid 30's I'm too old to be up to date on this but what do people who aren't "out of touch" use their phones nowadays for? Some form of 'Apps' I'm guessing.
And for those apps to serve their purpose (excepting the likes of Candy Crush and Genshin Impact) what do those apps need? Internet connectivity right? Which is received via the same modem as the calls, right?
So a sucky modem will negatively impact battery life and app usage in challenging environments just as much or even more as call capabilities, right?
The confusion was probably just about saying it's a phone first. I'm in my mid 30s as well and the only time I call someone is when I contact my mother. I think with the younger generation it's at a point where calling someone is seen as intrusive.
If you are calling me you are taking my choice away on when to deal with whatever you've got. Or at least are interrupting me with the attempt itself. So it better be important.
So on point. And even when it actually phones 90% of the time I just dodge it. That 10% left is pagerduty. All real contacts use some form of a messaging app instead
Won't work where I live now in south-centeal Europe (Austria ) where everything is archaic and most services are still done via phone calls like in the past: appointments for plumbers, doctors, recruiter call-backs to job applications, etc.
Almost none of those services here use email or apps as the default, everyone seems to prefer sync communication so they first tries to reach you on phone. If you don't answer or call back they'll assume you're ghosting them and don't need the appointment/service anymore and you'll miss out on important issues. Sometimes they'll leave a voicemail telling you to call them back when you can, so you can never really escape calls here without emigrating.
You can say we both live in completely different bubbles but the existence of one does not invalidate the other.
I'm from Poland. It's just over the course of years most of the services became messenger contacts, for various reasons, but most common one was the need to reliably exchange pictures
Sure but when you're out and about and don't have WiFi you'll be relying on your modem to send/receive pictures and all kinds of data.
Probably not an issue if you're all the time in cities with great coverage but you'll feel it when you go in "the woods" or well shielded buildings or basements.
Love Austria btw., and appreciate the preciseness of your language. It's been years since I worked with it, but it always felt like a vacation for my brain, while consuming technical content in it
>Love Austria btw., and appreciate the preciseness of your language.
I'm not Austrian, I'm an immigrant here, and it's not "my language" nor anyone else's I presume, it's still just German, albeit with an Arnold Schwarzenegger accent on top.