> Wait, that can't be right, it would be ludicrous. Once a game passes the threshold it pays per-install permanently?
Yes and no. You need to meet _both_ thresholds, cumulative (lifetime) installs _and_ yearly(!) revenue. I (!)'d the yearly part there, because you still need to be pulling in a yearly $1M of revenue (I'm assuming Unity Pro here cause the math is simpler) after your 1M of installs.
So while there are some edge cases here that are legitimately ludicrous, it's not the case that you're on the hook for the game in perpetuity, because if your game falls off a cliff and you make $500k in revenue next year, you owe nothing in runtime fees. In other words, you're not incentivized to take it off the market after 1M installs unless the runtime fees made it so you started losing money on the game after your $1M of revenue-- there are some examples where this is possible but none of them are very realistic.
It's also not clear if its $1M in the previous 12 months, or in the past calendar year, or if they have any rights to audit us, or which financial entity is on the hook. Is in the entity that pays for Unity, or the Publisher, or the Distributor?
What about the contractors we pay to do a few months of work at the end and use their own licenses. What about the folks that do our PS5 and Xbox ports for us?
Unity attempted to clarify their position around Game Pass telling devs not to worry because Microsoft will pay, but that makes me more worried because MS will just pull those games. I think there are 25 million Game Pass subscribers, and that's a lot of 20c installs.
We were hoping for another stint in Game Pass as a follow up to Void Bastards.
> The problem Unity have created is if something can be made with Unity it will get crowded out with clones in five minutes.
This really has nothing to do with Unity. Flappy Bird could have been built on any platform and you would still have a million clones of it. Because it takes a day to make it. It's just as easy to clone that game in Unreal Engine, fwiw.
Unity didn't create the concept of the quickly built game, nor is Unity responsible for society incentivizing this type of game dev. If anything, the new runtime fees will disincentivize this type of game, so maybe that's a good thing?
How big is your team? How do you code review? How do you unit test? How are you maintaining 3 year old blueprints?
This will work just fine for small indie games and teams where the code is thrown out after a year, but there's no scaling this. I've worked with enough 10 year old Max/MSP patches to know that you will have an unmaintainable mess of wires that is as good as garbage after a while.
Larger teams typically use Blueprints to prototype but then rewrite in C++. This is a perfectly fine use case, but ultimately you still need it "written down" for the maintainability.
tl;dr Blueprints are a tool in the process to writing C++, they aren't a replacement.
This isn't really a solid argument since stockholder investment isn't even "money"; it's collateral in the form of market valuation against which Unity borrows. The actual "money" (dollars in a bank account) comes from lenders who look at a bunch of factors and make determinations about how much $$$ to give. That money isn't free, but it's got very little to do with "the shareholders" in general.
> it's got very little to do with "the shareholders" in general.
no it's got everything to do with shareholders, because ultimately, the money the shareholders invested into the company will need to be recouped by the shareholders, by any method they can.
And in a cut throat business of game engines, unity is trying to squeeze out every bit of juice it can. It still has some advantages over godot atm (such as big install base, and a pre-existing network/momentum which is hard to dissipate), and my guess is that unity is trying to get revenue up before godot takes their lunch in a few more years.
Keep in mind that in addition to the "learning curve" arguments, there is also the functional developer ergonomics of things like live reload (where you can maintain memory state) that are simply not possible to do with C++ without heavy limitations or customized tooling.
Being able to fix a bug without resetting memory state is a huge ergonomic advantage in game development where generating the right memory state can be incredibly complex and depend on a ton of very specific and hard-to-reproduce factors. Not to mention recompiling and restarting a game can be incredibly slow.
Yes, but it has many known limitations and isn't nearly as reliable as a runtime that has a full GC and virtualization optionality, which you really need in order to fully track what state can be evicted and what needs to stay.
Citation needed? Raw speed is likely similar; the cost overhead comes from GC cycles and the general approach to managing memory primarily in the heap vs stack, although C# can stackalloc if you're really diligent. Note that this is the same problem that blazing fast alternatives like Go have at competing with C/C++. These languages are mostly equivalent to C/C++ in speed, but lose the benchmark shootouts because of GC.
Calling Go blazing fast would be a stretch at best...the path to performance in C# is same as in Rust or C++: struct generics (aka templates), as short hot paths as possible, static partitioning of the work, sometimes hands on memory management and minimization of locking.
But you need to consider that so have hosting costs-- proportionately too. Hosting data was incredibly expensive 10 years ago. If the math was working then, it should at least be pretty close to working now.
For some segments that's true. Image hosting wasn't incredibly expensive 10 years ago. It was very much on the lighter side in cost compared to MegaUpload or YouTube type services.
Image file sizes increasing dramatically as smartphones started producing photo sizes that would have been considered massive 15 years ago, saturated much of the gains in cost to hardware.
It's easier to run a one-click image host (like the early Imgur) as a solo operator today versus back then. It's not much cheaper when you account for the larger image files (unless you severely limit the file size, which won't be a popular choice with users, most of which just blast four billion smartphone photos, don't think much about image sizes, and want to upload them as is without thinking about any of that).
Bandwidth is a heck of a lot cheaper these days (I remember a previous employer paying $10k/mo for a 100MB circuit in San Francisco 10ish years ago). That said while the prices are much lower, people are realizing that not all bandwidth is created equal, e.g to get good connectivity to some regions can still be ludicrously expensive, for example if you want to deliver to Singapore, Australia, etc, or say you wait to get content from the USA to South America with reasonable reliability and low latency.
Fastmail nets around $50/year (if you pay annually) for mail + your own domain, and if you don't need the domain, it's less.
Another $50/year really isn't that big of a stretch for some people, but I can see how saving up to 70% on email would be a pretty easy choice for those who don't really care that much about email and might even use gmail if it weren't for needing a custom domain.
Can you explain what is worse about link tracking? If you're explicitly making a request to a remote service, how would you expect it to not be tracked?
2. The link now goes to a marketing.newsletter.com instead of example.com/original-story, which gives me a better signal of the quality of the citation. Wouldn't you hate it if every wikipedia citation went to bit.ly instead?
3. My "intent" is to read the information behind the original link, if I can't do that without being tracked by a the sender of the email - I'm less likely to click the link.
4. Email isn't meant to be tracked! Clicking on a link in an email I got should not notify the sender, in any way whatsover. If this is true for my personal emails, why is it not true for company emails?
Getting good signal-to-noise ratio from ANY interview, forget the big cos, is really difficult. I've seen great programmers passed over because they were _not_ given a chance to whiteboard and show their technical skill. At the end of the day, there is a bit of randomness to the process of sizing anyone up in an hour, and people will make bad calls.
That said, I'm skeptical about your particular story. As a process rule, AWS doesn't provide interview feedback, so I'm pretty sure you don't _actually_ know why you were rejected, you can only guess. Perhaps if you were rejected in phone screen(s) it can be obvious, but if you made it to an on-site, I can guarantee (based on experience being in the room during hiring decisions at AWS) that they don't almost someone because of one bad whiteboarding result; there are multiple people collecting and reporting on results and only one person, the bar raiser, has veto power, and I've never seen it used for something as banal as not writing perfect code.
I also don't know what happened in your interview, but consider it's also possible that you completely bombed it: failed to ask questions, failed to communicate properly, or made more mistakes than you even realize.
It's certainly great that you went on to do great things, but maybe at the time you interviewed you weren't yet doing those great things...
Yes and no. You need to meet _both_ thresholds, cumulative (lifetime) installs _and_ yearly(!) revenue. I (!)'d the yearly part there, because you still need to be pulling in a yearly $1M of revenue (I'm assuming Unity Pro here cause the math is simpler) after your 1M of installs.
So while there are some edge cases here that are legitimately ludicrous, it's not the case that you're on the hook for the game in perpetuity, because if your game falls off a cliff and you make $500k in revenue next year, you owe nothing in runtime fees. In other words, you're not incentivized to take it off the market after 1M installs unless the runtime fees made it so you started losing money on the game after your $1M of revenue-- there are some examples where this is possible but none of them are very realistic.