But that's the problem with this. The vast majority of the spend is going to be on the Nvidia chips which have a shelf life of 3-5 years. They are not making any significant long term investments.
During the dot com bubble, telecom companies spent 10s of billions of dollars laying down cables and building out the modern public internet infrastructure that we are still using today. Even if a lot of companies failed, we still greatly benefited from some of the the investments they made.
For this bubble, the only long term investment benefits seems to be the electricity build out and a renewed interest and investment in nuclear.
Most (if not all) of Oracle's investments are mostly in chips and data centers.
I was going to comment this exact thing about stacking plates. I think most servers/ex-servers also do this regardless of age. It's even easier to do than returning a shopping cart.
The only downside of this in the US is that homeless people will tend to hang around Aldi's asking people if they can return their cart to get the coin. Most of them are friendly and thankful but every once in awhile an aggressive person would make me very uncomfortable.
I also expect Aldi management isn't thrilled about homeless people camping outside their stores.
It already has. Any tech company that is pre-IPO and still raising funding rounds is a "startup". I'm surprised there hasn't been someone to come up with a separate term for the stage of these kinds of companies.
Still, I think there needs to be a specific term for a company that has recently had a funding round and will likely IPO in the future, like Stripe. That's a different category than a start-up or privately owned company that will never IPO like Koch Inc.
I think it makes sense in this instance. Because this occurred in us-east-1, the vast majority of affected customers are US based. For most people, it's easier to do the timezone conversion from PT than UTC.
us-east-1 is an exceptional Amazon region; it hosts many global services as well as services which are not yet available in other regions. Most AWS customers worldwide probably have an indirect dependency on us-east-1.
I think what you're trying to reference is APIs or libraries, most of which I wouldn't consider abstractions. I would hope most senior front-end developers are capable of developing a date library for their use case, but in almost all cases it's better to use the built in Date class, moment, etc. But that's not an abstraction.
Because if LLM inference is going to be a bigger priority for the majority of companies, they're going to go where they can get the best performance to cost ratio. AWS is falling behind on this. So companies (especially new ones) are going to start using GCP or Azure, and if they're already there for their LLM workloads, why not run the rest of the infrastructure there?
It's similar to how AWS became the de-facto cloud provider for newer companies. They struggled to convince existing Microsoft shops to migrate to AWS, instead most of the companies just migrated to Azure. If LLMs/AI become a major factor in new companies deciding which will be their default cloud provider, they're going to pick GCP or Azure.
It really depends on the query. I'm not a Google query expert, but I'm above average. I've noticed that phrasing a query in a certain way to get better results just no longer works. Especially in the last year, I have found it returns results that aren't even relevant at all.
The problem is people have learned to fill their articles/blogs with as many word combinations as possible so that it will show up in as many Google searches as possible, even if it's not relevant to the main question. The article has just 1 subheading that is somewhat relevant to the search query, even though the information under that subheading is completely irrelevant.
LLMs have ironically made this even worse because now it's so easy to generate slop and have it be recommended by Google's SEO. I used to be very good at phrasing a search query in the right way, or quoting the right words/phrases, or having it filter by sites. Those techniques no longer work.
So I have turned to ChatGPT for most of the queries I would have typically used Google for. Especially with the introduction of annotations. Now I can verify the source from where it determined the answer. It's a far better experience in most circumstances compared to Google.
I have also found ChatGPT to be much better than other LLMs at understanding nuance. There have been numerous occasions where I have pushed back against ChatGPT's answer and it has responded with something like "You would be correct if your input/criteria is X. But in this case, since your input/criteria is Y, this is the better solution for Z reasons".
It looks like this will support physical buttons. From the article: "Drivers can also use onscreen controls, physical buttons, or Siri to manage both standard vehicle functions like the car’s radio and climate, as well as advanced, vehicle-specific features and controls like audio system configurations or performance settings, right from CarPlay, giving them a more fluid and seamless experience."
And if you look at the picture from the article, all the physical buttons are there.
During the dot com bubble, telecom companies spent 10s of billions of dollars laying down cables and building out the modern public internet infrastructure that we are still using today. Even if a lot of companies failed, we still greatly benefited from some of the the investments they made.
For this bubble, the only long term investment benefits seems to be the electricity build out and a renewed interest and investment in nuclear.
Most (if not all) of Oracle's investments are mostly in chips and data centers.
reply