Hacker Newsnew | past | comments | ask | show | jobs | submit | guax's commentslogin

> It's traffic that is completely below ground which means that no streets above are burdened with it, it's all electric so no emissions

So are modern subways. Cost is a major point tho, subways are designed to move waaaaay more than 30k people a day for much less, but costs of building are much higher.

This is only 1.7 miles and a novelty, I would not know If the differences hold for Tesla on other places or when scaling up. My suspicious is that it does not.

I also wonder that if you use the same tunnel they did but modify the cars to run by themselves using traditional techniques, would the operation get cheaper but the shortcomings be more glaring.


Vegas now has 5 stations and is 2.2 miles. Can you realistically compare it to a billion dollar a mile sub at line that would take a decade to build (or more)?

30k a day is a nearly a million a month and costs are low by comparison (no expensive subway cars etc).


This is nothing compared to Tokyo Metro, or even the New York subway. Stop comparing failures to other failures like they are successes.

You are comparing the Vegas loop to a tunnel in Tokyo the biggest metro on earth.

That's like saying a car is slow because it's not a spaceship.


Can you cite that Tokyo is the biggest metro?

When I say metro I'm talking about metropolitan area not the subway.

Tokyo has 37 million people so it's comical to compare it las vegas which has less than 700k.

https://en.m.wikipedia.org/wiki/List_of_largest_cities


Hey. That's my network name.


Funnily enough weed is more legal in the US than Norway.


Not federally it isn't.


They stopped giving a F and started to give a S (lots of it)


I think any of the "off the shelf" gotek emulators should suffice for this. They're made for people to keep playing games on old hardware. I would assume copy protection and other shenanigans would be the creme de la creme of abusing the hardware.

This is to get rid of the media only. You'll still be using the original compute hardware. But it would be an interesting step.

I feel that most of the desire to upgrade is cultural and not technical. People love to talk about the floppies being used while its just a small part of the equation. Cost and risk of creating a new system with the same reliability expectations is hard when the incumbent has decades of iteration. For systems that do not require more performance or energy efficiency the accounting on upgrading looks very different.


So less than 10x already.

Question, how familiar are you with the used technologies? My experience on where Ai have been useful so far is things I don't have a good understanding on but, when I do, its a different ball game, mostly because coding it directly seems faster since I know exactly the behaviour I am looking for and I am not having to deal with unintended consequences.

I see it as the Alice cat thing, when you don't know where you going, any road will take you there. So its been great for exploratory work and prototyping.


Yeah, I'm very familiar with the tech, I've been interested in games dev and web dev for a few decades now. So you could be right, that the models aren't ready to "play on their own" yet.

I tried doing a warcraft 1 clone, but that felt too complex for the model being used (openai 4.1). That model was just the default setting in copilot.

I dug a little deeper this morning, and it turns out I hadn't actually enabled my copilot 'pro' mode, which has granted access to some more current or dev focused models. So I'll take them for a spin to see what they're capable of.

My goal here is to roughly get a sense for when a task is too complex for an "agent" to handle.

I also want to try adding custom tools to suit certain project needs. For example, Unreal Engine has a python editor interface, so I'd like to have an agent drive the editor to build something. I have my doubts.

Once I have a feeling for what level of complexity can be handled, I'll see if I can manage the tools better using this understanding, by breaking large and complex projects into appropriate chunks of work / complexity.

/endBrainDump


What used to take a week now can be done in just 5 days.


My guess is that the discussion trended around performance and not correctness since compilers are pretty well understood. Why a LLM output what they do are not understood by anyone to the same degree.


Whoosh noises, flashback bells...

In the future no one will have to code. Well compile the business case from UML diagrams!


Only market share will change that. The Proton making linux gaming this good is an incredible trojan horse to a more open OS market. The more people use it, the more incentive companies have to support it entirely, not only by making the game run well on the compatibility layer but making it native.

My pipe dream is that Proton becomes so successful that it kills itself by making linux gaming profitable enough to be first party supported. My long wish of having a decent gaming and development environment will finally come true, just in time for me to not care so much about one or the other.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: