I would not use such strong rhetoric as the GP, but I believe they probably mean we should lean towards using the Gall/Peters projection, which maintains lengths and areas, but not angles.
(There are of course other projections with other interesting features; or you could take the same projection but center the world differently etc.)
Why? Why is lengths and areas more important than angles? You have to choose one, its essentially arbitrary. Personally I find it more useful to know what is parallel to what and what is at which angles from what, than some size. We have globes, so we know what the "real size" of Greenland looks like... this has always been a silly argument from the overzealous online looking for right wrongs that don't exist.
> Why is lengths and areas more important than angles?
Well, of course the answer is "it depends on what it is you want to learn from the map. If you're driving around and want to navigate, you'll take Mercator probably. But if you want to compare sizes of objects (like lakes or forests or islands or world states), especially when zoomed out, you'll prefer Gall-Peters.
Many argue, and I tend to agree, that when looking at a map of the whole world, you are typically better served with Gall-Peters in terms of what your interest is, and in fact, people _do_ use Mercator maps to semi-consciously compare sizes of things - and have false impressions about geo-politics because of it.
This comment is inaccurate! Web Mercator causes such large errors in geolocation that the NGA had to issue an advisory about it [1].
There is a whole science behind map projections and Google ignored it entirely when they created Web Mercator, which was a hack to divide the world into a quad tree. It was vaguely clever and utterly stupid at the same time.
Hi - I understand you feel strongly; your Web Mercator input is interesting. I would just focus on the intellectually interesting part - people might not get it; you can't control that or compel them to.
You've been repeating essentially the same comment, writing in all caps (in some comments), complaining about downvotes, telling everyone they are idiots one way or another. None of those things are likely to be welcome.
yes doesn't do string concatenation, at least not in the loop that matters. It just prepares a buffer of bytes once and writes it to stdout repeatedly.
I usually do a kubernetes cluster on top of VMs. But sometimes when I really want to scale the standard cloud server less platforms all support /dev/null out of the box. (Except for Windows...)
Still need an adapter library though! Fortunately there are about 7 competing implementations on npm and most of them only have 5-6 transitive dependencies.
I'm working to make private hosting easier. I've been running a software development agency in Melbourne for 10+ years and have been building this platform in the background to help automate and standardise the hosting needs for our clients.
We're now getting ready to launch a web portal for others to manage their own private hosting in a simpler way. The product also includes a directory of off-the-shelf applications which can be launched in a few clicks (eg. Deepseek chatbot).
If you're interested in being part of our closed-Beta in March, reach out! (e: james at below domain)
Exactly, is OP really the one who should be influencing others preferences? Most couldn’t give two shits about the technological perils that await them just over the horizon - and should you really be the one to inform them of those horrors? Just relax - and embrace the book of faces. The movement will be swift and relatively painless, mostly.
I think you've managed to hit the nail on the head.
Rather than accept the rather human nature of people writing in their own style and making mistakes, we'd prefer to filter it through a dispassionate void first.
It's rather embarrassing how quickly we're willing to toss away the human elements of writing.
Agreed. LLM writing style is disgustingly bland and "offensively inoffensive" like Corporate Memphis. Would rather have actual human style, mistakes and all.
Comments like this one are so predictable and incredulous. As if the current state of the art is the final form of this technology. This is just getting started. Big facepalm.
Have you already noticed the trend of image search results for porn containing inferior AI slop porn?
I have. It sucks. The world we're headed for maybe isn't one we actually wind up wanting in the end.
I like the idea of increasingly advanced video models as a technologist, but in practice, I'm noticing slop and I don't like it. Having grown up on porn, when video models are in my hands, the addiction steers me in the direction of only using the the technology to generate it. That's a slot machine so addictive akin to the leap from the dirty magazines of old to the world of internet porn I witnessed growing up. So, porn addiction on steroids. I found it eventually damaging enough to my mental health that I sold my 4090. I'm a lot better off now.
The nerd in me absolutely loves Generative models from a technology perspective, but just like the era of social media before it, it's a double edged sword.
No, I'm providing a personal anecdote that some members of society that do have, or may develop, the same or similar problems are having both the (perceived) good and the bad aspects of those problems seriously magnified by this technology. This can have personal consequences, but also the consequences can affect the lives of others.
Hence, a certain % of the population will be negatively affected by this. I personally personally think it's worth raising awareness of.
I hope they're right. If the technology improves to such a degree that meaningful content can be produced then it could spell global disaster for a number of reasons.
Also I just don't want to live in a world where the things we watch just aren't real. I want to be able to trust what I see, and see the human-ness in it. I'm aware that these things can co-exist, but I'm also becoming increasingly aware that as long as this technology is available and in development, it will be used for deception.
That's exactly what I mean, all of those methods take some human effort, there is a human involved in the process. Now we face a reality that it might take no human effort to do... well, anything. Which is terrifying to me.
I do believe that humans are restless, and even when there is no longer any point to create, and it is far easier to dictate, we still will, just because we are too driven not to.
you know that there is still offline artforms like concerts theaters opera installations etc so i wouldn see it that negative. and we have nearly 100years of music and film we can enjoy. so maybe video is a dying artform for human to act in but there is so much more.
The most predictable comment is yours, especially since you completely missed the point of the original comment which had nothing to do with the video quality.
It’s interesting. They have had plenty of time and resources available to mount solid competition. Why haven’t they? Is it a talent hiring problems or some more fundamental problem with their engineering processes? The writing has been on the wall for gpgpu for more than 10 years. Definitely enough time to catch up.
Its a commitment problem IMO. NVidia stuck with CUDA for a long time before it started paying them what it cost. AMD and Intel have both launched and killed initiatives a couple times each, but abandon them within a few years because adoption didn't happen overnight.
If you need people to abandon an ecosystem thats been developed steadily over nearly 20 years for your shiny new thing in order to keep it around, you'll never compete.
reply