These cameras may have been originally sold to municipalities as a way to find stolen cars, but from one year to the next, federal agencies have (1) decided that their main goal is finding arbitrary noncitizens to deport, and (2) that they're entitled to the ALPR data collected by municipalities in order to accomplish this goal. The technology isn't any different, but as a result of the way it was deployed (on Flock's centralized platform), it was trivial to flip a switch and turn it into a mass surveillance network.
> decided that their main goal is finding arbitrary noncitizens to deport
In the vast majority of cases this means: "enforcing immigration law." A presidential administration deeming it politically expedient to import illegal immigrants via turning a blind eye doesn't change the law of the land.
> that they're entitled to the ALPR data collected by municipalities in order to accomplish this goal
"Entitled" to purchase something that is being sold on the market for a fair price? Why wouldn't they be entitled to purchase this info if a vendor wishes to sell it to them?
Maybe, but I don't think there's much evidence that cameras with sharing disabled were getting pulled by DHS, and I think, because of how the cameras work, it would be a big deal if they had. Flock also has extreme incentives not to let that happen. We'll see, I guess: contra the takes on threads like this, I don't think the cameras are going anywhere any time soon. I think small progressive and libertarian enclaves will get rid of their cameras while remaining landlocked in a sea of municipalities expanding theirs.
> I think small progressive and libertarian enclaves will get rid of their cameras while remaining landlocked in a sea of municipalities expanding theirs.
Flock will just start putting cameras up on private property and selling the data to the Federal government. Municipalities can do very little to stop this, and local governments are pretty poor at keeping their true reasons out of public forum deliberation. Loophole methods of prohibition ("Can't put up camera masts") are easily thwarted in court.
I think this is equally true of writing. Once you see something written one way, it's very hard to imagine other ways of writing the same thing. The influence of anchoring bias is quite strong.
A strong editor is able to overcome this anchoring bias, imagine alternative approaches to the same problem, and evaluate them against each other. This is not easy and requires experience and practice. I am starting to think that a lot of people who "co-write" with ChatGPT are seriously overestimating their own editing skills.
I sympathize with people who find writing difficult. But, putting myself in GP's shoes, I can't imagine trying to read my father's LLM-generated memoir. How could I possibly understand it as _his_ work? I would be sure that he gave the LLM some amount of information that would render it technically unique, but there's no way I could hear his voice in words that he didn't choose.
If you're writing something for an audience of one, literally nothing matters more than the connection between you and the reader. As someone with a father who's getting on in years, even imagining this scenario is pretty depressing.
- how many people that currently work at your company had to go through an AI interviewer to get the job?
- do referrals have to go through an AI interviewer too?
To me, this just smacks of a tool that increases the cost of cold-submitting your resume so companies can optimize for "preferred" hiring paths likeinternal referrals.
Fan/AC remotes have some of the worst, most unexpected UX I've ever seen. I still remember an AC remote at an Airbnb I stayed at a few years ago that, instead of having temperature control buttons that said "up"/"down" or "hotter"/"colder", it had two buttons that said "too hot" and "too cold". The "too hot" button decreased the temperature setting on the thermostat and the "too cold" button increased it.
This labeling makes some sense if you stop to think about it, but I can't shake the thought that this was the first time I encountered an interface that asked me to describe how I was currently feeling, rather than me telling it what I wanted it to do.
The author is pretty upfront that the conclusion was meant as a reference to a character in the movie "Hoosiers", not that any of the personalities named were literally drunks or bad fathers.
IDk, I feel like they're doing the thing where you list people they don't like, then list other worse people and kind of imply the first set are related / just as bad as the second set.
It's a motte and bailey where if people accuse you of doing that you retreat to saying "no see they're separate lists".
Maybe the author just forgot which three interchangeable white guys he named at the top of the article, up there with the DEI topic that he broached but then never came back to. Maybe he thought he had written "Elon Musk", about whom the bad father parallel is a little easier to insinuate from the public record.
Or maybe whenever he reads a headline about a billionaire, he just files it under one golem in his head called Zuckermuskezosdriessen. A golem which also includes James Damore (???).
After all, we're dealing with someone who writes sentences like, "the vast majority of your fellow students were men, and they were more or less all the same person as you." This is not an author who sees two people of the same demographic as separate individuals whose sins need to be litigated individually. If Musk is a bad father, what should it matter that Zuck seems to be a fine one?
I also recently had this experience! I remembered a recurring bit from an older comedy film (a customer in a shop keeps saying "Kumquats!") and tried to prompt ChatGPT 4o into getting it. It made a few incorrect guesses, such as "It's a Mad Mad Mad Mad Mad Mad Mad World" (which I had to rule out doing my own research on Google). I found the answer myself (W.C. Fields' "It's a Gift") with a minute or so of Googling.
Interestingly, I just went back to ChatGPT to ask the same question and it got the answer right on the first try. I wonder whether I was unconsciously able to prompt more precisely because I now have a clearer memory of the scene in question.
Notably the jury pool was from North Dakota, a state where many people have direct or indirect ties to oil and gas industries. In fact,
> During jury selection, potential jurors appeared to largely dislike the protests, and many had ties to the fossil fuel industry. In the end, more than half the jurors selected to hear the case had ties to the fossil fuel industry, and most had negative views of anti-pipeline protests or groups that oppose the use of fossil fuels.
You mean conspiring to cause immense economic harm to an already impoverished part of the country will make your organization unpopular in that region? shocked pikachu
playing economically broke locals against protesters of multinational energy giant gets "pikachu"? is anyone reading this, that simple minded? superficial or purposefully over-simplified feedback it seems
"You could have a model of Harvard Business School that is like:
1. Harvard Business School teaches you skills that would make you good at running a company.
2. There are lots of companies that could use those skills.
3. But you don’t want to run those companies, because they make, like, ball bearings.
4. You want to run a fancy company; you want to run a hedge fund or a tech startup or something.
5. Meanwhile, the people currently running the ball bearings company would not be all that excited about you, a fresh-faced business school graduate who has never run anything, coming in to run their company, even if you did learn a lot of useful skills at Harvard.
6. Therefore various industries exist whose principal business is laundering ball bearings companies into opportunities that appeal to Harvard Business School graduates. You wrap the ball bearings company in a name like “private equity” and suddenly it is legible to the Harvard students, so they flock to it.
7. Those industries are also in the business of getting the ball bearings companies to accept the Harvard Business School graduates, which in practice means not so much “make the ball bearings company excited about its new Harvard CEO” but rather “buy the ball bearings company and install new management.”