I frequently see news stories where attorneys get in trouble for using LLMs, because they cite hallucinated case law (e.g.). If they didn't get caught, that would look the same as using them "productively".
Asking the LLM for relevant case law and checking it up - productive use of LLM. Asking the LLM to write your argument for you and not checking it up - unproductive use of LLM. It's the same as with programming.
>Asking the LLM for relevant case law and checking it up - productive use of LLM
That's a terrible use for an LLM. There are several deterministic search engines attorneys use to find relevant case law, where you don't have to check to see if the cases actually exist after it produces results. Plus, the actual text of the case is usually very important, and isn't available if you're using an LLM.
Which isn't to say they're not useful for attorneys. I've had success getting them to do some secretarial and administrative things. But for the core of what attorneys do, they're not great.
For law firms creating their own repositories of case law, having LLMs search via summaries, and then dive into the selected cases to extract pertinent information seems like an obvious great use case to build a solution using LLMs.
The orchestration of LLms that will be reading transcripts, reading emails, reading case law, and preparing briefs with sources is unavoidable in the next 3 years. I don’t doubt multiple industry specialized solutions are already under development.
Just asking chatGPT to make your case for you is missing the opportunity.
If anyone is unable to get Claud 3.7 or Gemini 2.5 to accelerate their development work I have to doubt their sentience at this point. (Or more likely doubt that they’re actively testing these things regularly)
Law firms don't create their own repos of case law. They use a database like westlaw or lexis. LLMs "preparing briefs with sources" would be a disaster and wholly misunderstands what legal writing entails.
I find it very useful to review the output and consider its suggestions.
I don’t trust it blindly, and I often don’t use most of what it suggests; but I do apply critical thinking to evaluate what might be useful.
The simplest example is using it as a reverse dictionary. If I know there’s a word for a concept, I’ll ask an LLM. When I read the response, I either recognize the word or verify it using a regular dictionary.
I think a lot of the contention in these discussions is because people are using it for different purposes: it's unreliable for some purposes and it is excellent at others.
> Asking the LLM for relevant case law and checking it up - productive use of LLM.
Only if you're okay with it missing stuff. If I hired a lawyer, and they used a magic robot rather than doing proper research, and thus missed relevant information, and this later came to light, I'd be going after them for malpractice, tbh.
Surely this was meant ironically, right? You must've heard of at least one of the many cases involving lawyers doing precisely what you described and ending up presenting made up legal cases in court. Guess how that worked out for them.
The uses that they cited to me were "additional pair of eyes in reviewing contracts," and, "deep research to get started on providing a detailed overview of a legal topic."
Google has gradually been moving things from AOSP (open) to GApps (closed). Some of the things that have been moved are fairly essential for a mobile operating system (like a location provider and a SMS app). Projects building on AOSP now have to provide replacements, or declare them out of scope and punt.
Regarding the Anonym acquisition, every adtech acquisition is a reverse-acquisition. Google didn't really go evil until they got reverse-acquired by DoubleClick.
It's true that Apple has the only independent browser engine that has enough users to make developers cater to it. But it's also true that Mozilla has seats on the relevant standards bodies, and on privacy-related issues, their presence helps act as a counterweight along with Apple.
reply