I could see this being incredible if it had a set of performance related queries or ran explain analyze and offered some interpreted results.
Can this be run fully locally with a local llm?
I could see this being incredible if it had a set of performance related queries or ran explain analyze and offered some interpreted results.
Can this be run fully locally with a local llm?