I might add support for other model providers (like Google and Azure) in the future. Although I'm trying to keep the scope of the project very small because it's easier for me to maintain it.
Anyway, I think adding new LLM providers is pretty straightforward if you want to do it yourself. You just need to implement the API of the BaseLLM (see the `cogitator/model/base.py` file) for your provider. After that, you just use it like how you use OllamaLLM or OpenAILLM.
Sounds like a great idea. In the next release, I'll add a visualization for how things work and related to each other, and possibly also include some benchmark results.
Also, are you using this tool as part of another project? It’d be interesting to see what the main applications of CoT prompting are (the examples are great but a little basic)
I'm not using it in a larger project at the moment. The examples right now are mainly included to help people get started quickly. About the applications, they are somewhat context-dependent, but I might add one or two larger examples later if I have the time.
I'm developing an open-source Rust library called Graphina for graph data science. The library is in a very early stage of development, so it may not be ready for serious use yet. However, early versions of many core features and algorithms have already been implemented.
I'm announcing the project here to invite contributions and suggestions from the community. I'm still relatively new to Rust, so I would especially appreciate any constructive feedback you might have to help improve Graphina.
Did you bother searching for the name you chose? Or maybe consider the consequences of not doing so https://softwareengineering.stackexchange.com/questions/6943... or, maybe most importantly, not confuse the people that you're trying to get attention from?
Because you're coming off like you really just don't care about this project and don't have any professional standards.