Run local open-source LLM from Orchestrator - like Jan.ai
I want the ability to run a Local LLM (without access to the internet) and leverage it in Orchestrator - for end-user impact.
Think Jan.ai which allows the end-use to use Mistral, Llama 2, tinyLlama, and the likes.
Post Information
Subscribe to post
Get notified by email when there are changes.
Upvoters
Post Details