Recently I have been using perplexity AI. it was working great but now they have reduced token size. which has badly resulted in performace.
I am also perplexity pro user. I am facing quality issues with responses their supported models are generating.
Are there any Online LLM models (project as a whole) which i can deploy as self-hosted?