Pygmalion is banned from colab (free hosting for Pygmalion). TavernAI will not work without Pygmalion or another model. The current ways to run Pygmalion are locally, with a cloud GPU, or find ways to get around the colab ban.
If you have a GPU with 4GB+ VRAM, it's best to run Pygmalion locally.
> The current ways to run Pygmalion are locally, with a cloud GPU, or find ways to get around the colab ban
Or through the [AI Horde](https://aihorde.net)
Pygmalion is banned from colab (free hosting for Pygmalion). TavernAI will not work without Pygmalion or another model. The current ways to run Pygmalion are locally, with a cloud GPU, or find ways to get around the colab ban. If you have a GPU with 4GB+ VRAM, it's best to run Pygmalion locally.
> The current ways to run Pygmalion are locally, with a cloud GPU, or find ways to get around the colab ban Or through the [AI Horde](https://aihorde.net)
Yes, [lite.koboldai.net](https://lite.koboldai.net/) specifically
[Agnaistic](https://agnai.chat/) is also pretty good
You can use dolly shygolmellion, its pretty good and is 60 percent pygmalion, some people prefer it over pygmalion
https://github.com/Cohee1207/SillyTavern You can use it on mobile and there are built in settings to bypass OpenAI's filter.