I have a new micro-project.

I have a new micro-project.
It allows to use venice.ai lifetime Pro account with local apps that communicate over the ollama API (open-webui, or continue.dev your VS Code IDE / JetBrains IDE).

Check out the demo video, it's quite nice.

I can also provide a pro account if you don't want to fiddle with MOR tokens.

I started doing this because llama-3.1-405b is really a great model (I think better than ChatGPT for many coding tasks), but I can't run it locally on my laptop.

With this, I have everything set up to make it work locally, with the best open-source model available today.

https://pay.cypherpunk.today/apps/26zEBNn6FGAkzvVVuDMz3SXrKJLU/crowdfund