LLama.cpp now has a web interface
LLama.cpp now has a web interface

github.com Simple webchat for server by tobi · Pull Request #1998 · ggerganov/llama.cpp
I put together a simple web-chat that demonstrates how to use the SSE(ish) streaming in the server example. I also went ahead and served it from the root url, to make the server a bit more approach...

[ comments | sourced from HackerNews