Posts
1
Comments
1
Joined
2 yr. ago
Extending Context Window of Large Language Models via Positional Interpolation
Paper released by Meta a few days ago, detailing a method for extending the context or "memory" of an LLM up to 32k tokens. What is interesting is that they give a mention to: https://kaiokendev.github.io/
This is a blog post written by a guy in his spare time who came up with the same method simultaneously, he calls it SuperHOT.
It's really exciting how AI/ Machine learning can be advanced by relatively ordinary people putting in hard work without the resources of Microsoft etc.
I've gotten so much out of Terraria over the years, it makes me feel a bit guilty having paid less than £5 for it. Truly a great game.