Meta has released llama 3.1. It seems to be a significant improvement to an already quite good model. It is now multilingual, has a 128k context window, has some sort of tool chaining support and, overall, performs better on benchmarks than its predecessor.

With this new version, they also released their 405B parameter version, along with the updated 70B and 8B versions.

I’ve been using the 3.0 version and was already satisfied, so I’m excited to try this.

  • BazsalanszkyOPA
    link
    English
    24 months ago

    Are you using mistral 7B?

    I also really like that model and their fine-tunes. If licensing is a concern, it’s definitely a great choice.

    Mistral also has a new model, Mistral Nemo. I haven’t tried it myself, but I heard it’s quite good. It’s also licensed under Apache 2.0 as far as I know.