June 30th, 2024

WhatsApp Android beta reveals Llama 3 405B option

WhatsApp is updating to version 2.24.14.7 on Google Play Beta, introducing the Meta AI Llama model for enhanced user interactions. Users can choose between different Llama models for tailored AI experiences.

Read original articleLink Icon
WhatsApp Android beta reveals Llama 3 405B option

WhatsApp is releasing a new update, version 2.24.14.7, through the Google Play Beta Program. The update introduces a feature allowing users to choose the Meta AI Llama model for future use. This feature is still under development and not yet available to beta testers. The Meta AI Llama model aims to enhance user interactions with advanced AI models, offering options like the default Llama 3-70B for quick prompts and the upcoming Llama 3-405B for more complex queries. Users will have the flexibility to select the model that best suits their needs, balancing faster responses with advanced capabilities. This feature is expected to provide users with more control over their AI interactions, tailoring their experience based on specific requirements. Stay tuned for further updates on the release of this feature.

Link Icon 5 comments
By @paxys - 5 months
It's weird to me that Llama exists on WhatsApp only as a fully independent chatbot and the app keeps assuring you that it can't read any of your other messages. Reading my other messages is exactly what I want it to do! Help me make sense of long group threads. Summarize plans/events/dates. Remind me when something important comes up that needs my attention. Pull up info from old threads that might be relevant for me.

So much of everyone's life is happening in messaging apps, and it's a perfect surface for "AI" to actually help. Yet every big player keeps pushing their "generate AI art from a prompt" feature front and center as if anyone really cares.

By @wkat4242 - 5 months
This will definitely not be running locally on your phone. That's for sure. Both 70b and 405b are way too big for that.

8b would be feasible though given enough ram but probably a bit too slow for most users' liking. I guess this is why they don't even offer a local option. And when you go cloud the bigger models will be better of course.

I wonder if meta will use this to finally monetise WhatsApp.

By @OutOfHere - 5 months
Is this going to beat both gpt4t and gpt4o in benchmarks?
By @Gualdrapo - 5 months
Meanwhile they made almost impossible to restore a local chat backup. Yay!