June 29th, 2024

The super effectiveness of Pokémon embeddings using only raw JSON and images

Embeddings are vital in AI, with Pokémon data encoded for comparison. JSON data from a Pokémon API was optimized, generating embeddings for over 1,000 Pokémon. Similarities revealed relationships based on type and generation, showcasing the effectiveness of embeddings in data analysis.

Read original articleLink Icon
The super effectiveness of Pokémon embeddings using only raw JSON and images

Embeddings, a set of numbers defining an object's characteristics, are crucial in AI. Pokémon data was encoded into embeddings using a text model, allowing for comparisons. The process involved obtaining structured JSON data from a Pokémon API, optimizing it for encoding, and generating embeddings for over 1,000 Pokémon. The embeddings were saved in a tabular format for analysis. By calculating cosine similarities, relationships between Pokémon were explored. Results showed similarities between Pokémon of the same type or generation. Notably, Pikachu had high similarity with Electric-type Pokémon. Other comparisons revealed patterns among different Pokémon types and generations. The study highlighted the effectiveness of embeddings in representing and comparing complex data like Pokémon attributes. The process showcased the potential of text embeddings in organizing and analyzing diverse datasets efficiently.

Related

ESM3, EsmGFP, and EvolutionaryScale

ESM3, EsmGFP, and EvolutionaryScale

EvolutionaryScale introduces ESM3, a language model simulating 500 million years of evolution. ESM3 designs proteins with atomic precision, including esmGFP, a novel fluorescent protein, showcasing its potential for innovative protein engineering.

Surprise, your data warehouse can RAG

Surprise, your data warehouse can RAG

A blog post by Maciej Gryka explores "Retrieval-Augmented Generation" (RAG) to enhance AI systems. It discusses building RAG pipelines, using text embeddings for data retrieval, and optimizing data infrastructure for effective implementation.

AI Revolutionized Protein Science, but Didn't End It

AI Revolutionized Protein Science, but Didn't End It

Artificial intelligence, exemplified by AlphaFold2 and its successor AlphaFold3, revolutionized protein science by predicting structures accurately. AI complements but doesn't replace traditional methods, emphasizing collaboration for deeper insights.

Whats better: Neural nets wider with less layers or thinner with more layers

Whats better: Neural nets wider with less layers or thinner with more layers

Experiments compared Transformer models with varying layer depths and widths. Optimal performance was achieved with a model featuring four layers and an embedding dimension of 1024. Balancing layer depth and width is crucial for efficiency and performance improvement.

JEPA (Joint Embedding Predictive Architecture)

JEPA (Joint Embedding Predictive Architecture)

Yann LeCun's Joint Embedding Predictive Architecture (JEPA) enhances AI by emphasizing world models, self-supervised learning, and abstract representations. JEPA predicts future states by transforming inputs into abstract representations, handling uncertainty, and enabling complex predictions through multistep or hierarchical structures. Several models like I-JEPA, MC-JEPA, and V-JEPA have been developed to process visual data and improve AI's understanding of images and videos, moving towards human-like interaction with the world.

Link Icon 13 comments
By @bc569a80a344f9c - 4 months
Very nice! This took me about 30 minutes to re-implement for Magic: The Gathering cards (with data from mtgjson.com), and then about 40 minutes or so to create the embeddings. It does rather well at finding similar cards for when you want more than a 4-of, or of course for Commander. That's quite useful for weirder effects where one doesn't have the common options memorized!
By @rahimnathwani - 4 months
There seem to be a lot of properties that are numeric or boolean, e.g.

    "base_happiness": 50,
    "capture_rate": 190,
    "forms_switchable": false,
    "gender_rate": 4,
    "has_gender_differences": true,
    "hatch_counter": 10,
    "is_baby": false,
    "is_legendary": false,
    "is_mythical": false,
Why not treat each of those properties as an extra dimension, and have the embedding model handle only the remaining (non-numeric) fields?

Is it because:

A) It's easier to just embed everything, or

B) Treating those numeric fields as separate dimensions would mean their interactions wouldn't be considered (without PCA), or

C) Something else?

By @jszymborski - 4 months
I would be interested in how this might work with just looking for common words between the text fields of the JSON file weighted by e.g. TF-IDF or BM25.

I wonder if you might get similar results. Also would be interested in the comperative computation resources it takes. Encoding takes a lot of resources, but I imagine look-up would be a lot less resource intensive (i.e.: time and/or memory).

By @refulgentis - 4 months
Almost everyone uses MiniLM-L6-v2.

You almost certainly don't want to use MiniLM-L6-v2.

MiniLM-L6-V2 is for symmetric search: i.e. documents similar to the query text.

MiniLM-L6-V3 is for asymmetric search: i.e. documents that would have answers to the query text.

This is also an amazing lesson in...something: sentence-transformers spells this out, in their docs, over and over. Except never this directly: i.e. it has a doc on how to make a proper search pipeline, and a doc on the correct model for each type of search, but not a doc saying "hey use this"

And yet, I'd wager there's $100+M invested in vector DB startups who would be surprised to hear it.

By @bfung - 4 months
> minimaxir uses Embeddings!

> It’s super effective!

> minimaxir obtains HN13

By @axpy906 - 4 months
Nice article. I remember the original work. Can you elaborate on this one Max? > Even if the generative AI industry crashes
By @moralestapia - 4 months
Nice.

Can you compare distances just like that on a 2D space post-UMAP?

I was under the impression that UMAP makes metrics meaningless.

By @flipflopclop - 4 months
Great post, really enjoyed the flow of narrative and quality deep technical details
By @Woshiwuja - 4 months
arceus being as close as rampardos to mew is kinda funny
By @vasco - 4 months
> man + women - king = queen

Useless correction, it's king - man, not man - king.

By @ramonverse - 4 months
really cool read!
By @jpz - 4 months
Great article - thanks.