Launching today

TRIBE v2
Predict brain responses to video, audio, and text
1 follower
Predict brain responses to video, audio, and text
1 follower
TRIBE v2 is Meta’s multimodal brain encoding model that predicts fMRI brain responses to video, audio, and text. Built for neuroscience researchers, AI researchers, and brain-modeling teams exploring in-silico experiments.






TRIBE v2 is one of the most interesting AI research demos I have seen recently because it moves AI closer to modeling how humans respond to the world.
What it is: TRIBE v2 is Meta’s multimodal brain encoding model that predicts fMRI brain responses to video, audio, and text.
Problem → Solution: Neuroscience experiments are expensive, slow, and hard to scale because they often require scanning participants while they respond to stimuli. TRIBE v2 gives researchers a way to simulate brain-response predictions from natural inputs like videos, audio, and language, making it easier to explore hypotheses before running full human studies.
What makes it different: It combines video, audio, and language into a unified model for brain-response prediction, with code, demo access, model weights, and a research paper available for the community. The GitHub repo also includes a Colab demo and supports inference from video, audio, or text.
Key features:
Predicts fMRI brain responses from video, audio, and text
Uses a unified multimodal Transformer architecture
Includes an interactive demo
Provides open code and model weights
Supports Colab-based exploration and brain visualizations
Designed for in-silico neuroscience research
Benefits:
Helps researchers explore brain-response hypotheses faster
Reduces reliance on running every early experiment in an fMRI scanner
Makes multimodal brain modeling easier to test and reproduce
Gives AI researchers a practical look at brain-inspired modeling
Opens a path toward more scalable neuroscience experimentation
Who it’s for: Neuroscience researchers, AI researchers, computational cognitive scientists, and teams exploring brain-response prediction.
The bigger picture here is not “AI reading minds.” It is AI helping researchers model how the brain responds to the world, so neuroscience can move faster and test more ideas digitally.
I hunt the latest and greatest launches in tech, SaaS and AI, follow to be notified.