Business/Technology

Is superintelligence the next big thing? Meta AI predicts how people react to sight and sound.

News Mania Desk / Piyal Chatterjee/ 27th March 2026

TRIBE v2 (Trimodal Brain Encoder), a foundation model created by Meta, is intended to forecast how the human brain reacts to nearly any sight or sound. According to Meta, the model could make it easier for neuroscientists to carry out new research because it mimics how the brain responds to sight, sound, and language. Superintelligence, a stage of AI that transcends human intelligence and responds to the physical world similarly to humans, may be closer for the corporation as a result of the breakthrough.

TRIBE v2 predicts brain activity through a three-stage pipeline. In the first step, the model converts sounds, visuals and text into numbers so it can analyse them. “Today, we’re releasing TRIBE v2. This foundation model acts as a digital mirror of human brain activity in response to sight, sound and language – transforming months of lab work into seconds of computation,” Meta wrote in a blog post.

In the second step, it combines this information and identifies general patterns in how humans process information. Finally, the system predicts which parts of the brain are likely to activate when a person sees, hears or reads something and connects those patterns to actual brain activity.

Meta says TRIBE v2 provides a much more detailed map of brain activity compared to the earlier version. The model is trained on more people and larger datasets, which helps it work better in new situations and produce more accurate predictions than previous approaches.

When scientists record brain activity using fMRI, the data is not always perfectly clean. Individual scans can contain noise caused by factors such as movement or other signals unrelated to thinking or perception.

TRIBE v2 predicts how a person’s brain should react when they see or hear something, as opposed to recording raw inputs. Meta claims that the model’s prediction can occasionally match the average brain response more closely than a single scan because genuine fMRI scans can be noisy.

The TRIBE v2 paper, code, and model weights have been made available as open source by Meta. According to the company, this action is meant to speed up research in three important fields: healthcare, artificial intelligence, and neuroscience.


Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button