Brain2GAN: Feature-disentangled neural encoding and decoding of visual perception in the primate brain

PLoS Comput Biol. 2024 May 6;20(5):e1012058. doi: 10.1371/journal.pcbi.1012058. eCollection 2024 May.

Abstract

A challenging goal of neural coding is to characterize the neural representations underlying visual perception. To this end, multi-unit activity (MUA) of macaque visual cortex was recorded in a passive fixation task upon presentation of faces and natural images. We analyzed the relationship between MUA and latent representations of state-of-the-art deep generative models, including the conventional and feature-disentangled representations of generative adversarial networks (GANs) (i.e., z- and w-latents of StyleGAN, respectively) and language-contrastive representations of latent diffusion networks (i.e., CLIP-latents of Stable Diffusion). A mass univariate neural encoding analysis of the latent representations showed that feature-disentangled w representations outperform both z and CLIP representations in explaining neural responses. Further, w-latent features were found to be positioned at the higher end of the complexity gradient which indicates that they capture visual information relevant to high-level neural activity. Subsequently, a multivariate neural decoding analysis of the feature-disentangled representations resulted in state-of-the-art spatiotemporal reconstructions of visual perception. Taken together, our results not only highlight the important role of feature-disentanglement in shaping high-level neural representations underlying visual perception but also serve as an important benchmark for the future of neural coding.

MeSH terms

  • Animals
  • Brain / physiology
  • Computational Biology
  • Macaca mulatta
  • Male
  • Models, Neurological*
  • Neural Networks, Computer
  • Neurons / physiology
  • Photic Stimulation
  • Visual Cortex* / physiology
  • Visual Perception* / physiology

Grants and funding

This work was funded by the Dutch Research Council (https://www.nwo.nl/en). UG, YG and MvG were supported by grant numbers 024.005.022 (DBI2 project, Gravitation programme) and 17619 (INTENSE project, Crossover programme), and PP was supported by grant numbers OCENW.XS22.2.097 and VI.Veni.222.217. The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.