top of page
  • Writer's pictureAkseli Ilmanen

#5 Bernstein conference 2023: Computational neuroscience posters

Updated: Oct 23, 2023




Two weeks ago, I visited the Bernstein conference in Berlin. I had lots of fun, particularly at the poster sessions, where I met William, Movitz, and Shervin. I met with each of them later and recorded the following conversations (on bark benches again^^).



(00:02:53) - William Walker (Gatsby Computational Neuroscience Unit, London) had a poster on 'Representations of State in Hippocampus Derive from a Principle of Conditional Independence'. We discuss how current deep learning struggles with generalization, lacks priors, and could benefit by learning latent conditionally independent representations (similar to place cells).


(00:32:53) - Movitz Lenninger (KTH Royal Institute of Technology, Stockholm) had a poster on 'Minimal decoding times for various shapes of tuning curves'. He was puzzled why neurons with periodic tuning curves (such as grid cells) are so rare in the brain considering their superior accuracy. He posits there may be a trade-off between accuracy and encoding time.


(00:55:04) - Shervin Safavi (Max Planck Institute for Biological Cybernetics, Tübingen) had a poster on linking efficient coding and criticality. We introduce those concepts and talk about why noise is a feature, not a bug. Shervin is also starting a new lab at TU Dresden, where he wants to understand the computational machinery of cognitive processes and he is looking for interdisciplinary-minded applicants!



  • William's publications:

    • Walker et al., 2023 - Unsupervised representation learning with recognition-parametrised probabilistic models preprint

    • Walker et al., 2023 - Prediction under Latent Subgroup Shifts with High-Dimensional Observations preprint


  • Movitz's LinkedIn

  • Movitz's poster from another conference:

  • Movitz's publications:

    • Lenninger et al., 2022 - How short decoding times, stimulus dimensionality and spontaneous activity constrain the shape of tuning curves: A speed-accuracy trade-off preprint

    • Lenninger et al., 2023 - Are single-peaked tuning curves tuned for speed rather than accuracy? paper


  • Shervin's Website

  • Twitter: @neuroprinciples

  • For Shervin's new lab: interest mailing list

  • Shervin's publications:

    • Safavi et al., 2022 - Multistability, perceptual value, and internal foraging paper

    • Safavi et al., 2023 - Signatures of criticality in efficient coding networks preprint


  • Synchronization of metronomes video


  • My Twitter @akseli_ilmanen

  • Email: akseli.ilmanen[at]gmail.com

  • The Embodied AI Podcast, my blog, other stuff

  • Music: Space News, License: Z62T4V3QWL


Music: Space News, License: Z62T4V3QWL

61 views0 comments
bottom of page