About

About me

I research how the brain learns to represent visual images in a noisy world.

I'm a PhD candidate at New York University working jointly with Eero Simoncelli and Cristina Savin.

I use machine learning and dynamical systems techniques to study how the visual system learns to represent images, while coping with noise in both the environment and the activities of neurons themselves. I have a BPhil in mathematical biology and neuroscience from the University of Pittsburgh.

In my free time, I am an avid amateur fiction writer, frequently contributing to student-run journals. In addition, I co-organize NYU NeuWrite, an organization dedicated to helping neuroscience students with public-facing writing, and volunteer for ScAAN, an NYU-based science advocacy group.

Resume

CV

PDF

Contact Info

Colin Bredenberg

4 Washington Pl, New York, NY
cjb617@nyu.edu

Education

New York University

2017 - Present

PhD student in Computational Neuroscience.

Supervisors: Cristina Savin, Eero Simoncelli

University of Pittsburgh

2015 - 2017

Research assistant

Supervisor: Brent Doiron

University of Pittsburgh

2013 - 2014

Research assistant

Supervisor: Bita Moghaddam

University of Pennsylvania

2013 - 2017

Research assistant

Supervisor: John Trojanowski, Virginia Lee

Science Outreach & Communication

NYU NeuWrite Lead Officer

2019 - 2021

NeuWrite is a group that brings together scientists and journalists in NYC to foster creative and exceptional science communication. We strive to communicate complex scientific ideas in a manner that is accessible and engaging while also scientifically accurate.

Scientist Action and Advocacy Network (ScAAN)

2018 - 2021

As a pro bono data scientist at ScAAN, I help visualize public transportation data in collaboration with a non-profit that aims to improve bus service in Baltimore, MD.

Academic Work

Publications

Bredenberg, C., Simoncelli, E., & Savin, C. (2020). Learning efficient task-dependent representations with synaptic plasticity. Advances in Neural Information Processing Systems, 33.

Robinson, J. L., Lee, E. B., Xie, S. X., Rennert, L., Suh, E., Bredenberg, C., ... & Hurtig, H. I. (2018). Neurodegenerative disease concomitant proteinopathies are prevalent, age-related and APOE4-associated. Brain.

Bredenberg, C. (2017). Examining heterogeneous weight perturbations in neural networks with spike-timing-dependent plasticity (BPhil Thesis, University of Pittsburgh).

Presentations

Impression learning: Online predictive coding with synaptic plasticity. C Bredenberg, E P Simoncelli and C Savin. Computational and Systems Neuroscience (CoSyNe), Feb 2021. Abstract | Talk

Bredenberg C., Savin C., and Kiani R. (2020, February). Recurrent neural circuits overcome partial inactivation by compensation and relearning. Poster presentation, Cosyne 2020. Abstract | Poster

Bredenberg C., Simoncelli E. P., and Savin C. (2019, December). Learning efficient, task-dependent representations with synaptic plasticity. Poster presentation, NeuRIPS Workshop on Biological and Artificial Reinforcement Learning. Paper | Poster

Bredenberg C., Simoncelli E. P., and Savin C. (2019, February). Learning efficient, task-dependent representations with synaptic plasticity. Poster presentation, Cosyne 2019. Abstract | Poster

Bredenberg C., Doiron B. (2017, February). Examining weight perturbations in plastic neural networks. Poster presentation, Cosyne 2017. Abstract

Bredenberg C., Doiron B. (2016, October). Examining Variably Diffuse Weight Perturbations in Plastic Neural Networks. Poster presention, University of Pittsburgh’s Science 2016.

Suh E, Bredenberg C., Van Deerlin V. (2014, October). Screening for Mutations in Frontotemporal Degeneration with a Targeted Next Generation Sequencing Panel. Poster presentation, International Conference on Frontotemporal Dementia.

Teaching Experience

TA - Introduction to Time Series Analysis

NYU, Fall 2020

Dr. Cristina Savin

TA - Introduction to Neural Science

NYU, Fall 2018

Dr. Anthony Movshon

Projects

Current projects

Abstract

Early sensory areas in the brain are faced with a task analogous to the scientific process itself: given raw data, they must extract meaningful information about its underlying structure. This process is particularly difficult, because the true underlying structure of the data is never revealed, so representation learning must be largely unsupervised. Here, we provide a theoretical account of how learning to infer latent structure can be implemented in neural networks using local synaptic plasticity. To do this, we derive a learning algorithm in which synaptic plasticity is driven by a local error signal, computed by comparing stimulus-driven responses to internal model predictions (the network’s ‘impression’ of the data).

Paper | Poster

We construct a stochastic recurrent neural circuit model that can learn efficient, task-specific sensory codes using a novel form of reward-modulated Hebbian synaptic plasticity. We illustrate the flexibility of the model by training an initially unstructured neural network to solve two different tasks: stimulus estimation, and stimulus discrimination. The network achieves high performance in both tasks by appropriately allocating resources and using its recurrent circuitry to best compensate for different levels of noise.

Abstract | Poster

Technical advances in artificial manipulation of neural responses have precipitated a surge in studies of causal contributions of brain circuits to cognition and behavior. However, since underlying neural computations are often distributed across multiple regions, the interpretation of these experimental results is fraught with difficulty. Here we use recurrent neural networks (RNNs) trained to emulate the computations that underlie discrimination of random dots motion direction to study the diverse effects neural inactivation can have on computation.

Auditory perceptual learning reliably enhances cortical representations of task-relevant stimuli in trained animals relative to naive ones. We developed an experimental and computational framework for describing how sensory representations change during auditory perceptual learning. We recorded from a population of neurons throughout the duration of auditory conditioning using two-photon imaging of layer 2/3 excitatory neurons in the auditory cortex of mice. To make sense of our observations, we trained a network model with reward-modulated Hebbian synaptic learning to solve the same task. We found that the simulated network learns at similar rates as real animals and captures the across-animal variability in tuning.

Contact

Contact me

Designed by BootstrapMade