David C Sterratt

Lecturer, School of Informatics Deputy Director of Learning & Teaching (Operations)

Background

Following an undergraduate degree in Physics and a PhD in computational neuroscience, I've been a researcher, University Teacher and Lecturer at the University of Edinburgh since 2000. My research is in the area of computational neuroscience: using mathematical and computational models of parts of the nervous system to help understand how it develops, learns and functions. With Bruce Graham, Andrew Gillies, Gaute Einevoll and David Willshaw, I'm co-author of the textbook Principles of Computational Modelling in Neuroscience (2nd Edition, CUP, 2023, following 1st Edition in 2011). I have developed and maintain a number of software packages.

I have teaching interests in data science, statistics and sustainability. I co-designed and organise the large second year undergraduate course Informatics 2 - Foundations of Data Science, and supervise projects in data science as well as in my research areas. In 2023 I'm teaching the new course Modelling of Systems for Sustainability with Nigel Goddard.

I was Energy Coordinator for the Informatics Forum from 2011 to 2019.

Qualifications

Fellow of the Higher Education Academy (HEA)

Responsibilities & affiliations

School of Informatics Deputy Director of Teaching (Operations)

Undergraduate teaching

 

INF2-FDS: Informatics 2 - Foundations of Data Science, as course developer, lecturer and course organiser (2020-)

Modelling of Systems for Sustainability, course lecturer (2023-)

Informatics Project Proposal (Graduate Apprentice), lecturer (2021-)

Undergraduate 4th year project, including for Graduate Apprentices in Data Science

Postgraduate teaching

MSc Project supervision

Open to PhD supervision enquiries?

Yes

Current PhD students supervised

  • Domas Linkevicius (co-supervised with Melanie Stefan)
  • Susana Román García (co-supervised with Melanie Stefan)

Research summary

Multiscale modelling of biochemical networks within electrical models of neurons

Synaptic plasticity depends on the interaction between electrical activity in neurons and the synaptic proteome, the collection of over 1000 proteins in synapses. To construct models of synaptic plasticity with realistic numbers of proteins, we combine rule-based models of molecular interactions in the synaptic proteome with compartmental models of  electrical activity in neurons. Rule-based models allow interactions between the combinatorially large number of protein complexes in the postsynaptic proteome to be expressed straightforwardly. Simulations of rule-based models are stochastic and thus can deal with the small numbers of proteins and complexes in the synapse. Compartmental models of neurons are expressed as systems of coupled ordinary differential equations and solved deterministically. Our KappaNEURON software incorporates stochastic rule-based models (implemented in SpatialKappa) into deterministic compartmental models (implemented in NEURON).

  • Sterratt, D. C., Sorokina, O. and Armstrong, J. D. (2015). ‘Integration of rule-based models and compartmental models of neurons’. In O. Maler, Á. Halász, T. Dang and C. Piazza, eds., Hybrid Systems Biology: Second International Workshop, HSB 2013, Taormina, Italy, September 2, 2013 and Third International Workshop, HSB 2014, Vienna, Austria, July 23-24, 2014, Revised Selected Papers, vol. 7699 of Lecture Notes in Bioinformatics, pp. 143–158. Springer International Publishing, Cham. doi: 10.1007/978-3-319-27656-4_9. Preprint at arXiv:1411.4980

Development of the nervous system: Modelling the development of neural topographic maps

During early development in vertebrates, topographic maps form between retinal ganglion cells and their targets, the optic tectum/superior colliculus and the lateral geniculate nucleus. Chemical markers, expressed in gradients, and electrical activity have been shown to influence the growth and pruning of connections between retinal ganglion cells and target cells. Chemical markers, such as Ephs and ephrins, are expressed in the membranes of retinal ganglion cell axons at levels that depend on the location of the cell body in the retina. Complementary gradients of Ephs and ephrins are expressed in gradients throughout the target region, and repulsive interactions between Ephs and ephrins are thought to inhibit axonal branching in particular regions of the target region, leading to a diffuse topographic map. Electrical activity combined with synaptic plasticity is thought to refine this mapping.

An important issue is how to quantify and analyse topographic maps. In Willshaw, Sterratt and Teriakidis (2014), we have devised a new computational method entitled the “lattice method” to analyse experimental data. Our method reveals that there is hidden order in some maps that were previously thought to be disordered.

There are a number of existing computational models of the establishment of the mapping between the retina and its targets, which have shown how both marker-based and activity-based mechanisms can set up topographic maps. In the years since these models were devised a great deal of experimental evidence has accumulated and it is important to assess whether existing models can account for new experimental data as we do in our review (Hjorth et al., 2015). The key challenge is to integrate as many of the biological constraints as possible into models that can still explain the large-scale topographic organisation of the connections.

Sterratt (2013) uses an existing model to examine how crucial it is for countergradients of ephrins in the retina and Ephs in the superior colliculus to be tuned to each other, in the presence and absence of a compensatory mechanism. Sterratt and Hjorth (2013) provide a critical commentary on a Grimbert and Cang’s (2012) model of the development of retinotopy.

Computational reconstruction of retinae

Our program Retistruct morphs a flat surface with incisions (a dissected retina) onto a curvilinear surface (the original retinal shape). Retistruct has been used by various groups, and is still in development. The IntactEye software (Hjorth et al., 2015) uses two orthogonal images of the intact retina to locate focal injections of a dye.

  • Sterratt, D. C., Lyngholm, D., Willshaw, D. J. and Thompson, I. D. (2013). ’Standard anatomical and visual space for the mouse retina: Computational reconstruction and transformation of flattened retinae with the Retistruct Package’. PLoS Computational Biology 9(2): e1002921. doi:10.1371/journal.pcbi.1002921

  • Hjorth, J. J. J., Savier, E., Sterratt, D. C., Reber, M. and Eglen, S. J. (2015). ‘Estimating the location and size of retinal injections from orthogonal images of an intact retina’. BMC Neuroscience 16:80 PDF

Distance-dependent synaptic plasticity

In hippocampal CA1 cells, synapses that are further from the cell body are larger than those closer to the cell body. In collaboration with Dr Arjen van Ooyen (Sterratt & van Ooyen 2002, 2004; Sterratt, Groen, Meredith & van Ooyen, 2012), I have been investigating whether this distance-dependent scaling could arise as a result of voltage and calcium signals elicited by synaptic inputs to the dendrites of hippocampal CA1 cells. To achieve this, we used the NEURON simulation package to implement a detailed compartmental model of a CA1 cell incorporating spines into which calcium can flow via NMDA receptors and voltage-dependent calcium channels.

In contrast to earlier modelling work, we fed more naturalistic patterns of synaptic activity into the model and our results suggest that the magnitude of calcium signals could provide the information needed for synapses to be scaled. In a separate strand of work (Sterratt & Willshaw, 2008), I have analysed the expected improvement in the performance of a CA1 cell as an associative memory due to scaling synapses appropriately for distance.

Learning and forgetting in associative memories

How can memory circuits in the brain continue to learn memories throughout life? In computational models of memory networks, memories are stored by changing the strength of specific synapses in the network.  At the network learns more and more memories, it "fills up" and the quality of recall of all memories reduces. I've analysed and simulated  computational models of memories  that incorporate a synaptic weight decay, so that just after a new memory is learned, all the synapses are made a bit weaker.  These networks learn newly presented memories and forget old ones.   The number of memories the network can store depends on whether synapses are weakened quickly, slowly or optimally.  Too fast, and memories disappear quickly. Too slow, and traces of old memories start to interfere with recall, and the memory "fills up".

  • Sterratt, D. C. and Willshaw, D. (2008). ‘Inhomogeneities in heteroassociative memories with linear learning rules’ Neural Computation 20:311-344. [Preprint]

Familiarity memory

Familiarity memory the type of memory involved in being able to identify that a stimulus (e.g. someone met in the street) is familiar, but not being able to recall any more facts about the stimulus (e.g. the person’s name, where we know them from). Humans have a tremendous capacity for this, and in a neural network model, the number of stimuli that can be identified as familiar scales with the square of the number of neurons. This capacity is much greater than for being able to recall memories, which scales with the number of units. In collaboration with Dr Andrea Greve and Dr Mark van Rossum, I have investigated optimal learning rules for familiarity detection (Greve, Sterratt, Donaldson, Willshaw & van Rossum, 2009).

View all 27 publications on Research Explorer

Retistruct and associated R packages

  • Retistruct is an R package I wrote to morph a flat surface with cuts (a dissected flat-mount retina) onto a curvilinear surface (the a standard retinal shape).
  • geometry, which I maintain and develop, provides R with an interface to convex hull and Delaunay triangulation functions of the qhull library.
  • RImageJROI provides functions to read ImageJ Region of Interest (ROI) files, to plot the ROIs and to convert them to spatstat spatial patterns.
  • RTriangle is a port I wrote of Jonathan Shewchuk’s Triangle library to R.

Multiscale modelling

NEURON-related

Miscellaneous hacks