The workshop will be held in presence. link
The Dynamics On and Of Complex Networks (DOOCN) workshop series, aims at exploring statistical dynamics on and off complex networks. Dynamics on networks refers to the different types of processes that take place on networks, like spreading, diffusion, and synchronization. Modeling such processes is strongly affected by the topology and temporal variation of the network structure, i.e., by the dynamics of networks.
In recent years tremendous developments have been achieved in the comprehension of neural activity at the population level. This has been possible on one side thanks to the new investigation methods recently developed (e.g. the neuropixels probes and large-scale imaging) that allows for the contemporary registration of the activity of (tens/hundreds of) thousands of neurons in alive and behaving mice as well as established dynamic-clamp protocols. On the other side by the elaboration of extremely refined mean field models able to describe the population activity of spiking neural networks encompassing realistic biological features, from different forms of synaptic dynamics to plastic and adaptive aspects present at the neural level. The aim of this workshop is to gather neuroscientists, mathematicians, engineers, and physicists all working on the characterization of the population activity from different point of views, ranging from data analysis of experimental results to simulations of large ensembles of neurons, from next generation neural mass models to dynamical mean field theories. This workshop will favour the exchanges and the discussion on extremely recent developments in this extremely fluorishing field. This motivates us to focus on "Population activity : the influence of cell-class identity, synaptic dynamics, plasticity and adaptation" as the topic of interest in the 2025 edition.The 16th edition of the DOOCN workshop, “DOOCN-XVI: Population activity : the influence of cell-class identity, synaptic dynamics, plasticity and adaptation; will be held on July 8-9, 2025 in conjunction with the upcoming CNS*25 Florence -34th Annual Computational Neuroscience Meeting which will take place during 5– 9 July 2025, in Florence, Italy.
Abstract: Maintaining a balance between excitation and inhibition (E/I) is fundamental for stable neuronal dynamics and proper brain function. The efforts to maintain this balance are reflected in the conserved ratios of excitatory and inhibitory neurons across different brain regions and development. I will discuss how neuronal networks with artificially altered E/I cell-count ratios can adapt by modifying their connectivity to restore the equilibrium.
At the level of synaptic strength, theoretical studies suggest that a combination of plasticity rules can drive the emergence of E/I co-tuning (detailed balance) in neurons receiving independent, low-noise inputs. I will explore how recurrent network structures influence this process. Furthermore, at the level of single neurons with complex morphologies, I will demonstrate how the spatial distribution of excitatory and inhibitory inputs across the dendritic tree affects both the stability of learned orientation preference and the sharpness of tuning.Abstract: Excitatory-inhibitory balance governs how information is processed and transmitted through cortical circuits. In this talk, we show how the input-output properties of a balanced circuit can be naturally explained by control theory. We explain how excitatory-inhibitory interactions sculpt network controllability through the lens of balanced amplification. We capture the spatiotemporal features of cell-type specific control in spatially extended networks by introducing two new observables: controllability horizon and Gramian dimensionality. Extending these insights to networks trained on cognitive tasks, we highlight control theory as a unifying principle for understanding both information processing and causal circuit manipulations.
Abstract: Spatial information is encoded by location-dependent hippocampal place cell firing rates and sub-second, rhythmic entrainment of spike times. These rate and temporal codes have primarily been characterized in low-dimensional environments under limited cognitive demands; but how is coding configured in complex environments when individual place cells signal several locations, individual locations contribute to multiple routes and functional demands vary? Quantifying CA1 population dynamics of male rats during a decision-making task, we show that the phase of individual place cells spikes relative to the local theta rhythm shifts to differentiate activity in different place fields. Theta phase coding also disambiguates repeated visits to the same location during different routes, particularly preceding spatial decisions. Using unsupervised detection of cell assemblies alongside theoretical simulations, we show that integrating rate and phase coding mechanisms dynamically recruits units to different assemblies, generating spiking sequences that disambiguate episodes of experience and multiplexing spatial information with cognitive context.
Abstract: Depending on the question at hand, different neuron models are chosen to model the activity in neural networks - amongst the most basic descriptions for example, there are rate neurons and binary neurons. Because the mathematical frameworks for these descriptions are traditionally quite different, a direct comparison has been difficult so far. Here, we develop a generalized path-integral framework to study neuron models on the same footing, the only requirement being that the neuron is described by some input-output mechanism [Kühn, Keup, Dahmen & Helias, PRX 2021]. In this framework, we investigate the statistics of networks of binary and rate neurons, finding that they can be matched given the right amount of noise. The dynamics, however, is qualitatively different: whereas the dynamics of the rate network is regular for small recurrent connectivity and chaotic for strong enough recurrency, it is always chaotic for the binary network, at least in the thermodynamic limit. Given that chaos is often considered to be detrimental for computation, it might appear that, in this sense, binary neurons are of no much use. However, we show here that for classification tasks, it is precisely the chaotic dynamics leading to a (transient) improvement of performance because distances of data points inside classes are less strongly enhanced by the expansive dynamics than the (bigger) distances between different classes. Only on long time scales, the chaotic activity mixes the trajectories of all initial conditions and renders classification impossible. The same mechanism can be observed for networks with other types of model neurons in their respective chaotic regimes. We conclude by an outlook on how these theoretical considerations relate to recent experimental results: in Neuropixels data from mice, the decorrelation of neural trajectories in a binary network, together with the population firing rate, explains the decay of separability over time between visual and tactile stimuli.
Abstract: Everyday behavior requires processing temporally-extended sequences of stimuli, an ability that critically relies on Working Memory (WM). Yet, how WM supports the encoding and retrieval of a sequence of stimuli is presently unknown. Existing models rely on associative learning to encode or retrieve temporal information and are, thus, unable to explain how people can reproduce short, but otherwise arbitrary (i.e., novel), sequences of stimuli immediately. Here, we propose that both the stimuli and their relative times of occurrence are rapidly encoded by transient, non-associative synaptic plasticity over multiple time scales. To substantiate this proposal, we extend our previously- proposed synaptic theory of WM to include synaptic augmentation, besides short-term depression and facilitation, consistently with experimental observations. Like facilitation, augmentation builds up with repetitive pre-synaptic activation but persists for much longer. We find that the long time scales associated with augmentation naturally lead to the emergence of a temporal gradient in the synaptic efficacies. This gradient, in turn, can be easily read-out and used to immediately replay, at normal speed or in a time-compressed way, novel sequences. The theory proposed is consistent with multiple behavioral and neurophysiological observations.
Abstract: Dendritic morphology is a key indicator of cell class; however, compared to point-neuron models, there are relatively few analytical results on the role of space in the integration of synaptic input. Here, a feedforward cascade of partial differential equations is presented that describes the voltage-statistic dynamics of a minimal dendritic model. Though minimalistic, the model exhibits interesting properties such as a fast response to modulated afferent drive and morphological resonances. Moving to the level of interconnected networks, a space-time rate-based model is derived from an underlying stochastic network comprised of spiking neurons. A biophysically reasonable choice of spatial coupling is chosen that includes distance dependencies for both amplitude and signal delay. The resulting integro-differential representation reduces to a partial-differential equation for the network rate featuring both wavelike and diffusive components. This approach provides a tractable framework and efficient numerical scheme for analysing the network response to spatiotemporal stimulation patterns.
Abstract: Mean-field models [1] of neural population dynamics are central to theoretical neuroscience. However, cortical columns consist of a finite number of neurons $(N= 10^2-10^4)$, requiring realistic models to account for finite-size fluctuations [2,3]. This endogenous noise can induce transitions and coherence, phenomena well-studied in isolated or coupled populations but less understood in spatially extended systems. We investigate a two-dimensional cortical field, where each lattice node is a population composed of N excitatory spiking neurons. Each neuron has a membrane potential V integrating with leakage the input current due to both the pre-synaptic barrage of spikes it receives, and the inhibitory potassium flow determining the adaptation phenomenon of spike frequency [4,5]. Populations are interconnected with a probability that decays exponentially with distance. For optimal population sizes (N), finite-size fluctuations coupled with spatial interactions generate coherent oscillations absent in the "thermodynamic" limit (N -> \infty). We characterize this novel noise-induced phase transition using standard tools from non-equilibrium statistical physics and explore the system's dynamics in the bifurcation diagram of local excitability versus adaptation strength. Our findings depend primarily on global and local connectivity parameters, rather than single-neuron details; therefore, we expect them to be general and ubiquitous. Lastly, based on these results we address the transition from sleep, dominated by slow global waves, to the asynchronous state characteristic of wakefulness, using our cortical field model offering insights into this fundamental problem in neuroscience, challenging the current understanding [6,7].
[1] Brunel, N., and Hakim, V., Fast global oscillations in networks of integrate-and-fire neurons with low firing rates., Neural Comput. 11.7 (1999) 1621-1671. [2] Vinci, G. V., Benzi, R., and Mattia, M., Self-consistent stochastic dynamics for finite-size networks of spiking neurons., Phys. Rev. Lett. 130.9 (2023) 097402. [3] Mattia, M., and Del Giudice, P., Population dynamics of interacting spiking neurons., Phys. Rev. E 66.5 (2002) 051917. [4] Gigante, G., Mattia, M., and Del Giudice, P., Diverse population-bursting modes of adapting spiking neurons., Phys. Rev. Lett. 98.14 (2007) 148101. [5] Mattia, M., and Sanchez-Vives, M. V., Exploring the spectrum of dynamical regimes and timescales in spontaneous cortical activity., Cogn. Neurodyn. 6.3 (2012) 239-250. [6] Sanchez-Vives, M. V., Massimini, M., and Mattia, M., Shaping the default activity pattern of the cortical network., Neuron 94.5 (2017) 993-1001. [7] di Santo, S., Villegas, P., Burioni, R., and Muñoz, M. A., Landau-Ginzburg theory of cortex dynamics: Scale-free avalanches emerge at the edge of synchronization., Proc. Natl. Acad. Sci. USA 115.7 (2018) E1356-E1365.Abstract: Spike-frequency adaptation (SFA) is a fundamental neuronal mechanism taking into account the fatigue due to spike emissions and the consequent reduction of the firing activity. We have studied the effect of this adaptation mechanism on the macroscopic dynamics of excitatory and inhibitory networks of quadratic integrate-and-fire (QIF) neurons coupled via exponentially decaying post-synaptic potentials [1]. In particular, we have studied the population activities by employing an exact mean-field reduction, which gives rise to next-generation neural mass models. This low-dimensional reduction allows for the derivation of bifurcation diagrams and the identification of the possible macroscopic regimes emerging both in a single and in two identically coupled neural masses. In single populations SFA favors the emergence of population bursts in excitatory networks, while it hinders tonic population spiking for inhibitory ones. The symmetric coupling of two neural masses, in absence of adaptation, leads to the emergence of macroscopic solutions with broken symmetry, namely, chimera-like solutions in the inhibitory case and antiphase population spikes in the excitatory one. The addition of SFA leads to new collective dynamical regimes exhibiting cross-frequency coupling (CFC) among the fast synaptic timescale and the slow adaptation one, ranging from antiphase slow-fast nested oscillations to symmetric and asymmetric bursting phenomena. The analysis of these CFC rhythms in the \theta - \gamma range has revealed that a reduction of SFA leads to an increase of the \theta frequency joined to a decrease of the \gamma one. This is analogous to what has been reported experimentally for the hippocampus and the olfactory cortex of rodents under cholinergic modulation, which is known to reduce SFA. In a PING configuration, where SFA affects the excitatory population only, it is possible to observe the emergence of relaxation oscillations due to the interplay between the nonlinear dynamics of the firing rate and the self-inhibition modulated by SFA. A characterization of Up and Down states, together with that of the spike adding process, is provided in the PING configuration for different parameters.
[1] A. Ferrara, D. Angulo-Garcia, A. Torcini, and S. Olmi, Population spiking and bursting in next-generation neural masses with spike-frequency adaptation, Physical Review E, 107(2), 024311 (2023).Abstract: Cortical activity in-vivo displays relaxational time scales much longer than the underlying neuronal and synaptic time scales. The mechanisms responsible for such slow dynamics are not understood. Here, we show that slow dynamics naturally, and robustly, emerges in dynamically-balanced networks of spiking neurons. This only requires partial symmetry in the synaptic connectivity, a feature of local cortical networks observed in experiments. The symmetry generates an effective, excitatory self-coupling of the neurons that leads to long-lived fluctuations in the network activity, without destroying the dynamical balance. When the excitatory self-coupling is suitably strong, the same mechanism leads to multiple equilibrium states of the network dynamics. Our results reveal a novel dynamical regime of the collective activity in spiking networks, a regime where the memory of the initial state persists for very long times and ergodicity is broken.
Abstract:Sensory information is integrated across a distributed brain network for processing and perception. Recent studies have identified distinct spatiotemporal patterns of cortical activation for early and late components of sensory-evoked responses, with the early component linked to stimulus features and the late component associated with perception. To explore how brain state influences sensory-evoked activation, we used isoflurane to modulate brain states and performed wide-field calcium imaging in Thy1-GCaMP6f mice during multi-whisker stimulation. Our findings show that anesthesia levels strongly impact the spatiotemporal features and functional connectivity (FC) of sensory-activated networks. As anesthesia levels decrease, responses become more complex, and the late component emerges, raising questions about the potential for perception during unconscious states. In a following study, we explored the effects of anesthesia and age on the cortical function in an ASD model. Indeed, brain network dysfunction is recognized as a central feature of autism spectrum disorders (ASDs), with altered functional connectivity (FC) being a key marker. We conducted a longitudinal study on FC across three brain states and three ages in the Shank3b mouse model of autism. Using wide-field calcium imaging, we monitored cortical activity in Shank3b mutant mice from late development (P45) to adulthood (P90). Our results show that network hyperconnectivity emerges in juvenile Shank3b mice and expands to the dorsal cortex in adulthood. Notably, FC imbalances are more pronounced in awake states and shift toward hypoconnectivity under anesthesia. These findings highlight the importance of anesthesia in detecting autism-related FC alterations and suggest a potential target for non-invasive treatments of ASD.
Abstract: Neural dynamics is determined by the transmission of discrete synaptic pulses (synaptic shot-noise) among neurons. However, the neural responses are usually obtained within the diffusion approximation modeling synaptic inputs as continuous Gaussian noise. Here, we present a rigorous mean-field theory that encompasses synaptic shot-noise for sparse balanced inhibitory neural networks driven by an excitatory drive. Our theory predicts new dynamical regimes, in agreement with numerical simulations, which are not captured by the classical diffusion approximation. Notably, these regimes feature self-sustained global oscillations emerging at low connectivity (in- degree) via either continuous or hysteretic transitions and characterized by irregular neural activity, as expected for balanced dynamics. For sufficiently weak (strong) excitatory drive (inhibitory feedback) the transition line displays a peculiar re-entrant shape revealing the existence of global oscillations at low and high in-degrees, separated by an asynchronous regime at intermediate levels of connectivity. The mechanisms leading to the emergence of these global oscillations are distinct: drift-driven at high connectivity and cluster activation at low connectivity. The frequency of these two kinds of global oscillations can be varied from slow (around 1 Hz) to fast (around 100 Hz), without altering their microscopic and macroscopic features, by adjusting the excitatory drive and the synaptic inhibition strength in a prescribed way. Furthermore, the cluster-activated oscillations at low in-degrees could correspond to the gamma rhythms reported in mammalian cortex and hippocampus and attributed to ensembles of inhibitory neurons sharing few synaptic connections [G. Buzsaki and X.-J. Wang, Annual Review of Neuroscience 35, 203 (2012)].
Abstract: Nerve impulses, the currency of information flow in the brain, are generated by an instability of the neuronal membrane potential dynamics. Neuronal circuits exhibit collective chaos that appears essential for learning, memory, sensory processing, and motor control. However, the factors controlling the nature and intensity of collective chaos in neuronal circuits are not well understood. Here we use computational ergodic theory to demonstrate that basic features of nerve impulse generation profoundly affect collective chaos in neuronal circuits. Numerically exact calculations of Lyapunov spectra, Kolmogorov-Sinai-entropy, and upper and lower bounds on attractor dimension show that changes in nerve impulse generation in individual neurons moderately impact information encoding rates but qualitatively transform phase space structure. Specifically, we find a drastic reduction in the number of unstable manifolds, Kolmogorov-Sinai entropy, and attractor dimension. Beyond a critical point, marked by the simultaneous breakdown of the diffusion approximation, a peak in the largest Lyapunov exponent, and a localization transition of the leading covariant Lyapunov vector, networks exhibit sparse chaos: prolonged periods of near stable dynamics interrupted by short bursts of intense chaos. Analysis of large, more realistically structured networks supports the generality of these findings. In cortical circuits, biophysical properti es appear tuned to this regime of sparse chaos. Our results reveal a close link between fundamental aspects of single-neuron biophysics and the collective dynamics of cortical circuits, suggesting that nerve impulse generation mechanisms are adapted to enhance circuit controllability and information flow.
Abstract: To understand the emergent dynamics of macroscopic neural activity, we need simple coarse-grained models that link to the network of spiking neurons at the microscopic scale. Coarse-grained models, consistent with microscopic dynamics, are also essential for simulating large-scale brain activity. The intermediate mesoscopic scale of finite-size networks is particularly critical as it captures fluctuations. However, deriving simple coarse-grained models for biologically plausible, finite-size spiking networks with pronounced spike-history effects (refractoriness, adaptation) remains a largely unresolved theoretical problem. In my presentation I will introduce a low-dimensional stochastic mean-field model for finite-size networks of integrate-and-fire spiking neurons. I will begin by reviewing an accurate stochastic integral equation for the population activity of these networks with and without spike-frequency adaptation (Schwalger et al. 2017). I will then demonstrate how this equation can be used to derive low-dimensional stochastic differential equations for neural population dynamics. The theoretical reduction is based on a novel eigenfunction expansion. The low-dimensional neural-mass model successfully reproduces the non-stationary dynamics and fluctuations of mesoscopic population activities, enabling rapid simulations of large cortical circuits at the mesoscopic scale.
Abstract: The ability to make decisions according to context is a hallmark of intelligent behavior. The prefrontal cortex (PFC) is known for processing contextual information, but many questions remain open. This is especially the case for "strategic" behavior where the context follows from abstract rules rather than dedicated input cues. Here, we report on modeling results for the neural basis of two strategies called "repeat-stay" and "change-shift", respectively. These strategies have been observed in monkeys performing certain types of context-dependent tasks; in the task discussed here, one of three targets must be chosen based on an instruction stimulus and the outcome of previous trials. The same stimulus may instruct different decisions and the same decision may result from different stimuli, requiring the ability to develop strategic rules that span multiple trials. We found that the activity of populations of PFC neurons undergoes sharp transitions across latent neural states encoding task variables such as strategy, decision, action, reward, and previous-trial decisions. We built and compared two models able to perform this task: a recurrent neural network (RNN) trained via backpropagation through time, and a multi-modular spiking network (MMSN) containing realistic ingredients of real cortical networks. Both models successfully attain levels of behavioral performance comparable to the monkeys'; however, the RNN seems to learn specific combinations of task conditions while the MMSN more closely reflects the strategies used by the monkeys. In particular, the MMSN provides an explanation of the idiosyncratic bias observed in "free choice" trials, a subset of trials wherein the subjects can exercise a free choice between two equally valid options. In the MMSN, free choice resembles a biased stochastic process due to fast spiking noise generated internally by the network dynamics, due to heterogeneities in both connectivity and input distributions. In contrast, the RNN follows a stereotyped sequence of choices resulting in the absence of variability in free choice trials. The MMSN also reproduces the sequence of latent states observed in the activity of PFC neurons. These states are separated by sharp transitions that are mostly driven by task events, but can occasionally occur spontaneously due to the fast spiking noise generated by the network. These temporally misplaced transitions provide an explanation for specific behavioral errors. In summary, the combination of realistic modeling features including a modular structure, neural interaction through spikes, and heterogeneities in both connectivity and input distributions, provide a mechanistic explanation of strategic behavior in context-dependent tasks. This is joint work with Tianshu Li, Fabrizio Ceccarelli and Aldo Genovesio.
Abstract: In the last three decades the field of brain connectivity has explored the function of the white matter. Beyond the specialization of cortical regions, the cortex has been found to be organised into a modular and hierarchical architecture that supports the coexistence of segregation and integration of information. Different attempts to explain the emergence of modularity in neural networks have been proposed in the past, taken from different angles and assumptions. Here, we introduce a model constrained by minimal biologically plausible constraints (e.g., local STDP) and aimed at reproducing realistic behaviour (e.g., stable asynchronous irregular behaviour pre- and post- training). In particular, we construct a network of quadratic integrate and fire spiking neurons, subjected to stimuli focalised on different populations. We find that only the combination of Hebbian and anti-Hebbian inhibitory plasticity allows the formation of stable modular organization in the network. During the training phase the two neuronal population spontaneously overtake differentiated functional roles: Anti-Hebbian inhibitory populations mediate memory selectivity while the Hebbian inhibitory neurons balance the firing rate (homeostasis) of the network, without the need of additional rules to control homeostasis. These results mirror empirical evidence reporting that, for example, PV and SOM interneurons mediate homeostasis and stimulus selectivity respectively. Due to its simplicity and the small number of assumptions, the introduced model represents a starting point for further investigations on the role played by STDP on shaping the brain via memory storing and maintenance.
Abstract: Spiking neural network models are traditionally downscaled in terms of their numbers of neurons and synapses. However, this distorts the corresponding predictions of network activity. Leveraging the simulation technology of NEST and the availability of supercomputers, we develop point neuron spiking network models of the cerebral cortex with the full density of neurons and synapses in each local circuit. The models distinguish excitatory and inhibitory neural populations in the different cortical layers and focus on resting-state activity. We find our way from a model of all vision-related areas in a hemisphere of macaque cortex to a version of this model with joint excitatory/inhibitory clusters, and a model of a human cortical hemisphere. Furthermore, I will present two recent models of local cortical microcircuits. The combined work shows how network properties such as synaptic strengths, clustered connectivity, and target specificity of connections influence the ongoing spiking activity.
The first Dynamics On and Of Complex Networks (DOOCN I) took place in Dresden, Germany, on 4th October 2007, as a satellite workshop of the European Conference on Complex Systems 07. The workshop received a large number of quality submissions from authors pursuing research in multiple disciplines, thus making the forum truly inter-disciplinary. There were around 20 speakers who spoke about the dynamics on and of different systems exhibiting a complex network structure, from biological systems, linguistic systems, and social systems to various technological systems like the Internet, WWW, and peer-to-peer systems. The organizing committee has published some of the very high quality original submissions as an edited volume from Birkhauser, Boston describing contemporary research position in complex networks.
After the success of DOOCN I, the organizers launched Dynamics On and Of Complex Networks – II (DOOCN II), a two days satellite workshop of the European Conference of Complex Systems 08. DOOCN II was held in Jerusalem, Israel, on the 18th and 19th September 2008.
DOOCN III was held as a satellite of ECCS 2009 in the University of Warwick, UK on 23rd and 24th of September. In continuation, DOOCN IV was held again as a satellite of ECCS 2010 in the University Institute Lisbon, Portugal on 16th September.
DOOCN V was held as a satellite of ECCS 2011 in the University of Vienna on 14th – 15th September 2011.
DOOCN VI took place in Barcelona, as a satellite to ECCS 2013, and focused on Semiotic Dynamics in time-varying social media. As DOOCN I, the other five DOOCN workshops counted with a large number participants and attracted prominent scientist in the field.
DOOCN VII, held in Lucca as a satellite to ECCS 2014, focused on Big Data aspects. DOOCN VIII was held in Zaragoza with focus also on BigData aspects.
The 9th edition of DOOCN was held in Amsterdam at Conference on Complex Systems (CCS) with the theme “Mining and learning for complex networks”.
The 2017 edition of DOOCN was held in Indianapolis USA in conjunction with NetSci 2017.
The 2018 edition of DOOCN XI was held in Thessaloniki, Greece at Conference on Complex Systems (CCS) with the theme “Machine learning for complex networks”.
The 2019 edition of DOOCN XII was held in Burlington, Vermont, USA in conjunction with NetSci 2019 with the theme “Network Representation Learning”.
The 2020 edition of DOOCN XIII was held online in conjunction with NetSci 2020 with the theme “Network Learning”.
The 2023 edition of DOOCN XIV was held online in conjunction with Statphys28 with the theme “Cascading Failures in Complex Networks”.
The 2024 edition of DOOCN XV was held in conjunction with Compeng 2024 with the theme “Dynamical Multiscale Engineering of Network Architecture”.
The organizing committees of the DOOCN workshop series have published three Birkhäuser book volumes, from selected talks from the series.