Research
Journal papers
X.-Y. Zhang#,*, H. Lin#, Z. Deng, M. Siegel, E. K. Miller and G. Yan. Communications Biology, 2026. (* corresponding author)
While neural trajectories often appear as low dimensional manifolds at a global scale, they are locally highly tangled. Much like the fibers of a rope, what appears as a smooth curve from afar reveals intricate interweaving upon closer inspection. We show that capturing these fine grained features requires high dimensional embeddings and non linear transformations, which enable our ANN model to spontaneously align with the functional specialization of brain regions without any prior functional knowledge.
While neural trajectories often appear as low dimensional manifolds at a global scale, they are locally highly tangled. Much like the fibers of a rope, what appears as a smooth curve from afar reveals intricate interweaving upon closer inspection. We show that capturing these fine grained features requires high dimensional embeddings and non linear transformations, which enable our ANN model to spontaneously align with the functional specialization of brain regions without any prior functional knowledge.
Xin-Ya Zhang and Chao Tang. PNAS, 2025.
Artificial neural networks can adapt to tasks while freely exploring possible solutions, similar to how humans balance curiosity with goal-driven behavior. We show that during training, such networks naturally operate near a critical state. This state emerges from a balance between randomness and task relevance and leaves measurable signatures, including stable power-law statistics in parameter updates and multiscale patterns in the geometry of the loss landscape. Our findings reveal that neural network learning is a nonequilibrium process shaped by fundamental statistical principles, offering a general explanation for scaling laws in parameter updates and guiding the design of more interpretable and efficient intelligent systems.
Artificial neural networks can adapt to tasks while freely exploring possible solutions, similar to how humans balance curiosity with goal-driven behavior. We show that during training, such networks naturally operate near a critical state. This state emerges from a balance between randomness and task relevance and leaves measurable signatures, including stable power-law statistics in parameter updates and multiscale patterns in the geometry of the loss landscape. Our findings reveal that neural network learning is a nonequilibrium process shaped by fundamental statistical principles, offering a general explanation for scaling laws in parameter updates and guiding the design of more interpretable and efficient intelligent systems.
X.-Y. Zhang*, S. Bobadilla-Suarez, X. Luo, M. Lemonari, S. L. Brincat, M. Siegel, E. K. Miller and B. C. Love*. Nature Communications, 2025. (* corresponding author)
Although effects were most prominent in frontal areas, representations stretched along task-relevant dimensions in all sites considered: V4, MT, lateral PFC, frontal eye fields (FEF), lateral intraparietal cortex (LIP), and inferotemporal cortex (IT). Spike timing was crucial to this code. A deep learning model was trained on the same visual input and rewards as the monkeys. Despite lacking an explicit selective attention or other control mechanism, by minimizing error during learning, the model's representations stretched along task-relevant dimensions, indicating that stretching is an adaptive strategy.
Although effects were most prominent in frontal areas, representations stretched along task-relevant dimensions in all sites considered: V4, MT, lateral PFC, frontal eye fields (FEF), lateral intraparietal cortex (LIP), and inferotemporal cortex (IT). Spike timing was crucial to this code. A deep learning model was trained on the same visual input and rewards as the monkeys. Despite lacking an explicit selective attention or other control mechanism, by minimizing error during learning, the model's representations stretched along task-relevant dimensions, indicating that stretching is an adaptive strategy.
X.-Y. Zhang#,*, Y. Yao#, Z. Han and G. Yan. Communications Physics, 2025. (* corresponding author)
Most real-world complex networks are spatially embedded, with nodes positioned in physical space. In such systems, distance-based connectivity shapes not only the information transmission efficiency but also the network robustness and vulnerability behavior. Here, we systematically examined how spatial distance influences network robustness using k-core percolation across a pair of models of long-range connectivity. For structural connected components, we found that long-range connectivity can trigger explosive phase transitions, but with a delayed threshold. For spatial neighborhoods, two core phenomena emerge: spatial diffusion and clustering. Despite varying connectivity models, spatial neighborhoods consistently demonstrate self-organized criticality throughout the percolation process.
Most real-world complex networks are spatially embedded, with nodes positioned in physical space. In such systems, distance-based connectivity shapes not only the information transmission efficiency but also the network robustness and vulnerability behavior. Here, we systematically examined how spatial distance influences network robustness using k-core percolation across a pair of models of long-range connectivity. For structural connected components, we found that long-range connectivity can trigger explosive phase transitions, but with a delayed threshold. For spatial neighborhoods, two core phenomena emerge: spatial diffusion and clustering. Despite varying connectivity models, spatial neighborhoods consistently demonstrate self-organized criticality throughout the percolation process.
5. Geometric Scaling Law in Real Neuronal Networks
X.-Y. Zhang, J. M. Moore, X. Ru and G. Yan. Physical Review Letters, 2024. Editors' Suggestion & Featured in Physics & Physical Review Letters collection of the year 2024
We uncovered a fundamental and elegant scaling law in the synapse-resolution Drosophila connectomes. This discovery challenges the well-known exponential distance rule previously established in inter-areal brain networks and carries functional significance, aligning with the maximum entropy of information communication and the functional criticality balancing integration and segregation. Our findings establish a direct link between brain geometry and topology, hinting at new opportunities for developing brain geometry-inspired artificial intelligence.
We uncovered a fundamental and elegant scaling law in the synapse-resolution Drosophila connectomes. This discovery challenges the well-known exponential distance rule previously established in inter-areal brain networks and carries functional significance, aligning with the maximum entropy of information communication and the functional criticality balancing integration and segregation. Our findings establish a direct link between brain geometry and topology, hinting at new opportunities for developing brain geometry-inspired artificial intelligence.
X.-Y. Zhang, J. Sun and G. Yan. Physical Review Research, 2021.
We explored a general model of temporal networks and analytically proved that the weight variation of a link is equivalent to attaching a virtual driver node to that link. Consequently, the temporality of link weights can significantly increase the dimension of controllable space and remarkably reduce control cost.
We explored a general model of temporal networks and analytically proved that the weight variation of a link is equivalent to attaching a virtual driver node to that link. Consequently, the temporality of link weights can significantly increase the dimension of controllable space and remarkably reduce control cost.
X.-Y. Zhang, X. Ru, Z. Liu, J. M. Moore and G. Yan. Journal of Physics: Complexity, 2025.
X.-Y. Zhang*, G. Yan and J. M. Moore*. Journal of Complex Networks, 2025. (* corresponding author)
Z. Liu, X. Ru, J. M. Moore, X.-Y. Zhang and G. Yan. IEEE Transactions on Network Science and Engineering, 2024.
Conference papers
X. Ru, X.-Y. Zhang, Z. Liu, J. M. Moore and G. Yan. NeurIPS, 2023. Spotlight
X. Ru, J. M. Moore, X.-Y. Zhang, Y. Zeng and G. Yan. AAAI, 2023. Oral
Presentations
- Invited talk at the 8th National Conference on Statistical Physics and Complex Systems 2025, Ningbo, China.
Heavy-tailed update arises from information-driven self-organization in non-equilibrium learning. - Poster presentation at the 17th Annual Meeting of Chinese Neuroscience Society 2024, Suzhou, China.
Adaptive stretching of representations across brain regions and deep learning model layers. - Oral presentation at 20th China Networks Science Forum 2024, Beijing, China.
Geometric scaling law in real neuronal networks. - Spotlight at International Conference NeurIPS 2023, New Orlean, USA.
Attentive transfer entropy to exploit transient emergence of coupling effect. - Oral presentation at International Conference NetSci 2022, Shanghai, China.
Link weight variation offers superiority in controlling temporal networks. - Oral presentation at International Conference NetSci-X 2018, Hangzhou, China.
Structural origin of co-susceptibility in cascading failures. We found that both structural closeness and high-order correlations could lead to co-susceptibility. This finding prompted us to propose a new statistical quantity, based on structure only, to assess the co-susceptibility of node pairs in an arbitrary network.