Research Article The Relationship between Sparseness and Energy Consumption of Neural Networks Guanzheng Wang , 1 Rubin Wang , 1,2 Wanzeng Kong , 2 and Jianhai Zhang 2 1 Institute for Cognitive Neurodynamics, School of Science, East China University of Science and Technology, Meilong Road 130 Shanghai 200237, China 2 Key Laboratory of Brain Machine Collaborative Intelligence of Zhejiang Province, Hangzhou Dianzi University, Zhejiang, China Correspondence should be addressed to Rubin Wang; rbwang@ecust.edu.cn Received 3 August 2020; Accepted 29 September 2020; Published 25 November 2020 Academic Editor: J. Michael Wyss Copyright © 2020 Guanzheng Wang et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. About 50-80% of total energy is consumed by signaling in neural networks. A neural network consumes much energy if there are many active neurons in the network. If there are few active neurons in a neural network, the network consumes very little energy. The ratio of active neurons to all neurons of a neural network, that is, the sparseness, aects the energy consumption of a neural network. Laughlins studies show that the sparseness of an energy-ecient code depends on the balance between signaling and xed costs. Laughlin did not give an exact ratio of signaling to xed costs, nor did they give the ratio of active neurons to all neurons in most energy-ecient neural networks. In this paper, we calculated the ratio of signaling costs to xed costs by the data from physiology experiments. The ratio of signaling costs to xed costs is between 1.3 and 2.1. We calculated the ratio of active neurons to all neurons in most energy-ecient neural networks. The ratio of active neurons to all neurons in neural networks is between 0.3 and 0.4. Our results are consistent with the data from many relevant physiological experiments, indicating that the model used in this paper may meet neural coding under real conditions. The calculation results of this paper may be helpful to the study of neural coding. 1. Introduction Recent studies have shown that single neuron ring is sucient to inuence learning and behavior [1, 2]. The result challenges peoples long-standing understanding that a behavioral response needs the ring of thousands of neurons. Their ndings provide the basis and support for a neural theory (neuron sparse codinghypothesis); the hypothesis argues that a small number of neurons are enough to encode information [35]. Only a small part of neurons are activated when signaling in a sparse coding mode, and most of the neurons are responsible only for network connection [68]. Since a small number of neuron ring and little energy are required in the sparse coding mode, the sparse coding is an energy-ecient neural coding method [9, 10]. This energy-ecient neural coding pattern increases the ratio of neuron-encoded information and greatly improves energy eciency [11, 12]. Although the sparse coding hypothesis of neural networks in the cere- bral cortex has not yet been conrmed, it has been shown that sparse coding represents the maximization of energy eciency [1315]. Wang et al. studied the information carried by neurons and the energy cost by neurons [13]. They found that neu- rons are not most energy-ecient when coding the maxi- mum information, and the ratio of signaling to xed costs aects the total energy consumed by neurons. Laughlin studied the sparseness and representational capabilities of neural networks [14]. They found that the sparseness of the most energy-ecient coding pattern depends on the ratio of signaling to xed costs when neural networks have similar representational capabilities. However, Wang et al. and Laughlin did not consider the exact ratio of signaling to xed costs. Wang et al.s study believes that the ratio of signaling to xed costs is between 10 and 200, and Laugh- lin just studied three cases with a ratio of 1, 10, and 100. Hindawi Neural Plasticity Volume 2020, Article ID 8848901, 13 pages https://doi.org/10.1155/2020/8848901