From Wikipedia, the free encyclopedia

A set of networks that satisfies given structural characteristics can be treated as a network ensemble. [1] Brought up by Ginestra Bianconi in 2007, the entropy of a network ensemble measures the level of the order or uncertainty of a network ensemble. [2]

The entropy is the logarithm of the number of graphs. [3] Entropy can also be defined in one network. Basin entropy is the logarithm of the attractors in one Boolean network. [4]

Employing approaches from statistical mechanics, the complexity, uncertainty, and randomness of networks can be described by network ensembles with different types of constraints. [5]

Gibbs and Shannon entropy

By analogy to statistical mechanics, microcanonical ensembles and canonical ensembles of networks are introduced for the implementation. A partition function Z of an ensemble can be defined as:

where is the constraint, and () are the elements in the adjacency matrix, if and only if there is a link between node i and node j. is a step function with if , and if . The auxiliary fields and have been introduced as analogy to the bath in classical mechanics.

For simple undirected networks, the partition function can be simplified as [6]

where , is the index of the weight, and for a simple network .

Microcanonical ensembles and canonical ensembles are demonstrated with simple undirected networks.

For a microcanonical ensemble, the Gibbs entropy is defined by:

where indicates the cardinality of the ensemble, i.e., the total number of networks in the ensemble.

The probability of having a link between nodes i and j, with weight is given by:

For a canonical ensemble, the entropy is presented in the form of a Shannon entropy:

Relation between Gibbs and Shannon entropy

Network ensemble with given number of nodes and links , and its conjugate-canonical ensemble are characterized as microcanonical and canonical ensembles and they have Gibbs entropy and the Shannon entropy S, respectively. The Gibbs entropy in the ensemble is given by: [7]

For ensemble,

Inserting into the Shannon entropy: [6]

The relation indicates that the Gibbs entropy and the Shannon entropy per node S/N of random graphs are equal in the thermodynamic limit .

See also

References

  1. ^ Levin, E.; Tishby, N.; Solla, S.A. (October 1990). "A statistical approach to learning and generalization in layered neural networks". Proceedings of the IEEE. 78 (10): 1568–1574. doi: 10.1109/5.58339. ISSN  1558-2256. S2CID  5254307.
  2. ^ Bianconi, Ginestra (2008). "The entropy of randomized network ensembles". EPL (Europhysics Letters). 81 (2): 28005. arXiv: 0708.0153. Bibcode: 2008EL.....8128005B. doi: 10.1209/0295-5075/81/28005. ISSN  0295-5075. S2CID  17269886.
  3. ^ Menichetti, Giulia; Remondini, Daniel (2014). "Entropy of a network ensemble: definitions and applications to genomic data". Theoretical Biology Forum. 107 (1–2): 77–87. ISSN  0035-6050. PMID  25936214.
  4. ^ Krawitz, Peter; Shmulevich, Ilya (27 September 2007). "Entropy of complex relevant components of Boolean networks". Physical Review E. 76 (3): 036115. arXiv: 0708.1538. Bibcode: 2007PhRvE..76c6115K. doi: 10.1103/PhysRevE.76.036115. PMID  17930314. S2CID  6192682.
  5. ^ Bianconi, Ginestra (27 March 2009). "Entropy of network ensembles". Physical Review E. 79 (3): 036114. arXiv: 0802.2888. Bibcode: 2009PhRvE..79c6114B. doi: 10.1103/PhysRevE.79.036114. PMID  19392025. S2CID  26082469.
  6. ^ a b Anand, Kartik; Bianconi, Ginestra (13 October 2009). "Entropy measures for networks: Toward an information theory of complex topologies". Physical Review E. 80 (4): 045102. arXiv: 0907.1514. Bibcode: 2009PhRvE..80d5102A. doi: 10.1103/PhysRevE.80.045102. PMID  19905379. S2CID  27419558.
  7. ^ Bogacz, Leszek; Burda, Zdzisław; Wacław, Bartłomiej (1 July 2006). "Homogeneous complex networks". Physica A: Statistical Mechanics and Its Applications. 366: 587–607. arXiv: cond-mat/0502124. Bibcode: 2006PhyA..366..587B. doi: 10.1016/j.physa.2005.10.024. ISSN  0378-4371. S2CID  119428248.
From Wikipedia, the free encyclopedia

A set of networks that satisfies given structural characteristics can be treated as a network ensemble. [1] Brought up by Ginestra Bianconi in 2007, the entropy of a network ensemble measures the level of the order or uncertainty of a network ensemble. [2]

The entropy is the logarithm of the number of graphs. [3] Entropy can also be defined in one network. Basin entropy is the logarithm of the attractors in one Boolean network. [4]

Employing approaches from statistical mechanics, the complexity, uncertainty, and randomness of networks can be described by network ensembles with different types of constraints. [5]

Gibbs and Shannon entropy

By analogy to statistical mechanics, microcanonical ensembles and canonical ensembles of networks are introduced for the implementation. A partition function Z of an ensemble can be defined as:

where is the constraint, and () are the elements in the adjacency matrix, if and only if there is a link between node i and node j. is a step function with if , and if . The auxiliary fields and have been introduced as analogy to the bath in classical mechanics.

For simple undirected networks, the partition function can be simplified as [6]

where , is the index of the weight, and for a simple network .

Microcanonical ensembles and canonical ensembles are demonstrated with simple undirected networks.

For a microcanonical ensemble, the Gibbs entropy is defined by:

where indicates the cardinality of the ensemble, i.e., the total number of networks in the ensemble.

The probability of having a link between nodes i and j, with weight is given by:

For a canonical ensemble, the entropy is presented in the form of a Shannon entropy:

Relation between Gibbs and Shannon entropy

Network ensemble with given number of nodes and links , and its conjugate-canonical ensemble are characterized as microcanonical and canonical ensembles and they have Gibbs entropy and the Shannon entropy S, respectively. The Gibbs entropy in the ensemble is given by: [7]

For ensemble,

Inserting into the Shannon entropy: [6]

The relation indicates that the Gibbs entropy and the Shannon entropy per node S/N of random graphs are equal in the thermodynamic limit .

See also

References

  1. ^ Levin, E.; Tishby, N.; Solla, S.A. (October 1990). "A statistical approach to learning and generalization in layered neural networks". Proceedings of the IEEE. 78 (10): 1568–1574. doi: 10.1109/5.58339. ISSN  1558-2256. S2CID  5254307.
  2. ^ Bianconi, Ginestra (2008). "The entropy of randomized network ensembles". EPL (Europhysics Letters). 81 (2): 28005. arXiv: 0708.0153. Bibcode: 2008EL.....8128005B. doi: 10.1209/0295-5075/81/28005. ISSN  0295-5075. S2CID  17269886.
  3. ^ Menichetti, Giulia; Remondini, Daniel (2014). "Entropy of a network ensemble: definitions and applications to genomic data". Theoretical Biology Forum. 107 (1–2): 77–87. ISSN  0035-6050. PMID  25936214.
  4. ^ Krawitz, Peter; Shmulevich, Ilya (27 September 2007). "Entropy of complex relevant components of Boolean networks". Physical Review E. 76 (3): 036115. arXiv: 0708.1538. Bibcode: 2007PhRvE..76c6115K. doi: 10.1103/PhysRevE.76.036115. PMID  17930314. S2CID  6192682.
  5. ^ Bianconi, Ginestra (27 March 2009). "Entropy of network ensembles". Physical Review E. 79 (3): 036114. arXiv: 0802.2888. Bibcode: 2009PhRvE..79c6114B. doi: 10.1103/PhysRevE.79.036114. PMID  19392025. S2CID  26082469.
  6. ^ a b Anand, Kartik; Bianconi, Ginestra (13 October 2009). "Entropy measures for networks: Toward an information theory of complex topologies". Physical Review E. 80 (4): 045102. arXiv: 0907.1514. Bibcode: 2009PhRvE..80d5102A. doi: 10.1103/PhysRevE.80.045102. PMID  19905379. S2CID  27419558.
  7. ^ Bogacz, Leszek; Burda, Zdzisław; Wacław, Bartłomiej (1 July 2006). "Homogeneous complex networks". Physica A: Statistical Mechanics and Its Applications. 366: 587–607. arXiv: cond-mat/0502124. Bibcode: 2006PhyA..366..587B. doi: 10.1016/j.physa.2005.10.024. ISSN  0378-4371. S2CID  119428248.

Videos

Youtube | Vimeo | Bing

Websites

Google | Yahoo | Bing

Encyclopedia

Google | Yahoo | Bing

Facebook