![]() | It has been suggested that this article be
merged into
Network entropy. (
Discuss) Proposed since October 2023. |
![]() | This article may require
cleanup to meet Wikipedia's
quality standards. The specific problem is: Difficult to understand for laymen, an expert with an ability to explain the subject in simpler terms is required. (September 2020) |
A set of networks that satisfies given structural characteristics can be treated as a network ensemble. [1] Brought up by Ginestra Bianconi in 2007, the entropy of a network ensemble measures the level of the order or uncertainty of a network ensemble. [2]
The entropy is the logarithm of the number of graphs. [3] Entropy can also be defined in one network. Basin entropy is the logarithm of the attractors in one Boolean network. [4]
Employing approaches from statistical mechanics, the complexity, uncertainty, and randomness of networks can be described by network ensembles with different types of constraints. [5]
By analogy to statistical mechanics, microcanonical ensembles and canonical ensembles of networks are introduced for the implementation. A partition function Z of an ensemble can be defined as:
where is the constraint, and () are the elements in the adjacency matrix, if and only if there is a link between node i and node j. is a step function with if , and if . The auxiliary fields and have been introduced as analogy to the bath in classical mechanics.
For simple undirected networks, the partition function can be simplified as [6]
where , is the index of the weight, and for a simple network .
Microcanonical ensembles and canonical ensembles are demonstrated with simple undirected networks.
For a microcanonical ensemble, the Gibbs entropy is defined by:
where indicates the cardinality of the ensemble, i.e., the total number of networks in the ensemble.
The probability of having a link between nodes i and j, with weight is given by:
For a canonical ensemble, the entropy is presented in the form of a Shannon entropy:
Network ensemble with given number of nodes and links , and its conjugate-canonical ensemble are characterized as microcanonical and canonical ensembles and they have Gibbs entropy and the Shannon entropy S, respectively. The Gibbs entropy in the ensemble is given by: [7]
For ensemble,
Inserting into the Shannon entropy: [6]
The relation indicates that the Gibbs entropy and the Shannon entropy per node S/N of random graphs are equal in the thermodynamic limit .
![]() | It has been suggested that this article be
merged into
Network entropy. (
Discuss) Proposed since October 2023. |
![]() | This article may require
cleanup to meet Wikipedia's
quality standards. The specific problem is: Difficult to understand for laymen, an expert with an ability to explain the subject in simpler terms is required. (September 2020) |
A set of networks that satisfies given structural characteristics can be treated as a network ensemble. [1] Brought up by Ginestra Bianconi in 2007, the entropy of a network ensemble measures the level of the order or uncertainty of a network ensemble. [2]
The entropy is the logarithm of the number of graphs. [3] Entropy can also be defined in one network. Basin entropy is the logarithm of the attractors in one Boolean network. [4]
Employing approaches from statistical mechanics, the complexity, uncertainty, and randomness of networks can be described by network ensembles with different types of constraints. [5]
By analogy to statistical mechanics, microcanonical ensembles and canonical ensembles of networks are introduced for the implementation. A partition function Z of an ensemble can be defined as:
where is the constraint, and () are the elements in the adjacency matrix, if and only if there is a link between node i and node j. is a step function with if , and if . The auxiliary fields and have been introduced as analogy to the bath in classical mechanics.
For simple undirected networks, the partition function can be simplified as [6]
where , is the index of the weight, and for a simple network .
Microcanonical ensembles and canonical ensembles are demonstrated with simple undirected networks.
For a microcanonical ensemble, the Gibbs entropy is defined by:
where indicates the cardinality of the ensemble, i.e., the total number of networks in the ensemble.
The probability of having a link between nodes i and j, with weight is given by:
For a canonical ensemble, the entropy is presented in the form of a Shannon entropy:
Network ensemble with given number of nodes and links , and its conjugate-canonical ensemble are characterized as microcanonical and canonical ensembles and they have Gibbs entropy and the Shannon entropy S, respectively. The Gibbs entropy in the ensemble is given by: [7]
For ensemble,
Inserting into the Shannon entropy: [6]
The relation indicates that the Gibbs entropy and the Shannon entropy per node S/N of random graphs are equal in the thermodynamic limit .