I recently finished reading a book on the application of information theory to “reality”: Decoding Reality by Vlatko Vedral. It’s for the layman (me!) and I was wondering what applications have information theory made in chemistry. Well, just by accident I happened upon a paper by Noorizadeh which proposes an information-based metric to evaluate aromaticity!1 (I know what you’re thinking – we need another aromaticity metric like we need another hole in the head.) I don’t want to suggest that this metric, which he calls “Shannon aromaticity” after the inventor of information theory, will substitute for previous ones (like aromatic stabilization energy or NICS). But the application here is interesting.

Shannon defined entropy in the information sense as

S(r) = -Σ pi ln pi

Where pi is the probability of occurrence i. This can be converted into a quantum analogue as

S[ρ] = -∫ρ(r)  ln ρ(r) dr.

Noorizadeh suggests evaluating the electron density at the bond critical points of an aromatic ring and then summing the values of S at each of these ring critical points. An ideal aromatic ring would have Smax= ln (N) where N is the number of bonds in the ring. So, the Shannon aromaticity (SA) is then defined as the difference between the maximum value (ln (N)) and the sum over the ring critical points. A small value would indicate an aromatic ring, and a large value would indicate an antiaromatic ring.

The paper shows a strong correlation exists between the new SA metric and the warhorses ASE and NICS and HOMA for a variety of aromatic, antiaromatic and non-aromatic systems. This new metric is easy to compute and perhaps offers a new way to be thinking about a very old concept: aromaticity.


(1) Noorizadeh, S.; Shakerzadeh, E., "Shannon entropy as a new measure of aromaticity, Shannon aromaticity," Phys. Chem. Chem. Phys., 2010, 12, 4742-4749, DOI: 10.1039/b916509f.