Title
Mutual redundancies in interhuman communication systems: Steps toward a calculus of processing meaning
Abstract
AbstractThe study of interhuman communication requires a more complex framework than Claude E. Shannon's 1948 mathematical theory of communication because "information" is defined in the latter case as meaningless uncertainty. Assuming that meaning cannot be communicated, we extend Shannon's theory by defining mutual redundancy as a positional counterpart of the relational communication of information. Mutual redundancy indicates the surplus of meanings that can be provided to the exchanges in reflexive communications. The information is redundant because it is based on "pure sets" i.e., without subtraction of mutual information in the overlaps. We show that in the three-dimensional case e.g., of a triple helix of university-industry-government relations, mutual redundancy is equal to mutual information Rxyz=Txyz; but when the dimensionality is even, the sign is different. We generalize to the measurement in N dimensions and proceed to the interpretation. Using Niklas Luhmann's 1984-1995 social systems theory and/or Anthony Giddens's 1979, 1984 structuration theory, mutual redundancy can be provided with an interpretation in the sociological case: Different meaning-processing structures code and decode with other algorithms. A surplus of "absent" options can then be generated that add to the redundancy. Luhmann's "functional subsystems" of expectations or Giddens's "rule-resource sets" are positioned mutually, but coupled operationally in events or "instantiated" in actions. Shannon-type information is generated by the mediation, but the "structures" are re-positioned toward one another as sets of potentially counterfactual expectations. The structural differences among the coding and decoding algorithms provide a source of additional options in reflexive and anticipatory communications.
Year
DOI
Venue
2014
10.1002/asi.22973
Periodicals
Keywords
Field
DocType
information theory
Information theory,Data mining,Computer science,Mathematical theory,Counterfactual thinking,Multivariate mutual information,Redundancy (engineering),Mutual information,Artificial intelligence,Pointwise mutual information,Structuration theory
Journal
Volume
Issue
ISSN
65
2
2330-1635
Citations 
PageRank 
References 
16
1.20
14
Authors
2
Name
Order
Citations
PageRank
Loet Leydesdorff14987381.86
Inga A. Ivanova2486.71