mab1 - "The Media is the Message"

Text-only Preview


When seeing the message (the "innovation") of a media as a change in thinking in the recipient, then on the
level of raw data, information theory could provide some useful means for expressing the "innovation" of a
media in technical terms.

To simplify the discussion, I restrict it to the domain of grayscale images.

The entropy of a grayscale image is calculated as
255
E
p(v) log ( p( )
v )
2
v 0
v .... grayscale value v
p(v) ... histogram value of grayscale value v

The idea is to express the "innovation" of an image as the additional information it adds to a set of images.
Given a set of images: i ,i ,..., i
. What is the innovation I of an additional image i ? A trivial approach would
1
2
n 1
n
be to calculate I as
I (i )
E(i )
n
n
But does low entropy automatically imply less additional information? Depends on the context. A black image
in a sequence of surveillance images might be regarded as a faulty output without any value, or it might add
the information that passents block the view of the camera. From the information theoretic point of few, a
black image supplies a change in image information as any other image.

An approach to overcome this problem is to calculate I as
I (i )
E(i
i
...
i
i ) E(i
i
...
i
)
n
1
2
n 1
n
1
2
n 1
The
- operator can be interpreted as a concatenation of images, which leads to an accumulated histogram.
This, however, entails the problem of "negative innovation", which is not very intuitive.

A non information theoretic approach I thought of was to compare a new image with the average of the images
in the set.
n 1
1
avg
i
n 1
k
k 1
width,height
I (i )
|avg( ,
x y) - i (x, y) |

n
n
x 0, y 0

This method can easily be applied on extracted feature values, not only on raw pixel data.

It would of course also be possible to apply the entropy on avg, maybe this way:
I (i ) E(avg i )
n
n