On Calculating Entropy

When talking about Pasquali patches that converge and stationary surfaces that they converge to, we discussed that such would stabilize because it wouldn't loop in any shape or form nor would it diverge.  Such Claim we called Entropy.  We didn't provide a mechanism to calculate such. We do that now.

The way we propose we calculate the entropy is by measuring the deviation from the stationary surface.  Thus for the case where we have a Pasquali patch p_t(x,y), t \in \mathbb{Z}^+ we could measure entropy by:

 S(t) = \int_{[0,1]^2} \vert p_t(x,y) - p_\infty(x) \vert \, dA

that is, by taking the absolute deviation from the stationary surface (notice we use the area element dA).  Since in this case the time element is configured at integer times, rather than on continuous time, letting  t \in \mathbb{R} implies using the Pasqualian.  Thus we would have 

 S(t) = \int_{[0,1]^2} \vert p(x,y,t) - p_\infty(x) \vert \, dA

It seems fairly clear that 0 < S(t) < \infty , but here we must agree that 0 means that the system is stable and has the most entropy.

We may however, rather than the absolute difference from the stationary surface, choose to emphasize larger changes and (heavily) discount smaller ones.  Thus we could suggest

 S^{\square}(t) = \int_{[0,1]^2} \left( p(x,y,t) - p_\infty(x) \right)^2 \, dA

for t \in \mathbb{R}.

 

  1. June 10th, 2014 at 13:46 | #1

    I don't follow completely, but I'm trying to figure out if this is the same as Fisher Information (variance).

  2. June 11th, 2014 at 08:14 | #2

    Hey! Welcome back! ;)

  3. EastwoodDC
    July 7th, 2014 at 14:50 | #3

    I'm still around, but not spending so much time with my RSS feeds as I used to. You should turn on the option to subscribe to comments, then I won't have to hunt the post down again to see the reply. :-)

  1. No trackbacks yet.

*