It seems odd that entropy is usually only defined for a system in a single 'slice' of time or spacel

Matilda Webb

Matilda Webb

Answered question

2022-05-19

It seems odd that entropy is usually only defined for a system in a single 'slice' of time or spacelike region. Can one define the entropy of a system defined by a 4d region of spacetime, in such a way that yields a codimension one definition which agrees with the usual one when the codimension one slice is spacelike?

Answer & Explanation

garcialdariamcy4q

garcialdariamcy4q

Beginner2022-05-20Added 15 answers

You are thinking about Boltzmann's definition of entropy, I guess?
In Boltzmann's definition, entropy is just the logarithm of the amount of possible states associated with certain macroscopic variables. In its generality, therefore, it doesn't seem to me to exclude the possibility of counting states with different time coordinates. Or in your more general context, on different time-slices. The question is, what does this correspond to? Does it make sense to do that? You would have to specify the time-development of the macroscopic variables and count the number of microscopic trajectories compatible with those macroscopic trajectories.
As a matter of fact, there exist so-called dynamical entropies. In a heuristic sense, what they do is counting the density of phase-space trajectories of a system, whereas Boltzmann entropy just counts the amount of accessible states under certain macroscopic constraints.
studovnaem4z6

studovnaem4z6

Beginner2022-05-21Added 7 answers

As far as know, entropy works in systems with Hamiltonian dynamics, that is, when there is explicit dependence on time.
In classical mechanics (where position and momentum depend on time), there is Boltzmann entropy S = k b ln Ω ( Ω - 'number' of states).
In (non-relativistic) quantum mechanics (where wavefunction depend on time), there is von Neumann Entropy S = k b ρ ln ρ ( ρ - density matrix).
Though in general there is information-theoretic quantity Shannon entropy S = i p i ln p i ( p i are probabilities that the system is in the i-th state). Maybe in Quantum Field Theory there is some kind of 4-d entropy, but I am not sure. Anyway, the fundamental property 'entropy is non-decreasing function of time' has any meaning only if S is a function of time.

Do you have a similar question?

Recalculate according to your conditions!

New Questions in Relativity

Ask your question.
Get an expert answer.

Let our experts help you. Answer in as fast as 15 minutes.

Didn't find what you were looking for?