The concept of a state in physics is a surprisingly subtle and tricky concept, involving many potential layers of abstraction and encapsulation.  And no other discipline within physics demonstrates this subtly quite as poignantly as does classical thermodynamics. 

Take, for example, a collection of water molecules.  If the collection is empty, that is to say there are no molecules in it, then quantum field theory tells us that the state of the system is described by the vacuum field $| 0 \rangle$ with all of the quantum field operators and ephemeral fluctuations winking into and out of existence present in that description. If the collection holds a single water molecule and we are interested in the vibrational and rotational modes, the ionization and bonding angle, and similar physical quantities then the state is better described by quantum many-body wave function $\Psi (\left\{ {\vec r}_i \right\}, t)$.  If the collection has a small number $N$ of water molecules and we are interested in their interaction with their container and with each other via collisions, then the state comprises the individual positions and velocities ${\bar S} = \left[ {\vec r}_1, \ldots, {\vec r}_N, {\vec v}_1, \ldots, {\vec v}_N \right]$ evolved according to the usual Newtonian laws.  If there are a vast number of water molecules and they are in equilibrium then the state is describe by the partial differential equations governing the thermodynamic functions of density $\rho$, pressure $P$ and temperature $T$ (with the usual extension to hydrodynamics should there be hydrodynamic flows and gradients).

Given the elastic nature of the concept of state, stretching and bending as needed, it should hardly be a surprise that confusion and misuse might arise across the physics community. 

Peter Enders (in an article entitled Gibbs’ Paradox in the Light of Newton’s Notion of State in a letter to Entropy in 2009) argues that the resolution of the Gibb’s paradox – which has been a theme in this column’s analysis of how entropy is defined and understood – lies in the careful consideration of state.

According to Enders’s analysis, the Gibbs paradox arises when one uses what he calls the Lagrange-Laplace concept of state, defined as a collection of “the dynamical variables positions and velocities or momenta of all bodies involved”.  He argues that this notion of state counts the interchange of two “identical” particles as “representing two different states”.

Enders asserts that a Newtonian concept of state produces no paradoxical increase in entropy because

…the state of a body is given by its momentum vector ${\vec p}$. In case of several bodies without external interaction, their total momentum, ${\vec p}_{tot} = {\vec p}_1 + {\vec p}_2 + \ldots $ is conserved. And it is invariant against the interchange of bodies of equal mass if $m_2 = m_1$. \[ {\vec p}_{tot} = m_1 \left( {\vec v}_1 + {\vec v}_2 \right) + \ldots \; \]

He links the Newtonian state with the Hamiltonian and, thereby, with statistical mechanics and thermodynamics.  In terms of the usual explanation for the factor of $1/N!$ being needed from quantum considerations of indistinguishable particles, Enders says firmly

The factor $1/N!$ is thus not due to the (questionable) indistinguishability of quantum particles, but due to the permutation invariance of the classical Hamiltonian.

He summarizes his analysis by saying

Gibbs’ paradox concerning the mixing entropy can be resolved completely within classical physics. This result is important for the self-consistency of classical statistical mechanics as well as for the unity of classical physics.

Now, it isn’t important to finding Enders’s analysis compelling (although I’ll confess that I do) to appreciate that the logical and conceptual subtleties associated with what exactly is meant by the word state.