Entropy, abbreviated as S, is the thermodynamic measure of disorder or randomness. Systems with more disorder have a higher entropy than those with less disorder. For example, an unfolded amino acid chain has higher entropy than when the chain is properly folded because the linear chain is more flexible and unorganized than a tightly packed protein. The second law of thermodynamics states that the entropy of an isolated system always increases. This means everything becomes more disordered without outside input. Isolated systems rarely occur naturally, so thermodynamics often examines the change in entropy of the entire universe. The change in the entropy of the universe includes both entropy changes of the system being studied and its surroundings. A process where the entropy of the universe increases, that is, one which has a ΔS greater than zero, occurs spontaneously. A process where entropy decreases or has a −ΔS is not spontaneous and needs energy input to occur.