The thing is in thermodynamics we learn entropy as a measure of energy of a system per unit temperature that isn't available for the system to do work. Again, statistically, entropy is a measure of number of ways in which a system can be arranged. I am confused why these two seemingly different concepts are identical(or atleast proportional) to each other.
            Asked
            
        
        
            Active
            
        
            Viewed 79 times
        
    5
            
            
        - 
                    1Does this answer your question? How is $\frac{dQ}{T}$ measure of randomness of system? – Tobias Fünke Jun 14 '22 at 10:24
 - 
                    1This could be of interest, too. – Tobias Fünke Jun 14 '22 at 10:25
 - 
                    The textbook by Bergersen and Plischke spends some time relating the two. – Connor Behan Jun 14 '22 at 13:30
 - 
                    3Does this answer your question? How to derive Shannon Entropy from Clausius Theorem? also this: https://physics.stackexchange.com/a/710156/247642 – Roger V. Jun 14 '22 at 13:58
 - 
                    1Also 1, 2, 3, 4, 5. – Chemomechanics Jun 14 '22 at 15:03