Depends on the source you are watching - the "normal" human eye cannot see things change faster than about 60 Hz. That's why you probably don't see your light bulbs flickering, because the frequency is at 60 Hz (at least in the US). If you are watching a progressive scan source (i.e. 1080p, 720p, or anything else with "p"at the end), then you will not see a difference between 85Hz and 100Hz. If you are watching an interlaced source (i.e. 1080i, 480i, etc.) then you may detect a slight difference if you look for it. Maybe. I could be wrong on this next statement, but the interlacing is performed at a 60 Hz base frequency (you only get a full picture at a rate of 30 Hz), so any difference should be negligible. Size of the TV doesn't matter in this case as much as the source you are using and how fast your eyes-to-brain connection is.
As always, if this question is posed for a new purchase decision, we can talk specs and theoretics all day. In the end, TRY BEFORE YOU BUY! Go to your local "Buy More" and look at the LCDs side by side (and set the brightness, contrast, and color temp settings to "stun" instead of "kill"). See for yourself if there is a difference. Display frequency is only part of the equation to the "best picture". You usually won't go wrong if you follow this advise.