While you may get 'good' image quality running certain resolutions it's still not optimal. It depends on how well the monitor is able to scale the image.
See, now I completely agree with you, and had you stated that from the get go, no argument since that is factual.
And what goes for the refresh rate I wouldn't run the monitor at anything but regular 60hz. LCD's do not flicker and going past 60 is just asking for trouble really.
I have 6 lcd's here at home alone. One Apple Cinema HD 24", One Macbook Pro 15", Samsung BW226, HP v2007, HP 17" laptop and they all run 60hz + native resolution as the only troublefree setting.
The available refresh rate at 1920x1080 is limited also by single-link DVI which is used on most cheaper LCD's: http://en.wikipedia.org/wiki/Digital_Visual_Interface
True, the LCD's don't "flicker" like the old CRT's but you can get one to appear as if it's flickering if you set the refresh rate too low, and you can make it really wig out bad if you go too high. Running only "supported" refresh rates for the corresponding resolutions will not adversely affect the monitor, it's when you push those boundaries by going to "unsupported" refresh rates that you run into trouble...if you're running at "native"/"recommended"/"maximum" resolutions, then yes you do stand a greater chance of having problems by running refresh rates higher or lower than the recommended.
It is not
required to run an LCD monitor in native/recommended/maximum resolution by any means...and depending on what you're doing, it is sometimes better to run lower resolutions/higher refresh rates as long as your monitor has a decent dot pitch, can handle the proper scaling and you don't exceed the maximum refresh rate for that monitor.
I don't suppose you have had to support anyone with less than perfect vision?