Aces High Bulletin Board
General Forums => Hardware and Software => Topic started by: Max on January 05, 2020, 04:45:05 PM
-
Wanted to try using my Samsung 32" 1080p TV to replace my Samsung 27" gaming monitor. I've tried a few initial tweaks via the remote *picture* menu which are OK but not as crisp as my 27". Any suggestions from folks who've gone down this road?
Thanks :aok
-
Use your monitor.
If the TV has a gaming mode you could try that but the monitor is better.
-
Only the latest tvs with HDMI2 rival gaming monitors.
-
I'm not starting to debate whether a monitor would be better than a TV although I prefer using the "right" tools for each task.
There's a simple reason why the image on the TV isn't as crisp as on the monitor. Assuming both are 1080p the pixels on the TV are 18.5% larger than on the monitor. Further, as the viewing distance for a TV is much longer the screen can be made rougher (cheaper, older technology) meaning there's more stuff between the pixels than on a monitor designed to be viewed at an arm's length.
You can think the screen as two million boxes with a colour changing lamp in each box. On a 27" monitor the grid of the boxes is barely visible even with a perferct eyesight. On a 150" monitor the grid would be the size of the mesh used for mosquito nets on windows. Thus, if you want the image on a larger screen to be as crisp as on a smaller one you'd need more pixels to fill the area. That in turns requires more oomph from your video card.
-
Thoughts on this monitor...?
https://www.amazon.com/Sceptre-Edge-Less-FreeSync-DisplayPort-C275B-144RN/dp/B07N6ZBCVY/ref=sr_1_2?keywords=144%2Bhz%2B27%22%2Bgame%2Bmonitor&qid=1578349999&sr=8-2&th=1
-
How is it better than your Samsung? Is it just the 144Hz?
Do you have an AMD video card?
-
Yes, the 144 hz is deemed preferable to 60. My card is GTX 1660 Ti
-
This one might be a good compromise for the price; "only" 85 Hz but more surface and more pixels to see: https://www.amazon.com/Sceptre-C305W-2560UN-30-inch-DisplayPort-Build/dp/B07XZNXWGS (https://www.amazon.com/Sceptre-C305W-2560UN-30-inch-DisplayPort-Build/dp/B07XZNXWGS)
And here's all of the three; size, speed and sharpness: https://www.amazon.com/VIOTEK-GNV30CB-30-Inch-Curved-Monitor/dp/B07WHSW195/ (https://www.amazon.com/VIOTEK-GNV30CB-30-Inch-Curved-Monitor/dp/B07WHSW195/). The brand is unknown for me, though, but the reviews look promising.
-
Yes, the 144 hz is deemed preferable to 60. My card is GTX 1660 Ti
You might check your fps in AH with v-sync off to see if you get 144 consistently.
The monitor you listed is freesynch for AMD. A GTX card won't benefit from that, unless something changed, but it doesn't cost extra either like G-synch does.
-
My current computer monitor (not TV) is Samsung C500 Series S27C500H 27-Inch Screen LED-Lit Monitor. Not sure what AMD has to do with it. As far as turning off v-synch, doesn't that cause "rubber bullets"?
-
My current computer monitor (not TV) is Samsung C500 Series S27C500H 27-Inch Screen LED-Lit Monitor. Not sure what AMD has to do with it. As far as turning off v-synch, doesn't that cause "rubber bullets"?
edit: had them backwards AMD has a tech called freesync and NVidea has one called g sync that are basically their own implementation of Vsync. An NVidea card doesn't benefit from freesync enabled monitors and vice versa, but VSync is the older, generic implementation that works on all cards.
Wiley.
-
I have a 27in Dell Gsync 2K monitor. Its perfect, just make sure that you have a Geforce vido card to enable it in the Nvidia diplay settings.
<S> Max
-
This is what I have...great price too!
https://www.bestbuy.com/site/dell-27-led-qhd-g-sync-monitor-black/5293502.p?skuId=5293502&ref=212&loc=1&extStoreId=54&ref=212&loc=DWA&gclid=EAIaIQobChMIrffFl6by5gIVC9vACh3y4wtsEAQYASABEgKlVPD_BwE&gclsrc=aw.ds
-
My current computer monitor (not TV) is Samsung C500 Series S27C500H 27-Inch Screen LED-Lit Monitor. Not sure what AMD has to do with it. As far as turning off v-synch, doesn't that cause "rubber bullets"?
Turning V-sync off is for testing purposes only to see how high your video card could reach with a matching monitor. V-sync ON draws the max amount of full screen pictures, V-sync OFF starts drawing a new picture right after the previous one has been sent to the monitor which can cause several partial images on the screen which is not ideal. If your monitor is limited to 60 Hz which means max 60 FPS no matter how powerful your video card is, the video card will wait if needed. With V-sync off you can see how fast a monitor would work full speed with your video card.
AMD Freesync and Nvidia G-Sync are adaptive synchronizing technologies which very basically vary the refreshing speed depending on how much change there is on the screen with V-sync off. They both work but you'd be somewhat limited to compatible monitors/video cards by choosing one or the other.
-
This is what I have...great price too!
https://www.bestbuy.com/site/dell-27-led-qhd-g-sync-monitor-black/5293502.p?skuId=5293502&ref=212&loc=1&extStoreId=54&ref=212&loc=DWA&gclid=EAIaIQobChMIrffFl6by5gIVC9vACh3y4wtsEAQYASABEgKlVPD_BwE&gclsrc=aw.ds
Is it VESA mounting compatible? It might not be important with a single monitor, but once someone starts working with more than one monitor, those desk stand onlys become a pita.
Just something to think about.
:cheers:
-
If you have a TV in your RV, try doing 10 miles per hour. You can then tell folks your tv gets 52800 fps.
-
I have 2 Sanyos I use for Computer monitors..
A 55 that I use for the sim pit, and a 32 for my GP/browser computer..
The 55 is hooked up to the 1060 vid card, thru the HDMI#1 primary input..
The TV and the card self adjusted, so I don't get screen stretch or squeeze..
I thought that was cool as hell..
The 32, is connected via the oldschool PC input, as my browser GP/PC doesn't use
a vid card.. It also self adjusted, but it had a few issues with screen size..
After experiments with the manual TV picture settings, I ended up hitting the auto
adjust menu button over and over, repeating the auto adjust, it changed little by little..
Finally it worked itself out! Works good..
I don't know if the process will work for a Samsung tho!