Author Topic: Monitors  (Read 22040 times)

Offline Xessive

  • Gold Member
  • *
  • Posts: 9,920
    • XSV @ deviantART
Monitors
« on: Sunday, September 03, 2017, 01:41:46 PM »
Monitors: the windows to our personal computing devices. The one you have can determine the very basis of how you interface with your machine.

I'm often torn between resolution, refresh/response time, and colour accuracy.

A couple of years ago I picked up the Asus PG278Q aka the Asus ROG Swift. It's 1440p screen with 1ms response time and crank out up to 144Hz with G-Sync support.

It's a TN panel screen but I was surprised with how accurate the colour representation appears on it, considering one of the common downsides of TN panels is their lacking colour accuracy. Its viewing angles are not great but as a desktop monitor I'm generally only ever going to view it from one angle anyway. This was later addressed with its refresh the Asus PG279Q which has a gorgeous IPS panel. However, TN panels are renowned for their speed. Overall, I've been very happy with it.

Fairly recently, I wound up with a 27" curved monitor (Samsung CFG70), which I have been using for the past few months. It's among Samsung's first lineup of "gaming" monitors. It's s 27" 1080p screen with FreeSync, 144Hz, IPS panel. Gorgeous screen but suffers some IPS glow effects (which are random and inevitable). The curved screen was striking at first, I went in assuming it was yet another gimmick and its novelty would soon wear off. But at the distance I normally sit from a monitor, it actually felt a tad more immersive. It eventually became the norm and I adjusted it to it fairly quickly.

Even though it was a lower resolution than I had been using I took the opportunity to run games at 1080p and really get some performance out of them over the past few months. Great times but it's now time to come back to my beloved Asus ROG Swift. This is when I noticed the difference.

Apparently, my eyes were so accustomed to the concave curves of the Samsung CFG70 that now this monitor looks convex! I feel like I'm looking at a classic bubble TV screen! ;D The optical illusion is blowing my mind! I know it's in fact a perfectly flat screen but it looks and feels convex!

I'm sure this will soon wear off as well but it's an interesting experience having assumed the curve screens were just gimmicks.

Offline gpw11

  • Gold Member
  • *
  • Posts: 7,182
Re: Monitors
« Reply #1 on: Tuesday, September 19, 2017, 08:31:50 PM »
Ha, that's pretty hilarious

Offline Xessive

  • Gold Member
  • *
  • Posts: 9,920
    • XSV @ deviantART
Re: Monitors
« Reply #2 on: Thursday, September 21, 2017, 05:59:49 AM »
Yeah, it was weird for a few days.

I've readjusted to the flat screen and all is well.

I think curved screens make more sense for ultrawide monitors (21:9 ratio).

Offline iPPi

  • Senior Member
  • *
  • Posts: 3,159
  • Roar!
Re: Monitors
« Reply #3 on: Tuesday, January 23, 2018, 12:56:47 AM »
A bit of a bump, but my monitor of 12 years has finally died.  It was a 21" HP monitor with a resolution of 1680x1050 -- at the time, it was quite a decent monitor.  Over the years, I picked up a 20" (in 2010) and a 23" in 2013 and have been using a three monitor setup.  Both of these monitors are cheap -- the 20" cost $100 and and the 23" came with my PC that I ordered from Dell back then.

I think now might be a good time to get a new monitor, and the latest Alienware AW3418DW has caught my eye.  It's an IPS panel with 3440x1440, curved display with G-sync.  It's been getting decent reviews. 


Offline ren

  • Veteran
  • ****
  • Posts: 1,672
Re: Monitors
« Reply #4 on: Tuesday, January 23, 2018, 07:52:57 AM »
I bought a Dell U2415 on sale a few months ago and love it. So much that I may buy another one to complement it.

A 34" curved monitor sounds amazing but I prefer having a two or three monitor setup over that.

I suppose it depends what you do on the monitor though. I use it for work and no gaming so multi-monitor beats size.

Offline iPPi

  • Senior Member
  • *
  • Posts: 3,159
  • Roar!
Re: Monitors
« Reply #5 on: Wednesday, January 24, 2018, 12:05:52 AM »
Yea, one of the drawbacks of having an ultrawide is that it's a single screen with tons of real estate but sometimes productivity can be a challenge.  I've heard there are ways to split the screen space up either using a tool or even Windows' built-in snap features. 

In any event, I will probably keep the 23" on the side as well to ensure that, to the extent that I work on my desktop PC, I can.  I do all my work on my work laptop anyway.

Will be waiting for Dell to have a sale in Canada on the monitor.  It's way too expensive at regular price and a dealbreaker.

Offline Xessive

  • Gold Member
  • *
  • Posts: 9,920
    • XSV @ deviantART
Re: Monitors
« Reply #6 on: Wednesday, January 24, 2018, 01:55:31 AM »
The Alienware AW3418DW is a gorgeous monitor and generally scored well in reviews. I haven't tried it myself but it was on my list when I was looking at ultrawides as a potential option. The other 3 I was looking at were the Asus, Acer, and LG. With LG having the most affordable option (the LG 29UC88 close to $300).

The main reasons I ended up opting against ultrawide are cost, gaming, and that the trade-off did not add significantly to productivity. I was still doing fine with a dual monitor setup. In fact, I ended up going with multi-system setup (two PCs linked) thanks to helpful utilities like Multiplicity. But that's another topic.

The cost of ultrawide is still too high and while I can see it being useful for certain fields it's not all that utilitarian for most people.

For me, I'd rather spend that amount on a significantly higher spec 16:9 screen (whether it's 4K or 1440p).

Unless you're in dire need, I wouldn't recommend buying a new monitor just yet. I'd estimate waiting till March (if not sooner) when Asus and Acer will announce their new monitors and the release dates of their previously announced HDR monitors. You'll either find something new you want or the prices will drop on something you're eyeing now.

Offline Quemaqua

  • 古い塩
  • Administrator
  • Forum god
  • *
  • Posts: 16,498
  • パンダは触るな。
    • Bookruptcy
Re: Monitors
« Reply #7 on: Thursday, January 25, 2018, 10:15:30 PM »
Curiosity: why is having more monitors better for productivity as opposed to one big one? I've gotten used to my dual-monitor setup, but the primary advantage is being able to full-screen certain applications on the other screen easily, or snap things to the edges of that screen.

天才的な閃きと平均以下のテクニックやな。 課長有野

Offline gpw11

  • Gold Member
  • *
  • Posts: 7,182
Re: Monitors
« Reply #8 on: Saturday, January 27, 2018, 07:09:54 PM »
I find it really easy to just have a PDF open on one screen full screen and a spreadsheet, cad program, or whatever else open on the other.  That said, yeah, I could just snap to split screen on a larger monitor. 

Offline ren

  • Veteran
  • ****
  • Posts: 1,672
Re: Monitors
« Reply #9 on: Monday, January 29, 2018, 09:34:51 AM »
Snapping is good but it really only helps you do a 50/50 split. With dual monitors you can snap on each of them which gives way more options.

Dual monitors also give you a separate taskbar for each screen which makes it a lot easier to manage windows than an alt+tab screen that goes on forever.

Snapping is also just not ideal for some applications; operating in full screen can be easier.

Offline Xessive

  • Gold Member
  • *
  • Posts: 9,920
    • XSV @ deviantART
Re: Monitors
« Reply #10 on: Saturday, February 03, 2018, 02:13:29 AM »
For design, I usually have the main work on my primary screen and toolkits along with any reference images on the other screen. It just allows me to give maximum real estate to the artwork on a single screen.

This is also effective using multiple (or virtual) desktops but I tend to get frustrated quickly when I'm repeatedly switching or alt-tabbing.

Offline gpw11

  • Gold Member
  • *
  • Posts: 7,182
Re: Monitors
« Reply #11 on: Saturday, February 03, 2018, 11:14:08 PM »
Not at all related to this thread really but I switched one of my office monitors with my home monitor.  Mainly because I work from home a fair bit and figured I should move the 24" there and could use my old 21" at the office with little problem (in combination with my existing 24").

My home monitor was a pretty high end (for the time) Samsung from like 2006 or so.  The office one I moved is like a budget Benq from the last couple of years. I only really noticed the difference between them since switching really but the newer one just has so much better colour production and viewing angles.  Plus, way lower response time (like 2ms compared to 6ms) but fuck me, do I wish there was an easy way to turn the backlight down.

Offline Cobra951

  • Gold Member
  • *
  • Posts: 8,934
Re: Monitors
« Reply #12 on: Sunday, February 04, 2018, 12:12:51 PM »
My screen's backlight doesn't even get a normal adjustment control.  Rather, it falls under energy saving, with 4 levels:  High, Medium, Low and Off.  These are not brightness levels, but rather energy-saving labels.  So the scale is backward from what you'd think.  It's really stupid.  Off for full brightness, High for dimmest.  I was on High for a long time, and more recently went to Medium.  The full brightness seared my retinas, especially when the screen was new. 

I'm sure you have some sort of backlight adjustment you can live with most of the time, then use the monitor's software controls for brightness, contrast, etc.  I have 4 presets for the software visual adjustments, of which I use Normal and Movie almost exclusively.  Normal has the white level at 80%, for regular computer activities, like right now.  Movie I have set to use the full range of brightness (with white at 100%), and I use that for videos and gaming.

Offline iPPi

  • Senior Member
  • *
  • Posts: 3,159
  • Roar!
Re: Monitors
« Reply #13 on: Saturday, February 10, 2018, 07:38:07 PM »
Ordered the AW3418DW as it is on sale in Canada.  Now the wait -- apparently won't get here until March!

Offline Xessive

  • Gold Member
  • *
  • Posts: 9,920
    • XSV @ deviantART
Re: Monitors
« Reply #14 on: Sunday, February 11, 2018, 09:37:52 AM »
Ordered the AW3418DW as it is on sale in Canada.  Now the wait -- apparently won't get here until March!
Nice! Get us some shots of your setup!

Offline MysterD

  • Forum god
  • *
  • Posts: 18,049
  • OWNet 4 Eternity & Beyond
Re: Monitors
« Reply #15 on: Sunday, February 18, 2018, 08:47:07 AM »
I have a U28E590D Samsung 28'' 4K Monitor TN panel. Personally, I love it.

I do bounce often use 1440p for games, since GTX 970 won't handle most games at 4K 60fps for me. Though, b/c of the blur that can occur from 4K->1440p or any 4K->downgrading, make sure you crank your AA, AF, any/or anything else that can sharpen the image up until it isn't blurry for you.

Offline gpw11

  • Gold Member
  • *
  • Posts: 7,182
Re: Monitors
« Reply #16 on: Sunday, February 18, 2018, 10:58:26 PM »
My screen's backlight doesn't even get a normal adjustment control.  Rather, it falls under energy saving, with 4 levels:  High, Medium, Low and Off.  These are not brightness levels, but rather energy-saving labels.  So the scale is backward from what you'd think.  It's really stupid.  Off for full brightness, High for dimmest.  I was on High for a long time, and more recently went to Medium.  The full brightness seared my retinas, especially when the screen was new. 

I'm sure you have some sort of backlight adjustment you can live with most of the time, then use the monitor's software controls for brightness, contrast, etc.  I have 4 presets for the software visual adjustments, of which I use Normal and Movie almost exclusively.  Normal has the white level at 80%, for regular computer activities, like right now.  Movie I have set to use the full range of brightness (with white at 100%), and I use that for videos and gaming.

This helped actually.  I can't find a direct control but I can mess with the presets, although previously I didn't realize they were adjusting the backlight as well.  There's a low blue light mode which seems to help and an eco mode which is great, but seems to auto adjust - which is pretty fucking distracting. 

Offline Cobra951

  • Gold Member
  • *
  • Posts: 8,934
Re: Monitors
« Reply #17 on: Monday, February 19, 2018, 12:04:50 PM »
It depends on the monitor.  On mine, backlight is strictly its own thing under energy saving.  All my presets and full-range controls are software adjustments only (meaning they only affect the LCD elements).  Not ideal by far.

I have never met a TN panel that I liked.  I hate the extremely narrow vertical viewing angle, with colors inverting when the viewpoint is lower than the screen.  I know they have the lowest display lag, but I'll take a few more ms of that to get a screen that shows the same colors whether I'm standing up, sitting up straight, or slumping down in my chair.

Offline Cobra951

  • Gold Member
  • *
  • Posts: 8,934
Re: Monitors
« Reply #18 on: Sunday, August 12, 2018, 10:57:13 PM »
Well, my trusty 11-year-old Samsung 23" TV died.  I woke up at around 3:30 AM to see . . . nothing!  The audio from some satellite station was still playing, but all I could see was a dim backlit rectangle.  Nothing I tried for over an hour brought it back.  No display of any kind, including the OSD.  My guess is that enough dust accumulated on the electronic innards through the top vents to finally overheat the display chip and fry it.  RIP!

So I stayed up another hour on my laptop (thank God I got one around last Christmas) scouring the monitor offerings listed in the Microcenter site.  I narrowed it down to 2, and ended up with this ASUS 279QL after checking it out at my local MC store.  Pretty much exactly what I wanted, given that it was going to be a monitor and not a full TV.  27" with almost no bezel, it's only slightly wider than the 23" Samsung.  1080p, 60 Hz, though I've read it can be driven at 75 Hz.  It's either IPS or AMVA+, depending on where you go read about it.  Rock-steady colors at all viewing angles, with brightness dropping off as they get extreme.  Beautiful picture, with 6 different presets intended for different purposes.  All but one (sRGB) are user-modifiable.  HDMI, DisplayPort and D-Sub (VGA) inputs, separately selectable in the OSD.  Digital audio through HDMI and DP, with audio out through a standard headphone jack.  (Internal speakers suck, as usual.)  I went with DP, and the sound through my Logitech 2.1's is so much better than the integrated analog audio I've been getting for nearly 2 years.  I was expecting higher dynamic range (a much better noise floor and no harsh distortion for loud SFX), but I was not prepared for the much-tighter bass and overall clarity.

The picture is the real star.  Damn, this thing looks good.  I've spent the better part of the weekend fiddling with settings on both the monitor and Nvidia Control Panel, and I think I'm finally coming to grips with how to make it look its best.  I need basically 2 modes: one that looks good when doing what I'm doing right now without searing my retinas, and another for balls-to-the-wall high contrast for games, movies, TV shows, etc.  I'm almost there.  Still tweaking, but pretty happy.

The surprise was the stand.  This thing allows 4 degrees of motion: height, and 3 axes of rotation.  It can be turned around into portrait mode (9:16).  I doubt I'll ever use that, but who knows.

The sharpness, contrast and size are such great improvements.  The downside is no legacy analog connections, and no TV.  Since I went with DP, I could get an HDMI hub, and use it to plug in both the Xbox One and the 360.  For now, I will probably just plug in the One (haven't bothered yet).  The Wii will probably go into the closet.  I haven't even turned it on in years, but it still feels like a casualty.  the PS2 was disconnected long ago.  I will probably get a cheap TV, and run the coax to it, at some point.  Come to think of it, I can borrow the little one in the kitchen.  Now I just need to find something to prop it up on in my room.

Edit:  A couple more details:  Response time is 5 ms, and input lag, according to this review, is 12 ms.  Every official description I've read says that cables beyond VGA are optional, but that's not the case.  I got HDMI, DisplayPort and audio cables, and several adapters, most notably HDMI to DVI and full-size HDMI to mini.  Come to think of it, I think the only thing I didn't get was VGA.  I'll have to look in the bag of goodies to be sure.
« Last Edit: Monday, August 13, 2018, 05:26:05 AM by Cobra951 »

Offline Cobra951

  • Gold Member
  • *
  • Posts: 8,934
Re: Monitors
« Reply #19 on: Tuesday, August 14, 2018, 12:41:02 PM »
In general, some thoughts.

V-Sync is a waste, unless you get lots of tearing. You lose too many frames, using V-Sync and all. If you have little to no tearing, use a program like MSI Afterburner or NVidia Inspector and cap it to 60fps, if you don't take crazy roller-coaster type of hits that go up and down a lot (i.e. go from 60 to 25fps then up to 45fps then back to 30fps then...well, you get the drift).

Adaptive Sync is not a bad idea either, as long as it ain't bouncing b/t back and forth with 60fps and 30fps all the time.

FastSync is good too for Nvidia users - but, you have to make sure still don't get graphical tearing. It's like running No Sync, but you get "useless" frames thrown out. Doesn't always work, though - as it did not play nice w/ Homefront: The Revolution for me. When using it - cap to 60fps (or wherever you like) w/ Afterburner or Nvidia Inspector.

G-Sync monitors are great for NVidia users, if you're willing to afford that expense in terms of $$ - since you don't get tearing & you really don't want to lose performance/frames.

Thanks for those tips.  As long as the hardware performs as well as needed or better, I like sticking to vsync.  For the last 2 years, that's been the case, because of my lower resolution.  Very little trouble maintaining a locked 60 fps on everything I've been playing.  Now I'm going to have to make compromises in some cases, and as luck would have it, the game I'm into right now seems a tough, unoptimized case.

Thing is I hate tearing and flagging.  I'll put up with them when I have no choice, but I'm not going to suffer through them when I don't.  Your approach, it seems to me, practically guarantees those nasties in true fullscreen mode.  And a hard 60 fps cap imposed in an app often makes it more difficult for the system to maintain that frame rate than it is with triple-buffering and vsync.  With the latter, the process can get slightly ahead of the game when able, giving it more wiggle room.

The whole purpose of Adaptive Sync is to prevent the sudden drop from 60 to 30 fps.  If the system can't generate a frame in under 16.7 ms, it will skip the sync on it.  Theoretically anyway.  I don't really know the inner workings.

I actually saw some Freesync and GSync monitors while searching around, and at Microcenter as well.  They were all too expensive, and higher-res than I wanted.  Pretty soon, I imagine anything that isn't a cheapie will be 1440p or 4K.

Offline MysterD

  • Forum god
  • *
  • Posts: 18,049
  • OWNet 4 Eternity & Beyond
Re: Monitors
« Reply #20 on: Tuesday, August 14, 2018, 04:35:19 PM »
There are other problems also w/ V-Sync ON.

Input lag. If it's waiting for frames to sync-up perfectly b/c it's got too many frames produced - boom, input lag. Can really affect your gameplay and also throw you off w/ your controls, movement, and whatnot.

You can get also stuttering from V-Sync On, if your framerate goes below your monitor's refresh rate. Big hits on framerates also don't help either, as you can get slapped w/ stuttering from that.

Bethesda Engine Games from their overly duct-taped engine (i.e. Fallout 3/4/NV & Skyrim) definitely can be hit w/ all of this stuff.

V-Sync ON is normally best when you have a framerate matching your monitor and you lose NOTHING - i.e. when you will always remain at 60fps on your 60hz monitor.

FreeSync is an open-solution for Syncing. AMD supports it, though NVidia doesn't. So for now, it's only for AMD cards. But, not all FreeSync monitors are created equal, as there's no real guideline or curating done on them. No royalties in FreeSync either. My 4K monitor supports FreeSync - but eh, NVidia doesn't support it officially.

G-Sync is NVidia's solution for Syncing. It's expensive and it's great; I have it on my SC15 laptop. Input lag is at a minimum, frame stuttering is minimized & there's no screen tearing. It's the best Syncing method out there - and it really shows, as it's pricey as hell. This is something NVidia actually curates and has a special module that they developed that gets put in the monitor, which is in there to help control the frames and work along with the GPU in tandem to keep issues to a minimum. G-Sync works w/ the GPU for variable framerates. Plus, G-Sync is how they make $ with royalties, too - so, yup, that's all why it's so pricey.

IMHO, there's nothing like running a fast-paced action game like Prey 2017 at 120hz w/ 120fps w/ G-Sync, taking a hit down to 90fps, and it feels like nothing ever happened (no stutter, no issues, no tearing, nothing) and it still runs smooth like it never took a hit (since you're above 60fps and likely won't notice the other extra frames, when you go over 60fps). I notice & feel the hits w/out G-Sync when I go from say 60fps down to 50fps, but don't notice or feel them when going from 120fps to 90fps w/ G-Sync. Nothing like G-Sync, IMHO.
« Last Edit: Tuesday, August 14, 2018, 05:07:42 PM by MysterD »

Offline Cobra951

  • Gold Member
  • *
  • Posts: 8,934
Re: Monitors
« Reply #21 on: Wednesday, August 15, 2018, 10:32:44 AM »
Yeah, I completely understand the idea of a variable refresh rate controlled by rendering software.  I'm surprised it took so long to come about, because LCD doesn't need constant refreshing like CRT, and LCD has been ubiquitous for over 10 years.  A 1Hz refresh will look the same as a 240Hz refresh on a still image.  Nvidia are the spoilers here.  We have a VESA standard through DisplayPort 1.2a since 2013, it's free to implement, and AMD calls their implementation FreeSync, which they also extended to HDMI.  (Had to look some of those details up, though I was right, off the cuff, about the broad strokes.)  Enter the proprietary GSync from the anticonsumer fuckwads at Nvidia, and we get barriers to entry and to compatibility, muddying the waters and delaying universal adoption of adaptive sync.  So many of us still stick with fixed refresh, and its special challenges to smooth rendering.

To reply to your other points:  Yes, there is up to a 16.7-33.3 additional lag with vsync at 60 Hz.  Having worked on real-time computer rasterizing code, I get the picture quite clearly.  Everything's a tradeoff.  I've made mine.  I already touched upon the sudden drops in frame rate when frames doesn't get rendered in time for the next sync.  I also mentioned the solution, which Nvidia calls Adaptive Vsync (not to be confused with the subject of the paragraph above).  Whenever a frame takes longer than 16.6 ms to render, the sync gets skipped.  If all frames take too long, it's practically the same as having vsync turned off.  If all frames come in under that magic number, the motion is smooth as butter.  Best of both worlds.

Offline MysterD

  • Forum god
  • *
  • Posts: 18,049
  • OWNet 4 Eternity & Beyond
Re: Monitors
« Reply #22 on: Wednesday, August 15, 2018, 02:53:01 PM »
Only thing w/ Adaptive Sync is - well, you can get cut back to 30fps on the Syncing. If I wanted console gaming's weak framerate standards, I would've bought an XB1 or PS4. :P

Only good thing w/ it adapting you to 30fps is...well, you keep the stuttering issue to a minimum. That's the only good I can see really from this, unless the game requires you to be hard-capped at 30fps or crap goes out the window (i.e. Dark Souls: Prepare To Die PC version - hit detection, physics, and other stuff go out the windows when not capped at 30fps, if not modded properly w/ other mods).

It's okay if Adaptive Sync say cuts back to 45fps, so you're still above 30fps...but that isn't good for something like Fallout 3/4/NV & Skyrim (damn that duct-taped engine), where those games really needs you to be at intervals of 30fps b/c other crap goes out the window otherwise - i.e. you need to be at 30fps or 60fps, since it ruins the physics, animations, can cause frame stuttering, input lag, and other stuff.

Sure, A-Sync is fine to me, namely when you're at 60fps...if that's your cap. But, I really don't want to capped back to 30fps - bleh @ that. 45fps is okay to A-Synced to, as long as other crap (hit detection, animations, graphics, physics, etc) doesn't go out the window like it can in some other games (i.e. see some games above mentioned).

But really though - if you like A-Sync...or any other methods, for that matter, use it. It's all about what you like.

There are definitely trade-off's w/ every method out there.

Offline Cobra951

  • Gold Member
  • *
  • Posts: 8,934
Re: Monitors
« Reply #23 on: Wednesday, August 15, 2018, 03:09:45 PM »
Hmm, why would you get dropped to 30 fps with adaptive?  The sync is based on the refresh, which is going to be at least 60 Hz (on normal monitors).  It's doing exactly what I expected for me with NMS.  When it can't maintain 60 fps, I get some tearing, but still a lot better than 30 fps.  I know what 30 fps looks like on a 60Hz screen all too well.

Again, I'm not capped by the game, or some tool like MSI Afterburner.  The only thing capping my frame rate is vsync.  If two different processes are hampering the flow of frames at the same time, yeah, I can see how you might end up with worse performance.

Edit:  OK, to sum up after some thought:  If you're capping your frame rate with a game option, or with an external tool like MSIA, I don't think you want vsync of any kind.  That would mean that two separate forms of frame-rate control are going to step on each others toes.  You're most likely going to see serious frame drops.  Choose one (option/tool) or the other (vsync).  Maybe this is why we're not quite connecting in this conversation.  We're looking at the problem from two different sides.

Offline MysterD

  • Forum god
  • *
  • Posts: 18,049
  • OWNet 4 Eternity & Beyond
Re: Monitors
« Reply #24 on: Wednesday, August 15, 2018, 03:16:24 PM »
Hmm, why would you get dropped to 30 fps with adaptive?  The sync is based on the refresh, which is going to be at least 60 Hz (on normal monitors).  It's doing exactly what I expected for me with NMS.  When it can't maintain 60 fps, I get some tearing, but still a lot better than 30 fps.  I know what 30 fps looks like on a 60Hz screen all too well.

Depends on the game. Some PC games like Batman: Arkham Knight can take some crazy hits, even when capped to 60fps. I've taken hits from 60fps down to 25fps back to 45fps then down to 25 then up to 60fps w/ that unoptimized piece of junk.

I had to cap that game at 30fps internally (in the game options) b/c it is poorly put together. Granted, I ain't tried it on my 6GB GTX 1060 laptop - but it ran like ass on my desktop w/ either the GTX 960 4GB or GTX 970. And this was even after its final patch.

Awful, awful, awful port.
Really good game, though - for the most part.

And who would want to be synced to 30fps w/ fast-paced games? I don't.

While it might be fine for something like FF13 that's basically turn-based, it would be horrible for something like Bayonetta PC. 60fps (or better) is where it's at for something crazy-fast like Bayonetta PC.

Offline Cobra951

  • Gold Member
  • *
  • Posts: 8,934
Re: Monitors
« Reply #25 on: Thursday, August 16, 2018, 08:24:11 AM »
Yeah, I remember reading about the lousy Arkham Knight PC port.  That was back before I got my current game PC.  I played it on the Xbox, at 30 fps.  I was used to it then, and the game ran very well there.  One of my favorites.  Shame it didn't make the transition to PC properly.

You replied so quickly to my last post that I think you missed my edit.  Basically, I agree with you.  If you're going to cap your frame rate, you don't want vsync, normal or Nvidia's adaptive (or AMD's dynamic) vsync.  Vsync (of any pre-FreeSync kind) works best as the one and only frame-rate cap.  I don't have any experience with FreeSync/GSync, so I'll leave those alone.  Subject for another day.

New drinking game:  Take a shot every time you read "sync" in this thread.  Ouch!

Offline Cobra951

  • Gold Member
  • *
  • Posts: 8,934
Re: Monitors
« Reply #26 on: Thursday, September 06, 2018, 07:09:54 AM »
OK, so it turns out the big performance killer is windowed borderless mode.  Not an issue at 768p, but certainly is at 1080p.  After trying out Far Cry 5 briefly that way at native res, and shaking my head, I switched to true fullscreen, and it made a world of difference.  While I still get some drops below 60 fps, it behaves very well, and looks amazing.

I have to smack myself in the head.  Here I was starting to think my 1060 wasn't all that, at my new screen's 1080p res.  Weeks later, it comes to me practically in a dream that I had set Far Cry 5 to render internally at 1.3x resolution.  That works out to 998p, if the screen res is 768p, which worked fine.  But at a screen res of 1080p, that's 1404p--damn near 1440p.  No wonder I was getting frame drops.  Dropping the internal resolution scaling down to 1.0 yields a perfectly locked 60 fps at 1080p, always, in FC5.  Fook Mi.  So much time wasted with the wrong impression.

Offline MysterD

  • Forum god
  • *
  • Posts: 18,049
  • OWNet 4 Eternity & Beyond
Re: Monitors
« Reply #27 on: Friday, September 14, 2018, 07:27:05 PM »
I have to smack myself in the head.  Here I was starting to think my 1060 wasn't all that, at my new screen's 1080p res.  Weeks later, it comes to me practically in a dream that I had set Far Cry 5 to render internally at 1.3x resolution.  That works out to 998p, if the screen res is 768p, which worked fine.  But at a screen res of 1080p, that's 1404p--damn near 1440p.  No wonder I was getting frame drops.  Dropping the internal resolution scaling down to 1.0 yields a perfectly locked 60 fps at 1080p, always, in FC5.  Fook Mi.  So much time wasted with the wrong impression.

More or less, the performance on a 6GB GTX 1060 is about 14% better than a "4GB" GTX 970. It's about 11-13% better, performance-wise. It's insanely capable at 1080p (i.e. often at High at 60fps-120fps, depending on the game & what syncing you're using). At 1440p for a GTX 970, you're likely hitting 40-60fps at Medium; speaking from experience here.

When you do say "1.3x resolution", you're basically doing a downsampling or Nvidia's DSR (Dynamic Super Resolution) technique. So, your card really is rendering at 1404p and then scales it back to what your display is (1080p b/c that's the max of your screen) - but you still get spit back a much better & clearer image than standard 1080p b/c you started much higher from the get-go (1404p). So, you get slapped w/ the 1404p performance from your card, even though it gets downscaled back to 1080p. You'll get something that looks, more or less, somewhere b/t 1080p and 1404p. Usually, downsampling/DSR like this is used when you have insanely good performance (often above your monitor's refresh rate) and you want to make the images look even better at a lower-resolution monitor - i.e. say you have a 11 GB GTX 1080 Ti (which is built for 60fps at 4K) but you don't have a 4K screen (i.e. you are using a 1080p 60hz monitor, for some reason).

With more VRAM on 1060 if you got the 6GB version, you should be able to also do better and higher settings (than the 970) - since the 1060 has a bigger VRAM buffer zone. Where the GTX 970 would start choking at hitting around VRAM usage in a game at 3.5GB or especially more (b/c of the way it was partitioned w/ the bus b/t the really fast 3.5GB and the ultra-slow 0.5 other GB that can use or not use that 0.5GB depending on the situation and what other stuff it could be doing & allocating the 0.5GB for) - the 6GB 1060 won't choke (to a slideshow) if you're hitting 3.5GB or 4GB of VRAM. That's really going to matter on higher resolutions (if you're attempting them) like 1440p or 4K; or you say  run a poorly optimized game that can eat VRAM for breakfast (i.e. Batman AK) even at 1080p. 

Offline Xessive

  • Gold Member
  • *
  • Posts: 9,920
    • XSV @ deviantART
Re: Monitors
« Reply #28 on: Sunday, September 23, 2018, 11:31:02 AM »
I'm usually willing to sacrifice some resolution for better visual fidelity and performance overall.

My current monitor has a default resolution of 2560x1440 and I generally run things at full res. But if bringing it down to 1080p can net some extra smoothness out of the framerate or allow me to kick some setting up a notch at the same rate, I'll go for 1080p.

Considering that I sit around 60cm away from my screen (or a full metre when I lean back), on my desktop I would notice the resolution drop but in a game, particularly something with a lot of movement, it's hard to notice it.

Offline Cobra951

  • Gold Member
  • *
  • Posts: 8,934
Re: Monitors
« Reply #29 on: Monday, September 24, 2018, 10:03:19 AM »
It seems 1440p is the sweet spot now for upscale PC gaming.  4K is too much in at least some cases even for the best cards to maintain 60 fps at the higher game settings that so many players can't seem to do without.  1080p is plenty good enough for me, and it's the standard for movies, TV shows, and other content.  (On the other hand, if you have lots of 720p content, it scales better into 1440p.)  The problem is that really good monitors don't often come in that res anymore, do they?  1440p and 4K have become the home of all the best 16:9 monitors.

With fixed-pixel panels (i.e., everything since CRT), the native res is unquestionably the best--visible immediately.  I don't want to scale if I absolutely don't have to.  Lots to consider.  For now, I'm happy with what I got.

Offline MysterD

  • Forum god
  • *
  • Posts: 18,049
  • OWNet 4 Eternity & Beyond
Re: Monitors
« Reply #30 on: Monday, September 24, 2018, 07:03:25 PM »
4K is too much in at least some cases even for the best cards to maintain 60 fps at the higher game settings that so many players can't seem to do without.

RTX 2080 regular and 2080 Ti does just fine at 4K at 60fps or better.
Only problem is...well, the price issue: it's around $800 to $1200.

Quote
It seems 1440p is the sweet spot now for upscale PC gaming. 1080p is plenty good enough for me, and it's the standard for movies, TV shows, and other content.
I think most will be happy with either 1080p or 1440p at 60fps or better.

Offline MysterD

  • Forum god
  • *
  • Posts: 18,049
  • OWNet 4 Eternity & Beyond
Re: Monitors
« Reply #31 on: Thursday, November 01, 2018, 02:47:54 PM »
So, after 2 months or so of waiting, my ViewSonic 240hz G-Sync 1080p 25'' XG2560 arrived yesterday from Office Max / Office Depot (it was $180 when I ordered it; normally it is $480 MSRP).

This thing's amazing. Runs great. Performance is insane.

Had Overwatch running at around 155fps at 1080p w/ G-Sync on my desktop PC (i7 950; GTX 970; 16 GB RAM; W7 64-bit). Insane. Gonna do more testing later and all - but this thing's awesome.

Moved my 27'' 4K 60fps monitor over and connected that to my SC15 laptop. Connected it via HDMI (at 30fps, meh), for now - until my Mini DP to DP Cable arrives (for 60fps @ 4K support).

Offline idolminds

  • ZOMG!
  • Administrator
  • Forum god
  • *
  • Posts: 11,939
Re: Monitors
« Reply #32 on: Thursday, November 01, 2018, 10:19:01 PM »
I ended up getting a Dell S2209W for $5 at Goodwill the other day. 22 inch 1080p. Only had a minor scratch that I don't even notice unless I'm specifically looking for it and even then requires a solid white background to spot. Its larger and higher res than my old monitor so I'm pretty happy with it. I'd like to get a freesync monitor at some point but I mean come on, $5.

Also going to pick up some extra cables and see about hooking up my old monitor to finally run a dual monitor setup.

Offline Cobra951

  • Gold Member
  • *
  • Posts: 8,934
Re: Monitors
« Reply #33 on: Friday, November 02, 2018, 08:51:53 AM »
$5!  /thread

Offline MysterD

  • Forum god
  • *
  • Posts: 18,049
  • OWNet 4 Eternity & Beyond
Re: Monitors
« Reply #34 on: Saturday, November 03, 2018, 10:09:39 AM »
I ended up getting a Dell S2209W for $5 at Goodwill the other day. 22 inch 1080p. Only had a minor scratch that I don't even notice unless I'm specifically looking for it and even then requires a solid white background to spot. Its larger and higher res than my old monitor so I'm pretty happy with it. I'd like to get a freesync monitor at some point but I mean come on, $5.

Also going to pick up some extra cables and see about hooking up my old monitor to finally run a dual monitor setup.

For $5, why the heck not?
Sounds like an insane score to me.

I do recommend gamers, at some point, get a high refresh rate monitor (i.e. 90 hz or above) and make sure it has some sort of Adaptive Sync tech (i.e. Freesync for AMD, G-Sync for Nvidia).

Especially if they're packing some of the better cards - i.e. on NVidia, that would be 970's or above in the 900 series, 1060's or above in the 1000 series, and 2000 series (only high-end 2070's and above are out now).

The high refresh is really important, as games w/ intense scenes w/ lots of characters on screen, effects, explosions, and especially online w/ more players - yeah, it just is ultra-smooth even when you take a nasty framerate hit.

If you're going at say 90fps and a scene's so intense it knocks you down 30fps (right back to 60fps) - yeah, you'll never notice or feel the hit. It's refreshing so fast, your eyes and mind won't catch the slow-down. I don't feel and don't see momentarily slow-downs, stutters, or anything on Overwatch anymore now, even when players jump into a room - the game just keeps going. No slow-down. Just go. At 155fps, I don't catch any slowdowns; it's sick.

Keep in mind, I also have faster Internet here too.

Pair the fast refresh rates w/ Adaptive Sync tech like Freesync or G-Sync - yup, you'll never notice and/or feel both the lag and the screen-tear. Pretty much, Freesync and G-Sync's built to stop the other major problems: lag and screen-tear problems.

High refresh rates monitors & Adaptive Sync tech = game changing technology.

Offline MysterD

  • Forum god
  • *
  • Posts: 18,049
  • OWNet 4 Eternity & Beyond
Re: Monitors
« Reply #35 on: Saturday, November 03, 2018, 04:09:42 PM »
My Mini-Display Port to (Regular) Display Port cable arrived today.

Great.

Now my 4K 27'' Samsung monitor, which I moved now over to the SC15 laptop (since I threw the 1080p 240hz G-Sync Monitor over to my desktop w/ the 970), can run at 60fps max...instead of the 30fps it tops out on w/ HDMI cable.

Fallout 4 is running all smooth here at 60fps at 1080p when connected that 27'' monitor to my SC15. Some games should remain at 60fps b/c stuff goes wrong when over 60fps (i.e. physics go nuts in Creation Engine games), so the 60hz 4K 27'' monitor's perfect for games like that and other games like WWE 2K18 (which forces 60fps max).

Great! :)

Offline idolminds

  • ZOMG!
  • Administrator
  • Forum god
  • *
  • Posts: 11,939
Re: Monitors
« Reply #36 on: Saturday, November 10, 2018, 05:50:55 PM »


Ignore my messy wires. But I am very pleased right now. The HDMI cable I had to buy to hook them both up cost more than the monitor itself. In fact, I only paid $5 for my previous monitor so this dual monitor setup was $10 for the screens and like $12 for the cables, haha.

Offline Cobra951

  • Gold Member
  • *
  • Posts: 8,934
Re: Monitors
« Reply #37 on: Saturday, November 10, 2018, 10:11:51 PM »
You need to hook me up with your suppliers.  I feel like a sucker for paying $200 for ONE monitor now.

Offline Quemaqua

  • 古い塩
  • Administrator
  • Forum god
  • *
  • Posts: 16,498
  • パンダは触るな。
    • Bookruptcy
Re: Monitors
« Reply #38 on: Monday, November 12, 2018, 01:35:01 AM »
How the hell did you get it so cheap? I'm gonna need at least one new monitor when I get back to California. 2 if my house burns down.

天才的な閃きと平均以下のテクニックやな。 課長有野

Offline idolminds

  • ZOMG!
  • Administrator
  • Forum god
  • *
  • Posts: 11,939
Re: Monitors
« Reply #39 on: Monday, November 12, 2018, 07:16:54 PM »
I just got lucky at Goodwill. I wouldnt rely on it, heh. Well, unless you want 4:3 monitors because they usually have a couple of those.