A computer components & hardware forum. HardwareBanter

If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below.

Go Back   Home » HardwareBanter forum » General Hardware & Peripherals » Homebuilt PC's
Site Map Home Register Authors List Search Today's Posts Mark Forums Read Web Partners

Monitor question



 
 
Thread Tools Display Modes
  #1  
Old March 15th 21, 04:25 AM posted to alt.comp.hardware.pc-homebuilt
Bill[_41_]
external usenet poster
 
Posts: 24
Default Monitor question

I have a Asus Strix 750ti GPU. It has 2GB of onboard video memory and
uses 8GB of shared system memory. According to its specifications, it
can apparently handle 3840x2160 resolution.

I have been happily running a monitor (Dell UltraSharp 2407wfp) at
1900x1200 native resolution. It's got about 12 years on it. I don't
really do computing which pushes the video card harder than casual games.

If I were to upgrade to a 4k monitor (3840x2160 resolution), would I
likely be pleased or unsatisfied with the results? By rough math, I
think I would use about 4 times as much memory (but I don't know how to
see how much I am actually using now!) That's assuming that 4k uses
32-bit color too, Mainly I like "sharp crisp text". Somehow,after
reading some reviews, I ended up considering the Dell 2721Q monitor
(which is almost $500). It seems as you get bigger screens you need
finer resolution to get "sharp crisp text"! ; ) (duh!) Anyone
following GPUs in the news knows that this is a rather poor time to be
in the market for a GPU.

Any comments or suggestions based upon your experience is welcome!

Bill
  #2  
Old March 15th 21, 04:37 AM posted to alt.comp.hardware.pc-homebuilt
Bill[_41_]
external usenet poster
 
Posts: 24
Default Monitor question

Bill wrote:
I have a Asus Strix 750ti GPU.Â* It has 2GB of onboard video memory and
uses 8GB of shared system memory.Â* According to its specifications, it
can apparently handle 3840x2160 resolution.

I have been happily running a monitor (Dell UltraSharp 2407wfp) at
1900x1200 native resolution. It's got about 12 years on it. I don't
really do computing which pushes the video card harder than casual games.

If I were to upgrade to a 4k monitor (3840x2160 resolution), would I
likely be pleased or unsatisfied with the results?Â* By rough math, I
think I would use about 4 times as much memory (but I don't know how to
see how much I am actually using now!)Â*Â* That's assuming that 4k uses
32-bit color too,Â*Â* Mainly I like "sharp crisp text".Â* Somehow,after
reading some reviews, I ended up considering the Dell 2721Q monitor
(which is almost $500).Â* It seems as you get bigger screens you need
finer resolution to get "sharp crisp text"! ; )Â*Â* (duh!)Â*Â* Anyone
following GPUs in the news knows that this is a rather poor time to be
in the market for a GPU.

Any comments or suggestions based upon your experience is welcome!

Bill



P.S. I should add that I would intend to use DisplayPort (v 1.4)
connector instead of DVI.
  #3  
Old March 15th 21, 05:36 AM posted to alt.comp.hardware.pc-homebuilt
Paul[_28_]
external usenet poster
 
Posts: 1,467
Default Monitor question

Bill wrote:
Bill wrote:
I have a Asus Strix 750ti GPU. It has 2GB of onboard video memory and
uses 8GB of shared system memory. According to its specifications, it
can apparently handle 3840x2160 resolution.

I have been happily running a monitor (Dell UltraSharp 2407wfp) at
1900x1200 native resolution. It's got about 12 years on it. I don't
really do computing which pushes the video card harder than casual games.

If I were to upgrade to a 4k monitor (3840x2160 resolution), would I
likely be pleased or unsatisfied with the results? By rough math, I
think I would use about 4 times as much memory (but I don't know how
to see how much I am actually using now!) That's assuming that 4k
uses 32-bit color too, Mainly I like "sharp crisp text".
Somehow,after reading some reviews, I ended up considering the Dell
2721Q monitor (which is almost $500). It seems as you get bigger
screens you need finer resolution to get "sharp crisp text"! ; )
(duh!) Anyone following GPUs in the news knows that this is a rather
poor time to be in the market for a GPU.

Any comments or suggestions based upon your experience is welcome!

Bill



P.S. I should add that I would intend to use DisplayPort (v 1.4)
connector instead of DVI.


You would check the standards support of your existing video card,
and see if the standard supports 60Hz operation at the
resolution of interest. Wikipedia articles on HDMI and DisplayPort,
have various tables for this issue.

One possible issue, is the version of HDCP. They don't figure it out
here, so I don't know what the issue is exactly. Will the OS agree
to run 4K without HDCP ? The hardware likely allows it, but the OS
tunes for max DRM. Video cards have had "added evil" to stop copying,
and the OS only has to tap into those calls, to sew up the copy hole.

https://forums.tomshardware.com/thre...50-ti.2621525/

And no, I don't have a 4K monitor here. I've got two monitors on my
desk (run by two computers), and there isn't room for some huge
monitor. The reason the two monitors are on my desk, is the second
computer runs video conference, and the second computer is
further away, and less noise into my microphone. My lashup is there
so I can video conference, without fan noise from the first computer
being powered up.

I will be glad when some day, this video conference fetish will
have ended. The last video conference was a flop, when I couldn't
log into the damn thing. I had to take a phone call instead
(which as it happens, is all that was required anyway).

Paul
  #4  
Old March 15th 21, 08:22 PM posted to alt.comp.hardware.pc-homebuilt
Bill[_41_]
external usenet poster
 
Posts: 24
Default Monitor question

Paul wrote:
Bill wrote:
Bill wrote:
I have a Asus Strix 750ti GPU.Â* It has 2GB of onboard video memory
and uses 8GB of shared system memory.Â* According to its
specifications, it can apparently handle 3840x2160 resolution.

I have been happily running a monitor (Dell UltraSharp 2407wfp) at
1900x1200 native resolution. It's got about 12 years on it. I don't
really do computing which pushes the video card harder than casual
games.

If I were to upgrade to a 4k monitor (3840x2160 resolution), would I
likely be pleased or unsatisfied with the results?Â* By rough math, I
think I would use about 4 times as much memory (but I don't know how
to see how much I am actually using now!)Â*Â* That's assuming that 4k
uses 32-bit color too,Â*Â* Mainly I like "sharp crisp text".
Somehow,after reading some reviews, I ended up considering the Dell
2721Q monitor (which is almost $500).Â* It seems as you get bigger
screens you need finer resolution to get "sharp crisp text"! ; )
(duh!)Â*Â* Anyone following GPUs in the news knows that this is a
rather poor time to be in the market for a GPU.

Any comments or suggestions based upon your experience is welcome!

Bill



P.S. I should add that I would intend to use DisplayPort (v 1.4)
connector instead of DVI.


You would check the standards support of your existing video card,
and see if the standard supports 60Hz operation at the
resolution of interest. Wikipedia articles on HDMI and DisplayPort,
have various tables for this issue.

One possible issue, is the version of HDCP. They don't figure it out
here, so I don't know what the issue is exactly. Will the OS agree
to run 4K without HDCP ? The hardware likely allows it, but the OS
tunes for max DRM. Video cards have had "added evil" to stop copying,
and the OS only has to tap into those calls, to sew up the copy hole.

https://forums.tomshardware.com/thre...50-ti.2621525/


And no, I don't have a 4K monitor here. I've got two monitors on my
desk (run by two computers), and there isn't room for some huge
monitor.


In my "office" I have 2 desks, I have a desk with a computer on it and a
kitchen table I have been using as a desk for 36 years now--time flies!
Your post is helpful! If will at the very minimum motivate me to learn
what HDCP is---I've run in that before! : ) Paying a premium price
for a monitor to run at a non-premium resolution doesn't make sense, so
I will get this sorted out.

I strive to build quiet computers. I start with a quiet power supply and
include quiet GPU (with "semi-passive cooling"). It's fan only kicks in
when it needs to. There is a bit more to the complete strategy than
this, but it starts during the "design" stage. You know more about
computer than I do, so there is no sense in me rambling on... ; )
Everything is a compromise...




The reason the two monitors are on my desk, is the second
computer runs video conference, and the second computer is
further away, and less noise into my microphone. My lashup is there
so I can video conference, without fan noise from the first computer
being powered up.

I will be glad when some day, this video conference fetish will
have ended. The last video conference was a flop, when I couldn't
log into the damn thing. I had to take a phone call instead
(which as it happens, is all that was required anyway).

Â*Â* Paul


  #5  
Old March 15th 21, 02:42 PM posted to alt.comp.hardware.pc-homebuilt
Larc[_3_]
external usenet poster
 
Posts: 383
Default Monitor question

On Mon, 15 Mar 2021 00:25:57 -0400, Bill wrote:

| I have a Asus Strix 750ti GPU. It has 2GB of onboard video memory and
| uses 8GB of shared system memory. According to its specifications, it
| can apparently handle 3840x2160 resolution.
|
| I have been happily running a monitor (Dell UltraSharp 2407wfp) at
| 1900x1200 native resolution. It's got about 12 years on it. I don't
| really do computing which pushes the video card harder than casual games.
|
| If I were to upgrade to a 4k monitor (3840x2160 resolution), would I
| likely be pleased or unsatisfied with the results? By rough math, I
| think I would use about 4 times as much memory (but I don't know how to
| see how much I am actually using now!) That's assuming that 4k uses
| 32-bit color too, Mainly I like "sharp crisp text". Somehow,after
| reading some reviews, I ended up considering the Dell 2721Q monitor
| (which is almost $500). It seems as you get bigger screens you need
| finer resolution to get "sharp crisp text"! ; ) (duh!) Anyone
| following GPUs in the news knows that this is a rather poor time to be
| in the market for a GPU.
|
| Any comments or suggestions based upon your experience is welcome!

I had a 27" 4K monitor plugged into the display port on an EVGA GTX 750ti for a few
weeks while I was waiting for another GPU and 3840x2160 @60Hz worked perfectly. It's
not a great setup for gaming, but 4K streaming is outstanding.

Larc
  #6  
Old March 15th 21, 08:27 PM posted to alt.comp.hardware.pc-homebuilt
Bill[_41_]
external usenet poster
 
Posts: 24
Default Monitor question

Larc wrote:
On Mon, 15 Mar 2021 00:25:57 -0400, Bill wrote:

| I have a Asus Strix 750ti GPU. It has 2GB of onboard video memory and
| uses 8GB of shared system memory. According to its specifications, it
| can apparently handle 3840x2160 resolution.
|
| I have been happily running a monitor (Dell UltraSharp 2407wfp) at
| 1900x1200 native resolution. It's got about 12 years on it. I don't
| really do computing which pushes the video card harder than casual games.
|
| If I were to upgrade to a 4k monitor (3840x2160 resolution), would I
| likely be pleased or unsatisfied with the results? By rough math, I
| think I would use about 4 times as much memory (but I don't know how to
| see how much I am actually using now!) That's assuming that 4k uses
| 32-bit color too, Mainly I like "sharp crisp text". Somehow,after
| reading some reviews, I ended up considering the Dell 2721Q monitor
| (which is almost $500). It seems as you get bigger screens you need
| finer resolution to get "sharp crisp text"! ; ) (duh!) Anyone
| following GPUs in the news knows that this is a rather poor time to be
| in the market for a GPU.
|
| Any comments or suggestions based upon your experience is welcome!

I had a 27" 4K monitor plugged into the display port on an EVGA GTX 750ti for a few
weeks while I was waiting for another GPU and 3840x2160 @60Hz worked perfectly. It's
not a great setup for gaming, but 4K streaming is outstanding.

Larc


Thank you Larc! That is encouraging.
  #7  
Old March 15th 21, 09:05 PM posted to alt.comp.hardware.pc-homebuilt
Bill[_41_]
external usenet poster
 
Posts: 24
Default Monitor question

Larc wrote:

| Any comments or suggestions based upon your experience is welcome!

I had a 27" 4K monitor plugged into the display port on an EVGA GTX 750ti for a few
weeks while I was waiting for another GPU and 3840x2160 @60Hz worked perfectly. It's
not a great setup for gaming, but 4K streaming is outstanding.

Larc


Larc, When you used the DisplayPort, did you have to switch to using the
audio via the monitor (aux?) or were you able to still get audio (5.1 in
my case) directly using the various audio connectors on the card (as
you did when you were using DVI)? To my thinking, it sounds like a lot
to ask to get 5.1 through an "aux" connection. Thanks!

Bill
  #8  
Old March 16th 21, 12:11 AM posted to alt.comp.hardware.pc-homebuilt
Bill[_41_]
external usenet poster
 
Posts: 24
Default Monitor question

Bill wrote:

Larc, When you used the DisplayPort, did you have to switch to using the
audio via the monitor (aux?) or were you able to still get audio (5.1 in
my case)Â* directly using the various audio connectors on the card (as
you did when you were using DVI)?Â*Â* To my thinking, it sounds like a lot
to ask to get 5.1 through an "aux" connection.Â*Â* Thanks!

Bill



Here is an attempt to clarify my question: Can I use DisplayPort and
the output audio jacks on my GPU at the same time?

BTW, I was found the Dell 4k Model S2721QS to be more inline with my
needs and budget. It's supposed to be around $340, but it's not
currently in stock anywhere. In case, anyone else is looking, at think
it's the "sweetspot" between price and features (if you don't require
USB jacks on your monitor).
  #9  
Old March 16th 21, 01:55 AM posted to alt.comp.hardware.pc-homebuilt
Bill[_41_]
external usenet poster
 
Posts: 24
Default Monitor question

Bill wrote:

Here is an attempt to clarify my question:Â* Can I use DisplayPort and
the output audio jacks on my GPU at the same time?



It occurs to me now that the audio jacks are on the mainboard, so this
should be a no-brainer. It's curious how the GPU could even get the
audio--maybe in a different application of the DisplayPort (i.e. in a
different device)...
  #10  
Old March 16th 21, 03:43 AM posted to alt.comp.hardware.pc-homebuilt
Paul[_28_]
external usenet poster
 
Posts: 1,467
Default Monitor question

Bill wrote:
Bill wrote:

Here is an attempt to clarify my question: Can I use DisplayPort and
the output audio jacks on my GPU at the same time?



It occurs to me now that the audio jacks are on the mainboard, so this
should be a no-brainer. It's curious how the GPU could even get the
audio--maybe in a different application of the DisplayPort (i.e. in a
different device)...


Modern digital display standards have multiplexed
audio in the stream.

VGA was analog and doesn't have it.

DVI is digital and doesn't have it.

HDMI and DP have it. 8 channel LPCM (check Wikipedia articles
for more details). There are Dolby options but nobody cared.
With 8 channel Linear Pulse Code Modulation, there is
no compression, just 8 channels in plaintext.

Receiving devices (computer monitors with speakers, TV sets,
TV sets with passthru soundbar) receive 8 channel LPCM. They
don't need a license for it. No royalty to pay to Dolby.

You select digital audio in the Windows playback options.
The word "NVidia" might be involved (because we want the muxed
audio to be on the video cable, not RealTek digital via SPDIF or
TOSLink). You'd set the audio model to 2 channel audio,
because the monitor has two speakers, and you want the mixdown,
the head-model, to be used for any audio transforms. Your
source material could be 5.1, and you want the sub signal to
get mixed back into the two speakers, for a fuller sound.

Computer monitor speakers usually suck. And the time I've taken
to write this, is likely wasted. Your regular computer speakers,
hooked to the 1/8" Line Out and friends, likely sounds better,
because you can use ported amplified bookshelf speakers instead.
For example, Skybuck had a 500W amp setup, with ten channels
of Class D amplification, three channels in parallel driving the sub,
seven other channels for the other speakers. Putting the 3W monitor
speakers to shame :-) My setup is quite a bit less than that.

To use the regular computer speakers, you use Analog Line Out
and a setting of 2 channel stereo if you have two speakers. Etc.
Popular audio models are 2.0 (stereo), 2.1 (stereo+sub),
5.1 (four+center+sub), 7.1 (six+center+sub).

If you send computer ---- TVSet --- soundbar, then the audio
model selected in Windows, would be whatever the soundbar model
is. If the soundbar claims to be 5.1, then you'd set Windows to 5.1.
The soundbar is going to sound better than typical LCD monitor speakers.

No matter how you get audio, it's always over-priced. And the
high-powered setups, like the one Skybuck used to have, they
don't necessarily last, because the amp is placed inside the
sub and it tends to "cook" for want of a better word. Even if the
volume isn't turned up, the air inside the sub can be toasty warm.
Amps really belong in their own cabinet (like my home-made one,
it never gets even a little bit warm).

Paul
 




Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
Monitor Question Bozena Gateway Computers 9 November 28th 08 04:54 PM
monitor question Nospam[_2_] Homebuilt PC's 0 May 16th 07 06:41 PM
using only one monitor question FlipperLipps Ati Videocards 0 November 25th 03 02:00 AM
Monitor/TV question... J.Clarke Overclocking AMD Processors 5 September 21st 03 07:42 AM
LED Monitor question Rocket Gateway Computers 2 July 30th 03 04:42 AM


All times are GMT +1. The time now is 12:33 PM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Copyright ©2004-2024 HardwareBanter.
The comments are property of their posters.