If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below. |
|
|
Thread Tools | Display Modes |
#71
|
|||
|
|||
On Tue, 30 Aug 2005 03:26:02 GMT CJT wrote in
Message id: : keith wrote: On Mon, 29 Aug 2005 18:13:19 -0700, David Schwartz wrote: "CJT" wrote in message ... David Schwartz wrote: Unless you need resolution over 1280x1024 or need a ridiculously large viewing angle, there are LCDs that serve perfectly for both graphics editing and games. For example, the NEC 2010X is totally suitable to both applications. 1280x1024 isn't exactly hires any more. There are very few games that support resolutions above that. For normal desktop work, 1280x1024 is more than adequate. Personally, I prefer to have two LCD monitors, each 1280x1024, using the second one only when circumstances require it. As others here will attest, I've been using a 3200x1600 desktop at work for almost five years. One display is the laptop's LCD, the other is a 20" monitor. 1280x1024 is *NOT* adequite (though I live with two 19" CRTs at this resolution, each, here at home). What percentage of PC computer users do you think have a resolution over 1280x1024? What percentage have tried it? What percentage have ever gone back? Sheesh, I still see people with 1024x768 on 20" monitors at 60Hz! Is that what we should all aspire to? ...the least common denominator? Yeah, you da man ... NOT! Tsk, tsk. Take your spankings over PC power consumption like a man, puddles. For your sake, I hope that you have a better grasp of your crank than you do of computers. |
#72
|
|||
|
|||
Trent wrote:
On Tue, 30 Aug 2005 03:26:02 GMT CJT wrote in Message id: : keith wrote: On Mon, 29 Aug 2005 18:13:19 -0700, David Schwartz wrote: "CJT" wrote in message ... David Schwartz wrote: Unless you need resolution over 1280x1024 or need a ridiculously large viewing angle, there are LCDs that serve perfectly for both graphics editing and games. For example, the NEC 2010X is totally suitable to both applications. 1280x1024 isn't exactly hires any more. There are very few games that support resolutions above that. For normal desktop work, 1280x1024 is more than adequate. Personally, I prefer to have two LCD monitors, each 1280x1024, using the second one only when circumstances require it. As others here will attest, I've been using a 3200x1600 desktop at work for almost five years. One display is the laptop's LCD, the other is a 20" monitor. 1280x1024 is *NOT* adequite (though I live with two 19" CRTs at this resolution, each, here at home). What percentage of PC computer users do you think have a resolution over 1280x1024? What percentage have tried it? What percentage have ever gone back? Sheesh, I still see people with 1024x768 on 20" monitors at 60Hz! Is that what we should all aspire to? ...the least common denominator? Yeah, you da man ... NOT! Tsk, tsk. Take your spankings over PC power consumption like a man, puddles. For your sake, I hope that you have a better grasp of your crank than you do of computers. Hey! Intel and Apple agree that power consumption of PCs is a problem, so I'm in good company. -- The e-mail address in our reply-to line is reversed in an attempt to minimize spam. Our true address is of the form . |
#73
|
|||
|
|||
David Schwartz wrote:
Unless you need resolution over 1280x1024 or need a ridiculously large viewing angle, there are LCDs that serve perfectly for both graphics editing and games. For example, the NEC 2010X is totally suitable to both applications. I just don't like the fact that they are optimized for one resolution. I like to be able to change resolutions without suffering large display-quality degradation. |
#74
|
|||
|
|||
"chrisv" wrote in message
... David Schwartz wrote: Unless you need resolution over 1280x1024 or need a ridiculously large viewing angle, there are LCDs that serve perfectly for both graphics editing and games. For example, the NEC 2010X is totally suitable to both applications. I just don't like the fact that they are optimized for one resolution. I like to be able to change resolutions without suffering large display-quality degradation. Chris, I have a 19" LCD with native 1280x1024 resolution. At Keith's urging, I have on three occasions made a valiant effort to switch my desktop viewing to that resolution. I mean, I tried hard, adjusting icon sizes, font sizes, etc. On each occasion, after wasting the better part of a day I've had to switch back to 1024x768, which is _not_ native resolution but is the only resolution I'm able to put up with. Different people have different preferences. Keith thinks I'm a neanderthal. He's probably right. ;-) |
#75
|
|||
|
|||
On Mon, 29 Aug 2005 15:14:56 -0700, "David Schwartz"
wrote: So many people say the reverse, including myself. I can only suspect that one of the following is the case: 1) You had a really good CRT monitor and a really crappy LCD monitor. 2) Your video card was really crappy. 3) You didn't position the LCD monitor in a way that would make you comfortable. 4) You really like wasting tons of desk space and looking at a blurry image. DS The usual stupid assumptions posted by clueless *******. There are a few reasons why CRT is superior to LCD in image quality. If you can't see it then maybe you need to clean your glasses. |
#76
|
|||
|
|||
On Mon, 29 Aug 2005 19:15:35 -0400, George Macdonald
wrote: Maybe he's into photography and games, Correct, I'm into both and crt is superior in both instances. |
#77
|
|||
|
|||
On Mon, 29 Aug 2005 16:28:07 -0700, "David Schwartz"
wrote: Unless you need resolution over 1280x1024 or need a ridiculously large viewing angle, there are LCDs that serve perfectly for both graphics editing and games. For example, the NEC 2010X is totally suitable to both applications. DS What if I need to run games at lower resolutions, what if I like my monitor to actually be capable of showing subtle gradation in tones, what if I prefer superior colour accuracy? |
#78
|
|||
|
|||
In comp.sys.ibm.pc.hardware.chips Praxiteles Democritus wrote:
The usual stupid assumptions posted by clueless *******. That's more than a little impolite, and counterproductive if you actually wanted to convince someone. There are a few reasons why CRT is superior to LCD in image quality. There are, but you leave them unstated, and one gets the impression that you don't know them. I usually equate impoliteness with ignorance. As I understand it, many gamers still prefer CRT over LCD: 1) CRT phosphors have lower presistance than LCDs, producing less afterimage during motion ("ghosting") 2) LCD pixels are extremely sharp. This is great for text, but unpleasant for images. The slight blur of CRTs mimics natural vision and avoids hyperpixelation. There has been considerable improvement in (1), but (2) still operates. For a simple demonstration, try watching a DVD on an LCD vs CRT. -- Robert |
#79
|
|||
|
|||
"chrisv" wrote in message ... David Schwartz wrote: Unless you need resolution over 1280x1024 or need a ridiculously large viewing angle, there are LCDs that serve perfectly for both graphics editing and games. For example, the NEC 2010X is totally suitable to both applications. I just don't like the fact that they are optimized for one resolution. I like to be able to change resolutions without suffering large display-quality degradation. Depending on how you set them, you can get them to degrade at least reasonably nicely. But yeah, you want to stick with the native resolution if you can possibly do it. DS |
#80
|
|||
|
|||
On 29 Aug 2005 16:58:42 -0700, "Robert Myers" wrote:
George Macdonald wrote: On 29 Aug 2005 07:33:07 -0700, "Robert Myers" wrote: George Macdonald wrote: On 27 Aug 2005 06:36:43 -0700, "Robert Myers" wrote: George Macdonald wrote: On 26 Aug 2005 07:37:06 -0700, "Robert Myers" wrote: No matter what power management trickery does for you most of the time, you've got to be able to cool the thing when it's operating at peak performance. Well we know that Intel stubbed its toes there at 4GHz and while the end of scaling seems to be accepted as imminent, it's not clear how far other mfrs can go, nor in what time scale. What I'm talking about is also more than what we normally think of as power management - more like distributed dynamic adaptive clocks - there may be a better term for that. 100% load is difficult to categorize there and of course "clock rate" becomes meaningless as a performance indicator. AMD has said that it intends to continue to push clock speeds on single core CPUs and its current offerings do not suffer anywhere near the same heat stress as Intel's even at "100% load"; if AMD can get to 4GHz, and maybe a bit beyond with 65nm, they are quite well positioned. All I'm saying is that I'm not ready to swallow all of Intel's latest market-speak on power/watt as a new metric for measuring CPU effectiveness. They certainly tried to get too far up the slippery slope too quickly - it still remains to be seen where the real limits are and which technology makes a difference. Let's not get into another Intel/AMD round. As it stands now, Intel is likely to put its efforts at pushing single thread performance into Itanium. Who knows how long that emphasis will last. It was an honest and *correct* comment on Intel's technology choices - no need to have any "round" of anything... and certainly not about Itanium. Translation: Intel isn't likely to want to play. That may have no bearing on AMD's decision-making whatsoever, but, if AMD wants to go after x86 users with need for single-thread performance, I suspect they will have the market all to themselves. The gamers who have historically carried users hungry for single-threaded performance will all have moved to multi-core machines because that's where they'll be getting the best performance because all of the software will have been rewritten for mulitple cores. IBM will stay in the game because IBM wants to keep Power ahead of Itanium on SpecFP, and the x86 chips you'll be looking to buy, if they're available, will be priced like Power5, or whatever it is by then. You know, that monopoly thing. Hmm, and you you said you didn't want to get into another "Intel/AMD" round... and yet, there you go again. I was only stating a documneted acknowledged fact - your prognostications are not relevant. The game makers have already stated that they don't expect to get much out of multi-core - it looks to me single high-speed core is what is needed there for a (long) while yet. Hell, dual CPUs have been available for long enough and they have not tweaked any gaming interest. I don't think the market is going to be there to pay for the kind of workhorse you say you need. Power consumption is a huge economic consideration in HPC: as it stands now, it doesn't pay to run clusters more than about 3 years because it is more expensive to pay for the power to continue running them than it is to pay for replacements that save power. I don't need a super-computer - I want the fastest PC I can get for a price-point which is usually a notch back from the very highest clock-speed, not some compromised thing which fits a marketing strategy for peoples' living rooms. My explanation as to why the other usual customers for single-threaded performance won't be there in large numbers, either. So far I haven't seen an "explanation" for anything here... other than a blind willingness to follow the latest Intel marketing angle. A different programming paradigm/style is not going to help them - expectation of success is not obviously better than that of new semiconductor tweaks, or even technology, which allows another 100x speed ramp over 10 years or so coming along. When I hear talk of new compiler technology to assist here, I'm naturally skeptical, based on past experiences. Well sure. The compiler first has to reverse engineer the control and dataflow graph that's been obscured by the programmer and the sequential language with bolted-on parallelism that was used. If you could identify the critical path, you'd know what to do, but, even for very repetitive calculations, the critical path that is optimized is at best a guess. It's not static - compilers don't have the right info for the job... and compiler-compilers won't do it either. That's why you do profiling. It makes me wonder sometimes when you spout some buzzword like that as though it is known to work well for all general purpose code working on all possible data sets.shrug People who use compilers know this. -- Rgds, George Macdonald |
Thread Tools | |
Display Modes | |
|
|
Similar Threads | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
Intel found to be abusing market power in Japan | chrisv | General | 152 | March 26th 05 06:57 AM |
Gigabyte GA-8IDML with Mobile CPU? | Cuzman | General | 0 | December 8th 04 02:39 PM |
HELP: P4C800-E Deluxe, Intel RAID and Windows detection problems | Michail Pappas | Asus Motherboards | 2 | November 20th 04 03:18 AM |
Intel Is Aiming at Living Rooms in Marketing Its Latest Chip | Vince McGowan | Dell Computers | 0 | June 18th 04 03:10 PM |
New PC with W2K? | Rob | UK Computer Vendors | 5 | August 29th 03 12:32 PM |