A computer components & hardware forum. HardwareBanter

If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below.

Go Back   Home » HardwareBanter forum » Processors » General
Site Map Home Register Authors List Search Today's Posts Mark Forums Read Web Partners

Intel drops HyperThreading



 
 
Thread Tools Display Modes
  #61  
Old August 29th 05, 11:14 PM
David Schwartz
external usenet poster
 
Posts: n/a
Default


"Praxiteles Democritus" wrote in message
news
On Mon, 29 Aug 2005 08:57:25 -0500, chrisv
wrote:


I dread the day my beautiful Sony F500R 21" CRT ceases to work, and
I've no choice but to get in LCD to replace it...


I went LCD but only for a while. I hated it so much I went back to
CRT. I use the LCD now on my second PC which I rarely use.


So many people say the reverse, including myself. I can only suspect
that one of the following is the case:

1) You had a really good CRT monitor and a really crappy LCD monitor.

2) Your video card was really crappy.

3) You didn't position the LCD monitor in a way that would make you
comfortable.

4) You really like wasting tons of desk space and looking at a blurry
image.

DS


  #62  
Old August 30th 05, 12:15 AM
George Macdonald
external usenet poster
 
Posts: n/a
Default

On Mon, 29 Aug 2005 15:14:56 -0700, "David Schwartz"
wrote:


"Praxiteles Democritus" wrote in message
news
On Mon, 29 Aug 2005 08:57:25 -0500, chrisv
wrote:


I dread the day my beautiful Sony F500R 21" CRT ceases to work, and
I've no choice but to get in LCD to replace it...


I went LCD but only for a while. I hated it so much I went back to
CRT. I use the LCD now on my second PC which I rarely use.


So many people say the reverse, including myself. I can only suspect
that one of the following is the case:

1) You had a really good CRT monitor and a really crappy LCD monitor.

2) Your video card was really crappy.

3) You didn't position the LCD monitor in a way that would make you
comfortable.

4) You really like wasting tons of desk space and looking at a blurry
image.


Maybe he's into photography and games, in which case there is no single LCD
which will do the job for him. With LCDs, you choose your technology, be
it TN/film, xVA or IPS for the job it does with one of your tasks and live
with any/all compromises elsewhere. Every time it looks like the latest
twist or tweak is going to be the universal fix, like S-IPS, there's always
something which annoys with it.

--
Rgds, George Macdonald
  #63  
Old August 30th 05, 12:15 AM
George Macdonald
external usenet poster
 
Posts: n/a
Default

On 29 Aug 2005 07:33:07 -0700, "Robert Myers" wrote:

George Macdonald wrote:
On 27 Aug 2005 06:36:43 -0700, "Robert Myers" wrote:

George Macdonald wrote:
On 26 Aug 2005 07:37:06 -0700, "Robert Myers" wrote:


No matter what power management trickery does for you most of the time,
you've got to be able to cool the thing when it's operating at peak
performance.


Well we know that Intel stubbed its toes there at 4GHz and while the end of
scaling seems to be accepted as imminent, it's not clear how far other mfrs
can go, nor in what time scale. What I'm talking about is also more than
what we normally think of as power management - more like distributed
dynamic adaptive clocks - there may be a better term for that. 100% load
is difficult to categorize there and of course "clock rate" becomes
meaningless as a performance indicator.

AMD has said that it intends to continue to push clock speeds on single
core CPUs and its current offerings do not suffer anywhere near the same
heat stress as Intel's even at "100% load"; if AMD can get to 4GHz, and
maybe a bit beyond with 65nm, they are quite well positioned. All I'm
saying is that I'm not ready to swallow all of Intel's latest market-speak
on power/watt as a new metric for measuring CPU effectiveness. They
certainly tried to get too far up the slippery slope too quickly - it still
remains to be seen where the real limits are and which technology makes a
difference.

Let's not get into another Intel/AMD round. As it stands now, Intel is
likely to put its efforts at pushing single thread performance into
Itanium. Who knows how long that emphasis will last.


It was an honest and *correct* comment on Intel's technology choices - no
need to have any "round" of anything... and certainly not about Itanium.

I don't think the market is going to be there to pay for the kind of
workhorse you say you need. Power consumption is a huge economic
consideration in HPC: as it stands now, it doesn't pay to run clusters
more than about 3 years because it is more expensive to pay for the
power to continue running them than it is to pay for replacements that
save power.


I don't need a super-computer - I want the fastest PC I can get for a
price-point which is usually a notch back from the very highest
clock-speed, not some compromised thing which fits a marketing strategy for
peoples' living rooms.

I hope we are arriving at a moment of truth. Programming styles are
going to have to change. Either that, or we're going to have alot of
idle cores. I think programming styles are going to change. Don't ask
me how. The possibilities are endless. Threaded programming has to
move out of the sandbox and off the linux kernel list and into the
realy world.


As you well know there are algorithms & methods which just do not adapt to
multi-thread benefits. There are people who have spent a goodly portion of
their lives trying to improve the picture there, with little success.


Other than for the compiler-builders and a few marginalized players
that everyone ignores, I have such a high level of contempt for the
ad-hockery that has passed for research that I regret we are not on
comp.arch where I could say something that would provoke a flame. As
it is, I don't want to waste my energies here. When the discussion
turns to graph and process algebras, I'll be paying close attention.
Until then, it's more of the same: people who really should be playing
chess instead of inventing elaborate embedded bugs and making guesses
about what will and will not work.


I was talking about the algorithm folks who have tried to adapt/invent
methods to take "advantage" of multi-core/multi-CPU systems.

A different programming paradigm/style is not going to help them -
expectation of success is not obviously better than that of new
semiconductor tweaks, or even technology, which allows another 100x speed
ramp over 10 years or so coming along. When I hear talk of new compiler
technology to assist here, I'm naturally skeptical, based on past
experiences.


Well sure. The compiler first has to reverse engineer the control and
dataflow graph that's been obscured by the programmer and the
sequential language with bolted-on parallelism that was used. If you
could identify the critical path, you'd know what to do, but, even for
very repetitive calculations, the critical path that is optimized is at
best a guess.


It's not static - compilers don't have the right info for the job... and
compiler-compilers won't do it either.

There are also economic issues to be addressed here for the software
industry: if you have a 200 unit server farm running a major database
operation, which can suddenly be reduced to say a 10 unit quad-core
cluster, how much do you want your software costs reduced? Software
companies wold have to do a *lot* more work for less $$??:-)

That's Intel's sales pitch for Itanium.


Whatever that means, it ain't working - software people (that I know) hate
it and believe it's completely wrong-headed.

--
Rgds, George Macdonald
  #64  
Old August 30th 05, 12:28 AM
David Schwartz
external usenet poster
 
Posts: n/a
Default


"George Macdonald" wrote in message
...

Maybe he's into photography and games, in which case there is no single
LCD
which will do the job for him. With LCDs, you choose your technology, be
it TN/film, xVA or IPS for the job it does with one of your tasks and live
with any/all compromises elsewhere. Every time it looks like the latest
twist or tweak is going to be the universal fix, like S-IPS, there's
always
something which annoys with it.


Unless you need resolution over 1280x1024 or need a ridiculously large
viewing angle, there are LCDs that serve perfectly for both graphics editing
and games. For example, the NEC 2010X is totally suitable to both
applications.

DS


  #65  
Old August 30th 05, 12:58 AM
Robert Myers
external usenet poster
 
Posts: n/a
Default


George Macdonald wrote:
On 29 Aug 2005 07:33:07 -0700, "Robert Myers" wrote:

George Macdonald wrote:
On 27 Aug 2005 06:36:43 -0700, "Robert Myers" wrote:

George Macdonald wrote:
On 26 Aug 2005 07:37:06 -0700, "Robert Myers" wrote:


No matter what power management trickery does for you most of the time,
you've got to be able to cool the thing when it's operating at peak
performance.

Well we know that Intel stubbed its toes there at 4GHz and while the end of
scaling seems to be accepted as imminent, it's not clear how far other mfrs
can go, nor in what time scale. What I'm talking about is also more than
what we normally think of as power management - more like distributed
dynamic adaptive clocks - there may be a better term for that. 100% load
is difficult to categorize there and of course "clock rate" becomes
meaningless as a performance indicator.

AMD has said that it intends to continue to push clock speeds on single
core CPUs and its current offerings do not suffer anywhere near the same
heat stress as Intel's even at "100% load"; if AMD can get to 4GHz, and
maybe a bit beyond with 65nm, they are quite well positioned. All I'm
saying is that I'm not ready to swallow all of Intel's latest market-speak
on power/watt as a new metric for measuring CPU effectiveness. They
certainly tried to get too far up the slippery slope too quickly - it still
remains to be seen where the real limits are and which technology makes a
difference.

Let's not get into another Intel/AMD round. As it stands now, Intel is
likely to put its efforts at pushing single thread performance into
Itanium. Who knows how long that emphasis will last.


It was an honest and *correct* comment on Intel's technology choices - no
need to have any "round" of anything... and certainly not about Itanium.

Translation: Intel isn't likely to want to play. That may have no
bearing on AMD's decision-making whatsoever, but, if AMD wants to go
after x86 users with need for single-thread performance, I suspect they
will have the market all to themselves. The gamers who have
historically carried users hungry for single-threaded performance will
all have moved to multi-core machines because that's where they'll be
getting the best performance because all of the software will have been
rewritten for mulitple cores. IBM will stay in the game because IBM
wants to keep Power ahead of Itanium on SpecFP, and the x86 chips
you'll be looking to buy, if they're available, will be priced like
Power5, or whatever it is by then. You know, that monopoly thing.

I don't think the market is going to be there to pay for the kind of
workhorse you say you need. Power consumption is a huge economic
consideration in HPC: as it stands now, it doesn't pay to run clusters
more than about 3 years because it is more expensive to pay for the
power to continue running them than it is to pay for replacements that
save power.


I don't need a super-computer - I want the fastest PC I can get for a
price-point which is usually a notch back from the very highest
clock-speed, not some compromised thing which fits a marketing strategy for
peoples' living rooms.


My explanation as to why the other usual customers for single-threaded
performance won't be there in large numbers, either.

snip

A different programming paradigm/style is not going to help them -
expectation of success is not obviously better than that of new
semiconductor tweaks, or even technology, which allows another 100x speed
ramp over 10 years or so coming along. When I hear talk of new compiler
technology to assist here, I'm naturally skeptical, based on past
experiences.


Well sure. The compiler first has to reverse engineer the control and
dataflow graph that's been obscured by the programmer and the
sequential language with bolted-on parallelism that was used. If you
could identify the critical path, you'd know what to do, but, even for
very repetitive calculations, the critical path that is optimized is at
best a guess.


It's not static - compilers don't have the right info for the job... and
compiler-compilers won't do it either.

That's why you do profiling.

RM

  #66  
Old August 30th 05, 01:16 AM
CJT
external usenet poster
 
Posts: n/a
Default

David Schwartz wrote:

"George Macdonald" wrote in message
...


Maybe he's into photography and games, in which case there is no single
LCD
which will do the job for him. With LCDs, you choose your technology, be
it TN/film, xVA or IPS for the job it does with one of your tasks and live
with any/all compromises elsewhere. Every time it looks like the latest
twist or tweak is going to be the universal fix, like S-IPS, there's
always
something which annoys with it.



Unless you need resolution over 1280x1024 or need a ridiculously large
viewing angle, there are LCDs that serve perfectly for both graphics editing
and games. For example, the NEC 2010X is totally suitable to both
applications.

DS


1280x1024 isn't exactly hires any more.

--
The e-mail address in our reply-to line is reversed in an attempt to
minimize spam. Our true address is of the form .
  #67  
Old August 30th 05, 02:13 AM
David Schwartz
external usenet poster
 
Posts: n/a
Default


"CJT" wrote in message
...

David Schwartz wrote:


Unless you need resolution over 1280x1024 or need a ridiculously
large viewing angle, there are LCDs that serve perfectly for both
graphics editing and games. For example, the NEC 2010X is totally
suitable to both applications.


1280x1024 isn't exactly hires any more.


There are very few games that support resolutions above that. For normal
desktop work, 1280x1024 is more than adequate. Personally, I prefer to have
two LCD monitors, each 1280x1024, using the second one only when
circumstances require it.

What percentage of PC computer users do you think have a resolution over
1280x1024?

DS


  #68  
Old August 30th 05, 04:26 AM
CJT
external usenet poster
 
Posts: n/a
Default

keith wrote:

On Mon, 29 Aug 2005 18:13:19 -0700, David Schwartz wrote:


"CJT" wrote in message
...


David Schwartz wrote:


Unless you need resolution over 1280x1024 or need a ridiculously
large viewing angle, there are LCDs that serve perfectly for both
graphics editing and games. For example, the NEC 2010X is totally
suitable to both applications.


1280x1024 isn't exactly hires any more.


There are very few games that support resolutions above that. For normal
desktop work, 1280x1024 is more than adequate. Personally, I prefer to have
two LCD monitors, each 1280x1024, using the second one only when
circumstances require it.



As others here will attest, I've been using a 3200x1600 desktop at
work for almost five years. One display is the laptop's LCD, the other is
a 20" monitor. 1280x1024 is *NOT* adequite (though I live with two
19" CRTs at this resolution, each, here at home).


What percentage of PC computer users do you think have a resolution
over 1280x1024?



What percentage have tried it? What percentage have ever gone back?
Sheesh, I still see people with 1024x768 on 20" monitors at 60Hz! Is that
what we should all aspire to? ...the least common denominator?

Yeah, you da man ... NOT!

--
The e-mail address in our reply-to line is reversed in an attempt to
minimize spam. Our true address is of the form .
  #69  
Old August 30th 05, 04:30 AM
keith
external usenet poster
 
Posts: n/a
Default

On Mon, 29 Aug 2005 04:17:39 +0000, CJT wrote:

keith wrote:

On Sun, 28 Aug 2005 22:18:15 +0000, CJT wrote:


CJT wrote:


Tony Hill wrote:


snip

Computers do not.


... only Gigawatts



Even Intel processors don't Gigawatts!


Sure they do, when hundreds of millions of them are involved.


They are not all on at once. Your numbers are pulled out of your ass to
promote your politics. Yes, lie!

You pull numbers out of your ass,
then expect people to change their lives around your silly wishes. ...a
typical totalitarian liberal.


How did I know you'd soon resort to an attempt at ad hominem?


NOpe, just a simple fact. You pull numbers from your ass to prove a
point. You are a fraud.

Oh, well ...


Indeed!

-------------
Tony Hill
hilla underscore 20 at yahoo dot ca



BTW, that stoves and electrical heaters use lots of electric power is
one reason why one should heat and cook with gas. :-)



Another idiotic statement. ..and if gas isnt' available? if everyone
used gas? Please, do grow up.

You must not cook, or you'd realize how much better gas is.


No, I don't. You must not realize that not everyone has access to
natural gas, nor agrees with your silly lies.

--
Keith

  #70  
Old August 30th 05, 04:35 AM
keith
external usenet poster
 
Posts: n/a
Default

On Mon, 29 Aug 2005 18:13:19 -0700, David Schwartz wrote:


"CJT" wrote in message
...

David Schwartz wrote:


Unless you need resolution over 1280x1024 or need a ridiculously
large viewing angle, there are LCDs that serve perfectly for both
graphics editing and games. For example, the NEC 2010X is totally
suitable to both applications.


1280x1024 isn't exactly hires any more.


There are very few games that support resolutions above that. For normal
desktop work, 1280x1024 is more than adequate. Personally, I prefer to have
two LCD monitors, each 1280x1024, using the second one only when
circumstances require it.


As others here will attest, I've been using a 3200x1600 desktop at
work for almost five years. One display is the laptop's LCD, the other is
a 20" monitor. 1280x1024 is *NOT* adequite (though I live with two
19" CRTs at this resolution, each, here at home).


What percentage of PC computer users do you think have a resolution
over 1280x1024?


What percentage have tried it? What percentage have ever gone back?
Sheesh, I still see people with 1024x768 on 20" monitors at 60Hz! Is that
what we should all aspire to? ...the least common denominator?

--
Keith
 




Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
Intel found to be abusing market power in Japan chrisv General 152 March 26th 05 06:57 AM
Gigabyte GA-8IDML with Mobile CPU? Cuzman General 0 December 8th 04 02:39 PM
HELP: P4C800-E Deluxe, Intel RAID and Windows detection problems Michail Pappas Asus Motherboards 2 November 20th 04 03:18 AM
Intel Is Aiming at Living Rooms in Marketing Its Latest Chip Vince McGowan Dell Computers 0 June 18th 04 03:10 PM
New PC with W2K? Rob UK Computer Vendors 5 August 29th 03 12:32 PM


All times are GMT +1. The time now is 03:28 PM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Copyright ©2004-2024 HardwareBanter.
The comments are property of their posters.