A computer components & hardware forum. HardwareBanter

If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below.

Go Back   Home » HardwareBanter forum » Processors » Intel
Site Map Home Register Authors List Search Today's Posts Mark Forums Read Web Partners

Intel drops HyperThreading



 
 
Thread Tools Display Modes
  #51  
Old August 28th 05, 11:18 PM
CJT
external usenet poster
 
Posts: n/a
Default

CJT wrote:

Tony Hill wrote:

On Sun, 28 Aug 2005 03:01:51 GMT, CJT wrote:


keith wrote:

Your argument is silly, to the extreme. ...or at least your "facts"
are.


Computers use a lot of electric power. Much of it is wasted. You can
quibble about the numbers all you like, but you can't escape those basic
facts.




A STOVE uses a lot of electric power. An air conditioner uses a lot
of electric power. Electrical heaters use a lot of electric power.

Computers do not.



... only Gigawatts


-------------
Tony Hill
hilla underscore 20 at yahoo dot ca




BTW, that stoves and electrical heaters use lots of electric power is
one reason why one should heat and cook with gas. :-)

--
The e-mail address in our reply-to line is reversed in an attempt to
minimize spam. Our true address is of the form .
  #52  
Old August 29th 05, 01:58 AM
keith
external usenet poster
 
Posts: n/a
Default

On Sun, 28 Aug 2005 22:18:15 +0000, CJT wrote:

CJT wrote:

Tony Hill wrote:

On Sun, 28 Aug 2005 03:01:51 GMT, CJT wrote:


keith wrote:

Your argument is silly, to the extreme. ...or at least your "facts"
are.


Computers use a lot of electric power. Much of it is wasted. You can
quibble about the numbers all you like, but you can't escape those basic
facts.



A STOVE uses a lot of electric power. An air conditioner uses a lot
of electric power. Electrical heaters use a lot of electric power.

Computers do not.



... only Gigawatts


Even Intel processors don't Gigawatts! You pull numbers out of your ass,
then expect people to change their lives around your silly wishes. ...a
typical totalitarian liberal.



-------------
Tony Hill
hilla underscore 20 at yahoo dot ca




BTW, that stoves and electrical heaters use lots of electric power is
one reason why one should heat and cook with gas. :-)


Another idiotic statement. ..and if gas isnt' available? if everyone
used gas? Please, do grow up.

--
Keith
  #53  
Old August 29th 05, 05:17 AM
CJT
external usenet poster
 
Posts: n/a
Default

keith wrote:

On Sun, 28 Aug 2005 22:18:15 +0000, CJT wrote:


CJT wrote:


Tony Hill wrote:


On Sun, 28 Aug 2005 03:01:51 GMT, CJT wrote:



keith wrote:


Your argument is silly, to the extreme. ...or at least your "facts"
are.


Computers use a lot of electric power. Much of it is wasted. You can
quibble about the numbers all you like, but you can't escape those basic
facts.



A STOVE uses a lot of electric power. An air conditioner uses a lot
of electric power. Electrical heaters use a lot of electric power.

Computers do not.


... only Gigawatts



Even Intel processors don't Gigawatts!


Sure they do, when hundreds of millions of them are involved.

You pull numbers out of your ass,
then expect people to change their lives around your silly wishes. ...a
typical totalitarian liberal.


How did I know you'd soon resort to an attempt at ad hominem?

Oh, well ...




-------------
Tony Hill
hilla underscore 20 at yahoo dot ca



BTW, that stoves and electrical heaters use lots of electric power is
one reason why one should heat and cook with gas. :-)



Another idiotic statement. ..and if gas isnt' available? if everyone
used gas? Please, do grow up.

You must not cook, or you'd realize how much better gas is.

--
The e-mail address in our reply-to line is reversed in an attempt to
minimize spam. Our true address is of the form .
  #54  
Old August 29th 05, 08:30 AM
Grumble
external usenet poster
 
Posts: n/a
Default

Gnu_Raiz wrote:
Seems to be a dead link, I was unable to download the pdf
under Firefox under Linux.


http://www.tollygroup.com/ts/2005/In...-March2005.pdf
  #55  
Old August 29th 05, 01:43 PM
George Macdonald
external usenet poster
 
Posts: n/a
Default

On 27 Aug 2005 06:36:43 -0700, "Robert Myers" wrote:

George Macdonald wrote:
On 26 Aug 2005 07:37:06 -0700, "Robert Myers" wrote:


Nobody would argue about the importance of power consumption for
servers, for HPC, or for mobile applications. The question is: why
worry for stationary machines, such as those used for gaming? As far
as I can tell, because there is no other way to get more performance
into an acceptable power and cooling envelope, and I'm assuming that
machines used for gaming will continue to have an insatiable demand for
greater performance.


I see this view as somewhat out of date wrt current CPUs - I don't know how
well Intel's latest P4 power management is working but, from what I observe
with Athlon64s, a current CPU should spend very little time at 100% power
rating even when used for gaming. I don't game myself, but my current
Athlon64 3500+ spends most of its time just idling along at 1GHz/1V with a
reported temp which is just 1 or 2 degrees C above system temp... which
makes it 34/35C with an ambient of ~22-25C. You really have to pund on it
to get it to stay at its rated 2.2GHz/1.4V. If your average gamer spends
say 4 hours/day on solid gaming, which will not in itself need 100% CPU
steady load, I doubt that their overall CPU load is much above 80%, if
that.

No matter what power management trickery does for you most of the time,
you've got to be able to cool the thing when it's operating at peak
performance.


Well we know that Intel stubbed its toes there at 4GHz and while the end of
scaling seems to be accepted as imminent, it's not clear how far other mfrs
can go, nor in what time scale. What I'm talking about is also more than
what we normally think of as power management - more like distributed
dynamic adaptive clocks - there may be a better term for that. 100% load
is difficult to categorize there and of course "clock rate" becomes
meaningless as a performance indicator.

AMD has said that it intends to continue to push clock speeds on single
core CPUs and its current offerings do not suffer anywhere near the same
heat stress as Intel's even at "100% load"; if AMD can get to 4GHz, and
maybe a bit beyond with 65nm, they are quite well positioned. All I'm
saying is that I'm not ready to swallow all of Intel's latest market-speak
on power/watt as a new metric for measuring CPU effectiveness. They
certainly tried to get too far up the slippery slope too quickly - it still
remains to be seen where the real limits are and which technology makes a
difference.

The only way to get it, as I currently understand
the situation, is more cores operating at a point that is less than
optimal from the POV of single-thread performance. It will be
interesting to see how long power-no-consideration single-thread
workhorses will survive. I expect them to become an endangered species
for all but the the most specialized applications (very high-end HPC,
for example).


It's been my impression that game makers are not optimistic about getting
that much performance out of multiple threads/processors - they haven't so
far , with only HT, and even just dual core is going to be hard to get
benefits worth talking about. IOW single core performance is going to
matter for quite a while yet.

I hope we are arriving at a moment of truth. Programming styles are
going to have to change. Either that, or we're going to have alot of
idle cores. I think programming styles are going to change. Don't ask
me how. The possibilities are endless. Threaded programming has to
move out of the sandbox and off the linux kernel list and into the
realy world.


As you well know there are algorithms & methods which just do not adapt to
multi-thread benefits. There are people who have spent a goodly portion of
their lives trying to improve the picture there, with little success. A
different programming paradigm/style is not going to help them -
expectation of success is not obviously better than that of new
semiconductor tweaks, or even technology, which allows another 100x speed
ramp over 10 years or so coming along. When I hear talk of new compiler
technology to assist here, I'm naturally skeptical, based on past
experiences.

There are also economic issues to be addressed here for the software
industry: if you have a 200 unit server farm running a major database
operation, which can suddenly be reduced to say a 10 unit quad-core
cluster, how much do you want your software costs reduced? Software
companies wold have to do a *lot* more work for less $$??:-)

--
Rgds, George Macdonald
  #56  
Old August 29th 05, 02:57 PM
chrisv
external usenet poster
 
Posts: n/a
Default

Tony Hill wrote:

Ohh, and the observant among you will probably notice that BY FAR the
biggest power savings you can get in a PC is by replacing a CRT
monitor with an LCD, especially if you're using a large 19" or 21"
CRT.


I dread the day my beautiful Sony F500R 21" CRT ceases to work, and
I've no choice but to get in LCD to replace it...

  #57  
Old August 29th 05, 03:33 PM
Robert Myers
external usenet poster
 
Posts: n/a
Default

George Macdonald wrote:
On 27 Aug 2005 06:36:43 -0700, "Robert Myers" wrote:

George Macdonald wrote:
On 26 Aug 2005 07:37:06 -0700, "Robert Myers" wrote:


Nobody would argue about the importance of power consumption for
servers, for HPC, or for mobile applications. The question is: why
worry for stationary machines, such as those used for gaming? As far
as I can tell, because there is no other way to get more performance
into an acceptable power and cooling envelope, and I'm assuming that
machines used for gaming will continue to have an insatiable demand for
greater performance.

I see this view as somewhat out of date wrt current CPUs - I don't know how
well Intel's latest P4 power management is working but, from what I observe
with Athlon64s, a current CPU should spend very little time at 100% power
rating even when used for gaming. I don't game myself, but my current
Athlon64 3500+ spends most of its time just idling along at 1GHz/1V with a
reported temp which is just 1 or 2 degrees C above system temp... which
makes it 34/35C with an ambient of ~22-25C. You really have to pund on it
to get it to stay at its rated 2.2GHz/1.4V. If your average gamer spends
say 4 hours/day on solid gaming, which will not in itself need 100% CPU
steady load, I doubt that their overall CPU load is much above 80%, if
that.

No matter what power management trickery does for you most of the time,
you've got to be able to cool the thing when it's operating at peak
performance.


Well we know that Intel stubbed its toes there at 4GHz and while the end of
scaling seems to be accepted as imminent, it's not clear how far other mfrs
can go, nor in what time scale. What I'm talking about is also more than
what we normally think of as power management - more like distributed
dynamic adaptive clocks - there may be a better term for that. 100% load
is difficult to categorize there and of course "clock rate" becomes
meaningless as a performance indicator.

AMD has said that it intends to continue to push clock speeds on single
core CPUs and its current offerings do not suffer anywhere near the same
heat stress as Intel's even at "100% load"; if AMD can get to 4GHz, and
maybe a bit beyond with 65nm, they are quite well positioned. All I'm
saying is that I'm not ready to swallow all of Intel's latest market-speak
on power/watt as a new metric for measuring CPU effectiveness. They
certainly tried to get too far up the slippery slope too quickly - it still
remains to be seen where the real limits are and which technology makes a
difference.

Let's not get into another Intel/AMD round. As it stands now, Intel is
likely to put its efforts at pushing single thread performance into
Itanium. Who knows how long that emphasis will last.

I don't think the market is going to be there to pay for the kind of
workhorse you say you need. Power consumption is a huge economic
consideration in HPC: as it stands now, it doesn't pay to run clusters
more than about 3 years because it is more expensive to pay for the
power to continue running them than it is to pay for replacements that
save power.

Maybe that leaves a hole for someone to build something that genuinely
deserves to be called a supercomputer. Maybe the Japanese will do it,
but you'd better have a high limit on your credit card.

There are other possibilities. Asynchronous processing is one. There
are some semi-viable efforts, and power savings without a performance
hit is one of the likely payoffs.

The only way to get it, as I currently understand
the situation, is more cores operating at a point that is less than
optimal from the POV of single-thread performance. It will be
interesting to see how long power-no-consideration single-thread
workhorses will survive. I expect them to become an endangered species
for all but the the most specialized applications (very high-end HPC,
for example).

It's been my impression that game makers are not optimistic about getting
that much performance out of multiple threads/processors - they haven't so
far , with only HT, and even just dual core is going to be hard to get
benefits worth talking about. IOW single core performance is going to
matter for quite a while yet.

I hope we are arriving at a moment of truth. Programming styles are
going to have to change. Either that, or we're going to have alot of
idle cores. I think programming styles are going to change. Don't ask
me how. The possibilities are endless. Threaded programming has to
move out of the sandbox and off the linux kernel list and into the
realy world.


As you well know there are algorithms & methods which just do not adapt to
multi-thread benefits. There are people who have spent a goodly portion of
their lives trying to improve the picture there, with little success.


Other than for the compiler-builders and a few marginalized players
that everyone ignores, I have such a high level of contempt for the
ad-hockery that has passed for research that I regret we are not on
comp.arch where I could say something that would provoke a flame. As
it is, I don't want to waste my energies here. When the discussion
turns to graph and process algebras, I'll be paying close attention.
Until then, it's more of the same: people who really should be playing
chess instead of inventing elaborate embedded bugs and making guesses
about what will and will not work.

A different programming paradigm/style is not going to help them -
expectation of success is not obviously better than that of new
semiconductor tweaks, or even technology, which allows another 100x speed
ramp over 10 years or so coming along. When I hear talk of new compiler
technology to assist here, I'm naturally skeptical, based on past
experiences.


Well sure. The compiler first has to reverse engineer the control and
dataflow graph that's been obscured by the programmer and the
sequential language with bolted-on parallelism that was used. If you
could identify the critical path, you'd know what to do, but, even for
very repetitive calculations, the critical path that is optimized is at
best a guess.


There are also economic issues to be addressed here for the software
industry: if you have a 200 unit server farm running a major database
operation, which can suddenly be reduced to say a 10 unit quad-core
cluster, how much do you want your software costs reduced? Software
companies wold have to do a *lot* more work for less $$??:-)

That's Intel's sales pitch for Itanium.

RM

  #58  
Old August 29th 05, 03:58 PM
Felger Carbon
external usenet poster
 
Posts: n/a
Default


"chrisv" wrote in message
...
Tony Hill wrote:

Ohh, and the observant among you will probably notice that BY FAR

the
biggest power savings you can get in a PC is by replacing a CRT
monitor with an LCD, especially if you're using a large 19" or 21"
CRT.


I dread the day my beautiful Sony F500R 21" CRT ceases to work, and
I've no choice but to get in LCD to replace it...


To paraphrase a saying from the yachting folk, "the two best days of
your computing life are the day you installed your 21"CRT and the day
you replaced it with an LCD." ;-)


  #59  
Old August 29th 05, 11:07 PM
Praxiteles Democritus
external usenet poster
 
Posts: n/a
Default

On Mon, 29 Aug 2005 08:57:25 -0500, chrisv
wrote:


I dread the day my beautiful Sony F500R 21" CRT ceases to work, and
I've no choice but to get in LCD to replace it...


I went LCD but only for a while. I hated it so much I went back to
CRT. I use the LCD now on my second PC which I rarely use.
  #60  
Old August 29th 05, 11:14 PM
David Schwartz
external usenet poster
 
Posts: n/a
Default


"Praxiteles Democritus" wrote in message
news
On Mon, 29 Aug 2005 08:57:25 -0500, chrisv
wrote:


I dread the day my beautiful Sony F500R 21" CRT ceases to work, and
I've no choice but to get in LCD to replace it...


I went LCD but only for a while. I hated it so much I went back to
CRT. I use the LCD now on my second PC which I rarely use.


So many people say the reverse, including myself. I can only suspect
that one of the following is the case:

1) You had a really good CRT monitor and a really crappy LCD monitor.

2) Your video card was really crappy.

3) You didn't position the LCD monitor in a way that would make you
comfortable.

4) You really like wasting tons of desk space and looking at a blurry
image.

DS


 




Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
Intel found to be abusing market power in Japan chrisv General 152 March 26th 05 06:57 AM
Gigabyte GA-8IDML with Mobile CPU? Cuzman General 0 December 8th 04 02:39 PM
HELP: P4C800-E Deluxe, Intel RAID and Windows detection problems Michail Pappas Asus Motherboards 2 November 20th 04 03:18 AM
Intel Is Aiming at Living Rooms in Marketing Its Latest Chip Vince McGowan Dell Computers 0 June 18th 04 03:10 PM
New PC with W2K? Rob UK Computer Vendors 5 August 29th 03 12:32 PM


All times are GMT +1. The time now is 01:49 PM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Copyright ©2004-2024 HardwareBanter.
The comments are property of their posters.