A computer components & hardware forum. HardwareBanter

If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below.

Go Back   Home » HardwareBanter forum » Processors » Intel
Site Map Home Register Authors List Search Today's Posts Mark Forums Read Web Partners

Intel drops HyperThreading



 
 
Thread Tools Display Modes
  #31  
Old August 27th 05, 05:03 AM
Tony Hill
external usenet poster
 
Posts: n/a
Default

On Fri, 26 Aug 2005 04:49:56 GMT, CJT wrote:

Tony Hill wrote:

On Thu, 25 Aug 2005 17:36:17 GMT, CJT wrote:


George Macdonald wrote:

Its not even close - you can get a benchmark comparison .pdf here
http://www.tollygroup.com/DocDetail....cNumber=205107


Now show a study _not_ sponsored by Intel. And that addresses the watts
of power used by each processor.



Ok.. how's this?

http://www.tomshardware.com/cpu/20020605/



Quoting from the "Conclusions" page of that article:

"Looking at performance and power output in terms of a ratio, the C3
blows away its competitors."


It's "competitors" were discontinued about 3 years ago. The C3, on
the other hand, is still VIA's current processor (albeit with a
slightly tweaked core). The ULV Celeron-M has 1/3rd the power
consumption of the Celeron 667 used in this comparison, quite a bit
lower than the VIA C3, while it's performance is significantly higher.

-------------
Tony Hill
hilla underscore 20 at yahoo dot ca
  #32  
Old August 27th 05, 05:03 AM
Tony Hill
external usenet poster
 
Posts: n/a
Default

On Fri, 26 Aug 2005 13:04:45 GMT, CJT wrote:
Robert Myers wrote:
As to comparison of "ratio" with desktop chips, the only exercise that
makes any sense is to compare power consumption at equal performance or
performance at equal power consumption.

RM

Here's something current and on point:

http://www.computerworld.com/hardwar..._PM&nid=104017


And this relates to VIA how? I (and many others) have long been
talking about how performance/watt would be a critical metric for
processors for some time now, however Green Destiny isn't exactly a
champion here, even in the HPC sector. Compare Green Destiny's
performance to BlueGene/L and you'll see that the Transmeta chips are
rather ho-hum. Heck, even if you were to compare the rather
low-performance Transmeta chips to Intel ULV Pentium-M (or Celeron-M)
chips of similar power consumption you would find that the Transmeta
solution leaves much to be desired.

Trying to tie this back in to VIA chips, Transmeta and VIA have
similar performance/watt numbers, but VIA does so a much lower cost.
That is the real reason why I'm somewhat fond of VIA chips, they are
DIRT-CHEAP. Intel's ULV chips offer much better performance for
similar power consumption, but even the Celeron-M chips are MUCH more
expensive than VIA's alternative.

-------------
Tony Hill
hilla underscore 20 at yahoo dot ca
  #33  
Old August 27th 05, 05:03 AM
Tony Hill
external usenet poster
 
Posts: n/a
Default

On Fri, 26 Aug 2005 16:10:15 GMT, Rob Stow wrote:

Robert Myers wrote:
Del Cecchi wrote:
Nobody would argue about the importance of power consumption for
servers, for HPC, or for mobile applications. The question is: why
worry for stationary machines, such as those used for gaming?


Because while the additional power consumption might be
insignificant or irrelevant on a personal basis, it is *very*
significant on a national or global basis.

If 100 million home computers in North America are replaced with
machines that need an additional 100 Watts each, then an
additional 10 GW of generating capacity is needed.


Just to keep such numbers in perspective, in 2000 the US had a total
generating capacitor of 819GW. As such you are really talking about a
1.2% increase in total power consumption for the country. that is, of
course, assuming that these hypothetical computers are ALL operating
at 100% full-power 24 hours a day, 7 days a week.

In the real world, the difference between a 60W processor and a 100W
processor gets lost among the noise when looking at national power
consumption figures.

More likely
double that when you consider the fact that in most parts of
North America still more power is going to be wasted by air
conditioners working just a little harder to remove that
additional 100 Watts from the house/apartment/office.


And in a few parts of North America (such as both the area where live
and where I live) that 100W of extra power takes away from what we
would otherwise be spending on heating our homes for 6 months of the
year. I use AC about 10 days a year, I have the heat on for about 150
days of the year. This wouldn't be true for some of the population
centers in, for example, California, but it's still tough to add the
heating/cooling costs into things.

Increasingly power-hungry TVs and home computers are not fully to
blame for North America's growing energy crisis, but they are a
significant and highly symbolic factor.


TVs and home computers aren't really very power hungry, regardless of
what type you're talking about. The shift towards laptop computers
and LCD monitors is probably enough to counterbalance any increase in
the power consumption of processors. Similarly improvements in TVs
probably mean that a brand-new 50" TV probably doesn't consume much
more power (if any at all) than an old 20" TV from 15 or 20 years ago.

Power consumption today, much like for MANY years now, is dominated by
heating and cooling. Whether it's your air conditioner in the summer,
heater in the winter, or simply appliances like your stove and your
refrigerator. If you want to track where you power is being used,
look for things that either heat or cool.

If you want to look at power-hungry toys today, they are out there on
our roads, not in our homes. A recent increase in the average fuel
consumption per vehicle as well as a constantly increasing number of
verticals on the road are the real energy consumers in North America.
Faster computers fall WELL down on the list.

-------------
Tony Hill
hilla underscore 20 at yahoo dot ca
  #34  
Old August 27th 05, 06:18 AM
CJT
external usenet poster
 
Posts: n/a
Default

Tony Hill wrote:

snip
TVs and home computers aren't really very power hungry, regardless of
what type you're talking about. The shift towards laptop computers
and LCD monitors is probably enough to counterbalance any increase in
the power consumption of processors. Similarly improvements in TVs
probably mean that a brand-new 50" TV probably doesn't consume much
more power (if any at all) than an old 20" TV from 15 or 20 years ago.


If you have multiple PCs, and run them 24*7, you can easily spend
$50/month on electricity for them.

Power consumption today, much like for MANY years now, is dominated by
heating and cooling. Whether it's your air conditioner in the summer,
heater in the winter, or simply appliances like your stove and your
refrigerator. If you want to track where you power is being used,
look for things that either heat or cool.

If you want to look at power-hungry toys today, they are out there on
our roads, not in our homes. A recent increase in the average fuel
consumption per vehicle as well as a constantly increasing number of
verticals on the road are the real energy consumers in North America.


Agreed, but that's another problem, rather than absolution for wasting
power on inefficient computers.

Faster computers fall WELL down on the list.

-------------
Tony Hill
hilla underscore 20 at yahoo dot ca



--
The e-mail address in our reply-to line is reversed in an attempt to
minimize spam. Our true address is of the form .
  #35  
Old August 27th 05, 01:42 PM
George Macdonald
external usenet poster
 
Posts: n/a
Default

On 26 Aug 2005 07:37:06 -0700, "Robert Myers" wrote:

Del Cecchi wrote:


Well, if the processor is an extra 100 watts (large number) and
electricity is 15 cents/kwH (on the high side) and the processor draws
the extra 100 watts even when nobody is using it, it comes to 36
cents/day, or about 10 dollars/month. So the extra electricity to
assure optimim game play over 6 months equals the cost of a game. On
the other hand the high speed internet connection costs 40 dollars per
month. QED Power consumption in gaming PC is not a significant economic
factor.

Nobody would argue about the importance of power consumption for
servers, for HPC, or for mobile applications. The question is: why
worry for stationary machines, such as those used for gaming? As far
as I can tell, because there is no other way to get more performance
into an acceptable power and cooling envelope, and I'm assuming that
machines used for gaming will continue to have an insatiable demand for
greater performance.


I see this view as somewhat out of date wrt current CPUs - I don't know how
well Intel's latest P4 power management is working but, from what I observe
with Athlon64s, a current CPU should spend very little time at 100% power
rating even when used for gaming. I don't game myself, but my current
Athlon64 3500+ spends most of its time just idling along at 1GHz/1V with a
reported temp which is just 1 or 2 degrees C above system temp... which
makes it 34/35C with an ambient of ~22-25C. You really have to pund on it
to get it to stay at its rated 2.2GHz/1.4V. If your average gamer spends
say 4 hours/day on solid gaming, which will not in itself need 100% CPU
steady load, I doubt that their overall CPU load is much above 80%, if
that.

The only way to get it, as I currently understand
the situation, is more cores operating at a point that is less than
optimal from the POV of single-thread performance. It will be
interesting to see how long power-no-consideration single-thread
workhorses will survive. I expect them to become an endangered species
for all but the the most specialized applications (very high-end HPC,
for example).


It's been my impression that game makers are not optimistic about getting
that much performance out of multiple threads/processors - they haven't so
far , with only HT, and even just dual core is going to be hard to get
benefits worth talking about. IOW single core performance is going to
matter for quite a while yet.

--
Rgds, George Macdonald
  #36  
Old August 27th 05, 01:42 PM
George Macdonald
external usenet poster
 
Posts: n/a
Default

On Fri, 26 Aug 2005 13:04:45 GMT, CJT wrote:

Robert Myers wrote:


As to comparison of "ratio" with desktop chips, the only exercise that
makes any sense is to compare power consumption at equal performance or
performance at equal power consumption.

RM

Here's something current and on point:

http://www.computerworld.com/hardwar..._PM&nid=104017


Geeeez - these guys are wasting $$ on "research" for what you can get
straight out of the box with any current desktop CPU, or even an Opteron
based cluster.

--
Rgds, George Macdonald
  #37  
Old August 27th 05, 02:36 PM
Robert Myers
external usenet poster
 
Posts: n/a
Default

George Macdonald wrote:
On 26 Aug 2005 07:37:06 -0700, "Robert Myers" wrote:


Nobody would argue about the importance of power consumption for
servers, for HPC, or for mobile applications. The question is: why
worry for stationary machines, such as those used for gaming? As far
as I can tell, because there is no other way to get more performance
into an acceptable power and cooling envelope, and I'm assuming that
machines used for gaming will continue to have an insatiable demand for
greater performance.


I see this view as somewhat out of date wrt current CPUs - I don't know how
well Intel's latest P4 power management is working but, from what I observe
with Athlon64s, a current CPU should spend very little time at 100% power
rating even when used for gaming. I don't game myself, but my current
Athlon64 3500+ spends most of its time just idling along at 1GHz/1V with a
reported temp which is just 1 or 2 degrees C above system temp... which
makes it 34/35C with an ambient of ~22-25C. You really have to pund on it
to get it to stay at its rated 2.2GHz/1.4V. If your average gamer spends
say 4 hours/day on solid gaming, which will not in itself need 100% CPU
steady load, I doubt that their overall CPU load is much above 80%, if
that.

No matter what power management trickery does for you most of the time,
you've got to be able to cool the thing when it's operating at peak
performance.

The only way to get it, as I currently understand
the situation, is more cores operating at a point that is less than
optimal from the POV of single-thread performance. It will be
interesting to see how long power-no-consideration single-thread
workhorses will survive. I expect them to become an endangered species
for all but the the most specialized applications (very high-end HPC,
for example).


It's been my impression that game makers are not optimistic about getting
that much performance out of multiple threads/processors - they haven't so
far , with only HT, and even just dual core is going to be hard to get
benefits worth talking about. IOW single core performance is going to
matter for quite a while yet.

I hope we are arriving at a moment of truth. Programming styles are
going to have to change. Either that, or we're going to have alot of
idle cores. I think programming styles are going to change. Don't ask
me how. The possibilities are endless. Threaded programming has to
move out of the sandbox and off the linux kernel list and into the
realy world.

RM

  #38  
Old August 27th 05, 03:31 PM
keith
external usenet poster
 
Posts: n/a
Default

On Sat, 27 Aug 2005 05:18:23 +0000, CJT wrote:

Tony Hill wrote:

snip
TVs and home computers aren't really very power hungry, regardless of
what type you're talking about. The shift towards laptop computers
and LCD monitors is probably enough to counterbalance any increase in
the power consumption of processors. Similarly improvements in TVs
probably mean that a brand-new 50" TV probably doesn't consume much
more power (if any at all) than an old 20" TV from 15 or 20 years ago.


If you have multiple PCs, and run them 24*7, you can easily spend
$50/month on electricity for them.


Not "easily". Most don't have, nor a need for, aa dozen machines running
24x7. _Very_ few are spending $10/mo on electricity for their computer.
The range, clothes dryer, and AC are the biggies ($250 bill here last
month). Much of the country doesn't pay the electric rates we do here
either. A friend in Florida tells me he pays about $.04/kWh, vs $.13 here.

Power consumption today, much like for MANY years now, is dominated by
heating and cooling. Whether it's your air conditioner in the summer,
heater in the winter, or simply appliances like your stove and your
refrigerator. If you want to track where you power is being used, look
for things that either heat or cool.

If you want to look at power-hungry toys today, they are out there on
our roads, not in our homes. A recent increase in the average fuel
consumption per vehicle as well as a constantly increasing number of
verticals on the road are the real energy consumers in North America.


Agreed, but that's another problem, rather than absolution for wasting
power on inefficient computers.


Define "inefficient" as it relates to computers. Twenty years ago a
computer with roughly the same power as an Opteron would take thousands of
times more power. I'd call the Opteron rather "efficient", in comparison.

--
Keith
  #39  
Old August 27th 05, 03:47 PM
Del Cecchi
external usenet poster
 
Posts: n/a
Default


"CJT" wrote in message
...
Tony Hill wrote:

snip
TVs and home computers aren't really very power hungry, regardless of
what type you're talking about. The shift towards laptop computers
and LCD monitors is probably enough to counterbalance any increase in
the power consumption of processors. Similarly improvements in TVs
probably mean that a brand-new 50" TV probably doesn't consume much
more power (if any at all) than an old 20" TV from 15 or 20 years ago.


If you have multiple PCs, and run them 24*7, you can easily spend
$50/month on electricity for them.


So, set them to hibernate after an hour or so of non use. Or power them
down at night. What are you doing that you need multiple PCs 24x7?

snip


  #40  
Old August 27th 05, 06:47 PM
CJT
external usenet poster
 
Posts: n/a
Default

keith wrote:

On Sat, 27 Aug 2005 05:18:23 +0000, CJT wrote:


Tony Hill wrote:

snip

TVs and home computers aren't really very power hungry, regardless of
what type you're talking about. The shift towards laptop computers
and LCD monitors is probably enough to counterbalance any increase in
the power consumption of processors. Similarly improvements in TVs
probably mean that a brand-new 50" TV probably doesn't consume much
more power (if any at all) than an old 20" TV from 15 or 20 years ago.


If you have multiple PCs, and run them 24*7, you can easily spend
$50/month on electricity for them.



Not "easily". Most don't have, nor a need for, aa dozen machines running
24x7. _Very_ few are spending $10/mo on electricity for their computer.


I think you're wrong. 200 watts, 24x7, at 10 cents/KWH, is about $14 a
month, and a single machine can use that much. Then add a monitor and
some network hardware.

The 24x7 factor is subject to discussion, but I think lots of people
leave their machines running. And I think lots of people have more than
one.


The range, clothes dryer, and AC are the biggies ($250 bill here last
month). Much of the country doesn't pay the electric rates we do here
either. A friend in Florida tells me he pays about $.04/kWh, vs $.13 here.


Power consumption today, much like for MANY years now, is dominated by
heating and cooling. Whether it's your air conditioner in the summer,
heater in the winter, or simply appliances like your stove and your
refrigerator. If you want to track where you power is being used, look
for things that either heat or cool.

If you want to look at power-hungry toys today, they are out there on
our roads, not in our homes. A recent increase in the average fuel
consumption per vehicle as well as a constantly increasing number of
verticals on the road are the real energy consumers in North America.


Agreed, but that's another problem, rather than absolution for wasting
power on inefficient computers.



Define "inefficient" as it relates to computers. Twenty years ago a
computer with roughly the same power as an Opteron would take thousands of
times more power. I'd call the Opteron rather "efficient", in comparison.

Twenty years ago people did word processing with a 4 MHz 8080 (or even
less). I would argue that most computer power is wasted.

--
The e-mail address in our reply-to line is reversed in an attempt to
minimize spam. Our true address is of the form .
 




Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
Intel found to be abusing market power in Japan chrisv General 152 March 26th 05 06:57 AM
Gigabyte GA-8IDML with Mobile CPU? Cuzman General 0 December 8th 04 02:39 PM
HELP: P4C800-E Deluxe, Intel RAID and Windows detection problems Michail Pappas Asus Motherboards 2 November 20th 04 03:18 AM
Intel Is Aiming at Living Rooms in Marketing Its Latest Chip Vince McGowan Dell Computers 0 June 18th 04 03:10 PM
New PC with W2K? Rob UK Computer Vendors 5 August 29th 03 12:32 PM


All times are GMT +1. The time now is 01:18 AM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Copyright ©2004-2024 HardwareBanter.
The comments are property of their posters.