Thread: Intel, AMD...
View Single Post
  #6  
Old October 21st 04, 11:45 PM
JK
external usenet poster
 
Posts: n/a
Default



Eli Kane wrote:

JK wrote:

In my opinion, which is not all that important, there is no real
difference between the competing processors from AMD and Intel, except
the price.


Not quite. Look at benchmarks for Doom 3 for example. Even a $1,000
Pentium 4 3.4 ghz EE doesn't come close to the performance of a $285
Athlon 64 3500+ running Doom 3.

http://www.anandtech.com/cpuchipsets...spx?i=2149&p=7


True, it does beat it with 82.7 fps vs. 77.3 fps. I do state further down
that you get more bang for your buck with the AMD. I guess my point is that
if you were to be given two machines running Doom3, one with the AMD and
the other with the Intel, you would be very hard pressed to determine which
one was which, based on gameplay.


Very funny. If one machine had an Athlon 64 FX-55 and the other had a
Pentium 4 3.4 ghz EE, you would definitely notice a difference.

The performance difference between their
top-line chips is not so great in terms of ability to notice, in my
opinion. The happy thing is that not only do you get a few more fps with
AMD, but it costs way less. It's just that the price diff


Or perhaps the performance difference at each price point?

is the major
difference, not the (noticable) performance. Stated another way, suppose
the AMD cost $1000 and the Intel cost $185. Would you shell out that much
more money for those extra 5.4 frames or would 77.3 be perfectly fine?

It is misleading to say that there is a 7% performance difference in those
benchmarks at 1280x1024. It is accurate to say there is a 7% frame rate
difference, but performance is arguably something noticed by the user. As
frame rates increase, the meaning of a 7% increase lessens. The difference
between 32 fps and 26.6 fps is also 7%, but you will notice lag and jerky
movements on screen at 26.6fps. That same 7% difference in these benchmarks
will not be noticed.

One thing to note: if you look at the performance curves on the first page
of that article, you can clearly see that moving from a Radeon 9800 Pro to
a GeForce 6800 with the same CPU makes a HUGE difference. At 1280x1024 you
go from about 30 fps to about 75 fps. This clearly shows that the GPU does
the heavy lifting.

By the way, as far as full motion is concerned, the human eye percieves it
fully at a rate of 30 to 32 fps. The idea is to keep the game engine
portion from stuttering so that the GPU can maintain full motion. I would
be interested to have benchmarks that record numbers and lengths of stalls
in gameplay over unit time rather than a frame rate per second.


A $185 Athlon 64 3200+ socket 754 beats the $1,000 Pentium 4 3.4 ghz EE
running Business Winstone 2004.


Yeah, I concede this point for heavy business apps like database transaction
servers and heavy e-commerce apps. But I do say that I was stating opinions
about the average user.

Eli