PDA

View Full Version : ATI and HL2 team has screwed nvidia


banshee
September 16th 03, 11:07 PM
ATI and HL2 team has screwed nvidia.

They did it on purpose, what gaming company would release the first
offical benchmarks that had poor frames on drivers that where old, and
refused to use to the ones that nvida told them to use.

I smell $$$

First of One
September 17th 03, 12:12 AM
Ones that nVidia "told them to use", or ones that are publicly available?

--
First of One
Formula SAE Racing: http://fsae.utoronto.ca/

"banshee" > wrote in message
...
> ATI and HL2 team has screwed nvidia.
>
> They did it on purpose, what gaming company would release the first
> offical benchmarks that had poor frames on drivers that where old, and
> refused to use to the ones that nvida told them to use.
>
> I smell $$$
>

who be dat?
September 17th 03, 12:14 AM
You need to jerk your biased ass out of the twilight zone and come back to
reality. It's not just Half-Life these problems are showing up in. They
are appearing elsewhere. Nvidia is in trouble. Your nonsense isn't going
to help them out.

Chris Smith

"banshee" > wrote in message
...
> ATI and HL2 team has screwed nvidia.
>
> They did it on purpose, what gaming company would release the first
> offical benchmarks that had poor frames on drivers that where old, and
> refused to use to the ones that nvida told them to use.
>
> I smell $$$
>

Bean
September 17th 03, 12:15 AM
From what ive read around some of the news sites, the newer
drivers/51.XX are beta so valve doesnt want them to run the benchmark with
the beta drivers. If these drivers where whql'd, then i think valve would
let them use the newer drivers. Also i heard that the drivers dont improve
the framerate by that much anyways, 10-15fps. the 5900ultra is still behind
with the newer drivers. Plus the new drivers are only for specific game
optimizations. nvidia will have to optimize a new set of drivers for every
new dx9 game that comes out, and that is bad.



"banshee" > wrote in message
...
> ATI and HL2 team has screwed nvidia.
>
> They did it on purpose, what gaming company would release the first
> offical benchmarks that had poor frames on drivers that where old, and
> refused to use to the ones that nvida told them to use.
>
> I smell $$$
>

that bloke
September 17th 03, 12:26 AM
Or maybe it's nVidia's fault for being crap?

Their next drivers will only lower the quality of HL2 anyway.

Then they're going to have to do that for every DX9 game.

magnulus
September 17th 03, 12:27 AM
The only people that have been screwing us, are NVidia. Sad, because I
really like NVidia, and I'd be reluctant to go back to an ATI card. NVidia
knew their hardware couldn't run full precision, 32-bit shaders, but they
implemented them anyways. ATI on the other hand runs at 24-bit and you
can't see the difference, they are still better than NVidia's
implementation. If NVidia could run the shaders at 24-bit, it would be a
non-issue, IMO.

Just an example... try running Dusk Ultra or Last Chance Gas. Notice the low
framerate. Now imagine making a game with 4-5 characetrs and a whole world
that looks like that. Oh, and turn on anisotropic fiiltering and watch the
framerate absolutely die.

And Gabe Newell is right... Valve can afford to implement 2 seperate
codepaths and create a seperate set of optimized shaders. But many smaller
developers, the bread and butter of computer gaming, just aren't going to
have the time or money. Thank you, NVidia. Guess if a developer doesn't
have millions and doesn't participate in your PR campaign, they are chopped
liver.

NVidia has exactly 14 days until Halo comes out, then they are ****ed if
they don't have a solution. I for one won't play Halo, Half Life 2, or Deus
Ex in anything but their full dynamic range. I've seen the difference in
the screenshots, and it makes me a believer. The difference is greater than
going from DX 7 to DX 8, it's like a whole new world of realism- forget
dynamic shadows, bump mapping, specular highlights, nothing beats a clean,
pure image free of the muddy colors so common to computer games.

I for one am tired of buying a new videocard every year and paying 300
bucks for it. Even if I sold off my GeForce FX, it probably wouldn't even
make enough to buy a Radeon 9600.

Roger Squires
September 17th 03, 12:33 AM
> ATI and HL2 team has screwed nvidia.

Yes, and Carmack and Nvidia did precisely the same thing with the early
Doom3 benchmarks. Remember those, when it was shown 'conclusively' that the
5800 beat the 9800 hands down? Even Carmack got suspicious of all the
Nvidia tweaking surrounding those carefully controlled benchmarking
sessions. 'Ultrashadow' my ass!

What goes around comes around, Nvidia's bad karma has caught up with
them.

rms

Lenny
September 17th 03, 12:36 AM
> ATI and HL2 team has screwed nvidia.

Fairly good troll man. That, or you're just plain silly. If that's the case,
go read this: http://www.beyond3d.com/forum/viewtopic.php?t=7873

If you still believe ATi and Valve screwed Nvidia after you've checked out
that report I got more news for you that might upset you: Santa Claus
doesn't really exist either!

Larry Roberts
September 17th 03, 04:39 AM
On Wed, 17 Sep 2003 08:07:51 +1000, banshee > wrote:

>ATI and HL2 team has screwed nvidia.
>
>They did it on purpose, what gaming company would release the first
>offical benchmarks that had poor frames on drivers that where old, and
>refused to use to the ones that nvida told them to use.
>
>I smell $$$

I'm sorry for ya, but ATI couldn't pay the kind of money that
Valve would lose by coding the game to run good only on ATI hardware
users. Valve is in the buissness of making money. They do it by making
PC games that most gamers want to play. They would not jeperdize their
companies reputation on something as stupid as alinating customers who
are their "bread & butter".
We all like to think the "man" is out to put us down, but
Nvidia just "dropped the ball", and now they are taking flak for it
from all directions now. I really don't think drivers are gona fix the
problem. The FX 5800 was a "constapated turd", and the FX 5900 is just
a "softed turd" by comparison when it comes to DX9 shader technology.

SpaceWalker
September 17th 03, 05:55 AM
You are an idiot....


"banshee" > wrote in message
...
> ATI and HL2 team has screwed nvidia.
>
> They did it on purpose, what gaming company would release the first
> offical benchmarks that had poor frames on drivers that where old, and
> refused to use to the ones that nvida told them to use.
>
> I smell $$$
>

Dark Avenger
September 17th 03, 12:19 PM
banshee > wrote in message >...
> ATI and HL2 team has screwed nvidia.
>
> They did it on purpose, what gaming company would release the first
> offical benchmarks that had poor frames on drivers that where old, and
> refused to use to the ones that nvida told them to use.
>
> I smell $$$

I may guess..you own a FX card...loser!

But be happy, nvidia will surely build enough cheats and
"optimalisations" in their drivers to make you being able to play
Halflife 2.....at the cost of a whole lot of Image Quality.

Oh also prepare to never be able to run DX9, to slow since the card is
actually UNABLE to run DX9. So you get a lame mix between DX8 and DX9.
Hell...you get where you pay for!

Oh yes, the FX is expensive...yes....ever heard about "buying a cat in
the bag"

But hell, I can only hope the Nv40 truly gets around all those
problems they know they. But for now... the FX is simple said not an
DX9 card!

Lenny
September 17th 03, 01:05 PM
> Oh also prepare to never be able to run DX9, to slow since the card is
> actually UNABLE to run DX9. So you get a lame mix between DX8 and DX9.
> Hell...you get where you pay for!

Actually, all GFFX cards are perfectly able to run DX9, and that of course
includes Half-Life 2. They just don't do it very quickly, hence Nvidia's
feverish "optimization" efforts and Valve's "mixed" rendering mode. You CAN
run the game in straight DX9, which according to the benches released on the
web reduces performance to around half of the equivalent ATi product. Still,
it runs, if only slowly.

> problems they know they. But for now... the FX is simple said not an
> DX9 card!

Well, it IS a DX9 card so there. ;) Please get your facts straight.
Thanks...

magnulus
September 17th 03, 04:16 PM
"Lenny" > wrote in message
...
> Actually, all GFFX cards are perfectly able to run DX9, and that of course
> includes Half-Life 2. They just don't do it very quickly, hence Nvidia's
> feverish "optimization" efforts and Valve's "mixed" rendering mode. You
CAN
> run the game in straight DX9, which according to the benches released on
the
> web reduces performance to around half of the equivalent ATi product.
Still,
> it runs, if only slowly.

Maybe it's a "DX 9" card in the strictest sense, but it has the worst
price/performance of any card in years. Face it, NVidia didn't count on
developers rapidly shifting over to DX9. They thought they could build a
DX8-9 transition card, one that would beat ATI in benchmarks (and it does,
in DX 8). But people don't buy 400 dollar graphics cards just for current
games, they expect the card to be good in 2 years. Needless to say, NVidia
fooled alot of people.

The only thing I have learned by this how dumb it is to buy high-end
computer hardware for "future proofing" your system. If you think about it,
picking up a Radeon 9600 for around 110-120 bucks and upgrading in a year or
two will cost you alot less than buying a 400 dollar card that gets outdated
in 6 months or a year when then whole industry paradigm shifts.

NVidia isn't going to be fooling me twice. Aside from 4XAA and
8Xanisotropic filtering, the FX 5900 is a paperweight. And frankly I think
the Radeon's antialiasing looks better, so it's going to be back to ATI
with me. NVidia just isn't going to be able to fix this problem.

Bratboy
September 17th 03, 09:31 PM
Baah, put the mouse down and step away from the computer before you hurt
yourself. The only person screwing Nvidia users is NVIDIA. They made the
decision to not stick to standards, they told end users card was perfect DX9
when it isnt, Valve wasted a bunch of extra time and manpower and money
(which you can bet will be passed on to all who buy the game) trying to
solve Nvidia's screwup which they should not have to do in the first place
and you then have the NERVE to say Valve is screwing you. Look to who made
the chip on your card that cant do things the standard way and needs special
handling just to match standards version.

Larry Roberts
September 18th 03, 04:46 AM
The GFX cards are DX9 cards. They will run DX9 supported
games, but they just do it slower than what should be expected from a
card like the GFX 5900 Ultra. We'll have to wait till the next Nvidia
card to see if they can truly make a card that will beat out the
competion "straight up"... Or find a way to cheat without getting
caught ;)

On 17 Sep 2003 04:19:21 -0700, (Dark Avenger)
wrote:

>banshee > wrote in message >...
>> ATI and HL2 team has screwed nvidia.
>>
>> They did it on purpose, what gaming company would release the first
>> offical benchmarks that had poor frames on drivers that where old, and
>> refused to use to the ones that nvida told them to use.
>>
>> I smell $$$
>
>I may guess..you own a FX card...loser!
>
>But be happy, nvidia will surely build enough cheats and
>"optimalisations" in their drivers to make you being able to play
>Halflife 2.....at the cost of a whole lot of Image Quality.
>
>Oh also prepare to never be able to run DX9, to slow since the card is
>actually UNABLE to run DX9. So you get a lame mix between DX8 and DX9.
>Hell...you get where you pay for!
>
>Oh yes, the FX is expensive...yes....ever heard about "buying a cat in
>the bag"
>
>But hell, I can only hope the Nv40 truly gets around all those
>problems they know they. But for now... the FX is simple said not an
>DX9 card!

Nada
September 18th 03, 12:03 PM
"Roger Squires" > wrote:
> > ATI and HL2 team has screwed nvidia.
>
> Yes, and Carmack and Nvidia did precisely the same thing with the early
> Doom3 benchmarks. Remember those, when it was shown 'conclusively' that the
> 5800 beat the 9800 hands down?

The wind out of the 5800 was so strong that the testers with Ati cards
couldn't even come in at first when the benchmark started.

> Even Carmack got suspicious of all the
> Nvidia tweaking surrounding those carefully controlled benchmarking
> sessions. 'Ultrashadow' my ass!

After the benchmark tests, there was a pajama party held for the VIP
where people from Nvidia would make shadowpuppet theatre with
flashlights and hands. The crowd favorite shadow was the alligator.

> What goes around comes around, Nvidia's bad karma has caught up with
> them.
>
> rms

Just wait. GeForce6 Blowtorch will be the best **** you'll ever see.