View Single Post
  #22  
Old April 29th 08, 02:26 AM posted to comp.sys.intel,comp.sys.ibm.pc.hardware.chips
Robert Myers
external usenet poster
 
Posts: 606
Default AMD planning 45nm 12-Core 'Istanbul' Processor ?

On Apr 28, 7:29 pm, "
wrote:
On Mon, 28 Apr 2008 10:22:52 -0700 (PDT), Robert wrote:

snip/Without AMD, one of threee things would have happened:

1. Intel would have had the resources to deliver a satisfactory
Itanium product and accompanying compiler on schedule.


Itanium was stillborn, even more so than Netbust. Its only purpose
was to move everyone and their mother-in-law away from x86 and in the
process thereof screw all other chipmakers (chiefly AMD, since others
were, and still are, almost non-entities). While AMD and some other
guys have a license to churn out x86 compatible product, no licenses
were ever planned for IA64. Thanks to AMD and their Opteron product
beating Itanic on performance for a mere fraction of the price, it
didn't happen.

I've lived a long time now, and I've seen a lot of predictions come
and go. Anyone who wants to make emphatic statements about what was
inevitable should take a good, hard look at all the successful
predictions of the ramifications of the attack of the killer micros.

In all this "I knew all along" talk about Itanium, I've heard a few
insightful comments indicating that people actually understood
something of importance about the actual architecture, and not what
they've heard from others. If anyone *really* understood what went
wrong with Itanium, it would make the case study of all case studies
for business schools interested in the development and management of
technology. As it is, I don't think anyone really knows.

By comparison, it's pretty easy to see what went wrong with Netburst
and, among other things, we have public statements by its principal
architect, who no longer works for Intel. Even so, it's a puzzle as
to why Intel missed the importance of power consumption.

2. Intel would have been forced into a partial retreat to x86, anyway.


It took AMD64 (later renamed x84-64 to make it more digestable to
Intel) to do so. In all Intel roadmaps, x86 was to be relegated to
the low end of the market and then obsoleted in a matter of a few
years, if not months.


The more fool Intel. The "low end" is where all the action is. To
see that, you have only to look at Blue Gene that was built with "low
end" processors. Low end or high wasn't what mattered. Power
consumption did. Intel has it figured out by now, and they're pouring
resources into low-power processors. If Intel hadn't had a credible
x86 candidate, what would it have done? I have no idea. To give up
on the low-power market is essentially to give up on the future
because the cost of computation is going to be dominated by the cost
of electricity, including the costs of cooling.


3. Sparc or Power would be holding much larger market share under any
number of possible licensing and manufacturing arrangements.


Sparc is big iron stuff, it just doesn't scale down to desktop, let
alone laptop. And Power... It could not even hold on to Apple, the
only desktop/laptop maker ever using it. Ironically, it was dumped to
make way to Intel x86 product.

Sparc would have much of the market share for servers now dominated by
x86. The disappearance of Power is simply a matter of money. If the
market is entirely consumed by x86, no one will want to put the
resources into it necessary to compete with Intel's offerings. To be
sure, IBM has never been very much interested in that market, anyway.

I personally believe that what we've got is the worst of all possible
worlds: AMD on death's door, Microsoft holding on to its monopoly
catering to just one ISA, and, to all intents and purposes, zero
diversity in processor architecture.


Some prefer divercity, others prefer standard. Looks like you are not
in the business of writing software, otherwise you'd know what a pain
in a$$ is cross-platform compatibility.

I have very little sympathy for the concerns of software developers.
We'd be much better off with longer software development cycles so we
had less bad software.

Robert.