A computer components & hardware forum. HardwareBanter

If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below.

Go Back   Home » HardwareBanter forum » Processors » Intel
Site Map Home Register Authors List Search Today's Posts Mark Forums Read Web Partners

AMD planning 45nm 12-Core 'Istanbul' Processor ?



 
 
Thread Tools Display Modes
  #41  
Old May 1st 08, 06:14 AM posted to alt.comp.hardware.amd.x86-64,alt.comp.hardware.overclocking.amd,de.comp.hardware.cpu+mainboard.amd,comp.sys.intel,comp.sys.ibm.pc.hardware.chips
Zootal
external usenet poster
 
Posts: 87
Default AMD planning 45nm 12-Core 'Istanbul' Processor ?


The Pentium III was little more than a Pentium II with the L2
cache chips integrated on-die and running faster. The P2 was
little more than a slot repackaging of the PentiumPro which was
a completely new effort for Intel having nothing in common with
the original Pentium and PentiumMMX.

In many ways the P4 has nothing in common with the P3 or core,
and looks much more like a dressed up, overclocked original Pentium.

-- Robert


That is pretty much my understanding. The P4's netburst architecture was a
dead end road, and not even cranking up the clock to 3.8GHz gave the
performance people wanted. The core did not inherit from the P4, but was
based on PPro/II/III architecture.


  #42  
Old May 2nd 08, 07:35 PM posted to comp.sys.intel
John Dallman
external usenet poster
 
Posts: 18
Default AMD planning 45nm 12-Core 'Istanbul' Processor ?

In article ,
(Zootal) wrote:

That is pretty much my understanding. The P4's netburst architecture
was a dead end road, and not even cranking up the clock to 3.8GHz
gave the performance people wanted. The core did not inherit from the
P4, but was based on PPro/II/III architecture.


At this point, the variations of meaning of "architecture" start to
obscure meaning.

I'm a software guy. To me, 32-bit x86 is an architecture, and
386/486/Pentium/PPro/PII/PIII/P4/Core are all implementations of that
architecture, with varying performance characteristics and gradual
expansion of the instruction and register sets. To a hardware designer,
NetBurst and Core are different architectures. The fact that they run
very nearly the same instruction set isn't particularly important; their
circuitry and design approaches are very different, so they're different
architectures.

As best I understand it, Intel consider the Core to be a separate
architecture from both PPro/II/III and NetBurst. The design philosophy
is much more like that of the PPro/II/III than it is like the NetBurst,
but given the older architecture was designed for 350nm and the Core for
65nm, the way a lot of the circuitry is done is rather different.

--
John Dallman

"C++ - the FORTRAN of the early 21st century."
  #43  
Old May 9th 08, 05:56 PM posted to comp.sys.ibm.pc.hardware.chips,comp.sys.intel
Robert Myers
external usenet poster
 
Posts: 606
Default AMD planning 45nm 12-Core 'Istanbul' Processor ?

On Apr 29, 10:27 am, Sebastian Kaliszewski
wrote:
Robert Myers wrote:
I have very little sympathy for the concerns of software developers.
We'd be much better off with longer software development cycles so we
had less bad software.


ROTFL!

You got things 180 degree reversed from the reality. The reality is that
making software development harder won't make better software product nor
will it influence software development cycles.

This is a relatively common misconception among those who're clueless about
sofware development that length of development cycles pre se has any
meaningful effect on final product quality.


You made an erroneous inference from what I wrote because you
seriously underestimated how little I think of software developers.
If software developers are *slowed down*, there will be less bad
software because there will be less software.

It didn't *have* to turn out this way, but it did, and *you* are part
of the problem because, apparently, you think you know how to write
good software using languages and tools currently in use.

I may be wrong. If you are writing in a language and using tools that
allow checking of your programs for formal correctness, and if you
actually use those tools, please accept my apologies. Otherwise, you
are just another member of the club of gunslingers that call
themselves software developers and talk big, probably because they've
spent too much time blowing people away in video games.

There is multitude of software
systemes where there are strong requirements of simultanous high quality
and short development cycles. And those requirements are met.


rotfflmao. The extra f is intentional, as I'm sure your humor is not.

Importance of
cycle time is conditional on actual development methodology employed.


Once again, if you are using tools and methods that practically no one
actually uses, please accept my apologies. If you are relying on your
own personal brilliance and rigor, or that of your colleagues, you are
deluding yourself.

Robert.
  #44  
Old May 12th 08, 04:31 PM posted to comp.sys.ibm.pc.hardware.chips,comp.sys.intel
Sebastian Kaliszewski[_3_]
external usenet poster
 
Posts: 20
Default AMD planning 45nm 12-Core 'Istanbul' Processor ?

Robert Myers wrote:
I have very little sympathy for the concerns of software developers.
We'd be much better off with longer software development cycles so we
had less bad software.


ROTFL!

You got things 180 degree reversed from the reality. The reality is that
making software development harder won't make better software product nor
will it influence software development cycles.

This is a relatively common misconception among those who're clueless
about sofware development that length of development cycles pre se has
any meaningful effect on final product quality.


You made an erroneous inference from what I wrote because you
seriously underestimated how little I think of software developers.


Who cares what some clueless newsgroup poster thinks.


If software developers are *slowed down*, there will be less bad
software because there will be less software.

It didn't *have* to turn out this way, but it did, and *you* are part
of the problem because, apparently, you think you know how to write
good software using languages and tools currently in use.

I may be wrong.


You are.


If you are writing in a language and using tools that
allow checking of your programs for formal correctness, and if you
actually use those tools, please accept my apologies. Otherwise, you
are just another member of the club of gunslingers that call
themselves software developers and talk big, probably because they've
spent too much time blowing people away in video games.


You're clueless about software develompent process. Contrary to you I know
how to do formal verification of a software and do it when it's *needed*.
The important point here it's *needed* exceptionaly rarely. Thats because
is that formal verification is:
1. very expensive
2. does not guarantee total correctenss -- it only reduces chances of error
and that reduction is in reality directly proportional to cost increase.

It makes sense only in life critical systems (where cost of any hard error
is measured in millions). There it's in fact the preferred way (and the
cheapest one, as Lockhead's experience shows). But in other situations it's
simply too expensive.


There is multitude of software
systemes where there are strong requirements of simultanous high quality
and short development cycles. And those requirements are met.


rotfflmao. The extra f is intentional, as I'm sure your humor is not.


Go buy a clue and stop making idiot of yourself publicly.


Importance of
cycle time is conditional on actual development methodology employed.


Once again, if you are using tools and methods that practically no one
actually uses, please accept my apologies. If you are relying on your
own personal brilliance and rigor, or that of your colleagues, you are
deluding yourself.


No, I'm relying on cost of software production vs cost's of errors times
times their expected probability of occurence during lifetime of the
software.
Making software immune to errors costing 1e+9$ with probability greater that
1-1e-4 (that is four nines) during 10 years lifetime should not cost more
than 1e+5$ (that's about 1 man-year spent on improving quality). If it
costs more it's time to cut corners and increase error probability reducing
production cost.
And even then it's pointless to run that software on hardware with fault
probability worse that that of software (i.e. not for home use, nor small
company -- your standard PC will die with probability well above 0.5 not
below 0.0001 within the timeframe (even if you're doing regular
professional system maitenence and exchange all parts on planed shedule).
Then all of that must be immune to operator errors -- also impossible in
home or small and middle sized business environment.

So what you propose if a horrendous waste of time and money. It makes no
sense whatsoever.


Sebastian Kaliszewski
--
"Never underestimate the power of human stupidity" -- L. Lang
 




Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
AMD planning 45nm 12-Core 'Istanbul' Processor ? AirRaid General 115 June 13th 08 04:48 PM
Core 2 Duo Processor Peter[_4_] Dell Computers 5 January 22nd 08 05:01 PM
Is RAM Dedicated by Core in Mutli-Core Processor Systems? JB General 3 August 12th 07 07:36 PM
Core 2 Duo Processor Craig Dell Computers 7 September 3rd 06 03:14 AM


All times are GMT +1. The time now is 08:25 AM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Copyright ©2004-2024 HardwareBanter.
The comments are property of their posters.