A computer components & hardware forum. HardwareBanter

If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below.

Go Back   Home » HardwareBanter forum » General Hardware & Peripherals » Homebuilt PC's
Site Map Home Register Authors List Search Today's Posts Mark Forums Read Web Partners

AMD



 
 
Thread Tools Display Modes
  #12  
Old February 20th 12, 03:31 PM posted to alt.comp.hardware.pc-homebuilt
Michael Black[_2_]
external usenet poster
 
Posts: 164
Default AMD

On Mon, 20 Feb 2012, Dave C. wrote:


The CPU, as most currently know it, is a dinosaur, destined to
quickly become extinct. This will affect both AMD and Intel. AMD
knew that a long time ago, which is exactly why AMD bought ATI.


I think you're referring to "x86 compatibility" here, rather than CPU
as a concept in general.



No, I'm referring to the CPU as a concept in general. I know it's hard to
swallow if you're just hearing this recently, for the first time. But the
CPU is really not needed (at all) in a computer system. The best example
to illustrate my point (that I've heard of recently) is the supercomputer
that the U.S. Air Force created out of a bunch of console games:
http://www.popsci.com/technology/art...ir-forces-new-
supercomputer-made-1760-playstation-3s


I think you're misinterpreting things.

A graphic processor is a CPU that's sculpted to deal with graphics. In
the old days, there were CRT controllers, but while they streamlined
things like scrolling text up the screen, they required the CPU to do much
of the work. Then as things got more graphic, in effect a second CPU was
added to offload the load on the primary CPU, so it can do what it's
supposed to do, letting the secondary CPU handle the graphics. Actually,
a second CPU likely was added initially, then there were CPUs intended to
handle graphics specifically, and thus the GPU came along.

Yes, these GPUs have become better and better, for all those gamers, but
in essence they are still built for graphics processing.

For them to be "general purpose CPUs" they instruction set would have to
be different, which means they would not be as good for graphics. And if
you go back to one processor, then it slows things down.

Now, I can see graphics being included in CPUs, the way memory management
and math coprocessors have been added, but it wouldn't be about changing
the CPU, but adding the graphic processor to the chip that the CPU is on.

NObody wants to go back to the days of the CPU handling all the graphics,
since that will slow things down now that graphics are so intense on too
many computers.

Michael
  #13  
Old February 20th 12, 06:24 PM posted to alt.comp.hardware.pc-homebuilt
Paul
external usenet poster
 
Posts: 13,364
Default AMD

Michael Black wrote:

NObody wants to go back to the days of the CPU handling all the
graphics, since that will slow things down now that graphics are so
intense on too many computers.

Michael


Intel experimented with that idea, in Larrabee chip. While that
chip was not released to the market, I'll bet it is still alive
and kicking inside an Intel lab somewhere. Mainly for applications
in servers or scientific computing. It probably won't come back
as a graphics chip or an APU look-alike or anything.

http://en.wikipedia.org/wiki/Larrabe...roarchitecture)

Larrabee was to use the x86 instruction set with Larrabee-specific extensions.

Larrabee was to feature cache coherency across all its cores.

Larrabee was to include very little specialized graphics hardware,
instead performing tasks like z-buffering, clipping, and blending
in software, using a tile-based rendering approach.

The GPU wouldn't have cache coherency between cores (that would absolutely
kill performance), while the Larrabee does, which makes it more suited to
general computing. The GPU also has poor memory bandwidth, for ordinary
computing tasks (a forum participant here, measured it for himself). Under
the right conditions, and with the right test code, the aggregate memory
performance can be made to hit the state datasheet bandwidth (however many
GB/sec that happens to be). But, just like on a CPU, if you write a
single thread of code for a GPU, and make the code do random accesses,
performance sucks big time. It sucks worse than a CPU. That's because memory
design, hasn't kept up with the improvements in CPU design.

This is a partial block diagram of the Larrabee. Compared to a
regular Intel CPU, the processor features in-order execution, which
isn't as good as the out of order execution of current processors. That
was done to reduce the size of the core, so more could be fitted for
parallelism reasons.

http://upload.wikimedia.org/wikipedi..._bl oack).PNG

And this, apparently, is a wafer full of Larrabees.

http://news.mydrivers.com/Img/20091104/S11204977.jpg

I think the idea is, they can put more than one block into a chip.
I doubt it's tiled exactly like the Wikipedia picture, so it's
hard to say if this represents two "blocks" or not, or whether
they put all the cores onto a single internal bus.

http://www.techpowerup.com/img/10-05-31/110c.jpg

Paul

  #14  
Old February 20th 12, 07:48 PM posted to alt.comp.hardware.pc-homebuilt
Loren Pechtel[_2_]
external usenet poster
 
Posts: 427
Default AMD

On Mon, 20 Feb 2012 00:33:02 +0000 (UTC), "Dave C."
wrote:

To simplify a bit...a GPU is an incredibly powerful CPU that usually does
not process information the same way that a CPU does. But it CAN. It's
just a matter of coding. Instruct the GPU to act as a CPU also...and then
what do you use the CPU for?
The CPU as we know it is DONE!


The basic problem is that with the GPU they got that performance by
extreme parallelism and by ditching compatibility. Neither translates
too well into a chip to run your main machine.
  #15  
Old February 21st 12, 08:04 AM posted to alt.comp.hardware.pc-homebuilt
Red Cloud
external usenet poster
 
Posts: 94
Default AMD

On Feb 20, 2:22*pm, "Dave C." wrote:
As the saying goes, failing to plan is planning to fail. *If Intel is
getting ready to compete in the APU market, they sure are being
secretive about it! *Intel makes great CPUs, no argument there. *Too
bad the CPU is going the way of the 8-track tape player. *AMD is
gearing up to produce the next round of APUs which will power future
desktop computer systems. *Where is Intel in the future? *Right now,
I don't even see Intel in the future, at all...


I think the best place for feedback on your idea, would be something
like comp.arch.


* * Paul


What idea are you referring to, Paul? *I haven't had any ideas about
computer architecture. *I am simply commenting on current hardware
trends. *In case you haven't noticed, the CPU (as we currently use the
term) is almost extinct. *That's not my idea. *It is something I myself
read about many years ago. *Since then, I have watched it happening, and
commented on it. *But it wasn't ever my idea.

Within 10 or 20 years (at most) we will be building single-chip desktop
computers. *The single chip will be video card, CPU, RAM and possibly
storage as well (SSD) all integrated into one chip. *I've seen this
coming for many years. *The only thing I was confused on is what the
single chip would be called. *AMD cleared that up when they started
shipping their "APU" chips to retail channels.

So there ya have it. *CPUs are out, APUs are in. *Not my idea. *But it
it's not future tech or vaporware, either. *It's starting to happen RIGHT
NOW.



I was working as pc tech during 90's... I can't believe when you said
that CPU market could be extinct but... it is not way out of line
opinion. Somewhat it makes sense consider the PC market is driven by
the graphic driven gaming. I guess that's why GPU market has more
clear road than CPU market.


  #16  
Old February 21st 12, 04:15 PM posted to alt.comp.hardware.pc-homebuilt
SC Tom
external usenet poster
 
Posts: 441
Default AMD


"Dave C." wrote in message . ..

As the saying goes, failing to plan is planning to fail. If Intel is
getting ready to compete in the APU market, they sure are being
secretive about it! Intel makes great CPUs, no argument there. Too
bad the CPU is going the way of the 8-track tape player. AMD is
gearing up to produce the next round of APUs which will power future
desktop computer systems. Where is Intel in the future? Right now,
I don't even see Intel in the future, at all...


I think the best place for feedback on your idea, would be something
like comp.arch.

Paul


What idea are you referring to, Paul? I haven't had any ideas about
computer architecture. I am simply commenting on current hardware
trends. In case you haven't noticed, the CPU (as we currently use the
term) is almost extinct. That's not my idea. It is something I myself
read about many years ago. Since then, I have watched it happening, and
commented on it. But it wasn't ever my idea.

Within 10 or 20 years (at most) we will be building single-chip desktop
computers. The single chip will be video card, CPU, RAM and possibly
storage as well (SSD) all integrated into one chip. I've seen this
coming for many years. The only thing I was confused on is what the
single chip would be called. AMD cleared that up when they started
shipping their "APU" chips to retail channels.

So there ya have it. CPUs are out, APUs are in. Not my idea. But it
it's not future tech or vaporware, either. It's starting to happen RIGHT
NOW.


Here's an example of what you are referring to:

http://h10010.www1.hp.com/wwpc/us/en...414.html?dnr=1

Reviews:
http://www.pcmag.com/article2/0,2817,2392628,00.asp

http://www.zdnet.com/reviews/product...b-hdd/34847213


It seems to have some nice features, even if I would prefer a larger screen. The 9-cell battery life is pretty awesome,
but I'm sure a larger screen would cut into that (my Gateway M6850-fx 15.4" laptop gets about 90 minutes, so anything
over 3 hours would be superb to me). I'm not in the market for a new laptop yet, but when I am, I might look into
something like this. The only real con I would have with it is, is it possible to upgrade the APU for something more
powerful as they are produced? That may be another reason for my short battery life- I put a faster CPU in mine about a
year after I bought it.
--
SC Tom

  #17  
Old February 21st 12, 06:40 PM posted to alt.comp.hardware.pc-homebuilt
Paul
external usenet poster
 
Posts: 13,364
Default AMD

Dave C. wrote:
As the saying goes, failing to plan is planning to fail. If Intel is
getting ready to compete in the APU market, they sure are being
secretive about it! Intel makes great CPUs, no argument there. Too
bad the CPU is going the way of the 8-track tape player. AMD is
gearing up to produce the next round of APUs which will power future
desktop computer systems. Where is Intel in the future? Right now,
I don't even see Intel in the future, at all...

I think the best place for feedback on your idea, would be something
like comp.arch.

Paul


What idea are you referring to, Paul? I haven't had any ideas about
computer architecture. I am simply commenting on current hardware
trends. In case you haven't noticed, the CPU (as we currently use the
term) is almost extinct. That's not my idea. It is something I myself
read about many years ago. Since then, I have watched it happening, and
commented on it. But it wasn't ever my idea.

Within 10 or 20 years (at most) we will be building single-chip desktop
computers. The single chip will be video card, CPU, RAM and possibly
storage as well (SSD) all integrated into one chip. I've seen this
coming for many years. The only thing I was confused on is what the
single chip would be called. AMD cleared that up when they started
shipping their "APU" chips to retail channels.

So there ya have it. CPUs are out, APUs are in. Not my idea. But it
it's not future tech or vaporware, either. It's starting to happen RIGHT
NOW.


They're not likely to implement memory in an SOC. There are yield issues
with putting large amounts of silicon in a single chip. You can build
MCMs, and I have some experience with those. We proposed a project,
to put a large number of chips into an MCM (more than is attempted
commercially presently), and the big issue there is KGD (known good
die). The conditional probability of doing it without rework,
becomes pretty poor, with enough chips. Commercial MCMs currently,
in high volume, put two to four silicon die on a substrate. (Even some
of your Intel processors, are MCMs.) "Dicing" the chips like that, is
to improve yield, so each piece can be tested before assembly.

I would agree that CPU, GPU, Northbridge and Southbridge could potentially
be put on the same die (no Flash SSD, no DRAM). One enabler for that, would
be a separate memory component which was: large, and had few I/O. Currently,
putting memory busses on computers, sucks up a lot of I/O pads. Solutions
to that include some of RAMBUS corporation patent portfolio or FBDIMM
buffer technology (which reduces I/O a bit).

The economics of putting large amounts of DRAM or Flash, onto
a single large SOC, just aren't there. Even with sparing and
laser disabling of defective sections, the odds of getting a
good die, just aren't high enough. Constructing an MCM will
certainly work, and with some compromises, maybe that will
be the right solution.

The proof of the economics, is what we see in the "real world"
in products. MCMs have been around for years, and are not a new
idea, and their economics are well known to the people making them.

There is a company, that has put large numbers of silicon die, into
a common MCM package. IBM, years ago, placed a bunch of chips into
a sealed MCM, with water cooling spigots on the top of it. The
module power dissipation was around 1000W and was part of some
kind of mainframe. The module probably cost a fortune to make,
but mainframes also go for millions of dollars, so in that case,
it's a good use of technology. It allows the core of the mainframe,
to be a bit more compact.

(MCM, with lid and cooling plate removed - it's possible there is a
ceramic substrate with up to 600 routing layers, underneath. IBM makes
amazing substrates!)

http://mixeurpc.free.fr/SITE_x86-gui...P-13)%2002.jpg

The IBM technology, tends to use more expensive materials, to make
it possible to package more logic into the same package. Commercial
high volume MCMs, stick to two to four silicon die, so the parts
can be manufactured with plastic or organic packaging. With that
few number of parts, KGD isn't as much of a problem, and they
probably would not attempt to rework (repair) the part to make
a working one. They roll the dice, see if it works, and throw it
away if it doesn't work. With good quality control, they get
a decent percentage of working parts.

Paul
  #18  
Old February 21st 12, 10:03 PM posted to alt.comp.hardware.pc-homebuilt
Dave C.[_3_]
external usenet poster
 
Posts: 18
Default AMD

Peter wrote in
om:

In article ,
says...

To simplify a bit...a GPU is an incredibly powerful CPU that usually
does not process information the same way that a CPU does. But it
CAN. It's just a matter of coding. Instruct the GPU to act as a CPU
also...and then what do you use the CPU for?



This bit needs explaining. If, as you say, they can make GPUs that
are, in essence, just extremely powerful CPUs (basically the same
technology), then why are they not also just making extremely powerful
CPUs using the same technology, now?


Simply stated, why kill profits soooner than you have to? People are
used to buying two expensive chips for each computer system...or paying
the premium price for prebuilt systems with two chips each. So then you
might wonder why go to a single chip at all? The answer to that is that
CPU technology has hit a wall, of sorts. You can't get over the wall to
offer significantly better performance without tight integration...or
taking the motherboard mostly out of the circuit path.


It would be like a car manufacturer who managed to come up with an
engine that was twice as powerful, without being any bigger or using
any more fuel, but they held back because in 5 years time they new
they could improve the fuel economy by 50%.


I don't think anybody is deliberately holding back. If they are, it is
Intel, obviously. It's easy to see progress in the direction of tight
integration of multiple chips onto a single die. Heck, go to Newegg and
search for A6-3500 (for example). Then keep in mind that, probably
within 10 years AT MOST, this particular APU line (in retail channels
now) won't even have a CPU core on it.

AMD is committed to this technology, and APUs are indeed the way the
technology is headed. Is Intel planning to compete with an APU of their
own? So far, I haven't seen any hint of that, even.


  #19  
Old February 21st 12, 10:07 PM posted to alt.comp.hardware.pc-homebuilt
Dave C.[_3_]
external usenet poster
 
Posts: 18
Default AMD



NObody wants to go back to the days of the CPU handling all the
graphics, since that will slow things down now that graphics are so
intense on too many computers.

Michael


Don't take offense, but you are thinking backwards. I wouldn't want the
CPU to handle the graphics, either. Luckily, that's not where the tech is
headed. Soon, it will be the GPU that handles the general CPU functions.
Will this slow down graphics processing? Slightly. But GPUs are becoming
so powerful... loading a GPU with CPU tasks would be the functional
equivalent of removing a sound card from a system. Most users wouldn't
even notice the decrease in performance. Even benchmarks would be
minimally affected.
  #20  
Old February 21st 12, 10:11 PM posted to alt.comp.hardware.pc-homebuilt
Dave C.[_3_]
external usenet poster
 
Posts: 18
Default AMD


To simplify a bit...a GPU is an incredibly powerful CPU that usually
does not process information the same way that a CPU does. But it
CAN. It's just a matter of coding. Instruct the GPU to act as a CPU
also...and then what do you use the CPU for?
The CPU as we know it is DONE!


The basic problem is that with the GPU they got that performance by
extreme parallelism and by ditching compatibility. Neither translates
too well into a chip to run your main machine.


It doesn't matter. The GPUs of today are so powerful that even if they
aren't as efficient as they COULD be in performing CPU functions...they can
still take over the CPU functions without significantly affecting overall
system performance.

 




Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
Does an upgrade from AMD Athlon Duo Core2 4000+ to AMD 6000+ make sense? Ritter 197 AMD x86-64 Processors 2 February 19th 08 05:03 PM
AMD Athlon 64 FX-60 - Zalman 9500CNPS LED or AMD Stock heatpipe cooler? Glzmo Overclocking AMD Processors 1 June 6th 06 06:00 PM
AMD Barton VS Amd 64 Bit for High Definition Video how much better is a AMD 64 CPU ?? Son Of Sheep. Nvidia Videocards 7 July 16th 05 04:14 PM
AMD Barton VS Amd 64 Bit for High Definition Video how much better is a AMD 64 CPU ?? Son Of Sheep. Overclocking AMD Processors 2 July 14th 05 02:16 PM
AMD Athlon 64 3500+ success using AMD 64 3500+ , KUDOS to AMD, ASUS and NVIDIA happy camper Asus Motherboards 1 June 11th 05 10:58 AM


All times are GMT +1. The time now is 03:25 PM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Copyright ©2004-2024 HardwareBanter.
The comments are property of their posters.