If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below. |
|
|
Thread Tools | Display Modes |
#11
|
|||
|
|||
On Wed, 12 Oct 2005 06:21:40 +0200, koko wrote:
On Tue, 11 Oct 2005 01:17:01 +0200, Nathan Bates wrote: The twist is that this x86 CPU will be based on photonic technology. Photonics can you say bling bling? I just hope they got themselves lots of room for those flashlights and mirrors. I understand flashlights (when the power company wants the bill paid), but mirrors? -- Keith |
#12
|
|||
|
|||
"koko" wrote in message
news On Tue, 11 Oct 2005 01:17:01 +0200, Nathan Bates wrote: The twist is that this x86 CPU will be based on photonic technology. Photonics can you say bling bling? I just hope they got themselves lots of room for those flashlights and mirrors. No need for flashlights, but they will need orange smoke. -- ... Hank http://home.earthlink.net/~horedson http://home.earthlink.net/~w0rli |
#13
|
|||
|
|||
In article ,
Evgenij Barsukov wrote: YKhan wrote: As soon as I hear photonics in relation to CPUs, I immediately think scam. Yousuf Khan Intel recently demonstrated in-silicon infrared laser, so I think it will not remain this way much longer. There is not many other ways to go if you want to keep increasing bandwith inside the chip. Regards, Evgenij The problem is not (or mostly not) the generation of the light, it is the switching based on the light. Pretty much all the technologies available to do this suck in one way or another. And, of course, it's pretty much irrelevant whether they get a photonics CPU to work or not. Hook up your fancy 100GHz CPU to existing RAM, and the results aren't going to impress anyone much. Maynard |
#14
|
|||
|
|||
keith writes:
Jeez! After we get nuclear fusion tackled we'll not need photonics. Processors can then scale to 1.21GW and there won't be any need for photonics. Indeed, my money would be on Mr. Fusion first. After all, it has a 50 year head start. Well, if you have a 50 year head start, but are nowhere near the finishing line, perhaps you are running in the wrong direction? :-) -k -- If I haven't seen further, it is by standing in the footprints of giants |
#15
|
|||
|
|||
On Thu, 13 Oct 2005 06:30:38 GMT, Maynard Handley
wrote: In article , Evgenij Barsukov wrote: YKhan wrote: As soon as I hear photonics in relation to CPUs, I immediately think scam. Yousuf Khan Intel recently demonstrated in-silicon infrared laser, so I think it will not remain this way much longer. There is not many other ways to go if you want to keep increasing bandwith inside the chip. Regards, Evgenij The problem is not (or mostly not) the generation of the light, it is the switching based on the light. Pretty much all the technologies available to do this suck in one way or another. And, of course, it's pretty much irrelevant whether they get a photonics CPU to work or not. Hook up your fancy 100GHz CPU to existing RAM, and the results aren't going to impress anyone much. Hey, you could hook the bugger up directly to some of that holographic memory I've been hearing great things about... for the past x0 years (choose x according to your age). -- Rgds, George Macdonald |
#16
|
|||
|
|||
In article , fammacd=!
says... On Thu, 13 Oct 2005 06:30:38 GMT, Maynard Handley wrote: snip The problem is not (or mostly not) the generation of the light, it is the switching based on the light. Pretty much all the technologies available to do this suck in one way or another. And, of course, it's pretty much irrelevant whether they get a photonics CPU to work or not. Hook up your fancy 100GHz CPU to existing RAM, and the results aren't going to impress anyone much. Hey, you could hook the bugger up directly to some of that holographic memory I've been hearing great things about... for the past x0 years (choose x according to your age). I first heard about holographic storage "proposed"[*] for the Illiac IV in the mid '60s. [*] not sure it was serious -- Keith |
#17
|
|||
|
|||
On Thu, 13 Oct 2005 03:31:09 +0200, keith wrote:
I understand flashlights (when the power company wants the bill paid), but mirrors? You need a lot of little shiny objects in this line of business. -- the penguins are psychotic aka just smile and wave |
#18
|
|||
|
|||
On Thu, 13 Oct 2005 12:45:28 +0200, George Macdonald
wrote: Hey, you could hook the bugger up directly to some of that holographic memory I've been hearing great things about... for the past x0 years (choose x according to your age). you mean Diamond/ruby/gas based? The one workind in labs today ? -- the penguins are psychotic aka just smile and wave |
#19
|
|||
|
|||
So how exactly does holographic memory work?
Yousuf Khan |
#20
|
|||
|
|||
In comp.arch Maynard Handley wrote:
And, of course, it's pretty much irrelevant whether they get a photonics CPU to work or not. Hook up your fancy 100GHz CPU to existing RAM, and the results aren't going to impress anyone much. That was what came to mind here. If you ignore that however and also assume that a first generation photonic product wouldn't be able to support the complexity of a modern x86 chip, would there be a market for a 100GHz 8086? Or 80386, etc.? Would you be willing to run Windows 3.1 again if it ran 20x faster than XP on a traditional modern x86? G. |
Thread Tools | |
Display Modes | |
|
|
Similar Threads | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
NetBurst, Bangalore and automated design | Leif Sterner | Intel | 0 | May 5th 04 05:58 PM |
Analog Design Manager in India, Bangalore | abdul | General Hardware | 1 | December 14th 03 01:09 AM |
Recommend Book on Basic Video Card Design? | Jeff Walther | General | 29 | December 9th 03 04:32 AM |