A computer components & hardware forum. HardwareBanter

If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below.

Go Back   Home » HardwareBanter forum » Processors » General
Site Map Home Register Authors List Search Today's Posts Mark Forums Read Web Partners

photonic x86 CPU design



 
 
Thread Tools Display Modes
  #11  
Old October 13th 05, 02:31 AM
keith
external usenet poster
 
Posts: n/a
Default

On Wed, 12 Oct 2005 06:21:40 +0200, koko wrote:

On Tue, 11 Oct 2005 01:17:01 +0200, Nathan Bates
wrote:

The twist is that this x86 CPU will be based on photonic technology.
Photonics


can you say bling bling?
I just hope they got themselves lots of room for those flashlights and
mirrors.


I understand flashlights (when the power company wants the bill paid), but
mirrors?

--
Keith
  #12  
Old October 13th 05, 03:28 AM
Hank Oredson
external usenet poster
 
Posts: n/a
Default

"koko" wrote in message
news
On Tue, 11 Oct 2005 01:17:01 +0200, Nathan Bates
wrote:

The twist is that this x86 CPU will be based on photonic technology.
Photonics


can you say bling bling?
I just hope they got themselves lots of room for those flashlights and
mirrors.



No need for flashlights, but they will need orange smoke.

--

... Hank

http://home.earthlink.net/~horedson
http://home.earthlink.net/~w0rli


  #13  
Old October 13th 05, 07:30 AM
Maynard Handley
external usenet poster
 
Posts: n/a
Default

In article ,
Evgenij Barsukov wrote:

YKhan wrote:
As soon as I hear photonics in relation to CPUs, I immediately think
scam.

Yousuf Khan

Intel recently demonstrated in-silicon infrared laser, so I think
it will not remain this way much longer. There is not many other
ways to go if you want to keep increasing bandwith inside the chip.

Regards,
Evgenij


The problem is not (or mostly not) the generation of the light, it is
the switching based on the light. Pretty much all the technologies
available to do this suck in one way or another.
And, of course, it's pretty much irrelevant whether they get a photonics
CPU to work or not. Hook up your fancy 100GHz CPU to existing RAM, and
the results aren't going to impress anyone much.

Maynard
  #14  
Old October 13th 05, 07:40 AM
Ketil Malde
external usenet poster
 
Posts: n/a
Default

keith writes:

Jeez! After we get nuclear fusion tackled we'll not need photonics.
Processors can then scale to 1.21GW and there won't be any need for
photonics. Indeed, my money would be on Mr. Fusion first. After all,
it has a 50 year head start.


Well, if you have a 50 year head start, but are nowhere near the
finishing line, perhaps you are running in the wrong direction?

:-)

-k
--
If I haven't seen further, it is by standing in the footprints of giants
  #15  
Old October 13th 05, 11:45 AM
George Macdonald
external usenet poster
 
Posts: n/a
Default

On Thu, 13 Oct 2005 06:30:38 GMT, Maynard Handley
wrote:

In article ,
Evgenij Barsukov wrote:

YKhan wrote:
As soon as I hear photonics in relation to CPUs, I immediately think
scam.

Yousuf Khan

Intel recently demonstrated in-silicon infrared laser, so I think
it will not remain this way much longer. There is not many other
ways to go if you want to keep increasing bandwith inside the chip.

Regards,
Evgenij


The problem is not (or mostly not) the generation of the light, it is
the switching based on the light. Pretty much all the technologies
available to do this suck in one way or another.
And, of course, it's pretty much irrelevant whether they get a photonics
CPU to work or not. Hook up your fancy 100GHz CPU to existing RAM, and
the results aren't going to impress anyone much.


Hey, you could hook the bugger up directly to some of that holographic
memory I've been hearing great things about... for the past x0 years
(choose x according to your age).

--
Rgds, George Macdonald
  #16  
Old October 13th 05, 01:21 PM
Keith R. Williams
external usenet poster
 
Posts: n/a
Default

In article , fammacd=!
says...
On Thu, 13 Oct 2005 06:30:38 GMT, Maynard Handley
wrote:


snip

The problem is not (or mostly not) the generation of the light, it is
the switching based on the light. Pretty much all the technologies
available to do this suck in one way or another.
And, of course, it's pretty much irrelevant whether they get a photonics
CPU to work or not. Hook up your fancy 100GHz CPU to existing RAM, and
the results aren't going to impress anyone much.


Hey, you could hook the bugger up directly to some of that holographic
memory I've been hearing great things about... for the past x0 years
(choose x according to your age).

I first heard about holographic storage "proposed"[*] for the
Illiac IV in the mid '60s.
[*] not sure it was serious

--
Keith
  #17  
Old October 13th 05, 01:52 PM
koko
external usenet poster
 
Posts: n/a
Default

On Thu, 13 Oct 2005 03:31:09 +0200, keith wrote:

I understand flashlights (when the power company wants the bill paid),
but mirrors?


You need a lot of little shiny objects in this line of business.

--
the penguins are psychotic
aka just smile and wave
  #18  
Old October 13th 05, 03:04 PM
koko
external usenet poster
 
Posts: n/a
Default

On Thu, 13 Oct 2005 12:45:28 +0200, George Macdonald
wrote:

Hey, you could hook the bugger up directly to some of that holographic
memory I've been hearing great things about... for the past x0 years
(choose x according to your age).


you mean Diamond/ruby/gas based? The one workind in labs today ?


--
the penguins are psychotic
aka just smile and wave
  #19  
Old October 13th 05, 09:17 PM
Yousuf Khan
external usenet poster
 
Posts: n/a
Default

So how exactly does holographic memory work?

Yousuf Khan

  #20  
Old October 13th 05, 10:43 PM
Gavin Scott
external usenet poster
 
Posts: n/a
Default

In comp.arch Maynard Handley wrote:
And, of course, it's pretty much irrelevant whether they get a photonics
CPU to work or not. Hook up your fancy 100GHz CPU to existing RAM, and
the results aren't going to impress anyone much.


That was what came to mind here.

If you ignore that however and also assume that a first generation
photonic product wouldn't be able to support the complexity of a
modern x86 chip, would there be a market for a 100GHz 8086? Or
80386, etc.? Would you be willing to run Windows 3.1 again if it
ran 20x faster than XP on a traditional modern x86?

G.
 




Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
NetBurst, Bangalore and automated design Leif Sterner Intel 0 May 5th 04 05:58 PM
Analog Design Manager in India, Bangalore abdul General Hardware 1 December 14th 03 01:09 AM
Recommend Book on Basic Video Card Design? Jeff Walther General 29 December 9th 03 04:32 AM


All times are GMT +1. The time now is 09:38 PM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Copyright ©2004-2024 HardwareBanter.
The comments are property of their posters.