If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below. |
|
|
Thread Tools | Display Modes |
#21
|
|||
|
|||
"Gavin Scott" wrote in message ... If you ignore that however and also assume that a first generation photonic product wouldn't be able to support the complexity of a modern x86 chip, would there be a market for a 100GHz 8086? Or 80386, etc.? Would you be willing to run Windows 3.1 again if it ran 20x faster than XP on a traditional modern x86? G. It doesn't work that way. The question is based on lots of false premises. DS |
#22
|
|||
|
|||
Gavin Scott wrote:
In comp.arch Maynard Handley wrote: And, of course, it's pretty much irrelevant whether they get a photonics CPU to work or not. Hook up your fancy 100GHz CPU to existing RAM, and the results aren't going to impress anyone much. That was what came to mind here. If you ignore that however and also assume that a first generation photonic product wouldn't be able to support the complexity of a modern x86 chip, would there be a market for a 100GHz 8086? Or 80386, etc.? Would you be willing to run Windows 3.1 again if it ran 20x faster than XP on a traditional modern x86? The question would be irrelevant if it reflected the reality of the hardware. Given a CPU 20x faster than state of the art at that time you could run an X86 emulator on it and the real instruction set wouldn't matter, it would still be faster than current. If it were some subset of x86 that's a bonus, not a requirement. You don't say "cost effective" but the assumption is there, I think, 20x faster in any usable way would probably do it. If a photonic computer of reasonable speed and cost existed, I suspect that there would be a quick port of at least one O/S to it and x86 would be come history in the high end market. Clearly if you port something like Linux you get a lot of tools which either just work or work with little effort. That's because there are 32 and 64 bit versions in use so many of the problems have been fixed already. You could assume that if someone were building any new CPU today it would be LSB order 2's complement, so even though there are assumptions in some programs they are unlikely to fail. I believe 32 bit Windows was ported to Alpha, so given a reason it could be ported to a new CPU as well. I'm not sure that even a chance to own the workstation market would be a reason, the Alpha market was too small to support itself, and most buyers don't spend the money to buy the fastest CPU available now. The real question is if this new system would be so fast that people would go to the effort to port anything BUT Linux, since it would have to make sen$e to do so. Linux would get ported because (a) there are lots of recent ports to serve as examples, (b) people will do it to prove they can, and (c) you can get a grad student to do almost anything for a little money and a thesis topic. -- bill davidsen SBC/Prodigy Yorktown Heights NY data center http://newsgroups.news.prodigy.com |
#23
|
|||
|
|||
Nick Maclaren wrote:
In article .com, "Nathan Bates" writes: | As soon as I hear photonics in relation to CPUs, I immediately think | scam. | | Manufacturing a CPU on silicon was an enormous scam. | Just melt worthless sand into tiny wafers and sell each one for $1,000. A long time back, someone wrote an article about the forthcoming silicon shortage, if computer use kept expanding. | Seriously, here's an intriguing article mentioning | Intel, AMD, FreeScale, and Transmeta regarding photonics: | http://www.extremetech.com/article2/...1779951,00.asp Sigh. Most of that is about the electro-optical converters, which are produced in large numbers today but are not scalable. If Luxtera or anyone else can manage to integrate those with CPUs, it would make massive difference to interconnects and might even be used inside a chip to reduce latency. My first thought was HyperTransport. But 10GHz isn't that fast... people are using that for ethernet in labs, based on silicon. | Thru design to tape-out, a new x86 CPU based on CMOS technology | will take 3 years min to develop. A photonic x86 will take much | longer, | maybe 5..7 years, and expect a rate of advancement of photonic | technology. | Sounds like the classic gamble for Silicon Valley VCs. Complex optical logic is another game. Yes, maybe 5-7 years. But also maybe 50-70. Sane people don't believe tight schedules for developing new, known to be difficult, technology. -- bill davidsen SBC/Prodigy Yorktown Heights NY data center http://newsgroups.news.prodigy.com |
#24
|
|||
|
|||
Yousuf Khan wrote:
Bill Davidsen wrote: I spent a few decades at a major R&D lab, research is proving it can be done, development is finding out how. There's a lot of engineering needed to get photonic computing going. Just as a first thought, I would think that a RISC design would be easier to emulate if that were a goal. Or some kind of an embedded RISC core, with no FPU or stuff like that. We are in agreement on that one. Note my earlier post on x86 emulation. -- bill davidsen SBC/Prodigy Yorktown Heights NY data center http://newsgroups.news.prodigy.com |
#25
|
|||
|
|||
On Fri, 14 Oct 2005 00:43:28 +0200, Bill Davidsen
wrote: Yousuf Khan wrote: Bill Davidsen wrote: Or some kind of an embedded RISC core, with no FPU or stuff like that. We are in agreement on that one. Note my earlier post on x86 emulation. maybe, just maybe someone got their hands on this ones blueprints (espionage?) http://www.lenslet.com/products.asp -- the penguins are psychotic aka just smile and wave |
#26
|
|||
|
|||
On Thu, 13 Oct 2005 13:17:54 -0700, Yousuf Khan wrote:
So how exactly does holographic memory work? Do you mean "how is it *supposed* to work"? Remember, you can cut a hologram in half and still have a complete picture, though the S/N ratio is reduced. With this in mind, it's really a method of spreading information over a wide area. Many redundancy methods do similar things. It does sound kewll though. ;-) -- Keith |
#27
|
|||
|
|||
On Thu, 13 Oct 2005 16:04:31 +0200, koko wrote:
On Thu, 13 Oct 2005 12:45:28 +0200, George Macdonald wrote: Hey, you could hook the bugger up directly to some of that holographic memory I've been hearing great things about... for the past x0 years (choose x according to your age). you mean Diamond/ruby/gas based? The one workind in labs today ? "working?" What does it do, exactly. -- Keith |
#28
|
|||
|
|||
On Thu, 13 Oct 2005 08:40:55 +0200, Ketil Malde wrote:
keith writes: Jeez! After we get nuclear fusion tackled we'll not need photonics. Processors can then scale to 1.21GW and there won't be any need for photonics. Indeed, my money would be on Mr. Fusion first. After all, it has a 50 year head start. Well, if you have a 50 year head start, but are nowhere near the finishing line, perhaps you are running in the wrong direction? :-) Well, CNF hasn't been around 50 years. Perhaps you'd rather invest in that? ;-) -- Keith |
#29
|
|||
|
|||
On Thu, 13 Oct 2005 22:42:20 +0000, Bill Davidsen wrote:
Nick Maclaren wrote: In article .com, "Nathan Bates" writes: | As soon as I hear photonics in relation to CPUs, I immediately think | scam. | | Manufacturing a CPU on silicon was an enormous scam. | Just melt worthless sand into tiny wafers and sell each one for $1,000. A long time back, someone wrote an article about the forthcoming silicon shortage, if computer use kept expanding. | Seriously, here's an intriguing article mentioning | Intel, AMD, FreeScale, and Transmeta regarding photonics: | http://www.extremetech.com/article2/...1779951,00.asp Sigh. Most of that is about the electro-optical converters, which are produced in large numbers today but are not scalable. If Luxtera or anyone else can manage to integrate those with CPUs, it would make massive difference to interconnects and might even be used inside a chip to reduce latency. My first thought was HyperTransport. But 10GHz isn't that fast... people are using that for ethernet in labs, based on silicon. With how many NAND4s inbetween clocks? GHzGHz. -- Keith |
#30
|
|||
|
|||
On Thu, 13 Oct 2005 14:52:44 +0200, koko wrote:
On Thu, 13 Oct 2005 03:31:09 +0200, keith wrote: I understand flashlights (when the power company wants the bill paid), but mirrors? You need a lot of little shiny objects in this line of business. I think you've been staring at too many "little shiny objects". -- Keith |
Thread Tools | |
Display Modes | |
|
|
Similar Threads | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
NetBurst, Bangalore and automated design | Leif Sterner | Intel | 0 | May 5th 04 05:58 PM |
Analog Design Manager in India, Bangalore | abdul | General Hardware | 1 | December 14th 03 01:09 AM |
Recommend Book on Basic Video Card Design? | Jeff Walther | General | 29 | December 9th 03 04:32 AM |