PDA

View Full Version : ARM-based desktop computer ? (Hybrid computers ?: Low + High performance ;))


Skybuck Flying[_3_]
June 8th 10, 10:58 AM
Hello,

Today Apple "released" the iPhone 4.0... I believe it has something like a
1.0 GHz processor...

I find that quite impressive, 1.0 GHz in such a small package and
non-overheating ???

Maybe to good to be true ?

I wonder what the future will bring ?...

Will we see the rise of "low power/low heat/low noise desktop computers"
being powered by ARM-based processors ?

Is this the end of Windows because it doesn't work on ARM processors ?

Can intel atom processors compete with ARM processors ?

What's AMD's answer to atom and arm ?

Can an AMD/Intel single 1.0 to 2.0 GHz core be compared to ARM 1.0 to 2.0
Ghz Core ? Would they both be about as fast... or would one win over the
other ?

To me 1.0 to 2.0 GHz seems to be the magical number/milestone/border/hurdle
towards a good to great desktop experience.

For 99.9% of my daily PC activity 1.0 to 2.0 GHz would be enough... this
almost includes video processing at modest resolutions 640x480 or so...
maybe 800x600, maybe even 1024x768... further enhancements/optimizations
might enable very large resolutions too but don't count on it ;)

For 1920x1200... 4.0 GHz is probably needed to run smooth and cool
(strangely enough)... Or a really cool 2.0 GHz processor ;)

Only gaming does need stronger graphics cards and stronger cpu's to do
more...

However software/technology does advance so maybe I could be wrong an maybe
people will need more processing power... but I don't think so...

Therefore assuming all people need more processing power is a bit
dangerous...

A good secondary strategy is to focus on low power/low heat/low noise/weaker
processors to accomadate non-gaming related and non-high-performance
tasks/crowd ;)

I do want a low heat, low noise, low power computer, but I also want a
strong, high performant computer which can do heavy tasks.

I would love to have a computer which can be totally quite thanks to for
example a ARM processor or maybe even an ATOM processor.

I would also love it if the fans only go on when it's really needed like
gaming or maybe huge video's.

Thus I guess a system which can do both would be ideal for me.

My current PC is already able to do this a little bit:

AMD Dual Core Processor and NVIDIA 7900 GTX graphics card.

But these two technologies do not take it far enough.

The processor still needs a fan to spin.

The graphics card still needs a fan to spin.

The desktop still needs fans to be constantly on... <- This is the biggest
problem probably.

Therefore what is needed is:

1. A motherboard which can control the desktop fans and even shut them down.

2. Processors/Graphics cards which can do the same.

3. Special software which can regulate this or special hardware.

4. Debuggers to make sure no evil "shut fans down during heat" is in there
to kill hardware ;)

5. Temperature meters everywhere for safety...

6. Emergency shutdown in case of emergency/accidental overheat.

7. Fan spin up failure detection.

8. Maybe even blocked air flow detection.

9. Maybe even unacceptable noise detection and throttling of hardware to
reduce noise in return lower performance.

10. This would require microphones which might be too privacy-paranoya ;) So
not a good idea.

11. Maybe even build in temperature displays in/on the desktop case to show
constant temperature of hardware at different locations
in the case to feel "safe" :)

Ultimately HEAT is bad though... even for the high performance situation.

HEAT is unpleasant for the human beings... it can become to hot in summer.

Assuming HEAT can be expelled from CASE and not be a problem could be wrong
thing to do.

HEAT also leads to bigger fans on buildings which is bad too.

However...

In the winter HEAT can working as heating device... and the problem is less
big... it can actually be nice.

Therefore producing more HEAT in winter is more acceptable... unless melting
the polar caps is a bad idea ! ;) :)

And yup it could be bad... many countries facing floodings ! ;) :)

So maybe ultimately HEAT = BAD = EVIL.

Try to use materials and designs which give great processing power but no to
little heat ;)

New inventions are done all the time....

Are intel/amd/ati/nvidia up to the task ?

Or will ARM take the cookie and the cake ?! ;) :)

(Just some random thoughts of me on the 1.0 GHz in a tiny package ;) :):):)
There is even talk of 1.5 GHz in iphone 5.0 wow ! ;) :))

Please feel free to comment within the lines and fill in the blanks,
misconceptions, pipe-dreams, yes/no etc ;) :)

Bye,
Skybuck =D

Skybuck Flying[_3_]
June 8th 10, 11:30 AM
One problem which I see people mention is:

x86 software does not work on ARM...

A solution for this problem is the following (Not my idea, but some crazy
noob ?):

An x86 compiler which compiles x86 to ARM code.

It's a bit a crazy idea perhaps...

But x86 is a instruction set/asssembly language after all as well...

And languages can be ported/translated right ? ;)

Then for example Microsoft or the Users themselfes could do it.

Microsoft's Windows on ARM could detect that the executable being installed
or being tried to run is actually an x86 executable...

Windows then starts the x86-to-ARM compiler and compiles the x86 binary to
ARM binary... saves it and then runs it.

With arm it is possible to add additional co-processors so maybe
co-processors could handle some x86 specific tasks... for compatibility
sakes or so...

Maybe even an ARM/x86 hybrid ! LOL :)

Or how about the ultimate crazy ****:

PowerPC/ARM/x86/Motorola/ATI/Nvidia hybrid ?! LOL.

Take the best of all or so :)

Bye,
Skybuck =D

Joel Koltner
June 8th 10, 06:27 PM
"Skybuck Flying" > wrote in message
b.home.nl...
> I find that quite impressive, 1.0 GHz in such a small package and
> non-overheating ???

That's transistor scaling for you.

> Will we see the rise of "low power/low heat/low noise desktop computers"
> being powered by ARM-based processors ?

To some extent, yes. It's already happening -- look at the success of Intel's
Atom CPUs.

> Is this the end of Windows because it doesn't work on ARM processors ?

No. Windows (NT and beyond) has always had a "hardware abstraction layer" --
you'll recall that NT and XP used to come in versions for MIPS and Alpha CPU
architectures. Writing an ARM HAL is no big deal at all for Microsoft --
there's rumors they might have already started looking that direction, e.g.:
http://www.engadget.com/2009/05/01/arm-ceo-hints-at-possible-windows-7-support-for-arm-processors/

> Can intel atom processors compete with ARM processors ?

Sure. "Compete" means so much more than just MIPS per dollar or MIPS per watt
today -- pricing is very important as well. Look at how "successful"
Microsoft was in almost completely killing Linux's inroads into netbooks after
they started offered XP Home to netbook manufacturers for <$10/copy.

> What's AMD's answer to atom and arm ?

They'd tell you Neo, although to date it doesn't compete all that well.

> For 1920x1200... 4.0 GHz is probably needed to run smooth and cool
> (strangely enough)... Or a really cool 2.0 GHz processor ;)

Well, if you force the CPU to do all the rendering, perhaps so. But plenty of
systems ran at those resolutions back in the days of hundreds-of-MHz CPUs with
the help of graphics processor ICs to do the "heavy lifting" for the display.
Intel has, for years now, been fighting a battle with the GPU manufacturers
over whether rendering is best done in their CPUs of the GPUs. (Notice how
Intel doesn't have any GPUs to compete with those from AMD or nVidia...)

> However software/technology does advance so maybe I could be wrong an maybe
> people will need more processing power... but I don't think so...

Most of technology today is driven much more by what people "want" than what
they "need." Twenty years ago who would have thought that for ~$50 then (~$99
now) people would be getting CLOCK RADIOS that consist of a 454MHz CPU, 64MB
of RAM, and 54Mbps wireless networking? But here we are today... and here's
Chumby: http://www.costco.com/Browse/Product.aspx?Prodid=11529965

> I would love to have a computer which can be totally quite thanks to for
> example a ARM processor or maybe even an ATOM processor.

There are entire web sites devoted to building siltent PCs. If you're willing
to live with largish heatsinks, it's actually not particularly difficult to
do -- fans are ubiquitous because they're small and cheap and effective
compared to the brute-force alternative.

> I would also love it if the fans only go on when it's really needed like
> gaming or maybe huge video's.

Pretty much all laptops and some desktop machines do this.

> Therefore what is needed is:
> 1. A motherboard which can control the desktop fans and even shut them down.
> 2. Processors/Graphics cards which can do the same.
> 3. Special software which can regulate this or special hardware.

Already done.

> 4. Debuggers to make sure no evil "shut fans down during heat" is in there
> to kill hardware ;)

Mmm... ok...

> 5. Temperature meters everywhere for safety...

Already done.

> 6. Emergency shutdown in case of emergency/accidental overheat.

Already done (by BIOS).

> 7. Fan spin up failure detection.

Already done (usually by BIOS).

> 8. Maybe even blocked air flow detection.

Generally not worth the cost -- the system has enough thermal inertia that the
overheat detectors will indirectly detect this before anything fries.

> 9. Maybe even unacceptable noise detection and throttling of hardware to
> reduce noise in return lower performance.

Somewhat done. E.g., most hard drives have "best acoustic performance" and
"best performance" modes.

> 10. This would require microphones which might be too privacy-paranoya ;) So
> not a good idea.

I suppose so.

> 11. Maybe even build in temperature displays in/on the desktop case to show
> constant temperature of hardware at different locations
> in the case to feel "safe" :)

Already available.

> Therefore producing more HEAT in winter is more acceptable... unless melting
> the polar caps is a bad idea ! ;) :)

To a pretty good approximation, your computer produces no more heat in winter
than in summer... it's just easier to move around when when there's a large
temperature difference.

> So maybe ultimately HEAT = BAD = EVIL.

No, "lack of heat gradients" might be bad or evil, but heat itself is neither.
All the heat that will ever be is already here in the universe -- and the
universe is a Good Thing -- it's just that it's slowly all mixing together,
and once it's the same everywhere... we're cooked. :-)

Keeping as much heat locked up in, say, chunks of coal rather than just
heating up the air is rather useful.

> Are intel/amd/ati/nvidia up to the task ?

Yes.

> Or will ARM take the cookie and the cake ?! ;) :)

No.

---Joel

Joel Koltner
June 9th 10, 02:47 AM
"Skybuck Flying" > wrote in message
b.home.nl...
> The motherboard has only one temperature sensor as far as I know ?

For a motherboard in 2006, yeah, that's pretty likely.

> I also like to be able to see which parts of the motherboard are becoming
> the hottest.

If you save up your allowace you could purchase a thermal imager? :-)

> Also my hardware from 2006 doesn't fully shutdown the fans ;) the spindle
> just slowly...
>
> Yet you say it has already be done... I doubt it... but if I am wrong...:

I've seen many a laptop that completely stops its fans when they're not
needed... but personally not a desktop. I don't know why...

> Also my AMD X2 3800+ Dual Core CPU definetly does not detect CPU spin-up
> failure ! ;)

Blame your BIOS.

> Also I rather prefer not clunky big heatsinks... it's just heavy... risk of
> breaking motherboard... and it don't look so nice... it might also
> obstruct the airflow if it needs to scale up.... Big Clunky Heatsinks are
> definetly a NO-NO for me ;) :) =D <- They are windscreens... windscreens are
> evil inside a pc ;) :) I need all the wind I can get in my PC to cool it
> down... unless I am in the desert or so which I am not (yet) lol :)

That's kinda the problem with PCs... since it's "any motherboard, any CPU
cooler, any case, hopefully it'll be cool enough?" you tend to end up with a
lot of brute-force solutions like big fans rather than more elegant designs
where hear flow is more precisely engineered to go certain places. (Even the
old IBM PS/2 line had some very nice ducting in it...)

> Well you have made some claims that some to even all if this has already be
> done... I highly doubt that... but please do provide links to prove me wrong
> ;)

Check out http://www.silentpcreview.com/ -- those guys are serious about quiet
computing.

> Lastly it's amazing to see how fast Apple has launched new products.... like
> 4 iphones in just 3 years ? Plus an iPad and maybe some PC like thingies...
>
> Doesn't sound like much... but I think it is... it requires all of this
> enginering of hardware and software... quite impressive ?!?

People working at Apple all sign a contract stipulating that Steve Jobs gets
to bowhunt you and your family on his private island if your performance
review is deemed unsatisfactory.

But seriously, yes, Apple's execution has been impressive -- and while I don't
think that much of the man personally, one has to give credit that a large
part of it is directly linked to Jobs.

> I do wonder what happened to Steve Jobs though... he so thin ?!? Did all
> that WIFI give him cancer or so ?!?

No, but he had a liver transplant last year. Takes the wind out of most
everyone for awhile...

---Joel

MooseFET
June 9th 10, 03:25 AM
On Jun 8, 6:30 pm, "Skybuck Flying" > wrote:
> One problem which I see people mention is:
>
> x86 software does not work on ARM...
>
> A solution for this problem is the following (Not my idea, but some crazy
> noob ?):
>
> An x86 compiler which compiles x86 to ARM code.

Such programs already exist. It is a clever trick that is used to
make
fast simulations of the ARM on a PC. Doing it the other way also can
be done. It wouldn't be super fast but if you weren't trying to run
a complete Windows OS, it could be fast enough to be used.

Since the ARM can be had as a part of a FPGA, you could add extra
stuff to the standard ARM to make the process go a little faster.

Andrew Reilly[_2_]
June 9th 10, 06:48 AM
On Tue, 08 Jun 2010 19:25:33 -0700, MooseFET wrote:

> On Jun 8, 6:30 pm, "Skybuck Flying" > wrote:
>> One problem which I see people mention is:
>>
>> x86 software does not work on ARM...
>>
>> A solution for this problem is the following (Not my idea, but some
>> crazy noob ?):
>>
>> An x86 compiler which compiles x86 to ARM code.
>
> Such programs already exist. It is a clever trick that is used to make
> fast simulations of the ARM on a PC. Doing it the other way also can be
> done. It wouldn't be super fast but if you weren't trying to run a
> complete Windows OS, it could be fast enough to be used.

Back in '87 or so I had an Acorn RISC "PC", which had an ARM-2, and a "PC
emulator". It simulated an 8088 and the PC's basic hardware well enough
that I was able to use it to run a "scientific" word processor to write
my undergraduate thesis. The "feel" was about as fast as an original
4.77MHz PC, but I didn't run any benchmarks. I'm fairly sure that it
would have been a straight interpreter: the machine didn't really have
enough RAM to be mucking about with JIT compilation. This on a chip with
no cache, no 16-bit memory operations, and which ran the processor clock
at 4MHz or 8MHz depending on whether the DRAM-fetch in progress at the
time was in-page or doing a row access...

I thought it was quite a spectacular achievement.

Cheers,

--
Andrew

Skybuck Flying[_3_]
June 9th 10, 08:34 AM
> Check out http://www.silentpcreview.com/ -- those guys are serious about
> quiet computing.

Hmm... that's mosterd after the meal...

Computer hardware needs to be designed from the start for low heat/low noise
and so forth... :)

> But seriously, yes, Apple's execution has been impressive -- and while I
> don't think that much of the man personally, one has to give credit that a
> large part of it is directly linked to Jobs.

He has gained some respect from me... he seems a more honest guy than I had
expected him to be... at least in his presentations.

However if the world turns into one big cancer infected place because of all
the mobile phones and wifi's and gsm's and so forth than nope :)

May he rott in hell then forever as well ;) :)

>> I do wonder what happened to Steve Jobs though... he so thin ?!? Did all
>> that WIFI give him cancer or so ?!?
>
> No, but he had a liver transplant last year. Takes the wind out of most
> everyone for awhile...

What was wrong with his ex-liver ? Cancer from the wifi ? ;) :) What did he
do with his ex-liver ? Bottle it for memories ? :P***

Ain't he afraid of getting cancer from all that wifi ?

Bye,
Skybuck.

Torben Ęgidius Mogensen[_2_]
June 9th 10, 09:07 AM
Andrew Reilly > writes:


> Back in '87 or so I had an Acorn RISC "PC", which had an ARM-2, and a "PC
> emulator".

That must have been the Archimedes A310. The RISC PC did not come out
until some time in the 90's, and this used an ARM610.

> It simulated an 8088 and the PC's basic hardware well enough
> that I was able to use it to run a "scientific" word processor to write
> my undergraduate thesis. The "feel" was about as fast as an original
> 4.77MHz PC, but I didn't run any benchmarks. I'm fairly sure that it
> would have been a straight interpreter: the machine didn't really have
> enough RAM to be mucking about with JIT compilation.

It was, indeed, an interpreter, but of the 80186 instruction set. File
transfers etc. were a lot faster than on a 4.77MHz PC, but a few things
were a bit slower. The overall speed was fine for running the
occasional DOS application (I used it mostly for games), but for serious
work, you would use native applications.

> This on a chip with no cache, no 16-bit memory operations, and which
> ran the processor clock at 4MHz or 8MHz depending on whether the
> DRAM-fetch in progress at the time was in-page or doing a row
> access...
>
> I thought it was quite a spectacular achievement.

Indeed it was. Nowadays, you would use a JIT (similar to Digital's
fx!32), so the speed would be better. ARM uses arithmetic flags similar
to x86, so it is easier for ARM to emulate x86 efficiently than it is
for, say, MIPS to do so.

Torben

Joel Koltner
June 9th 10, 05:51 PM
"Skybuck Flying" > wrote in message
b.home.nl...
> However if the world turns into one big cancer infected place because of all
> the mobile phones and wifi's and gsm's and so forth than nope :)

I guarantee you that, whatever the potential health hazards posed by WiFi,
GSM, etc. may be, there are orders of magnitudes more lives saved by wireless
technology than lost due to it.

Robert Myers
June 9th 10, 07:06 PM
On Jun 9, 12:51*pm, "Joel Koltner" >
wrote:
> "Skybuck Flying" > wrote in message
>
> b.home.nl...
>
> > However if the world turns into one big cancer infected place because of all
> > the mobile phones and wifi's and gsm's and so forth than nope :)
>
> I guarantee you that, whatever the potential health hazards posed by WiFi,
> GSM, etc. may be, there are orders of magnitudes more lives saved by wireless
> technology than lost due to it.

I don't know. How often do you drive around people who drive will
using a wireless gadget? I think I'd want to do some research before
making any guarantees.

Robert.

Joel Koltner
June 9th 10, 07:56 PM
"Robert Myers" > wrote in message
...
On Jun 9, 12:51 pm, "Joel Koltner" >
wrote:
>> I guarantee you that, whatever the potential health hazards posed by WiFi,
>> GSM, etc. may be, there are orders of magnitudes more lives saved by
>> wireless
>> technology than lost due to it.
>I don't know. How often do you drive around people who drive will
>using a wireless gadget? I think I'd want to do some research before
>making any guarantees.

It's a money-back guarantee, where if I'm wrong, I give you back all the money
you paid for my opinion. :-)

Skybuck Flying[_3_]
June 10th 10, 05:03 AM
"Joel Koltner" > wrote in message
...
> "Skybuck Flying" > wrote in message
> b.home.nl...
>> However if the world turns into one big cancer infected place because of
>> all the mobile phones and wifi's and gsm's and so forth than nope :)
>
> I guarantee you that, whatever the potential health hazards posed by WiFi,
> GSM, etc. may be, there are orders of magnitudes more lives saved by
> wireless technology than lost due to it.

What kind of hog-wash is this ? :)

Bye,
Skybuck.

Dave Platt
June 10th 10, 06:15 AM
In article e.nl>,
Skybuck Flying > wrote:

>> I guarantee you that, whatever the potential health hazards posed by WiFi,
>> GSM, etc. may be, there are orders of magnitudes more lives saved by
>> wireless technology than lost due to it.

>What kind of hog-wash is this ? :)

I suspect that Joel was referring to (e.g.) the number of lives which
have been saved, because somebody was able to call for help quickly on
a cellular telephone, rather than having to drive five miles down the
road to the nearest vandalized payphone. Getting help on the way one
or two minutes faster makes a big difference in the survival rates for
severe trauma, heart attacks, etc.

Skybuck, before you go accusing WiFi and cellphones and wireless in
general of causing cancer, you really ought to do some actual
*research* on the subject, OK? Go look up the actual studied
published in the last five years, and see if there's any real
correlation between the use of these technologies, and the incidence
of cancer in their users.

I realize that actually doing research (even second-hand) would take
time away from gaming... but you might find it enlightening enough to
be worthwhile.

[And, for crying out loud, Steve Jobs did *not* invent cellphones or
WiFi, and I don't know of any evidence to suggest that the
availability of the iPhone has increased cell-phone usage above what
it would have been if the iPhone had never existed. You really ought
to have a good reason to issue oaths of damnation against somebody!]

--
Dave Platt > AE6EO
Friends of Jade Warrior home page: http://www.radagast.org/jade-warrior
I do _not_ wish to receive unsolicited commercial email, and I will
boycott any company which has the gall to send me such ads!

nik Simpson
June 10th 10, 07:51 PM
On 6/10/2010 12:15 AM, Dave Platt wrote:

>
> [And, for crying out loud, Steve Jobs did *not* invent cellphones or
> WiFi, and I don't know of any evidence to suggest that the
> availability of the iPhone has increased cell-phone usage above what
> it would have been if the iPhone had never existed. You really ought
> to have a good reason to issue oaths of damnation against somebody!]
>

Given the well documented problems of the iPhone on AT&T's network, it
may have even reduced the number of calls, well completed calls anyway ;-)
--
Nik Simpson

Skybuck Flying[_3_]
June 11th 10, 02:33 AM
"Dave Platt" > wrote in message
...
> In article e.nl>,
> Skybuck Flying > wrote:
>
>>> I guarantee you that, whatever the potential health hazards posed by
>>> WiFi,
>>> GSM, etc. may be, there are orders of magnitudes more lives saved by
>>> wireless technology than lost due to it.
>
>>What kind of hog-wash is this ? :)
>
> I suspect that Joel was referring to (e.g.) the number of lives which
> have been saved, because somebody was able to call for help quickly on
> a cellular telephone, rather than having to drive five miles down the
> road to the nearest vandalized payphone. Getting help on the way one
> or two minutes faster makes a big difference in the survival rates for
> severe trauma, heart attacks, etc.

Ok valid points, but I prefer not to get into such situations in the first
place:

1. Accidents along the road, don't be on the road, don't be in car, or plane
or bus etc.

2. Stay healthy, eat healthy, breath healthy, be healthy.


> Skybuck, before you go accusing WiFi and cellphones and wireless in
> general of causing cancer, you really ought to do some actual
> *research* on the subject, OK? Go look up the actual studied
> published in the last five years, and see if there's any real
> correlation between the use of these technologies, and the incidence
> of cancer in their users.

I suspect that the energy in the wifi/gsm/wireless signals go through the
human body and might trigger DNA changes to certain cells/parts of the body.

The problem is that scientists probably can't scan the entire body for these
changes ?!

If they could scan for such changes then maybe they could prove that
wireless energy is indeed causing DNA changes and therefore could increase
the risk of cancer.

> I realize that actually doing research (even second-hand) would take
> time away from gaming... but you might find it enlightening enough to
> be worthwhile.

I don't game as much as I used to... mostly because pirates like me have
been cut-off from online gaming ;) :)

And also maybe I grown out of it a bit ;)

Ofcourse I will play Doom 4, Quake 5, Crysis 2, Battlefield 3, Call of
Juarez 3 (maybe), Alien vs Predator 2 remake, Company of Hero's 4 (maybe
even the free online version) and Red Alert 4 ;) (Mostly/especially if they
have new graphics technology ;) and perhaps even sound technology ;))

Those games are manditory ! =D

> [And, for crying out loud, Steve Jobs did *not* invent cellphones or
> WiFi, and I don't know of any evidence to suggest that the
> availability of the iPhone has increased cell-phone usage above what
> it would have been if the iPhone had never existed. You really ought
> to have a good reason to issue oaths of damnation against somebody!]

Steve Jobs is promoting iPhones/iPads/iPods/iMacs and what not... and
promoting all this wireless stuff... without even blinking about it or
thinking about it.
(And ofcourse integrating it into his products without even blinking or
thinking about it ;))

So seriously he is too blame as well...

He would go free out/to heaven if he did the opposite: People watch out for
those wireless signals ! ;)

But nope... none of that... so that makes him guilty in my book ! ;) :)

Perhaps his own "inventions" will take care of him for all of us...

Perhaps he is driving a nice drive-by-wire car and soon he will face death
by wireless energy ****ing up his CAR !

Such irony would be sweet ! =D LOL.

Bye,
Skybuck ;) =D

Skybuck Flying[_3_]
June 11th 10, 02:34 AM
"nik Simpson" > wrote in message
...
> On 6/10/2010 12:15 AM, Dave Platt wrote:
>
>>
>> [And, for crying out loud, Steve Jobs did *not* invent cellphones or
>> WiFi, and I don't know of any evidence to suggest that the
>> availability of the iPhone has increased cell-phone usage above what
>> it would have been if the iPhone had never existed. You really ought
>> to have a good reason to issue oaths of damnation against somebody!]
>>
>
> Given the well documented problems of the iPhone on AT&T's network, it may
> have even reduced the number of calls, well completed calls anyway ;-)

Yeah good point as well..

Bad wireless/mobile phone service might actually cause lives to be lost...

Instead of trying to get the damn mobile phone working, which ofcourse
fails...

One could have gone to the nearest real phone and get some decent service
and save lifes ! ;)

Bye,
Skybuck =D

Joel Koltner
June 12th 10, 12:15 AM
"Skybuck Flying" > wrote in message
b.home.nl...
> I suspect that the energy in the wifi/gsm/wireless signals go through the
> human body and might trigger DNA changes to certain cells/parts of the body.

Yeah, you and plenty of other people.

It's been extensively researched; the results are generally somewhere between
"it seems quite harmless" and "pretty inconclusive, really hard to say." So
while no one would suggest it's 100% certain that such low-energy EM waves are
harmless, it does seem pretty clear that if they do create harm, it's a very,
VERY small risk in the grand scheme of things.

At some point you have to decide if the conveniences of modern technology are
worth the risk given science's best asesssment of what those risks are. None
of your great-great-great grandparents was ever killed by driving down an
interstate highway too fast... although it wouldn't have been unheard of for
them to be killed from something as simple a relatively small cut on, say,
their foot while walking along a beach that then became infected and
eventually killed them. But just as surely as they'd love to have had
penicillin -- thereby decreasing their risk of death -- they just as surely
would have liked automobiles, despite the well-known increase in the risk of
death from them (especially for young guys like *you*, Skybuck!).

Does it bother you to stand in front of a light bulb? You're getting
*hundreds* of watts there at *many terahertz* after all... makes your WiFi
gear seem absolutely puny!

> The problem is that scientists probably can't scan the entire body for these
> changes ?!

The problem is that when you're looking for causal effects down in the noise,
you need a lot of time and a lot of people to average results out enough to
draw any meaningful conclusions and hence such studies are quite expensive for
a questionable benefit.

> I don't game as much as I used to... mostly because pirates like me have
> been cut-off from online gaming ;) :)
>
> And also maybe I grown out of it a bit ;)

Would it be too much to ask that you've grown to appreciate that theft of
intellectual property is just about as bad as theft of material goods?

---Joel

Skybuck Flying[_3_]
June 12th 10, 01:06 AM
What I am worried about is that wifi/gsm/umts signals might become the
"asbest" of the 21th century.

Asbest is very dangerous and cancerous if only they had known better back in
those days it wouldn't have become such a major problem/plague.

It seems none-of-the-lessons of asbest have been learned by electronics
industry.

And no I do not feel guilty about downloading games of which I know that I
would have never bought them anyway...

Bye,
Skybuck.

Robert Myers
June 12th 10, 01:17 AM
On Jun 11, 7:15*pm, "Joel Koltner" >
wrote:
> "Skybuck Flying" > wrote in message
>
> b.home.nl...
>
> > I suspect that the energy in the wifi/gsm/wireless signals go through the
> > human body and might trigger DNA changes to certain cells/parts of the body.
>
> Yeah, you and plenty of other people.
>
> It's been extensively researched; the results are generally somewhere between
> "it seems quite harmless" and "pretty inconclusive, really hard to say." *So
> while no one would suggest it's 100% certain that such low-energy EM waves are
> harmless, it does seem pretty clear that if they do create harm, it's a very,
> VERY small risk in the grand scheme of things.
>
> At some point you have to decide if the conveniences of modern technology are
> worth the risk given science's best asesssment of what those risks are. *None
> of your great-great-great grandparents was ever killed by driving down an
> interstate highway too fast... although it wouldn't have been unheard of for
> them to be killed from something as simple a relatively small cut on, say,
> their foot while walking along a beach that then became infected and
> eventually killed them. *But just as surely as they'd love to have had
> penicillin -- thereby decreasing their risk of death -- they just as surely
> would have liked automobiles, despite the well-known increase in the risk of
> death from them (especially for young guys like *you*, Skybuck!).
>
> Does it bother you to stand in front of a light bulb? *You're getting
> *hundreds* of watts there at *many terahertz* after all... makes your WiFi
> gear seem absolutely puny!
>

As a professor of chemistry at one of our lesser local institutions
pompously informed me, photons from cell phone radiation aren't strong
enough to break the relevant chemical bonds (like, gosh, I never would
have known from all that time studying physics), thus leading to
possible mutations. What apparently didn't occur to him is that
structural kinetics of proteins *could* be affected by the relatively
low energy but coherent radiation from wireless devices. It would be
really premature to conclude that there is no risk. People were dying
of bovine spongiform encephalopathy (mad cow disease) in significant
numbers by the time clinicians opened their minds wide enough to
accept prions as a cause.

I agree that the kind of wild speculation you are responding to is
unhelpful, but the history of environmental hazards to health is
littered with premature dismissals of potential risks.

Robert.

Jasen Betts
June 16th 10, 12:11 PM
On 2010-06-08, Skybuck Flying > wrote:

> Is this the end of Windows because it doesn't work on ARM processors ?

wince.


--- news://freenews.netfront.net/ - complaints: ---

Maddoctor
July 19th 10, 04:07 PM
"Skybuck Flying" > wrote in message
b.home.nl...
> Hello,
>
> Today Apple "released" the iPhone 4.0... I believe it has something like a
> 1.0 GHz processor...
>
> I find that quite impressive, 1.0 GHz in such a small package and
> non-overheating ???
>
> Maybe to good to be true ?
>
> I wonder what the future will bring ?...
>
> Will we see the rise of "low power/low heat/low noise desktop computers"
> being powered by ARM-based processors ?
>
> Is this the end of Windows because it doesn't work on ARM processors ?
>
> Can intel atom processors compete with ARM processors ?
>
> What's AMD's answer to atom and arm ?
>
> Can an AMD/Intel single 1.0 to 2.0 GHz core be compared to ARM 1.0 to 2.0
> Ghz Core ? Would they both be about as fast... or would one win over the
> other ?
>
> To me 1.0 to 2.0 GHz seems to be the magical
> number/milestone/border/hurdle towards a good to great desktop experience.
>
> For 99.9% of my daily PC activity 1.0 to 2.0 GHz would be enough... this
> almost includes video processing at modest resolutions 640x480 or so...
> maybe 800x600, maybe even 1024x768... further enhancements/optimizations
> might enable very large resolutions too but don't count on it ;)
>
> For 1920x1200... 4.0 GHz is probably needed to run smooth and cool
> (strangely enough)... Or a really cool 2.0 GHz processor ;)
>
> Only gaming does need stronger graphics cards and stronger cpu's to do
> more...
>
> However software/technology does advance so maybe I could be wrong an
> maybe people will need more processing power... but I don't think so...
>
> Therefore assuming all people need more processing power is a bit
> dangerous...
>
> A good secondary strategy is to focus on low power/low heat/low
> noise/weaker processors to accomadate non-gaming related and
> non-high-performance tasks/crowd ;)
>
> I do want a low heat, low noise, low power computer, but I also want a
> strong, high performant computer which can do heavy tasks.
>
> I would love to have a computer which can be totally quite thanks to for
> example a ARM processor or maybe even an ATOM processor.
>
> I would also love it if the fans only go on when it's really needed like
> gaming or maybe huge video's.
>
> Thus I guess a system which can do both would be ideal for me.
>
> My current PC is already able to do this a little bit:
>
> AMD Dual Core Processor and NVIDIA 7900 GTX graphics card.
>
> But these two technologies do not take it far enough.
>
> The processor still needs a fan to spin.
>
> The graphics card still needs a fan to spin.
>
> The desktop still needs fans to be constantly on... <- This is the biggest
> problem probably.
>
> Therefore what is needed is:
>
> 1. A motherboard which can control the desktop fans and even shut them
> down.
>
> 2. Processors/Graphics cards which can do the same.
>
> 3. Special software which can regulate this or special hardware.
>
> 4. Debuggers to make sure no evil "shut fans down during heat" is in there
> to kill hardware ;)
>
> 5. Temperature meters everywhere for safety...
>
> 6. Emergency shutdown in case of emergency/accidental overheat.
>
> 7. Fan spin up failure detection.
>
> 8. Maybe even blocked air flow detection.
>
> 9. Maybe even unacceptable noise detection and throttling of hardware to
> reduce noise in return lower performance.
>
> 10. This would require microphones which might be too privacy-paranoya ;)
> So not a good idea.
>
> 11. Maybe even build in temperature displays in/on the desktop case to
> show constant temperature of hardware at different locations
> in the case to feel "safe" :)
>
> Ultimately HEAT is bad though... even for the high performance situation.
>
> HEAT is unpleasant for the human beings... it can become to hot in summer.
>
> Assuming HEAT can be expelled from CASE and not be a problem could be
> wrong thing to do.
>
> HEAT also leads to bigger fans on buildings which is bad too.
>
> However...
>
> In the winter HEAT can working as heating device... and the problem is
> less big... it can actually be nice.
>
> Therefore producing more HEAT in winter is more acceptable... unless
> melting the polar caps is a bad idea ! ;) :)
>
> And yup it could be bad... many countries facing floodings ! ;) :)
>
> So maybe ultimately HEAT = BAD = EVIL.
>
> Try to use materials and designs which give great processing power but no
> to little heat ;)
>
> New inventions are done all the time....
>
> Are intel/amd/ati/nvidia up to the task ?
>
> Or will ARM take the cookie and the cake ?! ;) :)
>
> (Just some random thoughts of me on the 1.0 GHz in a tiny package ;)
> :):):) There is even talk of 1.5 GHz in iphone 5.0 wow ! ;) :))
>
> Please feel free to comment within the lines and fill in the blanks,
> misconceptions, pipe-dreams, yes/no etc ;) :)
>
> Bye,
> Skybuck =D
>

I'm pretty sure AMD has allowed nVIDIA to integrate its GPU to AMD
processors especially family 10h core. The main cause was Dirk Meyer hates
Intel.



--- news://freenews.netfront.net/ - complaints: ---

nik Simpson
July 19th 10, 08:47 PM
On 7/19/2010 10:07 AM, Maddoctor wrote:
>
> I'm pretty sure AMD has allowed nVIDIA to integrate its GPU to AMD
> processors especially family 10h core. The main cause was Dirk Meyer hates
> Intel.
>

Pretty sure you are wrong there, AMD has it's own graphics business that
competes directly with NVidia, so making life easy for NVidia would come
under the "cutting of your nose to spite your face" category


--
Nik Simpson

Maddoctor
July 22nd 10, 05:13 AM
No, AMD believes competition is good. AMD wants to secure discreete GPUs
market by allowing nVIDIA to do its own APUs.
"nik Simpson" > wrote in message
...
> On 7/19/2010 10:07 AM, Maddoctor wrote:
>>
>> I'm pretty sure AMD has allowed nVIDIA to integrate its GPU to AMD
>> processors especially family 10h core. The main cause was Dirk Meyer
>> hates
>> Intel.
>>
>
> Pretty sure you are wrong there, AMD has it's own graphics business that
> competes directly with NVidia, so making life easy for NVidia would come
> under the "cutting of your nose to spite your face" category
>
>
> --
> Nik Simpson



--- news://freenews.netfront.net/ - complaints: ---

Neil Harrington[_4_]
July 22nd 10, 01:49 PM
"nik Simpson" > wrote in message
...
> On 7/19/2010 10:07 AM, Maddoctor wrote:
>>
>> I'm pretty sure AMD has allowed nVIDIA to integrate its GPU to AMD
>> processors especially family 10h core. The main cause was Dirk Meyer
>> hates
>> Intel.
>>
>
> Pretty sure you are wrong there, AMD has it's own graphics business that
> competes directly with NVidia, so making life easy for NVidia would come
> under the "cutting of your nose to spite your face" category

I don't think so. Yes, AMD bought out ATI four years ago, so now produces
ATI graphics hardware in addition to (and sometimes combined with) its own
hardware. But AMD surely realizes that NVIDIA remains a very popular maker
of graphics hardware, so making it easy for them to integrate would not hurt
their business at all as far as I can see -- it would just give them
"another stall in the marketplace" so to speak. You can already use NVIDIA
cards in AMD systems anyway, so why not?

GMAN[_13_]
July 22nd 10, 10:37 PM
In article >, "Neil Harrington" > wrote:
>
>"nik Simpson" > wrote in message
...
>> On 7/19/2010 10:07 AM, Maddoctor wrote:
>>>
>>> I'm pretty sure AMD has allowed nVIDIA to integrate its GPU to AMD
>>> processors especially family 10h core. The main cause was Dirk Meyer
>>> hates
>>> Intel.
>>>
>>
>> Pretty sure you are wrong there, AMD has it's own graphics business that
>> competes directly with NVidia, so making life easy for NVidia would come
>> under the "cutting of your nose to spite your face" category
>
>I don't think so. Yes, AMD bought out ATI four years ago, so now produces
>ATI graphics hardware in addition to (and sometimes combined with) its own
>hardware. But AMD surely realizes that NVIDIA remains a very popular maker
>of graphics hardware,

Not after their GPU and chipset fiasco. I sit here with a HP tx1000 that cost
over $1300 years ago , that now will not boot due to fried nvidia hardware.

There are endless people suffering from this. Dell/HP/ even Apple have laptops
with this hardware in them.

MitchAlsup
July 22nd 10, 11:17 PM
On Jul 19, 10:07*am, "Maddoctor" > wrote:
> I'm pretty sure AMD has allowed nVIDIA to integrate its GPU to AMD
> processors especially family 10h core. The main cause was Dirk Meyer hates
> Intel.

I do not believe that nVidia has integtrated with AMD chips, it is a
surprisingly difficult engineering effort withness the delay from ATI
acquisition to integrated chips.

The main cause 'for' buying ATI was that sooner or later the low end
processors were/are going to need a 'pretty' powerful graphics engine,
and these are not really candidates for either multi-core nor
daughtercard graphics which add too much cost (to the low end). This
decision was primarily Hector's and Dirk is just trying to make the
best of what landed on his plate.

Mitch

nik Simpson
July 23rd 10, 02:28 PM
On 7/21/2010 11:13 PM, Maddoctor wrote:
> No, AMD believes competition is good. AMD wants to secure discreete GPUs
> market by allowing nVIDIA to do its own APUs.

There's a big difference between allowing NVidia graphics cards and
chipsets to work with AMD processors, and allowing NVidia to do an
integrated CPU/GPU. The former that just requires a functional PCIe
implementation, that latter is a major engineering project.
--
Nik Simpson

Neil Harrington[_4_]
July 24th 10, 10:28 PM
"GMAN" > wrote in message
...
> In article >, "Neil
> Harrington" > wrote:
>>
>>"nik Simpson" > wrote in message
...
>>> On 7/19/2010 10:07 AM, Maddoctor wrote:
>>>>
>>>> I'm pretty sure AMD has allowed nVIDIA to integrate its GPU to AMD
>>>> processors especially family 10h core. The main cause was Dirk Meyer
>>>> hates
>>>> Intel.
>>>>
>>>
>>> Pretty sure you are wrong there, AMD has it's own graphics business that
>>> competes directly with NVidia, so making life easy for NVidia would come
>>> under the "cutting of your nose to spite your face" category
>>
>>I don't think so. Yes, AMD bought out ATI four years ago, so now produces
>>ATI graphics hardware in addition to (and sometimes combined with) its own
>>hardware. But AMD surely realizes that NVIDIA remains a very popular maker
>>of graphics hardware,
>
> Not after their GPU and chipset fiasco. I sit here with a HP tx1000 that
> cost
> over $1300 years ago , that now will not boot due to fried nvidia
> hardware.

That is the Nvidia chipset that included the GPU on the single chip, right?

>
> There are endless people suffering from this. Dell/HP/ even Apple have
> laptops
> with this hardware in them.

I suppose it depends on how exactly the integration would be carried out,
and under whose name it would be marketed. From the OP's post that isn't
clear to me.

I've read there was some talk of Intel buying Nvidia, which might suggest
they were thinking of doing the same thing AMD did with ATI. Maybe AMD
considered making an arrangement of some sort with Nvidia just to head Intel
off at the pass, so to speak. Dunno, just guessing.