View Single Post
  #3  
Old December 10th 08, 06:21 PM posted to alt.comp.hardware.overclocking,alt.electronics
omattos
external usenet poster
 
Posts: 3
Default Non-deterministic CPU's.

This kind of redundancy is always be more expensive in time and material
than reducing the error rate by merely operating components 'in spec'.


I was more thinking of redundancy like this for performance. Say I
have a task that must be completed in 1 second, but takes 2 seconds on
the best currently available processor. The task can't be
parallelized (ie. every part of the task depends on the previous). By
using multiple processors running faster than designed to get the task
done I can get the job done in 1 second. By having multiple
processors running the same job simultaneously, I can check which
result is correct (by a majority vote)

This uses the theory that as the clock speed of the part moves out of
it's designed working region the reliability goes down, and the
further out of the working region, the further it goes down.