Quote:
Consumed power during runtime is nothing compared to what is
wasted during manufacturing. Therefore all these CPUs should have
a much higher carbon footprint than the x86's.
Eh? That is probably false. I don't know the daily yields of cpu factories, but I'd guess it would be at least in the hundreds of CPUs/day (probably in the tens of thousands, but let's play safe). Assuming that a factory wastes a few hundreds of KWts of power(probably MWts but again let's play safe), that means a new cpu "costed" a few KWts of power to produce -I did try to find numbers for these, but Google wasn't much help :-/
However, assuming people use their computers for years -esp in a cluster that we're talking about- for 24/7, let's say 2 years before an upgrade, I think it's pointless to compare. An ARM cpu running at 7Wts/h, would definitely consume much more than a few KWts in the two years of its runtime (~122KWts to be exact). Compare that to 80Wts of a modern Intel CPU, which amounts to 1.4MWts in 2 years total).
Quote:
The x86's won't stall in development if you think that? In five years they may still be ahead of ARM..
Maybe, maybe not. Considering their later pace, I think ARM beats Intel in terms of progress rate -not absolute terms. What I mean is that ARM cpus progressed in a faster rate than it took Intel to achieve the same performance.
It will definitely be an interesting race, that's for sure. :)