I've been reading about IBM's new technology that stores a bit on just twelve atoms - good enough to store a whole byte on 96 atoms. I think I can follow the arithmetic so far. Next it decides to get funky. The linked story and another on CNN say that it now takes about a million atoms to store a bit. The CNN story adds that it takes half a billion to store a byte. Hmmm?
These numbers imply, say both stories, that we should now be able to achieve data densities 100 times greater than present technology. Elsewhere, the linked story thinks that it only takes 1/83,000 as much space to store a bit in the new scheme. Hmmm again?
I seem to get 1,000,000/12 = 85,000 and 500,000,000/96 = a little more than 5,208,333. No wonder we old people have so much trouble keeping up with technology.