'Atomisation'
There are several problems with Atom:
i) You need the chipset to do anything with an Atom. In the applications where you need two chips, this isn't really a disadvantage, but traditionally ARM chips have also been suited for apps in which one chip is sufficient - where the display element isn't complex or the the display has its own driver already. So an ARM chip is always likely to win in these areas.
ii) The low end of the Atom lineup is a bit expensive, but the high end is really very expensive (compared with other potential solutions).
iii) And I'm not quite clear whether this is just marketing being unclear or whether Intel are really suffering from corporate hubris, but Atom isn't really a product for general embedded applications. For that, you need stuff like counter timers, A/Ds, flexible interrupt architectures, etc. Intel 'markets' Atom as if it can cope with general embedded when it is only really suitable for a distinct subset (where low-ish power is a necessity, but not very low power, where you need somewhat complex display performance, but not very complex displays or trivial displays).
On the other hand, atom is an x86 instruction set part. It is difficult to know how important that is. If you need a proprietary technology - something like flash - and the proprietary hegemon of that technology will not support anything other than x86, then you need x86 (and what about Nemiah??). On the other hand, if your build standards encompass stuff like embedded Linux (or similar), QTopia, etc., maybe you don't need x86 at all.
In a sense, this is Intel's mission for Atom: Stop the true embedded processors from ripping into x86's market from the bottom, because if something like ARM gets from smartphones to the 'internet in your pocket' devices, Intel probably can't win that market area back while maintaining margins. On the other hand, if Intel can prevent the real low powered devices from getting critical mass in this area, in, say, two years, Intel can introduce a proper product.