10 years ago IBM had an awsume manufacturing capability in chips. I'm not sure they still do. They do tend to be among the leaders in announcing advanced R&D, but what are the costs behind this "new capability" (IE Cu interconnects), that they like to tout. IMHO, Intel is just as advanced but does not announce when the first chip functions, but rather when the process is both economical and the scaling requires it.
Since IBM shut down all most of it's real manufacturing fabs (Fishkill etc.), does it really have the capability to support AMD in a cost effective way? I really don't know if they can do it even if the AMD process is robust (and I have serious doubt's about that).
They used to be the leader, but they may end up being doomed if they jump on the AMD bandwagon. Only time will tell, hope you're wrong, I'm hangin' way out there on Intel.