No...that R&D number isn't for combined CPUs and gpus it is what intel is spending on CPU and GPU development, combined into one number.
I have responded (again, very politely) to this in a new thread titled "Marsavian and Wallisweaver". I urge you to read it.
Go to slide 40 and take a look at the breakdown Intel shows for R&D per segment. Intel claims PCCG + DCG R&D adds up to about $2 billion, technology development (process tech) is over $2 billion, "shared processor and graphics" is "over 2 billion", and software R&D is over $1 billion.
So, let's say half of Intel's "shared" processor + GPU R&D is half GPU and half CPU to play it safe. This implies that the processor R&D per year is in excess of $1 billion dollars. This is on top of the R&D in PCCG and DCG combined R&D of $2 billion which, once again, let's call it $1 billion as we strip out all of the PCCG (which may be a mistake, but let's play it conservative).
So, now you've got:
$2 billion/year in process R&D
$1 bllion/year in "DCG" related R&D
$1 billion/year in processor-related R&D (again, the GPU is used in some Xeon E3's, but we'll let it slide for the sake of argument).
This implies an annual R&D run rate of ~$4 billion/year for a standalone DCG or 37% of revenues. Now, if we assume GMs of 80% on an ~$11 billion sales base, and we assume that SG&A maybe comes to...$2 billion (it is currently $8 billion per YCharts for all of Intel...I assume it drops dramatically in a DCG-only company), then we are looking at a company with the following profile:
Revs: $11 billion
R&D: $4 billion
SG&A: $2 billion
Gross margin $: 8.8 billion
operating margin = $8.8B - $4B - $2B = $2.8 billion/25% of revenues.
Now, if you want to argue SG&A would be meaningfully less than that...let's call it $1 billion, then you get to $3.8 billion in operating margin or 34.5% of revenues, roughly in-line with the estimate that I gave you the other day.
Still much lower than today's juicy 47% operating margin.
Samsung has been very clear that it has big ambitions as foundry and in the design of SoCs. Unless they utterly and completely fail, I expect that its MO is to use as few third party components as possible. Whether it succeeds or not is another matter.
"You don't have the information to make even a half-way accurate estimate. The only reason you try is to prop up a conclusion you have already made."
If it took $200M-$300M to support the development of one 32nm CPU + accompanying chipset per year, then how much do you think it would cost to support:
1. A pipeline of at least 3 low power CPU cores built on 14nm and later?
2. A pipeline of at least 3 "big" CPU cores built on 14nm and later?
3. Chipsets for the products in (2), also on increasingly more expensive-to-design on nodes
4. Various flavors of SoCs built on the CPU cores in (1), particularly as Intel attacks storage, networking, and comms infrastructure with more customized processors. This includes all custom accelerator blocks
5. Various flavors of products built on the CPU cores in (2), including those for 1 socket, 2-socket, 4-socket, and 8-socket+ system? Remember that Intel also plans to start integrating various interconnect fabrics and accelerators into these products, too.
6. Funky new types of memory like the eDRAM found in "Crystalwell"
So you've got CPUs that are more expensive to develop than the Itanium chips, a pipeline that is much wider/deeper than the 2-product Itanium pipeline, a world that is demanding custom accelerators/IP, more sophisticated types of Intel-designed memory, process nodes that simply cost more to design on each generation as the complexity grows, and so on and so forth.
If you think that moving all of these expenses into DCG would be wildly less than $1 billion, then I don't know what to tell you.
Also, re CCDO and IDC, I know people on these teams. These teams are absolutely gigantic as Intel has some of the largest and most skilled physical design teams in the world. Also, don't forget that we are in a very competitive CPU environment (and inflation is at work), so engineers today are probably paid much more in dollar terms today than they were during Itanium.
"If it takes ~$200m annually to keep a cpu pipeline going then double that if you want to include Atom as well as Core although in my view of the new server-only Intel Atom/Avoton would be dropped as they offer no performance/power advantage over LV Xeons"
No, it took ~$200M annually according to Chipguy to do Tukwila chipset + Poulson (IIRC). The Xeon/Core (particularly E3, and the new Broadwell SoC/Avoton line) development pipeline is broader and deeper and the flavors of implementations of those cores in finished products are much wider, and I am confident that CCDO/IDC (where the Core CPUs are designed) are much larger teams than the Itanium teams. Also, consider that going forward Intel is designing on 22nm/14nm/10nm which should be much more difficult to design on than the 32nm that the last Itanium was built on.
$1 billion is not unreasonable at all.
Apple signed on to whomever it is using to build its chips LONG ago. You do need to know what process you're actually building your chip on, after all.
Samsung is bolstering its in-house chip development capabilities. Very doubtful you'll see Samsung go "all Intel" on anything.
See, I write you up something courteous and polite and you respond with yet more insults. Having a discussion with you is pointless.
"But, Ash you are the only one who moans, groans and pulls his hair out over every possible negative Intel future event as if they have already happened"
Actually, I moan and groan that Intel's track record in mobility hasn't been stellar, and I moan and groan that they grossly overestimated PC demand in both late 2012 and through about half of 2013. I also am unhappy that they forecast some pretty impressive market share gains during 2013 in tablets/phones that never materialized. Intel, which is generally known for its impeccable forecasting skills, has been getting it wrong pretty often lately.
With BK, the reset button has been hit and Wall Street is cautiously optimistically waiting to see what moves he makes and how competitive he can make Intel the company...and Intel the stock. I think he'll do well, but it'll take a few beat 'n raise quarters to really get WS excited about the company/stock again.
"But for people who bought at $55 when you said it was going to $60 not so much. Anyone can play the hindsight game, Ash."
How have your attempts to play the foresight game worked out?
You were pumping them and claiming that Intel was "brilliant" for "fabbing ARM chips", so you're equally guilty of buying into it.
I still think that ARMH has a solid chance of hitting $60 if this bull market continues, but not high enough of a chance that I'd bet my own money on it.
But would I be going short here? Nope. Let me ask you something...if you so truly believe that ARMH is a bubble about to pop, then why not short the stock? Unlike your small beer-money put, I once had a fairly substantial short position in ARMH which I was very lucky to have been able to cover when Intel won the Galaxy Tab 3. When all was said and done, I made a decent amount of money being short ARMH as I had botched a number of trades beforehand.
Are you an investor, or do you simply enjoy ruling the message boards? Stop issuing the useless "Humpty Report" and tell us what trades you're making and why. If you're not actively trading, provide fundamental analysis and participate in the interesting discussions that occasionally pop up on here. The exchange between myself and mars would have been a great time for you to demonstrate your knowledge of tech that you claim WS so badly lacks ;-)
The jury is still out on whether Intel is going to build ARM competitors to its own products. Foundry deals take *years* to come to fruition, as anybody investing in this space should be aware.
Guy pumps an NVDA and QCOM deal when these two players are direct competitors to Intel and both sell royalty bearing to ARM chips. Absolutely unbelievable that this guy claims to know more than everyone else.