Raja Koduri, the highly rated former senior vice president and chief architect of AMD's Radeon Technologies Group, has moved over to Intel to drive its integrated graphics efforts.
Officially, Koduri's title at Intel is: senior vice president of the Core and Visual Computing Group, general manager of edge computing solutions and chief architect.
Confirmation of his shift from AMD to its larger rival Intel came late last night and followed a day of speculation after he announced that he was leaving AMD. That had followed a period of extended leave after the launch of AMD's Vega-based line of GPUs at the beginning of October.
According to Intel: "Koduri leads the expansion of Intel's leading position in integrated graphics for the PC market with discrete graphics solutions for a broad range of computing segments. He also leads differentiated IP [intellectual property] across computing, graphics, media, imaging and machine intelligence capabilities for the client and data centre segments, artificial intelligence, and emerging opportunities like edge computing."
AMD CEO Lisa Su took over as head of AMD's Radeon Technologies Group during Koduri's sabbatical, but the company will now be looking to hire a new head to deliver the graphics roadmap sketched out by Koduri.
His return to AMD in 2013 was widely heralded as just the thing to reinvigorate the company's GPU developments, uniting both hardware and software under Koduri at AMD for the first time.
However, the launch last month of AMD's next-generation GPUs, based on the Vega micro-architecture, wasn't quite such a major success, and not just because it had been much delayed: The products more or less only match Nvidia's latest 10-series products, which have been out for more than a year; and, because virtual currency miners have been buying up stocks of their preferred GPUs by the plane-load that has pushed up the price of AMD-based graphics cards to a level where, pound for pound, they compare poorly against Nvidia-based cards.
The decision to go with high-bandwidth memory (HBM) over GDDR5, too, also put AMD at a disadvantage. In theory, HBM modules should be able to offer higher bandwidth (one of the keys to GPU performance) while using less power and in a smaller form factor than GDDR5. However, the technology has proved more challenging to integrate and more expensive than GDDR5, while so far delivering few of the promised benefits.
At Apple, where he was employed from 2009 to 2013, Koduri worked on building fast mobile GPU hardware that fitted in with Apple's ARM-based A-series microprocessors.
And, yep, it'll run Android rather than RiscOS
US engineering giant's cost-cutting outsourcing plan is on the rocks, according to insiders
HP Envy X2 laptop only affordable if you've got loadsamoney
Counterfeit code-signing certificates enabling hackers to hide malware being sold by cyber criminals
Certificates can be used as part of layered obfuscation to evade detection by anti-virus software