Intel has snatched rival AMD’s former SVP and Chief Architect of its Radeon GPU division, Raja Koduri (above), and tasked him with heading up the new Core and Visual Computing Group, a new division that Intel hopes will provide discrete GPU cards and integrated graphics to counter Nvidia’s incursion. It looks like Intel is about to try and out-muscle Nvidia’s video cards with its own GPUs.
Koduri, the public face of the Radeon group, bowed out a few months ago, saying he planned to recover from the Ryzen and Vega projects and take some family time. However, it seems that Koduri was planning a new type of family, and was poached for the new job by Intel. AMD won’t be amused, but it’s an endorsement of their previous staffer that Intel is putting him in charge of a group that is squarely aimed at preventing Nvidia tearing chunks out of it.
Intel is talking about extending its integrated GPUs into edge-devices, which is hardly revolutionary, considering they are already on-board the CPUs it hopes to ship to power these sorts of gateways and monitoring devices. However, the company is also planning on developing high-end GPUs – hopefully with more success that the i740 and Larrabee (which actually eventually morphed into the x86-based Xeon Phi, which is losing ground to Nvidia).
However, Qualcomm’s new Centriq 2400 CPU is another threat that Intel needs to mitigate, as are server-grade CPUs from Cavium, which both Google and Microsoft supporting the ARM-based initiatives. Microsoft’s Project Olympus and its Open Compute community are notable examples, with the second-largest cloud computing player saying it planned on moving some of its workload onto ARM CPUs.
While those ARM chips might not be used in the most demanding applications, perhaps only seen in storage boxes where ARM’s low-power competence could help slash energy bills for the data center operators, Microsoft has also moved to make Windows compatible with ARM for laptops and desktops – something that Intel has warned Microsoft about, with threats of a lawsuit regarding x86 emulation on ARM.
For a long time, Intel has been able to view all data center compute market growth as assured sales for its Xeon CPUs – the workhorse behind all server-based applications, and fundamental to their applications. However, newer AI and ML demands currently favor GPU-based processing, and might eventually move to ASICs and other purpose-build chips like Google’s TPU (Tensor Processing Unit).
With all those new applications, which all contribute to overall growth in demand for data center processing requirements, Intel has to view them as threats to its Xeons. Now, a couple of Xeons might be used in a server rack that houses dozens of GPU acclerator cards, from the likes of Nvidia or AMD, whereas a few years ago, Intel would have expected the same rack to be packed to the gills with Xeons, in a CPU-only architecture. But that paradigm has shifted, and Intel knows this.
In a similar vein, edge-computing could damage the overall demand for data center processing of any kind. Bandwidth costs to move data from the edge to the cloud could act as a strong disincentive to developers, and there are benefits to carrying out data-based decision making at the edge for latency-sensitive applications, as that application doesn’t have data transported to a cloud and then await instructions.
Intel and AMD have also just partnered to develop a new part for laptops and tablets, which combines an Intel CPU with a Radeon GPU on a single PCB – aimed at developers searching for a powerful graphics option in a thin enough form factor. The exact specifications of both components are not clear, but Intel’s Embedded Multi-Die Interconnect Bridge (EMIB) tech is responsible for linking the two processors.
The move shows a united front against Nvidia in mobile devices, and comes despite historic hostility between the pair – where has long been the underdog, upset at the perceived abuse of Intel’s dominant x86 market position. Demand for PCs has been sluggish in the past few years, with different forecasts giving mixed views but a consensus of a stall and decline, and a new generation of ultra-thin laptops with powerful graphics capabilities could help turn that around.
Apple is also an AMD fan, and these new parts may well find their way into its PCs, but there were rumors that it was considering moving from Intel to AMD for its laptop CPUs – which might have prompted the deal.
Intel doesn’t have much to worry about in the PC market from AMD, thanks to its gargantuan R&D budget and current dominance. Anything AMD’s CPUs (the new Ryzen range) throw at Intel can be countered by a price cut or the release of the next feature or design that Intel has been sitting on in its labs. While its integrated Iris and GT GPUs do the job for basic tasks, discrete GPUs in desktops have been required for any sort of video-based task – and that’s a paradigm unlikely to change any time soon.
With the new group, it isn’t clear whether Intel is planning on adapting Iris to create a PCI-card product, or if it is planning on using an entirely new GPU design. Iris doesn’t have a great reputation among GPUs, but if Intel starts rolling out new GPUs, we would expect AMD to respond with some sort of legal challenge – given that it never got the chance to put Kudari on gardening leave. There also seems to be no form of no-compete clause, which has allowed him to waltz over to Intel.
Intel’s Chief Engineering Officer, Murthy Renduchintala, said “we have exciting plans to aggressively expand our computing and graphics capabilities, and build on our very strong and broad differentiated IP foundation. With Raja at the helm of our Core and Visual Computing Group, we will add to our portfolio of unmatched capabilities, advance our strategy to lead in computing and graphics, and ultimately be the driving force of the data revolution.”
As for Koduri, a series of tweets said that he had spent more than two-thirds of his adult life with Radeon, and that the AMD team will always be family. “It will be a massive understatement to say that I am beyond excited about my new role at Intel. I haven’t yet seen anything written that groks the magnitude of what I am pursuing. The scale of it is not even remotely close to what I was doing before.”
Source article posted by Rethink Technology Research.
Source: AI Trends