X-bit labs: Can you name the advantages that Intel’s x86 instruction set will bring to graphics processors?
: One of the benefits of the Larrabee architecture’s inherent programmability is that it does not rely on fixed pipelines set out by versions of existing API’s. We expect to support future versions of DirectX and OpenGL. Larrabee’s instruction set will be published and most programmers are already very familiar with IA. Add support for irregular data structures, cache coherency, the ability to do software rendering and as such have the option of using the most appropriate renderer for any single game, any single frame or even any single triangle – and you have a pretty compelling set of features.
Schematic block diagram of Intel Larrabee graphics processorX-bit labs: Do you think that GPGPU technology will ever go mainstream? Does it make sense to invest into that for Intel and software developers?
: GPGPU technology like CUDA allows you to use the GPU as an accelerator to the CPU. Such accelerators are not new, there is a place for them but, given the issues will be niche application. Drawbacks include code development and optimization that requires major investment, actual application level performance is far less than often claimed kernel level performance claims due to bottlenecks, code often does not scale forward to next generation resulting in costly re-work. Intel’s approach is to offer the increasing parallelism of a GPU with full CPU architecture compatibility, offering all of the benefits with none of the drawbacks.
X-bit labs: Did you decide to make Larrabee x86 compatible because in that case it would be easier for the chip to compete on the market of accelerated computing (or GPGPU, if you prefer)?
: Intel Architecture is the gold standard for a programmable device and is well understood by developers everywhere. IA is also what Intel does best. For Intel to build a truly programmable GPU device, using IA technology is the obvious choice, so long as it can be done in an extremely power conscious manner.
X-bit labs: Considering the fact that Larrabee GPU is very similar to the CPUs, can we talk about convergence between CPUs and GPUs?
: Well, graphics are being integrated into the processor in the a future Nehalem processor – but I can’t see the need for discrete graphics cards going away any times soon for those who demand the best 3D performance.
X-bit labs: What are the primary constraints for GPU performance today, power consumption, memory bandwidth, anything else?
: Power consumption, memory bandwidth, and shader execution are performance constraints for GPUs today. Which one dominates varies from workload to workload, and also from platform to platform (Extreme, pro, mobile, etc..). This is key to Larrabee's usage of an all software pipeline and general purpose resources, we adjust to the workload to apply resources where necessary.
X-bit labs: How often do you plan to update your graphics products family? ATI and Nvidia tend to update the high-end families two times a year, but Intel introduces microprocessors for high-end markets more often.
: Well we haven’t even announced the first Larrabee product, so I think it is a bit premature to answer this question. We will do the right thing.
X-bit labs: What is your opinion about external graphics adapters? There are a couple of them on the market aiming laptops (e.g., Amilo Booster, Asustek XG Station), but maybe that market is going to grow substantially in the future?
: Potentially a terrific idea. Imagine being on the go using integrated graphics, benefiting from excellent battery life and small form factor – then coming home and plugging your high performance laptop into an external card and getting its full hardcore 3D gaming benefits. I think quite a lot of mobile power users would like this scenario – especially when coupled with a mobile Intel Core 2 Extreme processor.