GC_PaNzerFIN
Work and study @ Technical Uni
+528|6407|Finland

Intel is planning to make a low-end combined CPU/GPU chip for cheap computers in near future. (codename "timna")

AMD Fusion technology is likely to be a high-end solution. (high-end graphics accelerator cores in a cpu! wow) They are planning to use multiple combined CPU/GPU chips.  co-processors could be installed into pci-e slots that nowadays are for gfx cards.

more: http://www.videsignline.com/news/showAr … amp;pgno=1
3930K | H100i | RIVF | 16GB DDR3 | GTX 480 | AX750 | 800D | 512GB SSD | 3TB HDD | Xonar DX | W8
Bertster7
Confused Pothead
+1,101|6574|SE London

Looks like Intel will make loads of money from a combined low end package. Whereas AMD will slump further behind by targeting the less lucrative high end of the market.

Although if AMD could corner the entire gaming market sector, they could do pretty well.

It looks like Intel are onto the real high profit area though.


That's all just marketing talk though. Hurray for combined CPU/GPUs.

Last edited by Bertster7 (2007-05-16 16:18:46)

GR34
Member
+215|6538|ALBERTA> CANADA
SO i could buy a GFX card that is my computer processor and GFX card???
Bertster7
Confused Pothead
+1,101|6574|SE London

GR34 wrote:

SO i could buy a GFX card that is my computer processor and GFX card???
Other way round.
GC_PaNzerFIN
Work and study @ Technical Uni
+528|6407|Finland

GR34 wrote:

SO i could buy a GFX card that is my computer processor and GFX card???
more like a cpu card/or normal cpu that works as both cpu and gfx. like a 4-core cpu that has two normal cores and 2 gfx cores.
3930K | H100i | RIVF | 16GB DDR3 | GTX 480 | AX750 | 800D | 512GB SSD | 3TB HDD | Xonar DX | W8
Scorpion0x17
can detect anyone's visible post count...
+691|6759|Cambridge (UK)
This is just a stupidly bad idea.

For us gamers at least.

Berster's right - Intel almost certainly have the right approach - combined CPU/GPU for low-end cheap systems.

I don't see the point of AMDs high-end solution - I'm planning to probably upgrade my video card to DX10 at some point and that will be the third (or will it be fourth?) video card I've had in this box.

If you buy into a combined CPU/GPU you're just spending extra money on a component you can't take out and upgrade (though I'd assume the GPU part will be disableable).

Hmm... Actually... I could see AMDs approach working if it were for the console market... but otherwise, pointless for PC gamers.
Bertster7
Confused Pothead
+1,101|6574|SE London

Scorpion0x17 wrote:

This is just a stupidly bad idea.

For us gamers at least.

Berster's right - Intel almost certainly have the right approach - combined CPU/GPU for low-end cheap systems.

I don't see the point of AMDs high-end solution - I'm planning to probably upgrade my video card to DX10 at some point and that will be the third (or will it be fourth?) video card I've had in this box.

If you buy into a combined CPU/GPU you're just spending extra money on a component you can't take out and upgrade (though I'd assume the GPU part will be disableable).

Hmm... Actually... I could see AMDs approach working if it were for the console market... but otherwise, pointless for PC gamers.
That's not what I meant.

It is extremely likely there will be enormous performance benefits from such a combination. But I feel that the approach AMD appear to be taking will be less profitable.
Scorpion0x17
can detect anyone's visible post count...
+691|6759|Cambridge (UK)

Bertster7 wrote:

Scorpion0x17 wrote:

This is just a stupidly bad idea.

For us gamers at least.

Berster's right - Intel almost certainly have the right approach - combined CPU/GPU for low-end cheap systems.

I don't see the point of AMDs high-end solution - I'm planning to probably upgrade my video card to DX10 at some point and that will be the third (or will it be fourth?) video card I've had in this box.

If you buy into a combined CPU/GPU you're just spending extra money on a component you can't take out and upgrade (though I'd assume the GPU part will be disableable).

Hmm... Actually... I could see AMDs approach working if it were for the console market... but otherwise, pointless for PC gamers.
That's not what I meant.

It is extremely likely there will be enormous performance benefits from such a combination. But I feel that the approach AMD appear to be taking will be less profitable.
Oh, ok... Then you're not right...

I doubt the performance benefit will be that big - a decent amount of fast on-board video ram is far more important for performance than any improvement in the communication between the CPU and GPU - there's very little comunication between the two - basically a game says "Here's a bunch of triangles, here's the textures, shaders and so on I want applying to it, please render that for me" - there may then be some data transfer from system memory to video memory (which is why it so important to have plenty on board ('cos then you just put everything in video ram from the start)) - then GPU goes off and renders away - that's it.

That's why I think Intel are going for the low-end low-cost market and AMD could well be going for the console market (where fewer fixed components is better).
jsnipy
...
+3,276|6515|...

It could be a very substantial performance jump, isn't this why things have evolved from isa->pci->agp->pci-e? To get more bandwidth between the cpu and the vpu?
Scorpion0x17
can detect anyone's visible post count...
+691|6759|Cambridge (UK)

jsnipy wrote:

It could be a very substantial performance jump, isn't this why things have evolved from isa->pci->agp->pci-e? To get more bandwidth between the cpu and the vpu?
No, it's get more bandwidth between system memory and the gpu and video memory. There is very little direct communication between CPU and GPU.
unnamednewbie13
Moderator
+2,053|6765|PNW

Scorpion0x17 wrote:

This is just a stupidly bad idea.

For us gamers at least.

Berster's right - Intel almost certainly have the right approach - combined CPU/GPU for low-end cheap systems.

I don't see the point of AMDs high-end solution - I'm planning to probably upgrade my video card to DX10 at some point and that will be the third (or will it be fourth?) video card I've had in this box.

If you buy into a combined CPU/GPU you're just spending extra money on a component you can't take out and upgrade (though I'd assume the GPU part will be disableable).

Hmm... Actually... I could see AMDs approach working if it were for the console market... but otherwise, pointless for PC gamers.
The Fusion chips aim to increase performance-per-Watt for applications such as 3D graphics, digital media and technical computing. In a press statement, AMD suggested the processors will leverage both its coherent HyperTransport interconnect as well as PCI Express to link to external co-processors.
Actually, savings in power consumption would be the ticket for those gamers who are also addicted to laptop computing.
Stormscythe
Aiming for the head
+88|6542|EUtopia | Austria
Well, in times that dual CPU mainboards have become reality even for desktop types, there's not even anything that would keep me from buying a mobo with a CPU and GPU socket and two different RAM slots

Still, the latencies of normal memory are just too high for efficient GPU use (and I assume that the usual system memory would be used, right?). This is something that's possibly an alternative for notebooks and HTPCs but not for gaming setups
GC_PaNzerFIN
Work and study @ Technical Uni
+528|6407|Finland

Stormscythe wrote:

Well, in times that dual CPU mainboards have become reality even for desktop types, there's not even anything that would keep me from buying a mobo with a CPU and GPU socket and two different RAM slots

Still, the latencies of normal memory are just too high for efficient GPU use (and I assume that the usual system memory would be used, right?). This is something that's possibly an alternative for notebooks and HTPCs but not for gaming setups
As I have heard they might be planning to integrate extra cache memory to gpu cores. like in cpus and console gpu chips  something like 512kb L1 and 3mb L2 in GPU FTW! (no more laggy latency between ram and gfx, lol)
3930K | H100i | RIVF | 16GB DDR3 | GTX 480 | AX750 | 800D | 512GB SSD | 3TB HDD | Xonar DX | W8
Scorpion0x17
can detect anyone's visible post count...
+691|6759|Cambridge (UK)

unnamednewbie13 wrote:

Scorpion0x17 wrote:

This is just a stupidly bad idea.

For us gamers at least.

Berster's right - Intel almost certainly have the right approach - combined CPU/GPU for low-end cheap systems.

I don't see the point of AMDs high-end solution - I'm planning to probably upgrade my video card to DX10 at some point and that will be the third (or will it be fourth?) video card I've had in this box.

If you buy into a combined CPU/GPU you're just spending extra money on a component you can't take out and upgrade (though I'd assume the GPU part will be disableable).

Hmm... Actually... I could see AMDs approach working if it were for the console market... but otherwise, pointless for PC gamers.
The Fusion chips aim to increase performance-per-Watt for applications such as 3D graphics, digital media and technical computing. In a press statement, AMD suggested the processors will leverage both its coherent HyperTransport interconnect as well as PCI Express to link to external co-processors.
Actually, savings in power consumption would be the ticket for those gamers who are also addicted to laptop computing.
That's a good point - and I had much the same thoughts last night (well, this morning), whilst sitting in bed before going to sleep - it would make a lot of sense if these chips are being aimed primarily at the mobile market - I could see them working well in both laptop and handheld devices - and that applies to both the high-end AMD and low-end Intel solutions equally - just at different ends of that sub-market.

Also, another thought I just had - this would make more sense if they were more general purpose VPUs (vector processing unit) rather than GPUs (graphics processing unit) - for those not familiar with the term, vectors are the basic building blocks of 3D math - and so are used exstensively in 3D graphics rendering, but they're also used a lot in physics processing, as well numerous other areas of 'technical computing' - now a combined physics processor and CPU - that's something I would buy into and is something that needs the higher CPU<=>VPU bandwidth.

Last edited by Scorpion0x17 (2007-05-17 05:08:20)

Scorpion0x17
can detect anyone's visible post count...
+691|6759|Cambridge (UK)

[69th_GFH]GC_PaNzerFIN wrote:

Stormscythe wrote:

Well, in times that dual CPU mainboards have become reality even for desktop types, there's not even anything that would keep me from buying a mobo with a CPU and GPU socket and two different RAM slots

Still, the latencies of normal memory are just too high for efficient GPU use (and I assume that the usual system memory would be used, right?). This is something that's possibly an alternative for notebooks and HTPCs but not for gaming setups
As I have heard they might be planning to integrate extra cache memory to gpu cores. like in cpus and console gpu chips  something like 512kb L1 and 3mb L2 in GPU FTW! (no more laggy latency between ram and gfx, lol)
Internal cache isn't going to help that much - yeah, it'll help some, but not much - think about it - the top end X2900 is reportedly going to have 1GB high-speed video ram on it - and future games will fully utilise that - 3.5mb of internal cache in a combined CPU/GPU isn't going to give that much improvement.
GC_PaNzerFIN
Work and study @ Technical Uni
+528|6407|Finland

Scorpion0x17 wrote:

[69th_GFH]GC_PaNzerFIN wrote:

Stormscythe wrote:

Well, in times that dual CPU mainboards have become reality even for desktop types, there's not even anything that would keep me from buying a mobo with a CPU and GPU socket and two different RAM slots

Still, the latencies of normal memory are just too high for efficient GPU use (and I assume that the usual system memory would be used, right?). This is something that's possibly an alternative for notebooks and HTPCs but not for gaming setups
As I have heard they might be planning to integrate extra cache memory to gpu cores. like in cpus and console gpu chips  something like 512kb L1 and 3mb L2 in GPU FTW! (no more laggy latency between ram and gfx, lol)
Internal cache isn't going to help that much - yeah, it'll help some, but not much - think about it - the top end X2900 is reportedly going to have 1GB high-speed video ram on it - and future games will fully utilise that - 3.5mb of internal cache in a combined CPU/GPU isn't going to give that much improvement.
AMD knows that memory is the problem and I guess they plan to change the cpu-ram line to low-latency and high-speed next gen hypertransport w/ ring-bus. and who knows what kind of tech they are hiding at amd-ati. possibility of using their use ring-bus memory technology would increase performance a lot. they are not going to release high-end technology that will get pwned by even mid-range gfx cards, are they? xD

Intel is smarter and releases low-end stuff where it really doesn't matter how slow it is. ^^
3930K | H100i | RIVF | 16GB DDR3 | GTX 480 | AX750 | 800D | 512GB SSD | 3TB HDD | Xonar DX | W8
Bertster7
Confused Pothead
+1,101|6574|SE London

Scorpion0x17 wrote:

Bertster7 wrote:

Scorpion0x17 wrote:

This is just a stupidly bad idea.

For us gamers at least.

Berster's right - Intel almost certainly have the right approach - combined CPU/GPU for low-end cheap systems.

I don't see the point of AMDs high-end solution - I'm planning to probably upgrade my video card to DX10 at some point and that will be the third (or will it be fourth?) video card I've had in this box.

If you buy into a combined CPU/GPU you're just spending extra money on a component you can't take out and upgrade (though I'd assume the GPU part will be disableable).

Hmm... Actually... I could see AMDs approach working if it were for the console market... but otherwise, pointless for PC gamers.
That's not what I meant.

It is extremely likely there will be enormous performance benefits from such a combination. But I feel that the approach AMD appear to be taking will be less profitable.
Oh, ok... Then you're not right...

I doubt the performance benefit will be that big - a decent amount of fast on-board video ram is far more important for performance than any improvement in the communication between the CPU and GPU - there's very little comunication between the two - basically a game says "Here's a bunch of triangles, here's the textures, shaders and so on I want applying to it, please render that for me" - there may then be some data transfer from system memory to video memory (which is why it so important to have plenty on board ('cos then you just put everything in video ram from the start)) - then GPU goes off and renders away - that's it.

That's why I think Intel are going for the low-end low-cost market and AMD could well be going for the console market (where fewer fixed components is better).
Looking more at what they are doing I reckon AMD's plan is crap. All the block diagrams I've seen from AMD seem very basic, simple integration of GPU and CPU on the same die. This, as you rightly say, will not improve performance for the type of dense linear algebra that is currently performed on GPUS. It may, again as you say, have important applications in mobile environments.

Intel's ideas look far more interesting. In the short term Intel hope to target the low end of the market. Long term Intel are looking at ways of increasing parallelism in graphics and to move to using more irregular algorithms (just as Nvidia are trying to do the opposite with technologies like CUDA - make GPUs more capable of performing more irregular tasks, that is). Intel's plans involve clever use of multiple smaller cores, which would require massive changes to programming techniques to be efficient, yet could increase performance immensely.

https://www.beyond3d.com/images/articles/IntelFuture/Image10.jpg

Combined with a wide vectored FPU this system could provide stagering performance, though a significant performance decrease should be expected for single threaded code.

Intel predict that graphics will move away from rasterisation and start to be ray traced, which would improve image quality immeasurably, though how they will manage to render ray traced scenes in real time I have no idea - though this type of technology should pave the way for that to be possible.

If Intel are right then they will have obtained a massive lead in pioneering technology for a shift that will make most existing graphics processing techniques fairly redundant and will make them loads of money.
Scorpion0x17
can detect anyone's visible post count...
+691|6759|Cambridge (UK)

Bertster7 wrote:

Scorpion0x17 wrote:

Bertster7 wrote:

That's not what I meant.

It is extremely likely there will be enormous performance benefits from such a combination. But I feel that the approach AMD appear to be taking will be less profitable.
Oh, ok... Then you're not right...

I doubt the performance benefit will be that big - a decent amount of fast on-board video ram is far more important for performance than any improvement in the communication between the CPU and GPU - there's very little comunication between the two - basically a game says "Here's a bunch of triangles, here's the textures, shaders and so on I want applying to it, please render that for me" - there may then be some data transfer from system memory to video memory (which is why it so important to have plenty on board ('cos then you just put everything in video ram from the start)) - then GPU goes off and renders away - that's it.

That's why I think Intel are going for the low-end low-cost market and AMD could well be going for the console market (where fewer fixed components is better).
Looking more at what they are doing I reckon AMD's plan is crap. All the block diagrams I've seen from AMD seem very basic, simple integration of GPU and CPU on the same die. This, as you rightly say, will not improve performance for the type of dense linear algebra that is currently performed on GPUS. It may, again as you say, have important applications in mobile environments.

Intel's ideas look far more interesting. In the short term Intel hope to target the low end of the market. Long term Intel are looking at ways of increasing parallelism in graphics and to move to using more irregular algorithms (just as Nvidia are trying to do the opposite with technologies like CUDA - make GPUs more capable of performing more irregular tasks, that is). Intel's plans involve clever use of multiple smaller cores, which would require massive changes to programming techniques to be efficient, yet could increase performance immensely.

http://www.beyond3d.com/images/articles … mage10.jpg

Combined with a wide vectored FPU this system could provide stagering performance, though a significant performance decrease should be expected for single threaded code.

Intel predict that graphics will move away from rasterisation and start to be ray traced, which would improve image quality immeasurably, though how they will manage to render ray traced scenes in real time I have no idea - though this type of technology should pave the way for that to be possible.

If Intel are right then they will have obtained a massive lead in pioneering technology for a shift that will make most existing graphics processing techniques fairly redundant and will make them loads of money.
Hmm... interesting...

Fully raytraced games are long way off I would have thought - have you seen the PS3 raytracing demo? IBM (IIRC) linked up 3 PS3s over a gigabit network, installed Linux and a distributed realtime raytracer on them and, while it's sweeeeeeet, all that all that power could manage to raytrace is a single car!

Last edited by Scorpion0x17 (2007-05-17 06:03:06)

Milk.org
Bringing Sexy Back
+270|6769|UK
From what I read in a PC mag a few months ago this could mean huge increases in performance in the future for graphics.
vpyroman
Aeon Supreme commander
+16|6609|UCF
The Gaming market is small for PCs. The real $$ is in workstations, for schools,  buisness, corporations. So just 1 card to do both is a good thing, perhaps even smaller desktops, like that new Dell not a tower & not a laptop PC.
Bertster7
Confused Pothead
+1,101|6574|SE London

vpyroman wrote:

The Gaming market is small for PCs. The real $$ is in workstations, for schools,  buisness, corporations. So just 1 card to do both is a good thing, perhaps even smaller desktops, like that new Dell not a tower & not a laptop PC.
Exactly. Which is why Intel currently dominate the GPU market.
kylef
Gone
+1,352|6486|N. Ireland
I knew AMD would do this. Rumours went round as to why they actually bought ATi and now we have the reason! I think it is a fantastic idea, although I'm not sure how it will pan out.

Board footer

Privacy Policy - © 2024 Jeff Minard