AMD SAYS ITS PURCHASE OF ATI GIVES IT AN ADVANTAGE IN COMBINED CHIPS

苦逼热狗

路边通讯社社长
VIP
注册
2002-10-12
消息
47,114
荣誉分数
2,376
声望点数
393
AMD SAYS ITS PURCHASE OF ATI GIVES IT AN ADVANTAGE IN COMBINED CHIPS
By Dean Takahashi
Mercury News

A revolution in three-dimensional graphics on the personal computer has made possible everything from Pixar's animated films to frighteningly realistic video games. Fueling that revolution is the once-humble graphics chip, now an increasingly important part of every PC.

For years, the core of the PC has been the microprocessor, the brains that orchestrate the computer's basic functions. But the graphics chip, which creates the visual effects on the computer's display, is playing a bigger and bigger role. In fact, some believe that the graphics chip itself could be a threat to the microprocessor. One of these chips might absorb the other one day.

The result: computers will still deliver the gee-whiz 3-D animations in movies like ``Cars'' and stunning visuals for video games, but at lower costs because two chips can be combined into one. That could lead to laptops that cost as little as $100. A single chip that combines both functions also could save space, leading to handheld computers with high-quality graphics.

The rivalry between these chips -- dubbed the CPU for central processing unit and GPU for graphics processing unit -- has set off a high-stakes chess game among PC chip giants Intel, AMD and Nvidia to gain an edge in graphics processing and, as a result, control processing inside a PC.

Next to the microprocessor, the graphics chip is the highest value chip in many PCs, and it can be the most expensive chip in high-end entertainment and game computers.

Since 2000, the average price of an Intel microprocessor in a $2,000 PC has fallen from $487 to $328, according to International Data Corp. At the same time, the average price of a graphics chip has risen from $31 to $38.

Intel, the world's biggest maker of PC microprocessors, and its rival, Advanced Micro Devices, have taken notice of the growing importance of graphics in computing.

AMD and Intel are each figuring out ways to combine graphics and microprocessor functions into a single chip.

AMD, the No. 2 maker of PC microprocessors after Intel, made the first offensive move when it agreed this summer to buy graphics chip maker ATI Technologies for $5.4 billion.

Intel is beefing up its own internal graphics expertise, and its researchers believe they have found a way to make the graphics chip obsolete.

Meanwhile, Nvidia, the last stand-alone graphics chip maker, is figuring out a strategy to remain an independent company and to make sure that graphics remains a separate chip in the PC.

The winner of the graphics processing contest could cut the loser out of the PC entirely.

``If I were them, I would be nervous,'' said Jerry Bautista, director of a research lab at Intel, referring to Nvidia. ``We see a trend. They have been a growing part of the PC budget. We can pull the graphics back on the microprocessor.''

`Last man standing'

David Kirk, chief scientist at Nvidia in Santa Clara, said: ``We make our living by being nervous all of the time. We're very happy with our position as the last man standing in graphics.''

Each of the major players -- Intel, AMD-ATI, and Nvidia -- are taking different paths.

Upon completing the merger expected this fall, AMD and ATI say they plan on combining a GPU and a CPU into a single chip as early as 2007. A combined chip could be cheaper than two separate chips, but it would probably not have the best performance. As such, a combined chip would first be used as the brains in low-cost computers for developing countries. Both companies would still make stand-alone CPUs and GPUs for more expensive computers. But over time, the combination chips could become a bigger part of overall sales.

Intel, by contrast, has the ability to assemble a combination chip on its own. It already has a large team of graphics experts in Folsom. Those chip designers already create communications chips known as ``chip sets'' that have built-in graphics functions. But Intel executives also believe they can steal graphics processing away from the GPU with the promise of better computation.

Nvidia, meanwhile, expects demand for better graphics to keep growing, and that will give stand-alone graphics chips a bright future.

``Whenever someone tries to do a hybrid chip, the first one is usually pretty bad,'' Nvidia's Kirk said.

CPUs and GPUs handle processing tasks in different ways. A CPU is a general-purpose device that can run at very high speeds and do many different tasks, one after another. Graphics chips do the same kind of task -- creating a piece of an image -- over and over. They often handle dozens of tasks at the same time. For this reason, it has been hard to combine the two chips to date.

Moore's Law, the decades-old prediction by Intel founder Gordon Moore, favors the notion that two chips will eventually be combined into one. The law postulates that manufacturing and chip design will advance so that chip makers will be able to double the number of transistors on a chip every couple of years. They can use the extra transistors to either improve performance, or combine two chips to cut costs and enhance efficiency.

A step closer

Intel and AMD recently have both turned to a technique that puts two microprocessors on the same chip. So it isn't far-fetched to think that graphics components could be put on future microprocessors, says Justin Rattner, chief technology officer at Intel.

But Jen-Hsun Huang, CEO of Nvidia, believes that the microprocessor will absorb the graphics chip only if graphics technology stands still. But he thinks that graphics technology will evolve so quickly that it will still make sense to dedicate an entire chip to graphics. So, he thinks it makes sense to keep the chips separate for the foreseeable future.

``You end up getting the worst of both worlds'' by combining the two functions onto a single chip, he said.

But Intel researchers say that changing technology may naturally shift graphics processing from the graphics chip to the PC's microprocessor. Intel's Bautista believes that will happen within several years as a technique dubbed ``ray tracing'' wins favor among 3-D graphics programmers.

With ray tracing, a computer creates a scene by shooting a line, or ray, from one point of view. If the ray hits an object, then the ray proceeds no further and the computer knows to draw an object at that point in the three-dimensional space. The computer casts these rays out over and over again until it has a full 3-D image to create. With this technique, the computer never wastes time drawing objects that are obscured by other objects.

Bautista believes that ray tracing is better done on a central processor and could thus make the graphics processor obsolete. Pixar used a lot of ray tracing in its most recent film, ``Cars,'' a move that Bautista interprets as a sign of things to come.

Effects in games

Nvidia's Kirk is skeptical that ray tracing will take hold to the degree that a separate graphics chip becomes unnecessary. He notes that graphics chips are still needed to do most of the visual effects in computer games.

Since the AMD-ATI combined company will own all of the assets required, it will have the most options for creating stand-alone chips or combined chips. It could thus be ready for any new development, said Patrick Moorhead, an AMD vice president.

Dave Orton, ATI's chief executive, says initiatives to create handheld computers and $100 laptops for developing countries will put more pressure on costs that in turn will make the combination of the microprocessor and the graphics processor more likely.

``It's a question of when,'' he said.
Contact Dean Takahashi at dtakahashi@mercurynews.com or (408) 920-5739.
 
they all say that ...
 
后退
顶部