Home John Carmack’s QuakeCon 2005 keynote
Reviews

John Carmack’s QuakeCon 2005 keynote

Scott Wasson
Disclosure
Disclosure
In our content, we occasionally include affiliate links. Should you click on these links, we may earn a commission, though this incurs no additional cost to you. Your use of this website signifies your acceptance of our terms and conditions as well as our privacy policy.

JOHN CARMACK’S QUAKECON address has become something of an annual benchmark for the industry, summing up the news of his latest work and the state of game development’s cutting edge, as intertwined as those two things are. Carmack spoke live and in person at QuakeCon this year, after having missed the last two years due to illness in 2003 and the birth of his child in 2004. He was back in typical form, delivering his speech without notes, seemingly off the top of his head. The speech was still crystal clear and reasonably well organized, as one might expect from a renowned programmer.

Handicapping the next-gen consoles
Carmack started out by offering his assessment of the current state of PC gaming hardware, noting that he is largely satisfied with current trends. He marveled for a moment over the advances made in recent years, especially in graphics, and looked forward to continued progress on most fronts. Ever the PC guy, Carmack observed that the upcoming generation of game consoles looks to be very powerful, but said that PCs will, of course, be much more powerful than these consoles thanks to PC hardware’s rate of progress.

The quintessential PC game programmer then dropped a bit of a bombshell by announcing that he had recently moved his primary development efforts over to the Xbox 360, and that he expected to continue development there for the next six months—although the PC version of id Software’s next game will still be released first. One of his reasons for the move to Xbox 360 for development, Carmack said, was the headache of driver issues on the PC platform. The several layers of abstraction on the PC make it hard to nail down exact graphics performance because the programmer is held at a distance from the hardware. By contrast, the Xbox 360’s more direct approach was “refreshing.” Carmack also praised Microsoft’s development environment as easily the best of any of the consoles, thanks to the company’s background as a software provider.

As for the PS3, he liked sound of the noises Sony has been making about the PlayStation 3 as something of an open platform, and suggested that perhaps an open PS3 could become a computing platform something like the Amiga of old, with an excellent, fixed graphics subsystem. HDTV displays offer the fidelity to make this happen where it couldn’t on past consoles.

Carmack was less pleased with the PowerPC processors for the new consoles, questioning the choice of an in-order CPU architecture. He estimated the console CPUs’ performance at about 50% that of a modern x86 processor and expressed skepticism about the returns of multi-core designs and multithreaded software, especially in the short term. Graphics accelerators are a great example of parallelism working well, he noted, but game code is not similarly parallelizable. Carmack cited his Quake III Arena engine, whose renderer was multithreaded and achieved up to 40% performance increases on multiprocessor systems, as a good example of where games would have to go. (Q3A’s SMP mode was notoriously crash-prone and fragile, working only with certain graphics driver revisions and the like.) Initial returns on multithreading, he projected, will be disappointing.

Part of the problem with multithreading, argued Carmack, is knowing how to use the power of additional CPU cores to enhance the game experience. A.I., can be effective when very simple, as some of the first Doom logic was. It was less than a page of code, but players ascribed complex behaviors and motivations to the bad guys. However, more complex A.I. seems hard to improve to the point where it really changes the game. More physics detail, meanwhile, threatens to make games too fragile as interactions in the game world become more complex.

Physics acceleration’s prospects
Carmack said he considers the prospects for the upcoming physics acceleration chip on the PC iffy, because physics presents a very fundamental problem that graphics doesn’t have: it isn’t easily scalable for level of detail. Either an object in the game is a true physics object with which other objects can interact, or it isn’t. Carmack predicted this constraint would lead to a number of physics-accelerated titles where acceleration affects only elements, such as flowing water, that are peripheral to the core gameplay experience.

Another of his concerns about physics acceleration was speed, not in the sense of peak processing throughput but in terms of the immediacy of real-time interactions. Carmack recalled that the first pre-3dfx graphics chips made Quake feel slower due to lag between user input and visual output. He worried that the first generation of physics chips could cause similar problems, leaving us with games that are fragile and slow. He also happily conceded that he is more of a graphics guy than a physics guy, and admitted that his worries about the bar being raised in physics are probably similar to the worries other have had about his own standard-setting work in graphics.

Cultivating creativity
The subject of bar-raising was, in fact, one of the overarching themes of his speech, as Carmack displayed a passion for cultivating creativity in game development and making development accessible for aspiring game designers and programmers. He acknowledged that today’s games, for which id Software has helped create the expectations, require budgets of tens of millions of dollars and the creation of B-movie-class creative assets. The big budgets, he said, prevent risk-taking. His recent work in developing a game for cell phones had convinced him these relatively simple platforms could be a good place for a lower-budget project—at least for now, until phones become more capable and budgets balloon there, as well.

In a move intended to address this situation, Carmack announced that id Software will, very shortly, be making the source code for Quake III Arena available to the public under the GNU General Public License. This release will include not just the game code, but the development utilities, as well. Carmack looked forward to the possibility of a company doing commercial development work with the engine and actually shipping a game with source code on the CD, as required by the terms of the GPL.

What’s next for graphics
Although the general trajectory and rate of progress in PC hardware pleased him, Carmack did end up his speech with a couple of wish-list items for the graphics companies. First and foremost on that list was full virtualization of texture mapping in graphics hardware. Carmack decried the “fallacy” that “procedural synthesis will be worth a damn,” arguing that programmers spending hours creating procedural shaders isn’t the best way forward. Instead, he said, tools should unleash artists. He called the current tiled texture analogy a crude form of compression, and argued that true unique texturing in graphics would be a massive leap in visual fidelity over current practices. To that end, Carmack asked for virtual page tables in graphics hardware with 64-bit addressing.

Carmack’s other wish-list item was that some attention be paid to the problems with handling small batches of data on today’s GPUs. He said the graphics companies’ typical answers to these problems, large batches and instancing, don’t make for great games.

John Carmack’s past pleas for graphics hardware changes have led to a number of developments, including the incorporation of floating-point color formats into DirectX 9-class graphics chips. We’ll have to watch and see how the graphics companies address these two problems over the next couple of years.