Nintendo 64 programming characteristics

The Nintendo 64's programming characteristics describe the elements of writing software for the Nintendo 64 (N64) gaming system.

History

The Nintendo 64 was released in 1996. At the time, The Economist described the system as "horrendously complex".[1] The difficulties were said to be a combination of oversight on the part of the hardware designers, limitations on 3D graphics, technology limits of that time, and manufacturing issues.

As the Nintendo 64 reached the end of its lifecycle, hardware development chief Genyo Takeda referred to its programming challenges using the word hansei (Japanese: 反省 "reflective regret"). Takeda said, "When we made Nintendo 64, we thought it was logical that if you want to make advanced games, it becomes technically more difficult. We were wrong. We now understand it's the cruising speed that matters, not the momentary flash of peak power."[2]

Memory

The console uses high-bandwidth but high-latency Rambus DRAM connected to the Reality Coprocessor (RCP) in a unified memory architecture.[3] The R4300 CPU is connected to the RCP through which it accesses RAM, and does not have its own DMA controller. Except for the Game Pak, which is memory mapped on the CPU, all data transfer occurs through DMA to and from the RDRAM. The Reality Signal Processor, Reality Display Processor, peripheral interface, serial interface, and audio interface each have a DMA controller.

Characteristics

Texture cache

The texture cache was 4 KB in size. Its small size led developers to stretch small textures over a comparatively larger space. The console's bilinear filtering only blurs them. When mipmapping is used, texture width requirements and the extra storage for the mipmap levels limit the largest mipmap level to 2 KB. Toward the end of the Nintendo 64's market cycle, some developers precomputed their textures using multi-layered texturing and small texture pieces that were heavily clamped, to simulate larger textures. Examples of this workaround are found in Rare's Perfect Dark, Banjo-Tooie, Conker's Bad Fur Day, and in Factor 5's Indiana Jones and the Infernal Machine.[4] Some games with non-realistic aesthetics use plain colored Gouraud shading instead of texturing on certain surfaces (e.g., Super Mario 64).[5]

The big strength was the N64 cartridge. We use the cartridge almost like normal RAM and are streaming all level data, textures, animations, music, sound and even program code while the game is running. With the final size of the levels and the amount of textures, the RAM of the N64 never would have been even remotely enough to fit any individual level. So the cartridge technology really saved the day.

Factor 5, Bringing Indy to N64, IGN[4]

Fill rate

Many Nintendo 64 games are fill-rate limited, not geometry limited. For example, Z-buffering when enabled is a significant share of memory access, otherwise needed for textures and the frame-buffer. Optimisation is possible by pushing this function onto the RSP and CPU using a custom microcode.[6][4] Significant performance optimisation can be found by using a microcode appropriate for each game. The Nintendo 64's polygon per second rating is about 160,000 with hardware features enabled.[7] Some of the more polygon-intense Nintendo 64 games include World Driver Championship, Turok 2: Seeds of Evil, and Indiana Jones and the Infernal Machine.[4]

Microcode

The Reality Signal Processor (RSP) accepts microcode,[8] through which, a developer can access different operations, create new effects, and optimize for speed or quality. The RSP is a RISC processor, less capable than the CPU, but with an 8-way 16-bit vector engine. The efficacious use of this engine is governed by the microcode, which defines a small instruction sequence for each complex instruction. While promoting the feature of custom microcodes, Nintendo initially refused to share information on how to use the related microcode tools. This was due to the fear that it would be copied by their competitors. However during the console's last few years, Nintendo shared the microcode information with a few developers. Nintendo's official code tools are basic, with no debugger and poor documentation.

SGI's default microcode for Nintendo 64 is called "Fast3D", which some developers claimed was poorly profiled for use in games. Although it generates more than 100,000 high accuracy polygons per second, this microcode is optimized more for accuracy than for speed, and performance suffered. Nintendo's "Turbo3D" microcode allows 500,000–600,000 normal accuracy polygons per second. However, due to the graphical degradation, Nintendo officially discouraged its use. Companies such as Factor 5,[4] Boss Game Studios, and Rare were able to write custom microcode that reportedly runs their game engines better than SGI's standard microcode.

One of the best examples of custom microcode is Factor 5's N64 port of the Indiana Jones and the Infernal Machine PC game. The Factor 5 team aimed for the high resolution mode of 640×480[9] because of its visual crispness. The machine was said to be operating at its limits while running at 640×480. The Z-buffer could not be used because it alone consumed the already constrained texture fill rate. To work around the 4 KB texture cache, programmers came up with custom texture formats and tools. Each texture was analyzed and fitted to best texture format for performance and quality. They took advantage of the cartridge as a texture streaming source to squeeze as much detail as possible into each environment and work around RAM limitations. They wrote microcode for real-time lighting, because the supplied microcode from SGI was not optimized for this task, and because they wanted to have more lighting than the PC version. Factor 5's microcode allows almost unlimited realtime lighting and significantly boosts the polygon count. In the end, the N64 version is said to be more feature-rich than the PC version, and is considered to be one of the unit's most advanced games.[4]

Factor 5 again used custom microcode with games such as Star Wars: Rogue Squadron and Star Wars: Episode I: Battle for Naboo. In Star Wars: Rogue Squadron, the team tweaked the microcode for a landscape engine to create the alien worlds. For Star Wars: Battle for Naboo, they used what they learned from Rogue Squadron and made the game run at 640×480, also implementing enhancements for particles and the landscape engine. Battle for Naboo has a long draw distance and large amounts of snow and rain, even in high resolution mode.[10]

See also

References

  1. "Nintendo Wakes Up." The Economist Aug 03 1996: 55-. ABI/INFORM Global; ProQuest Research Library. Web. 24 May 2012.
  2. Croal, N'Gai; Kawaguchi, Masato; Saltzman, Marc. "It's Hip To Be Square." Newsweek 136.10 (2000): 53. MasterFILE Premier. Web. 23 July 2013.
  3. "Difference Between RDRAM and DDR". Retrieved 2009-01-15.
  4. "Bringing Indy to N64". IGN. 2000-11-09. Retrieved September 24, 2013.
  5. "Super Mario Galaxy". Retrieved 2009-01-11.
  6. "Hidden Surface Removal" (PDF). Archived from the original (PDF) on March 4, 2009. Retrieved April 24, 2014.
  7. Next Generation, issue 24 (December 1996), page 74
  8. "Nintendo 64". Archived from the original on 2007-07-10. Retrieved 2009-01-14.
  9. "Indiana Jones and the Infernal Machine". IGN. December 12, 2000. Retrieved September 24, 2013.
  10. "Interview: Battling the N64 (Naboo)". IGN64. 2000-11-10. Retrieved 2008-03-27.
This article is issued from Wikipedia. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.