What is the record word size for a gaming system?
Related: What does 8-bit / 16-bit actually refer to?
I remember the "bit wars" of the late 1980s and early 1990s. The Genesis/Mega Drive and the Turbografx-16 were heavily marketed as having twice as many "bits" as old-school Famicoms and such. A few years later, Atari returned with its monstrous "64 bit"!!1!11one Jaguar system which was pwned by the Nintendo 64 and Sony PlayStation.
Starting around 2000, word size seems to have faded out of the world of gaming marketing and criticism. It became all about having wowie-zowie immersive 3D graphics, raw bit count be damned. Without a spec sheet, I couldn't tell you how many bits a PlayStation 2, Xbox 360, or 3DS has, nor do I really care.
So, my question is, how high has word size actually gone in terms of gaming consoles? Was there ever a 128-bit console? 256? 512? 1024? How about a 2048-bit monstrosity? If I am dreaming of creating the world's first 4096-bit Mega Word Pwnage Rhinocerous9000 XL 2022, would that be a record or would I just be an also-ran in the lost Bit Wars?
To be clear, I'm not asking for a definition of or explanation of word size. I'm asking a historical question. I'm also not asking why bit size is no longer prominent in marketing, but more interested in whether bit size continued to grow behind the scenes or whether "more bits" is pretty much dead in terms of gaming.
64-bit, really. All current-generation consoles are based on 64-bit processor architectures (AMD Zen x86-64 on the PS5 and Xbox; Nvidia Tegra AArch64 for the Nintendo Switch). Yes, they support SIMD instructions that can operate on up to 256 bits of data at once, but those are parallel operations on chunks of 64 bits or less (very often, 32-bit and even 16-bit floats).
As mentioned in pinckerman's answer, manufacturers around the year 2000 tried marketing their new systems as "128-bit" because the 90s encouraged consumers to associate the "bitness" of a console with generational improvements and they wanted to one-up the "64-bit" generation... but thankfully they learned the futility of that and didn't take it any further.
Fundamentally, in gaming and everywhere else, "more bits" is dead. There are a few niche scientific applications where extended floating-point precision is needed, and some applications like cryptography that deal with integers thousands of bits long, but for 99.9% of everything 64 bits is plenty — and for the rest, you can always do the operations in software, at a speed penalty. The push instead is for more throughput, more parallelism, and more power efficiency.
Footnote: actually, if you want to be a stickler, many x86 processors since the 1980s, and essentially 100% of them since the mid-90s, have hardware support 80-bit floating-point numbers. For various reasons, this has not led anyone to call anything with an Intel-compatible FPU an "80-bit machine".
The 6th generation of consoles is known as 128-bit era.
The Dreamcast and the PlayStation 2 were the last systems to use the term "128-bit" in their marketing to describe their capability.
The PlayStation 2's CPU (known as the "128-bit Emotion Engine") has a 64-bit core with a 32-bit FPU coupled to two 128 bit Vector Units. The PS2 also has an internal 10 Channel DMA Bus which is fully 128 bits wide.
The Dreamcast has a 64-bit double-precision superscalar SuperH-4 RISC Central processing unit core with a 32-bit integer unit using 16-bit fixed-length instructions, a 64-bit data bus allowing a variable width of either 8, 16, 32 or 64-bits, and a 128-bit floating-point bus.
Increasing the number of bits for a component wouldn't really help you out and would probably just make it harder to design your console. The PS2's CPU was really just capable of doing four 32-bit calculations at once, so again they were marketing it as something that wasn't really true (in a way people would expect). Video games don't really need to go above 64 bits for the current era.
So I really doubt any consoles were ever advertized as 256-bit (or above), although modern x86-based consoles with CPUs supporting AVX do have 256-bit SIMD, and 256-bit data paths between load/store units and cache. (AMD Jaguar handles 256-bit AVX instructions internally as 2x 128-bit operations, but later CPUs like Zen2, and Intel Haswell, truly have 256-bit data paths and execution units.)
And I doubt any consoles ever existed with what CPU architects would really describe as 128-bit CPUs; that was only ever marketing based on SIMD width.
Even modern number-crunching servers don't use the full 64 bits of physical or virtual addresses that are possible with x86-64 and AArch64, although on paper RISC-V defines a version of that ISA with 128-bit addresses and integer registers. If you define bitness as pointer width / address-space size, and/or integer register width, 64-bit is huge.
For throwing more data around with each instruction, modern architectures use separate SIMD registers, not wider integer registers.
The Nuon hybrid DVD player / console had a VLIW CPU focused on 128bit SIMD instructions to get the necessary power. On the other hand, its short lifespan and handful of games meant that it never really had much opportunity to show its skills off beyond Jeff Minter’s creations.