ChatGPT vs Claude vs Gemini: Which AI Builds the Best NBA 2K26-Style Game in 1 Hour?

15 May 2026 03:37 349,753 views
Three top AI models—ChatGPT, Gemini, and Claude—were each given one hour to build an NBA 2K26-style basketball game from scratch. Here’s how they handled code, graphics, gameplay, and which one came out on top.

What happens if you ask three of today’s most powerful AI models to build an NBA 2K-style basketball game from scratch in just one hour? This experiment put ChatGPT, Gemini, and Claude head-to-head to see which AI could ship the most fun, polished NBA 2K26-style prototype under intense time pressure.

Each model got the same massive prompt (over 2,500 lines), the same reference images, and the same one-hour timer. The result is a surprisingly good look at how these AIs differ when it comes to real-world coding, 3D game logic, and feature design.

How the Challenge Worked

All three AIs were asked to create a playable 3D basketball game in Unity that felt like a simplified NBA 2K26. The rules were:

• Same long, detailed prompt for all three models
• Same three reference images (court, UI, and style hints)
• One hour per model, including iterations and bug fixes
• The human only pasted prompts, imported assets, and hit Play

Each AI had to handle core systems like player movement, dribbling, shooting, passing, fouls, free throws, stamina, and basic game flow. On top of that, they were pushed to add things like cinematic intros, replays, and camera systems—exactly the kind of polish that separates a tech demo from an actual game.

ChatGPT’s NBA 2K26: Solid Foundation, Rough Edges

ChatGPT (running in an extended, high-reasoning mode) went first. It returned a huge amount of code very quickly—about 2,000 lines for the first version and roughly 4,400 lines by the end of the hour.

Graphics and Player Models

The initial build used a downloaded basketball stadium model, which already looked surprisingly good in-game. The first player characters, however, looked more like Wii-style avatars than NBA pros.

To fix that, ChatGPT was asked to describe a realistic player model: a 6'6" athletic player with a lean, muscular build, long arms, and broad shoulders. That description was then fed into an AI 3D model generator to produce a custom basketball player, which was wired back into the game.

Gameplay and Features

ChatGPT’s game quickly evolved from a basic dribbling test into a full mini-basketball experience:

• Dribbling and shooting, with reasonably responsive controls
• Passing between teammates (with some awkward moments, like the ball rolling on the ground)
• Defenders that guard players without sticking too close
• A stamina bar that drains as you play
• A working free throw system with two-shot attempts

Later prompts asked ChatGPT to make blocking harder, improve player spacing so they line up like a real NBA formation, and refine the overall flow. By the final build, there was even a cinematic intro before tip-off.

The main issues: some animations were stiff, legs didn’t always move while dribbling, and the overall feel was more “good prototype” than “polished arcade game.” Still, it was fully playable and genuinely fun, earning a rough score of 6.5/10.

Gemini’s NBA 2K26: Better Physics and Cinematics

Next up was Gemini 3.1 Pro, again using the same prompt and reference images. Gemini’s first version came in at about 1,250 lines of code—less than ChatGPT’s—but the visual quality jumped immediately.

Visuals and Rim Physics

Right from the main menu and first playthrough, Gemini’s court looked cleaner and more realistic than ChatGPT’s initial attempt. The standout feature: rim physics.

Shots interacted with the hoop in a much more satisfying way. The ball would bounce off the rim, roll around, and drop in or out with believable motion. Even early on, simple jump shots felt good because of how the rim behaved.

Gemini also generated a custom player description, which was turned into a 3D model and imported. Using Gemini’s code assistant, that model was hooked into the game with animations.

Gameplay, Replays, and Camera Systems

After several rounds of bug fixing and refinement (ending at around 2,700 lines of code), Gemini’s game added a lot of flair:

• Dribbling that lined up much better with the player’s hands
• Clear pass indicators showing who you’re targeting
• Throw-ins where you can choose where to inbound the ball
• Fouls and a free throw bar for timing shots
• A dedicated dunk button that triggers a cinematic slam (even if the animation looked a bit goofy)

One of Gemini’s biggest wins was presentation. It added:

• A stylish main menu with players on screen
• Cinematic intro sequences (even if players spun like helicopters at one point)
• Multiple camera angles, including a top-down view
• Instant replays with dramatic camera cuts after big plays like dunks

The time-out sequences were also surprisingly immersive, with players circling up and the game pausing like a real NBA timeout. Overall, the game felt more like a broadcast-style experience than a simple prototype, earning an estimated 8/10.

If you’re interested in more deep-dive comparisons of these models beyond gaming, you may like this breakdown of how wrappers and tooling affect performance in ChatGPT vs Claude vs Gemini.

Claude’s NBA 2K26: Clean Animations and First-Person Mode

Claude Opus 4.6 went last, running in an extended, high-thinking mode. Expectations were high, and Claude delivered a strong showing—especially in animation quality and advanced camera work.

Core Gameplay and Animations

Claude’s first version, about 1,200 lines of code, already looked decent and played smoothly. It included:

• Shooting with a nice, clean release animation
• A stamina bar
• Planned crossovers and shot fakes (though some didn’t fully work at first)

Like the other models, Claude described a 6'6" muscular player, which was turned into a 3D model and wired into the project. Initially, all players looked identical—teammates and opponents alike—making it confusing to tell sides apart.

After a large “mega prompt” to add missing features, Claude expanded the project to around 4,000 lines of code and brought in many of the elements seen in the other games:

• Rim mechanics and realistic bounces
• Slow-motion replays with multiple camera angles after scores
• Timeouts and fouls, complete with a timing bar for free throws
• More advanced dribbling moves like between-the-legs animations (even if they were a bit scuffed)

First-Person Mode and Camera Control

One of Claude’s most impressive features was a first-person mode. A final prompt asked for a smooth first-person camera that didn’t glitch, and Claude delivered a view where you can see the ball while dribbling and take shots from the player’s perspective.

On top of that, you could cycle through different camera angles with right-click, shifting between broadcast-style views and more immersive perspectives. Catch and shooting animations were especially clean, making jump shots feel satisfying.

The final build also included:

• A main menu with an animated basketball
• An intro cutscene with text overlays (including a funny moment where a player trips over nothing)
• Solid dunk animations and smooth ball control on the ground

Claude did hit a session limit during development, requiring a paid reset to keep going—something worth noting if you’re planning long coding sessions. Still, the final result was polished, fluid, and arguably the best-feeling shooter of the three.

For a broader look at Claude’s coding strengths in more serious projects, you might also find this long-form test of Claude Code vs Google’s tools useful.

Which AI Won the NBA 2K26 Challenge?

All three AIs managed to build a surprisingly complete basketball game in just one hour, but they each had different strengths:

ChatGPT: Fastest to produce a large, feature-rich codebase. Good overall gameplay loop, but weaker visuals and animations. Approx. rating: 6.5/10.
Gemini: Best rim physics and broadcast-style presentation. Strong replays, timeouts, and camera work, with a fun arcade feel. Approx. rating: 8/10.
Claude: Cleanest shooting and catching animations, plus a standout first-person mode and flexible camera system. Very polished feel overall.

The real takeaway isn’t just which AI “won,” but how far AI coding assistants have come. In about an hour, each model helped build a 3D sports game with custom models, physics, UI, replays, and multiple camera angles—something that used to take a small team days or weeks.

If you’re a developer, this kind of experiment shows how powerful these tools can be as coding partners, especially for rapid prototyping. And if you’re a gamer, it’s a glimpse of how AI might shape the next generation of sports games—faster iteration, more experimental modes, and maybe even custom games built on demand.

Share:

Comments

No comments yet. Be the first to share your thoughts!

More in Code Assistants