By Josh Proto
Dec 17, 2025

How a Multiplayer Game Was Built in One Day (And Won the Hackathon)

Hackathons are often treated as a paradox. On one hand, they're celebrated as spaces for creativity, experimentation, and rapid iteration. On the other, most hackathon projects never ship, never scale, and rarely reflect how real software gets built. We wanted to challenge that pattern. During a recent internal hackathon, our dev teams set out with a clear constraint: build a game that's playable, online, and real in a single day. A working multiplayer game that real people could join, interact with, and break. But you could use an AI code assistant alongside your team to make your idea a reality. What followed was a crash course in modern AI-assisted development, disciplined scope control, and the realities of building real-time software under extreme time pressure. By the end of the day, our dev teams had shipped 6 online, playable, multiplayer games, with the winner being a multiplayer Minesweeper game built by Hank and Jonathan, complete with spectators, chat, real-time updates, and a leaderboard. This post breaks down how the winning team did it, what almost went wrong, and what they'd do again. If you're a founder, engineering leader, or developer navigating AI-accelerated workflows, this is a practical case study in how to build fast without letting your AI take you off the rails.

The Hackathon Challenge: Ship Something Real

The rules were simple:
  • One day
  • Small teams
  • Theme: Games
  • Output: something playable and online
There were no bonus points for refined architectural or clever abstractions. The only metric that mattered was whether people could actually use what you built, and had fun! Hank and Jonathan aligned early on one principle. Whatever they built needed to be multiplayer. That decision raised the difficulty immediately. Multiplayer systems introduce shared state, synchronization problems, and failure modes that single-player experiences never encounter. Under time pressure, those problems surface quickly.

Choosing Minesweeper and Making It Multiplayer

Early brainstorming included drawing games, card simulations, and competitive typing challenges. All of them were interesting. All of them were too complex. The team reframed the question. What is the smallest possible game that still forces real multiplayer behavior? Minesweeper stood out. The rules are simple, the grid is deterministic, win and loss conditions are clear, and most players already understand how it works. The twist was the multiplayer mechanic. Players would take turns placing mines and revealing tiles on a shared board. Every action changed the information available to everyone else. One player's mistake could end the game for all. Even at the whiteboard stage, the idea produced tension and humor. That reaction was an early signal that the design had potential!

The Early Technical Decisions that Jumpstarted Development

Before writing code, the team made two decisions that shaped the rest of the day. The first was choosing a stack that supported real-time behavior without heavy setup. The second was creating a lightweight planning artifact to keep decisions coherent as the pace accelerated. For the frontend, the team relied on familiar tools. Under tight deadlines, familiarity reduces friction. There is no time to relearn basics. Hank and Jonathan relied on a familiar React-based setup, using modern tooling that allowed fast iteration, predictable builds, and painless deployment. The application was bundled with Vite, which provided rapid hot reloading during development and a straightforward production build pipeline. Styling was kept deliberately lightweight. Rather than introducing a heavy component library, the UI was composed with simple layout primitives and minimal custom styles. This kept visual complexity low and made it easier to reason about state changes during debugging, especially once multiple players and spectators were interacting with the board simultaneously. The backend decision mattered more. To support real-time multiplayer gameplay, the system needed shared state and live updates. Building that infrastructure from scratch would have consumed the entire day. Instead, the team chose Supabase. Supabase provided managed Postgres, real-time subscriptions, a CLI for schema management, and automatic type generation. It offered just enough structure to support the game without slowing iteration. That balance proved critical.

Planning as a Living Artifact

Rather than starting in the editor, the team created a planning document using AI-assisted planning tools. The document outlined gameplay mechanics, turn structure, schema assumptions, and deployment steps. It was intentionally incomplete. The goal was direction, not detail. As the day progressed, the document evolved. Decisions were revised. Constraints were clarified. Assumptions were corrected. When AI context windows filled up, the document became a stable reference point. It allowed the team to rehydrate intent quickly and avoid drifting into contradictory implementations. This practice prevented downstream anxiety when the team was close to the submission deadline later in the day.

AI as an Accelerator, Not an Authority

AI tools, specifically the Cursor IDE, were used heavily throughout the day, but project autonomy was never given over to the AI directly. They were effective at scaffolding components, generating boilerplate, and drafting basic logic. They were less effective at reasoning about multiplayer behavior, race conditions, and edge cases. The team compensated by enforcing tight feedback loops of generating code, deploying it, testing it in the browser, and refining the user experience based on real experiential feedback. This loop repeated dozens of times and accelerated their progress.

When the Game "Worked" but Still Wasn't Right

By late morning, the game technically functioned. The board rendered. Turns advanced. Mines exploded. Wins and losses triggered. It was also deeply flawed. Players could win by detonating someone else's mine. Flags leaked information they shouldn't. Turns desynchronized under certain conditions. State updates behaved correctly in isolation but failed under concurrency. These were not syntax errors. They were behavioral failures. Fixing them required careful reasoning about what information should be shared, what should remain hidden, and how spectators differ from players. These decisions could not be outsourced to Cursor. They required real developer insight.

Debugging Multiplayer Reality

Multiplayer bugs don't appear in isolation because they are contingent upon player interaction. Thus testing locally was insufficient. Many issues only surfaced when multiple clients interacted with the deployed system. The team relied on multiple browser windows, incognito sessions, and live deployments to reproduce failures. At times, debugging involved tracking four simultaneous views of the game to understand how state propagated.

The Decision That Saved the Project

Roughly an hour before the hackathon ended, the team enforced a hard cutoff. No new features, no schema changes, and no experiments. From that point forward, the focus shifted to stabilizing a polished build they could present and this decision prevented cascading failures late in the day from poorly tested features. With the core stable, the team added spectator mode, chat, and a dramatic loss animation. These features went ahead to massively improve engagement without overloading their system.

The Result

By the end of the day, the project was fully playable. Players could create games. Others could join. Spectators could watch and chat. State updates propagated in real time. The demo invited participation rather than passive observation. During final presentations, people joined games instead of just watching them and the audience loved it! The project was awarded first place, with the judges remarking how the live spectator and chat features took the simple concept of Minesweeper and elevated it to an engaging multiplayer experience that was fun to both play and watch.

What This Hackathon Revealed

This project illustrates a potential shift in modern software development. With speed and acceleration potentially infinitely unblocked through the use of AI Code Assistants, what is the bottleneck for software development? According to this project, discipline, planning, and organization are still the determinants of success. AI tools make it easy to generate code quickly but they do not guarantee coherence, correctness, or software resilience. Teams that combine AI acceleration with proper planning, validation, and application of their expertise will outperform those that rely on pure AI automation.
Josh Proto
Cloud Strategist

Josh is a Cloud Strategist passionate about helping engineers and business leaders navigate how emerging technologies like AI can be skillfully used in their organizations. In his free time you'll find him rescuing pigeons with his non-profit or singing Hindustani & Nepali Classical Music.

Share This Post

Join our newsletter!

To get news on Gen AI

Development on AWS.

Don't worry, we don't spam