A modern deployment platform for developers

Node.jsAWS

About the Project

From Idea to MVP: Building a One-Click Next.js Hosting Platform

About a month ago, I had a crazy idea. What if you could take any public Next.js GitHub repo and deploy it with a single click—no login, no config, no dashboard—and get a temporary deployment link valid for 30 minutes? Something like Vercel, but simpler, faster to test ideas, and way more lightweight.

🌱 The Spark

Vercel is incredible. They built Next.js and a great developer experience around it. But sometimes I just want a dead-simple way to preview a repo—no accounts, no UI, just fire-and-forget deployment. Especially useful for:

  • Testing PRs or forks
  • Sending a demo to a friend/client
  • Quickly sharing internal tools

The goal? Build a throwaway deployment platform optimized for speed and simplicity. Temporary hosting, no clutter.

🧱 The Architecture (MVP)

The system breaks into three core services:

  • 🔼 Upload Service
  • 🔧 Build & Deploy Service
  • 🌐 Runtime Hosting (Lambda)

1. Upload Service

This is the user entry point. Accepts a GitHub repo URL, pushes a build job into an AWS SQS queue. That's it—super lean.

Why decoupled? Because I want the frontend or any other tool to use it as a webhook-style API. It just queues jobs.

2. Build & Deploy Service

This is the engine of the platform. Polls SQS for jobs, clones the GitHub repo, runs npm install + next build, creates a deployment bundle, uploads static and public files to S3, and deploys a Lambda function for SSR if needed.

Challenges:

  • Handling SSR required modifying next.config.js to output a standalone server bundle
  • I added zip compression to reduce the number of S3 PUT requests
  • Used MIME types to correctly upload files for static hosting

3. Runtime Hosting (Lambda + S3)

I deploy a Lambda per deployment for SSR support. The Lambda reads files from S3, and I configure a Function URL to serve the deployment. Deployments auto-expire after 30 minutes via S3 lifecycle rules.

💰 Costs & Optimizations

In early testing, I hit the S3 free tier limit. Here's how I optimized:

  • Zip + upload static & public files to reduce PUT requests
  • Use S3 lifecycle rules to auto-delete deployments after 30 mins
  • Explore caching with CloudFront

With these, I expect to stay within ~$5–$10/month even at 10k+ deploys/day.

🛣️ What's Next?

  • Add SES notifications for when a build is ready
  • Make the Lambda smarter (support middleware, rewrites)
  • Add a simple web UI (copy-paste repo URL and go)

And maybe... open it up to the world.