Hardware, software, and workflow to create world-class content for The Light Park year-round — with full AI integration and maximum cost efficiency over time.
The warehouse is open. The crew is back. And the content opportunity is happening right now — not in October. The 2026 season story begins in the warehouse, months before a single car rolls through the gate.
DJ Polar Ice narrates a 4-episode behind-the-scenes series from the warehouse — crew prepping gear, building the DJ booth, testing light sequences, and giving the final walkthrough before trucks roll. This content builds the Jingle Jam audience before the season opens and gives Gnomies a reason to follow along all year. It can't be made without a dedicated machine to edit, render, and produce it.
The Light Park is building a brand that extends well beyond October–January. Jingle Jam, the DJ battle concept, Gnomie Light, and the 2027 Finale expansion all require consistent, high-quality visual content year-round. Right now every piece of content requires cloud AI tokens, external tools, and workarounds. This proposal builds a permanent content production capability — faster, cheaper, and fully in-house.
Generate AI video clips in Kling/Runway (cloud). Edit, composite, score, and finalize everything locally on a dedicated Mac Studio. Over time, move more and more generation work local too. Token spend drops every year as local capability grows.
Every Light Park venue is surveyed to centimeter accuracy using Emlid GNSS receivers and exported as georeferenced DXF — the industry standard for precision site data. Our CAD team imports that survey data into SketchUp, where the entire show is modeled in exact real-world coordinates: every prop, every lane, every display, every arch, positioned precisely where it will physically exist on opening night.
That georeferenced 3D model is then imported into Twinmotion — an Unreal Engine-powered photorealistic rendering platform — where AI generates the full show experience: night atmosphere, synchronized LED lighting, moving vehicles, fog effects, and the DJ booth in full operation. The result is a photorealistic render of the complete show that doesn't yet exist.
Aerial drone footage of the real venue — captured with the Antigravity A1 360° drone — is composited against the render using matching GPS coordinates. The virtual show sits precisely on the real lot, to the centimeter, before a single prop has been installed.
This is pre-visualization at the level of major theme park and live event production — built entirely in-house, at a fraction of the cost.
RTK GPS survey of every venue. Exported as georeferenced DXF. The foundation every other tool builds on.
Every prop, lane, display, and structure modeled in exact real-world position. Reusable year over year.
Unreal Engine-powered photorealistic rendering. Imports SketchUp directly. Generates night lighting, atmosphere, vehicles, and the full show experience before it's built.
Drone footage of the real venue with GPS metadata. Composited against the Twinmotion render using matching coordinates. Virtual show, real location, perfect alignment.
A dedicated Mac Studio for content production — separate from Bob's personal machine. Handles local AI generation, video editing, rendering, and audio work without impacting daily operations.
192GB unified memory. The most powerful Mac ever made for this kind of work. Runs the largest open-source AI models locally, full 8K video rendering, DaVinci Resolve, ComfyUI, and audio processing simultaneously — without breaking a sweat. No ceiling. Built for years of growth.
Replacing the two existing wall-mounted monitors with matched LG ultrawides to complete a consistent four-monitor setup. VESA-compatible for existing wall mounts.
34" curved ultrawide (3440×1440). VESA 100×100 wall mount compatible. USB-C 96W charging. Matches existing LG monitors on desk. No smart TV features — no remote needed, wakes/sleeps with the Mac automatically.
Hollywood-grade video editor used on major films. Built-in AI tools: Magic Mask, Speed Warp, voice isolation, auto color grading. Best AI feature set of any editor. No subscription — pay once, own forever.
Premiere Pro, After Effects, Photoshop, Illustrator, Firefly AI. Essential for social media assets, character art, motion graphics, and marketing materials. Full suite — everything in one login.
Runs locally on the Mac Studio. Image generation (Flux, SD), video generation (Wan 2.1), audio processing — all without cloud tokens. Gets more powerful as models improve. Free forever.
Kling Pro at $25.99/month — 150 AI video generations. Veo for highest-quality cinematic clips. Used for new character videos, promos, and content that needs cloud-scale compute. Token cost reduces every year as local capacity grows.
| Item | Type | Cost | Notes |
|---|---|---|---|
| Mac Studio M4 Ultra (192GB, 2TB SSD) | One-time | $4,399 | Dedicated content production machine — maximum AI capability, no ceiling |
| 2× LG 34" Ultrawide (34WP85C-B) | One-time | $1,100 | VESA wall-mount, matches existing setup |
| Antigravity A1 Infinity Bundle + Propeller Guards | One-time | $1,623 | 8K 360° drone. Films empty venue lots during the day — AI builds the full light show over the footage as a time-lapse. Nine venues, nine reveals. Also: warehouse fly-throughs, aerial build coverage, in-show footage. 20% off through April 16. FAA registration required ($5). |
| 2× GoPro HERO 13 Black | One-time | $800 | Build time-lapses, multi-angle coverage, weatherproof, 4K/8× slo-mo |
| DaVinci Resolve Studio | One-time | $295 | Own forever, no subscription |
| One-Time Hardware + Software | $8,217 | ||
| Adobe Creative Cloud | Annual | $600/yr | Full suite — Photoshop, Premiere, After Effects, Illustrator |
| AI Tokens (Minimal Use) | Monthly | $124/mo · $1,488/yr | Cloud AI for highest-quality generations requiring cutting-edge models. Majority of work runs locally — tokens reserved for premium output. |
| Annual Operating Cost | ~$2,576/yr | Adobe $600 + AI tokens $1,488 + ElevenLabs $132 + Suno $96 = $2,316 · rounded up for misc | |
| Year 1 Total | ~$10,793 | Year 2+ drops to ~$1,140/yr | |
As local AI capability grows, cloud token spend drops. The Mac Studio pays for itself within 2-3 years just in avoided cloud costs — before factoring in the value of content produced.
Year 1: Mostly cloud AI for video, local for images and editing. Year 2: Local video generation (Wan 2.1 or successor) handles 50%+ of clips. Year 3: Cloud AI only for truly cutting-edge generations. Token spend reduces by 60-70% from Year 1 to Year 3 while output quality and volume increase.
| Content Type | Platform | Cadence |
|---|---|---|
| Jingle Jam character videos | TikTok, Instagram, YouTube | Weekly during season |
| DJ battle promo clips | TikTok, Instagram | Weekly pre-season |
| Gnomie Light product videos | TikTok, website | Pre-launch + ongoing |
| Show recap / highlight reels | All platforms | Weekly during season |
| Behind the scenes content | TikTok, Instagram | Weekly during build |
| Character introduction videos | All platforms | Pre-season rollout |
| Music videos (Jingle Jam anthem) | YouTube, TikTok | Season launch |
| Venue-specific clips (9 shows) | Local targeting | Per venue opening |
| Year 2 teaser / Finale reveal | All platforms | End of 2026 season |