Puget Systems founder Jon Bach sees a fundamental shift in where creative compute happens. Workstations aren't disappearing — they're moving off desks and into server rooms, giving teams full hardware performance without being physically tethered to a tower.
At NAB 2026, Bach outlined the trends reshaping creative hardware: on-prem rack workstations, memory shortages driven by AI demand, and the economics of when to move workloads off the cloud and onto your own hardware.
Workstations Without the Desk
One of Puget's fastest-growing segments is the rack workstation — the same high-performance hardware, but mounted in a data center or server room instead of sitting on someone's desk. Teams remote into their dedicated hardware from home, a coffee shop, or another office.
"You're not performance compromised. You're not sharing hardware with anyone else. It's still your hardware, full performance, it's just not at your desk," Bach said. The approach also lets teams push hardware harder: rack-mounted systems can run fans at full speed in a dedicated space, which matters now that high-end GPUs draw 600 watts.
Rack workstations serve as a stepping stone between local desktop computing and full cloud virtualization. Teams get the familiarity and predictability of owning their hardware while gaining the flexibility of remote access.
The Cloud-to-Local Economics
Bach's framework for deciding between cloud and local hardware is straightforward: start in the cloud, figure out what you actually need, and move to owned hardware once the math works.
"The cloud is a fantastic place to get started and it's a fantastic place to do the big stuff that you need really big hardware to do. But it also has performance costs, it has dollar costs, it has speed costs of latency," Bach said.
The tipping point is typically around a nine-month payback period. Once a team has dialed in their workflow in the cloud and can predict their compute needs, buying equivalent hardware often pays for itself in under a year. The challenge is knowing what to buy — which is exactly what the cloud testing phase answers.
"You really need to know what your needs are and what your workflow is before then. And the cloud is great for that. And the cloud will tell you when it's time to move because you'll find your configuration, you'll find what you want to do and you'll scale it up and then you'll get the bill," Bach said.
AI Demand and Memory Shortages
AI workloads are creating supply chain pressure that affects all hardware buyers, not just AI companies. Memory shortages are the most visible symptom, driven by the massive RAM requirements of large language models and AI training pipelines.
Bach noted that shortage situations trigger unpredictable buyer behavior — hoarding inventory, timing purchases differently, and accelerating orders out of fear. Those behavioral effects compound the actual supply constraint, making pricing and availability more volatile than the underlying supply numbers would suggest.
Small Teams, Big Results
Bach highlighted the collaboration with Corridor Digital on Corridor Key, their AI-powered green screen tool, as an example of where creative computing is heading. Corridor built a production-grade tool without being a software company, something Bach attributes to AI lowering the barrier between having an idea and shipping a product.
"We've talked with some Hollywood studios where in order to make a feature film, you have to become this business and you have to have thousands of people on payroll and you have to have an HR department and finance and all of this stuff. And AI helps make everyone more effective so that you can have a small passionate team and still have an end result that's similar to those larger teams," Bach said.
The pattern extends beyond VFX. Bach's view is that the compute continuum runs from phones to local workstations to cloud, and the right mix depends on the specific workflow, budget, and team size.


