Context
Ethereal Engine was created independently as a web-based 3D engine aimed at bringing multiplayer experiences with near-native performance and graphical quality to any platform compatible with open web standards like WebGL and WebXR. Like Unity or Unreal Engine, it provided a scene editor environment and the ability to build and distribute scenes to run anywhere.

After acquisition by Infinite Reality, Ethereal Engine expanded to become the core product offering of the company, which focused on using immersive technologies to empower small businesses with an on-ramp to the spatial web, much like Squarespace had simplified creating modern websites.
New Perspective
I joined the company after that acquisition, before public launch, to lead the design of the engine's editor and publishing experience. My many years of experience with 3D creation tools and game engines gave me useful technical context and behavioral expectations, but for our target customers, aligning with more common interactions and bringing clarity to complex multidimensional operations was more important than aligning with other 3D tools.

The editor interface is roughly organized into quadrants, from top-left: the Viewport, Scene Hierarchy, Properties, and Assets Panel. Its growth to this point had been largely developer-driven and design hadn't been considered from a top-down perspective or undergone critique from someone outside the core engineering. My assessment was that it was packed with features, but UI inconsistency and unclear information architecture buried useful things under vestiges of debugging.

As we quickly approached general release, I considered design improvements per quadrant to expedite implementation. Being able to complete a smaller portion and distribute new features to beta testers early allowed us to gather feedback and test new ideas before committing to a full redesign.
Untangling Assets
Our first major undertaking was to address the tangled file system. Early testing revealed that creators could not tell the difference between core engine files, project files and user-imported assets. Much of what appeared in the file browser was irrelevant or dangerous to modify. To clarify ownership and reduce clutter, I merged the Files and Assets tabs into a single hierarchy. In this unified browser, “Assets” became a special class of file that could live either in the engine's library or in the project, while irrelevant files were hidden from view.


To improve perceived performance, long folders were loaded progressively on scroll, paginated dynamically. The order of animation and presentation of UI elements were crucial for making this process clear and responsive, and I fully prototyped the system with usable code for implementation. See footnote [2] below for an interactive prototype.
Part of iR Studio's strategy to appeal to non-technical customers included a massive catalog of pre-built 3D assets—architectural elements, props, displays, visual effects—that could be assembled together into a complete scene without importing any custom models. The new combined browser architecture clarified those asset's relationship to the Engine and individual projects. A new Search tab allowed for a greatly expanded filtering and advanced search interface, including saved searches and custom tagging that solved some of the tedium of finding and keeping track of assets.

Direct Manipulation
The initial Viewport interaction model centered on gizmos for translate / rotate / scale operations and arbitrary, non-standard keyboard shortcuts. Every non-geometric entity (lights, spawn points, portals, collisions) had a different visual representation, making selections unclear and unpredictable. Studying our target audience, we concluded that they were much more likely to be familiar with lightly-creative productivity tools like PowerPoint, Keynote and Canva than with professional 3D modeling software. I rebuilt the Viewport's entity representations around simplified, standardized icons and a direct manipulation model that prioritized natural gestures: point and click to select objects, drag to move them freely, and optionally use modifier keys to access alternative modes. These changes made navigating and editing a 3D scene feel more like editing a slide deck than programming a game.

Direct manipulation and larger, simpler elements at the origin point of nodes were also chosen for suitability on touch devices and, eventually, in a WebXR editor. While these design choices were well informed by research, they represented a major change to how the editor worked—and possibly controversial to the Engine core team that had only known the old way. My design process heavily emphasizes native prototyping; I use it to make design decisions, but it's also incredibly helpful for aligning the team on big changes. See footnote [3] for a fully interactive prototype of the new Viewport navigation experience.
Pruning Components
iR Engine is built on an entity-component system (ECS) where functionality is added via components. In the Editor interface, each component on an entity adds a section to its Properties panel; some components also automatically added supporting components, which came with their own properties, too. Up to this point, Properties were assembled in an ad-hoc manner, exposing every possible option in the quickest way possible for development—the kitchen sink, filled with unvalidated text inputs and every possible form and alignment of button, checkbox, and pulldown menu.
Imposing order on the Properties panel required two steps in parallel: editing down content, and systematizing their implementation.

To reduce the complexity of the Properties presented, we first combined components with their dependencies. A “video” component, for example, would also require a “media” component to specify its file source—removing it would break the functionality, so who would ever do that intentionally? We put them together everywhere possible, reducing the opportunity for unintentional breakage and strengthening the relationship between dependent components. Next, I audited the options within each component and simplified or removed any options with unclear purposes, or that were only useful for debugging. If default values performed better than any adjustments, the option was removed.

Disparate (or arbitrary) styles within and between Properties panels were reconciled into a single pattern library. Our design system was built to be shared among several product surfaces: the SaaS dashboard, web scene viewer, mobile apps—and the Editor, which had some special needs not served by the existing system. We needed a set of components that responded well to adjustable panel sizes and maximized information density while remaining clear and orderly. I led the design team to reconfigure the system around simpler components, variables and other tokens that could be branched to create more appropriate subsystems when needed. We also introduced Storybook to better include designers in the development process and build visual QA into code reviews.

The Properties panel automatically populates to reflect the active selection in the Viewport, but because the Assets panel can also have a simultaneous active selection there was an unclear hierarchy of “active-ness” in the interface as a whole. To help clear this up, I introduced a new tab to the Properties panel for an Inspector, giving both selection types an equivalent reflection of state and removing modal behaviors.
Setup Assistant
Even with a refined editor, product leadership felt that the Editor concept was too intimidating to target customers. To lower the barrier to entry, the design team built a wizard-style onboarding flow that emphasized pre-built templates and connections with Shopify to populate e-commerce content. The Setup Assistant flow was presented as an equivalent option when creating new scenes and projects.

Qualitative user testing revealed some clear limitations of the “wizard” flow that were in conflict with the desire to simplify onboarding: it went too far. Creators felt railroaded into prescriptive environments that didn't fit their branding; the low visual quality of template assets, poor rendering performance and unpredictable controls were off-putting. The majority of new users dropped off after the first step that showed template models in a 3D view.
iR Studio launched to general availability with some adjustments to the Assistant. Another team in the company had been working toward building an LLM-powered interface, and some of the conceptual work I had partnered with them on was integrated as a contextual understanding step to drive template suggestions and default values meant to increase success rates. At the point of launch, however, it was clear that an Agentic approach to constructing scenes was far more accessible and appealing to potential customers, and I shifted focus to lead the development of that product. Part two of this story continues in the World Builder case study—see footnote [4] or find it in the Projects list!
