Designing user interfaces (UIs) is a critical step when launching products, building portfolios, or personalizing projects, yet end users without design expertise often struggle to articulate their intent and to trust design choices. Existing example-based tools either promote broad exploration, which can cause overwhelm and design drift, or require adapting a single example, risking design fixation.
We present UI Remix, an interactive system that supports mobile UI design through an example-driven workflow. Powered by a multimodal retrieval-augmented generation (MMRAG) model, UI Remix enables iterative search, selection, and adaptation of examples at both the global (whole interface) and local (component) level. To foster trust, it presents source transparency cues such as ratings, download counts, and developer information. In an empirical study with 24 end users, UI Remix improved participants’ ability to achieve their design goals, facilitated effective iteration, and encouraged exploration of alternative designs.
UI Remix supports three interaction modes (Chat, Search, Apply) through two core modules: a retriever and a generator. Users can iterate between global remix (adapting the whole interface) and local remix (refining a selected component), while transparency cues help assess example credibility.
Demo video illustrating example retrieval, global remix, and local remix workflows.
@inproceedings{wang2026uiremix,
author = {Wang, Junling and Lan, Hongyi and Su, Xiaotian and Dogan, Mustafa Doga and Wang, April Yi},
title = {UI Remix: Supporting UI Design Through Interactive Example Retrieval and Remixing},
booktitle = {Proceedings of the 31st International Conference on Intelligent User Interfaces (IUI '26)},
year = {2026},
address = {Paphos, Cyprus},
doi = {10.1145/3742413.3789154}
}