* Streamlining UI/UX design workflow with generative AI
AI Designer focuses on leveraging generative AI to streamline digital design workflows by addressing inefficiencies and roadblocks. The goal is to give designers more time for creative processes and critical thinking, leading to better design outcomes.
* AI Designer generating a series of screens (user flow) based on a written prompt and regenerating a selected design.
* Overview
Intro
Starting in February 2023 and concluding in May 2023, I undertook this project as a key part of my Master’s thesis. It was an individual project, independent of any association with Figma or Midjourney. As the sole contributor, I took on various roles, including UX Research, UX Design, Interaction Design, Prototyping, Usability Testing, and Visual Design, utilizing Figma and Midjourney as primary tools.
Background
In this project, I used the Midjourney AI Model as a case study, focusing on UI design tasks, to explore the potential and current constraints of generative AI technology within a digital design workflow. I conducted an experiment to test the model's understanding of design language. I created design tasks using varying description strategies and parameters, which were then used to generate images. The first step of this test began with a fundamental design request, gradually introducing specific design criteria such as different layout configurations. The results were checked through binary evaluation to see if certain features were either present or not. This experiment offered insights into the model's efficiency in handling different design parameters. The results showed that the current model's understanding of given tasks and its ability to generate relevant design images is limited. This result prompted me to investigate how a proficiently trained AI could be seamlessly integrated into the UI/UX design process and the potential implications of such an integration.
I led a workshop on a private Discord channel where six mid-senior level UI/UX & product designers written text prompts and used Midjourney to generate corresponding design images. I observed firsthand how, when integrating this tool into the design process, the designers handle their tasks and the potential challenges that emerged. Eventually each designer crafted at least ten images, totaling 75 prompts. My goal was to understand designers' communication strategies when interacting with an AI model, and to assess its impact on their work process in terms of efficiency, creativity, speed, as well as any potential negative effects.
/ DESIGN LANGUAGE
Quantitative analysis of the 75 prompts revealed an emerging design language when interacting with AI, blending clear communication, UI element specification, problem-solution framing, and visual depiction. Because AI necessitates textual articulation of visual concepts, designers face a new communication challenge that may demand new skills. This language accelerates concept exploration and iterations. The success depends on the AI's ability to translate textual prompts into successful design images and the designer's skill in crafting those prompts.
/ USER INTERVIEWS
In the week following the workshop, I conducted two-part interviews with the participants. My main goal was to gain an understanding of the designers' current workflows, evaluate the impact and effectiveness of integrating the Midjourney AI Model into their processes, and identify potential areas for improvement or optimization in both their traditional practices and their interactions with the new AI tool. Interestingly, while most designers typically used ChatGPT early in their design process, their experience with Midjourney during the workshop shifted the focus more towards the visual and interaction design and prototyping phases. After the interviews, I analyzed and synthesized the data to identify user pain points and derive key insights.
/ KEY FINDINGS
Integrated AI Experience: Designers need and expect gen AI technology to seamlessly blend into their workspace, adeptly understand intricate design prompts, and efficiently support the ideation, creation, and iterative refinement.
Workflow Efficiency: Designers need a streamlined workflow that simplifies the transition from low to high-fidelity designs. This means reducing the labor-intensive tasks associated with intricate detailing, ensuring precise alignment, and minimizing the need for numerous design iterations. Additionally, designers seek ways to reclaim valuable creative time by easily sourcing inspiration from various platforms.
Control over Iterative Design: Designers seek to maintain control over the iterative design process, even with the integration of AI technology. This entails the need for a user-friendly system that allows for straightforward iterations, modifications, and feedback. Also, designers must retain the ability to make granular adjustments without unintentionally affecting unrelated elements when making modifications through the AI system.
Usage of Existing Design Assets: Designers require the AI to effectively leverage their existing design libraries and design systems within their workspace. This capability ensures a streamlined design creation process, promoting consistency and efficiency throughout the design workflow.
Ideation
During ideation, research insights helped me to shape my design approach. Given Figma's wide use in the design community, I decided to integrate the generative AI technology as an intelligent design assistant within this platform. This integration aims to utilize Figma's extensive user base, and provides a seamless, enhanced design experience within a workspace that many designers are comfortable with. I centered my design process around the utilization of generative AI in visual and interaction design and prototyping phases where it showed the most potential according to designer feedback and workshop experiences.
task flows
I specified several design tasks based on user needs and created relevant task flows to dig deep in the user journey, which informed the ai integrated functionalities, design of the user interface and interaction patterns.
* Task flows showcasing potential user navigation paths in an AI-integrated design workspace.
wireframes
To determine the optimal integration of the AI Designer into Figma, I generated wireframes featuring various layouts and collected feedback from multiple designers. The preferred design introduces an interactive AI assistant in the form of a movable modal. Its flexibility and unobtrusiveness ensure that the AI Designer remains accessible without cluttering the user's workspace.
* The wireframes that depict different layout options for the AI designer in Figma.
prototypes
Final Designs
After several iterations informed by usability tests, the final version of the design tool was completed. The AI Designer prototype showcases a range of key functionalities aimed to assist designers in their design process. It is capable of generating artifacts using existing styles, components, or libraries, and it can create a user flow from a written prompt. The tool has the ability to update existing design work using the same or different prompts, and reference images can be used as visual prompts to inspire the style of new designs. Users could navigate through a version log for each design work and iterate on generated designs, with each iteration maintaining its own design and prompt log. Additionally, with the AI Designer users can create multiple variations of a component based on a written prompt, allowing for versatile design exploration.
* Design Generation from Existing Sources: AI Designer utilizes local styles, components, or published libraries to generate designs
(2)
* User Flow Generation: AI Designer generating a series of screens (user flow) based on a written prompt.
(3)
* Regenerating Designs: AI Designer regenerating an outcome for an existing design using the same or a different prompt.
(4)
* Incorporating Visual Inspiration: AI Designer utilizing a reference image as a prompt to inspire the visual style of a generated design.
# Reference image by Anastasia Golovko on Dribbble, used for color palette inspiration in this project.
(5)
* Navigating Design Versions: Users can access the version log for a specific design frame and switch between different versions of a generated design.
(6)
Design Iteration: Users can duplicate AI-generated designs for manual or AI-guided refinements, with each copy retaining the original's version log.
(7)
Component Variation: AI Designer generating numerous component variants, like buttons, from a written prompt.