Contact

Midjourney

Imagining a desktop experience outside Discord

Overview

This case study outlines the steps I took to design a desktop view for Midjourney. As the sole designer, I was responsible for every aspect of the UX process, from user research to visual design. In this case study, I'll walk you through some steps of the journey, explaining my thought process and the decisions I made along the way.

Overview

This case study outlines the steps I took to design a desktop view for Midjourney. As the sole designer, I was responsible for every aspect of the UX process, from user research to visual design. In this case study, I'll walk you through some steps of the journey, explaining my thought process and the decisions I made along the way.

Overview

This case study outlines the steps I took to design a desktop view for Midjourney. As the sole designer, I was responsible for every aspect of the UX process, from user research to visual design. In this case study, I'll walk you through some steps of the journey, explaining my thought process and the decisions I made along the way.

UX research

UI design

Visual design

Highlight

Reimagining the Midjourney experience on desktop

Context

Midjourney is one of the leading AI for generating images from a prompt. You enter a prompt and get a corresponding image. But you have to do all this from their Discord server. And this is where the challenge is. Discord can be confusing to understand. Also, generating a prompt via Discord can be fairly complex for the average user. I sought to simplify this process.

Context

Midjourney is one of the leading AI for generating images from a prompt. You enter a prompt and get a corresponding image. But you have to do all this from their Discord server. And this is where the challenge is. Discord can be confusing to understand. Also, generating a prompt via Discord can be fairly complex for the average user. I sought to simplify this process.

Context

Midjourney is one of the leading AI for generating images from a prompt. You enter a prompt and get a corresponding image. But you have to do all this from their Discord server. And this is where the challenge is. Discord can be confusing to understand. Also, generating a prompt via Discord can be fairly complex for the average user. I sought to simplify this process.

Midjourney on Discord

Comment from Twitter user

I’m going to kindly disagree. Midjourney is mostly natural language just an unfortunate UX. We will always need advanced controls, whether that’s parameters in Midjourney or UI affordances.

Comment from Twitter user

I’m going to kindly disagree. Midjourney is mostly natural language just an unfortunate UX. We will always need advanced controls, whether that’s parameters in Midjourney or UI affordances.

Comment from Twitter user

I’m going to kindly disagree. Midjourney is mostly natural language just an unfortunate UX. We will always need advanced controls, whether that’s parameters in Midjourney or UI affordances.

Comment from Twitter user

I’ve been subscribed to Midjourney and have tried both. At first I thought that Midjourney’s UI and UX (discord) was clunky, but I’ve grown to it. The results are…

Comment from Twitter user

I’ve been subscribed to Midjourney and have tried both. At first I thought that Midjourney’s UI and UX (discord) was clunky, but I’ve grown to it. The results are…

Comment from Twitter user

I’ve been subscribed to Midjourney and have tried both. At first I thought that Midjourney’s UI and UX (discord) was clunky, but I’ve grown to it. The results are…

What I had in mind going into this…

Here are the major points I was looking to pass accross.

What I had in mind going into this…

Here are the major points I was looking to pass accross.

What I had in mind going into this…

Here are the major points I was looking to pass accross.

  • Discord can be very confusing.

  • Users might find it hard to understand what is going on. How do I go about the most critical task—generating an image.

  • There is no feedback mechanism incase the user doesn’t get the result he is looking for. Users should be asked whether they got the right results and ways to work around that.

  • There should be a more intuitive way to add parameters to a prompt. The current way will only work for a select few.

  • There should be an intuitive way to change your settings.

  • After an image is generated, the various options should be easier to understand. I had to take a tutorial to understand how to use this.

What I did…

Here are the major issues I handled with this design. My focus was on the key features of Midjourney.

What I did…

Here are the major issues I handled with this design. My focus was on the key features of Midjourney.

What I did…

Here are the major issues I handled with this design. My focus was on the key features of Midjourney.

Isolating the chats

Generating an image can be a bit confusing in Discord. You start by typing “/imagine” and your keyword. This is pretty straightened. The major issue here is you have to search through the chat to find the generated message. This is even more of a problem when there are lots of people requesting [which is well, every time]

Isolating the chats

Generating an image can be a bit confusing in Discord. You start by typing “/imagine” and your keyword. This is pretty straightened. The major issue here is you have to search through the chat to find the generated message. This is even more of a problem when there are lots of people requesting [which is well, every time]

Isolating the chats

Generating an image can be a bit confusing in Discord. You start by typing “/imagine” and your keyword. This is pretty straightened. The major issue here is you have to search through the chat to find the generated message. This is even more of a problem when there are lots of people requesting [which is well, every time]

My solution

Give each chat its own place in the sidebar. To avoid congestion, you can create a new chat whenever you want to start a session unrelated to the current one.

Forget slash commands

Currently to generate an image, you need to start with /imagine or /blend, then your prompt. This is pretty straightforward but a redesign could remove the need for that.

Forget slash commands

Currently to generate an image, you need to start with /imagine or /blend, then your prompt. This is pretty straightforward but a redesign could remove the need for that.

Forget slash commands

Currently to generate an image, you need to start with /imagine or /blend, then your prompt. This is pretty straightforward but a redesign could remove the need for that.

My solution

Just type in what you want and get a response. Switch from “Generate” [which you use to get your image] to "Blend" when you need to merge 2 images seamlessly. Then type in your prompt and you are good to go.

A more intuitive settings

Currently, you access your settings by typing “/settings” in the chat. This is pretty straightforward but could be better. More information should be presented to users as some of these settings can be hard to understand

A more intuitive settings

Currently, you access your settings by typing “/settings” in the chat. This is pretty straightforward but could be better. More information should be presented to users as some of these settings can be hard to understand

A more intuitive settings

Currently, you access your settings by typing “/settings” in the chat. This is pretty straightforward but could be better. More information should be presented to users as some of these settings can be hard to understand

My solution

Since your settings affect the next result, each chat comes with its own. Changing the settings will only affect the current chat. New chats should inherit the last saved setting. More information on each setting is also provided.

A better way to customize your response

To change the way Midjourney generates images, users need to enter parameters such as --aspect, --fast, and a list of others. This can be confusing as users would have to constantly read the docs to use this feature.

A better way to customize your response

To change the way Midjourney generates images, users need to enter parameters such as --aspect, --fast, and a list of others. This can be confusing as users would have to constantly read the docs to use this feature.

A better way to customize your response

To change the way Midjourney generates images, users need to enter parameters such as --aspect, --fast, and a list of others. This can be confusing as users would have to constantly read the docs to use this feature.

My solution

On your chat bar, you have an option to customize the generated images. This opens up a modal to customize your result. Each of these parameters are only a click away and are well explained.

When a parameter is active, it shows up on the top of your chat bar. Dismiss a parameter by clicking the 'x' icon

Generating images

4 images are generated every time you send a request

Generating images

4 images are generated every time you send a request

Generating images

4 images are generated every time you send a request

What do these mean?

After Midjourney generates 4 images, you get the options—U1, U2, U3, U4, V1, V2, V3, V4, and rerun. Anyone using this for the first time will find it hard to understand what these mean.

What do these mean?

After Midjourney generates 4 images, you get the options—U1, U2, U3, U4, V1, V2, V3, V4, and rerun. Anyone using this for the first time will find it hard to understand what these mean.

What do these mean?

After Midjourney generates 4 images, you get the options—U1, U2, U3, U4, V1, V2, V3, V4, and rerun. Anyone using this for the first time will find it hard to understand what these mean.

My solution

Arrange the options after generating an image. They are written in full so anyone can understand. Each of the commands is also under their corresponding images. If you still need to learn more, you can use the docs.

Don’t like what you are seeing?

Edit your prompt and ask Midjourney to run it back.

Don’t like what you are seeing?

Edit your prompt and ask Midjourney to run it back.

Don’t like what you are seeing?

Edit your prompt and ask Midjourney to run it back.

Even better—provide a feedback mechanism where users can tell Midjourney what they didn’t like in the new image. Then, ask it to rerun, keeping the feedback in mind. This can improve the satisfaction users get from the images.

Creating a similar image

The V option lets you create a similar-looking image from the initially generated one. Since users are looking for something similar, there is a chance to provide more value by asking if the variation they got is closer to what they have in mind.

Creating a similar image

The V option lets you create a similar-looking image from the initially generated one. Since users are looking for something similar, there is a chance to provide more value by asking if the variation they got is closer to what they have in mind.

Creating a similar image

The V option lets you create a similar-looking image from the initially generated one. Since users are looking for something similar, there is a chance to provide more value by asking if the variation they got is closer to what they have in mind.

My solution

When users click "Is this similar to what you want?" they see the dialog below. This feedback can also help Midjourney going forward.

After upscaling an image…

After upscaling an image, you get the following. Once again, most users need to read the docs or watch a tutorial on what these mean.

After upscaling an image…

After upscaling an image, you get the following. Once again, most users need to read the docs or watch a tutorial on what these mean.

After upscaling an image…

After upscaling an image, you get the following. Once again, most users need to read the docs or watch a tutorial on what these mean.

My solution

Merge similar commands and provide more description of what they mean. Since most people will be downloading right after they upscale an image, the download button should be prominent. Liking an image saves it to their profile.

Varying an upscaled image

Your profile

Manage your account details

Your profile

Manage your account details

Your profile

Manage your account details

Dark theme

I also made a dark theme for this

Dark theme

I also made a dark theme for this

Dark theme

I also made a dark theme for this

Reflections

Looking back on this project

Reflections

Looking back on this project

Reflections

Looking back on this project

What went well?

What went well?

Research! Even as a user of Midjourney, I had to do extensive research to deepen my understanding of the platform.

What went well?

Research! Even as a user of Midjourney, I had to do extensive research to deepen my understanding of the platform.

What went well?

Research! Even as a user of Midjourney, I had to do extensive research to deepen my understanding of the platform.

What went well?

Setting up a well-defined style guide to help me as I design. This was an opportunity to test out Variables in Figma

What went well?

Setting up a well-defined style guide to help me as I design. This was an opportunity to test out Variables in Figma

What went well?

Setting up a well-defined style guide to help me as I design. This was an opportunity to test out Variables in Figma

What could I have done better?

What could I have done better?

I could have gathered more user feedback to see what works and what doesn't

What could I have done better?

I could have gathered more user feedback to see what works and what doesn't

What could I have done better?

I could have gathered more user feedback to see what works and what doesn't

What could I have done better?

Although this was entirely to show how the experience can be improved via the desktop, designing a mobile view would greatly enhance the user experience.

What could I have done better?

Although this was entirely to show how the experience can be improved via the desktop, designing a mobile view would greatly enhance the user experience.

What could I have done better?

Although this was entirely to show how the experience can be improved via the desktop, designing a mobile view would greatly enhance the user experience.

First project

First project

Mintlify—Enhancing the experience of a documentation platform.

Mintlify—Enhancing the experience of a documentation platform.