I treat user experience work with an approach that incorporates five steps: User Research, Proposed solutions explorations, Wireframing, Prototyping, Testing. It’s a process that incorporates three basic tenets of UX: execution, evaluation, and iteration based on data-driven decisions.
I’m fairly structured in this approach, largely because big UX projects are like elephants: if you want to eat an entire elephant, you still need to tackle it one bite at a time. I find it overwhelmingly helpful to understand the whole of it for what it is and where it can be improved (i.e. where it’s weakest so you can start nomming on those parts first.)
I generally conduct user research for new products to see what kind of problems might be solved with a new design. Metrics are helpful in determining where there’s room for improvement. Sometimes, I’ll dig deeper and look at how the flow fits within the gestalt of the user journey so I can understand the context of where the new product will live.
Proposed Solutions Explorations
Through discussion and design writing, I often come up with one or two scenarios that might solve the existing design problem. Weighed against the gathered data, I validate with the stakeholders which approach we should test, make adjustments, or try something wholly new.
I wireframe after ensuring the business requirements are met. These are the bare bones of what the user flow, user interface, and user interaction should look and behave like. Oftentimes, this involves mapping out a user flow as well so we can understand interaction points.
After a quick prototype is built, we test the design to see if it works: putting it in the hands of the people who will use it. From there we iterate, evaluate the results, and apply the visual design.
A final prototype is built, the design files are cleaned up, and then handed to development to build the product.
I like to monitor the success of the product through collaboration with analytics. We set KPIs at the onset of any project to measure its success before designing begins, but it’s in the testing phase that we really get to see if it flies.
I should add another point at the end of this list, in that when our gains do not match our KPIs, it’s expected that we will iterate on existing designs to make incremental improvements. I typically work in collaboration with the analytics team to flag issues early, but I also like to establish touchpoints for review after a design is launched to see what performs well, and what can be improved.