Considerable experience conducting research tasks utilized to ascertain product/service issues, user needs, improvement goals, and track performance.  The following are a few examples of these tasks including various tools utilized:

Analytics Review

Often one of the first tasks conducted is understanding analytically how a product or service is performing. This allows for :

  1. Learning: Reviewing the metrics helps with gaining an understanding for the app/system, and how it’s being used.
  2. Gap Analysis: Run those metrics of ‘actual’ user usage against ‘desired’ business goals of usage and address the gaps.
  3. KPIs: Sets a baseline for UX and business side to be able to later quantify via Key Performance Indicators (KPIs) the Return On Investment (ROI) based on improvements.

Competitive Analysis

This tool allows for the ability to ascertain and produce goals in both qualitative and quantitative nature. ‘Judging’ of the existing interface as well as competitors is typically occur based on UX experience heuristics and experience as well as end user feedback. The qualitative portion provides general and specific design goals while the quantitative portion helps to define requirements.

comp a

Content Strategy

Usually work through trying to determine the best content strategy involves card sorting exercise with the Product Owners, SMEs, and end users. Conduct this exercise with those individuals via:

  1. Composition: Asking which existing elements should remain, which added, and which removed
  2. Terminology: Asking if the wording for the elements make sense? If not what do they recommend?
  3. Mental Map: How would they group the items?
  4. Prioritization: How would they triage the elements from most to least important in supporting the business goals?

Cognitive Walkthrough / Observation / Interviews

From my early days of working in aerospace industry designing cockpits for helicopter aircrews (below), through to designing current day education and financial applications, there is always considerable effort put into understanding users needs and barriers to completing tasks.

Routinely conduct cognitive walkthroughs, observations, and interviews with end users at the beginning, middle, and after development to ensure task completion rates improve …and they have a much more enjoyable experience!



Utilize personas to some extent during the process to put a face with the major roles and ensure they’re needs are met and how a system would change for those roles.

That said personas only are a guide rail and in more recent years with information overload being a real concern most of my conceptualization aims to address very individualized personalization of content.

Usability Testing

Depending on a project’s timeframe it’s beneficial to conduct usability testing multiple times during development AND after development to capture KPIs for further iterations. Below are examples for:

  1. Methodology: Use Cases focused on, demographics of participants, etc
  2. Findings: The summary of user likes and dislikes, barriers to task completions, quotes, etc.
  3. Actions: A prioritized listing of next steps based on importance and level of ability to make changes.