Web Design Guide
How to Conduct a UX Audit
A UX audit reveals the hidden friction points that cost you conversions. This guide walks you through a systematic evaluation process that identifies what's broken, what's underperforming, and what to fix first.
Prerequisites
- Google Analytics or equivalent analytics platform with at least 90 days of data
- Heatmap and session recording tool (Hotjar, Microsoft Clarity, or FullStory)
- Accessibility testing tools (axe DevTools, WAVE, or Lighthouse)
- A screen reader for manual accessibility testing (VoiceOver or NVDA)
- Spreadsheet or project management tool for documenting and prioritizing findings
Preparing for the Audit
A UX audit without preparation is just browsing your own website with a critical eye. Proper preparation transforms it from a subjective exercise into an objective evaluation that produces actionable findings.
Start by defining audit scope and objectives. Are you auditing the entire site or specific conversion funnels? What metrics define success? A homepage audit might focus on bounce rate and scroll depth; a checkout audit might focus on cart abandonment rate and time-to-purchase. Narrow scope produces sharper insights than trying to audit everything at once.
Gather quantitative data before you start evaluating anything visually. Pull Google Analytics reports covering the last 90 days: page-level traffic, bounce rates, exit rates, conversion paths, and device breakdown. Identify your top 10 highest-traffic pages, top 10 highest-bounce-rate pages, and the primary conversion funnel steps. This data tells you where to focus. A usability problem on a page with 50 monthly visitors matters less than one on a page with 50,000. Set up heatmap and session recording tools (Hotjar, Microsoft Clarity, or FullStory) at least two weeks before beginning the audit so you have behavioral data to reference. Watching 20-30 session recordings on your key pages reveals patterns that analytics alone can't: where users hesitate, what they click on that isn't clickable, where they scroll past important content, and where they rage-click out of frustration.
Define Scope & Objectives
Choose specific pages or funnels to audit and define what success looks like with measurable metrics.
Gather Quantitative Data
Pull 90 days of analytics data including traffic, bounce rates, exit rates, and conversion paths for every page in scope.
Install Behavioral Tools
Set up heatmaps and session recordings at least two weeks before the audit to collect real user behavior data.
Identify Focus Areas
Prioritize high-traffic and high-bounce-rate pages. Fix problems where the most users encounter them.
Heuristic Evaluation
A heuristic evaluation assesses your interface against established usability principles. Jakob Nielsen's 10 usability heuristics, published in 1994 and still the industry standard, provide a systematic framework for identifying usability problems without requiring user testing.
Walk through each page in your audit scope and evaluate it against these core heuristics: visibility of system status (does the user know where they are and what's happening?), match between system and real world (does the interface use language and concepts familiar to users?), user control and freedom (can users easily undo mistakes or navigate back?), consistency and standards (do similar elements behave the same way throughout?), error prevention (does the interface prevent errors before they occur?), recognition rather than recall (is information visible when needed rather than requiring memory?), and aesthetic and minimalist design (is every element necessary?).
Score each heuristic on a severity scale: 0 (not a usability problem), 1 (cosmetic only), 2 (minor usability problem), 3 (major problem that needs fixing), 4 (usability catastrophe that must be fixed immediately). Document each finding with a screenshot, the violated heuristic, severity rating, affected page, and a recommended fix. This structured approach prevents the evaluation from devolving into a list of personal preferences. Having 2-3 evaluators conduct independent reviews and then comparing findings increases coverage significantly: Nielsen's research shows that a single evaluator catches only about 35% of usability problems, while 3-5 evaluators catch 60-75%.
Nielsen's 10 Heuristics
Evaluate against established principles: system status visibility, real-world match, user control, consistency, error prevention, recognition over recall, and minimalist design.
Severity Scoring
Rate each finding from 0 (cosmetic) to 4 (catastrophe). This prioritizes fixes by impact rather than personal preference.
Structured Documentation
Document each finding with a screenshot, violated heuristic, severity score, affected page URL, and recommended fix.
Multiple Evaluators
A single evaluator catches only 35% of problems. Use 3-5 independent evaluators and compare findings for comprehensive coverage.
User Flow Analysis
User flow analysis traces the actual paths visitors take through your site and compares them to the paths you intended them to take. The gap between intended and actual behavior reveals where your design fails to guide users effectively.
Start with your primary conversion funnel. Map the ideal path from entry point to conversion: for example, homepage to service page to pricing to contact form submission. Then pull the actual path data from Google Analytics (Behavior Flow or Path Exploration in GA4). You'll almost certainly find that real users take detours, backtrack, or exit at unexpected points. High exit rates at a specific funnel step indicate a UX problem at that step.
Session recordings bring user flows to life in a way analytics can't. Watch 30-50 sessions of users attempting to complete your primary conversion action. Note every point of hesitation (cursor pausing, slow scrolling), confusion (backtracking, clicking non-interactive elements), and abandonment (closing the tab, navigating away). Common patterns include users searching for information that should be prominent, clicking on images or text they expect to be links, struggling with navigation menus that don't match their mental model, and abandoning multi-step processes because they can't see how far along they are. Map these patterns onto your user flow diagram and flag each friction point. Pay special attention to mobile user flows, where smaller screens and touch interactions create different friction points than desktop. A navigation structure that works with a mouse may be frustrating with a thumb.
Ideal vs. Actual Paths
Map the conversion path you designed, then compare it to the paths real users actually take using analytics flow data.
Funnel Drop-Off Analysis
Identify which funnel steps have the highest exit rates. Each drop-off point represents a UX problem costing you conversions.
Session Recording Patterns
Watch 30-50 sessions to identify hesitation, confusion, and abandonment patterns that analytics alone can't reveal.
Mobile Flow Testing
Analyze mobile user flows separately. Touch interactions and smaller screens create different friction points than desktop navigation.
Accessibility Review
Accessibility isn't a separate audit category; it's a core UX requirement. The WHO estimates that 16% of the global population lives with some form of disability, and temporary impairments (a broken arm, bright sunlight on a screen) affect everyone at some point. An inaccessible website excludes potential customers and increasingly creates legal liability.
Start with automated testing using tools like axe DevTools, WAVE, or Lighthouse's accessibility audit. These tools catch approximately 30-40% of WCAG 2.1 issues automatically: missing alt text, insufficient color contrast, missing form labels, incorrect heading hierarchy, and missing ARIA attributes. Run automated scans on every page in your audit scope and document every finding.
Manual testing catches what automated tools miss. Navigate your entire site using only a keyboard (Tab, Enter, Escape, arrow keys). Can you reach every interactive element? Is the focus indicator visible? Can you operate all menus, forms, and modals without a mouse? Test with a screen reader (VoiceOver on Mac, NVDA on Windows) to verify that your content is announced in a logical order and that interactive elements are properly labeled. Check color contrast ratios using a contrast checker: WCAG 2.1 AA requires 4.5:1 for normal text and 3:1 for large text. Don't rely on color alone to convey information (like red text for errors without an icon or text label). Test with your browser zoomed to 200% to verify that content remains readable and functional for users with low vision. Every accessibility fix you make improves the experience for all users, not just those with disabilities. Clearer labels, better contrast, and keyboard navigability benefit everyone.
Automated Scanning
Run axe DevTools, WAVE, or Lighthouse accessibility audits on every page. Automated tools catch 30-40% of WCAG issues.
Keyboard Navigation
Navigate the entire site using only keyboard. Verify that every interactive element is reachable and operable without a mouse.
Screen Reader Testing
Test with VoiceOver or NVDA to confirm content is announced logically and all interactive elements are properly labeled.
Color & Zoom
Verify 4.5:1 contrast ratios for normal text, don't rely on color alone for information, and test at 200% zoom.
Prioritizing Fixes
A thorough UX audit typically produces 50-200 individual findings. Trying to fix everything at once is impractical and unnecessary. Effective prioritization ensures you fix the problems that have the biggest impact on conversion and user satisfaction first.
Use a prioritization matrix that scores each finding on two dimensions: impact (how much does this problem affect conversion rate, user satisfaction, or task completion?) and effort (how much design and development time does the fix require?). Plot findings into four quadrants: high impact / low effort (fix immediately), high impact / high effort (plan for next sprint), low impact / low effort (batch together for efficiency), and low impact / high effort (deprioritize or eliminate).
Weight impact based on traffic. A severity-3 usability problem on your homepage (50,000 monthly visitors) is far more urgent than a severity-4 problem on a blog post with 200 monthly visitors. Multiply severity by monthly traffic to create an "impact score" that reflects real-world business cost. Group related fixes into themed sprints. Instead of jumping between unrelated issues, batch all form fixes together, all navigation fixes together, and all accessibility fixes together. This produces more coherent improvements and makes it easier to measure the cumulative effect of each batch. Create a clear roadmap with three phases: quick wins (implement within 1-2 weeks), medium-term improvements (1-2 months), and strategic redesigns (next quarter). Quick wins build organizational momentum and demonstrate the value of UX investment, making it easier to secure resources for larger improvements later.
Impact/Effort Matrix
Score every finding on impact and effort. Fix high-impact, low-effort items first, then plan high-impact, high-effort items for later sprints.
Traffic-Weighted Severity
Multiply usability severity by monthly page traffic to calculate real-world business impact. Fix problems where the most users encounter them.
Themed Fix Sprints
Group related fixes (all form issues, all navigation issues) into batches for more coherent improvements and clearer measurement.
Three-Phase Roadmap
Organize fixes into quick wins (1-2 weeks), medium-term improvements (1-2 months), and strategic redesigns (next quarter).
Related Content
Brand Identity Design Guide
Build a cohesive brand identity from strategy to execution. Covers brand foundations, visual identity, voice and messaging, brand guidelines, and digital application.
Conversion-Centered Design Guide
Learn how to design websites that convert. Covers the psychology of conversion, above-the-fold strategy, CTA design, form optimization, and A/B testing.
Responsive Design Best Practices
Master responsive web design with mobile-first principles, breakpoint strategy, fluid layouts, responsive images, and cross-device testing techniques.
The Web Design Process Explained
Understand every phase of the web design process. Covers discovery, wireframing, visual design, development handoff, and QA through launch.