VRV - Streaming Platform Redesign for Enhanced Discoverability

ux exploration

research

component library

usability testing

Most of my professional work centers on design systems and operational infrastructure—work I'm skilled at and genuinely enjoy. But I wanted to develop deeper discovery and research capabilities. When a friend complained about VRV's poor usability, we saw an opportunity. We assembled a team of 4 designers and a project manager, and we treated it like a real client engagement—setting realistic constraints around timeline, scope, and research budget. We gave ourselves 7 weeks to research, prioritize, and design solutions.

Most of my professional work centers on design systems and operational infrastructure—work I'm skilled at and genuinely enjoy. But I wanted to develop deeper discovery and research capabilities. When a friend complained about VRV's poor usability, we saw an opportunity. We assembled a team of 4 designers and a project manager, and we treated it like a real client engagement—setting realistic constraints around timeline, scope, and research budget. We gave ourselves 7 weeks to research, prioritize, and design solutions.

Most of my professional work centers on design systems and operational infrastructure—work I'm skilled at and genuinely enjoy. But I wanted to develop deeper discovery and research capabilities. When a friend complained about VRV's poor usability, we saw an opportunity. We assembled a team of 4 designers and a project manager, and we treated it like a real client engagement—setting realistic constraints around timeline, scope, and research budget. We gave ourselves 7 weeks to research, prioritize, and design solutions.

Overview

VRV is an over-the-top (OTT) streaming service launched in November 2016, specializing in anime and geek culture content. The platform operates on a hybrid model where some content streams for free while premium content requires subscription. Users can purchase individual channel subscriptions or opt for premium bundles that provide access to multiple channels.

In a market where users already subscribed to 3-5 streaming services, VRV needed to prove its value immediately or lose the signup window entirely.

Redesigned VRV homepage

Project Type

Self initiated exploration

Timeline

7 months

Role

UX Designer (5 person team including ux researcher, and project mananger)

Focus

Discovery research, prioritization framework, content discovery patterns

Project Type

Self initiated exploration

Timeline

7 months

Role

UX Designer (5 person team including ux researcher, and project mananger)

Focus

Discovery research, prioritization framework, content discovery patterns

Project Type

Self initiated exploration

Timeline

7 months

Role

UX Designer (5 person team including ux researcher, and project mananger)

Focus

Discovery research, prioritization framework, content discovery patterns

Research: Uncovering the Real Problem

We started with a hypothesis: VRV had usability problems. But research revealed something more fundamental.

Quantitative Validation: The Awareness Gap

We surveyed 51 participants across various demographics to validate the scale of issues we heard in initial conversations.

The numbers told a surprising story:

Only 2 out of 51 participants (4%) had heard of VRV

Only 2 out of 51 participants (4%) had heard of VRV

Only 2 out of 51 participants (4%) had heard of VRV

Yet 52.9% regularly watched anime content

Yet 52.9% regularly watched anime content

Yet 52.9% regularly watched anime content

And 68% struggled with content discovery on streaming platforms they currently used

And 68% struggled with content discovery on streaming platforms they currently used

And 68% struggled with content discovery on streaming platforms they currently used

2/51 knew VRV, 52.9% watch anime, 68% struggle with content discovery

This wasn't just a usability problem—VRV had a fundamental awareness challenge. Even if we perfected the experience, most potential users didn't know the platform existed.

Qualitative Depth: Understanding User Behavior

We conducted 16 in-depth interviews with users of different ages and streaming habits. The pattern was consistent: participants who watched anime were loyal to platforms they already used (Crunchyroll, Funimation, Netflix) and saw no reason to switch.

💡

Key insight: For VRV to convert users, the experience had to be noticeably better—not just comparable—from the first interaction.

💡

Key insight: For VRV to convert users, the experience had to be noticeably better—not just comparable—from the first interaction.

💡

Key insight: For VRV to convert users, the experience had to be noticeably better—not just comparable—from the first interaction.

Competitive Analysis: Learning From the Market

We analyzed 12 direct and indirect competitors to understand what successful streaming platforms did well—and where they failed.

Feature comparison of VRV with 12 direct and indirect competitors.

Critical patterns emerged

We analyzed 12 direct and indirect competitors to understand what successful streaming platforms did well—and where they failed.

Successful platforms used familiar mental models (shopping carts, "Continue Watching") rather than forcing users to learn new paradigms

Successful platforms used familiar mental models (shopping carts, "Continue Watching") rather than forcing users to learn new paradigms

Successful platforms used familiar mental models (shopping carts, "Continue Watching") rather than forcing users to learn new paradigms

Failed platforms buried critical information or made subscription value unclear

Failed platforms buried critical information or made subscription value unclear

Failed platforms buried critical information or made subscription value unclear

Content discovery required multiple pathways: behavioral recommendations, genre browsing, and robust search

Content discovery required multiple pathways: behavioral recommendations, genre browsing, and robust search

Content discovery required multiple pathways: behavioral recommendations, genre browsing, and robust search

We synthesized competitive insights through card sorting—green cards for successful patterns to adopt, red cards for pitfalls to avoid. This collaborative exercise aligned the team on which patterns were worth borrowing vs. where VRV could differentiate.

Strategic Decision: Prioritizing What Mattered Most

With research revealing 20+ distinct pain points, we faced a common design challenge: too many problems, not enough resources to solve them all.

I introduced the Red Routes framework to prioritize ruthlessly. By plotting user needs on frequency vs. importance axes, we identified which problems—if solved—would create the highest value.

💡

Not all problems are equal. A rare edge case, even if frustrating, matters less than a frequent blocker. The Red Routes method gave us objective criteria to make tough prioritization decisions.

💡

Not all problems are equal. A rare edge case, even if frustrating, matters less than a frequent blocker. The Red Routes method gave us objective criteria to make tough prioritization decisions.

💡

Not all problems are equal. A rare edge case, even if frustrating, matters less than a frequent blocker. The Red Routes method gave us objective criteria to make tough prioritization decisions.

Red routes method shows which user needs that has more value to solve

Zoom in the prioritized pain points that we needed to solve

This analysis surfaced four critical problems worth solving. By focusing on these four, we could impact both new user acquisition and retention.

Content discovery was broken

Search only worked for exact titles, categories were disorganized, no filtering existed

Content discovery was broken

Search only worked for exact titles, categories were disorganized, no filtering existed

Content discovery was broken

Search only worked for exact titles, categories were disorganized, no filtering existed

Subscription value was unclear

Users couldn't understand what they'd get or compare channel options easily

Subscription value was unclear

Users couldn't understand what they'd get or compare channel options easily

Subscription value was unclear

Users couldn't understand what they'd get or compare channel options easily

Navigation patterns defied expectations

Key features were buried or inconsistent with streaming platform norms

Navigation patterns defied expectations

Key features were buried or inconsistent with streaming platform norms

Navigation patterns defied expectations

Key features were buried or inconsistent with streaming platform norms

Visual hierarchy was weak

Users struggled to scan content and make decisions quickly

Visual hierarchy was weak

Users struggled to scan content and make decisions quickly

Visual hierarchy was weak

Users struggled to scan content and make decisions quickly

Design Approach: Building for Team Consistency

Before jumping to high-fidelity screens, I advocated for building a component library using Atomic Design methodology. With five designers working in parallel, this investment prevented the visual inconsistency that often plagues team projects.

Screenshot of Figma file featuring commonly used design elements

💡

The result: when one designer updated a button or card component, it propagated across all screens automatically. What could have been weeks of alignment discussions became seamless parallel work.

💡

The result: when one designer updated a button or card component, it propagated across all screens automatically. What could have been weeks of alignment discussions became seamless parallel work.

💡

The result: when one designer updated a button or card component, it propagated across all screens automatically. What could have been weeks of alignment discussions became seamless parallel work.

Key Design Solution: Multi-Layered Content Discovery

The biggest redesign focused on homepage discovery. We created multiple pathways for different user behaviors:

Behavioral recommendations for passive browsing ("Continue Watching," "Because you watched...")

Behavioral recommendations for passive browsing ("Continue Watching," "Because you watched...")

Behavioral recommendations for passive browsing ("Continue Watching," "Because you watched...")

Horizontal content rows for exploration by genre and channel

Horizontal content rows for exploration by genre and channel

Horizontal content rows for exploration by genre and channel

Advanced filtering for targeted search (genre, release year, channel, rating)

Advanced filtering for targeted search (genre, release year, channel, rating)

Advanced filtering for targeted search (genre, release year, channel, rating)

Clear visual hierarchy so users could scan dozens of options without cognitive overload

Clear visual hierarchy so users could scan dozens of options without cognitive overload

Clear visual hierarchy so users could scan dozens of options without cognitive overload

Key screens: Homepage with poster-based browsing and horizontal content rows | Watchlist using familiar save-for-later patterns | Play Screen optimized for desktop and mobile usability.

Usability Testing: Validating the Approach

We tested the prototype with 10 participants, including users with color blindness and age-related cognitive differences, to validate our design decisions across four key scenarios.

The results validated our core approach:

9 out of 10 users successfully completed content discovery tasks without guidance

9 out of 10 users successfully completed content discovery tasks without guidance

9 out of 10 users successfully completed content discovery tasks without guidance

Average task completion time met our target benchmarks

Average task completion time met our target benchmarks

Average task completion time met our target benchmarks

Advanced filtering for targeted search (genre, release year, channel, rating)

Advanced filtering for targeted search (genre, release year, channel, rating)

Advanced filtering for targeted search (genre, release year, channel, rating)

Users consistently responded positively: "Everything seemed to be where I would've anticipated it...didn't even have to think about it."

Users consistently responded positively: "Everything seemed to be where I would've anticipated it...didn't even have to think about it."

Users consistently responded positively: "Everything seemed to be where I would've anticipated it...didn't even have to think about it."

However, testing surfaced opportunities for refinement:

The subscription comparison flow, while functional, required more cognitive effort than ideal. Users could complete the task but needed to scan back and forth between options to make confident decisions. The information hierarchy could be optimized to make channel comparison more intuitive.

These insights became our recommended next steps: tightening information architecture in the subscription flow, adding side-by-side comparison views, and refining visual hierarchy to reduce comparison friction.

What This Project Taught Me

Quantitative research validates assumptions at scale, but qualitative research reveals why. The survey told us only 4% knew VRV, but interviews revealed the emotional loyalty users felt toward existing platforms. Both data types were necessary.

Competitive analysis isn't about copying—it's about understanding learned behaviors. Users bring expectations from every platform they've used. Fighting those expectations requires extraordinary justification.

Strategic prioritization matters more than comprehensive solutions. We could have designed improvements for all 20+ pain points. Red Routes helped us focus on the four that would create the most value.

Good testing validates your approach AND surfaces the next layer of problems. When users successfully complete tasks but you notice friction points, that's valuable data for future iterations. Perfect is the enemy of shipped.

Reflection

This self-directed project gave me hands-on experience with discovery methods my professional design systems work doesn't typically require. The research synthesis—taking 51 survey responses, 16 interviews, and 12 competitive analyses and distilling them into four prioritized problems—taught me how to find signal in noise.

While I'm proud of the visual designs, the real value was learning to structure research at scale, facilitate collaborative synthesis exercises, and make defensible prioritization decisions under constraints.

The research methods and strategic frameworks I practiced here now inform how I approach client projects—even when the deliverable is a component library, I'm asking better questions about which problems to solve first and why.

Note: This was a self-initiated project where I formed a team to practice end-to-end UX research and design. The designs did not ship, though the research methods and strategic frameworks have since informed my professional client work.

Have an idea worth stitching together?

Designed & Developed by Muneeba

Have an idea worth stitching together?

Designed & Developed by Muneeba

Have an idea worth stitching together?

Designed & Developed by Muneeba