RUFUS

Designing Amazon’s Conversational AI Shopping Experience

RUFUS

Designing Amazon’s Conversational AI Shopping Experience

RUFUS

Designing Amazon’s Conversational AI Shopping Experience

Executive Summary

Rufus is Amazon’s generative AI shopping assistant, launched in 2023 to help customers shop with confidence through conversational, AI-powered guidance. As lead UX visual designer, I established Rufus’s design system and visual language, ensuring it could scale across modalities and inspire trust in a new kind of customer interaction. My work focused on defining the visual identity, creating scalable UI components, and setting design principles that balanced innovation with accessibility and clarity—ultimately shaping the foundation for Amazon’s conversational commerce.

Executive Summary

Rufus is Amazon’s generative AI shopping assistant, launched in 2023 to help customers shop with confidence through conversational, AI-powered guidance. As lead UX visual designer, I established Rufus’s design system and visual language, ensuring it could scale across modalities and inspire trust in a new kind of customer interaction. My work focused on defining the visual identity, creating scalable UI components, and setting design principles that balanced innovation with accessibility and clarity—ultimately shaping the foundation for Amazon’s conversational commerce.

Executive Summary

Rufus is Amazon’s generative AI shopping assistant, launched in 2023 to help customers shop with confidence through conversational, AI-powered guidance. As lead UX visual designer, I established Rufus’s design system and visual language, ensuring it could scale across modalities and inspire trust in a new kind of customer interaction. My work focused on defining the visual identity, creating scalable UI components, and setting design principles that balanced innovation with accessibility and clarity—ultimately shaping the foundation for Amazon’s conversational commerce.

Context

Amazon set out to introduce conversational shopping to millions of customers, integrating AI into the shopping journey in a way that felt both novel and trustworthy. Unlike previous shopping experiences, Rufus allowed customers to interact in natural language, ask open-ended questions, and rely on AI to surface product guidance—a paradigm shift from traditional keyword search. Designing this assistant meant creating not just a new interface, but a new set of expectations for how customers engage with the Amazon app and website to accomplish their shopping goals.

Context

Amazon set out to introduce conversational shopping to millions of customers, integrating AI into the shopping journey in a way that felt both novel and trustworthy. Unlike previous shopping experiences, Rufus allowed customers to interact in natural language, ask open-ended questions, and rely on AI to surface product guidance—a paradigm shift from traditional keyword search. Designing this assistant meant creating not just a new interface, but a new set of expectations for how customers engage with the Amazon app and website to accomplish their shopping goals.

Context

Amazon set out to introduce conversational shopping to millions of customers, integrating AI into the shopping journey in a way that felt both novel and trustworthy. Unlike previous shopping experiences, Rufus allowed customers to interact in natural language, ask open-ended questions, and rely on AI to surface product guidance—a paradigm shift from traditional keyword search. Designing this assistant meant creating not just a new interface, but a new set of expectations for how customers engage with the Amazon app and website to accomplish their shopping goals.

Design Challenges

Trust in AI

Customers needed visual and interaction cues that reinforced credibility, transparency, and reliability.

Defining new patterns

No established blueprint existed for how questions, answers, and product suggestions should flow in a shopping-specific conversational interface.

System scalability

The design needed to support future extensions across Amazon's store without fragmentation or rework.

Innovation vs. consistency

Rufus had to feel distinct enough to stand out as “something new,” but cohesive within Amazon’s broader design ecosystem.

Trust in AI

Customers needed visual and interaction cues that reinforced credibility, transparency, and reliability.

Defining new patterns

No established blueprint existed for how questions, answers, and product suggestions should flow in a shopping-specific conversational interface.

System scalability

The design needed to support future extensions across Amazon's store without fragmentation or rework.

Innovation vs. consistency

Rufus had to feel distinct enough to stand out as “something new,” but cohesive within Amazon’s broader design ecosystem.

Trust in AI

Customers needed visual and interaction cues that reinforced credibility, transparency, and reliability.

Defining new patterns

No established blueprint existed for how questions, answers, and product suggestions should flow in a shopping-specific conversational interface.

System scalability

The design needed to support future extensions across Amazon's store without fragmentation or rework.

Innovation vs. consistency

Rufus had to feel distinct enough to stand out as “something new,” but cohesive within Amazon’s broader design ecosystem.

My Role

I led UX visual design for Rufus from launch through scale, collaborating closely with UX Research, Product, Engineering, and Data Science. My contributions included:

  • Establishing the design system and visual language.

  • Designing scalable UI patterns for conversational interactions.

  • Partnering with UX Research to run usability studies and Engineering to run A/B testing that guided key design decisions.

  • Translating research insights into visual and interaction principles around color, iconography, navigation, and UI efficiency.

My Role

I led UX visual design for Rufus from launch through scale, collaborating closely with UX Research, Product, Engineering, and Data Science. My contributions included:

  • Establishing the design system and visual language.

  • Designing scalable UI patterns for conversational interactions.

  • Partnering with UX Research to run usability studies and Engineering to run A/B testing that guided key design decisions.

  • Translating research insights into visual and interaction principles around color, iconography, navigation, and UI efficiency.

My Role

I led UX visual design for Rufus from launch through scale, collaborating closely with UX Research, Product, Engineering, and Data Science. My contributions included:

  • Establishing the design system and visual language.

  • Designing scalable UI patterns for conversational interactions.

  • Partnering with UX Research to run usability studies and Engineering to run A/B testing that guided key design decisions.

  • Translating research insights into visual and interaction principles around color, iconography, navigation, and UI efficiency.

Process

The design process began with foundational research into customer perceptions of AI-powered shopping. Working with UX Research, we tested brand and UI directions to understand how design choices affected trust and usability. Insights from these studies shaped the initial visual language and has allowed us to continually improve the interface as customer perceptions and expectations for AI continue to evolve.

From there, I built Rufus’s design system, focusing on scalability. The system defined how conversational input and responses appear, how product recommendations are embedded in chat, and how trust signals (like attribution and transparency markers) are displayed. Each component was designed with extensibility in mind, ensuring the same system could later support multimodal interactions.

After launch, I guided a series of refinements to Rufus’s interface through usability studies and A/B testing across desktop and mobile. Some were sweeping, like evolving the color palette to keep pace with Amazon’s broader UI refresh. Others were subtle yet powerful at Amazon’s scale. In the conversational interface, reducing margins and removing avatars increased usable text width by 13% and surfaced ~6–7% more content above the fold, while right-aligning customer input created clearer visual distinction from Rufus’s responses. Together, these adjustments gave customers faster access to information and made conversations easier to scan.

Process

The design process began with foundational research into customer perceptions of AI-powered shopping. Working with UX Research, we tested brand and UI directions to understand how design choices affected trust and usability. Insights from these studies shaped the initial visual language and has allowed us to continually improve the interface as customer perceptions and expectations for AI continue to evolve.

From there, I built Rufus’s design system, focusing on scalability. The system defined how conversational input and responses appear, how product recommendations are embedded in chat, and how trust signals (like attribution and transparency markers) are displayed. Each component was designed with extensibility in mind, ensuring the same system could later support multimodal interactions.

After launch, I guided a series of refinements to Rufus’s interface through usability studies and A/B testing across desktop and mobile. Some were sweeping, like evolving the color palette to keep pace with Amazon’s broader UI refresh. Others were subtle yet powerful at Amazon’s scale. In the conversational interface, reducing margins and removing avatars increased usable text width by 13% and surfaced ~6–7% more content above the fold, while right-aligning customer input created clearer visual distinction from Rufus’s responses. Together, these adjustments gave customers faster access to information and made conversations easier to scan.

Process

The design process began with foundational research into customer perceptions of AI-powered shopping. Working with UX Research, we tested brand and UI directions to understand how design choices affected trust and usability. Insights from these studies shaped the initial visual language and has allowed us to continually improve the interface as customer perceptions and expectations for AI continue to evolve.

From there, I built Rufus’s design system, focusing on scalability. The system defined how conversational input and responses appear, how product recommendations are embedded in chat, and how trust signals (like attribution and transparency markers) are displayed. Each component was designed with extensibility in mind, ensuring the same system could later support multimodal interactions.

After launch, I guided a series of refinements to Rufus’s interface through usability studies and A/B testing across desktop and mobile. Some were sweeping, like evolving the color palette to keep pace with Amazon’s broader UI refresh. Others were subtle yet powerful at Amazon’s scale. In the conversational interface, reducing margins and removing avatars increased usable text width by 13% and surfaced ~6–7% more content above the fold, while right-aligning customer input created clearer visual distinction from Rufus’s responses. Together, these adjustments gave customers faster access to information and made conversations easier to scan.

Outcome

Rufus launched to millions of U.S. customers in the Amazon Shopping app and desktop in 2023, and is now available in 8 locales worldwide: U.S., U.K., India, Canada, Germany, France, Italy, Spain, and Japan. It has become the foundation for conversational shopping within Amazon, with its design system guiding future multimodal AI experiences, like Lens Live. By shaping Rufus’s identity and design principles, I helped establish not just a new interface, but a new interaction model for how customers shop with AI—one grounded in accessibility, clarity, and trust.

Outcome

Rufus launched to millions of U.S. customers in the Amazon Shopping app and desktop in 2023, and is now available in 8 locales worldwide: U.S., U.K., India, Canada, Germany, France, Italy, Spain, and Japan. It has become the foundation for conversational shopping within Amazon, with its design system guiding future multimodal AI experiences, like Lens Live. By shaping Rufus’s identity and design principles, I helped establish not just a new interface, but a new interaction model for how customers shop with AI—one grounded in accessibility, clarity, and trust.

Outcome

Rufus launched to millions of U.S. customers in the Amazon Shopping app and desktop in 2023, and is now available in 8 locales worldwide: U.S., U.K., India, Canada, Germany, France, Italy, Spain, and Japan. It has become the foundation for conversational shopping within Amazon, with its design system guiding future multimodal AI experiences, like Lens Live. By shaping Rufus’s identity and design principles, I helped establish not just a new interface, but a new interaction model for how customers shop with AI—one grounded in accessibility, clarity, and trust.