Intel's Next-Generation Enterprise Search

From fragmented experiences to 20M+ unified queries annually.

Gradient 1 - Blue

Intel's Next-Generation Enterprise Search

From fragmented experiences to 20M+ unified queries annually.

Gradient 1 - Blue

Intel's Next-Generation Enterprise Search

From fragmented experiences to 20M+ unified queries annually.

Gradient 1 - Blue

Intel's Next-Generation Enterprise Search

From fragmented experiences to 20M+ unified queries annually.

Gradient 1 - Blue

Intel's Next-Generation Enterprise Search

From fragmented experiences to 20M+ unified queries annually.

Gradient 1 - Blue

Overview

Intel’s search was a mess, but not for the reason you’d think. The technology was fine, we had Coveo, a robust AI-powered search platform. The problem was that over a decade, different parts of Intel.com had implemented search differently. Marketing had one approach, support had another, product pages had a third. Same underlying technology, completely inconsistent user experience.

I led the redesign to create a unified search experience across Intel.com. The real challenge wasn’t designing a better search interface, it was getting dozens of content owners to agree on what “better” meant.

My Role

Lead Product Designer on DX Team

I owned the UX strategy and design execution. But honestly, half my time was spent mediating between teams who all thought their implementation was the correct one.

The Problem

Intel.com serves vastly different audiences: consumers buying laptops, IT professionals researching data center solutions, developers looking for technical specs, investors checking financials. Each group’s content was managed by different teams and each team had optimized search for their corner of the site.

This created:

  • Inconsistent result layouts depending on where you searched from

  • Filters that appeared or disappeared based on which section you were in

  • Wildly different ranking algorithms (support prioritized recency, marketing prioritized campaigns, technical docs prioritized exact matches)

  • Gated content appearing in results even when users didn’t have access

  • Mobile experiences that were essentially broken, filters were hidden, pagination didn’t work

The data was stark: average time from search to finding content was 3-4 minutes. Industry benchmark is under 60 seconds. And that’s just for users who found what they needed, we had no reliable way to measure how many people gave up.

What actually happened

  1. Misaligned Teams, Misaligned Search

Each team optimized search for their own goals, marketing prioritized campaigns, support favored recency and product emphasized specifications. Users experienced inconsistent relevance and navigation patterns across experiences, eroding trust.

Search wasn’t just a usability issue, it was an organizational alignment problem disguised as UX.

  1. One Search, Many Needs

“Search” meant different things depending on the context. Internal tools, marketing pages and support articles all surfaced content differently, leading to confusion and inconsistent results.

I facilitated a shared definition of what “search” should mean across the ecosystem and introduced a unified design language for results, filters and metadata.

Scrolling image
  1. Filters Without the Chaos

Through iterative testing, we proved that standardizing relevance models and filters improved comprehension and reduced drop-offs. Centralized search templates brought predictability without limiting flexibility.

Final categories were grouped by user intent, not internal org charts improving clarity and reducing drop-offs post-launch.

Scrolling image
  1. Accessibility Took Real Work

An audit revealed 47 accessibility issues in Coveo’s default components. I pushed to extend, not replace, their library partnering directly with Coveo engineers to fix issues.

We shipped an accessible experience that reached everyone.

  1. Documentation That Saved the Project

I fully spec’d responsive states, filter behaviors, and expected search results in Figma so another member of the team could produce highly detailed, developer focused documentation. At first it seemed like overkill, until edge cases hit. The dev team later said that level of detail “saved us weeks.”

Results

  • 20M+ annual queries through the unified experience

  • Search-to-click time dropped to under 90 seconds (still working on getting it lower)

  • Mobile search usage increased 40% after launch (people had been avoiding it before because it was so broken)

  • Achieved WCAG 2.1 AA compliance across full experience

  • Ranking is better but not perfect. Some high-value content still gets buried because it lacks the metadata Coveo needs. This is an ongoing content operations issue, not a design issue, but it limits the experience.

What I learned

  • Users don’t care about your internal org structure. They don’t know that marketing and support have different content management systems. They don’t know that your company reorganized three times and nobody unified the taxonomy. They just want answers.

  • I also learned that “AI-powered search” means nothing if the fundamentals are broken. Coveo gave us sophisticated ranking algorithms, but they couldn’t fix inconsistent metadata, unclear taxonomies or interfaces that excluded screen reader users. Technology is an amplifier, not a solution.

  • This project succeeded because we fixed the foundation first: clear information architecture, consistent interface patterns, accessible interactions. The AI makes it better, but the design makes it work.

Have a design challenge? I'd love to help solve it.

© Kevin Shalkowsky 2025 - All rights reserved

© Kevin Shalkowsky 2025 - All rights reserved

© Kevin Shalkowsky 2025 - All rights reserved

© Kevin Shalkowsky 2025 - All rights reserved

© Kevin Shalkowsky 2025 - All rights reserved

Overview

Intel’s search was a mess, but not for the reason you’d think. The technology was fine, we had Coveo, a robust AI-powered search platform. The problem was that over a decade, different parts of Intel.com had implemented search differently. Marketing had one approach, support had another, product pages had a third. Same underlying technology, completely inconsistent user experience.

I led the redesign to create a unified search experience across Intel.com. The real challenge wasn’t designing a better search interface, it was getting dozens of content owners to agree on what “better” meant.

My Role

Lead Product Designer on DX Team

I owned the UX strategy and design execution. But honestly, half my time was spent mediating between teams who all thought their implementation was the correct one.

The Problem

Intel.com serves vastly different audiences: consumers buying laptops, IT professionals researching data center solutions, developers looking for technical specs, investors checking financials. Each group’s content was managed by different teams and each team had optimized search for their corner of the site.

This created:

  • Inconsistent result layouts depending on where you searched from

  • Filters that appeared or disappeared based on which section you were in

  • Wildly different ranking algorithms (support prioritized recency, marketing prioritized campaigns, technical docs prioritized exact matches)

  • Gated content appearing in results even when users didn’t have access

  • Mobile experiences that were essentially broken, filters were hidden, pagination didn’t work

The data was stark: average time from search to finding content was 3-4 minutes. Industry benchmark is under 60 seconds. And that’s just for users who found what they needed, we had no reliable way to measure how many people gave up.

What actually happened

  1. Misaligned Teams, Misaligned Search

Each team optimized search for their own goals, marketing prioritized campaigns, support favored recency and product emphasized specifications. Users experienced inconsistent relevance and navigation patterns across experiences, eroding trust.

Search wasn’t just a usability issue, it was an organizational alignment problem disguised as UX.

  1. One Search, Many Needs

“Search” meant different things depending on the context. Internal tools, marketing pages and support articles all surfaced content differently, leading to confusion and inconsistent results.

I facilitated a shared definition of what “search” should mean across the ecosystem and introduced a unified design language for results, filters and metadata.

Scrolling image
  1. Filters Without the Chaos

Through iterative testing, we proved that standardizing relevance models and filters improved comprehension and reduced drop-offs. Centralized search templates brought predictability without limiting flexibility.

Final categories were grouped by user intent, not internal org charts improving clarity and reducing drop-offs post-launch.

Scrolling image
  1. Accessibility Took Real Work

An audit revealed 47 accessibility issues in Coveo’s default components. I pushed to extend, not replace, their library partnering directly with Coveo engineers to fix issues.

We shipped an accessible experience that reached everyone.

  1. Documentation That Saved the Project

I fully spec’d responsive states, filter behaviors, and expected search results in Figma so another member of the team could produce highly detailed, developer focused documentation. At first it seemed like overkill, until edge cases hit. The dev team later said that level of detail “saved us weeks.”

Results

  • 20M+ annual queries through the unified experience

  • Search-to-click time dropped to under 90 seconds (still working on getting it lower)

  • Mobile search usage increased 40% after launch (people had been avoiding it before because it was so broken)

  • Achieved WCAG 2.1 AA compliance across full experience

  • Ranking is better but not perfect. Some high-value content still gets buried because it lacks the metadata Coveo needs. This is an ongoing content operations issue, not a design issue, but it limits the experience.

What I learned

  • Users don’t care about your internal org structure. They don’t know that marketing and support have different content management systems. They don’t know that your company reorganized three times and nobody unified the taxonomy. They just want answers.

  • I also learned that “AI-powered search” means nothing if the fundamentals are broken. Coveo gave us sophisticated ranking algorithms, but they couldn’t fix inconsistent metadata, unclear taxonomies or interfaces that excluded screen reader users. Technology is an amplifier, not a solution.

  • This project succeeded because we fixed the foundation first: clear information architecture, consistent interface patterns, accessible interactions. The AI makes it better, but the design makes it work.

Overview

Intel’s search was a mess, but not for the reason you’d think. The technology was fine, we had Coveo, a robust AI-powered search platform. The problem was that over a decade, different parts of Intel.com had implemented search differently. Marketing had one approach, support had another, product pages had a third. Same underlying technology, completely inconsistent user experience.

I led the redesign to create a unified search experience across Intel.com. The real challenge wasn’t designing a better search interface, it was getting dozens of content owners to agree on what “better” meant.

My Role

Lead Product Designer on DX Team

I owned the UX strategy and design execution. But honestly, half my time was spent mediating between teams who all thought their implementation was the correct one.

The Problem

Intel.com serves vastly different audiences: consumers buying laptops, IT professionals researching data center solutions, developers looking for technical specs, investors checking financials. Each group’s content was managed by different teams and each team had optimized search for their corner of the site.

This created:

  • Inconsistent result layouts depending on where you searched from

  • Filters that appeared or disappeared based on which section you were in

  • Wildly different ranking algorithms (support prioritized recency, marketing prioritized campaigns, technical docs prioritized exact matches)

  • Gated content appearing in results even when users didn’t have access

  • Mobile experiences that were essentially broken, filters were hidden, pagination didn’t work

The data was stark: average time from search to finding content was 3-4 minutes. Industry benchmark is under 60 seconds. And that’s just for users who found what they needed, we had no reliable way to measure how many people gave up.

What actually happened

  1. Misaligned Teams, Misaligned Search

Each team optimized search for their own goals, marketing prioritized campaigns, support favored recency and product emphasized specifications. Users experienced inconsistent relevance and navigation patterns across experiences, eroding trust.

Search wasn’t just a usability issue, it was an organizational alignment problem disguised as UX.

  1. One Search, Many Needs

“Search” meant different things depending on the context. Internal tools, marketing pages and support articles all surfaced content differently, leading to confusion and inconsistent results.

I facilitated a shared definition of what “search” should mean across the ecosystem and introduced a unified design language for results, filters and metadata.

Scrolling image
  1. Filters Without the Chaos

Through iterative testing, we proved that standardizing relevance models and filters improved comprehension and reduced drop-offs. Centralized search templates brought predictability without limiting flexibility.

Final categories were grouped by user intent, not internal org charts improving clarity and reducing drop-offs post-launch.

Scrolling image
  1. Accessibility Took Real Work

An audit revealed 47 accessibility issues in Coveo’s default components. I pushed to extend, not replace, their library partnering directly with Coveo engineers to fix issues.

We shipped an accessible experience that reached everyone.

  1. Documentation That Saved the Project

I fully spec’d responsive states, filter behaviors, and expected search results in Figma so another member of the team could produce highly detailed, developer focused documentation. At first it seemed like overkill, until edge cases hit. The dev team later said that level of detail “saved us weeks.”

Results

  • 20M+ annual queries through the unified experience

  • Search-to-click time dropped to under 90 seconds (still working on getting it lower)

  • Mobile search usage increased 40% after launch (people had been avoiding it before because it was so broken)

  • Achieved WCAG 2.1 AA compliance across full experience

  • Ranking is better but not perfect. Some high-value content still gets buried because it lacks the metadata Coveo needs. This is an ongoing content operations issue, not a design issue, but it limits the experience.

What I learned

  • Users don’t care about your internal org structure. They don’t know that marketing and support have different content management systems. They don’t know that your company reorganized three times and nobody unified the taxonomy. They just want answers.

  • I also learned that “AI-powered search” means nothing if the fundamentals are broken. Coveo gave us sophisticated ranking algorithms, but they couldn’t fix inconsistent metadata, unclear taxonomies or interfaces that excluded screen reader users. Technology is an amplifier, not a solution.

  • This project succeeded because we fixed the foundation first: clear information architecture, consistent interface patterns, accessible interactions. The AI makes it better, but the design makes it work.

Overview

Intel’s search was a mess, but not for the reason you’d think. The technology was fine, we had Coveo, a robust AI-powered search platform. The problem was that over a decade, different parts of Intel.com had implemented search differently. Marketing had one approach, support had another, product pages had a third. Same underlying technology, completely inconsistent user experience.

I led the redesign to create a unified search experience across Intel.com. The real challenge wasn’t designing a better search interface, it was getting dozens of content owners to agree on what “better” meant.

My Role

Lead Product Designer on DX Team

I owned the UX strategy and design execution. But honestly, half my time was spent mediating between teams who all thought their implementation was the correct one.

The Problem

Intel.com serves vastly different audiences: consumers buying laptops, IT professionals researching data center solutions, developers looking for technical specs, investors checking financials. Each group’s content was managed by different teams and each team had optimized search for their corner of the site.

This created:

  • Inconsistent result layouts depending on where you searched from

  • Filters that appeared or disappeared based on which section you were in

  • Wildly different ranking algorithms (support prioritized recency, marketing prioritized campaigns, technical docs prioritized exact matches)

  • Gated content appearing in results even when users didn’t have access

  • Mobile experiences that were essentially broken, filters were hidden, pagination didn’t work

The data was stark: average time from search to finding content was 3-4 minutes. Industry benchmark is under 60 seconds. And that’s just for users who found what they needed, we had no reliable way to measure how many people gave up.

What actually happened

  1. Misaligned Teams, Misaligned Search

Each team optimized search for their own goals, marketing prioritized campaigns, support favored recency and product emphasized specifications. Users experienced inconsistent relevance and navigation patterns across experiences, eroding trust.

Search wasn’t just a usability issue, it was an organizational alignment problem disguised as UX.

  1. One Search, Many Needs

“Search” meant different things depending on the context. Internal tools, marketing pages and support articles all surfaced content differently, leading to confusion and inconsistent results.

I facilitated a shared definition of what “search” should mean across the ecosystem and introduced a unified design language for results, filters and metadata.

Scrolling image
  1. Filters Without the Chaos

Through iterative testing, we proved that standardizing relevance models and filters improved comprehension and reduced drop-offs. Centralized search templates brought predictability without limiting flexibility.

Final categories were grouped by user intent, not internal org charts improving clarity and reducing drop-offs post-launch.

Scrolling image
  1. Accessibility Took Real Work

An audit revealed 47 accessibility issues in Coveo’s default components. I pushed to extend, not replace, their library partnering directly with Coveo engineers to fix issues.

We shipped an accessible experience that reached everyone.

  1. Documentation That Saved the Project

I fully spec’d responsive states, filter behaviors, and expected search results in Figma so another member of the team could produce highly detailed, developer focused documentation. At first it seemed like overkill, until edge cases hit. The dev team later said that level of detail “saved us weeks.”

Results

  • 20M+ annual queries through the unified experience

  • Search-to-click time dropped to under 90 seconds (still working on getting it lower)

  • Mobile search usage increased 40% after launch (people had been avoiding it before because it was so broken)

  • Achieved WCAG 2.1 AA compliance across full experience

  • Ranking is better but not perfect. Some high-value content still gets buried because it lacks the metadata Coveo needs. This is an ongoing content operations issue, not a design issue, but it limits the experience.

What I learned

  • Users don’t care about your internal org structure. They don’t know that marketing and support have different content management systems. They don’t know that your company reorganized three times and nobody unified the taxonomy. They just want answers.

  • I also learned that “AI-powered search” means nothing if the fundamentals are broken. Coveo gave us sophisticated ranking algorithms, but they couldn’t fix inconsistent metadata, unclear taxonomies or interfaces that excluded screen reader users. Technology is an amplifier, not a solution.

  • This project succeeded because we fixed the foundation first: clear information architecture, consistent interface patterns, accessible interactions. The AI makes it better, but the design makes it work.

Overview

Intel’s search was a mess, but not for the reason you’d think. The technology was fine, we had Coveo, a robust AI-powered search platform. The problem was that over a decade, different parts of Intel.com had implemented search differently. Marketing had one approach, support had another, product pages had a third. Same underlying technology, completely inconsistent user experience.

I led the redesign to create a unified search experience across Intel.com. The real challenge wasn’t designing a better search interface, it was getting dozens of content owners to agree on what “better” meant.

My Role

Lead Product Designer on DX Team

I owned the UX strategy and design execution. But honestly, half my time was spent mediating between teams who all thought their implementation was the correct one.

The Problem

Intel.com serves vastly different audiences: consumers buying laptops, IT professionals researching data center solutions, developers looking for technical specs, investors checking financials. Each group’s content was managed by different teams and each team had optimized search for their corner of the site.

This created:

  • Inconsistent result layouts depending on where you searched from

  • Filters that appeared or disappeared based on which section you were in

  • Wildly different ranking algorithms (support prioritized recency, marketing prioritized campaigns, technical docs prioritized exact matches)

  • Gated content appearing in results even when users didn’t have access

  • Mobile experiences that were essentially broken, filters were hidden, pagination didn’t work

The data was stark: average time from search to finding content was 3-4 minutes. Industry benchmark is under 60 seconds. And that’s just for users who found what they needed, we had no reliable way to measure how many people gave up.

What actually happened

  1. Misaligned Teams, Misaligned Search

Each team optimized search for their own goals, marketing prioritized campaigns, support favored recency and product emphasized specifications. Users experienced inconsistent relevance and navigation patterns across experiences, eroding trust.

Search wasn’t just a usability issue, it was an organizational alignment problem disguised as UX.

  1. One Search, Many Needs

“Search” meant different things depending on the context. Internal tools, marketing pages and support articles all surfaced content differently, leading to confusion and inconsistent results.

I facilitated a shared definition of what “search” should mean across the ecosystem and introduced a unified design language for results, filters and metadata.

Scrolling image
  1. Filters Without the Chaos

Through iterative testing, we proved that standardizing relevance models and filters improved comprehension and reduced drop-offs. Centralized search templates brought predictability without limiting flexibility.

Final categories were grouped by user intent, not internal org charts improving clarity and reducing drop-offs post-launch.

Scrolling image
  1. Accessibility Took Real Work

An audit revealed 47 accessibility issues in Coveo’s default components. I pushed to extend, not replace, their library partnering directly with Coveo engineers to fix issues.

We shipped an accessible experience that reached everyone.

  1. Documentation That Saved the Project

I fully spec’d responsive states, filter behaviors, and expected search results in Figma so another member of the team could produce highly detailed, developer focused documentation. At first it seemed like overkill, until edge cases hit. The dev team later said that level of detail “saved us weeks.”

Results

  • 20M+ annual queries through the unified experience

  • Search-to-click time dropped to under 90 seconds (still working on getting it lower)

  • Mobile search usage increased 40% after launch (people had been avoiding it before because it was so broken)

  • Achieved WCAG 2.1 AA compliance across full experience

  • Ranking is better but not perfect. Some high-value content still gets buried because it lacks the metadata Coveo needs. This is an ongoing content operations issue, not a design issue, but it limits the experience.

What I learned

  • Users don’t care about your internal org structure. They don’t know that marketing and support have different content management systems. They don’t know that your company reorganized three times and nobody unified the taxonomy. They just want answers.

  • I also learned that “AI-powered search” means nothing if the fundamentals are broken. Coveo gave us sophisticated ranking algorithms, but they couldn’t fix inconsistent metadata, unclear taxonomies or interfaces that excluded screen reader users. Technology is an amplifier, not a solution.

  • This project succeeded because we fixed the foundation first: clear information architecture, consistent interface patterns, accessible interactions. The AI makes it better, but the design makes it work.