Himanshu Shukla | June 8, 2022 | 5 min read
Rethinking Zomato Search (Part Two)

Before we deep dive into more features that make Zomato Search your favourite friend – the one who always listens and is there for you for all your (food) queries, a quick recap of what we discussed earlier –

  • Restaurant search and discovery forms the core of Zomato’s platform and our hyperlocal delivery is built on a customer’s convenience to navigate Search easily
  • Key features like Auto-suggest tabs, Learn to Rank, and Autocorrect help address search queries faster and more conveniently. Remember Barger autocorrecting to Burger?

Search is a great responsibility that we have built with care and attention to detail. Well we do power millions of search queries daily and as we said earlier, with great numbers, comes great responsibility. 

So let’s dig into more features that make our rather your Search, better and customer-friendly.

(Features continued from Part One. Read here)

Mike testing – 1…2…3… Did you know about the Voice Search feature at Zomato?

What is it?

We introduced voice search to enable customers to explore restaurants, dishes, or cuisines through voice inputs. A customer can now tap the mic icon to activate the listening mode. The voice search processes the voice input into a text and displays the relevant results on the Search screen.

Why did we build it?

  • To enable voice search for customers 
  • To ease the onboarding process as many people find voice search easier and faster in comparison to typing 
  • To cater to the current trend in India, wherein people are shifting to voice over text for search1
  • To make Zomato app more inclusive. Did you know that Indians use voice search twice as much as the rest of the world2

How did we build it?

To enable voice search, we deployed a variety of technical capabilities such as voice transcription (powered by Google on Android and speech framework on iOS devices) and the Natural Language Understanding, built in-house by the Engineering and Data Sciences teams. 

How it works

  • Once a customer gives a command, the automatic speech recognition recognises the voice and converts it into text
  • Our Natural Language Understanding then tries to identify the intent and entity from the keywords given in the command
  • These keywords are picked up by our Search engine, which fetches results for the customers in a fraction of a second
  • Finally, our voice search merges these technologies with our existing text-based search capability and voila – your voice-enabled search query is ready to be served and serviced 

Why write tea and evening snacks when you can say Chai and Samosa? Introducing Natural Language Search

What is it?

Natural Language Search allows customers to speak or type using their everyday language, rather than common keywords. Now customers can use slangs, phrases or partial sentences in their native language, just like in a casual conversation. While all along, the system in the backend transforms these sentences into searchable queries.


A few examples of Natural Language Search:

  • ‘Pizza outlets near me’
  • ‘XYZ* ka burger’
  • Chai and samosa

How did we build it?

We have built a model that can predict if a given sentence is a combination of a dish-dish (chai and samosa) or a dish-restaurant (chai from XYZ place). And once we identify these entities, we provide relevant results like the normal search.

This is one of the first yet pivotal steps in the direction of Natural Language search wherein we solve for some common use cases, such as:

  • Restaurant + Dish queriesJalebi from XYZ
  • Dish + Dish queries; Roti sabzi, Chai samosa
  • Restaurant + Near me – XYZ near me
  • Dish + Near me – Pizza outlets near me

Please note: XYZ is a blanket term used instead of specific restaurant names. 

Do note: There will still be cases where we will not be able to gauge the intent of certain natural language searches.

Did you say payasa or kheer? Global counterparts and their regional dish aliases

What is it?

We added and merged regional names of dishes along with their global names. So customers can use either of the names and will get the relevant search results. For instance, search for Upit or Upma and all results for Upma will show up or search for Payasa and Kheer and you will see all results for Kheer.

Why did we build it?

  • The old search experience was quite broken for regional dishes due to which customers could not find what they were looking for. 
  • Additionally, the restaurants with huge assortments of local dishes were unable to highlight their menu well, which created a cascading effect on outlets serving these dishes.

For example, if a certain outlet has mentioned its dish name as Upit on the menu (regional dish name for the global counterpart Upma), the outlet will not be searchable for Upma. Similarly, while searching for Upit, outlets serving Upma would not show up in the old search ecosystem.

How did we build it?

  • We started with creating a repository of dishes with their regional names and then linking them to their global counterparts, in a bid to increase the search-ability of such types of dishes. 
  • We also merged these dishes, so that restaurants serving either of these dishes show up when a customer searches for the dish with either of the names (Upma or Upit)

In the screenshot attached, the regional alias is being shown with the global counterpart name in brackets.

Let’s search for impact, shall we? Learning and key takeaways!

Through our features, we made it easier for our customers to locate the dishes, cuisines, or restaurants. Our improvements also helped in reaching new customers, and overall made search easier and faster for everyone.

That’s all folks! On to the next problem solving.

This is a two-part series on how and why we introduced new ‘Search’ features on our app. If you are interested in solving similar problems with a customer-first eye and building features that impact millions, then connect with Himanshu Shukla on LinkedIn. We are always looking for builders at Zomato.

This blog was written by Himanshu Shukla and Saurav Singh in collaboration with Sandeep Veethu, Sonal Garg, Srinjay Kumar, and Shivansh Tamrakar.


All images/ videos are designed in-house.

-x-

Sources: 

  1. Year in Search 2021, about.google
  2. Google  for India 2021 virtual event, youtube.com

facebooklinkedintwitter

More for you to read

Technology

apache-flink-journey-zomato-from-inception-to-innovation
Data Platform Team | November 18, 2024 | 10 min read
Apache Flink Journey @Zomato: From Inception to Innovation

How we built a self-serve stream processing platform to empower real-time analytics

Technology

introducing-pos-developer-platform-simplifying-integration-with-easy-to-use-tools
Sumit Taneja | September 10, 2024 | 2 min read
Introducing POS Developer Platform: Simplifying integration with easy-to-use tools

Read more about how Zomato is enabling restaurants to deliver best-in-class customer experience by working with POS partners

Technology

migrating-to-victoriametrics-a-complete-overhaul-for-enhanced-observability
SRE Team | August 12, 2024 | 11 min read
Migrating to VictoriaMetrics: A Complete Overhaul for Enhanced Observability

Discover how we migrated our observability metrics platform from Thanos and Prometheus to VictoriaMetrics for cost reduction, enhanced reliability and scalability.

Technology

go-beyond-building-performant-and-reliable-golang-applications
Sakib Malik | July 25, 2024 | 6 min read
Go Beyond: Building Performant and Reliable Golang Applications

Read more about how we used GOMEMLIMIT in 250+ microservices to tackle OOM issues and high CPU usage in Go applications, significantly enhancing performance and reliability.