Home > Industry Observation > INSIGHTS & INSPIRATIONS >Insights & Inspirations | Xiecheng Wendao – AI Mega Model Applied in Travel and Tourism Industry

Insights & Inspirations | Xiecheng Wendao – AI Mega Model Applied in Travel and Tourism Industry

2023-11-23

In a pivotal segment of the 2023 WTA • Xianghu Dialogue titled “Empowerment of Science and Technology: Tourism Promotes Industrial Integration”, Mr. WANG Qing, VP of Technology of Trip.com Group, shared his insights on the “Xiecheng Wendao – AI Mega Model Applied in Travel and Tourism Industry”.

Here are excerpts from his speech:

In July this year, Trip.com Group unveiled “Xiecheng Wendao”, the first vertical large language model in the tourism sector. I would like to share some insights into the specifics of this model, our thoughts on AI, and the progress in its application.

Currently, while large language models (LLMs) can understand user inputs, the content they generated is not entirely reliable on its own. The industry is exploring various solutions to enhance its accuracy, but we still have a long way to go. This gives rise to vertically trained LLMs with potential for customization to enhance the accuracy of responses. That is how we started, but how do we leverage this LLMs in the Online Travel Agency (OTA) industry? We’ve identified three major areas: First, vertical LLMs can assist users with travel planning, utilizing industry and real-time data to improve the efficiency in planning and booking travel products and services. Second, in after-sales service, which is vital in service e-commerce, we aim to enhance user experience by improving accuracy of the LLMs. Third, for on-the-way tourists, we provide travel information through intelligent guides and en route services.

How do we ensure the reliability of these three aspects? Firstly, we turn to customization to train the LLMs to better understand user needs, enhancing its comprehension through secondary training with our 30 billion tokens of training data. Secondly, “Xiecheng Wendao” integrates with real-time data, like hotel and flight statuses, to respond to user queries. Thirdly, we create various rankings based on user searches, orders, and reviews to ensure content accuracy. Lastly, we combine the LLMs with existing search and robot algorithms.

Why do we focus on rankings? A major challenge for all large models today is ensuring the accuracy and usability of responses related to travel itineraries. To address this, we focus on the common needs and themes of top domestic and international destinations, leveraging massive data to generate rankings, which are then meticulously verified by human experts and selected through scoring assessments in over 300 countries worldwide. To ensure the accuracy of our data, we primarily rely on internal data searches summarized by the LLMs, supplemented by search engine responses where our data does not cover.

Our current rankings include hotspots, special deals, themed hotels, attractions, and itineraries, and we are continuously expanding our coverage, especially in Southeast Asia, Japan, Korea, and the Europe and America regions. Our current rankings are categorized into several types: the Hotspot List, which keeps up with the latest trends; the Special Deals List, offering real-time pricing information to address the time-sensitivity – and all discounts in our rankings are genuine; the Hotel List, focusing mainly on themed hotels; the Attractions List and; the Itinerary List. We are continually expanding our Itinerary List, currently extending our coverage to Southeast Asia, Japan, Korea, and regions in Europe and America.

We are committed to focusing on the tourism industry, providing intelligent assistant services before, during, and after sales. We are investing significant resources to ensure the quality of frequently used content, gradually building trust in AI services among our users. Our data shows that users’ greatest expectation from AI lies in travel content. Therefore, we will invest more in ensuring the quality of travel itineraries to better serve our users.