Wine selection can be daunting for many, especially when faced with a plethora of unfamiliar labels. Fortunately, wine apps have made this process easier by allowing users to scan bottle labels and access detailed information and reviews. These wine apps leverage advanced artificial intelligence (AI) algorithms to enhance the wine-buying experience, but they can’t taste it for you.
However, a new development by a collaborative team from the Technical University of Denmark (DTU), the University of Copenhagen, and Caltech is set to revolutionize how these wine algorithms operate by incorporating a crucial new parameter: individual flavor and taste impressions.
Thoranna Bender, a graduate student at DTU, explained, “By feeding an algorithm with data consisting of people’s flavor impressions, the algorithm can make more accurate predictions of what kind of wine we individually prefer.” This innovative approach was developed under the Pioneer Centre for AI at the University of Copenhagen.
The research involved hosting wine tastings with 256 participants. These participants were asked to arrange shot-sized cups of different wines on an A3 paper based on perceived similarity in taste.
This spatial arrangement, a method common in consumer tests, was then digitized by photographing the papers. The resulting data, combined with a vast array of wine labels and user reviews provided by Vivino, formed the basis of the newly developed algorithm.
“The dimension of flavor that we created in the model provides us with information about which wines are similar in taste and which are not. So, for example, I can stand with my favorite bottle of wine and say: I would like to know which wine is most similar to it in taste – or both in taste and price,” says Thoranna Bender.
Professor and co-author Serge Belongie from the Department of Computer Science heads the Pioneer Centre for AI at the University of Copenhagen. He adds, “We can see that when the algorithm combines the data from wine labels and reviews with the data from the wine tastings, it makes more accurate predictions of people’s wine preferences than when it only uses the traditional types of data in the form of images and text. So, teaching machines to use human sensory experiences results in better algorithms that benefit the user.”
Belongie further emphasized the growing trend of using multimodal data in machine learning, which typically includes a mix of images, text, and sound. The addition of taste and other sensory inputs as data sources is a novel approach with immense potential, particularly in the food sector.
“Understanding taste is essential for healthy, sustainable food production, and the use of AI in this context is still in its infancy,” he added.
Thoranna Bender points out that the researchers’ method can easily be transferred to other types of food and drink as well.
Bender expounded, “We’ve chosen wine as a case, but the same method can just as well be applied to beer and coffee. For example, the approach can be used to recommend products and perhaps even food recipes to people. And if we can better understand the taste similarities in food, we can also use it in the healthcare sector to put together meals that meet with the tastes and nutritional needs of patients. It might even be used to develop foods tailored to different taste profiles.”
The researchers have made their data publicly available on an open server. Bender expressed enthusiasm about the potential collaborations and developments that could stem from this data. She closed by saying, “I’ve already fielded requests from people who have additional data that they would like to include in our dataset. I think that’s really cool.”
In summary, this development marks a significant step forward in the integration of AI and human sensory experiences, particularly in the realm of food and beverage. By better understanding individual taste preferences, these advanced algorithms can not only enhance personal enjoyment but also contribute to broader applications in sectors like healthcare and sustainable food production.
Like what you read? Subscribe to our newsletter for engaging articles, exclusive content, and the latest updates.