This week I tried a AR food IOS app KaBaQ, which enables users to see virtual 3D food on their table in-restaurant and when ordering online. With the development of AR as a medium for storytelling, there are more applications of AR for food brands and restaurants like KaBaQ, Memu 8 and Domino’s Pizza. The first two is for the translation from the menu to the lifelike 3D food model, the third one is an online pizza constructor help customers to customize their own pizza and have fun with it.
The KaBaQ app is powered by Apple ARkit, made by Qreal, who had a place within the AR industry by creating the most lifelike models of cuisine. Besides visualizing the food on menu to the customers when they ordering, they also want to make menu presentation available on Instagram, Snapchat and Facebook to help the restaurants promote their brands online. Keeping these intentions in mind, I began to try this app on my iPhone to examine how they realize them.
First I was asked to choose my location, the there appears 5 restaurants in Brooklyn for me to choose. I know none of them, and neither know their locations. I randomly choose one and come to its online menu. I was asked to scan my table to find the surface. I moved my camera to cover every corner of my table, I don't know if this can be called scanning. After about 2-3 seconds, there appeared a button for me, “tap to place your food”. I tapped on the table where I wanted to place my food, but it didn't seem to work. Then I realized that only by tapping exactly on that button can this work. I assumed that maybe the location is already calculated so it cannot be anywhere I tapping, but still that UI design made my confused. Then the model of the chosen food is loaded, quite lifelike, with lots of details. The accuracy of placing the food model in real 3D space is fine but not stable, and it may depend on how complete the table was scanned before. The food model kept rotating, independently from the direction of the phone, showing every aspects of the food without dealing the camera angle’s problem.(Compared to Menu 8, in which the food model trying to adjust to always face the camera but didn't work out well, reducing fluency of the experience), price and name of that dish showed aside. At around eight dishes in one restaurant on average.
The way this app inspired me is that how AR can be used to embed sensory experience into the food industry, and how to promote this experience through syndication and sharing with online social media platform companies. However, the problems remain to e is how to make this sensory experience powered by AR more engaging and worthy. Because for me, the step from the menu to the real dishes is not far, since I have been here, after I placed my order I can see my dishes soon, see that dish with its smell, crunchy sound, vivid details and hot streaming looks. All of these senses attract me and sometimes I don't want to uncover them too early. And even if I want to learn them ahead, I can go to yelp to see the ratings of each dish with more reliable advice from people who have tried them. In other words, the lifelike 3D models are cool, but they neither help me the make my choices from the menu, nor arouse my interest towards these food by seeing their models.
Then I began to think about the scenarios on which this AR experience could be applied more effectively. From my point of view, learning from the critique above, in that effective scenario people should get no easy access to the real food, yet need rich information from that. It inspired me to think about the scenario when I buy raw materials for cooking in markets. I always want to know how to buy the whole combo for cooking a specific kind of dish. When I see some lovely materials, I want to buy them but I don't know if I want to cook them what else should I buy together. And I also enjoy the experience when I go to markets with my friends, they shared with me their menu I’m interested in and I just buy all the same raw materials with him/her. In this way when I cooking this materials I would feel more emotionally connected to them. So I imagine an AR experience when people buy raw or half-cooked materials in supermarkets by scanning them they can learn more possibilities about how to cook them and what else they need to buy. The possibilities can be showed by still pictures of the finished dishes, lifelike models, blogs or moments introducing cooking methods on social media. And when people select one possible way how they want to cook it, they can get more information about the other raw materials and their locations. It can also solve the problem that people always put cooking tutorials in their wishlists but when they want to try only to find they forgot to buy the raw materials needed when they went to supermarket last time. This experience helps to bridge the gap between buying new materials and cooking. It makes sense to me because the distance between marketing shopping to cooking at home is big, at least bigger to me than the distance from menu to dishes. Also market shopping is a precious social experience to me, because it still can and must happen when I am too busy to attend other social activities. It can keep my menu renewing, and always arouse my interest for new food. It also creates possibilities for restaurants to promote their food and their raw materials, although the former one is seemingly weak link but that may help with expanding customer groups.