Were you unable to attend Transform 2022? Check out all of the summit sessions in our on-demand library now! Watch here.
Pinterest today introduced Complete the Look, a visual search tool being developed for the Home Decor and Fashion categories that makes style recommendations for multiple items in a photo like shirts, dresses, shoes, or sunglasses. Complete the Look recommendations take into consideration context of surrounding objects and elements in a photo such as season, body type, design aesthetics, and even whether an image was taken inside or outside.
Modeling fashion compatibility results can be subjective, but early experiments by Pinterest engineers found Complete the Look to be more accurate in making recommendations than previous systems.
The Pinterest catalog of visual search tools started in 2017 with Lens, a service that went on to power hundreds of millions of searches a month, and follows the introduction of full automation for Shop the Look for buying Pinned items in February and personalized results for Lens Your Look in March.
Pinterest details the approach taken for Complete the Look in a paper accepted for publication at the Computer Vision and Pattern Recognition (CVPR) conference, which takes place next week in Long Beach, California. Pinterest shared no details about when Complete the Look visual searches will be made available for Pinterest users.
MetaBeat will bring together thought leaders to give guidance on how metaverse technology will transform the way all industries communicate and do business on October 4 in San Francisco, CA.
Complete the Look is a deep convolutional feed-forward neural network trained using STL-Dataset that begins by generating scenes and product images using the ResNet50 network and creating a map of story features.
“We learn global embeddings from scene and product images and local embeddings from local regions of the scenes, and measure scene-product compatibility in a unified style space with category-guided attention,” researchers said in the paper.
Pinterest’s visual search tools compete with Google Assistant’s Lens, which can deliver real-time similar clothing style recommendations, identify objects, translate text, or pick the best items from a restaurant menu. Earlier this month, Amazon introduced StyleSnap for fashion recommendations based on pics from magazines, social media, or your camera roll.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings.