HACKINTOSH.ORG | Macintosh discussion forums

Macintosh News => Apple News => Topic started by: HCK on December 14, 2023, 04:05:12 pm



Title: The Ray-Ban Meta smart glasses are getting AI-powered visual search features
Post by: HCK on December 14, 2023, 04:05:12 pm
The Ray-Ban Meta smart glasses are getting AI-powered visual search features

<p>The Ray-Ban Meta smart glasses are about to get some powerful upgrades thanks to improvements to the social network’s AI assistant. The company is finally <a data-i13n="elm:context_link;elmt:doNotAffiliate;cpos:1;pos:1" class="no-affiliate-link" href="https://www.threads.net/@boztank/post/C0w5Vz3r0h0" data-original-link="https://www.threads.net/@boztank/post/C0w5Vz3r0h0">adding support[/url] for real-time information to the onboard assistant, and it’s starting to test new “multimodal” capabilities that allow it to answer questions based on your environment.</p>
<p>Up to now, Meta AI had a “knowledge cutoff” of December 2022, so it couldn’t answer questions about current events, or things like game scores, traffic conditions or other queries that would be especially useful while on the go. But that’s now changing, according to Meta CTO Andrew Bosworth, who said that all Meta smart glasses in the United States will now be able to access real-time info. The change is powered “in part” by Bing, he added.</p>
<span id="end-legacy-contents"></span><p>Separately, Meta is starting to test one of the more intriguing capabilities of its assistant, which it’s calling “multimodal AI.” The features, first <a data-i13n="elm:context_link;elmt:doNotAffiliate;cpos:2;pos:1" class="no-affiliate-link" href="https://www.engadget.com/metas-metaverse-is-getting-an-ai-makeover-194004996.html" data-original-link="https://www.engadget.com/metas-metaverse-is-getting-an-ai-makeover-194004996.html">previewed[/url] during Connect, allow Meta AI to answer contextual questions about your surroundings and other queries based on what your looking at through the glasses.</p>
<figure><img src="https://s.yimg.com/os/creatr-uploaded-images/2023-12/ef595050-992d-11ee-bfe7-51b5611b6e64" data-crop-orig-src="https://s.yimg.com/os/creatr-uploaded-images/2023-12/ef595050-992d-11ee-bfe7-51b5611b6e64" style="height:1280px;width:1920px;" alt="Meta AI's new " data-uuid="599a8437-e9ef-3640-9d7b-b70a5dc746ee"><figcaption></figcaption><div class="photo-credit">Meta</div></figure>
<p>The updates could go a long way toward making Meta AI feel less gimmicky and more useful, which was one of my top complaints in my <a data-i13n="elm:context_link;elmt:doNotAffiliate;cpos:3;pos:1" class="no-affiliate-link" href="https://www.engadget.com/ray-ban-meta-smart-glasses-review-instagram-worthy-shades-070010365.html" data-original-link="https://www.engadget.com/ray-ban-meta-smart-glasses-review-instagram-worthy-shades-070010365.html">initial review[/url] of the otherwise impressive smart glasses. Unfortunately, it will likely still be some time before most people with the smart glasses can access the new multimodal functionality. Bosworth said that the early access beta version will only be available in the US to a “small number of people who opt in” initially, with expanded access presumably coming sometime in 2024.</p>
<p>Both <a data-i13n="elm:context_link;elmt:doNotAffiliate;cpos:4;pos:1" class="no-affiliate-link" href="https://www.instagram.com/p/C0w4Agjvq5_/?img_index=1" data-original-link="https://www.instagram.com/p/C0w4Agjvq5_/?img_index=1">Mark Zuckerberg[/url] shared a few videos of the new capabilities that give an idea of what may be possible. Based on the clips, it appears users will be able to engage the feature with commands that begin with “Hey Meta, look and tell me.” Zuckerberg, for example, asks Meta AI to look at a shirt he’s holding and ask for suggestions on pants that might match. He also shared screenshots showing Meta AI identifying an image of a piece of fruit and translating the text of a meme.</p>
<p>In a <a data-i13n="elm:context_link;elmt:doNotAffiliate;cpos:5;pos:1" class="no-affiliate-link" href="https://www.threads.net/@boztank/post/C0w5Vz3r0h0" data-original-link="https://www.threads.net/@boztank/post/C0w5Vz3r0h0">video[/url] posted on Threads, Bosworth said that users would also be able to ask Meta AI about their immediate surroundings as well as more creative questions like writing captions for photos they just shot.</p>This article originally appeared on Engadget at https://www.engadget.com/the-ray-ban-meta-smart-glasses-are-getting-ai-powered-visual-search-features-204556255.html?src=rss

Source: The Ray-Ban Meta smart glasses are getting AI-powered visual search features (https://www.engadget.com/the-ray-ban-meta-smart-glasses-are-getting-ai-powered-visual-search-features-204556255.html?src=rss)