Connect with us

News

Samsung unveiled the Galaxy S24 with Google’s AI

More interestingly, Gemini, Google’s advanced family of generative AI models, will appear on the Galaxy S24.

Samsung has unveiled its top-of-the-line Galaxy S24 series smartphones. Inside – Snapdragon 8 Gen 3 (or Exynos 2400), AMOLED screens from 6.2 FHD+ to 6.8 Quad HD+ 120 Hz, 8 to 12 GB RAM, storage from 128 GB to 1 TB, three or four cameras (at best it’s 200 MP f/1. 7 with stabilization + 12 MP f/2.2 + 10 MP f/2.4 with 3x optical zoom and stabilization + 50 MPf/3.4 with 5x zoom and stabilization), Android 14 c One UI 6.1. The price ranges from $800 for the Samsung Galaxy S24 to $1,300 for the Samsung Galaxy S24 Ultra.

Google’s AI in the Samsung Galaxy S24

More interestingly, Gemini, Google’s advanced family of generative AI models, will appear on the Galaxy S24. Google and Samsung made the announcement at the same time as the Galaxy S24 was unveiled.

On the Galaxy S24 phones, Gemini – specifically Gemini Pro, a mid-range model designed for a range of tasks – will run in Samsung’s Notes, Voice Recorder and Keyboard apps, providing, in Google’s words, “enhanced generalization features.” Users will be able, for example, to record a lecture using Voice Recorder and get a summary of the key points of the lesson.

Meanwhile, the Gemini Nano – a more efficient and compact Gemini model – will enable a new feature in Google Messages, Magic Compose, which lets you create messages in styles such as “excited,” “formal” and “lyrical” right on the device and without the need for an Internet connection.

The Galaxy S24 is only the second Android device to utilize the Gemini Nano, following Google’s own Pixel 8 Pro. On the Pixel 8 Pro, the Nano supports features like summarization in the Recorder app and suggested answers in Google’s Gboard keyboard app.

In addition to Gemini, the Galaxy S24 will utilize Google’s Imagen 2 text-to-image conversion model, which will form the basis of the photo editing features in the Galaxy S24 Gallery app (Imagen 2 was introduced at Google I/O last May and recently launched as a preview version online). One of them, Generative Edit, can automatically fill in parts of images based on the surrounding context.

Google also said Samsung will be one of the first partners to test the Gemini Ultra, the largest and most powerful Gemini model, before it becomes widely available to developers and enterprises later this year. However, neither company has said exactly how the Ultra might be used in the Galaxy S24 – or in other Samsung devices.

Circle to Search

In addition to using AI, Google has also announced a new way to search on Android phones called “Circle to Search”. The feature will allow users to search in any app using gestures such as circling, highlighting, shading or tapping. Google says the addition is designed to make it more natural to interact with Google Search anytime there’s a question – for example, when you’re watching a video, viewing a photo in a social app, or while talking to a friend on messenger.

For example, when you want to identify something in a video or photo. For example, if you watch a video of a Korean corn dog, by circling it, you can ask “Why are they so popular?”. The feature can be triggered by other gestures as well. If you’re chatting in a messaging app with a friend about a restaurant, you can simply tap its name to see more information about it. Or you can swipe your finger over a string of words to turn that string into a search.

If you’re interested in something visual on the screen, you can circle that item. For example, Google suggests circling the sunglasses the video creator was wearing or shading his shoes to find related items on Google without having to switch between different apps. The “shading” or “scribble” gesture can be used for both images and words, Google notes.

Search results will vary depending on the user’s query and the Google Labs products they’ve selected. With a simple text search, you might see traditional search results, while a query that combines image and text – or “multisearch,” as Google calls it – uses generative AI. And if a user participates in the Google Search Generative Experience (SGE) experiment run by Google Labs, they will be presented with AI-driven answers, as with other SGE queries.

The company believes that being able to access searches from any app will make a big difference, as users will no longer have to interrupt their work to start a search or take a screenshot of the screen to search later. Turning the entire Android platform into a search surface isn’t just a “meaningful” change for consumers, it’s a kind of recognition that Google’s search business needs to be strengthened by deeper integration with the smartphone OS itself.

The Circle to Search feature will launch on January 31 on the new Galaxy S24 series smartphones announced today at Samsung’s event, as well as on premium Android smartphones including the Pixel 8 and Pixel 8 Pro. It will be available in all languages and at all locations where these phones are sold. Google says more Android smartphones will support the feature over time.

Advertisement

Trending