The open-source code of Queryable, an iOS app, utilizes the CLIP model to conduct offline searches in the
Photos album. Unlike the object recognition-based search feature built into the iOS gallery, Queryable allows you to use natural language statements, such as
a brown dog sitting on a bench, to search your gallery. It operates offline, ensuring that your album privacy won’t be leaked to anyone, including Apple/Google.
How does Queryable work?
- Process all photos in your album through the CLIP Image Encoder to create a set of local image vectors.
- When a new text query is inputted, convert the text into a text vector using the Text Encoder.
- Compare the text vector with all the stored image vectors, evaluating the level of similarity between the text query and each image.
- Sort and return the top K most similar results.
The process is as follows:
For more details, please refer to my article Run CLIP on iPhone to Search Photos.
Run on Xcode
TextEncoder_float32.mlmodelc from Google Drive. Clone this repo, put the downloaded models below
CoreMLModels/ path and run Xcode, it should work.
Core ML Export
If you only want to run Queryable, you can skip this step and directly use the export model from Google Drive. If you wish to implement Queryable that supports your own native language, or do some model quantization/acceleration work, here are some guidelines.
The trick is to separate the
ImageEncoder at the architecture level, and then load the model weights individually. Queryable uses the OpenAI ViT-B/32 model, and I wrote a Jupyter notebook to demonstrate how to separate, load, and export the Core ML model. The export results of the ImageEncoder’s Core ML have a certain level of precision error, and more appropriate normalization parameters may be needed.
Disclaimer: I am not a professional iOS engineer, please forgive my poor Swift code. You may focus only on the loading, computation, storage, and sorting of the model.
You can apply Queryable to your own product, but I don’t recommend simply modifying the appearance and listing it on the App Store. If you are interested in optimizing certain aspects(such as #3, #4, #5, #6), feel free to submit a PR (Pull Request).
Mobile App Development Best Practices – 04.10
iOS New and Deprecated APIs in iOS 17 Abstract Class vs. Protocol-Oriented Approach in Swift Comparing the Performance of the...
New and Deprecated APIs in iOS 17
In this video, I would like to share with you some things that were either deprecated or added in iOS...
Promova helps people with dyslexia learn languages
The new Promova feature comes just in time for National Dyslexia Awareness Month and is available on the platform for...
Notify – A simple note application with modern MVVM, Compose and Material3
Notify is a simple note application that is built with Modern Android development tools. This project showcases the Good implementation...
Mobile App Development Best Practices – 03.10
iOS MetaCodable – Supercharge Swift’s Codable implementations with macros meta-programming How to build a Tuist plugin and publish it using...
How to make and use BOM (Bill of Materials) dependencies in Android projects
By using a BOM dependency, you can avoid specifying the versions of each individual library in your app, and let...