MGIE, which stands for MLLM-Guided Image Editing, is a model developed by Apple and the University of California, Santa Barbara, that leverages multimodal large language models (MLLMs) to enable image editing based on natural language instructions.
The MGIE model, which Apple worked on with the University of California, Santa Barbara, can crop, resize, flip, and add filters to images all through text prompts... which suggests that iOS 18 (or something thereabouts) is going to be offering a bunch of sweet on-device AI functionality!
Product Hunt
Sugar Free: Food Scanner