Inference engine for Apple Silicon
by•
We launched our own inference engine written from scratch for Apple Silicon. It’s open sourced
https://github.com/trymirai/uzu
So you can easily launch LLMs on your Mac. You can explore more on our website here https://trymirai.com/
27 views


Replies
Amazing work! What about iPhones?