Hawkeye

Conduct eye tracking tests with your iPhone

Hawkeye lets you conduct eye tracking tests using an iPhone or iPad, no extra hardware required. Understand where users look in your products, whether you're a designer, marketer, or PM.
Discussion
Would you recommend this product?
1 Review5.0/5
Hey PH, Matt from Hawkeye Labs here! Hawkeye makes it easy to conduct eye tracking tests using an iPhone or iPad without any extra hardware. Traditionally, conducting eye tracking tests can be a real pain. You have to buy expensive eye tracking hardware and conduct all your tests in-person. So even though eye tracking provides incredibly valuable insights into user behavior, the testing process could be way better. By using the TrueDepth camera on newer iOS devices, Hawkeye tracks eye movements without any extra hardware. This makes the process way cheaper and allows far more people to participate in tests. The app is also flexible, so it works for UX researchers, marketers, and more! Highlights: - Conduct tests without any extra hardware 🎥👀 - View heatmaps, focus point diagrams, and screen recordings 📈 - Easily share results with your team 💬 - Requires an iPhone X, XS, XR, or new iPad Pro 📱 I'd love to hear your feedback and am happy to answer any questions :)
Upvote (4)Share
awesome job matt!
Upvote (2)Share
Matt his built the future of user testing with Hawkeye and it is incredible to use! On top of that, he's a great founder who is iterating fast. Keep your eyes on this one 👀
Upvote (1)Share
Well done, @thefuturematt. Is the tracking as accurate as illustrated in the video? I'm curious if you have any data on this.
Upvote (1)Share
@rrhoover Thanks so much for checking out Hawkeye! Currently, accuracy definitely varies from person to person. Accuracy can get as good as in the video when the app is performing at its best. I actually recorded the test shown in the video using the app, although I’m a pretty experienced user, as you might expect :) I’ve found there are a bunch of circumstantial factors that can influence accuracy like the distance of the user’s face from the phone, whether or not the user wears glasses, and lighting conditions. There’s also just a lot of variation in how different people’s eyes are shaped and move. During our beta, the eye tracking worked well enough that most testers were able to easily understand their results, while there were a few people who had trouble getting a high enough accuracy. The app also collects some anonymized metrics during the calibration process which measures the accuracy of the calibration and all the circumstantial factors involved. This data should be useful to both measure accuracy and help improve it going forward!
Good idea. There is a use case around making this an SDK that other app developers can install into their apps. You can then provide analytics on the backend for the app developers.
Upvote (1)Share