Hawkeye Access

Control your iOS device using your eyes

#4 Product of the DayOctober 25, 2018

Hawkeye Access lets you control your iOS device using your eyes. Browse any website, hands-free, all through eye movements. For people with motor impairments, this makes browsing the web much easier!

Around the web

Reviews

Nathan Gitter
Kristina Gordeeva
Kiran NK
 +3 reviews

Discussion

Hunter
Makers
You need to become a Contributor to join the discussion - Find out how.
Matt Moss
Matt MossMaker@thefuturematt · CEO at Hawkeye Labs
Hey PH, Matt from Hawkeye Labs here! Hawkeye Access lets anyone control an iOS device using their eyes. After building a rough eye tracking demo while at WWDC as a student scholarship winner (https://twitter.com/thefuturemat...), I decided to spend the rest of my summer break from college launching Hawkeye Labs and building our first product, Hawkeye Access. The most common response I got on Twitter to my original demo was how impactful this technology could be for people with motor impairments. Super excited to launch a product that I think can have a real impact for people with disabilities! Highlights: - Browse any website, hands-free. - Just look at any button or link and smile, blink, or hold your gaze to select. - Look at the edges of the screen to scroll. - Look in the bottom right corner to go home. - Requires an iPhone X, XS, or XR. Would love to get some feedback and am happy to answer any questions!
Greg Hundermark
Greg Hundermark@ghundermark · English Teacher, Pequea Valley SD
@thefuturematt Hi Matt. I just downloaded Hawkeye & I can't get best the calibration screen. I am looking at the dot that is centered but it doesn't register my eye/progress to the next screen. How do I know if my face is "in the camera frame" ? Any ideas?
Matt Moss
Matt MossMaker@thefuturematt · CEO at Hawkeye Labs
@ghundermark Hey Greg, really appreciate you trying out the app! Few things that can help calibration: 1) Use the app while sitting down with the phone in front of you 2) Clean or remove any reflective glasses you may be wearing 3) Keep your head and the phone as still as possible. 4) Make sure you’re using an iPhone X, XS, or XR Let me know if this helps! If calibration continues to be a problem, you’re able to tap the screen to manually advance, although the eye tracking performance will likely be worse as it won’t have gotten as good a read on your eyes.
Candice Richter
Candice Richter@candice_richter · Assistive Technology & SLP-Assistant.
@thefuturematt Hi Matt! Candice over here down in South Texas. I am an Assistive Technology specialist and SLP-Assistant. I’ve worked in the home heath-pediatric population & currently work in school setting. AAC (Augmentive Alternative Communication) is my passion and I work with multiple kiddos who have complex communication deficits due to their complex disabilities. ACCESS to communication is one of the most challenging things for these kiddos. I love the app. I have only dabbled with it but take a second and pretend you can’t speak with your natural voice. I went in to Amazon right away and that is super cool but, when I went to search for an item and it prompted me to use to speak Into the phone using auditory recognition to search. I looked in the settings to try and find a way for it To give me the option to have the key board come up instead but, I didn’t see anything. It would be nice if there is something that you would normally search for that you can use the eye gaze with the keyboard. Basically the ability to “type with your eyes”. iPhone Keyboard with word prediction. I think this is fantastic and has the possibility to be a complete game changer to access multiple people. I can’t wait for the new model iPad to come out that way we will have eyegaze access to multiple AAC apps. LAMP words for life...if you haven’t heard of it check it out. We reached out to the app developer to check out the app and I’m sure he will be contacting you soon.
Matt Moss
Matt MossMaker@thefuturematt · CEO at Hawkeye Labs
@candice_richter Hey Candice, really appreciate the feedback! Speech is currently the only method of text input for a few different reasons. Because of the way iOS is designed, it makes it nearly impossible for us to allow the user to control the standard iOS keyboard with their eyes. It’s also very challenging to highlight buttons that small through eye movements simply because the phone screen isn’t all that big. As you mentioned, an iPad with a TrueDepth camera would open up a lot of possibilities for text input. With so much more screen real estate, it would be much easier to design an eye controllable keyboard. It’s definitely something we’re looking into :) Also love the potential of AAC through eye control. I think it would be a life changing feature for a lot of people!
@candice_richter @thefuturematt Hi Matt. If the eye gaze works in more or less the same fashion as our existing infrared systems (Tobii, Iris Bond, MyGaze, TM 5, LC Edge and Alea) then larger screens won't make targeting much easier as the movement of the eye remains the same but is amplified across larger changes in the position of the cursor. For people that can't achieve the movement resolution required for a full keyboard we switch to a 2-step keyboard with letters grouped (a little like T-9). We dont have an iPhone X-series at Ace Centre so can't test your app yet but thanks for the innovation and i'm really looking forward to trialing it. If it's practical then it'll be an amazing option for users. And certainly the iPad would be more practical then the iPhone. In the meantime you could try using your browser to visit coughdrop.com or cboard.io which are both Augmentative/Alternative Communication apps that run online. Cheers, Charlie.
Jordan Gonen
Jordan Gonen@jrdngonen · trying my best.
Awesome!!
Riley Walz
Riley Walz@w · High schooler who ships 💯
Hey Matt, this is awesome! It has huge potential as an accessibility tool. Congrats, and can't wait to see where this goes!
Matt Moss
Matt MossMaker@thefuturematt · CEO at Hawkeye Labs
@w Thanks Riley, really excited about how the app can help people with disabilities!
Lama Al Rajih
Lama Al Rajih@lamaalrajih
Hey, Matt. I'm with @w on the great potential for this as an accessibility tool. Why is it limited to the X, XS, or XR?
Matt Moss
Matt MossMaker@thefuturematt · CEO at Hawkeye Labs
@w @lamaalrajih Hey Lama, thanks for the comment! The app uses the TrueDepth camera to track where the users looks on their screen. This is currently only available on the iPhone X, XS, or XR. We hope to expand support as more TrueDepth-equipped devices are released!
Zenno Bruinsma
Zenno Bruinsma@iamzb_ · Learning everyday
This is so cool! the possibilities with this technology are very large. For example, hospitals. Older people no longer need to use a remote control to watch Netflix ;) Great job keep going!
Matt Moss
Matt MossMaker@thefuturematt · CEO at Hawkeye Labs
@iamzb_ Thanks Zenno! Really excited about how the app can help people, especially those with disabilities!