You might think that eye control technology is something out of a science fiction film when it comes to using your eyes to operate your smartphone.
But for millions of iPhone users worldwide, it's about to happen.
It has been confirmed by Apple that eye tracking is enabled on the iPad and iPhone.
The tool would let users manage their Apple devices with just their eyes thanks to artificial intelligence (AI).
'Eye Tracking uses the front-facing camera to set up and calibrate in seconds, and with on-device machine learning, all data used to set up and control this feature is kept securely on the device, and isn't shared with Apple,' according to Apple.
Apple last week unveiled several new accessibility features, including the eye-tracking tool.
According to the chief executive of Apple 'We believe deeply in the transformative power of innovation to enrich lives.’
For this reason, Apple has supported inclusive design for almost 40 years by integrating accessibility into the foundation of both our hardware and software.
'We're continuously pushing the boundaries of technology, and these new features reflect our long-standing commitment to delivering the best possible experience to all of our users.'
There is no need for additional hardware or accessories for Eye Tracking to function with iPadOS and iOS apps.
After setting it up, users may utilize Dwell Control to activate each app piece and navigate through it, using their eyes alone to access other features like physical buttons, swipes, and other motions.
Although the new feature won't be available until "later this year," it has already attracted a lot of interest on X formally known as Twitter.
"For those with specific disabilities, this is fantastic news; however, for everyone else, it simply means becoming more lazy than we already are," one user tweeted.
Another added: 'The black mirror episodes making more sense every day.'
And one joked: 'Oh this generation is about to be the laziest generation ever.'
The tech giant also unveiled a function that it claims could lessen motion sickness in passengers in moving cars as part of its portfolio of new capabilities.
According to studies, Apple claims that a sensory conflict between an individual's perception of their surroundings and feelings is often the cause of motion sickness.
Motion sickness can be lessened by using its new Vehicle Motion Cues function, which adds animated dots to the screen's edges to depict variations in vehicle motion.
Another new feature is Music Haptics, which uses the haptic engine in the iPhone – which powers the vibrations on the device – to enable those who are deaf or hard of hearing to experience music vibrating to the audio of the music.
In addition, Apple said that it will be rolling out new speech features for customers with speech-impairing conditions. These features would allow users to program specific utterances to virtual assistant Siri to assist with shortcuts to apps.