According to Apple, the machine learning it can make a difference for accessibility and for people’s health. This is what emerges from the speech of Ge Yuevice president and director of Apple in Cinawhich held the World Artificial Intelligence Conference di Shanghai. And the executive also explains how.
Apple: Machine learning can help with accessibility and health
NPR reports the executive’s speech, which as Apple has accustomed us to, never separates ideas from products. So to talk about accessibility, Yue explains how Apple Watch e AirPods integrate features to ensure access to technology for all.
“We believe that the best products in the world must meet everyone’s needs. Accessibility is one of our founding values and an important part of our products. We are committed to producing products that are truly suitable for everyone ”.
And he continues: “We know that the machine learning can help provide independence and convenience for employees with disabilities. Also including people with impaired vision, hearing, people with physical and motor disabilities, and people with cognitive impairments “.
Apple Store in New York, USA
For instance Assistitive Touch in Apple Watch for those who have difficulty moving their upper limbs, or gaze tracking with AI for iPad. And then there are also small innovations, such as the possibility of saying “Hey Siri, end the call”To hang up.
But beyond accessibility, machine learning can become a health resource. And the manager explains: “Our exploration into the world of health has just begun.” Precisely because Apple believes that “the machine learning and sensor technology it has unlimited potential to provide insight and encourage healthy lifestyles ”.
So it looks like Apple wants to use the technology for these two really cool goals. Indeed, it is doing it. And perhaps with today’s announcement in Cupertino, other products dedicated to these purposes will also arrive.