New iOS 14 And macOS 11 Features Will Let Developers Track Hand And Body Movements

Apple’s annual WWDC event is ongoing right now and while it’s online-only for the first time, that isn’t stopping there from being some big announcements.

We saw iOS 14, iPadOS 14, watchOS 7, and macOS 11 Big Sur announced on Monday and now we’ve also heard about a new framework that will allow iPhones and Macs to track hand and body movements.

Available as a WWDC session online for developers to check out, details show that the new framework will allow apps to track movement and poses and then carry out actions based on what they see.

Explore how the Vision framework can help your app detect body and hand poses in photos and video. With pose detection, your app can analyze the poses, movements, and gestures of people to offer new video editing possibilities, or to perform action classification when paired with an action classifier built in Create ML.

Apple provided a few examples of how the new framework could be used. It suggested that an app could watch hand movements and then overlay a hand emoji depending on the pose being struck. Other ideas include a fitness app that will track exercises and even an app that could allow people to write in the air and have an iPhone turn the movement into text.

While none of this means anything until developers make use of it, the potential for what this new framework could make possible is huge. Hopefully there are plenty of apps already in the heads of those developers right now.

You may also like to check out:

You can follow us on Twitter, or Instagram, and even like our Facebook page to keep yourself updated on all the latest from Microsoft, Google, Apple, and the Web.