The Windows Apps Team has released a new set of APIs for developers to add contextual sensing to their Windows 10 apps.
These APIs utilize information from various sensors in your devices (sensors like a pedometer or a barometer) to add context based functionality to Windows 10 apps. This is similar to health apps on Windows Phone, like Microsoft Health or Runtastic, that use sensors to track activities levels and step counts. However, it can go beyond just a health and fitness tracking and apply to a wider range of activities.
New Activity Detection APIs allow apps to know the user’s motion context, such as if someone is walking, running, in a vehicle, biking, stationary, or the device is idle and unused. Some of the examples that the Windows Apps Team provides for how to use these APIs are apps that change their behavior based upon the user’s motion context, such as for power saving when the device is idle or auto adjusting camera focus when the user is walking or running.
The new Barometer API returns information to the app on the user’s elevation based on barometric pressure. This could enable indoor navigation, such as known what floor of the parking garage you parked your car on, weather forecasting, or more detailed health and fitness tracking.
The Windows Apps Team also introduced a Proximity API that can enable a device to wake up or log off based upon the user’s proximity, turn off the display during a phone call, ignore pocket clicks made then the device is in the user’s pocket, and detect gestures.
With these new APIs developers can create apps that are more aware of the user and are more informed by, and useful because of, the context of the user’s activity. This all at the end of the day depends on what sensors and hardware is installed on a device. But as new devices come out, and with the release this week of Windows 10 IoT, these new contextual sensing APIs have the potential to enable smarter apps that are more useful as we go through our daily lives.