Today many blogs were talking about the purported May 11 Release to Manufacturers launch date for Windows Mobile 6.5, but I find another bit of news in the same ZDNET article more significant.
For a year or so many Windows Mobile devices have shipped with advanced sensors such as accelerometers, but each OEM implemented their own API to access the sensors, meaning fragmentation of the platform became a significant problem, with for example G-Sensor apps written for the HTC Touch Diamond not working on the Samsung Omnia.
It seems Microsoft is finally addressing this serious issue, as can be seen from this Tech-Ed session excerpt.
â€œ Make some magic! Shake, Flip and Flick Your Application for Windows Mobile 6.5!â€œ:
â€œThe world of mobility has evolved. While keypads, stylus, and keyboards are all good and fine for device input, newer input methods have been popularized in recent years, such as accelerometers, touch screen gestures, capacitive touch screens, light sensors, and such. More than just gadgets and gimmicks, these next-generation input methods allow you, the mobile developer, to offer the best interface possible to your users on the road, enhancing their device experience. This session explores various input methods available on some of the latest Windows Mobile 6.1 and 6.5 devices and how to programmatically leverage them using managed APIs from Microsoft .NET Compact Framework-based applications. Topics covered include working with the Windows Mobile Unified Sensor API to access hardware sensors, controlling device cameras using the Windows Mobile SDK, capturing stylus and finger gestures on touch screens, detecting ambient light, making your device vibrate and sound-off, and more.â€
Of note is that the same passage talks about capacitive screens, which until recently was believed not to be supported by Windows Mobile 6.5, but is now expected to arrive on a Toshiba WM 6.5 device towards the end of the year.
It is gratifying to see this issue finally addressed, as with the unified Windows Mobile Marketplace further fragmentation can be ill afforded.
Edit: It turns out that the Windows Mobile team have in fact NOT developed a unified sensor framework, but will in fact be discussing Koushik Duttaâ€™s .Net CF API framework instead.
This I think is a real shame and an abdication of responsibility by the Windows Mobile team for the health of their platform. One simply can not reply on third party developers (no matter how gifted) for implementing such an essential feature of the platform. For example, who will develop the framework further when new sensors, like the proximity sensor on the HTC Touch Pro 2, becomes available. Mr Dutta has now moved on to greener Android pastures primarily because he found the APIâ€™s in Windows Mobile exceedingly challenging, and despite the API being open source there is no guarantee some-one of enough skill will be interested in updating it.
In closing, if its important enough for Microsoft to devote a Tech-Ed session to, its important enough for them to have developed the software themselves.
Thanks Joel Johnson for setting me straight and Loke Uei from Microsoft for confirming it.