Category Archives: Wearables

Mobile Apps, Cognitive Computing, & Wearables

talkingLast week I was in good ‘ole Las Vegas for IBM InterConnect – IBM’s largest conference of the year. With over 20,000 attendees, it was a fantastic event that covered everything from technical details for developers to forward-looking strategy and trends for C-level executives. IBM also made some big announcements for developers – OpenWhisk serverless computing and bringing the Swift language to the server – just to name a few. Both of these are exciting new initiatives that offer radical changes & simplification to developer workflows.

It was a busy week to say the least – lots of presentations, a few labs, and even a role in the main stage Swift keynote. You can expect to find more detail on each of these here on the blog in the days/weeks to come.

For starters, here are two “lightning talks” I presented in the InterConnect Dev@ developer zone:

Smarter apps with Cognitive Computing

This session introduces the concept of cognitive computing, and demonstrates how you can use cognitive services in your own mobile apps.  If you aren’t familiar with cognitive computing, then I strongly recommend that you check out this post: The Future of Cognitive Computing.

In the presentation below, I show two apps leveraging services on Bluemix, IBM’s Cloud computing platform, and the iOS SDK for Watson.

Actually, I’m using two Watson SDKs… The older Speech SDK for iOS, and the new iOS SDK.  I’m using the older speech SDK in one example because it supports continuous listening for Watson Speech To Text, which is currently still in development for the new SDK.

You can check out the source code for the translator app here.

Redefining your personal mobile expression with on-body computing

My second presentation highlighted how we can use on-body computing devices to change how we interact with systems and data.  For example, we can use a luxury smart watch (ex: Apple Watch) to consume and engage with data in more efficient and more personal ways.  Likewise, we can also use smart/wearable peripherals devices to access and act on data in ways that were never possible before.

For example, determining gestures or biometric status based upon patterns in raw data transmitted by the on-body devices.  For this, I leveraged the new IBM Wearables SDK.  The IBM Wearables SDK provides a consistent interface/abstraction layer for interacting with wearable sensors.  This allows you to focus on building your apps that interact with the data, rather thank learning the ins & outs of a new device-specific SDK.

The wearables SDK also users data interpretation algorithms to enable you to define gestures or patterns in the data, and use those patterns to act upon events when they happen – without additional user interaction.  For example: you can determine if someone falls down, you can determine when someone is raising their hand, you can determine anomalies in heart rate or skin temperature, and much more.  The system is capable of learning patterns for any type of action or virtually any data being submitted to the system.  Sound interesting?  Then check it out here.

The wearables SDK is open source on Github, and contains a sample to help you get started.

I also had some other sessions on integrating drones with cloud services, integrating weather services in your mobile apps, and more.  I’ll be sure to post updates for this content I make them publicly available.  I think you’ll find the session on drones + cloud especially interesting – I know I did.

Wearables & IBM MobileFirst – Video & Sample Code

Last week I attended IBM Insight in Las Vegas. It was a great event, with tons of great information for attendees. I had a few sessions on mobile applications. In particular, my dev@Insight session on Wearables powered by IBM MobileFirst was recorded. You can check it out here:

https://youtu.be/d4AEwCOmvug

Sorry it’s not in HD, but the content is still great! (Yes, I am biased.)

In this session I showed how you can power wearable apps, specifically those on smart watch devices, using either the MobileFirst Platform Foundation Server, or the MobileFirst offerings on IBM Bluemix (cloud).

Key takeaways from the session:

  1. Wearables are the most personal computing devices ever. Your users can use them to be notified of information, search/consume data, or even collect environmental data for reporting or actionable analysis.
  2. Regardless of whether developing for a peripheral device like the Apple Watch or Microsoft Band, or a standalone device like Android Wear, you are developing an app that runs in an environment that mirrors that of a a native app. So, the fundamental development principles are exactly the same. You write native code, that uses standard protocols and common conventions to interact with the back-end.
  3. Caveat to #1: You user interface is much smaller. You should design the user interface and services to acomodate for the reduced amount of information that can be displayed.
  4. You can share code across both the phone/tablet and watch/wearable experience (depending on the target device).
  5. Using IBM MobileFirst you can easily expose data, add authentication, and capture analytics for both the mobile and wearable solutions.

Demos/Code Samples:

In the session I showed 3 sample wearable apps.  Full source code and setup instructions for each app is available at: https://github.com/triceam/MobileFirst-Wearables/

Stocks

A sample WatchKit (Apple Watch) app powered by IBM MobileFirst Platform Foundation Server.

applewatch-stocks

Contacts

A sample WatchKit (Apple Watch) app powered by IBM MobileFirst on Bluemix.

contacts-watch

Heartrate

A simple heart rate monitor using the Microsoft Band, powered by MobileFirst on Bluemix and IBM Cloudant.

heartrate