Category Archives: open source

Mobile Apps, Cognitive Computing, & Wearables

talkingLast week I was in good ‘ole Las Vegas for IBM InterConnect – IBM’s largest conference of the year. With over 20,000 attendees, it was a fantastic event that covered everything from technical details for developers to forward-looking strategy and trends for C-level executives. IBM also made some big announcements for developers – OpenWhisk serverless computing and bringing the Swift language to the server – just to name a few. Both of these are exciting new initiatives that offer radical changes & simplification to developer workflows.

It was a busy week to say the least – lots of presentations, a few labs, and even a role in the main stage Swift keynote. You can expect to find more detail on each of these here on the blog in the days/weeks to come.

For starters, here are two “lightning talks” I presented in the InterConnect Dev@ developer zone:

Smarter apps with Cognitive Computing

This session introduces the concept of cognitive computing, and demonstrates how you can use cognitive services in your own mobile apps.  If you aren’t familiar with cognitive computing, then I strongly recommend that you check out this post: The Future of Cognitive Computing.

In the presentation below, I show two apps leveraging services on Bluemix, IBM’s Cloud computing platform, and the iOS SDK for Watson.

Actually, I’m using two Watson SDKs… The older Speech SDK for iOS, and the new iOS SDK.  I’m using the older speech SDK in one example because it supports continuous listening for Watson Speech To Text, which is currently still in development for the new SDK.

You can check out the source code for the translator app here.

Redefining your personal mobile expression with on-body computing

My second presentation highlighted how we can use on-body computing devices to change how we interact with systems and data.  For example, we can use a luxury smart watch (ex: Apple Watch) to consume and engage with data in more efficient and more personal ways.  Likewise, we can also use smart/wearable peripherals devices to access and act on data in ways that were never possible before.

For example, determining gestures or biometric status based upon patterns in raw data transmitted by the on-body devices.  For this, I leveraged the new IBM Wearables SDK.  The IBM Wearables SDK provides a consistent interface/abstraction layer for interacting with wearable sensors.  This allows you to focus on building your apps that interact with the data, rather thank learning the ins & outs of a new device-specific SDK.

The wearables SDK also users data interpretation algorithms to enable you to define gestures or patterns in the data, and use those patterns to act upon events when they happen – without additional user interaction.  For example: you can determine if someone falls down, you can determine when someone is raising their hand, you can determine anomalies in heart rate or skin temperature, and much more.  The system is capable of learning patterns for any type of action or virtually any data being submitted to the system.  Sound interesting?  Then check it out here.

The wearables SDK is open source on Github, and contains a sample to help you get started.

I also had some other sessions on integrating drones with cloud services, integrating weather services in your mobile apps, and more.  I’ll be sure to post updates for this content I make them publicly available.  I think you’ll find the session on drones + cloud especially interesting – I know I did.

IBM Acquires StrongLoop – Leveling Up Node.js in the Enterprise

Today IBM announced the acquisition of StrongLoop, Inc,  leaders in enterprise development on Node.js and major contributors to Express, LoopBack, and other Node.js tools and frameworks.

0-strongloop

Node.js is an incredible tool for rapidly building highly performant and scalable back end systems, and you develop it using a familiar core language that most front-end developers are already accustomed to, JavaScript. This acquisition is positioned to greatly enhance Node.js in the enterprise, and StrongLoop’s offerings will be integrated into IBM Bluemix, IBM MobileFirst, and WebSphere.

Even though the acquisition is still “hot off of the presses”, you can start using these tools together today:

You can read more about this acquisition and the future vision between IBM and StrongLoop on the StrongLoop blog, IBM Bluemix Blog, and IBM MobileFirst Blog.

If you haven’t heard about StrongLoop’s LoopBack framework, it enables you to easily connect and expose your data as REST services. It provides the ability to visually create data models in a graphical (or command line) interface, which are used to automatically generate REST APIs – thus generating CRUD operations for your REST services tier, without having to write any code.

Why is this important?

It makes API development easier and drastically reduces time from concept to implementation.  If you haven’t yet looked at the LoopBack framework, you should definitely check it out.  You can build API layers for your apps literally in minutes.  Check out the video below for a quick introduction:

Again, be sure to check out these posts that detail the integration steps so you can start using these tools together today:

 

 

Complete Walkthrough and Source Code for “Building Offline Apps”

I recently put together some content on building “Apps that Work as Well Offline as they do Online” using IBM MobileFirst and Bluemix (cloud services).  There was the original blog post, I used the content in a presentation at ApacheCon, and now I’ve opened everything up for anyone use or learn from.

The content now lives on the IBM Bluemix github account, and includes code for the native iOS app, code for the web (Node.js) endpoint, a comprehensive script that walks through every step of of the process configuring the application, and also a video walkthrough of the entire process from backend creation to a complete solution.

Key concepts demonstrated in these materials:

  • User authentication using the Bluemix Advanced Mobile Access service
  • Remote app logging and instrumentation using the Bluemix Advanced Mobile Access service
  • Using a local data store for offline data access
  • Data replication (synchronization) to a remote data store
  • Building a web based endpoint on the Node.js infrastructure

You can download or fork any of the code at:

The repo contains:

  • Complete step-by-step instructions to guide through the entire app configuration and deployment process
  • Client-side Objective-C code (you can do this in either hybrid or other native platforms too, but I just wrote it for iOS).  The “iOS-native” folder contains the source code for a complete sample application leveraging this workflow. The “GeoPix-complete” folder contains a completed project (still needs you to walk through backend configuration). The “GeoPix-starter” folder contains a starter application, with all MobileFirst/Bluemix code commented out. You can follow the steps inside of the “Step By Step Instructions.pdf” file to setup the backend infrastructure on Bluemix, and setup all code within the “GeoPix-starter” project.
  • Backend Node.js app for serving up the web experience.

 

New UI Article and Architectural Presentation on PhoneGap

A couple of PhoneGap related activities I’ve been up to recently…

First, an article I recently wrote on Creating a Card-based UI in PhoneGap has been published on DZone.  Be sure to check it out for some tips on creating the user interface, all with just HTML, CSS, and JavaScript.   This covers both basic/static and dynamic use cases.

cards

Second, a video of my presentation on “Architectural Considerations for PhoneGap and Mobile Web Apps” has been published by the Atlanta HTML5 meetup group.  Check it out in the video below, and if you’re in the Atlanta area, be sure to check out the meetup group!  Here’s the presentation description:

Tired of Hello World? In this session, we explore best practices to build real-world PhoneGap applications. We investigate the Single Page Architecture, HTML templates, effective Touch events, performance techniques, modularization and more. We also compare and contrast the leading JavaScript and Mobile Frameworks. This session is a must If you plan to build a PhoneGap application that has more than a couple of screens.

… and here’s the actual presentation video:

If you’re reading this, and wondering what PhoneGap is:

PhoneGap is a free and open source framework that allows you to create mobile apps using standardized web APIs for the platforms you care about.

You should definitely check it out, and enjoy!

PhoneGap & Hardware Session Video now Available

The recording for my session “PhoneGap and Hardware” from PhoneGap Day back in July is now available! Be sure to check it out. There were apparently some issues with the audio, but you can still hear everything.

I’d like to express a huge Thank You to everyone who attended, and to everyone who watches this video!

Below are the sample projects I showed in the presentation, including source code. However, keep in mind that all of these examples were written before PhoneGap 3.0. The native plugin syntax, and inclusion methods have changed.

Pressure Sensitive Sketching in PhoneGap

In this example, the pressure-sensitive Pogo Connect Stylus uses a low energy Bluetooth 4 connection to relay touch/pressure information back to the PhoneGap application. This makes for a unique drawing and sketching experience powered with the HTML5 Canvas element. I’ve written about this example previously… Check out the video below to see it in action, and read the blog post for technical details and source code.

Moga Gamepad

The second example that I explored is a PhoneGap native plugin that is used to handle input from a Moga game controller inside of a PhoneGap application on Android.

This implementation is intended to be a proof of concept demonstrating how you could integrate the gamepad within your application. It currently only supports input from the joysticks (axisX and axisY) and the A and B buttons, and it does not handle all possible input from the controller.

This implementation is adapted directly from the com.bda.controller.example.demo.listen example from the Moga developers SDK samples available for download at: http://www.mogaanywhere.com/developers/

Check out the video below to see it in action:

The game is based on the Universe prototype that was used as a sub-game inside of the MaxMe app for the recent Adobe MAX conference. I make no guarantees about the code for this game, it was in a huge rush!

Quick Links