Interview: Gathering & analyzing data with drones & IBM Bluemix

Here’s an interview that I recently did with IBM DeveloperWorks TV at the recent World of Watson conference. In it I discuss a project I’ve been working on that analyzes drone imagery to perform automatic damage detection using the Watson Visual Recognition, and generates 3D models from the drone images using photogrammetry processes. The best part – the entire thing runs in the cloud on IBM Bluemix.

It leverages the IBM Watson Visual Recognition service with custom classifiers to detect the presence of hail damage on shingled roofs, Cloudant for metadata/record storage, the IBM Cloud Object Storage cross-region S3 API for massively scalable & distributed image/model/asset storage, and Bare Metal servers for high performance computing.

Bare Metal servers are dedicated machines in the cloud: not shared, and not virtualized. I’ve got mine setup as a linux server with 24 cores (48 threads), 64 Gigs of RAM, a SSD RAID array, multiple GPUs, etc… and it improved my photogrammetry rendering from hours on my laptop down to merely 10 minutes (in my opinion the best part).

I’ve done all of my testing with DJI Phantom and DJI Inspire aircraft, but really, it could work with any images, from any camera that has embedded GPS information.

Check out the video to see it in action.

Drones, Bots, Cognitive Apps, Image Recognition, Motion Analysis, and Photogrammetry (or, what I’ve been up to lately)

It’s been a while since I’ve posted here on the blog…  In fact, I just did the math, and it’s been over 7 months. Lots of things have happened since, I’ve moved to a new team within IBM, built new developer tools, worked directly with clients on their solutions, worked on a few high profile keynotes, built apps for kinetic motion and activity tracking, built a mobile client for a chat bot, and even completed some new drone projects.  It’s been exciting to say the least, but the real reason I’m writing this post is to share a few of the public projects I’ve been involved with from recent conferences.

I recently returned from Gartner Symposium and IBM’s annual World of Watson conference, and it’s been one of the busiest, yet most exciting span of two weeks I’ve experienced in quite a while.

At both events, we showed a project I’ve been working on with IBM’s Global Business Services team that focuses on the use of small consumer drones and drone imagery to transform Insurance use cases. In particular, by leveraging IBM Watson to automatically detect roof damage, in conjunction with photogrammetry to create 3D reconstructions and generate measurements of afflicted areas to expedite and automate claims processing.

This application leverages many of the services IBM Bluemix has to offer… on-demand CloudFoundry runtimes, a Cloudant NoSQL database, scalable Cloud Object Storage (S3 compatible storage), and BareMetal servers on Softlayer. Bare Metal servers are *awesome*… I have a dedicated server in the cloud that has 24 cores (48 threads), 64 GB RAM, RAID array of SSD drives, and 2 high end multi-core GPUs. It’s taken my analysis processes from 2-3 hours on my laptop down to 10 minutes for photogrammetric reconstruction with Watson analysis.

It’s been an incredibly interesting project, and you can check it out yourself in the links below.

World of Watson

World of Watson was a whirlwind of the best kind… I had the opportunity to join IBM SVP of Cloud, Robert LeBlanc, on stage as part of the the Cloud keynote at T-Mobile Arena (a huge venue that seats over 20,000 people) to show off the drone/insurance demo, plus 2 more presentations, and an “ask me anything” session on the expo floor.

wow

The official recording is available on IBM Go, but it’s easier to just see the YouTube videos. There are two segments for my presentation: the “set up” starts at 57:16 here: https://youtu.be/VrZMQZSB_UE?t=57m16s and the “end result” starts at 1:08:00 https://youtu.be/VrZMQZSB_UE?t=1h8m0s. I wasn’t allowed to fly inside the arena, but at least I was able to bring the Inspire up on stage as a prop!

You can also check out my session “Elevate Your apps with IBM Bluemix” on UStream to see an overview in much more detail:

.. and that’s not all. I also finally got to see a complete working version of the Olympic Cycling team’s training app on the expo floor, including cycling/biometric feedback, video, etc… I worked with an IBM JStart team and wrote the video integration layer into for the mobile app using IBM Cloud Object Storage and Aspera for efficient network transmission.

olympics

This app was also showcased in Jason McGee’s general session “Trends & Directions: Digital Innovation in the Era of Cloud and Cognitive”: https://youtu.be/hgd3tbc2eKs?t=11m49s

Gartner Symposium

At the Gartner Symposium event, I showed the end to end workflow for the drone/insurance app…

Drones

On this project we’ve been working with a partner DataWing, who provides drone image/data capture as a service. However, I’ve also been flying and capturing my own data. The app can process virtually any images with appropriate metadata, but I’ve been putting both the DJI Phantom and Inspire 1 to work, and they’re working fantastically.

Here’s a sample point-cloud scan I did of my office. :)

  • Left-click and drag to rotate
  • Right-click and drag to pan
  • Scroll or pinch/pull to zoom

Or check it out fullscreen in a new window.

IBM MobileFirst Platform Foundation 8.0 Beta Now Available!

Back at the end of February, IBM announced an upcoming beta version of MobileFirst Platform Foundation version 8.0. Well, guess what? … As of last week, it is now available!

MobileFirst-Foundation

What is IBM MobileFirst Platform Foundation?

For those stumbling upon this and wondering “What is IBM MobileFirst Platform Foundation?”:

IBM MobileFirst Platform Foundation is an open, comprehensive platform to develop, test, secure, and manage mobile apps.

MobileFirst Platform Foundation provides a middleware solution and SDK that makes exposing data to mobile apps easier, improves security through encryption, authentication and handshaking to guarantee app authenticity, provides facilities to easily manage multiple versions of an app, notify and engage users, and, on top of everything else, provides operational analytics so that you can monitor the health of your overall system at any point in time.

As a mobile developer catering to the enterprise, it makes your life significantly easier, and it supports any mobile development paradigm that you might want to target: Native platforms, hybrid Xamarin using C#, and hybrid Cordova platforms (HTML/JS).

What’s new in the IBM MobileFirst Platform Foundation 8.0 Beta?

The recently opened beta has some great new features, AND it’s now available as a service on Bluemix (IBM’s Cloud platform).   The beta program is intended to deliver the next generation of an open, integrated and comprehensive mobile app development platform redesigned for cloud agility, speed, and productivity, that enables enterprises to accelerate delivery of their mobile strategy.

Those new features include (but not limited to):

  • Use of NPM on Cordova apps
  • CocoaPods support for iOS apps
  • Gradle and NuGet support for native apps
  • Maven support for backend logic
  • Faster plug-in speed – faster MFPF performance for new and existing apps
  • New, better, sample code, documentation, and guides
  • Automation support and self-service features for faster ramp-up and tear-down of environments for testing and iterations
  • Ability to make changes to app runtime settings without redeployment
  • Middleware redesigned for DevOps efficiency
  • Custom notifications in MobileFirst Operations Analytics
  • New crash analysis tools
  • and more…

Getting involved in the Beta

This is a great opportunity to to explore new features, and drive business value.  We also want your feedback to make sure the MobileFirst Platform has what you need.

You can start using the Mobile Foundation service on Bluemix today, or join the Beta, and tell us what you think.

To join the Beta program, just head over to the MobileFirst Platform Beta home page, scroll down to the “Interested in the Beta Program?” heading, and follow the instructions to sign up.

You can also join the Slack community (channel: #MFPF8_beta) to engage directly with IBM.

New Swift Offerings from IBM

In my last post I mentioned some new announcements related to the Swift programming language at IBM.  Upon further thought, I guess it’s probably not a bad idea to re-post more detail here too…

If you didn’t see/hear it last week, IBM unveiled several projects to advance the Swift language for developers, which we think will have a profound impact on developers & developer productivity in the years to come. You can view a replay of the IBM announcement in the video embedded below, or just scroll down for direct links:

Here are quick links to each of the projects listed:

Kitura
A light-weight web framework written in Swift, that allows you to build web services with complex routes, easily. Learn more…

Swift Package Catalog
The IBM Swift Package Catalog enables the Swift.org developer community to leverage and share code across projects. Learn more…

Updated IBM Swift Sandbox
The Swift Sandbox enables developers to learn Swift, build prototypes and share code snippets. Whatever your Swift ambitions, join the over 100,000 community members using the Sandbox today. Learn more…

OpenWhisk
OpenWhisk is an event-driven compute platform that executes application logic in response to events or through direct invocations–from web/mobile apps or other endpoints. Learn more…

Or, you can read John Ponzo’s official announcement here.


This is more or less a re-share from my original post on the Swift@IBM Blog.

Mobile Apps, Cognitive Computing, & Wearables

talkingLast week I was in good ‘ole Las Vegas for IBM InterConnect – IBM’s largest conference of the year. With over 20,000 attendees, it was a fantastic event that covered everything from technical details for developers to forward-looking strategy and trends for C-level executives. IBM also made some big announcements for developers – OpenWhisk serverless computing and bringing the Swift language to the server – just to name a few. Both of these are exciting new initiatives that offer radical changes & simplification to developer workflows.

It was a busy week to say the least – lots of presentations, a few labs, and even a role in the main stage Swift keynote. You can expect to find more detail on each of these here on the blog in the days/weeks to come.

For starters, here are two “lightning talks” I presented in the InterConnect Dev@ developer zone:

Smarter apps with Cognitive Computing

This session introduces the concept of cognitive computing, and demonstrates how you can use cognitive services in your own mobile apps.  If you aren’t familiar with cognitive computing, then I strongly recommend that you check out this post: The Future of Cognitive Computing.

In the presentation below, I show two apps leveraging services on Bluemix, IBM’s Cloud computing platform, and the iOS SDK for Watson.

Actually, I’m using two Watson SDKs… The older Speech SDK for iOS, and the new iOS SDK.  I’m using the older speech SDK in one example because it supports continuous listening for Watson Speech To Text, which is currently still in development for the new SDK.

You can check out the source code for the translator app here.

Redefining your personal mobile expression with on-body computing

My second presentation highlighted how we can use on-body computing devices to change how we interact with systems and data.  For example, we can use a luxury smart watch (ex: Apple Watch) to consume and engage with data in more efficient and more personal ways.  Likewise, we can also use smart/wearable peripherals devices to access and act on data in ways that were never possible before.

For example, determining gestures or biometric status based upon patterns in raw data transmitted by the on-body devices.  For this, I leveraged the new IBM Wearables SDK.  The IBM Wearables SDK provides a consistent interface/abstraction layer for interacting with wearable sensors.  This allows you to focus on building your apps that interact with the data, rather thank learning the ins & outs of a new device-specific SDK.

The wearables SDK also users data interpretation algorithms to enable you to define gestures or patterns in the data, and use those patterns to act upon events when they happen – without additional user interaction.  For example: you can determine if someone falls down, you can determine when someone is raising their hand, you can determine anomalies in heart rate or skin temperature, and much more.  The system is capable of learning patterns for any type of action or virtually any data being submitted to the system.  Sound interesting?  Then check it out here.

The wearables SDK is open source on Github, and contains a sample to help you get started.

I also had some other sessions on integrating drones with cloud services, integrating weather services in your mobile apps, and more.  I’ll be sure to post updates for this content I make them publicly available.  I think you’ll find the session on drones + cloud especially interesting – I know I did.