Category Archives: Video

New Swift Offerings from IBM

In my last post I mentioned some new announcements related to the Swift programming language at IBM.  Upon further thought, I guess it’s probably not a bad idea to re-post more detail here too…

If you didn’t see/hear it last week, IBM unveiled several projects to advance the Swift language for developers, which we think will have a profound impact on developers & developer productivity in the years to come. You can view a replay of the IBM announcement in the video embedded below, or just scroll down for direct links:

Here are quick links to each of the projects listed:

Kitura
A light-weight web framework written in Swift, that allows you to build web services with complex routes, easily. Learn more…

Swift Package Catalog
The IBM Swift Package Catalog enables the Swift.org developer community to leverage and share code across projects. Learn more…

Updated IBM Swift Sandbox
The Swift Sandbox enables developers to learn Swift, build prototypes and share code snippets. Whatever your Swift ambitions, join the over 100,000 community members using the Sandbox today. Learn more…

OpenWhisk
OpenWhisk is an event-driven compute platform that executes application logic in response to events or through direct invocations–from web/mobile apps or other endpoints. Learn more…

Or, you can read John Ponzo’s official announcement here.


This is more or less a re-share from my original post on the Swift@IBM Blog.

Mobile Apps, Cognitive Computing, & Wearables

talkingLast week I was in good ‘ole Las Vegas for IBM InterConnect – IBM’s largest conference of the year. With over 20,000 attendees, it was a fantastic event that covered everything from technical details for developers to forward-looking strategy and trends for C-level executives. IBM also made some big announcements for developers – OpenWhisk serverless computing and bringing the Swift language to the server – just to name a few. Both of these are exciting new initiatives that offer radical changes & simplification to developer workflows.

It was a busy week to say the least – lots of presentations, a few labs, and even a role in the main stage Swift keynote. You can expect to find more detail on each of these here on the blog in the days/weeks to come.

For starters, here are two “lightning talks” I presented in the InterConnect Dev@ developer zone:

Smarter apps with Cognitive Computing

This session introduces the concept of cognitive computing, and demonstrates how you can use cognitive services in your own mobile apps.  If you aren’t familiar with cognitive computing, then I strongly recommend that you check out this post: The Future of Cognitive Computing.

In the presentation below, I show two apps leveraging services on Bluemix, IBM’s Cloud computing platform, and the iOS SDK for Watson.

Actually, I’m using two Watson SDKs… The older Speech SDK for iOS, and the new iOS SDK.  I’m using the older speech SDK in one example because it supports continuous listening for Watson Speech To Text, which is currently still in development for the new SDK.

You can check out the source code for the translator app here.

Redefining your personal mobile expression with on-body computing

My second presentation highlighted how we can use on-body computing devices to change how we interact with systems and data.  For example, we can use a luxury smart watch (ex: Apple Watch) to consume and engage with data in more efficient and more personal ways.  Likewise, we can also use smart/wearable peripherals devices to access and act on data in ways that were never possible before.

For example, determining gestures or biometric status based upon patterns in raw data transmitted by the on-body devices.  For this, I leveraged the new IBM Wearables SDK.  The IBM Wearables SDK provides a consistent interface/abstraction layer for interacting with wearable sensors.  This allows you to focus on building your apps that interact with the data, rather thank learning the ins & outs of a new device-specific SDK.

The wearables SDK also users data interpretation algorithms to enable you to define gestures or patterns in the data, and use those patterns to act upon events when they happen – without additional user interaction.  For example: you can determine if someone falls down, you can determine when someone is raising their hand, you can determine anomalies in heart rate or skin temperature, and much more.  The system is capable of learning patterns for any type of action or virtually any data being submitted to the system.  Sound interesting?  Then check it out here.

The wearables SDK is open source on Github, and contains a sample to help you get started.

I also had some other sessions on integrating drones with cloud services, integrating weather services in your mobile apps, and more.  I’ll be sure to post updates for this content I make them publicly available.  I think you’ll find the session on drones + cloud especially interesting – I know I did.

Say What? Live video chat between iOS & WebRTC with Twilio & IBM Watson Cognitive Computing in Real Time

What I’m about to show you might seem like science fiction from the future, but I can assure you it is not. Actually, every piece of this is available for you to use as a service.  Today.

Yesterday Twilio, an IBM partner whose services are available via IBM Bluemix, announced several new SDKs, including live video chat as a service.  This makes live video very easy to integrate into your native mobile or web based applications, and gives you the power to do some very cool things. For example, what if you could add video chat capabilities between your mobile and web clients? Now, what if you could take things a step further, and add IBM Watson cognitive computing capabilities to add real-time transcription and analysis?

Check out this video from yesterday’s Twilio Signal conference keynote, where fellow IBM’ers Damion Heredia and Jeff Sloyer demonstrate exactly this scenario; the integration of the new Twilio video SDK between iOS native and WebRTC client with IBM Watson cognitive computing services providing realtime transcription and sentiment analysis.

If it doesn’t automatically jump to the IBM Bluemix Demo, skip ahead to 2 hours, 15 min, and 20 seconds.

Jeff and Damion did an awesome job showing of both the new video service and the power of IBM Watson. I can also say first-hand that the new Twilio video services are pretty easy to integrate into your own projects (I helped them integrate these services into the native iOS client (physician’s app) shown in the demo)!  You just pull in the SDK, add your app tokens, and instantiate a video chat.   Jeff is pulling the audio stream from the WebRTC client and pushing it up to Watson in real time for the transcription and sentiment analysis services.

Video: The Next Generation of Native Apps Built with IBM MobileFirst

Last month I had the opportunity to speak at the DevNexus developer conference in Atlanta on building native iOS apps IBM MobileFirst. DevNexus is a great event, and it is always a privilege to attend – I highly recommend it for next year.   If you weren’t able to make it, no worries!  Most of the sessions were recorded and are available for viewing online via dzone.

The recording of my session is embedded below.  It covers everything you need to know to get started building apps with the MobielFirst platform.

This session focuses mainly on native iOS, but the exact sample concepts apply to MobileFirst apps built for other platforms or hybrid apps.  It covers both the MobileFirst for Bluemix (Cloud) and on-premise MobileFirst Platform Foundation Server solutions.

Here’s the “official” session description:

Once your app goes live in the app store you will have just entered into an iterative cycle of updates, improvements, and releases. Each successively building on features (and defects) from previous versions. IBM MobileFirst Foundation gives you the tools you need to manage every aspect of this cycle, so you can deliver the best possible product to your end user. In this session, we’ll cover the process of integrating a native iOS application with IBM MobileFirst Foundation to leverage all of the capabilities the platform has to offer.

Video originally shared by @dzone.

Lens Correction For GoPro Video Footage Has Never Been Easier!

I love my GoPro camera.  It takes amazing pictures and captures incredible videos, and can get into some extreme situations that other cameras probably would not survive – no wonder it is one of the best selling cameras in the world.  I also love the fisheye lens, but there are times when the fisheye effect is too much. We’ve had lens correction in Photoshop and Lightroom for a while, optics compensation in After Effects, but now it is easier than ever to non-destructively remove the fisheye effect from GoPro video footage directly inside of Adobe Premiere Pro.  Check out the video below to see it in action.

Applying lens correction (or lens distortion removal) is incredibly easy.  There are new effects presets in the effects panel that enable video editors to simply drag an effect onto their clip to have the lens correction applied.  Just select the preset for the resolution and field of view (FOV) that match what you used to capture your footage, and drag it right onto your clip.  They under Presets -> Lens Distortion Removal -> GoPro. For those fellow quadcopter enthusiasts, you may also notice some presets for the DJI Vision cameras!

gopro-lens-effect-preset
GoPro Lens Distortion Presets

Once you’ve applied the preset to your footage, you can tweak it as you like to customize the amount of correction.  You can under-correct, over-correct, or change the center/focal point of the correction.  I normally tend to leave it with the default settings…

gopro-lens-effects
GoPro Lens Distortion Effect Controls

Once you’ve applied the correct preset for your footage, you’ll be able to see that the lens distortion has been removed.  The straight lines will now appear straight, and everything will line up to scale.

lens-correction-in-Premiere
Lens Distortion Removal in Action

Now get out there and go capture some amazing footage of your own!