Tag Archives: video

IBM Worklight Powered Native Objective-C iOS Apps

IBM MobileFirst Foundation (also known as IBM Worklight) is a middleware solution for developing mobile applications. Out of the box, Worklight enables security and authentication, device management, encrypted storage, operational analytics, simple cross platform push notifications, remote logging, data access, and more…

Historically, most people think that Worklight is just for creating hybrid mobile (HTML-powered) applications. While this is one of the powerful workflows that Worklight enables, it’s also the proverbial “tip of the iceberg”. Worklight does a lot more than just provide the foundation for secure hybrid applications. Worklight also provides a secure foundation for native applications, and can be easily incorporated to any existing (or new) native app.

In this post, I’ve put together a video series to take a look at just how easy it is to setup Worklight in an existing iOS native application to provide remote log collection, application management, and more. Read on for complete detail, or check out the complete multi-video playlist here.

The existing app that I’ll be integrating is based on an open source Hacker News client, which I downloaded from GitHub. Check out the video below for a quick introduction to what we’ll be building.

If you want, you can leverage the Worklight IDE – a set of developer tools built on top of the Eclipse development environment. If you don’t want to use Eclipse, that’s OK too – Worklight can be installed and configured using the Worklight command line interface (CLI) and you can leverage any developer tools that you’d like to build your applications. Want to tie into Xcode ? No problem. I’ll be using the Worklight CLI to setup the development environment in this series.

Setting up the Worklight Server

The Worklight server is the backbone for providing Worklight features. App managment, remote logging, operational & network analtics, and all of the Worklight features require the server, so the first thing that you need to do is setup the server for our environment. Check out the next video, which will guide you through setting up a Worklight server/project using the CLI.

First things first, you have to start the Worklight server. If you haven’t already configured a Worklight server, run the create-server command to perform the initial setup – this only has to be run once ever.

wl create-server

Now that the server is setup, we need to create a Worklight project. For this walkthrough, I’ll just call it “MyWorklightServer”:

wl create MyWorklightServer

Next, go into the newly created project directory and go ahead and start it.

cd MyWorklightServer
wl start

Once the server is started, add the iOS platform:

wl add api

You will be prompted for an API name. This can be anything, but you should probably give it a meaningful name that identifies what the API will be used for. In this walkthrough I specify the name “MyiOSNativeAPI”.

01-Worklight-CLI

Next you will be prompted to select a platform, select “iOS”
Next, build the project, and then deploy it to the server:

wl build
wl deploy

Next, launch the Worklight console to view that the project has been deployed and the native API has been created. The console will launch in the system web browser.

wl console
02-Worklight-Console
Worklight Console

 

Be sure to check out the Worklight CLI documentation for complete detail on the CLI commands.

Xcode Integration

Next we need to setup the Xcode project to connect to the newly created Worklight server. If you’re adding Worklight to a new Xcode project, or an existing Xcode project, the preparation steps are the same:

  1. Add Worklight files to your Xcode project
  2. Add framework dependencies
  3. Add the -ObjC linker flag

This next video walks through configuration of your Xcode project and connecting to the Worklight server (which we will cover next):

In the Xcode project navigator, create a folder/group for the Worklight files, then right click or CTRL-click and select “Add Files to {your project}”…

03-add-files

Next, navigate to the newly created MyiOSNativeAPI folder and select the worklight.plist file and WorklightAPI folder, and click the “Add” button.

04-add-files-2

Once we’ve added the files to our Xcode project, we need to link the required framework/library dependencies:

  • CoreData.framework
  • CoreLocation.framework
  • MobileCoreServices.framework
  • Security.framework
  • SystemConfiguration.framework
  • libstdc++.6.dylib
  • libz.dylib

Next, In the Xcode project’s Build Settings search for “Other Linker Flags” and add the following linker flag: “-ObjC”.

05-linker-flag

If you’d like additional detail, don’t miss this tutorial/starter app by Carlos Santanahttps://github.com/csantanapr/wl-starter-ios-app

 Connecting to the Worklight server

Connecting to the Worklight server is just a few simple lines of code. You only have to implement the WLDelegate protocol, and call wlConnectWithDelegate, and the Worklight API handles the rest of the connection process. Check out the video below to walk through this process:

Implement the WLDelegate protocol:

//in the header implement the WLDelegate protocol
@interface MAMAppDelegate : UIResponder <UIApplicationDelegate,
   WLDelegate> {

//in the implementation, add the protocol methods
-(void)onSuccess:(WLResponse *)response {
}
-(void)onFailure:(WLFailResponse *)response {
}

Connect to the Worklight Server:

[[WLClient sharedInstance] wlConnectWithDelegate: self];

Next, go ahead and launch the app in the iOS Simulator.

You’re now connected to the Worklight server! At this point, you could leverage app management and analytics through the Worklight Console, or start introducing the OCLogger class to capture client side logging on the server.

App Administration via Worklight Console
App Administration via Worklight Console

 

Worklight Analytics Dashboard
Worklight Analytics Dashboard

 

Remote Collection of Client Logs

Once you’re connected to Worklight, you can start taking advantage of any features of the client or server side APIs. In this next video, we’ll walk through the process of adding remote collection of client app logs, which could be used for app instrumentation, or for debugging issues on remote devices.

On the server, you’ll need to add log profiles to enable the capture of information from the client machines.

Adding Log Profiles
Adding Log Profiles

On the client-side, we just need to use the OCLogger class to make logging statements. These statements will be output in the Xcode console for local debugging purposes. If a log profile has been configured on the server, these statements will also be sent to the Worklight server.

OCLogger *logger = [OCLogger getInstanceWithPackage:@"MyAppPackageName"];
[OCLogger setCapture:YES];
[OCLogger setAutoSendLogs:YES];

//now log something
[logger log:@"worklight connection success"];
[logger error:@"worklight connection failed %@", response.description];

For complete reference on client side logging, be sure to review the Client-side log capture API documentation.

App Management & Administration

Out of the box, Worklight also provides for hassle-free (and code-free) app management. This enables you to set notifications for Worklight client apps and disable apps (cutting off access to services and providing update URLs, etc..). This next video walks you through the basics of app management.

Be sure to check out the complete documentation for app management for complete details.

All done, right?

At this point, we’ve now successfully implemented Worklight server integration into a native iOS app. We have remote collection of client-side logs, we can leverage app management, and we can collect operational analytics (including platforms, active devices, and much more).

If you don’t want to leverage any more Worklight features, then by all means, ship it! However, Worklight still has a LOT more to offer.

Exposing Data Through Adapters

Worklight also has the ability to create adapters to expose your data to mobile clients. Adapters are written in JavaScript and run on the server. They help you speed up development, enhance security, transform data serialization formats, and more.

In the next two videos we will walk through the process of collecting information inside the native iOS application and pushing that into a Cloudant NoSQL data base (host via IBM Bluemix services).

Cloudant databases have a REST API of their own, so why use a Worklight Adapter? For starters, Worklight becomes your mobile gateway.  By funneling requests through Worklight, you are able to capture analytic data for every server invocation, and Worklight gives us the ability to control access to the enterprise.  Worklight gives you the capability to cut off mobile access to the backend at any time, just by changing the API status.

Now let’s take a look at the process for exposing the Cloudant database through a Worklight Adapter:

Once the data adapter has been configured, it is simple to invoke the adapter procedures to get data into or out of the your applications.

This next video covers the process of pushing data into the Cloudant database adapter from the native mobile client application:

Once again, you will have to implement the WLDelegate protocol to handle success/error conditions, and the procedure invocation is implemented using the WLClient invokeProcedure method:

NSMutableDictionary *userProfile = [[NSMutableDictionary alloc] init];
[userProfile setValue:self.email.text forKey:@"email"];
[userProfile setValue:self.firstName.text forKey:@"firstName"];
[userProfile setValue:self.lastName.text forKey:@"lastName"];    

WLProcedureInvocationData * invocationData = [[WLProcedureInvocationData alloc] initWithAdapterName:@"CloudantAdapter" procedureName:@"putData"];
invocationData.parameters = @[userProfile];

[[OCLogger getInstanceWithPackage:@"UserProfile"] log:@"sending data to server"];
[[WLClient sharedInstance] invokeProcedure:invocationData withDelegate:self];

It is as simple as that.

IBM MobileFirst Foundation (aka Worklight) is for more than just hybrid app development. It is a secure platform to streamline your enterprise application development processes, including everything from encryption, security and authentication, to operational analytics and logging, to push notifications, and much more, regardless of whether you’re targeting hybrid app development paradigms, or native iOS, native Android, or native Windows Phone projects.

Lens Correction For GoPro Video Footage Has Never Been Easier!

I love my GoPro camera.  It takes amazing pictures and captures incredible videos, and can get into some extreme situations that other cameras probably would not survive – no wonder it is one of the best selling cameras in the world.  I also love the fisheye lens, but there are times when the fisheye effect is too much. We’ve had lens correction in Photoshop and Lightroom for a while, optics compensation in After Effects, but now it is easier than ever to non-destructively remove the fisheye effect from GoPro video footage directly inside of Adobe Premiere Pro.  Check out the video below to see it in action.

Applying lens correction (or lens distortion removal) is incredibly easy.  There are new effects presets in the effects panel that enable video editors to simply drag an effect onto their clip to have the lens correction applied.  Just select the preset for the resolution and field of view (FOV) that match what you used to capture your footage, and drag it right onto your clip.  They under Presets -> Lens Distortion Removal -> GoPro. For those fellow quadcopter enthusiasts, you may also notice some presets for the DJI Vision cameras!

gopro-lens-effect-preset
GoPro Lens Distortion Presets

Once you’ve applied the preset to your footage, you can tweak it as you like to customize the amount of correction.  You can under-correct, over-correct, or change the center/focal point of the correction.  I normally tend to leave it with the default settings…

gopro-lens-effects
GoPro Lens Distortion Effect Controls

Once you’ve applied the correct preset for your footage, you’ll be able to see that the lens distortion has been removed.  The straight lines will now appear straight, and everything will line up to scale.

lens-correction-in-Premiere
Lens Distortion Removal in Action

Now get out there and go capture some amazing footage of your own!

Salisbury Festival Time-lapse

Every year the town I live in has a weekend-long spring festival. There are rides for the kids, live music, beer, and lots of food. This year I have a great view overlooking the carnival area, so I decided to do a time-lapse video capturing all of the activity. The trucks pulled in before I got to the office on Thursday morning, but I managed to capture most of the set up, all the way until the trucks drove away on Sunday night.

I set up two GoPro cameras. One was a stock GoPro Hero 3+ Black edition capturing 7MP narrow FOV stills every 60 seconds. The other was a GoPro Hero 3 Black with a “flat” lens capturing 5MP stills every 60 seconds. Unfortunately the 3+ stopped recording after about 24 hours – I’m not sure if the camera over heated, had a bug in the firmware (I realized I’m 1 version back from the latest), or if my memory card had a corrupt sector. The image sequence for Thursday is from this camera. The backup camera kept running all 4 days and captured the entire festival.

Assembling this was simple – I imported the images as image sequences in Adobe Premiere, arranged them on the timeline, cut out the night sequences (there was almost no activity during them), added some transitions, titles, and color correction (contrast and saturation), then added some background music.  I added slow zooming and panning to each of the shots to add drama, which helped make things a lot more interesting.

Improving The Quality Of Your Video Compositions With Creative Cloud

I’ve been spending a lot of time with Adobe video tools lately… everything from videos for the blog, to promotional videos, to help/technical videos.  Here are a few topics that beginners in video production need to think about… audio processing and color correction.

First, you can make so-so video look great with a few simple color correction techniques. Second, a video is only as good as its audio, so you need solid audio to keep viewers engaged.  Hopefully this post helps you improve your videos with simple steps on both of these topics.

To give you an idea what I’m talking about, check out this before and after video. It’s the exact same clip played twice.  The first run through is just the raw video straight from the camera and mic.  Colors don’t “pop”, it’s a little grainy, and the audio is very quiet.  The second run through has color correction applied to enhance the visuals, and also has processed audio to enhance tone, increase volume, and clean up artifacts.

Let’s first look at color correction.  Below you can see a “before” and “after” still showing the effects of color correction.  The background is darker and has less grain, there is more contrast, and the colors are warmer.

Before and After - Color Correction
Before and After – Color Correction

The visual treatment was achieved using two simple effects in Adobe Premiere Pro.  First I used the Fast Color Corrector to adjust the input levels.  By bringing up the black and gray input levels, the background became darker, and it reduced grain in the darker areas.  Then, I applied the “Warm Overall” Lumetri effect to make the video feel warmer – this enhances the reds to add warmth to the image.

Color Correction Effects in Adobe Premiere
Color Correction Effects in Adobe Premiere

You can enhance colors even further using color correction tools inside of Premiere Pro, or open the Premiere Pro project directly within SpeedGrade for fine tuning.

Next, let’s focus on audio…

You can get by with a mediocre video with good audio, but nobody wants to sit through a nice looking video with terrible audio. Here are three simple tips for Adobe Audition to help improve your audio, and hopefully keep viewers engaged.

In this case, I thought the audio was too quiet and could be difficult to understand.  My goal was to enhance audio volume and dynamics to make this easier to hear.

I first used Dynamics Processing to create a noise gate. This process removes quiet sounds from the audio, leaving us with the louder sounds, and generally cleaner audio.  You could also use Noise Reduction or the Sound Remover effects… the effect that works best will depend on your audio source.

Dynamics Processing (Noise Gate) in Adobe Audition
Dynamics Processing (Noise Gate) in Adobe Audition

Next I used the 10-band graphic equalizer to enhance sounds in specific frequency ranges.  I brought up mid-range sounds to give more depth to the audio track.

10 Band EQ in Adobe Audition
10 Band EQ in Adobe Audition

Finally, I used the Multiband Compressor to enhance the dynamic range of the audio.  Quieter sounds were brought up and louder sounds were brought down to create more level audio that is easier to hear and understand.  However, be careful not to make your audio too loud when using the compressor!  If you’ve ever been watching TV and the advertisements practically blow out your eardrums, this is because of overly compressed audio.

Multi-band Compressor in Adobe Audition
Multi-band Compressor in Adobe Audition

Want to learn more?  Don’t miss the Creative Cloud Learn resources to learn more about all of the Creative Cloud tools – the learning resources are free for everyone! If you aren’t already a member, join Creative Cloud today to access all Adobe media production tools.

Aerial Videography with the GoPro Camera and Adobe Creative Cloud Tools

Interested in aerial videography with remote control helicopters? Well, you’re in luck! This month’s issue of Adobe Inspire magazine features my article which introduces aerial videography with a DJI Phantom multirotor helicopter and a GoPro camera!

You can read it on the web or download the FREE digital publication version to learn more. I recommend the digital publication version, which was created with Adobe Digital Publishing Suite.

Interested in focusing on aerial photography instead of videography? Stay tuned for the March Adobe Inspire issue next month, which will feature a complimentary article focusing on still images captured with the same helicopter configuration. Subscribe today to be notified automatically when the new version is available.

Adobe Inspire-1

Be warned – flying helicopters with cameras attached is highly addictive. You may easily become obsessed with the endless possibilities, as I have.

Here are a few videos I’ve captured with this setup, and processed with Creative Cloud.

Some scenic shots in and around San Francisco…

A digital short where I was playing around with After Effects…

The sky really is the limit!

OK, do I have your attention yet? To learn more you can read the full article online or download the FREE digital publication, and don’t forget to become a member of Creative Cloud to take advantage of all the creative tools that Adobe has to offer.