My New Obsession: Aerial Photography & Creative Cloud

I have a new “hobby”, and I must admit, I am completely addicted. Perhaps “obsessed” might be the proper term. Luckily for me, it is a great use of Creative Cloud, so I can get away with it! What is this new obsession, you ask? Aerial Photography.

However, I’m not flying airplanes or riding in helicopters. In fact, my feet aren’t even leaving the ground at all… just my camera. It all started a few weeks ago when my friend Tony approached me about getting some aerial photos for experimentation. He knew I already had a GoPro camera, and that I was completely obsessed with it… GoPros take incredible pictures, and are very durable. The only thing was that we needed a reliable way to get it off the ground.

Enter part 2 of the equation, and the greatest part of my obsession: the quadcopter.  Basically, it is a multi-rotor remote controlled helicopter. Multi-rotor copters are mechanically very simple, very stable, and versatile. Just check out this TED video to see how agile and versatile they can be.

I had been wanting one of these for a long time. Researching here and there, looking at the DIY and pre-built kits. I wanted something that would be easy to fly, but also capable of outdoor flight and a camera payload. Copters can range from inexpensive children’s toys to high-end hex/octo-rotor professional rigs for many thousands of dollars, and I looked at nearly every one of them. I finally settled on the DJI Phantom.


I chose this copter mainly for two reasons:

  • It has a GPS guidance system that makes it easy to fly, corrects for wind drift, and even has a “return to home” feature if the battery gets low or if it loses contact with the controller. Yet, it can be very maneuverable and agile. 
  • It is easy to setup, and comes with a GoPro mounting system.
  • The videos that people are producing with this setup will make your jaw drop. (Yes, I said two… this one isn’t an “official” reason.)

DJI’s promise of “ease of use” definitely holds true. It was setup and running very quickly, with the longest part of the setup process waiting for the battery to charge. Once we finally got it off the ground, there was no going back… Now, with everything that I look at, I wonder “Wouldn’t that be cool to take a picture of from the air?”

A word of warning, once you start down this path, it is very easy to become obsessed – I will elaborate more on this subject in a bit. For now, check out some of my recent photos with this rig. You can see the full collection (and any future photos) on my Flickr photostream.

Ocean-Pines-2013-06-17-020-editOcean-Pines-2013-06-17-014-editOcean Pines 2013-06-17-054-2.jpgOcean Pines 2013-06-17-020-1.jpgOcean Pines 2013-06-17-068-9.jpgOcean Pines 2013-06-17-062-8.jpg
Ocean Pines 2013-06-17-057-7.jpgOcean Pines 2013-06-17-055-6.jpgOcean Pines 2013-06-17-050-5.jpgOcean Pines 2013-06-17-046-4.jpgOcean Pines 2013-06-17-026-3.jpgOcean Pines 2013-06-17-014-2.jpg
Ocean Pines 2013-06-17-004-1.jpgPemberton Park - 2013-06-14-050-12.jpgPemberton Park - 2013-06-14-046-11.jpgPemberton Park - 2013-06-14-042-10.jpgPemberton Park - 2013-06-14-041-9.jpgPemberton Park - 2013-06-14-032-8.jpg
Pemberton Park - 2013-06-14-028-7.jpgPemberton Park - 2013-06-14-025-6.jpgPemberton Park - 2013-06-14-020-5.jpgPemberton Park - 2013-06-14-015-4.jpgPemberton Park - 2013-06-14-011-3.jpgPemberton Park - 2013-06-14-009-2.jpg

In fact, we were so excited to fly it, that we didn’t even wait for daylight! The Phantom has a series of LEDs for navigation, so there was no problem flying at night, at all. Here’s a short video from my first quadcopter flight:

While this setup takes amazing still images out-of-the-box, you may have noticed that the video isn’t all that smooth. Some of the shaking can be reduced by refining your piloting techniques (don’t ascend/descend too fast, etc…). However, the default configuration isn’t ideal for video capture. Though, much of that can be corrected with some ad-ons (which I will be investing in), and software-based video stabilization in After Effects. Again, I’ll cover this more in a bit…

Lessons Learned

My explorations into aerial photography have provided me with some valuable lessons…

  • Accept the fact that you will crash.  Even though the Phantom is very easy to fly, at some point you will, without a doubt, crash the copter. Luckily every piece of the copter can be replaced. If your battery is almost dead and you try to take off, the copter will just flip over onto the ground. If you are trying to land and there is a strong gust of wind, you will likely crash. If you descend too quickly, you will crash. I mention each of these, because I have done all of them.
  • Immediately replace the nuts that hold the propellers onto the copter.  Within the first week of owning the Phantom, I lost a propeller and the copter came plummeting to the ground. Luckily, I was only about ten feet off of the ground when this happened. The “dome” nuts that ship with the copter look pretty, but are not reliable. Get some new nuts with lock washers, Nylock nuts, and/or some thread lock to prevent the nuts from loosening due to vibration. If you use thread lock, just be sure NOT to get any on the plastic propellers. I have heard it will do bad things to them.
  • Be prepared to obsess over the ways that you can use and modify the copter. This leads me to my next section…


If you are like me, you will immediately want to explore all of the ways that you can use the Phantom once you get it. This will lead you down the road of watching countless YouTube videos that others have created, reading DIY instructions for modifications, and scouring the web for upgrade options.

Must haves:

  • Extra batteries. The Phantom comes with one battery that lasts 10-12 minutes. You will want more batteries for additional flight time.
  • Better nuts to secure the propellers. (See my lessons learned above)

Of course, not all modifications are “must haves”; here are the next few options that are on my list. If anyone is looking for a gift idea for me, look at this list! 🙂

  • Travel Case. Without a doubt, you will want to protect your copter. I have mine in a cardboard box lined with bubble-wrap to prevent damage from travel and toddlers. If you plan on travelling with your copter (which I do), you will want something like one of these.
  • Upgraded Propellers. Larger propellers provide additional lift. Balanced propellers provide less vibration. Stiffer propellers also provide additional lift. The Phantom ships with 8″ plastic props. I’m looking into 9″ wooden or carbon fiber, balanced propellers.
  • Camera Gimbal System. A gimbal system will keep your camera stable, regardless of the pitch or yaw of the copter. This will make a huge difference if you are producing video. There are lots of options, from DIY kits, to the official DJI/Zenmuse Gimbal system that was just released today. I am drooling over these options.
  • Vibration Isolation Mount. If you have a gimbal, you probably already have this, but if you don’t, you might want one. A vibration isolation mount will reduce the copter vibrations that are transferred to the camera. This will result in better quality images and videos, and less “jello effect”.
  • ND/Polarized filters for the GoPro camera. ND filters and polarized filters can help improve the captured image or video, and can minimize the “jello effect” due to vibrations.
  • FPV Video Navigation System. An FPV (First Person View) video system gives you the ability to see exactly what the copter sees when you are flying. This can be really useful for framing of photos, or flying beyond line-of-sight. Most of these will require you to modify your copter. You can use the GoPro app as a limited FPV system; although keep in mind that it operates over an ad hoc wifi connection. If you lose the connection, you lose the video feed.

I mentioned that people are creating amazing images and videos with this rig; here are just a few that blew me away…

… and there are many, many more.

Creative Cloud

So, I’ve written all about the copter so far, but what about Creative Cloud?  How does it tie into quadcopters? Creative Cloud enables you to do more with the aerial imagery that you capture from the quadcopter.

When taking still images, I setup the camera so that it will automatically take two pictures every second. When you’re flying for 10 minutes, this ends up being a lot of images. Lightroom enables you to quickly import, tag, develop, and publish your images.

Import in Lightroom
Import in Lightroom
Develop in Lightroom
Develop in Lightroom

Many of the images need additional editing before they are suitable for your final product. Whether you need to remove objects (such as landing gear, shadows, people, etc…), or you want to tweak clouds or make other dramatic changes, Photoshop enables you to do these things. Photoshop enables rich editing and retouching of images.

Advanced Editing in Photoshop CC
Advanced Editing in Photoshop CC

Of course, it’s not all about still images. Premiere and After Effects enable editing and production of your captured videos. You can use Premiere to arrange multiple clips into a broader sequence, and you can add post-processing effects and image stabilization with After Effects.

Editing Video in Premiere Pro CC
Editing Video in Premiere Pro CC

…and what is a video without audio? Audition provides you with a rich environment for audio production. Whether it is editing the captured audio, or producing new music and soundtracks, Audition has you covered.

Produce Audio with Audition CC
Produce Audio with Audition CC

Here’s another video that I produced using Premiere for edits, stabilization with After Effects, and Audition for audio tweaks. Granted, I do not have a gimbal or other stabilization system, so bear with the quality!

This is just the proverbial “tip of the iceberg”. Creative Cloud gives you the tools you need to create amazing multimedia content, and the means to make that content engaging and interactive – from hobbyist to die-hard professional.

Once you’re happy with what you’ve got, don’t forget you can share it on Behance via your Creative Cloud membership!


Live Editing HTML on Mobile Devices With Adobe Edge Code CC & Adobe Edge Inspect CC

inAdobe Edge Inspect CC is an awesome tool for synchronized browsing experiences across both desktop and mobile devices. It is incredibly powerful, and streamlines the developer & testing workflows by reducing iterations when testing HTML/JS experiences on multiple devices.

coAdobe Edge Code CC is an awesome code editor for creating HTML/JS experiences. It has a live connection to the browser, so you can see your edits in the browser in real time as you are editing your code… all without having to leave the code editor.

Wouldn’t it be cool if Adobe Edge Code CC could push updates to mobile devices leveraging Adobe Edge Inspect CC? Oh wait, what? It already can? YES, it can! This is one of the coolest new features in this week’s Creative Cloud release. You can edit your code in Edge Code CC, and preview your changes live, and in real time in both the desktop browser and on mobile devices. Take a look at the video below to see it in action:

OK, so how do you get it? Become a member of Creative Cloud… Membership does have its perks! This is available in all of the plans, even the free option. This plugin for Edge Code CC/Brackets is also open source. Know what else is cool about Adobe Edge Inspect CC? It has an open source JavaScript API that you can leverage to integrate your own apps into an Edge Inspect workflow or continuous integration environment. It’s definitely worth checking out.

My Favorite New Feature: Multi-Clip Sync in Premiere Pro CC

The latest version of Premiere Pro CC is seriously awesome. There are tons of new features and performance improvements… everything from cloud synchronization of settings across computers, to the new Mercury playback engine, to asset browsing, editing improvements, closed captioning, and more. However, I think my absolute favorite feature in Premiere Pro CC is the ability to synchronize multiple video clips based on their audio tracks.

You can simply select the clips that you want to have synchronized, right-click, select “Synchronize”, then select the “Audio” option.


This feature will temporally align the video clips by analyzing their audio tracks. You no longer have to compare waveforms manually to try and align everything to the precise frame. For me, this has been a tremendous time saver. Check it out, in action, in the video below!

This is one of the many features and product updates that were launched in Adobe Creative Cloud today. If you aren’t a member, you’re missing out on all of the latest and greatest features. Head on over to to become a Creative Cloud member and get started today!

Adobe Creative Cloud CC Apps Now Available

The latest release of Adobe Creative Cloud tools are now available!  This includes updates for over  15 tools, including Photoshop CC, Illustrator CC, Premiere Pro CC, InDesign CC, Dreamweaver CC, Edge Animate CC, Edge Code CC, Edge Inspect CC, After Effects CC, and more!  This release is Adobe’s best yet, and not only includes our best tools, but also greater integration with Adobe’s Cloud services.  You can learn more about everything in today’s release from the Adobe Creative Cloud Team blog, or go directly to to get all of these applications today!



PhoneGap Exploration – Realtime Hardware Communication

Recently I’ve undertaken some explorations with fellow evangelist Kevin Hoyt, trying to determine how far we can push PhoneGap applications with devices and physical computing. Turns out, you can push things really far and now I’m delighted to share one of the experiments that we’ve been pursuing.

I’ve been asked on more than one occasion, can you access Bluetooth devices in PhoneGap applications. The answer is YES, you can. There is not a specific “Bluetooth API” in PhoneGap, however you can create native plugins to access any native library. Basically, with a native plugin, you create the native interface (written in the native language), and a JavaScript interface. The native and JavaScript interfaces can leverage the PhoneGap native to JavaScript bridge for bidirectional communication.

In this exploration, we researched whether or not you can use a pressure sensitive stylus with a PhoneGap application. Again, the answer is YES, you can. Check out the video below to see a sample application in-action. This example demonstrates the use of a TenOne Pogo Connect Stylus inside of a PhoneGap application.

Note: This is not connected to Project Mighty in any way – Kevin and I started exploring completely separately from the big announcements at MAX.

The Pogo Connect stylus leverages Bluetooth 4 Smart (low energy) connectivity to communicate with the device, and provides pressure sensitivity and a physical button for the user to interact with. The JavaScript interface doesn’t interact directly with with the Bluetooth connection. Instead, I leveraged TenOne’s Pogo Connect SDK and created a JavaScript bridge layer to delegate pen interaction from the SDK to the JavaScript layer.

There were definitely a few tricks to get this working. First, the SDK is designed to accept touch input at the native layer, and determine whether or not that touch is from the pen. When using the SDK (in Objective-c), you are supposed to implement the touchesBegan, touchesMoved, touchesEnded, and touchesCancelled functions for a view:

[objc]- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event;
– (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event;
– (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event;
– (void)touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event;[/objc]

In those functions, you check if any of the touches are pens, and if so, do something with that pen input.

[objc]- (void) touchesBegan: (NSSet *)touches withEvent:(UIEvent *)event {
for (UITouch *touch in touches) {
if ([pogoManager touchIsPen:touch]) {
//do something with the pen

The first catch with implementing this inside of a PhoneGap application is that you can’t override the touch handlers of a web view on iOS. Luckily, there is another way! I leveraged a UIGestureRecognizer instance to intercept the touch input that is received by the web view, determine if the touches were from the PogoConnect Stylus, and if so, then delegate that input back to the JS layer.

UIGestureRecognizer is normally used as a base class for creating custom gestures in iOS applications. If has everything you need to handle touch input, and it gives you that information without having to subclass an actual view. This means you can attach it to any UIView instance. Since you don’t have to subclass a view to use this, this plugin can be implemented in a PhoneGap native plugin without having to modify *any* code inside the PhoneGap framework.

So, here’s how I did it… Once the PhoneGap plugin is initialized on load, I create a gesture recognizer instance and attach it to the PhoneGap application’s web view. Whenever touch input is received by the gesture recognizer, that input is passed back to the PhoneGap plugin instance, which executes JavaScript on the web view to pass that information to the JavaScript layer in real time as it is received.

On the JavaScript layer, stylus/pen input is dispatched to the application as custom events on the window.document object. To subscribe to pen input in JavaScript, you just add event listeners for the Pogo Connect events that I defined in the native plugin’s JavaScript file.

[js]document.addEventListener( pogoConnect.PEN_TOUCH_BEGIN, app.penTouchBegin );
document.addEventListener( pogoConnect.PEN_TOUCH_MOVE, app.penTouchMove );
document.addEventListener( pogoConnect.PEN_TOUCH_END, app.penTouchEnd );[/js]

Information about the stylus will be contained in the event.detail attribute, and will be an instance of this object (containing accurrate values about the pen’s state, of course):

identifier: 0,

In JavaScript, you can do whatever you want within your application once you receive this information.

I created two sample applications to test this functionality. The first is a very basic app that simply outputs the pressure as text which follows the pen tip. The intent with this app was really just to prove the concept and determine if it was actually possible to receive and respond to information from the stylus.

Basic Pressure Sensitivity Detection in PhoneGap
Basic Pressure Sensitivity Detection in PhoneGap

Once I proved it could be done, the next logical step was to create an app that actually takes advantage of the pressure sensitivity information. So, I made a sketching app.

Pressure Sensitive Sketching in PhoneGap
Pressure Sensitive Sketching in PhoneGap

I started off by expanding on the drawing logic from my Lil’ Doodle PhoneGap application. This uses a requestAnimationFrame interval to render content in an HTML Canvas element using a “brush image” technique. Next, I added logic to vary the opacity and stroke size based on the pressure information received from the pen plugin, and a few other options to change the pen tip/brush shape and color.

The pressure sensitive stylus gives a few interesting interactions that you wouldn’t get without the hardware:

  • First, the obvious, you know the pressure being applied to the pen tip, and the app can respond accordingly.
  • Next, you have an extra input method. The button on the pen allows you to interact with the device without having to actually touch the device. I used the button in two ways: first, if the button is pressed and held, the pen erases instead of draws. Second, if you double-tap the pen button, it brings up the drawing options. These options are where I placed controls to modify the pen color and stroke.
  • Third, the plugin provides bidirectional communication with the Stylus. When you change the pen color, the LED on the pen will display the selected color for a few seconds.

I used a modified version of DevGeeks’ Canvas2Image plugin to save the content of the HTML Canvas to the device’s photo library. I also had to leverage a variation of this technique for getting the data from the Canvas element without transparency.

All button and slider styling leverages Topcoat, Adobe’s brand new open source CSS framework designed to help developers build HTML-based apps with an emphasis on performance.

Full source code for this application is available on GitHub at

Note: This is iOS only – The third-party PogoConnect SDK is for iOS devices only. This example will also ONLY work if you have the PogoConnect Stylus. It does not support other stylus devices or finger-only drawing.