Stage3D & Flex Demo w/ Source

Back in the summer, I was lucky enough to get my hands on some early builds of Stage3D for mobile. I built some simple examples, including basic geometric shapes and simple 3D bubble charts inside of mobile Flex/AIR applications. I have been asked numerous times for the source code, and I’ve finally given in, and am sharing some source code.

I am not posting the full mobile application source code, since Stage3D for mobile is not yet available. However, I have ported the 3D bubble chart example to run in a Flex application targeting the desktop (Flash Player 11). The bubble chart example extends the concepts explored in the basic geometric shapes example.

Before you say “shoot, he didn’t give us the mobile code”, let me explain… When I ported the code from the mobile project to the desktop Flex project, all I changed was code specific to the mobile Flex framework. I changed <s:ViewNavigatorapplication> to <s:Application> and the corresponding architecture changes that were required, and I changed the list item renderers to Spark item renderers based on <s:Group> instead of mobile item renderers.   In the mobile item renderers, all my drawing logic was done using the ActionScript drawing API.  For simplicity in the port, I just used <s:Rect> to add the colored regions in the desktop variant.

That is all I changed!  

The stage3D code between the desktop and mobile implementations is identical.    You can see the desktop port in action in the video below:

Or, you can test it for yourself here:

The source code was intended to be exploratory at best… I was simply experimenting with hardware accelerated content, and how it can be used within your applications.   There is one big “gotcha” that you will have to watch out for if you want Stage3D content within a Flex application… Stage3D content shows up behind Flex content on the display list.   By default, Flex apps have a background color, and they will hide the Stage3D content.   If you want to display any Stage3D content within a Flex application (regardless of web, desktop AIR, or mobile), you must set the background alpha of the Flex application to zero (0).  Otherwise you will pull out some hair trying to figure out why it doesn’t show up.

The source code for the web/Flex port of this example is available at:

This also requires inclusion of the Away3D library, available at:

You can check out my original posts, showing Stage3D on mobile here:

You can also check out a video of this code running on a mobile device (Samsung Galaxy Tab 10.1) below:


Low Latency & Polyphonic Audio in PhoneGap

UPDATE 10/07/2013: This plugin has been updated to support PhoneGap 3.0 method signatures and command line interface.  You can access the latest at:

If you have ever tried to develop any kind of application using HTML5 audio that is widely supported, then you have likely pulled all the hair from your head. In its current state, HTML5 Audio is wrought with issues… lack of consistent codec support across browsers & operating systems, no polyphony (a single audio clip can not be played on top of itself), and lack of concurrency (on some of the leading mobile browsers you can only play one audio file at a time, if at all). Even the leading HTML5 games for desktop browsers don’t even use HTML5 audio (they use Flash). Don’t believe me? Just take a look at Angry Birds, Cut the Rope, or Bejeweled in a proxy/resource monitor…

The Problem

You want fast & responsive audio for your mobile applications.   This is especially the case for multimedia intensive and/or gaming applications.

HTML5 audio is not *yet* ready for prime-time. There are some great libraries like SoundManager, which can help you try to use HTML5 audio with a failover to Flash, but you are still limited without polyphony or concurrency. In desktop browsers, Flash fixes these issues, and Flash is still vastly superior to HTML5 for audio programming.

If you are building mobile applications, you can have great audio capabilities by developing apps with AIR. However, what if you aren’t using AIR? In native applications, you can access the underlying audio APIs and have complete control.

If you are developing mobile applications with PhoneGap, you can use the Media class, which works great. If you want polyphony, then you will have to do some work managing audio files for yourself, which can get tricky. You can also write native plugins that integrate with the audio APIs for the native operating systems, which is what i will be covering in this post.

Before continuing further, let’s take a minute to understand what I am talking about when I refer to concurrency, polyphony, and low-latency…


Concurrency in audio programming refers to the ability to play multiple audio resources simultaneously.  HTML5 in most mobile devices does not support this – not in iOS, not in Android.  In fact, HTML5 Audio does not work *at all* in Android 2.x and earlier.  Native APIs do support this, and so does PhoneGap’s Media class, which is based on Android MediaPlayer and iOS AVAudioPlayer.


Producing many sounds simultaneously; many-voiced.

In this case, polyphony is the production of multiple sounds simultaneously (I’m not referring to the concept of polyphany in music theory). In describing concurrency, I refered to the ability to play 2 separate sounds at the same time, where with polyphony I refer to the ability to play the same sound “on top” of itself. There can be multiple “voices” of the same sound. In the most literal of definitions concurrency could be considered a part of polyphony, and polyphony a part of concurrency… Hopefully you get what I’m trying to say. In its current state, HTML5 audio supports neither concurrency or polyphony.  The PhoneGap Media class does not support polyphony, however you can probably manage multiple media instances via javascript to achieve polyphonic behavior – this requires additional work in the JavaScript side of things to juggle resources.

Low Latency

Low latency refers to “human-unnoticeable delays between an input being processed and the corresponding output providing real time characteristics” according to wikipedia.   In this case, I refer to low latency audio, meaning that there is an imperceptible delay between when a sound is triggered, and when it actually plays.   This means that sounds will play when expected, not after a wait.   This means a bouncing ball sound should be heard as you see the ball bouncing on the screen.   Not after it has already bounced.

In HTML5, you can auto-load a sound so that it is ready when you need it, but don’t expect to play more than one at a time.  With the PhoneGap Media class, the audio file isn’t actually requested until you invoke “play”.   This occurs inside “startPlaying” on Android, and “play” on iOS.   What I wanted was a way to preload the audio so that it is immediately ready for use at the time it is needed.

The Solution

PhoneGap makes it really easy to build natively installed applications using a familiar paradim: HTML & JavaScript.   Luckily, PhoneGap also allows you to tie into native code using the native plugin model.   This enables you to write your own native code and expose that code to your PhoneGap application via a JavaScript interface… and that is exactly what I did to enable low-latency, concurrent, and polyphonic audio in a PhoneGap experience.

I created PhoneGap native plugins for Android and iOS that allow you to preload audio, and playback that audio quickly, with a very simple to use API.   I’ll get into details how this works further in the post, but you can get a pretty good idea of what I mean by viewing the following two videos.

The first is a basic “Drum Machine”.  You just tap the pads to play an audio sample.

The second is a simple user interface that allows you to layer lots of complex audio, mimicking scenarios that may occur within a video gaming context.

Assets used in this example from  See README for specific links & attribution.

You may have noticed a slight delay in this second video between the tap and the actual sounds.  This is because I am using “touchStart” events in the first example, and just using a normal <a href=”javascript:foo()”> link in the second.  There is always a delay for “normal” links in all multi-touch devices/environments because there has to be time for the device to detect a gesture event. You can bypass this delay in mobile web browsers by using touch events for all input.

Side Note:  I have also noticed that touch events are slightly slower to be recognized on Android devices than iOS.   My assumption is that this is related to specific device capabilities – this is more noticeable on the Amazon Kindle Fire than the Motorola Atrix.   The delay does not appear to be a delay in the actual audio playback.

How it works

The native plugins expose a very simple API for hooking into native Audio capabilities.   The basic usage is:

  • Preload the audio asset
  • Play the audio asset
  • When done, unload the audio asset to conserve resources

The basic components of a PhoneGap native plugin are:

  • A JavaScript interface
  • Corresponding Native Code classes
You can learn more about getting started with native plugins on the PhoneGap wiki.

Let’s start by examining the native plugin’s JavaScript API.  You can see that it just hands off the JavaScript calls to the native layer via PhoneGap:

var PGLowLatencyAudio = {

preloadFX: function ( id, assetPath, success, fail) {
return PhoneGap.exec(success, fail, "PGLowLatencyAudio", "preloadFX", [id, assetPath]);

preloadAudio: function ( id, assetPath, voices, success, fail) {
return PhoneGap.exec(success, fail, "PGLowLatencyAudio", "preloadAudio", [id, assetPath, voices]);

play: function (id, success, fail) {
return PhoneGap.exec(success, fail, "PGLowLatencyAudio", "play", [id]);

stop: function (id, success, fail) {
return PhoneGap.exec(success, fail, "PGLowLatencyAudio", "stop", [id]);

loop: function (id, success, fail) {
return PhoneGap.exec(success, fail, "PGLowLatencyAudio", "loop", [id]);

unload: function (id, success, fail) {
return PhoneGap.exec(success, fail, "PGLowLatencyAudio", "unload", [id]);

You would invoke the native functionality by first preloading the audio files BEFORE you need them:

[js]PGLowLatencyAudio.preloadAudio(‘background’, ‘assets/background.mp3’, 1);
PGLowLatencyAudio.preloadFX(‘explosion’, ‘assets/explosion.mp3’);
PGLowLatencyAudio.preloadFX(‘machinegun’, ‘assets/machine gun.mp3’);
PGLowLatencyAudio.preloadFX(‘missilestrike’, ‘assets/missle strike.mp3’);
PGLowLatencyAudio.preloadAudio(‘thunder’, ‘assets/thunder.mp3’, 1);[/js]

When you need to play an effect you just call either the play or loop functions, passing in the unique sound ID:


Next, let’s examine some intricacies of the plugin…   One thing to keep in mind is that I do not have callbacks to the phonegap app once a media asset is loaded.   If you need “loaded” callbacks, you will need to add those yourself.

preloadFX: function ( id, assetPath, success, fail)

id – string unique ID for the audio file
assetPath – the relative path to the audio asset within the www directory
success – success callback function
fail – error/fail callback function


The preloadFX function loads an audio file into memory.  These are lower-level audio methods and have minimal overhead. These assets should be short (less than 5 seconds). These assets are fully concurrent and polyphonic.

On Android, assets that are loaded using preloadFX are managed/played using the Android SoundPool class. Sound files longer than 5 seconds may have errors including (not playing, clipped content, not looping) – all will fail silently on the device (debug output will be visible if connected to debugger).

On iOS, assets that are loaded using preloadFX are managed/played using System Sound Services from the AudioToolbox framework. Audio loaded using this function is played using AudioServicesPlaySystemSound. These assets should be short, and are not intended to be looped or stopped.

preloadAudio: function ( id, assetPath, voices, success, fail)

id – string unique ID for the audio file
assetPath – the relative path to the audio asset within the www directory
voicesthe number of polyphonic voices available
success – success callback function
fail – error/fail callback function


The preloadAudio function loads an audio file into memory.  These have more overhead than assets laoded via preloadFX, and can be looped/stopped. By default, there is a single “voice” – only one instance that will be stopped & restarted when you hit play. If there are multiple voices (number greater than 0), it will cycle through voices to play overlapping audio.  You must specify multiple voices to have polyphonic audio – keep in mind, this takes up more device resources.

On Android, assets that are loaded using preloadAudio are managed/played using the Android MediaPlayer.

On iOS, assets that are loaded using preloadAudio are managed/played using AVAudioPlayer.

play: function (id, success, fail)

id – string unique ID for the audio file
success – success callback function
fail – error/fail callback function


Plays an audio asset.  You only need to pass the audio ID, and the native plugin will determine the type of asset and play it.

loop: function (id, success, fail)

id – string unique ID for the audio file
success – success callback function
fail – error/fail callback function


Loops an audio asset infinitely.  On iOS, this only works for assets loaded via preloadAudio.  This works for all asset types for Android, however it is recommended to keep usage consistent between platforms.

stop: function (id, success, fail)

id – string unique ID for the audio file
success – success callback function
fail – error/fail callback function


Stops an audio file.  On iOS, this only works for assets loaded via preloadAudio.  This works for all asset types for Android, however it is recommended to keep usage consistent between platforms.

unload: function (id, success, fail)

id – string unique ID for the audio file
success – success callback function
fail – error/fail callback function


Unloads an audio file from memory.   DO NOT FORGET THIS!  Otherwise, you will cause memory leaks.

I’m not just doing this for myself, the audio is completely open source for you to take advantage of as well.  You can download the full code, as well as all examples from github at github:

UPDATE 10/07/2013: This plugin has been updated to support PhoneGap 3.0 method signatures and command line interface.  You can access the latest at:


2012 Flex User Group Tour: North America Dates Announced

If you hadn’t heard already, Flex has been accepted by the Apache Software Foundation. If you are wondering what exactly this means, or you have asked yourself any of the following questions, then you might want to check out the Adobe Flex User Group Tour:

  • Do you want to learn more about the future of Flex?
  • Do you want to learn more about the Flex transition to the Apache Software Foundation?
  • How can you contribute and help make Flex thrive?
  • Do you have questions that you would like to voice to Adobe?

As promised, Adobe is kicking off The Flex User Group Tour to discuss recent events surrounding Flex and the Flash Platform.   These meetings are intended to help you understand the changes happening with Flex and Flash, the impact to related tools, as well as to educate about the process & transition to Apache.   You can learn more about the user group tour and get an up-to-date listing of dates & cities from the Flex Team blog – be sure to check back periodically for updates. Initial cities include New York, Boston, Denver, Seattle, Los Angeles, Sand Diego, and Dallas.   Expect more cities & countries to be announced at a later date.

We hope to see you at one of the upcoming events. I’m scheduled to speak at the Dallas event in April, and I hop to see you there!

Mobile Web & PhoneGap HTML Dev Tips

Recently I’ve been spending a fair amount of time working on HTML-based applications – both mobile web and mobile applications using PhoneGap.   Regardless of whether you are targeting a mobile web browser or a mobile app using the PhoneGap container, you are still targeting a mobile web browser instance.  If you haven’t noticed, mobile web browsers can often have peculiarities with how content is rendered, or how you interact with that content.   This happens regardless of platform – iOS, Android, BlackBerry, etc…  All have quirks.  Here are a few tips that I have found useful for improving overall interaction and mobile HTML experiences.

Disclaimer: I’ve been targeting iOS and Android primarily, with BlackBerry support on some applications.  I don’t have a Windows Phone device to test with, so I can’t comment on support for the Windows platform.

AutoCorrect and AutoCapitalize

First things first: autocorrect and autocapitalize on Apple’s iOS can sometimes drive you to the brink of insanity.  This is especially the case if you have a text input where you are typing in a username, and it keeps “correcting” it for you (next thing you know, you are locked out of the app).   You can disable these features in web experiences by setting the “autocorrect” and “autocapitalize” attributes of an <input> instance.

Disabled AutoCorrect:
[html]<input type="text" autocorrect="off" autocapitalize="on" />[/html]

Disabled AutoCapitalize:
[html]<input type="text" autocorrect="on" autocapitalize="off" />[/html]

Managing the Keyboard

Have you ever experienced an an app or web site on a mobile device where you have to enter numeric data, and the default keyboard pops up. Before entering any text, you have you switch to the numeric input. Repeat that for 100 form inputs, and try to tell me that you aren’t frustrated… Luckily, you can manage the keyboard in mobile HTML experiences very easily using HTML5 Form elements.

Default Keyboard: Supported Everywhere [html]<input style="width: 400px;" type="text" value="default" />[/html]

Numeric Keyboard: Supported on iOS, Android & BlackBerry (QNX) [html]<input style="width: 400px;" type="number" value="numeric" />[/html]

Numeric Keyboard: Supported on iOS [html]<input style="width: 400px;" type="text" pattern="[0-9]*" value="numeric" />[/html]

Phone Keyboard: Supported on iOS [html]<input style="width: 400px;" type="tel" value="telephone" />[/html]

URL Keyboard: Supported on iOS & BlackBerry (QNX) [html]<input style="width: 400px;" type="url" value="url" />[/html]

Email Keyboard: Supported on iOS & BlackBerry (QNX) [html]<input style="width: 400px;" type="email" value="email" />[/html]

Disable User Selection

One way to easily determine that an application is really HTML is that everything on the UI is selectable and can be copied/pasted – Every single piece of text, every image, every link, etc… Not only is this annoying in some scenarios (and very useful in others), but there may be instances where you explicitly don’t want the user to be able to easily copy/paste content. You can disable user selection by applying the following CSS styles. Note: This works on iOS, and partially works on BlackBerry/QNX for the PlayBook. It did not work on Android in my testing.

* {
-webkit-touch-callout: none;
-webkit-user-select: none;

The -webkit-touch-callout css rule disables the callout, and the -webkit-user-select rule disables the ability to select content within an element. More details on webkit css rules from the Mobile Safari CSS Reference. More detail about disabling copy/paste on iOS is available at

Disable Zoom

If you want your content to feel like an app instead of a web page, then I strongly suggest that you disable gestures for pinch/zoom and panning for all use cases where pinch/zoom is not required. The easiest way to do this is to set the viewport size to device-width and and disable user scaling through the HTML metadata tag.

[html]<meta name="viewport" content="width=device-width, user-scalable=no" />[/html]

You can read further detail on the viewport metadata tag from the Apple Safari HTML Reference, or the Mozilla reference.

On a Phone? Integrate With It

Your application can dial phone numbers very easily. Just use a standard web location, but use the “tel:<phonenumber>” URI format.

Test it with Apple Customer Support: 800-275-2273[html]<a href="tel:800-275-2273">800-275-2273</a>[/html]

This technique works on both Android and iOS devices, and I assume other platforms. However, I don’t have the devices to test all of them.

Touch Based Scrolling

Touch-based scrolling is critical to having an application that feels native. I dont mean that the whole page should be able to scroll… Your browser will be able to take care of that alone. Instead I mean that you should be able to scroll individual elements so that they mimic clipped views, lists, or large blocks of content. You should be able to scroll content where it is, and not have to scroll an entire page to reveal something in only one area of the screen. You should minimize scrolling when it may cause poor UX scenarios. This is especially the case in tablet-based applications which have a larger UI than phone-based applications.

Luckily, this is also really easy. I personally prefer the open source iScroll JavaScript library from iScroll works really well on iOS, Android and BlackBerry – I haven’t tested other platforms, but you can test them out yourself:

Remove “click” Delays

“Click” events on HTML elements on mobile devices generally have a delay that is caused by the operating system logic used to capture gestural input based on touch events. Depending on the device, this could be 300-500 MS. While this doesn’t sound like much, it is very noticeable. The workaround is to use touch events instead of mouse events: touchStart, touchMove, touchEnd. You can learn more about touch events from There’s also a great script from cubiq that adds touch events for you to optimize the experience for onClick event handlers on iOS devices.

Add To Home Screen

If you want your web app to fee like a real app and take up the full screen without using PhoneGap as an application container, then you can always add it to the device’s home screen. Although this can only be done manually through the mobile browser, there are a few open source scripts to guide the user through this processs: or mobile-bookmark-bubble should get you started.

Use Hardware Acceleration

Animations will generally be smoother and faster if your content is hardware accelerated (and the device supports hardware acceleration). You can make html elements hardware accelerated just by adding the translate3d(x,y,z) css style to the element (be sure to set all three x, y, and z attributes otherwise hardware acceleration may not be applied. If you don’t want any translation changes, you can use the translate3d CSS rule with all zero values: translate3d(0,0,0).
[css]transform: translate3d(0,0,0);
-webkit-transform: translate3d(0,0,0);[/css]

In your development/testing, you can even visualize which content is hardware accelerated in both desktop and mobile Safari using the technique shown at

Make You Apps Fast

Last, but certainly not least, make your apps fast. Follow best practices, and be efficient in code execution and the loading of assets (both local and remote). Here are a few links to get you going in the right direction:

I hope these get you moving in the right direction! If you have read this, and aren’t sure what it all means, check out the Adobe Developer Connection to ramp up on HTML5, or to see what HTML5 & CSS3 can do.

Multi-Screen iOS Apps with PhoneGap

Did you know that apps built on top of iOS can have a multi-screen workflow? For example in Keynote, you can have an external screen show a presentation while you control it on your iOS device. In the Jimi Hendrix app, you can view the audio player on an external screen, and in Real Racing HD, you can view the game on an external screen while the iOS device becomes your controller. (among others)

Real Racing HD

This is all made possible by the UIWindow and UIScreen APIs in iOS. Even better, on the iPad 2 and iPhone 4Gs, this can be done wirelessly using Airplay with an Apple TV device. On other iOS devices, you can have a second screen using a VGA output.

One of the benefits of using a cross platform solution like PhoneGap or Flex/Air is that you can build apps with an easier to use/more familiar paradigm.  However, cross platform runtimes don’t always offer access to every API feature that native development enables.

Out of the box, PhoneGap apps are confined to a single screen.  You can use screen mirroring to mirror content on an external screen, but you can’t have a second screen experience.  It’s a good thing you can write native plugins/extensions to enable native functionality within your applications.

ExternalScreen Native Plugin For PhoneGap

I recently did exactly that… I created a PhoneGap native plugin that enables second screen capability for PhoneGap applications.   The plugin listens for external screen connection notifications, and if an additional screen is available, it creates a new UIWebView for HTML-based content in the external screen – complete with functions for injecting HTML, JavaScript, or URL locations.


You might be wondering “Why?” you would want this plugin within PhoneGap…  this plugin enables the multi-screen experiences described in the apps mentioned above.  They extend the interactions and capabilities of the mobile hardware.   With this PhoneGap native plugin, you can create rich multi-screen experiences with the ease of HTML and JavaScript.   Here are a few ideas of the types of apps that you can build with this approach (scroll down for source code):

Fleet Manager

Let’s first consider a simple Fleet Manager application which allows you monitor vehicles in a mobile app.  This is a similar concept which I’ve used in previous examples.   The basic functionality allows you to see information on the tablet regarding your fleet.   What if this app connected to a larger screen and was able to display information about your vehicles for everyone to see?   Watch the video below to see this in real life.

This application example is powered by Google Maps, and all of the data is randomly generated on the client.

Law Enforcement

Let’s next consider a mobile law enforcement application application which gives you details to aid in investigations and apprehension of criminals.  Let’s pretend that you are a detective who is searching for a fugitive, and you walk into a crowded bar near the last known location of that fugitive.  You connect to the bar’s Apple TV on their big screen TV, pull up images and videos of the suspect, then say “Have you seen this person?”.   This could be incredibly powerful.  Check out the video below to see a prototype in real life.

This law enforcement demo scenario is a basic application powered by the FBI’s most wanted RSS data feeds.

Tip Of The Iceberg

There are lots of use cases where a second screen experience could be beneficial and create a superior product or application.   Using PhoneGap allows you to build those apps faster & with the ease of HTML and JavaScript, using traditional web development paradigms.

How It Works

Now, let’s review what makes this all work…   The client interfaces for both of these samples are written in HTML & JavaScript, and utilize jQuery, iScroll, and Modernizr, with a trick for removing link click delay on iOS devices.

The PhoneGap native plugin is written in Objective C, with a JavaScript interface to integrate with the client application. PhoneGap plugins are actually very easy to develop.  Basically, you have to write the native code class, write a corresponding JS interface, and add a mapping in your PhoneGap.plist file to expose the new functionality through PhoneGap.  There is a great reference on the PhoneGap wiki for native plugins which includes architecture & structure, as well as platform specific authoring and installation of those plugins.    Here are quick links to the iOS-specific native plugin content authoring and installation.

The ExternalScreen plugin creates a UIWebView for the the external screen, and exposes methods for interacting with the UIWebView.   Note: This is just a normal UIWebView, it does not have support for all PhoneGap libraries… just a standard HTML container.

You can read up on multi-screen programming at iOS from these useful tutorials:

Now let’s first examine the native code:


The header file shows the method signatures for the native functionality.  The corresponding PGExternalScreen.m contains all of the actual code to make it all work.   Note: If you are using ARC (Automatic Reference Counting), you will need to remove the retain/release calls in PGExternalScreen.m.

@interface PGExternalScreen : PGPlugin {

NSString* callbackID;
UIWindow* externalWindow;
UIScreen* externalScreen;
UIWebView* webView;
NSString* baseURLAddress;

@property (nonatomic, copy) NSString* callbackID;

//Public Instance Methods (visible in phonegap API)
– (void) setupScreenConnectionNotificationHandlers:(NSMutableArray*)arguments withDict:(NSMutableDictionary*)options ;
– (void) loadHTMLResource:(NSMutableArray*)arguments withDict:(NSMutableDictionary*)options;
– (void) loadHTML:(NSMutableArray*)arguments withDict:(NSMutableDictionary*)options;
– (void) invokeJavaScript:(NSMutableArray*)arguments withDict:(NSMutableDictionary*)options;
– (void) checkExternalScreenAvailable:(NSMutableArray*)arguments withDict:(NSMutableDictionary*)options;

//Instance Methods
– (void) attemptSecondScreenView;
– (void) handleScreenConnectNotification:(NSNotification*)aNotification;
– (void) handleScreenDisconnectNotification:(NSNotification*)aNotification;


The PGExternalScreen.js file defines the native methods that are exposed through PhoneGap.   You invoke the function, and can add success/fail callback function references.

[js]var PGExternalScreen = {

setupScreenConnectionNotificationHandlers: function (success, fail) {
return PhoneGap.exec(success, fail, “PGExternalScreen”, “setupScreenConnectionNotificationHandlers”, []);

loadHTMLResource: function (url, success, fail) {
return PhoneGap.exec(success, fail, “PGExternalScreen”, “loadHTMLResource”, [url]);

loadHTML: function (html, success, fail) {
return PhoneGap.exec(success, fail, “PGExternalScreen”, “loadHTML”, [html]);

invokeJavaScript: function (scriptString, success, fail) {
return PhoneGap.exec(success, fail, “PGExternalScreen”, “invokeJavaScript”, [scriptString]);

checkExternalScreenAvailable: function (success, fail) {
return PhoneGap.exec(success, fail, “PGExternalScreen”, “checkExternalScreenAvailable”, []);


The Client

You can call any of these functions from within your PhoneGap application’s JavaScript just by referencing the exposed method on the PGExternalScreen instance.

[js]// check if an external screen is available
PGExternalScreen.checkExternalScreenAvailable( resultHandler, errorHandler );

//load a local HTML resource
PGExternalScreen.loadHTMLResource( ‘secondary.html’, resultHandler, errorHandler );

//load a remote HTML resource (requires the URL to be white-listed in PhoneGap)
PGExternalScreen.loadHTMLResource( ‘’, resultHandler, errorHandler );

//load a HTML string


this is html content', resultHandler, errorHandler );

//invoke a JavaScript (passed as a string)
PGExternalScreen.invokeJavaScript('document.write(\'hello world\')', resultHandler, errorHandler );

The full code for the ExternalScreen PhoneGap native plugin, as well as both client applications and a basic usage example is available on github at:

Be sure to read the README for additional setup information.

(Update: source code link changed)