Tag Archives: HTML5

So, What is IBM MobileFirst?

I’m still “the new guy” on the MobileFirst team here at IBM, and right away I’ve been asked by peers outside of IBM: “So, what exactly is MobileFirst/Worklight?  Is it just for hybrid apps?”

In this post I’ll try to shed some light on IBM MobileFirst, and for starters, it is a lot more than just hybrid apps.

MobileFirst-Logo

IBM MobileFirst Platform is a suite of products that enable you to efficiently build and deliver mobile applications for your enterprise, and is composed of three parts:

IBM MobileFirst Platform Foundation

IBM MobileFirst Platform Foundation (formerly known as Worklight Foundation) is a platform for building mobile applications for the enterprise.  It is a suite of tools and services available either on-premise or in the cloud, which enable you to rapidly build, administer, and monitor secure applications.

The MobileFirst Platform Foundation consists of:

  1. MobileFirst Server – the middleware tier that provides a gateway between back-end systems and services and the mobile client applications.  The server enables application authentication, data endpoints/services, data optimization and transformation, push notification management (streamlined API for all platforms), consolidated logging, and app/services analytics. For development purposes, the MobileFirst server is available as either part of the MobileFirst Studio (discussed below), or as command line tools.

  2. MobileFirst API - both client and server-side APIs for developing and managing your enterprise mobile applications.
    • The server-side API enables you to expose data adapters to your mobile applications – these adapters could be consuming data from SQL databases, REST or SOAP Services, or JMS data sources. The Server side API also provides a built-in security framework, unified push notifications (across multiple platforms), and data translation/transformation services. You can leverage the server-side API in JavaScript, or dig deeper and use the Java implementation.
    • The client-side API is available for native iOS (Objective-C), native Android (Java), J2ME, C# native Windows Phone (C#), and JavaScript for cross-platform hybrid OR mobile-web applications. For the native implementations, this includes user authentication, encrypted storage, push notifications, logging, geo-notifications, data access, and more.  For hybrid applications, it includes everything from the native API, plus cross-platform native UI components and platform specific application skinning.  With the hybrid development approach, you can even push updates to your applications that are live, out on devices, without having to push an update through an app store.  Does the hybrid approach leverage Apache Cordova?  YES.

  3. MobileFirst Studio - an optional all-inclusive development environment for developing enterprise apps on the MobileFirst platform.  This is based on the Eclipse platform, and includes an integrated server, development environment, facilities to create and test all data adapters/services, a browser-based hybrid app simulator, and the ability to generate platform-specific applications for deployment.  However, using the studio is not required! Try to convince a native iOS (Xcode) developer that they have to use Eclipse, and tell me how that goes for you… :)  If you don’t want to use the all-inclusive studio, no problem.  You can use the command line tools (CLI).  The CLI provides a command line interface for managing the MobileFirst server, creating data adapters, creating the encrypted JSON store, and more.

  4. MobileFirst Console – the console provides a dashboard and management portal for everything happening within your MobileFirst applications.  You can view which APIs and adapters have been deployed, set app notifications, manage or disable your apps, report on connected devices and platforms, monitor push notifications, view analytics information for all services and adapters exposed through the MobileFirst server, and manage remote collection of client app logs.  All together, an extremely powerful set of features for monitoring and managing your applications.

  5. MobileFirst Application Center - a tool to make sharing mobile apps easier within an organization.  Basically, it’s an app store for your enterprise.

MobileFirst Platform Application Scanning

MobileFirst Platform Application Scanning is set of tools that can scan your JavaScript, HTML, Objective-C, or Java code for security vulnerabilities and coding best practices.  Think of it as a security layer in your software development lifecycle.


MobileFirst Quality Assurance

MobileFirst Quality Assurance is a set of tools and features to help provide quality assurance to your mobile applications.  It includes automated crash analytics, user feedback and sentiment analysis, in-app bug reporting, over-the-air build distribution to testers, test/bug prioritization, and more.


So, is MobileFirst/Worklight just for hybrid (HTML/JS) apps? You tell me… if you need clarification more information, please re-read this post and follow all the links.  ;)

 

DevNexus 2014 PhoneGap Presentations

I’ve just wrapped up my presentations for this year’s DevNexus event in Atlanta – it has been a great event, filled with tons of information on web, mobile, and back-end development. I had 3 sessions on PhoneGap – One intro, one advanced, and one a mobile frameworks panel.

Below are my presentations.  I didn’t record them this time, since they were being recorded by the conference organizers, so expect to see a video once they’re released.

Just press the space bar, or use the arrow keys to view the presentation in your browser.

Getting Started with PhoneGap and Cross Platform Mobile Development

View Presentation …

intro_2_pg

(Lesson learned, never make changes to you presentation/environment after midnight when you have the first session of the day – it will always bite you)

Designing & Architecting for PhoneGap & the Mobile Web

View presentation …

architecture_pg

Enjoy, and feel free to reach out with any questions!

PhoneGap Presentations from HTML5DevConf

I was  searching the web earlier this week for an older presentation from a few months back, and just happened to stumble across my recent presentations from HTML5DevConf from this past October. Looks like the videos were posted in November, but I’m just seeing them now. I had two sessions: Designing and Architecting PhoneGap and Mobile Web Apps and Getting Started with PhoneGap and Cross-Platform Mobile Development, and if you weren’t able to attend them, you’re still in luck! Here are the videos from those sessions:

Designing and Architecting PhoneGap and Mobile Web Apps

Tired of Hello World? In this session, we explore best practices to build real-world PhoneGap applications. We investigate the Single Page Architecture, HTML templates, effective Touch events, performance techniques, modularization and more. We also compare and contrast the leading JavaScript and Mobile Frameworks. This session is a must If you plan to build a PhoneGap application that has more than a couple of screens.

Getting Started with PhoneGap and Cross-Platform Mobile Development

Unfortunately, I ran into network issues which prevented some of my samples from working in this one, but you’ll still be able to get the point.

HTML has emerged as a powerful alternative to “native” to enable cross-platform mobile application development. In this session, you learn how to leverage your existing HTML and JavaScript skills to build cross-platform mobile applications, how to access the device features (camera, accelerometer, contacts, file system, etc) using JavaScript APIs, and how to package your HTML application as a native app for distribution through the different app stores.

You can also check out highlights from HTML5DevConf and find my presentation assets and materials online here.

More Device Motion Experiments with HTML & Adobe DPS

I wanted to follow up my last post on 3D Parallax effects in HTML or Adobe DPS, I’ve decided to release some of the other experiments that I’ve been exploring with device motion in DPS publications. Check out the video below to see two new samples, and a corrected version of the strawberries example from my last post (the plants were going the wrong way in the last post).

All three of these samples leverage the same basic technique for responding to device motion inside of a DPS publication. The motion-interactive components are implemented using HTML and JavaScript, and are included in publications as web content overlays. In JavaScript, it takes advantage of the ondevicemotion event handler to respond to the physical orientation of the device.

In all three of samples, the web content overlay is set to autoplay, with user interaction disabled. This way the HTML & JavaScript automatically loads and the scripting is active, but it doesn’t block interaction or gestures for DPS navigation. I also enabled “Scale Content To Fit” so that HTML content scales appropriately between retina and non-retina devices.

Adobe InDesign - Web Content Overlay Options
Web Content Overlay Options

Strawberries

The strawberries sample is identical to the one from my previous post. This is just a capture of the updated motion. You can access the full source project to this sample at:
https://github.com/triceam/DPS-HTML-Samples/tree/master/strawberries

strawberries

Adobe San Francisco

The Adobe/inline content example is implemented in the same manner as the strawberries example. The large city image It is a two-layer composition created with Adobe Edge Animate. The foreground building and flag move independently from the background image. I used Photoshop to separate the content into layers and made them animate based on device orientation in the exact same fashion as the strawberries sample. All of the text and image content surrounding the cityscape panorama is laid out with InDesign.

adobe

You can check out the Adobe Edge Animate project at:
https://github.com/triceam/DPS-HTML-Samples/tree/master/adobe%20roof

AT&T Park/San Francisco Giants

The AT&T Park/San Francisco Giants example is implemented with basic HTML and JavaScript, no additional tools were used to create this interactive scenario.   The content on the left hand side was all laid out with InDesign. The content on the right side is the interactive HTML.

att_park

The image used in this example is a vertical panorama captured from a remote control helicopter. This image contains various perspectives that have been composited in Photoshop. The motion of the device is aligned to match the perspectives in the image/viewport; When the device is facing down, the image is looking down and when the device is vertical, the image faces forward. You can check out the vertical panorama image below. If you’re interested in creating a vertical panorama, be sure to check out this tutorial from Russell Brown.

Vertical Panorama over AT&T Park

The HTML and JavaScript used in this example is fairly minimal. The image is applied as the background of the root HTML <body> element, and the position of the background is shifted based upon the device motion event. This approach keeps the HTML DOM as flat and simple as possible.

Here’s the HTML that makes up this example:

<html>
    <head>
        <meta charset="utf-8">
        <meta name="viewport" content="initial-scale=1, maximum-scale=1, user-scalable=no">
        <title>Parallax Vertical Background</title>
        <link rel="stylesheet" href="assets/styles.css">
        <script src="assets/zepto.js"></script>
    </head>
    <body></body>
</html>

… and the CSS styles used to apply the background image.

body {
    background-image:url('att_park_vert_panorama.jpg');
    background-position: center;
}

… and the JavaScript used to shift the position based on the device orientation event. Note: this also uses the Zepto JavaScript library.

window.ondeviceorientation = function(event) {
        var gamma = event.gamma/90;
	var beta = event.beta/180;
	var temp = 0;

	// shift values/motion based upon device orientation
	switch (window.orientation) {
		case 90:
			temp = gamma;
			gamma = beta;
			beta = temp;
			break;
		case -90:
			temp = -gamma;
			gamma = beta;
			beta = temp;
			break;

	}

	// update positions to be used for CSS
	var yPosition = 1200 - (beta * 2200);
	var xPosition = -200 + (gamma * 300);
	xPosition = Math.max( -300, Math.min( 0,xPosition));
	yPosition = -Math.max( 100, Math.min( 1400,yPosition));
	//console.log(xPosition, yPosition);

	// apply css styles
	var css = xPosition + "px " + yPosition + "px";
	$("body").css( "background-position", css);
}

Textual content used in this example from: http://en.wikipedia.org/wiki/At%26t_park

Source code for the device motion in this example is available at: https://github.com/triceam/DPS-HTML-Samples/tree/master/ATT%20Park

All of the HTML, CSS and JavaScript code used for these examples is available in the GitHub repository at: https://github.com/triceam/DPS-HTML-Samples

Photorealistic 3D Parallax Effects in HTML or Adobe DPS with Adobe Photoshop and Adobe Edge Animate

A few weeks ago, a fellow Adobe colleague showed me a DPS publication that had an amazing design. All of the content looked great by itself, but what really made parts of it “pop” was that in certain areas there was a 3D parallax effect, which made it feel like you were looking into an image that had depth. You could rotate the device and see what’s hiding behind a person, or around the corner of a building.

Here’s what I mean… on the surface the image looked static, but as I rotated it, elements shifted to give the illusion of depth. The background and foreground elements all moved at different rates:

animation
3D Parallax Effects on a Device

I thought this was an incredible example of added interactivity and immersive experiences, and it’s not really that difficult to implement. In fact, I put together this tutorial to show exactly how you can create these types of effects in your own compositions.

To create this kind of an effect, the first thing you need to do is break apart an image into layers – note: you may need to synthesize edges so that there is an overlap in all transparent areas. Then you need to add interactivity in HTML. Align those images so that their default state looks just like the still image, then move the images based upon the device orientation. I move the foreground one way, keep the middle content more or less stationary, and move the background content the opposite direction (all based upon which way you are rotating the mobile device). Since this is all HTML, you can take this content and use it on the web, or import it into Adobe InDesign to export a DPS digital publication.

Step 1: Create Layered Images

You can either create your own layers, or break apart an existing image into layers so that each individual layer can be placed over top each other to form a seamless composition. In this case, I separated the strawberries, the rows of plants, my daughter, and the sky out to separate layers.

Break Apart Layers in Photoshop
Break Apart Layers in Photoshop

To achieve this, I used the following in Photoshop:

Yes, I did this quickly, and there are still some artifacts visible from the layering process.  :)

Step 2: Create Edge Animate Composition

Next, pull all of those images into an Edge Animate composition so you can create the parallax behavior on the timeline. I actually used the exact same technique that fellow Adobe evangelist Paul Trani uses in his parallax scrolling example.

Edge Animate Composition
Edge Animate Composition

The only difference in mine is that I added some simple HTML and JavaScript to handle device-specific behaviors. I added the following:

An HTML meta tag to the root HTML file to prevent device scaling:

<meta name = "viewport" content = "user-scalable=no, width=device-width"/>

JavaScript to disable touch interactions (prevents touch scrolling):

document.addEventListener('touchstart', function(event){
   event.preventDefault();
   return false;
});

JavaScript to handle device orientation – this jumps to a specific point in time in the timeline animation based on the device orientation:

window.ondeviceorientation = function(event) {
   var delta = Math.round(event.beta);

	switch (window.orientation) {
		case 0:
			delta = Math.round(event.gamma);
			break;
		case 180:
			delta = -Math.round(event.gamma);
			break;
	}

   var position = 15000 + (delta * 400);
   position = Math.floor(position);
   sym.stop(position);
   console.log(position);
}

Update 1/7/2014: I added logic to support both landscape and portrait orientation.

Be sure to add both of those JavaScript snippets inside of the creationComplete event for the Stage.  I also over-exaggerated the movement in the timeline.  I think it would look better with slightly less (more subtle) movement.

At this point, you could publish the composition and use it on the web – there’s nothing stopping you at all. In fact, you can check it out here, just load it on an iPad and rotate the device to see the effect. However, please keep in mind that 1) I haven’t added a preloader, 2) the assets are non-optimized and are all retina size , 3) I don’t have it auto scaling for the viewport size, so it will only look right on a retina iPad, and 4) I have only tested this on an iPad – no other devices.

Note: You could also do this without using Edge Animate, but you’d have to hand code the HTML/JS for it.

You can download the source for the Edge Animate project here.

Step 3: Include in InDesign/DPS Composition

To include this in a DPS publication, all that you need to do is export an Animate Deployment Package (.oam file) from Adobe Edge Animate. You can then just drag and drop this into InDesign for inclusion in a DPS publication.

Including the Animation in an InDesign layout for DPS
Including the Animation in an InDesign layout for DPS

Be sure to check out the DPS Getting Started Guide to learn more about DPS, and check out the docs on Web Content Overlays to learn about HTML usage inside of DPS publications.

If you aren’t already a member of Creative Cloud, join today to take advantage of all of our creative tools!

Update: After publishing this I realized that the movement of the plants should actually be reversed.  If you view this link, you’ll see the updated motion (which looks more realistic), but I can’t update the video that’s already been published.