Recently I’ve undertaken some explorations with fellow evangelist Kevin Hoyt, trying to determine how far we can push PhoneGap applications with devices and physical computing. Turns out, you can push things really far and now I’m delighted to share one of the experiments that we’ve been pursuing.
I’ve been asked on more than one occasion, can you access Bluetooth devices in PhoneGap applications. The answer is YES, you can. There is not a specific “Bluetooth API” in PhoneGap, however you can create native plugins to access any native library. Basically, with a native plugin, you create the native interface (written in the native language), and a JavaScript interface. The native and JavaScript interfaces can leverage the PhoneGap native to JavaScript bridge for bidirectional communication.
In this exploration, we researched whether or not you can use a pressure sensitive stylus with a PhoneGap application. Again, the answer is YES, you can. Check out the video below to see a sample application in-action. This example demonstrates the use of a TenOne Pogo Connect Stylus inside of a PhoneGap application.
Note: This is not connected to Project Mighty in any way – Kevin and I started exploring completely separately from the big announcements at MAX.
The Pogo Connect stylus leverages Bluetooth 4 Smart (low energy) connectivity to communicate with the device, and provides pressure sensitivity and a physical button for the user to interact with. The JavaScript interface doesn’t interact directly with with the Bluetooth connection. Instead, I leveraged TenOne’s Pogo Connect SDK and created a JavaScript bridge layer to delegate pen interaction from the SDK to the JavaScript layer.
There were definitely a few tricks to get this working. First, the SDK is designed to accept touch input at the native layer, and determine whether or not that touch is from the pen. When using the SDK (in Objective-c), you are supposed to implement the touchesBegan, touchesMoved, touchesEnded, and touchesCancelled functions for a view:
[objc]- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event;
– (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event;
– (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event;
– (void)touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event;[/objc]
In those functions, you check if any of the touches are pens, and if so, do something with that pen input.
[objc]- (void) touchesBegan: (NSSet *)touches withEvent:(UIEvent *)event {
for (UITouch *touch in touches) {
if ([pogoManager touchIsPen:touch]) {
//do something with the pen
}
}
}[/objc]
The first catch with implementing this inside of a PhoneGap application is that you can’t override the touch handlers of a web view on iOS. Luckily, there is another way! I leveraged a UIGestureRecognizer instance to intercept the touch input that is received by the web view, determine if the touches were from the PogoConnect Stylus, and if so, then delegate that input back to the JS layer.
UIGestureRecognizer is normally used as a base class for creating custom gestures in iOS applications. If has everything you need to handle touch input, and it gives you that information without having to subclass an actual view. This means you can attach it to any UIView instance. Since you don’t have to subclass a view to use this, this plugin can be implemented in a PhoneGap native plugin without having to modify *any* code inside the PhoneGap framework.
So, here’s how I did it… Once the PhoneGap plugin is initialized on load, I create a gesture recognizer instance and attach it to the PhoneGap application’s web view. Whenever touch input is received by the gesture recognizer, that input is passed back to the PhoneGap plugin instance, which executes JavaScript on the web view to pass that information to the JavaScript layer in real time as it is received.
On the JavaScript layer, stylus/pen input is dispatched to the application as custom events on the window.document object. To subscribe to pen input in JavaScript, you just add event listeners for the Pogo Connect events that I defined in the native plugin’s JavaScript file.
[js]document.addEventListener( pogoConnect.PEN_TOUCH_BEGIN, app.penTouchBegin );
document.addEventListener( pogoConnect.PEN_TOUCH_MOVE, app.penTouchMove );
document.addEventListener( pogoConnect.PEN_TOUCH_END, app.penTouchEnd );[/js]
Information about the stylus will be contained in the event.detail attribute, and will be an instance of this object (containing accurrate values about the pen’s state, of course):
[js]{
identifier: 0,
connected:false,
x:NaN,
y:NaN,
pressure:0,
buttonDown:false,
tipDown:false,
lowBattery:false
}[/js]
In JavaScript, you can do whatever you want within your application once you receive this information.
I created two sample applications to test this functionality. The first is a very basic app that simply outputs the pressure as text which follows the pen tip. The intent with this app was really just to prove the concept and determine if it was actually possible to receive and respond to information from the stylus.
Once I proved it could be done, the next logical step was to create an app that actually takes advantage of the pressure sensitivity information. So, I made a sketching app.
I started off by expanding on the drawing logic from my Lil’ Doodle PhoneGap application. This uses a requestAnimationFrame interval to render content in an HTML Canvas element using a “brush image” technique. Next, I added logic to vary the opacity and stroke size based on the pressure information received from the pen plugin, and a few other options to change the pen tip/brush shape and color.
The pressure sensitive stylus gives a few interesting interactions that you wouldn’t get without the hardware:
- First, the obvious, you know the pressure being applied to the pen tip, and the app can respond accordingly.
- Next, you have an extra input method. The button on the pen allows you to interact with the device without having to actually touch the device. I used the button in two ways: first, if the button is pressed and held, the pen erases instead of draws. Second, if you double-tap the pen button, it brings up the drawing options. These options are where I placed controls to modify the pen color and stroke.
- Third, the plugin provides bidirectional communication with the Stylus. When you change the pen color, the LED on the pen will display the selected color for a few seconds.
I used a modified version of DevGeeks’ Canvas2Image plugin to save the content of the HTML Canvas to the device’s photo library. I also had to leverage a variation of this technique for getting the data from the Canvas element without transparency.
All button and slider styling leverages Topcoat, Adobe’s brand new open source CSS framework designed to help developers build HTML-based apps with an emphasis on performance.
Full source code for this application is available on GitHub at https://github.com/triceam/PGPogoConnect
Note: This is iOS only – The third-party PogoConnect SDK is for iOS devices only. This example will also ONLY work if you have the PogoConnect Stylus. It does not support other stylus devices or finger-only drawing.