AR rendering

The Augmented Reality (AR) engine in the SmartEyeglass renders application text and graphics on the real-world view seen through the SmartEyeglass. The AR Rendering API available through the SmartEyeglass SDK allows you to define graphic objects such that they appear to be overlaid on specific objects whose real-world location you know.

You can define a rendered graphic so that it automatically changes position on the display as the user turns to face toward or away from the known object, or tilts their head up or down with respect to it.

This is a powerful and exciting feature, but when you design an application for it, you must be particularly careful about safety consideration. You should assume that application is active when the user is moving around. You must take care that the UI you display is clear but unobtrusive, so that it does not compromise the user’s safety.

The basics

To start using the AR rendering API, set up a project using one of the AR rendering samples as a base. For more information, see the How to create your first SmartEyeglass project tutorial.

The SmartEyeglass SDK sample projects demonstrate API usage for AR rendering tasks: SampleARConvertCoordinateSystemExtension, SampleARCylindricalExtension, and SampleARAnimationExtension.

Coordinate systems

The AR rendering engine defines three different coordinate systems for describing positions:

  • The display area of the SmartEyeglass device is 419 x 138 pixels. The absolute position of standard text and graphics within this area are described in what we call the glasses coordinate system.
  • A cylindrical coordinate system defines positions in an imaginary cylinder surrounding the user.
  • A position in world coordinates has a latitude, longitude, and altitude.

The figure below shows how the position of an object in the real world is represented in the cylindrical coordinate system, and how the system maps that position onto the glasses coordinate system in order to determine that a registered position maps onto the users current view.

Cylindrical coordinate system.

Cylindrical coordinate system.

The AR system keeps a database of objects that you register. Each object associates a graphic image or text with a set of either cylindrical or glasses coordinates. Use cylindrical coordinates to define dynamic objects, and glasses coordinates to define fixed-place objects.

  • A dynamic graphic object is rendered such that it appears to overlay a specific real-world location (latitude and longitude). To define a dynamic object, use the AR Rendering API to convert the real-world coordinates of the target object into cylindrical coordinates, then use the resulting position to register the graphic object.
  • A fixed-place graphic object is rendered at a specific position in the display, which does not vary with the user’s movements. Define a fixed-place graphic object using the glasses coordinate system.

Automatic rendering

You must activate AR rendering mode before you can register and render graphic objects. To activate the AR rendering mode, call this method:


Make this call from your AR initialization routine. You could do it in your onResume handler, for example.

Once you have activated AR rendering mode, the system automatically renders the registered graphic objects at the correct position in the display, according to the user’s current orientation. The rotation vector is calculated from the data supplied by the device’s compass, accelerometer, and gyroscope sensors.

For example, if you register an object to appear at 60 degrees right and 30 degrees up, the object is rendered on the screen when the user moves to face that position. The image remains on screen until the registered position is no longer in view.

The screen rendering is performed locally on the SmartEyeglass device. This allows a smooth operation that does not require communication between devices in order to render screen images in the correct positions.

Convert real-world coordinates to cylindrical coordinates

Typically, you would want to use AR rendering to display information about real-world objects with a known position in the real world. You will need to do so when the user is near that object, according to their own current position in the real world. You can use the Android location providers available in the host device to determine when the user is near such an object.

A position in world coordinates has a latitude, longitude, and altitude. In order to associate your graphic display object with a real-world object, you must translate the current world position of the user and the known position of the target object into a set of cylindrical coordinates, and use those to register the dynamic display object with the AR engine.

The AR API provides this method to perform the conversion:


The methodcombines the current real-world position of the user with the real-world position of the object of interest, and returns a position in cylindrical coordinates.

PointF convertCoordinateSystemFromWorldToCylindrical (
               PointInWorldCoordinate viewingLocation, 
               PointInWorldCoordinate targetLocation)
viewingLocation  The real-world position of the SmartEyeglass user, from which the target object can be seen.
targetLocation  The real-world position of the real-world object of interest.

The sample SampleARConvertCoordinateSystemExtension, available in the SmartEyeglass SDK, shows how to use this conversion method, and how to use the resulting position to register the graphic object.

Register an object for AR rendering

The AR engine can render both dynamic and fixed-place objects at the same time.

  • A dynamically placed object is associated with a position in the cylindrical coordinate system, and appears when that position is in view. Its position on the display depends on where the user is looking, with respect to a real-world location.
  • A fixed-place object is associated with a specific position in the glass coordinate system. It always appears at that position on the display, rather than a position that varies with the user’s head position and attitude.

The SampleARCylindricalExtension demonstrates how to combine dynamic objects with fixed-place objects.

Any AR object can be static or animated. A static object is associated with a single image, while an animated object is associated with a set of images which are rendered in a repeating sequence. The SampleARAnimationExtension demonstrates how to register and display animations.

When registration succeeds, the AR rendering engine sends the unique ID of the registered object to your callback for the onARRegistrationResult event.

Add a dynamic object to the AR rendering system

A dynamic object is represented by a CylindricalRenderObject instance. When you create an object of this type, you provide a  a position in the cylindrical coordinate system, and a unique ID that associates it with a bitmap to be rendered at that position.

Use SmartEyeglassControlUtils.registerARObject(RenderObject) to register your object with the AR rendering engine. The object is rendered when the associated position is in the device view.

These are the functions you use to define and register a dynamic object:

object = CylindricalRenderObject(final int objectId, final Bitmap bitmap,
               final int order, final int objectType,
               final float h, final float v)

These are the arguments that you supply to define the dynamic object:

objectId A unique object ID that identifies the object in callbacks that are invoked after registration is completed.
bitmap  The object bitmap for a static object, or the initial bitmap for an animated object.
order The occlusion order of this object, relative to other objects being rendered. Objects with lower values are in front of those with higher values. 0 means this object is in the foreground, and appears on top of any object with a higher value.
objectType The type constant for the data associated with this object, one of:


(See “ Using AR animations” below for more information.)

h, v The horizontal and vertical angles in degrees of the object to be rendered, in the cylindrical coordinate system. The horizontal angle is 0.0 to 360.0 degrees, where 0 is north and 180 is south.

See the SampleARCylindricalExtension for a full example of how to define, register and display this type of object.

Add a fixed-place object to the AR rendering system

A fixed-place object is represented by a GlassesRenderObject instance. When you create an object of this type, you provide a  a position in the glasses coordinate system, and a unique ID that associates it with a bitmap to be rendered at that position.

The bitmap is rendered at the given XY position on the display regardless of the viewing orientation. A fixed-place object might be, for example, a crosshair in the middle of the display, a border around the screen, or static text information. You might, for example, have a fixed-place text label that remains at the bottom of the display at a fixed position, which describes a real-world object marked by a dynamic “X” overlaid on the object wherever it appears in the user’s current view.

These are the functions you use to define and register a fixed-place object:

object = GlassesRenderObject(final int objectId, final Bitmap bitmap,
               final int order, final int x, final int y,
               final int objectType)

The arguments are the same as for dynamic object, except that the position arguments should provide the absolute glasses coordinates of the upper left corner of the image to be rendered. See the SampleARCylindricalExtension for an example of how to register and display this type of object.

Handle AR events

In your event listener, register event-handler callback methods for events that can be received when the AR rendering engine is displaying data.

  • When you register an object, the unique ID of the registered object is passed back to your app in the onARRegistrationResult event upon completion of the registration operation.
  • When the system determines that the position associated with a registered object is visible in the current device view, it sends the object ID in the onARObjectRequest event. Your handler for this event should retrieve the associated RenderObject and send it in a call to sendARObjectResponse().
  • After receiving the response, the AR system automatically renders the data associated with the object, until the position is no longer in the device view.

Here is an example of how the constructor for an extension’s Control class creates a listener for AR rendering events:

SmartEyeglassEventListener listener = new SmartEyeglassEventListener() {
    public void onARRegistrationResult(
        final int result, final int objectId) {
            Log.d(Constants.LOG_TAG, "onARRegistrationResult() result=" + result
                + " objectId=" + objectId);
            if (result != SmartEyeglassControl.Intents.AR_RESULT_OK) {
                        "AR registre object failed! errorcode = " + result);

    public void onARObjectRequest(final int objectId) {
                "onLocalRenderingObjectRequest() " + " objectId=" + objectId);
        // send bitmap
        utils.sendARObjectResponse(renderObj, 0);

SmartEyeglassControlUtils utils = 
    new SmartEyeglassControlUtils(hostAppPackageName, listener);

Modify registered objects

After you have registered a static or dynamic object with the AR rendering engine, the AR rendering API provides methods that allow you to modify or delete the entry.

  • You can change the position associated with a registered object, then move the object. To change the position, use one of these calls, according to the type of render object:
    CylindricalRenderObject.setPosition(PointF point)
    GlassesRenderObject.setPositon(Point point)

    To draw the object at the new position, use:

    utils.moveARObject(RenderObject object)
  • You can change the rendering order, then draw the object using the new order:
    RenderObject.setOrder(int order)
    utils.moveARObject(RenderObject object)
  • You can unregister an object:
    deleteARObject(RenderObject object)

Use AR animations

The AR rendering engine supports the rendering of animations; that is, a set of bitmaps displayed continuously in sequence. To define an AR animation, register an object with the type SmartEyeglassControl.Intents.AR_OBJECT_TYPE_ANIMATED_IMAGE. For example:

RenderObj renderObj = new CylindricalRenderObject(OBJECT_ID, bitmap, 0,
        h, v);

Specify the bitmap for the initial animation state, along with the horizontal and vertical angle of object position in cylindrical coordinates.

See the SampleARAnimationExtension for an example of how to register and display the set of images associated with this type of object.

Send frames with a timer

You can send each frame of the animated object in a call to sendARAnimationObject(), using a loop with a delay that is long enough to allow for updating the bitmap. For example, you can use a Timer to create a loop in which you pass the next bitmap in the animation sequent after a given interval. This is shown in the following code snippet:

timer = new Timer();
timer.schedule(new TimerTask() {
    private Handler mHandler = new Handler();
    public void run() { Runnable() {
            public void run() {
            // pass the next bitmap in the sequence
            // OBJECT_ID identifies the registered animation object
            utils.sendARAnimationObject(OBJECT_ID, bitmap); 

In this example, ANIMATION_INTERVAL_TIME is the delay between two animation updates, in milliseconds. Set the value according to the connection speed and the size of the images you are sending.

Send frames with a callback

To achieve the best possible animation frame rate, you can send each frame of the animated object using this call, which notifies you when the asynchronous operation is completed:

sendARAnimationObjectWithCallback (int objectId, final Bitmap image, 
                                     int transactionNumber)

When the operation completes, the result is sent to your callback function:

smartEyeglassEventListener.onResultSendAnimationObject(int transactionNumber, 
                                                    int result)

Your callback function should use the same method to send the next frame in the animation, for as long as you intend the animation to run.

  • The transaction number that you supplied in the call is passed back to your callback, so that your handler can match a specific operation to its result.
  • The result value that is sent to your callback indicates the success or failure status of the operation; see SmartEyeglassControl.Intents.EXTRA_DISPLAY_DATA_RESULT in the API reference.

Limit vertical range for AR rendering

If you limit the range in which objects might be rendered, it is easier and faster to find them. For some applications, it provides a better user experience if you ignore the vertical tilt orientation of the user’s head altogether. For example, a photo panorama application might want to render a part of a picture according to the user’s horizontal head orientation, but ignore the vertical tilt entirely.

For other applications, you might prefer to limit consideration to some maximum angle. A points-of-interest application, for example, only needs a small vertical range for rendering distant objects.

To limit the vertical range of the AR engine, call changeARCylindricalVerticalRange(float angle). This sets the maximum angle of the viewing range to angle degrees and the minimum angle to –angle degrees. If the current vertical viewing angle is out of range, objects are rendered at the edge of the range.

See the SampleARCylindricalExtension for an example of how to let the user set the vertical range with a swipe on the touch pad.

Learn from sample code

To see how the AR rendering API can be used, see the following samples available in the SmartEyeglass SDK:

  • SampleARConvertCoordinateSystemExtension: Shows how to use the augmented reality (AR) rendering feature to display graphics superimposed on objects whose position is defined by real-world coordinates (latitude, longitude). The sample shows how to convert these coordinates to cylindrical system coordinates based on the user’s current position.
  • SampleARCylindricalExtension: Shows how to register multiple bitmaps in the cylindrical coordinate system to be rendered by the AR engine, how to combine the rendered AR graphics with fixed-place graphics displayed in the glass coordinate system. Also shows how to set a limit on the vertical range by swiping on the controller touch pad.
  • SampleARAnimationExtension: Shows how to display animated images using AR rendering.

Comments 0

Sort by:

Showing 0 of 0 comments. Show all comments