The SmartEyeglass SDK includes a set of sample projects that demonstrate most features and API functionality. You can include these samples in your own projects, and extend or modify them to provide your own app’s behavior and appearance.
These samples are included:
|HelloEvents||Shows how to handle user-input events such as touch, tap, swipe and Back key events.|
|HelloLayouts||Shows all standard layout parts that can be used for a SmartEyeglass app, except Gallery.
Demonstrates two different approaches for displaying and updating a UI: bitmap and layout. The bitmap approach is useful for compatibility with accessories that do not have layout support (such as SmartWatch), or for displaying special custom UI elements.
|AdvancedLayouts||Shows how to define a gallery of cards that respond to swipe events (based on a string array), and how to create a view hierarchy that uses layer transitions.|
|HelloWidget||Shows how to implement a simple widget with a layout.|
|SampleDialogExtension||Shows how to use the three types of dialogs: A dialog with multiple action buttons, a message dialog with and OK button, and a pop-up dialog with a timeout.
When a dialog is closed, the sample responds by displaying the selected button.
|HelloNotification||Shows how to define notification sources, send notifications, clear notifications, and assign actions to notifications.|
|SampleCameraExtension||Shows how to access the SmartEyeglass camera to capture still pictures, JPEG streams, and videos, how to listen for camera events, and store image data to external storage.|
|HelloSensors||Shows how to get sensor data from all available sensors on a SmartEyeglass device: accelerometer, gyroscope, magnetic field sensor (compass), light sensor, and rotation vector.|
|SampleVoiceTextInputExtension||Shows how to use the voice input feature, which accepts voice input when the user presses the Talk button, automatically converts it to text and returns that text to your app.|
|SampleARConvertCoordinateSystemExtension||Shows how to use the augmented reality (AR) rendering feature to display graphics superimposed on objects whose position is defined by real-world coordinates (latitude, longitude).
The sample shows how to convert these coordinates to cylindrical system coordinates based on the user’s current position.
|SampleARCylindricalExtension||Shows how to register multiple bitmaps in the cylindrical coordinate system to be rendered by the AR engine, how to combine the rendered AR graphics with fixed-place graphics displayed in the glass coordinate system. Also shows how to set a limit on the vertical range by swiping on the controller touch pad.|
|SampleARAnimationExtension||Shows how to display animated images using AR rendering.|
|SampleDisplaySettingsExtension||Shows how to set the apparent screen depth and enable safe-display mode.|
|HelloStandbyMode||Shows how to request activation of standby mode to reduce power usage, and how to detect when standby mode is terminated by user action.|
|SamplePowerModeExtension||Shows how to toggle high-performance power mode (a WLAN connection) and normal power mode (a Bluetooth connection).|
|SampleSoundEffectsSettingsExtension||Shows how to enable and disable sound effects that provide feedback for user input actions.|