One of the unique features of the Pixel 4 series has to be the presence of the Soli chip. However, Google didn’t add new functionalities that would take advantage of the chip, except the air gesture to control music playback in the March Pixel feature drop. Now, the company’s research team (Research at Google) has released an app named ‘Soli Sandbox’ to create new Soli workflows.
Before we proceed further, it is worth keeping in mind that Soli Sandbox is not a tool for creating production apps. Rather, it is a developer-centric platform that acts as a bridge between a developer’s web-based Soli gesture prototypes and Soli interactions.
“Soli Sandbox is a way to connect your web prototypes to Soli interactions on Pixel 4. Create a prototype on your computer, and set it in motion with Soli gestures through the Soli Sandbox app. You’ll be able to use touchless swipes, taps, and more in your own prototypes!”, explains Google.
The app offers four example interactions namely Presence, Reach, Swipe, and Tap. According to Google, these events are to help developers familiarize Soli interactions.
Soli Sandbox Interactions
- Presence Event – Triggers every time Soli detects a person within 0.7 meters (2.3ft) of the device.
- Reach Event – Detects movement resembling a hand reaching toward the device, within about 5-10 cm.
- Swipe Event – Detects motion that resembles a hand wave gesture above the device.
- Tap Event – Detects movement that resembles a single hand bounce above the center of the phone.
Google says that Soli Sandbox prototypes are HTML files that receive and respond to Soli events with JavaScript. The app uses Android System Webview to display prototypes. Hence, all technologies supported by Android System Webview are supported on the Soli Sandbox app, except WebAR, WebGL, and WebVR.
If you’re interested to get involved, you may check out Soli Sandbox Manual and the starter project on Glitch.
from Beebom https://beebom.com/google-soli-sandbox-pixel-4/
No comments:
Post a Comment