Google Soli Chip in Smart Displays, Tablets
In a recent video shared on YouTube, Google’s Advanced Technology and Projects group (aka ATAP) showcased how the Soli chip can be used in smart displays and tablets. For the uninitiated, the Soli chip was originally developed by Google’s ATAP team. And now, the researchers have explained how the radar-based chipset can be used to develop “socially intelligent” displays that show results by detecting the head and body movements of individuals.
The video shows off some of the applications of the Soli chip in smart displays and tablets. The team used a “combination of new sensing and machine learning techniques” to detect the head orientation, body movements, and other gestures and take the necessary actions. For instance, the video shows the appearance of potential weather conditions on the display when a person glances at a Soli-based smart display before going out the main door.
This shows that the Soli chip can recognize social contexts and show relevant results that could help an individual take the right decision. In this case, the smart display was able to recognize the fact that the person was going out and might need to take a look at the weather outside. It is similar to how humans can communicate with each other non-verbally by understanding the social context.
Other examples showed a video pausing automatically when a person walked away from the viewing position, auto-answer incoming video calls when a recipient glanced at a screen, and many others. You can check out the video attached right below to see the Soli-based, “socially intelligent” displays and tablets in action.
While these smart displays and tablets look pretty useful, there is no word on when or if Google will actually make them available for commercial use. And considering the issues related to the Soli chipset (implementation costs and more), it is unlikely that we will see these smart displays in the market in the near future. What are your thoughts on this? Let us know in the comments below!