LightBlog

jeudi 28 mai 2015

I/O Summary: Google Now on Tap

20150529005702663

Google is all about organizing the world’s information to make it universally useful, and these developments allow us to understand things better. Their latest advancements come from their “deep neural networks”, of which the first layer can understand things like shadows and depth, the second layer can understand features such as legs, and the top can understand it holistically.

 

Their current networks are over 30 layers deep, and this is the technology that they will use to organize your pictures. These systems allow for deeper user assistance, which is what Google Now is all about:

 

Aparna Chennapragada, Director Google Now, began talking about the future of the virtual assistant platform. From the beginning, their goal was ”to figure out how to assist users in the mobile world”. For this they have 3 steps: 1) understand the context 2) bring you answers 3) help you with actions to get stuff done.

 

With a different context, you need different things.”Context is also about getting what you are saying” and for this, Google built a powerful context engine that can understand over 100 million places – not just their geography, but also their interesting properties, activities and history. Once the context is settled and understood, Google Now can help you get things done. And on mobile, things get done with apps.

 

 

This is why Google is working on a new  capability to assist you when you need it, regardless of where you are on your phone. They call it “Now on Tap” and it takes advantage of new functionality built for Android M. Listening to music, for example, will enrichen the context of Google Now and allow you to ask questions about the song or artist without necessarily naming them. This way,  Android understands the context of your use case and can simplify your questions. All you need to do is access it through the home button.

 


The new Google Now can also monitor the contents of your applications to build a context through which it can assist you. This allows you, for example, to hold the home button on an IM conversation to instantly get information about a discussed movie, or get help researching a particular product. We will expand on this in future features, as it concerns sensible topics such as privacy. It can also be used to select celebrity names or meaningful topics on a browser through text selection and get more information on the subject. Regardless on our stances on privacy, the machine learning and context awareness being applied here is astounding.

 

Will you be using Google Now on Tap? Let us know below!

The post I/O Summary: Google Now on Tap appeared first on xda-developers.



from xda-developers http://ift.tt/1Axezka
via IFTTT

Aucun commentaire:

Enregistrer un commentaire