Pay per emotional response

Pay per emotional response August 19, 2013

Chris Taylor at Mashable discusses how Google Glass (a set of glasses connected to Google) will change advertising.  According to the patent application, the technology will track gazes, charging advertisers for what ads  the wearers look at and for how long they do so.

But that’s nothing:  These glasses are also looking back at the wearer.  The patent application includes a method for determining how much the wearer’s eyes dilate when they see an ad.  (Our pupils get bigger when we see something we like.)  So advertisers will be charged more when the ads create an emotional response.

After the jump, an excerpt from the patent application and some serious questions.

From the “Claims” section of  United States Patent: 8510166:

1. A method comprising: receiving scene images from a head mounted gaze tracking device capturing external scenes viewed by a user wearing the head mounted device, the scene images received at a server via a network; receiving gaze direction information from the head mounted gaze tracking device along with the scene images, the gaze direction information indicating where in the external scenes the user was gazing when viewing the external scenes, the gaze direction information received at the server via the network; executing an image recognition algorithm on the scene images to identify items within the external scenes viewed by the user; generating a gazing log tracking the identified items viewed by the user; performing latent pre-searches on at least a portion of the items viewed by the user to generate latent search results, wherein the latent pre-searches are automatically triggered while the associated items are in the user’s peripheral view and without affirmative requests on a per search basis by the user; and caching the latent search results.

2. The method of claim 1, wherein generating the gazing log further comprises storing an indication of whether the user looked directly at the identified items based at least in part upon the gaze direction information.

3. The method of claim 2, wherein the user is deemed to have looked directly at a given identified item if a position of the given identified item within a given scene image correlates to the gaze direction information associated with the given scene image and if the user looked directly at the given identified item for a threshold period of time.

4. The method of claim 2, wherein generating the gazing log further comprises storing an indication of how long and when the user looked at one or more of the identified items based upon the gaze direction information.

5. The method of claim 1, further comprising: receiving pupil dilation information from the head mounted gaze tracking device along with the scene images, the pupil dilation information indicating a pupil dilation of the user while viewing the external scenes, the pupil dilation information received at the server via the network.

6. The method of claim 5, further comprising: inferring an emotional state of the user while viewing the external scenes based at least in part upon the pupil dilation information; and storing an emotional state indication associated with one or more of the identified items.

7. The method of claim 1, further comprising: determining which, if any, of the identified items within the external scenes viewed by the user are advertisements; and charging advertisers associated with the advertisements based at least in part on a per gaze basis.

8. The method of claim 7, wherein charging the advertisers further comprises charging a given advertiser associated with a given advertisement based at least in part upon whether the user looked directly at the given advertisement as determined by the gaze direction information and how long the user looked at the given advertisement.

9. The method of claim 1, further comprising: determining which, if any, of the identified items within the external scenes viewed by the user are advertisements; and charging advertisers for analytical information generated based upon the gazing direction information.

10. The method of claim 9, further comprising: combining the gazing direction information acquired from the user wearing the head mounted device with other gazing direction information acquired from other users wearing other head mounted devices to generate batched information; and charging advertisers for the analytical information generated based upon the batched information.

11. The method of claim 10, wherein the analytical information includes the tendency of a given advertisement to draw user gazes or to hold the user gazes.

12. The method of claim 10, wherein the analytical information includes the tendency of a given advertisement to evoke an emotional response.

Keep reading for even more on Google’s plans to make advertising a part of normal human perception.

My question:  Who would willingly subject themselves to so much advertising?  Will people actually want to filter their very perception of the outside world through Google’s advertising system?

Would any of you wear these glasses?

Might this be over-reaching on the part of Google, attaining a reductio ad absurdum of advetising technology?

""At his current rate of growth, my three year old son will be ten feet ..."

Where Christianity Is Growing the Most
"There are no simple answers...That is true. But allow me to ask more specific questions. ..."

DISCUSS: Our Approach to Foreign Policy
"There are no simple answers, and I don't claim any expertise about foreign policy. But ..."

DISCUSS: Our Approach to Foreign Policy
"How can Madagascar have 44 million Christians when according to Wikipedia its population is only ..."

Where Christianity Is Growing the Most

Browse Our Archives