Before I begin i'd like to do a little experiment with you guys, look at the Arduio Forum logo top left of your computer screen, now specficially look at the very last Letter "M" of "Arduino Forum"
without moving your eyes a mm off of the letter M, do not move your eyes anywhere else at all now, while doing so, read what topic heading or some kind of text only a few inches away from the M in the Arduino Forum section, so why not apply this rule to screens?
Turn off all none used pixels outside of eye's processing capability, the eye wont notice it but if we had say up to what? 60 - 70% of the screen off? i'm sure battery life could be saved right? on potentially any phone with a front facing camera?
How can you tell where someone is looking anyway but just getting an image of the face? I think most such systems have to bounce a beam off the eye. Also the viewer's field of view relative to the screen will change depending on how far away they are. IE from a distance the area being "concentrated" on could in fact be the entire screen.
It largely depends on how good the imaging sensor is ... but "hypothetically" all the OLED pixels that your eye can't view, the pixels could be turned off to save power...
But how much power it eats up processing the real time data, is another question...