Recent Posts

Project summary and conclusion

2 minute read

I’m really proud that I got as far as I did this summer. I learned a lot of new things and even though it was not a particularly advanced pupil detection algorithm, I implemented it myself from a paper. Also, the app actually works!

The ingredients are:

  • Apple iPad 10.5” (2017) for running the iOS app and providing high-resolution camera input at 30 fps,
  • Swift, Objective-C and C++ (and Python for experiments in OpenCV outside the app),
  • Apple’s SpriteKit for the UI,
  • Apple’s hardware-based AVCapture for face detection,
  • Dlib for five-point face landmark detection (eye corners) and face alignment normalization,
  • Pupil detection algorithm implemented from the paper Automatic Adaptive Center of Pupil Detection Using Face Detection and CDF Analysis (Mansour and Shanbezadeh, 2010) using OpenCV,
  • Pearson’s Product Moment Correlation Coefficient for calculating the correlation between pupil and sprite movements.

I would really have liked to do some testing with infants this summer but the app was far from ready for testing when people with infants were visiting us. I hope that I can make it happen during the autumn.

Here’s the project report.

Future enhancements

There are a lot of things that I would like to do, going forward. A few of which are:

  • Release it on the iOS App Store,
  • Enabling parents to use their own photos as moving objects for investigating what images/things/people their child is drawn to,
  • Small animations in the moving images, perhaps by slowly toggling between two-three photos.
  • Randomized pairing of objects with a long-term score board, in our case, finally settling what we’ve always expected: our cats would win over us, the parents.
  • Adjust for the head tilt angle by combining the corresponding proportion of horizontal and vertical pupil movement. That way, detection should be completely rotation invariant (although vertical smooth pursuits for infants take longer to develop than horizontal pursuits).
  • Make the hearts move faster with more correlation. As it is now the hearts only move when average correlation over the last 30 frames is at least 0.7. I use correlation to control opacity for the hearts indicating that the user is following the moving objects so it is unexpected that the hearts are not moving at the same time that the indicators are showing.
  • Investigate how the pupil detection works with make-up, different skin tones, eye configurations, etc. Starting with a verification that I get roughly the same results as the original paper got when running the algorithm on the BioID dataset.

Also, since I can produce correlations in the vertical axis as well, it would be really fun to pursue the other sketch in my project specification:

Sketch from project specification. Personal image by author. May 2018.

A game based on that could be really fun for toddlers.

A detection working indicator could be useful

less than 1 minute read

For a parent, it is probably useful to know when the detection is working or not so I added an emoji-based indicator:

Cropped app screenshot. Personal image by author. August 2018.

Harry-testing the app. Not a complete success

less than 1 minute read

It’s not easy getting a child already used to touch-controlled apps to try an eye-controlled app. Harry can start the app

but not keep his hands away from the interface while it is running

Notice that Harry switches to looking at the top object at the end of the video and it is picked up by the app. The correlation is not strong enough to be counted though.

There’s a heart indicator showing most of the time during the other part of the video but Harry is probably too close to the camera for the camera to pick up his pupils properly. If the camera loses contact, I should…

TODO Hide heart indicators when face or eye detection fails

to avoid any confusion.

Had he played long enough, one of the hearts (pretty sure which) would reach the tip of the arrow and the win screen would show, e.g.

End screen. Personal illustration by author. August 2018.

Movement along a bezier curve

1 minute read

When looking at the moving objects, I want the heart at the sides to move along their respective arrows until one heart reaches the tip and “wins”. That is, I want to map a value [0, 1] to positions on the path, starting at the base and ending at the tip.

Arrow drawn in Sketch.app. Personal illustration by author. June 2018.

However, this proved more difficult than I thought. I figured I would draw a bezier path in Sketch.app and use the same coordinates in a Python program using the Bezier package, which can extract points along a path with arbitrary subdivisions. Only, it turned out that there was a disconnect between how bezier paths are constructed in GUI programs and how they are constructed programmatically. Perhaps I just didn’t look deep enough into the problem. I also tried guessing coordinates for a long time but getting the loop right was too difficult.

SpriteKit can move sprites along bezier paths but once you start them, they will move with a set speed, i.e. one can’t step forward arbitrarily along the path. Perhaps I can pause when the pupil-to-object correlation is low and unpause when the correlation is high? Then only the problem of creating a nice bezier curve remained.

After trying far too long to recreate the curve by umpteen methods, I found PaintCode, which can give me the Swift code for any bezier path I would draw. Easy!

Bezier path along looped arrow, drawn in [PaintCode](https://www.paintcodeapp.com). Personal illustration by author. August 2018.

Rotation invariant pupil detection (well, almost)

less than 1 minute read

The advantage of first transforming the face so that is level and then performing eye corner and pupil detection is that we get a high degree of rotation invariance:

With that said, once the head tilt becomes large enough, the horizontal pupil movements will be too small compared to the noise. Considering a 90 degree tilt, the pupils will only move vertically when tracking the moving objects.

To fix this, one would probably have to adjust for the head tilt angle by combining the corresponding proportion of horizontal and vertical pupil movement.