Coder Social home page Coder Social logo

gkaguirrelab / transparenttrack Goto Github PK

View Code? Open in Web Editor NEW
16.0 8.0 9.0 5.9 MB

Determine pupil size and gaze location in IR video of the eye

License: MIT License

MATLAB 100.00%
pupil ellipse perspective-projection ray-tracing biometrics eye-tracking

transparenttrack's Introduction

transparentTrack

Code to analyze pupil size and gaze location in IR videos of the eye.

These MATLAB routines are designed to operate upon monocular, infra-red videos of the eye and extract the elliptical boundary of the pupil in the image plane. Absolute pupil size and gaze position are estimated without need for explicit calibration.

Notably, this software is computationally intensive and is designed to be run off-line upon videos collected during an experimental session. A particular design goal is to provide an accurate fit to the pupil boundary even when it is partially obscured by the eyelid. This circumstance is encountered when the pupil is large, as is seen in data collected under low-light conditions or in people with retinal disease.

The central computation is to fit an ellipse to the pupil boundary in each image frame. This fitting operation is performed iteratively, aided by ever more informed constraints upon the parameters that define the ellipse, including by a projection model of the appearance of the pupil on the image plane. A perspective projection of the pupil is calculated using a model eye. This accounts for the refractive properties of the cornea and any lens distortion in the camera. The parameters of this model are informed by normal variation in eye biometrics from the literature, subject ametropia (or measured axial length), by empirical calibration of the IR camera, knowledge of corrective lenses worn by the subject during recording, and by estimation of the geometry of the scene performed by the routines.

At a high level of description, the fitting approach involves:

  • Intensity segmentation to extract the boundary of the pupil. A preliminary circle fit via the Hough transform and an adaptive size window. In principle, another algorithm (e.g., the Starburst) could be substituted here.
  • Initial ellipse fit with minimal parameter constraints. During this initial stage, the pupil boundary is refined through the application of "cuts" of different angles and extent. The minimum cut that provides an acceptable ellipse fit is retained. This step addresses obscuration of the pupil border by the eyelid or by non-uniform IR illumination of the pupil.
  • Estimation of scene parameters. Assuming an underlying physical model for the data, an estimate is made of the extrinsic translation vector and torsion of the camera, and the centers of rotation of the eye. This estimation is guided by a forward, perspective projection model that best accounts for the observed ellipses on the image plane. This step is informed by an empirical measurement of the camera intrinsics and lens distortion parameters.
  • Repeat ellipse fit with scene geometry constraints. Given the scene geometry, the perimeter of the pupil on the image plane is re-fit in terms of explicit values for the rotation of the eye and the radius of the pupil aperture in millimeters.
  • Smooth pupil radius. Pupil radius is expected to change relatively slowly. This physiologic property motivates temporal smoothing of pupil radius using an empirical Bayes approach. A non-causal, empirical prior is constructed for each time point using the (exponentially weighted) pupil radius observed in adjacent time points. The posterior is calculated and the pupil perimeter in the image is refit using this value as a constraint upon the radius.

There are many parameters that control the behavior of the routines. While the default settings work well for some videos (including those that are part of the sandbox demo included in this repository), other parameter settings may be needed for videos with different qualities.

The MATLAB parpool may be used to speed the analysis on multi-core systems.

To install and configure transparentTrack, first install toolboxToolbox (tBtB), which provides for declarative dependency management for MATLAB. Once tBtB is installed, transparentTrack (and all its dependencies) will be installed and readied for use with the command tbUse('transparentTrack'). These dependencies are:

A good place to start is to run the routine DEMO_eyeTracking which is located in the validations directory. This routine will download and process an example eye tracking video. The script includes descriptions of the processing stages and the definition of analysis parameters.

transparenttrack's People

Contributors

gfrazzetta avatar gkaguirre avatar harrison-mcadams avatar ozenctaskin avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

transparenttrack's Issues

sceneGeometry structure

  • Re-arrange to place camera params in their own field
  • Add a field to specify if the model should use ray tracing or not

Custom pipeline for video demo

Let's build a custom pipeline option within the runPupilPipeline that creates and saves videos and pupil data files at each stage of processing. This can be used to demo the stages of the analysis.

attempt starburst algorithm

We currently find the initial perimeter of the pupil using an intensity threshold approach. Following this stage, we could insert a step in which the geometric center of the pupil boundary is found, and then used as the starting point for a starburst algorithm search. This might provide us with a better boundary before we start examining cuts.

Add visual axis to model

The model eye is based around an optical axis coordinate frame. Need to add to the eye structure a field that is the visual axis offset, and provides the relative azimuth and elevation of the visual axis with respect to the optical axis.

use transparent ellipse params in control file

The routine allows the specification of an pupil perimeter by ellipse parameters in the control file. The code currently reads and writes this in explicit ellipse form. We should change this to be in transparent form to maintain consistency.

Invalid eyePose value

Hi,

I have been running a video through the pipeline and have made it past the scene parameter estimation step, but am getting an error in the second fitPupilPerimeter step near the end of the Bayesian smoothing. I'm including a picture with the error. Let me know what other files I should send. Thanks!

mdp014_smoothpupilradius_error

Mapping toolbox

Hi,

I'm attempting to run the code but am getting an error that 'extractfield' requires the Mapping Toolbox. Will I have to purchase this toolbox to run the analysis?

Thanks!

Error: Objective Function Undefined at Initial Point

Hi. I'm trying to run the pipeline from beginning to end on one of the videos we recorded in our setup. I was following the code written in the DEMO_eyeTracking, changing paths and such. I didn't make the specific measurements that were in that script for creating the scene geometry, so I just used the default settings instead with the calls:

sceneGeometry = createSceneGeomtry():
[cameraDepth, cameraDepthSD] = depthFromIrisDiameter(sceneGeometry,maxIrisDiamPixels);

The pipeline runs very well up until the estimateSceneParams call where the error Objective Function Undefined at Initial Point is thrown. I'm including a screenshot of the entire error output for more information.

estimatesceneparams_error

I can upload the eyetracking video file if that would help, but I was wondering if this error may be due to just using the defaults for creating the scene geometry or if there is something else I am missing. Thanks

harmonize convertRawToGray

This routine currently calls the two functions deinterlaceVideo and resizeAndCropVideo. Ideally, these three separate pieces of code should be integrated into a single routine (with integrated parameters) that is used for video pre-processing. The routine (called perhaps preprocessVideo) could offer options to deinterlace, crop, rotate, etc.

Unable to use estimateSceneParamsGUI, Creating Poor Fit when Using estimateSceneParams

I am currently attempting to use the estimateSceneParamsGUI on my own IR video of an eye. I would like to mention I do not have access to the camera's intrinsic matrix and have been using the same matrix from DEMO_eyeTracking at the moment.

The code is currently erroring at line 140 of estimateSceneParamsGUI where it attempts to display the frame to be analyzed, it is shown below:

Error using  * 
Incorrect dimensions for matrix multiplication. Check that the number of columns in the first matrix matches the number of rows in
the second matrix. To perform elementwise multiplication, use '.*'.

Error in projectToImagePlane (line 58)
projectionMatrix = sceneGeometry.cameraIntrinsic.matrix * cameraExtrinsicMatrix;

Error in projectModelEye (line 289)
    projectToImagePlane(worldPoints,sceneGeometry);

Error in eyePoseEllipseFit (line 256)
    [~, probeCoord0] = projectModelEye(refPose, sceneGeometry, 'cameraTrans', [0;0;0]);

Error in estimateSceneParamsGUI (line 140)
    [eyePose, cameraTrans] = eyePoseEllipseFit(Xp, Yp, glintCoord,
    adjustedSceneGeometry,'eyePoseUB',[89,89,0,5],'cameraTransBounds',p.Results.cameraTransBounds);

Error in jorgeright_transparentTrack (line 107)
sceneObjects = estimateSceneParamsGUI('', 'frameSet', frameSet, 'eyeArgs', eyeArgs, 'sceneArgs', sceneArgs);

This error seems to be occurring since sceneGeometry.cameraIntrinsic.matrix is empty. However, I do not seem to have control in when this scene geometry file is created or when the intrinsic matrix is created. Is there something I'm missing?

I tried working around this error by just using estimateSceneParams instead with pre-made variables from the demo, however, this produces a poor fit as this data is not meant for my video specifically.

Axial length of the visual vs. optic axis of the eye

The IOL Master is used to measure the axial length of the eye. The subject is asked to fixate a red point, and the axial length is measured along the axis that connects the fovea to that measurement location. Therefore, the device reports the axial length of the eye along the visual as opposed to optic axis of the eye. This will lead to a slight over-estimation of the length of the eye.

Need to understand this more completely and, if necessary, do some quick trigonometry in the the modelEyeParameters routine to adjust.

Middle-out search strategy for pupil cuts

Creds to Rob Cooper: We could increase the efficiency of the search across eccentricities for pupil cuts in the makePreliminaryControlFile stage by first doing test cuts at 1/2, then 1/4, etc eccentricity distances. This would cut the time from the steady search from outside to inside for those frames in which a large degree of cutting is necessary

Cannot Run DEMO_eyeTracking to Completion

Hi,

I am attempting to run DEMO_eyeTracking with all toolboxes fully installed.

Right now, I am running into an error where the code tries to access the GazeCal01_sceneGeometry.mat file in the TOME_processing folder, however, it does not exist within the folder.

I have been looking through files to see where the file was supposed to be downloaded, however, I have been unable to locate where this file is supposed to be created. Am I missing something?

The error is below:

Error using load
Unable to read file
'C:\Users\opt-omlab\Documents\MATLAB\DEMO_eyeTracking\TOME_processing\session2_spatialStimuli\TOME_3020\050517\EyeTracking\GazeCal01_sceneGeometry.mat'.
No such file or directory.

Error in fitPupilPerimeter (line 155)
    load(p.Results.sceneGeometryFileName,'sceneGeometry');

Error in runVideoPipeline (line 320)
            eval(funCalls{ff});

Error in DEMO_eyeTracking (line 226)
runVideoPipeline( pathParams, ...

Thanks,
Christopher Bain

controlFile is appended instead of over-written

When called with the overwriteControlFile flag, the makePrelimControlFile routine should completely over-write any existing control file. Instead, the newly calculated instructions are appended to the existing control file.

Try bads

Try using the BADS optimizer instead of pattern search for estimate camera translation. Might also try it instead of fmincon in the eyePose search for ellipse Fit or in the inverse pupil projection

Warning IDs

Make sure that all warnings have warn IDs

camera rotation in z

Need to add the ability to adjust the camera rotation matrix in Z. This is because the eye model now has a specific horizontal and vertical orientation both in eye rotation and in exit pupil ellipticity.

The estimateSceneParams stage can adjust camera rotation in Z by the value theta, with negative values being counter-clockwise, and positive values clockwise. The extrinsicCameraRotation matrix would then take the form:

[cosd(theta) -sind(theta) 0; sind(theta) cosd(theta) 0; 0 0 1];

Save p.Results params for controlFile

When makeControlFile is called, it has a set of params in p.Results. These are not currently saved anywhere. We could write out each field of p.Results as a comment (% ...) at the end of the controlFile.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.