Microsoft AR/VR Patent Explores Solution to Myopic Prescription Lenses' Refractive Interference with Eye Tracking
CheckCitation/SourcePlease click:XR Navigation Network
(XR Navigation Network 2023年12月29日)eye trackingGaze point detection, eye posture estimation, facial analysis and recognition have a wide range of applications.The eye tracking feature in the XR headset uses a camera to acquire an image that determines the movement of the human eye. The eye-tracking camera measures the orientation of the eyes to determine the location of the wearer's gaze point in a real or virtual scene. Such measurements can be used to interact with XR applications.
The transparent display may be calibrated during the manufacturing process to ensure optimal measurement of the sensors including the eye tracking sensor. A prescription lens can be used with the transparent display, but this can interfere with the calibration state of the tracking camera, which can affect the measurement of eye orientation, as light from the eye may be refracted by the prescription lens before reaching the camera.
So in the patent application titled "Enhanced eye tracking using lens inverse transform".MicrosoftA method for correcting the refraction of light by a lens by applying an inverse transformation associated with the optical parameters of the lens is presented.
In one embodiment, the inverse permutation may be programmed into the eye tracking software through user input of prescription settings or may be obtained by the eye tracking camera based on the labeling of the prescription lens.
系统可以对所述标记进行读取和解码,以获得处方参数,并用于访问左透镜和右透镜中的每个透镜的相应逆变换。在一个示例中,标记可以是对人类不可见,但可由具有IR功能的摄像头读取的红外IR标记。
FIG. 1 is a cross-sectional block diagram of a portion of a head-mounted device 100. The frame 110 supports a transparent display 115 and a prescription or vision correction lens 120 spaced from the transparent display 115. the display 115 may be a lens for MR orARThe head-mounted device 100 has a see-through display and allows the user's eyes 125 to view through the prescription lens 120 and the see-through 115.
The eye tracking camera 130 may be supported on the edge of the see-through 115, such as in or near the nose pad of the device 100, and is used to support the device 100 on the nose of a user wearing the device 100. said camera 130 is positioned between said see-through 115 and said vision correction lens 120. This position of the camera 130 places it at the edge of the field of view of the eye 125 so that it does not interfere with the field of view of the component 120 and the component 115.
In one embodiment, the camera has an objective lens 132 mounted less than 2 centimeters from the visual correction lens. in one example, the camera 130 may have a field of view of at least 70 degrees.
In one embodiment, the lens 120 comprises a marker 135. said marker 135 may be an infrared marker printed with an infrared ink, said infrared ink fluorescing in the infrared range visible to the illuminating camera 130 but not to the human eye. Said fluorescence may be caused by infrared light provided by one or more infrared light emitting diodes 140 located in the frame 110 between said lens 120 and the assembly 115.
In further examples, the markings may be silk-screened, printed, or otherwise applied to the lens. The location and size of the visible markings 135 may be configured to minimize obstruction of the user's view.
Said marker 135 may be a QR code or other marker text recognizable by other software, and said marker may be used to identify and obtain an inverse transformation to correct a camera image 130 of the eye 125 produced by refracted light from the lens 120 received from said eye 125. The corrected image may be used for eye tracking.
The QR code may encode information that contains visual correction information for recognizing the inverse transformation. For example, the QR code is 4 mm x 4 mm.
In one example, the markers encode the actual prescription, specifying optical correction parameters such as spheres, cylinders, and axes. Each of the parameters may cause refraction of light, thereby effectively changing the position of the eye light hitting the camera sensor. Corresponding inverse transformations may be used to essentially transform the resulting sensed image to correspond to an image of the eye as if the light from the eye had not been refracted by the lens.
Said parameters may be used to determine the inverse transformation that will most closely correctly convert the image sensed by said camera 130 into a corrected image.
In one example, each 0.25 increment of sphere refraction may be associated with a different inverse transformation. Alternatively, each 0.25 increment of such sphere refraction may be associated with a plurality of inverse transformations corresponding to different values of the cylindrical component and the axial component. In one example, each 0.50 increment of the sphere diopter may be associated with a different inverse transformation.
FIG. 2 is a perspective view of a portion of a head unit 200 having a lens 210 and a camera 220 supported by a nose pad 225. Said camera 220 may be partially embedded in said nose pad 225 and supports a field of view having a user's eye through said lens 210. The objective lens 222 of the camera 220 is visible.
In one example, the field of view of the camera 220 may be about 70 degrees. The field of view in further examples is sufficient to capture eye positions for all possible eye positions while wearing the head-up display 200, as well as to capture any markers supported by the lens 210. One or more light emitting diodes 230 may be supported by the head-up display 200 to provide infrared illumination of the markers of the lens 210, as shown in FIG. 1 at 135.
In another example, the lenses 120 and 210 may be part of separate eyewear worn by the user. In this further example, the camera is positioned to capture an infrared marker of the camera, which may be illuminated by an infrared light emitting diode.
FIG. 3 is a perspective view of a head-up display 300 having the shape of eyeglasses. The head-up display 300 includes a frame 310 for housing one or more transparent displays and optional vision correction lenses as described above.Said frame 310 has openings 315 and 320 for a user's right eye and left eye, respectively.Each opening is configured to house a transparent display.
In further examples, a continuous transparent display screen may be supported by the frame 310, rather than by two separate screens.
The frame 310 supports one or more light emitting diodes indicated at 325 for illuminating the markings of the one or more vision correction lenses that the frame 310 is configured to support. In one example, the light emitting diodes may emit infrared light to be reflected through the markings having infrared light reflection. Component 320 is configured to help support the frame 310 of the user's head. said component 320 extends toward the user's ears to help support said head-up display 300 in an appropriate position.
Said component 320 may support circuitry 335 for processing an image obtained through a visual correction lens, and the circuitry uses an inverse transform function to correct said image.
Circuit 335 simultaneously performs a test on the corrected image and with the MR, AR, andVRA further interface of the application performs eye tracking functions. Among other things, the associated application may be locally or remotely hosted, or a combination of remote and locally hosted. In one example, the circuit 335 may include a memory having an appropriate inversion pre-programmed for the user's prescription.
FIG. 4 is a flowchart of a computer-implemented method 400 for correcting eye tracking images. The method 400 begins with an operation 410 to receive an image of an eye through a camera in a head-mounted display device through a visual correction lens.
In operation 420, an inverse transformation corresponding to a vision-correcting lens is obtained.
In operation 430, the received image is inverted to obtain an uncorrected image of the eye. The uncorrected image is an image of the eye is an image obtained without a vision correction lens.
At operation 440, eye tracking is performed based on an uncorrected image of the eye.
FIG. 5 is a flowchart of a computer-implemented method 500 for obtaining correction information using markers to correct an eye tracking image. The correction information may be or identify an inverse transformation function.
The method 500 begins at operation 510 by generating light incident on the vision correction lens. In one example, ambient visible light may be used to provide the incident light. In another example, the light may be infrared light.
In operation 520, light-responsive markers of a visually corrected lens captured in an image read information. In various examples, the marker may reflect visible or infrared light.
In operation 530, an inverse transformation is recognized from the information read by the marker.
In one example, said information includes a vision correction lens prescription. The inversion may be identified by an indexed lookup table using the prescription to identify the corresponding inversion.
In further examples, the inverse transformation may be obtained from a memory associated with the head-mounted display. Prior to shipping the head-mounted display to the user, the inverse transformation may be programmed into the memory based on the user's prescription. In further examples, the user may obtain the appropriate inverse transformation by downloading the inverse transformation from a database of inverse transformations corresponding to various prescriptions.
Enhanced eye tracking using lens inverse transform".Microsoft patentThe application was originally submitted in April 2022 and was recently published by the US Patent and Trademark Office.
Generally speaking, after a U.S. patent application is examined, it will be automatically published 18 months from the filing date or priority date, or it will be published within 18 months from the filing date at the request of the applicant. Note that publication of a patent application does not mean that the patent is approved. After a patent application is filed, the USPTO requires actual review, which can take anywhere from 1 to 3 years.