LOADING STUFF...

Excerpts from New AR/VR Patent Applications filed at the USPTO on March 16, 2024

HTC4mos agorelease firefly
2,016 0 0

Article related citations and references:XR Navigation Network

(XR Navigation Network March 16, 2024) Recently, the U.S. Patent and Trademark Office announced a batch of new AR/VR patents. The following is a summary of the XR Navigation Network (for details, please click on the patent title), a total of 66 articles. For more patent disclosures, please visit the patent section of XR Navigation Networkhttps://patent.nweon.com/To search, you can also join the XR Navigation Network AR/VR patent exchange WeChat group (see the end of the article for details).

1. "Meta Patent | Voltage regulator module with shared storage capacitor architecture for depth camera assembly

Excerpts from new AR/VR patents filed with the U.S. Patent Office on 03/16/2024</trp-post-container

In one embodiment, the charging circuit includes a voltage regulation module (VRM) and a shared storage capacitor, the VRM configured to generate a regulated voltage, and the shared storage capacitor coupled to the VRM. the shared storage capacitor is charged during a non-exposure window of the DCA including the illuminator and the sensor array using the regulated voltage. The shared storage capacitor provides power to the illuminator and the sensor array during the exposure window of the DCA, wherein the illuminator emits light into the localized area and the sensor array detects the light.

2. "Meta Patent | Interactive avatars in artificial reality

Excerpts from new AR/VR patents filed with the U.S. Patent Office on 03/16/2024</trp-post-container

In one embodiment. Once immobilized, the Avatar can interact with the environment based on scenario queues and rules without active control by the owner. The interactive Avatar system can be configured based on user selection with action rules, visual elements, and settings. Once configured and fixed to a position by the owner, the centralized system can make the Avatar (and its configuration) available to another XR device when that other XR device is in that position. This allows the user of that other XR device to discover and interact with the Avatar based on the configuration established by the Avatar owner.

3.《Meta Patent | Body pose estimation using self-tracked controllers

Excerpts from new AR/VR patents filed with the U.S. Patent Office on 03/16/2024</trp-post-container

In one embodiment, the computing system may determine a posture of a device held by or attached to a user's hand based on sensor data captured by the device. The system may determine a pose of a headset worn by a user based on sensor data captured by the headset. The system may determine a location of a first set of key points associated with a first portion of the user's body based on one or more first images captured by one or more cameras of the device, the pose of the device, a second image captured by more than one camera of the head-up display, and the pose of the head-up display. The system may determine the pose of the user's body based on at least the locations of the first set of key points.

4.《Meta Patent | Gradient-index liquid crystal lens having lens segments with optical power gradient

Excerpts from new AR/VR patents filed with the U.S. Patent Office on 03/16/2024</trp-post-container

In one embodiment, the lens system described in the patent may include a lens having a first electrode layer, a second electrode layer, and a liquid crystal layer disposed between the first electrode layer and the second electrode. The lens may be divided into a plurality of lens segments, the lens segments being arranged concentrically from a center of the lens to a periphery. A transmittance of a first lens segment among the plurality of lens segments may be greater than a transmittance of a second lens segment among the plurality of lens segments provided radially outwardly with respect to the first lens segment.

5.《Meta Patent | Display backplanes with integrated electronics, photonics and color conversion

Excerpts from new AR/VR patents filed with the U.S. Patent Office on 03/16/2024</trp-post-container

In one embodiment, the patent-described in device may include a backlight unit (BLU), the backlight unit including an electronic integrated circuit layer, a photonic integrated circuit layer, a color conversion module, and a display interface layer. the BLU may include at least one laser, or may be configured to receive a laser from at least one external laser source. The laser light may be transmitted toward a portion of the display interface layer using the photonic integrated circuit. The color conversion module may be used to convert the laser light into one or more desired colors.

6.《Meta Patent | System and method for fabricating polarization holograms

Excerpts from new AR/VR patents filed with the U.S. Patent Office on 03/16/2024</trp-post-container

In one embodiment, the patent describes a system for generating a polarized interference pattern. The system includes a light source configured to output a first beam having a predetermined wavelength; a transmission polarization volume hologram ("PVH") mask configured to provide a predetermined diffraction efficiency to a second beam having a predetermined wavelength, circular polarization, and non-zero angle of incidence at the transmission PVH mask; an optical deflector element provided between the light source and the transmission PVH mask, and configured as a light deflection element that deflects the first light beam as a second light beam toward the transmission PVH.

7.《Meta Patent | Optical quality pvdf having enhanced piezoelectric response

Excerpts from new AR/VR patents filed with the U.S. Patent Office on 03/16/2024</trp-post-container

In one embodiment, a mechanically and piezoelectrically anisotropic polymer article is formed from a crystallizable fluoropolymer and a nucleating agent. The polymer article may be, for example, a film or a fiber. The crystalline phase may comprise at least about 501 TP3T of the polymer article. in one embodiment, the fluoropolymer may comprise vinylidene fluoride, vinylidene trifluoride, vinylidene chlorotrifluoride, hexafluoropropylene, and vinylidene fluoride. The polymer article may include up to about 10 wt% of nucleating agent. Such polymer articles are optically transparent, have an elastic modulus of at least about 3 GPa and an electromechanical coupling factor of at least about 0.15.

8. "Meta Patent | Handheld controller with hand detection sensors

Excerpts from new AR/VR patents filed with the U.S. Patent Office on 03/16/2024</trp-post-container

In one embodiment, the handheld controller described in the patent is configured to be held by a hand of a user. The handheld controller includes a body and a handle extending from the body, wherein the handle has a palm side and a finger side. A control button is positioned above the body or the handle and a detection sensor is positioned in the handle. The detection sensor is positioned to detect whether a finger or palm of a user's hand engages the handle. The detection sensor may be a capacitive touch sensor, a proximity sensor, or may be operable to detect the touch of the user's hand or fingers.

9. "Microsoft Patent | Optical array panel translation

Excerpts from new AR/VR patents filed with the U.S. Patent Office on 03/16/2024</trp-post-container

在一个实施例中,专利描述的头戴式显示设备包括用于发射显示光的显示面板。光学阵列面板沿着由显示面板发射的显示光的光路定位,并且配置为将显示光重定向向视窗。eye tracking系统估计用户眼睛相对于可佩戴显示设备的当前瞳位。致动器平移光学阵列面板相对于显示面板的位置,以朝向用户眼睛的当前瞳位移动视窗的位置。

10.《Microsoft Patent | Near-eye display systems utilizing an array of projectors

Excerpts from new AR/VR patents filed with the U.S. Patent Office on 03/16/2024</trp-post-container

In one embodiment, the array of projectors may be arranged along the first dimension and may output image light toward a coupler within the waveguide. The waveguide provides a one-dimensional outgoing optical pupil extension. The array of monochromatic projectors is arranged in an offset column. The couplers couple the image light from the array of projectors into the TIR path within the waveguide. Different optical elements, including diffractive optics and reflective optics, can be realized as couplers. The image light propagates within the waveguide until it interacts with the coupler. Upon interaction with the coupler, the image light expands in a second dimension transverse to the first dimension and couples out of the waveguide.

11.《Apple Patent | Locating content in an environment

Excerpts from new AR/VR patents filed with the U.S. Patent Office on 03/16/2024</trp-post-container

In one embodiment, the method described in the patent includes obtaining an image of a machine-readable data representation located on a physical object using a camera of an electronic device. The machine-readable data representation includes an encoded form of a data value; decoding the machine-readable data representation to determine the data value; selecting a content source based on a content source identifier, obtaining a content item and content location information from the content source based on the content identifier, determining a content location and a content orientation of the content item with respect to the physical object based on the content location information, and displaying, based on the content location and the content orientation, an image of the content item using the electronic device representation of the content item.

12.《Apple Patent | Head-mounted device with spatially aware camera adjustments

Excerpts from new AR/VR patents filed with the U.S. Patent Office on 03/16/2024</trp-post-container

In one embodiment, the head-mounted device may include one or more sensors for capturing an image of the real-world environment and for determining a location of the head-mounted device facing within the real-world environment. The sensors may be used to capture images for creating an illumination map indicative of where the one or more light sources are located within the environment, and for creating a physical map representing the geometry of the environment. The sensors may be used to predict a direction that the head-mounted device will face in the real-world environment. One of the sensors may include an image sensor for capturing an image displayed to a user of the head-mounted device. One or more settings of the image sensor may be adjusted based on the illumination map and based on a predicted direction that the head-mounted device will face when capturing future frames.

13.《Apple Patent | Notification of augmented reality content on an electronic device

Excerpts from new AR/VR patents filed with the U.S. Patent Office on 03/16/2024</trp-post-container

In one embodiment, the electronic device detects that the playback of the content has reached a corresponding playback position. In response to detecting that the playback of the content has reached the corresponding playback position, and based on a determination that the corresponding playback position in the content is associated with a corresponding augmented reality content corresponding to the content, the electronic device provides a notification corresponding to the corresponding augmented reality content.

14.《Apple Patent | Head-mounted electronic device with adjustable frame rate

Excerpts from new AR/VR patents filed with the U.S. Patent Office on 03/16/2024</trp-post-container

In one embodiment, the head-mounted device described in the patent includes one or more displays configured to present media content. The media content may be presented in a user interface window. The head-mounted device may include a display controller configured to adjust a frame rate of the one or more displays, and may include frame rate management circuitry. The frame rate control circuitry is configured to determine, based on the type of media content presented in the user interface window and based on additional information such as a preferred frame rate associated with the media content, a size of the total display area of the user interface window relative to the one or more displays, gaze point data, gesture data, head posture data, data associated with other body parts, audio information, and other data, whether to adjust the one or more displays.

15.《Apple Patent | Methods for depth conflict mitigation in a three-dimensional environment

Excerpts from new AR/VR patents filed with the U.S. Patent Office on 03/16/2024</trp-post-container

In one embodiment, a computer system facilitates depth conflict mitigation for a virtual object in contact with one or more physical objects in a three-dimensional environment by altering the visual characteristics of one or more portions of the virtual object.

16.《Apple Patent | Dynamic foveated pipeline

Excerpts from new AR/VR patents filed with the U.S. Patent Office on 03/16/2024</trp-post-container

In one embodiment, the method described in the patent includes receiving a distorted image representing simulated reality (SR) content. Said warped image has a plurality of pixels at various locations uniformly spaced in a grid pattern in the warping space, wherein said plurality of pixels are each associated with a plurality of corresponding pixel values and a plurality of corresponding scaling factors, said scaling factors indicating a plurality of respective resolutions at a plurality of corresponding locations of the SR content. Said method further comprises processing a warped image in warp space based on the plurality of corresponding scaling factors to generate a processed warped image and sending the processed warped image.

17.《Apple Patent | Method of manipulating user interfaces in an environment

Excerpts from new AR/VR patents filed with the U.S. Patent Office on 03/16/2024</trp-post-container

In one embodiment, a method for displaying and manipulating a user interface in a computer-generated environment provides an efficient and intuitive user experience. In one embodiment, the user interface may be grouped into a container. In an embodiment, a user interface that is a member of a container may be manipulated. In an embodiment, manipulating a user interface that is a member of a container can result in manipulating other user interfaces in the same container. In an embodiment, manipulating a user interface in a container can cause the user interface to change one or more directions and/or rotate about one or more axes.

18.《Apple Patent | Method of grouping user interfaces in an environment

Excerpts from new AR/VR patents filed with the U.S. Patent Office on 03/16/2024</trp-post-container

In one embodiment, the user interface may be grouped into a container. In an embodiment, the user interface may be added to the container, removed from the container, or moved from one location in the container to another location. In one embodiment, a visual indication is displayed prior to adding the user interface to the container. In one embodiment, the user interface may replace an existing user interface in the container. In one embodiment, a transparency of an obscured user interface may be modified when the user interface is moved in a computer-generated environment.

19.《Apple Patent | Environmentally aware gestures

Excerpts from new AR/VR patents filed with the U.S. Patent Office on 03/16/2024</trp-post-container

In one embodiment, a method of presenting a scene is performed at a device comprising a display, one or more processors, and non-transient memory. Said method includes displaying, at the display, a virtual character associated with the physical environment, said virtual character being located at a character position in a three-dimensional coordinate system of the physical environment. Said method further includes determining for an object a location of the object in the three-dimensional coordinate system of the physical environment, displaying the virtual character at the character location on the display, and performing a gesture based on the object location.

20.《Apple Patent | Face engaging structure

Excerpts from new AR/VR patents filed with the U.S. Patent Office on 03/16/2024</trp-post-container

In one embodiment, the device described in the patent may include a display, a face interface, and a connector between the display and the face interface. The facial interface may be at least translatable or rotatable relative to the display via the connector.

twenty one. "Apple Patent | Optical systems with lens-based static foveation

Excerpts from new AR/VR patents filed with the U.S. Patent Office on 03/16/2024</trp-post-container

In one embodiment, the electronic device described in the patent may include a display module that generates light having an image, a lens that directs the light to a waveguide, and a waveguide that directs the light to a viewport. The lens may produce a concave image in the light by applying a non-uniform magnification to the image in the light. The non-uniform magnification may vary as a function of the angle within the field of view of the lens. This may allow the concave image to have a higher resolution in the center region than in the peripheral region. A control circuit of the device may apply pre-distortion to the image.

twenty two. "Apple Patent | Electronic devices with gaze trackers

Excerpts from new AR/VR patents filed with the U.S. Patent Office on 03/16/2024</trp-post-container

In one embodiment, the head-mounted device may include a housing having an opening for receiving a lens. A display may output an image. A waveguide overlapping the lens may receive the image from the display and may direct the image to a viewport aligned with the lens. An infrared light source, such as an infrared light emitting diode or a laser, may be used to provide infrared light to the waveguides. Each waveguide may have a plurality of local couplers overlapping the lens. The localized couplers of each lens each direct a beam of infrared light from the waveguide to an eye surface in a viewport associated with that lens to produce an eye blink. The gaze point tracking infrared camera may capture an image of the eye blink to determine the user's gaze point.

twenty three. "Apple Patent | System for determining position both indoor and outdoor

Excerpts from new AR/VR patents filed with the U.S. Patent Office on 03/16/2024</trp-post-container

In one embodiment, the sensors and captured images are used to construct an environmental characterization map that provides reference information in the form of environmental characteristics associated with a location. The environmental characteristic map may be used online (e.g., in real time) in order to determine the location of the device. In one embodiment, the environmental characteristic map provides a rough location.

twenty four. "Apple Patent | Head-mountable device band with a latch movable by a tab

Excerpts from new AR/VR patents filed with the U.S. Patent Office on 03/16/2024</trp-post-container

In one embodiment, the band described in the patent includes a latch body, wherein the latch body is movable by a pull tab. The latch body extends through an opening in the band. The band may be detached from the header by pulling the pull tab, which activates the latch body and removes the latch body from the pin. A spring (or a plurality of springs) may bias the latch body toward the opening, thereby allowing the latch body to enter the pin. A force applied to the tab may overcome the biasing force provided by the spring and move the latch body away from the pin and remove the pin.

25.《Samsung Patent | Electronic device and method of providing content sharing based on object

Excerpts from new AR/VR patents filed with the U.S. Patent Office on 03/16/2024</trp-post-container

In one embodiment, the patent describes a server for providing content shared to objects. Said server is configured to establish communication between a first electronic device of a user and a second electronic device of another user and to provide a virtual space and objects in said virtual space to said first electronic device and to said second electronic device. The server selects, based on input from the user, at least one piece of content to be shared to the object in the virtual space by another user of the second electronic device entering the same virtual space as the first electronic device. The server activates the object to output the selected content identically to the at least one electronic device entering the virtual space. Based on the second electronic device leaving the virtual space, the server stops providing the content shared by the other user.

26.《Samsung Patent | Method and system for optimizing virtual behavior of participant in metaverse

Excerpts from new AR/VR patents filed with the U.S. Patent Office on 03/16/2024</trp-post-container

In one embodiment, the patent describes an electronic device presenting virtual behavior of a participant in a meta-universe. The electronic device determines a scenario of the metaverse, including a person meeting with the participant. The electronic device determines real-world behavior of the participant while immersed in the metaverse. The electronic device generates the virtual behavior of the participant based on the scenario of the metaverse and the real-world behavior of the participant while immersed in the metaverse. The electronic device presents an avatar of the participant with the participant's virtual behavior.

27.《Samsung Patent | System and method for measuring depth of stereoscopic image

Excerpts from new AR/VR patents filed with the U.S. Patent Office on 03/16/2024</trp-post-container

In one embodiment, a system for measuring a depth of a stereoscopic image comprises a display device; a holographic camera generating an interference pattern image by sensing a wavelength and a phase of light of the stereoscopic image; a control unit calculating a plurality of modulated image data having image information of the stereoscopic image at the depth of each of said plurality of depths based on said wavelength and phase of light, calculating an edge of a field at each of said plurality of modulated image data in said plurality of field to obtain an edge detection value, calculating a first maximum value of said modulation signal by arranging said edge detection value according to the depth of each of said plurality of modulated image data, calculating a first depth corresponding to said first maximum value as a depth of field, and calculating the modulation signal.

28.《Samsung Patent | Electronic device and method for driving display thereof

Excerpts from new AR/VR patents filed with the U.S. Patent Office on 03/16/2024</trp-post-container

In one embodiment, the electronic device described in the patent may include a display module for displaying image data, and a processor operably coupled to the display module to provide the image data to the display module, and the display module may include a display panel and display driver circuitry, the display panel including a plurality of pixel lines including a plurality of pixels, and the display driver circuitry for driving the plurality of pixels of the display panel.

29.《Samsung Patent | Method and system for generating augmented reality content using ar/vr devices

Excerpts from new AR/VR patents filed with the U.S. Patent Office on 03/16/2024</trp-post-container

In one embodiment, the patent describes a method comprising: receiving a plurality of image frames of at least one scene captured by a plurality of participant devices in an AR or VR environment; storing said plurality of image frames and metadata associated with said plurality of sub-image frames in a database; receiving an AR content generation request for generating a user's AR content view in said AR/VR environment; retrieving a set of image frames from said database based on said user's ID, information about said at least one scene, and metadata associated with said set of image frames, retrieve a set of image frames from said plurality of stored image frames in said database; and generate an AR content view of said user by combining said set of image frames retrieved from said database.

30.《Samsung Patent | Method and apparatus for determining persona of avatar object in virtual space

Excerpts from new AR/VR patents filed with the U.S. Patent Office on 03/16/2024</trp-post-container

In one embodiment, the example electronic device described in the patent may comprise a display module, a memory configured to store computer-executable instructions, and a processor configured to execute the instructions by accessing the memory. Said processor may be configured to create a preliminary persona based on first information related to a history of a target user, create a first persona from said preliminary persona based on second information related to the time and space of said target user, apply said first persona to an avatar object in a virtual space and correspond to the target user, and, upon detection of the occurrence of an event related to said target user, create a second persona from said first persona based on the said event occurrence a second person is created from said first person and said second person is applied to said avatar object instead of said first person.

31.《Google Patent | Dynamic display alignment with left and right image overlay

Excerpts from new AR/VR patents filed with the U.S. Patent Office on 03/16/2024</trp-post-container

In one embodiment, the method described in the patent comprises providing a coupling element at the nosepiece, said coupling element being able to superimpose a left image and a right image output from a respective output coupler and send the superimposed image to the sensor. Based on at least a portion of the superimposed image, the sensor may move the left field of view, the right field of view, or both fields of view until the left image and the right image are aligned.

32.《Google Patent | Color and infra-red three-dimensional reconstruction using implicit radiance functions

Excerpts from new AR/VR patents filed with the U.S. Patent Office on 03/16/2024</trp-post-container

In one embodiment, an image is rendered based on a neural radiation field (NeRF) volumetric representation of the scene, wherein the NeRF representation is based on captured frames of video data, each frame comprising a color image of the scene, a wide-field IR image, and a plurality of depth IR images. Each depth IR image is captured when the scene is illuminated by a different pattern of points of IR light and the illumination of the pattern occurs at a different time. the NeRF represents a mapping between a provided position and a viewing direction to the color and optical density at each position in the scene, and the NeRF represents a mapping between the provided position and the viewing direction and the IR value for each of the different patterns of points of IR light from a new viewpoint.

33.《Google Patent | Residual layer thickness modulation in nanoimprint lithography

Excerpts from new AR/VR patents filed with the U.S. Patent Office on 03/16/2024</trp-post-container

In one embodiment, the patent describes an improved nanoimprint lithography process. In one embodiment, the height is controlled by UV curing and the thickness of a residual layer of resin remaining after releasing the nanoimprint mold from the resin layer. Alternatively, the thickness of the residual layer can be controlled by a fill factor of the nanoimprint mold that transfers its pattern to the resin layer set on the substrate, or by resin droplets in the resin layer.

34.《Sony Patent | Hologram recording medium, hologram optical element, optical device, optical component, and method for forming hologram diffraction grating (Sony Patent | Hologram recording medium, hologram optical element, optical device, optical component, and method for forming hologram diffraction grating)

Excerpts from new AR/VR patents filed with the U.S. Patent Office on 03/16/2024</trp-post-container

In one embodiment, the patent describes a hologram recording medium. The hologram recording medium includes a protective layer and a light sensitive layer. An initial maximum load on the protective layer measured in a tensile test is 3N or greater and 1000N or less. The photographic layer contains a polymerizable compound and a polymerization initiator, and the polymerization initiator contains an electron-donating initiator and an electron-accepting initiator.

35.《Sony Patent | Systems and methods of protecting personal space in multi-user virtual environment

Excerpts from new AR/VR patents filed with the U.S. Patent Office on 03/16/2024</trp-post-container

A method for protecting a personal space in a multi-user virtual environment comprises the steps of generating an avatar for a target user in a multi-user virtual environment, determining a relationship score between the target user and a peer-to-peer user, creating a personal space around the avatar of the target user, wherein said dimensions of said personal space are computed on the basis of said relationship score with said peer-to-peer user, detecting when said peer-to-peer user 's avatar crosses the boundary of said personal space and applying rules to said peer-to-peer user to restrict his/her interaction with said target user.

36.《Sony Patent | Image displaying system, display apparatus, and image displaying method

Excerpts from new AR/VR patents filed with the U.S. Patent Office on 03/16/2024</trp-post-container

In one embodiment, a state information acquisition portion of an image generation apparatus described in the patent acquires state information of a user's head. The image generation portion generates a display image corresponding to the field of view. The downsampling portion downsamples the image data and transmits the downsampled image data from the transmission portion. After up-sampling the data by the up-sampling portion, the distortion correction portion of the head-mounted display performs correction according to the aberration of the eyepieces for each of the primary colors and causes the obtained data to be displayed on the display portion.

37.《Sony Patent | Server device and network control method

Excerpts from new AR/VR patents filed with the U.S. Patent Office on 03/16/2024</trp-post-container

In one embodiment, the live entertainment agent filters each of the first object data as object data of the first object and the second object data as object information of the second object based on the determined filtering conditions, the first object and the second object will be displayed in the virtual space, and the filtered object data will be sent to the presentation server.

38.《Sony Patent | Tracking and processing historical data of a non-fungible token based digital asset

Excerpts from new AR/VR patents filed with the U.S. Patent Office on 03/16/2024</trp-post-container

In one embodiment, the method described in the patent includes generating an NFT for a game resource. monitoring the use of a game asset in a video game during gameplay to identify qualifying events that occur during gameplay using the game asset. The NFT is updated using metadata associated with the qualifying events.

39.《Qualcomm Patent | Connected mode discontinuous reception settings for periodic traffic with jitter

Excerpts from new AR/VR patents filed with the U.S. Patent Office on 03/16/2024</trp-post-container

In one embodiment, a network entity may account for jitter in communications with a user equipment (UE) by adjusting a connectivity mode discontinuous reception (CDRX) configuration parameter of the UE based on an estimated downlink TRAFFIC arrival time. For a downlink traffic burst, the network entity may estimate a traffic arrival offset based on determining the traffic periodicity, an estimated arrival time associated with one or more packets of the traffic burst, and at least one jitter parameter. The jitter parameter may represent uncertainty in the arrival time of the TRAFFIC burst. The network entity may select a CDRX offset value based on the estimated TRAFFIC arrival offset. The network entity may, for example, send a message indicating the CDRX offset value as part of the CDRX configuration.

40.《Qualcomm Patent | Predicting thermal states in connected devices to provide edge processing

Excerpts from new AR/VR patents filed with the U.S. Patent Office on 03/16/2024</trp-post-container

In one embodiment, a method of receiving data from an edge server executed in a wearable device may include obtaining a plurality of temperature measurements from a plurality of hardware components, sending the temperature measurements to an edge server, receiving instructions related to operations of an application program executing at a processor of the wearable device, and adjusting parameters of the operations based on the received instructions. The method performed in the edge server includes receiving the temperature measurements from the wearable device, identifying one or more adjustments to the operations of the plurality of hardware components based on the plurality of temperature measurements, and sending instructions to the wearable device to make the adjustments.

41.《Qualcomm Patent | Systems and methods of three-dimensional modeling based on object tracking

Excerpts from new AR/VR patents filed with the U.S. Patent Office on 03/16/2024</trp-post-container

In one embodiment, the imaging system determines a movement path of the indicator object relative to a target object based on image data of the scene received from the image sensor. The target object is represented in the image data from a first viewpoint. The imaging system identifies an outline of at least a portion of the target object based on the movement path of the indicator object and the first viewpoint. The imaging system generates a 3D mesh based on the contour of the at least a portion of the target object. The imaging system may generate the mesh by adding or subtracting a volume based on the contour to a prior mesh or by subtracting the volume from the prior mesh, respectively. The imaging system may generate a texture of the mesh based on the target object in the image data.

42.《HTC Patent | Remote-control system, remote-controller, and remote-control method

Excerpts from new AR/VR patents filed with the U.S. Patent Office on 03/16/2024</trp-post-container

In one embodiment, the remote control method described in the patent comprises: acquiring environmental image data by an image acquisition device of the remote control; constructing a map based on said environmental image data based on a SLAM algorithm, and acquiring first position information of the first display in the map by said remote control based on the environmental image data; and receiving the first position information from the remote control and controlling, by a computing device based on the first position information, the the first display.

43.《Snap Patent | Augmented reality guidance that generates guidance markers

Excerpts from new AR/VR patents filed with the U.S. Patent Office on 03/16/2024</trp-post-container

In one embodiment, the patent describes an eyewear device including a display system and a position detection system. This is accomplished by monitoring a current position of the eyewear device within the environment, recognizing a marker position within a threshold of the current position, defining the marker position relative to the environment and associating it with a guide marker, registering the marker position, generating a superimposed image including the guide marker, and presenting the superimposed image at a display of the eyewear device.

44.《Snap Patent | Deforming real-world object using image warping

Excerpts from new AR/VR patents filed with the U.S. Patent Office on 03/16/2024</trp-post-container

In one embodiment, the system described in the patent receives an image including a depiction of a real world object; applies a machine learning model to the image to generate a distortion field and a segmentation mask; and applies the generated distortion field and segmentation mask to the image to distort the real world object depicted in the image into a target shape.

45.《Snap Patent | Context-sensitive remote eyewear controller

Excerpts from new AR/VR patents filed with the U.S. Patent Office on 03/16/2024</trp-post-container

In one embodiment, the patent describes a situationally sensitive remote control for use with an electronic device, such as an eyewear device. The electronic device is configured to perform an activity. The scenario-sensitive remote control includes a display having a display area, a display driver coupled to the display, and a transceiver. The remote control also includes memory storing a controller layout configuration for display by the display driver in the display area of the monitor. A processor in the scenario-sensitive remote control is configured to establish communication with the electronic device via the transceiver, detect an activity currently being performed by the electronic device, select one of the controller layout configurations in response to the detected activity, and present, via the display driver, the selected controller layout configuration in the display area of the display.

46.《Snap Patent | Carry case for rechargeable eyeglasses devices

Excerpts from new AR/VR patents filed with the U.S. Patent Office on 03/16/2024</trp-post-container

In one embodiment, a carrying case for an electronic eyewear device, such as smart glasses, has a charging contact. The charging contact connects a battery via contact coupling of the charging contact to a corresponding contact structure of the eyewear device. The contact structure of the eyewear device is provided by a hinge assembly that couples individual side supports to a frame of the eyewear device.

47.《Magic Leap Patent | Method and system for integration of refractive optics with a diffractive eyepiece waveguide display Method and system for integration of refractive optics with a diffractive eyepiece waveguide display)

Excerpts from new AR/VR patents filed with the U.S. Patent Office on 03/16/2024</trp-post-container

In one embodiment, the patent describes an eyepiece waveguide comprising a group of waveguide layers having a world side and a user side. The eyepiece waveguide includes both a first cover plate having a first optical magnification and arranged near the world side of the set of waveguide layers, and a second cover plate having a second optical magnification and arranged near the user side of the set of waveguide layers.

48.《Magic Leap Patent | Eyepieces for use in wearable display systems

Excerpts from new AR/VR patents filed with the U.S. Patent Office on 03/16/2024</trp-post-container

In one embodiment, the head-mounted display device comprises a light projector and eyepieces. The eyepiece is arranged to receive light from the light projector during use of the wearable display system and direct the light to a user. The eyepiece includes a waveguide, an edge of the waveguide positioned to receive light from the display light source module and couple the light into the waveguide. The waveguide includes a first surface and a second surface opposite the first surface. The waveguide includes a number of different regions, each region having a different grating structure configured to diffract light according to different sets of grating vectors.

49.《Magic Leap Patent | Polychromatic Light Out-Coupling Apparatus, Near-Eye Displays Comprising The Same, And Method Of Out-Coupling Polychromatic Light (Magic Leap Patent: Polychromatic Light Out-Coupling Apparatus, Near-Eye Displays Including Said Apparatus, And Method Of Out-Coupling Polychromatic Light)

Excerpts from new AR/VR patents filed with the U.S. Patent Office on 03/16/2024</trp-post-container

In one embodiment, the assembly described in the patent includes a first coupled-out diffractive optical element and a second coupled-out diffractive optical element. Each of the first and second coupled diffractive optical elements includes a first region having a first repeating diffraction interval d1 and a second region having a second repeating diffraction interval d2. The first coupled diffractive optical element has a first region having a first repeated diffraction interval d1 and a second region having a second repeated diffraction interval d2. The first region of the first coupled diffractive optical element is superimposed on the second region of the second coupled diffractive optical element and coupled to the second region. The second region of the first coupled out diffractive optical element is superimposed on and aligned with the first region of the second coupled out diffractive optical element.

50.《Magic Leap Patent | Augmented and virtual reality display systems with correlated in-coupling and out-coupling optical regions for efficient light utilization (Magic Leap Patent: Augmented and virtual reality display systems with correlated in-coupling and out-coupling optical regions for efficient light utilization)

Excerpts from new AR/VR patents filed with the U.S. Patent Office on 03/16/2024</trp-post-container

In one embodiment, augmented reality and virtual reality display systems and devices are configured to efficiently utilize projected light. In one aspect, the display system comprises a light projection system and a head-mounted display configured to project light into a user's eyes to display virtual image content. Said head-mounted display comprises at least one waveguide, said waveguide comprising a plurality of coupled-in regions, each coupled-in region configured to receive light corresponding to a portion of said user's field of view from said light projection system and to couple said light into said waveguide. Said waveguide further comprises a plurality of coupling out regions configured to couple light out of the waveguide to display virtual content. Each coupling out region is configured to receive light from a different coupling in region. In one realization, each coupled-in region has a one-to-one correspondence with a uniquely corresponding coupled-in region.

51.《Magic Leap Patent | Bundle adjustment using epipolar constraints

Excerpts from new AR/VR patents filed with the U.S. Patent Office on 03/16/2024</trp-post-container

In one embodiment, the method described in the patent includes receiving image data for a particular pose from a head unit. The image data comprises a first image from a first camera of the headset and a second image from a second camera. The described method comprises identifying at least one key point in a three-dimensional model of the environment represented at least partially in the first image and the second image, and performing beam adjustment. The beam adjustment is performed by jointly optimizing a reprojection error of the at least one key point and a kernel pole error of the at least one core point. The results of the beam adjustment are used to perform an update of the three-dimensional model, determine a position of the headset in a particular pose, or determine at least one of the external parameters of the first camera and the second camera.

52.《Magic Leap Patent | Eye tracking using alternate sampling

Excerpts from new AR/VR patents filed with the U.S. Patent Office on 03/16/2024</trp-post-container

In one embodiment, the eye tracking system described in the patent may include a first camera, the first camera configured to capture a first plurality of visual data of the right eye at a first sampling rate. The system may include a second camera, the second camera configured to capture a second plurality of visual data for the left eye at a second sampling rate. The second plurality of visual data may be captured during a different sampling time than the first plurality of visual data. The system may estimate visual data for at least one of the right eye or the left eye at the sampling time based on at least some of the visual data of the first and second plurality of visual data. Based on the estimated visual data, eye movements may be determined.

53.《Magic Leap Patent | Systems and methods for artificial intelligence-based virtual and augmented reality

Excerpts from new AR/VR patents filed with the U.S. Patent Office on 03/16/2024</trp-post-container

In one embodiment, the patent describes systems and methods for generating and displaying a virtual companion system. In the example method, a first input from a user's environment is received via a first sensor of a head-mounted device. An occurrence of an event in the environment is determined based on the first input. A second input from the user is received via a second sensor of the head-mounted device, and an emotional response of the user is recognized based on the second input. A correlation exists between the emotional response and the event. A view of the environment is presented via a transparent display of the head-mounted device. A stimulus is presented via the translucent virtual companion, wherein the stimulus is identified based on the identified association between the emotional response and the event.

54.《Magic Leap Patent | Cross reality system with prioritization of geolocation information for localization

Excerpts from new AR/VR patents filed with the U.S. Patent Office on 03/16/2024</trp-post-container

In one embodiment, the cross-reality system allows any of a plurality of devices to efficiently access previously stored mappings. Both the stored mappings and the tracking mappings used by the portable devices may have any of multiple types of location metadata associated with them. The location metadata may be used to select a set of candidate mappings for an operation, such as localization or mapping merging. Said operation involves finding a match between a location defined by location information from the portable device and any of a plurality of previously stored mappings. The type of location metadata may be prioritized for selecting a subset. To aid in selecting candidate mappings, the stored mappings may be indexed based on the geolocation information. The cross-reality platform may update the index when interacting with devices that provide geolocation information, and may propagate said geolocation information to devices that do not provide said index.

55.《Magic Leap Patent | Determining Input For Speech Processing Engine

Excerpts from new AR/VR patents filed with the U.S. Patent Office on 03/16/2024</trp-post-container

In one embodiment, the patent describes a method of presenting a signal to a speech processing engine. Said method comprises receiving, via a microphone, an audio signal. A portion of the audio signal is recognized, and a probability is determined that the portion includes speech directed by a user of the speech processing engine as an input to the speech processing engine. Based on a determination that the probability exceeds a threshold, the portion of the audio signal is presented as an input to the speech processing engine. Based on the determination that the probability does not exceed the threshold, the portion of the audio signal is not presented as an input to the speech processing engine.

56.《Magic Leap Patent | Wearable system speech processing

Excerpts from new AR/VR patents filed with the U.S. Patent Office on 03/16/2024</trp-post-container

In one embodiment, the patent describes a method of processing an acoustic signal. According to one or more embodiments, a first acoustic signal is received via a first microphone. The first acoustic signal is associated with a first voice of a user of the wearable head unit. A first sensor input is received via the sensor, and a control parameter is determined based on the sensor input. The control parameter is applied to one or more of the first acoustic signal, the wearable head unit, and the first microphone. Determining the control parameter includes determining a relationship between the first voice and the first acoustic signal based on the first sensor input.

57.《Magic Leap Patent | Interaural Time Difference Crossfader For Binaural Audio Rendering

Excerpts from new AR/VR patents filed with the U.S. Patent Office on 03/16/2024</trp-post-container

In one embodiment, the patent describes systems and methods for presenting audio signals to a user of a headset. According to an exemplary method, a first input audio signal is received, said first input audio signal corresponding to a source location in a virtual environment; the first input audio signal is processed to produce a left output audio signal and a right output audio signal. The left output audio signal is presented to a user's left ear through a left speaker. The right output audio signal is presented to the user's right ear via the right speaker. Processing the first input audio signal includes applying delay processing to the first input audio signal to produce a left audio signal and a right audio signal; adjusting a gain of the left audio signal; adjusting a gain of the right audio signal; applying a first head-related transfer function (HRTF) to the left audio signal to produce the left output audio signal; and applying a second HRTF to the right audio signal to produce the right output audio signal The second HRTF is applied to the right audio signal to produce the right output audio signal. Applying delay processing to the first input audio signal comprises applying an inter-auditory time delay (ITD) to the first input audio signal, said ITD being determined based on a signal source location.

58.《Magic Leap Patent | Session manager (Magic Leap Patent: Session Manager)

Excerpts from new AR/VR patents filed with the U.S. Patent Office on 03/16/2024</trp-post-container

In one embodiment, the patent describes systems and methods for implementing mixed reality collaboration. A method may include receiving persistent coordinate data; presenting a first virtual session handle to a first user via a transmissive display of a wearable device at a first location, wherein said first location is based on said persistent coordinate data; presenting a virtual object to said first user via said transmissive display at a second location, wherein said second location is based on said first location; receiving position data from a second user, wherein said location data correlates the location of said second user with the location of a second virtual session handle; presenting an Avatar to said first user via said transmissive display at a third location, wherein said Avatar corresponds to said second user, wherein said third location is based on said location data, and wherein said third location is further based on said first location.

© Copyright notes

Related posts

No comments

No comments...
en_USEN_US
Powered by TranslatePress