Samsung XR patented shared brain-computer interface provides complete sensory feedback stimulation - smell, taste, touch, etc.

Valve1yrs agorelease firefly
7,800 0

CheckCitation/SourcePlease click:XR Navigation Network

Provide complete sensory feedback stimulation through brain-computer interface, including smell, taste, touch, etc.

(XR Navigation Network November 27, 2023)根据早前的情报,三星已经制定了完整的元宇宙战略,并旨在构建一个以三星为中心的XR生态系统,并且正在积极招揽人才。根据专利申请,这家公司实际上已经将终极的脑机接口纳入至考虑之中。

In the past, XR+ brain-computer research was mostly seen inMetaand other startups. Of course, companies such as Microsoft have also explored this. Now, the field that is actively deploying XR is also beginning to explore the ultimate brain-computer interface. In another related patent application titled "Information generation method and device," Samsung proposed using brain-computer interfaces in the field of XR sensory feedback stimulation.

The company believes that in XR technology, various sensory simulations are mainly implemented based on sensors. But each sensory simulation is realized through a sensor, so the process is cumbersome. In addition, the sensors are not comprehensive enough, so many simulations cannot be implemented. In other words, the effects obtained using the relevant sensors are far from reality and users cannot obtain realistic simulations.

Therefore, Samsung proposes that brain-computer interface devices can be used to provide realistic sensory simulations. Among them, the system can obtain information about the user's position relative to the target object in the virtual environment, then determine sensory information corresponding to the target object based on the position information and attribute information, and finally convert the sensory information into electrical signals and pass them through Brain-computer interface devices to stimulate users.

Samsung XR patented shared brain-computer interface provides complete sensory feedback stimulation - smell, taste, touch, etc.

Figure 2 shows a related information generation flow chart 200. According to an example embodiment, the information generation method may include an operation 201 of obtaining relative position information of the user relative to a target object in the virtual environment.

In one embodiment, one or more devices that work cooperatively with each other may be called an execution entity. The execution entity can establish a spatial rectangular coordinate system, obtain the user's location information in the rectangular coordinate system and the location information of the target object in the virtual environment in real time, and determine based on the user's location information and the location information of the target object. Relative position information of the user relative to the target object.

The execution entity may periodically obtain the position information of the user in the rectangular coordinate system and the position information of the target object in the virtual environment. The target object can be any target object in the virtual environment, such as a table, a glass of water, bread, etc.

In operation 202, sensory information corresponding to the target object is determined based on the relative position information and attribute information of the target object.

According to example embodiments, after acquiring the relative position information of the user relative to the target object in the virtual environment, the execution entity may further determine sensory information corresponding to the target object according to the correspondence table or the sensory information prediction model. For example, the correspondence table may store predetermined relative position information, attribute information of the target object, and interrelated sensory information. The sensory information prediction model may be a predetermined sensory information prediction model determined by performing a training operation.

According to example embodiments, the model may be trained based on relative position information of the target object and attribute information samples marked with sensory information, thereby obtaining a predetermined sensory information prediction model.

The sensory information corresponding to the target object may include at least one of touch, hearing, smell, or taste. Sensory information can be represented by stimulus parameters. Stimulation parameters can be determined based on the sensory information actually produced by the user on the target object in the real environment.

Interestingly, Samsung proposed that related stimulators could be brain-computer startupsBlackrockThe fully programmable 96-channel neurostimulator CereStim. But of course, the company says the device in question is just one option, and other emulators are possible.

In an example scenario, the user touches an object at 30 degrees Celsius, and electrode activity is recorded using a neural port neural signal collector. Neural activity recorded by the electrodes is amplified and analog-to-digital (A/D) sampled at a sampling frequency of 30 kHz, and then recorded using the Neural Port Neural Signal Processor NSP system. Then, the stimulation parameters were written through CereStim's Matlab API, and the electrode signals were acquired again through different pulse stimulation and adjustment of current intensity. When the electrode signals are infinitely close to each other, they can touch the stimulation parameters of the 30-degree Celsius object.

Likewise, any stimulation parameters for touch, hearing or smell can be obtained.

In operation 203, the method may include converting sensory information into electrical signals to stimulate the user through the brain-computer interface device.

After determining the sensory information corresponding to the target object, the execution entity can use the brain-computer interface device to convert the sensory information into electrical signals to stimulate corresponding parts of the user's cerebral cortex to produce corresponding feelings.

According to another example embodiment, the operation of determining the sensory information corresponding to the target object based on the relative position information and the attribute information of the target object may include determining the sensory information corresponding to the target object based on the user's facial orientation and the distance between the user's head and the target object. corresponding olfactory information; and determining the olfactory information as sensory information corresponding to the target object.

According to another example embodiment, the method may include determining that the target object has an odor attribute, and determining olfactory information corresponding to the target object according to a user's facial orientation and a distance between the user's head and the target object. In response to or based on determining that the target object has an odor attribute, olfactory information is determined as sensory information corresponding to the target object.

Smell attributes can include attribute information such as fruit aroma, flower aroma, food aroma, etc.

In one embodiment, the sensory information corresponding to the target object is determined based on the relative position information and attribute information of the target object. Based on determining that the target object has sound attributes, the method may include determining the direction of the user's face and the relationship between the user's head and the target object. The distance between the target objects determines the auditory information corresponding to the target objects, and then the auditory information is determined as the sensory information corresponding to the target objects.

The execution entity may use wearable positioning devices on the left and right sides of the user's head to determine the left and right coordinates of the user, further determine the line connecting the left and right coordinates of the user's head, and calculate the relationship between the line and the right coordinate according to the line. The angle between the plane directions where the target object's coordinates lie determines the orientation of the user's face.

Samsung XR patented shared brain-computer interface provides complete sensory feedback stimulation - smell, taste, touch, etc.

As shown in Figure 3, the execution entity 301 can establish a spatial rectangular coordinate system with the UWB base station as the origin, and use the first wearable positioning device 303 on the left side of the user's head and the second wearable positioning device 303 on the right side of the user's head. 304 Obtain the coordinates A and B corresponding to the user's head.

According to example embodiments, wearable positioning devices 303 and 304 may be UWB chips. The UWB base station can be implemented by the execution entity 301. Coordinate A corresponds to chip 303, that is, the coordinate of the left side of the user's head, and coordinate B corresponds to chip 304, that is, the coordinate of the right side of the user's head.

At the same time, the coordinate M of the target object 305 is obtained, and the angle between the straight line AB and the direction of the plane where M is located is determined to be the user's orientation. Here, the target object 305 is a virtual flower. The execution entity can determine the olfactory information corresponding to the target object from the predetermined correspondence table according to the user's facial orientation and the distance between the user's head and the target object, and determine that the olfactory information corresponds to the target object. sensory information. The correspondence table records the user's facial orientation, the distance between the user's head and the target object, and related olfactory information.

In this embodiment, the coordinates of the left and right sides of the user's head are determined based on the wearable positioning devices on the left and right sides of the user's head; The included angle is used to determine the direction of the user's face, thereby effectively improving the accuracy of determining the user's face orientation.

The execution entity may detect the relative position information of the user's head relative to the target object, and determine that the taste information corresponding to the edible attribute is the same as the target object based on determining that the target object is located at a predetermined position of the user's head and that the target object has an edible attribute. Corresponding sensory information.

Taste attributes may include attribute information such as banana taste, apple taste, etc.

It should be noted that before it is determined that the target object is located at the predetermined position of the user's head, there may be an action of the user touching the target object, or there may not be an action of the user touching the target object within a predetermined time range.

In one embodiment, whether there is an action of the user touching the virtual object may be determined by determining whether the distance between the user's hand and the target object meets a predetermined threshold.

In this embodiment, based on the determination of the predetermined position of the target object on the user's head and the fact that the target object has edible attributes, the taste information corresponding to the edible attributes is determined as the sensory information corresponding to the target object. The sensory information is then converted into electrical signals to stimulate the user through the brain-computer interface device, thereby helping the user experience the edible properties of the object in the virtual environment and improving the authenticity of the interaction.

Samsung XR patented shared brain-computer interface provides complete sensory feedback stimulation - smell, taste, touch, etc.

Figure 4 is a schematic diagram of an application scenario of the information generation method.

In the application scenario shown in Figure 4, the execution entity 401 can establish a spatial rectangular coordinate system with the UWB base station as the origin, and obtain the user's spatial rectangular coordinates by using a wearable positioning device on one or more two parts of the user 402 location information in the system.

For example, wearable positioning devices can be placed on the user's hands and head. The UWB base station can be implemented by the execution entity 401. According to an example embodiment, the entity may obtain location information on a target object 403, such as a virtual football, thereby determining a relative position of the user relative to the target object based on the user's location information and the location information on the target object. information.

Then, the sensory information corresponding to the target object is determined according to the correspondence table, which records the predetermined relative position information, attribute information (such as smoothness, softness, etc.) and corresponding sensory information of the target object. Then, the sensory information is converted into electrical signals to stimulate the user 402 through the brain-computer interface device.

The information generation method may include obtaining the user's relative position information relative to a target object in the virtual environment, determining sensory information corresponding to the target object based on the relative position information and attribute information of the target object, and generating the information through a brain-computer interface. The device converts the sensory information into electrical signals to stimulate the user; this helps the user experience the properties of objects in the virtual environment and improves the authenticity of the interaction.

Figure 5 is a flowchart 500 of an information generation method.

Samsung XR patented shared brain-computer interface provides complete sensory feedback stimulation - smell, taste, touch, etc.

In operation 501, the information generation method may obtain relative position information of the user relative to the target object in the virtual environment.

In operation 502, in response to determining that the distance between the user's hand and the target object satisfies a predetermined threshold and that the target object has an outline attribute, tactile information corresponding to the outline attribute of the target object is determined as sensory information corresponding to the target object .

The relative position information may include relative position information of the user's hand relative to the target object, and the execution entity may detect the distance between the user's hand and the target object. In response to determining that the distance between the user's hand and the target object satisfies a predetermined threshold and the target object has a contour attribute, tactile information corresponding to the contour attribute of the target object is determined as sensory information corresponding to the target object.

Contour attributes can include attribute information such as contour, material, texture, smoothness, temperature, quality, etc.

Among them, predetermined thresholds can be set based on experience, actual needs and specific application scenarios. For example, the distance between the fingers and/or the palm and the target object is less than or equal to 1 cm, or the distance between the fingers and/or the palm and the target object is less than or equal to 0.5 cm, which is not limited in the present disclosure.

Specifically, the execution entity establishes a spatial rectangular coordinate system with the UWB base station as the origin, uses the data glove worn on the user's hand to obtain the position information on the user's hand, and obtains the position information of the target object at the same time. The target object is a water cup, and the predetermined threshold is that the distance between the user's fingers and/or palm and the target object is equal to or less than 1 cm.

In order to determine that the distance between the user's fingers and/or palms and the water cup meets a predetermined threshold and that the water cup has a contour attribute, the tactile information corresponding to the contour attribute of the water cup is determined as the sensory information corresponding to the water cup.

In operation 503, the sensory information is converted into electrical signals through the brain-computer interface device to stimulate the user.

Compared with the method shown in the example embodiment of FIG. 2, according to the method shown in the flowchart 500 of FIG. 5, in order to determine that the distance between the user's hand and the target object satisfies the predetermined threshold and the target object has an outline attribute, the target object is The tactile information corresponding to the contour attribute is determined as the sensory information corresponding to the target object.

Next, the brain-computer interface device converts the sensory information into electrical signals to stimulate the user, thereby helping the user experience the contour properties of the object in the virtual environment and improving the authenticity of the interaction.

Further reading:Samsung Patent | Information generation method and equipment

Samsung's patent application titled "Information generation method and device" was originally submitted in July 2023 and was recently published by the US Patent and Trademark Office.

Generally speaking, after a U.S. patent application is examined, it will be automatically published 18 months from the filing date or priority date, or it will be published within 18 months from the filing date at the request of the applicant. Note that publication of a patent application does not mean that the patent is approved. After a patent application is filed, the USPTO requires actual review, which can take anywhere from 1 to 3 years.

© Copyright notes

Related posts

No comments

none
No comments...