Meta AR/VR Patent Shares User's Freedom to Travel Through Different App Worlds Without Transit to HOME Lobby

patent9mos agorelease firefly
3,272 0

CheckCitation/SourcePlease click:XR Navigation Network

(XR Navigation Network February 20, 2024) The metaverse allows users to visit a series of different virtual worlds. If interested in activities or things experienced in the current world, the user may like to travel to subsequent virtual worlds. Therefore, there is a need for an interface that allows the user to travel freely through different virtual worlds.

在名为“Virtual personal interface for control and travel between virtual worlds”的专利申请中,MetaA virtual personal interface that allows users to travel directly through multiple virtual worlds without having to switch in and out of the Home headspace lobby was introduced.

Meta AR/VR Patent Shares User's Freedom to Travel Through Different App Worlds Without Transit to HOME Lobby

The personal interface can be separated from the current virtual world to allow it to appear consistently across multiple virtual worlds and to display controls from the XR application controlling the current virtual world or elements and controls from other virtual worlds, such as controls that enable virtual world selection and transfer.

For example, the XR system may define a platform for XR applications, where each XR application may include a controller, a system for providing output in a personal interface, and a system for providing a 3D world.

In one example, when the user is in the first virtual world, she can access the personal interface and can navigate in the personal interface to the personal interface output of the XR application controlling the current world or to the personal interface output from another XR application.

In particular cases, the output of a personal interface from such another application may be referred to as a 2D interface, but may include 3D content.The 2D interface allows the user to be transported directly to one or more locations in a 3D world controlled by the other application. When such teleportation control is active, the 3D world building portion of the corresponding XR application may be activated to create the world into which the user is brought.

In other words, the personal interface may host a 2D interface from the application's personal interface builder, wherein the 2D interface may enable teleportation to a 3D world and/or control of 3D content of the corresponding virtual world.

The Personal Interface Builder and/or 3D World Builder may display 2D interfaces and/or 3D worlds through content orchestration, with the associated content either hosted by a server for the respective application or stored locally by the Personal Interface.

When a user wishes to travel from the current to a subsequent virtual world, she can simply select the desired application from her personal interface and choose the teleportation control in order to teleport to the appropriate virtual world.

In this case, the current virtual world can be displayed simultaneously with the personal interface, which can provide a 2D interface from the selected other application until the world is changed using the 2D interface through the corresponding 3D world builder of the other application.

In one embodiment, the personal interface may be part of the 2D interface of the application and present various specific destinations in the corresponding virtual world to allow the user to be transportable directly to the relevant destination.

For example, these destinations may be designated by one or more travel cards, which are implemented as deep links to places, events, or people in the respective virtual worlds.

In this way, the personal interface may facilitate direct travel between virtual worlds in the artificial reality. In response to a selection of one of the selected links, a 3D world building portion corresponding to the selected application may responsively build a 3D world corresponding to the selected destination.

In one embodiment, the personal interface may define a variety of controls that can be applied to each of the virtual worlds corresponding to the application. For example, the controls may coordinate movement and/or appearance of characters, navigation to specific areas in the virtual world, personal content available in the virtual world, credit access required for transaction payments, and the like.

In the manner described, a personal interface can provide a complete set of controls that can be applied similarly regardless of the virtual world in which the user operates. In other words, a personal interface can provide universal controls by providing them in a virtual world and allowing the user to experience them in an unconstrained manner.

In one embodiment, the personal interface may display various corresponding content for items selected by other users in the virtual world. For example, when a user selects a selectable item and deep-links it to a controller of the item, the personal interface may access the deep-link to obtain and display corresponding content from the controller. In this way, the user can explore the content, for example to learn additional information about the selected item.

In one embodiment, the personal interface may generate 3D content that is not controlled by an application of the current virtual world in which the user is traveling. To this end, a 2D interface controlling the personal interface may respond to a 3D content-triggering action by the user such that the personal interface accesses and displays the 3D content of said action.

Example content-triggering operations may include the user approaching or selecting a particular item in the virtual world, or gazing at the virtual world, selecting an item on a personal interface, navigating a site in the virtual world, pro- gramming one or more items of personal content that the user would like to include in the virtual world, and the like.

In response to the received 3D content triggered action, the personal interface can display the corresponding 3D item content in the vicinity of the personal interface, and the personal interface can become a window into another virtual world. In this way, the personal interface may provide the user with the opportunity to preview the 3D content-triggered action of the item.

Meta AR/VR Patent Shares User's Freedom to Travel Through Different App Worlds Without Transit to HOME Lobby

FIG. 4 illustrates component 400. said component 400 includes hardware 410, intermediaries 420, and specialized components 430. as noted above, systems implementing the disclosed technology may use a variety of hardware, including processing units 412, working memory 414, input and output devices 416, and storage memory 418.

The intermediary 420 may include components that mediate resources between the hardware 410 and the specialized components 430. For example, the intermediary 420 may include an operating system, a service, a driver, a basic input/output system, controller circuitry, or other hardware or software system.

The specialized component 430 may include software or hardware configured to perform operations for controlling the artificial reality environment, such as by providing a user interface for interacting with the current XR application, providing a detailed view of selected items, navigating between multiple virtual worlds without having to switch between world main halls, executing a second XR application in a world controlled by the first XR application, and providing 3D content separate from the current 3D content separate from the current world, etc.

The specialized component 430 may include an information retrieval module 434, an information evaluation module 436, a trip execution module 438, a content schema generation module 440, a content facilitation module 442, a content selection module 444, a content delivery module 446, a content rendering module 448, and components and APIs that may be used to provide specialized components such as user interfaces, transmission data, and control interfaces 432 .

The information retrieval module 434 may retrieve information that may be used to activate the personal interface. For example, relevant data may include gestures, words, and other activities of the user while traveling in the virtual world.

In one embodiment, the information retrieval module 434 may retrieve a user-selected, XR application corresponding to a virtual world to which the user wishes to travel. For the virtual world, the information retrieval module 434 may retrieve data corresponding to a user action or space that the user interacts with.

Alternatively, the information retrieval module 434 may retrieve data for a 3D content triggering action. Wherein said action may cause the personal interface to display 3D content that does not contain the virtual world the user is traveling through.

In this case, the non-exhaustive list of triggered actions may include the user approaching or selecting a particular item in the virtual world, gazing at a particular item in the virtual world or personal interface, selecting an item in the personal interface, navigating outlets in the virtual world, and so forth.

The information assessment module 436 may perform specific assessments regarding travel to and behavior in the virtual world. For example, the information assessment module 436 may assess which XR application corresponding to the virtual world the user has selected to travel to and whether the user has selected a particular travel destination designated for said virtual world.

In one embodiment, the information evaluation module 436 may evaluate the type of user-triggered action. Additionally, for virtual worlds, the information evaluation module 436 may evaluate the type of broadcast space within the virtual world that may be dedicated to displaying or otherwise presenting virtual world content.

The travel execution module 438 may execute travel to a particular destination in the virtual world. For example, travel may be executed based on a user's selection of a travel card provided by an XR application corresponding to the virtual world at a personal interface.

The content mode generation module 440 may determine a particular mode in which content should be generated. For example, in the case where the personal interface serves to generate 3D content that is unrelated to the world in which the user is traveling, the content pattern generation module 440 may select how the 3D content should be presented to the user via the personal interface. In this regard, exemplary modes may include representing the content according to display options.

In one embodiment, the content pattern generation module 440 may identify a particular pattern as a function of the item type that triggers the action topic of the 3D content, such as based on a mapping of the item type to a display mode. In other embodiments, the content pattern generation module 440 may always use the same display mode.

The content facilitation module 442 may facilitate the processing of specific types of content that may enhance the virtual world in which the user is traveling. For example, for selected virtual spaces in the virtual world the user is traveling in, such a module may enable 2D and/or 3D content to be transmitted to the space.

In this regard, the content facilitation module 442 may detect a deep link associated with the space. Once detected, the content facilitation module 442 may then transmit the link to a personal interface, for example, so that the user may subsequently make a selection of an application corresponding to a content provider that may deliver content to the selected virtual space.

The content selection module 444 may perform a selection of content processed in accordance with the operations of the content facilitation module 442. That is, the content selection module 444 may receive a selection of 2D interfaces for the personal interface, and for applications on the personal interface, may provide content that may enhance the virtual world.

The content transfer module 446 may transfer 3D content to alter or enhance the virtual world the user is traveling to. Example transfers may be directed to content of an XR application corresponding to the virtual world the user wishes to travel to. Another example transfer may be directed to content populating the virtual space from a content provider, the

The content representation module 448 may present 3D content for an XR application corresponding to a virtual world or a 2D application that can add content to that world, based on user selection.

Meta AR/VR Patent Shares User's Freedom to Travel Through Different App Worlds Without Transit to HOME Lobby

FIG. 5 illustrates a conceptual block diagram of an exemplary artificial reality application 500. Said application may be used to generate a 2D interface for display via a personal interface and for controlling 3D content of a corresponding virtual world in artificial reality. Among other things, the application 500 includes a controller 502, a personal interface builder 504, and a 3D world builder 506.

In operation, the XR system may include a plurality of runtimes, one controlled by the currently active application that defines the current virtual world, and another that is a personal interface.

Although personal interfaces can be virtual objects in a 3D world with attributes such as size, physical response, display properties, etc., the content displayed by the personal interface can be runtime separate from the virtual world. This separation between the virtual world and the personal interface provides the active application with the ability to output the 3D world while the user can access the functionality of other XR applications, such as transferring a travel card directly to another virtual world.

In various implementations, the personal interface execution environment may be hidden from the active application or may have access ports that can be controlled by the application executing on the personal interface in order to provide only the required information to the active application.

Thus, the XR application can be conceptually categorized into general control and access functions in the controller 502, a personal interface builder 504 module that executes under the personal interface runtime to provide a 2D interface on the personal interface, and a 3D world builder module 506 that executes under the world runtime to generate the current 3D world.

Thus, the 3D world builder 506 for an XR application is active only when that XR application generates the current 3D world, and the personal interface builder 504 may be executed when the XR application is the currently active application or when the XR application is selected in the personal interface when another application is the currently active application.

The controller 502 may include all necessary coding and programming for coordinating the operation of the personal interface builder 504 and the 3D world builder 506. The personal interface builder 504 may respond to activation of an application on the personal interface received through the controller 502.

When a user of the XR control and navigation system 164 uses a 2D interface from the personal interface builder 504 to travel to a location in a corresponding world, the 3D world builder 506 may cause a virtual world to be loaded.

In various embodiments, the personal interface builder 504 and/or the 3D world builder 506 may build 2D interfaces and 3D worlds using local content and/or content retrieved from remote sources. For example, the 3D world may contain a number of 3D models.The 3D models may be stored locally or retrieved from a server, and other users' representations and their states may be synchronized with such a server.

In one embodiment, the personal interface builder 504 itself can be based on display options for one or more items corresponding to said content and generate 3D content that can be displayed via said personal interface. In this way, the personal interface can host and display content independently of the virtual world. Thus, the personal interface can be used as an output medium for the XR application in two different ways.

First, a personal interface can be a vehicle through which a 2D interface can be simply rendered for an application, for example enabling a user to travel between and interact with multiple virtual worlds corresponding to the application.

Second, the personal interface can be the vehicle for 3D content generated by the XR application and displayed by the personal interface itself. Such 3D content could exclude the virtual world that the user is traveling through. In this way, the user can "preview" one or more aspects of a given virtual world through the personal interface without actually entering the virtual world.

Meta AR/VR Patent Shares User's Freedom to Travel Through Different App Worlds Without Transit to HOME Lobby

FIG. 6 illustrates a process 600 for navigating a plurality of virtual worlds in an artificial reality using the personal interface of the XR control and navigation system 164.

In 602, the user of the XR control and navigation system 164 may activate a personal interface to access one or more controls while traveling in the current virtual world. For example, the personal interface may be activated and displayed to the user when the user makes one or more gestures, a series of actions, words, activates UI elements, etc. The user may activate the personal interface using one or more of the activation prompts to travel directly from the current virtual world to a subsequent virtual world.

When the personal interface is active, it can display a 2D interface for the currently active application. At 604, a selection of another XR application may be received from a plurality of available XR applications via the personal interface. In this regard, the selected XR application may represent a subsequent virtual world with which the user wishes to interact and/or a virtual world to which the user wishes to seamlessly travel from the current virtual world.

At 606, a 2D interface portion of the selected application is displayed via the personal interface. For example, the displayed 2D interface portion may be controlled by the 2D interface of the personal interface to output content corresponding to the selected application. During control, the 2D interface portion may be operated according to the runtime of the personal interface without any control by the current virtual world controller. That is, the 2D interface part can operate under its own authority when displaying the content of the selected application.

In one embodiment, the content displayed corresponding to the selected application may include, for example, information about the source of the selected application, the type of virtual world to which the selected application corresponds, a list of travel cards capable of enabling the user to travel to a particular destination within the corresponding virtual world, a list of occupants of the selected virtual world, a list of occupants of the selected virtual world, a list of occupants of the selected virtual world, and a list of occupants of the selected virtual world. When the user arrives at the virtual world, the Avat that will be provided to the user

At 608, a selection of travel destinations may be received via the displayed 2D interface portion of the selected application. For example, the 2D interface portion may present the user with a plurality of travel cards for subsequent virtual worlds corresponding to the application, wherein the travel cards may indicate destinations, including places (museums, schools, resorts, etc.), events (sporting events, musical performances, parties, etc.), and people (friends, family members, supervisors, etc.) in the world. In this regard, the travel cards may have corresponding deep links that allow the user to travel directly to the corresponding associated destination.

At 610, the selected application may be caused to generate and display 3D content for a subsequent virtual world corresponding to the selected travel destination. For example, if the subsequent virtual world includes a building with several rooms and the selected travel destination corresponds to a particular room of those rooms, the process 600 may generate representative 3D depictions for the rooms.

At 612, process 600 may return to 602 to initiate travel to another subsequent virtual world without the user completing travel between virtual worlds in the artificial reality. In other words, the XR control and navigation system 164 may configure the personal interface to perform a sleep mode after a predetermined period of inactivity to minimize distractions while the user is traveling through the virtual world.

名为“Virtual personal interface for control and travel between virtual worlds”的Meta专利申请最初在2022年7月提交,并在日前由美国专利商标局公布。

Generally speaking, after a U.S. patent application is examined, it will be automatically published 18 months from the filing date or priority date, or it will be published within 18 months from the filing date at the request of the applicant. Note that publication of a patent application does not mean that the patent is approved. After a patent application is filed, the USPTO requires actual review, which can take anywhere from 1 to 3 years.

In addition, this is only a patent application, which does not necessarily mean that it will be adopted, and it is also uncertain whether it will be actually commercialized and the actual results of its application.

© Copyright notes

Related posts

No comments

none
No comments...