DataMesh offers BIM model authorization conversion service. After receiving user authorization, our team will use a plugin on a dedicated server to convert the BIM model into an architectural scene file. Once the conversion is complete, we will email the results to the user. The user can then use these results to build the architectural scene, which can be viewed and utilized on Inspector. The process is as follows:
1. The user sends the BIM model to DataMesh’s designated email address.
2. Upon receiving the model, DataMesh will convert it to the required format.
3. Once the conversion is complete, DataMesh will email the converted model to the user.
4. The user logs into the FactVerse platform and uploads the converted model to the resource library.
5. The user then builds the architectural scene and uses the Inspector to perform business operations.
SLA for BIM model authorization conversion service
The service provider referred to in this document is collectively known as “DataMesh” or “we.”
Service availability
Service Availability = (Total time of the service period – Unavailable time during the service period) / Total time of the service period * 100%. The service period is calculated from the date the customer applies for and activates the product or service in accordance with the agreement.
Definition of terms:
Total time of the service period: The duration of the service period for the Inspector application license you purchased.
Available time during the service period: For the BIM model authorization conversion service provided by DataMesh, this refers to the official service timeframe (which begins on the day DataMesh receives the client’s BIM model and lasts for three working days). During this period, the service is considered available.
Unavailable time during the service period: If the processing time exceeds the official service timeframe mentioned above, the time beyond that will be considered as unavailability.
DataMesh commits to the service timeframe for the BIM model authorization conversion service:
Processing will start on the day DataMesh receives your authorization and will be completed within three working days, send a notification of the successful scene construction to your email.
Service indicator commitment
1. Service Availability: DataMesh will use reasonable commercial and technical efforts to ensure that the normal service availability is above 95%.
2. Service Performance Standard: After DataMesh receives your authorization, we will process the BIM model, complete it within three working days, and send a notification of the successful scene construction to your email.
3. DataMesh does not take any responsibility to the customer when occur any unavailability, suspension, or termination of the Service that is due to any of the following:
a) Events that result from early notifications of DataMesh to customers for system maintenance, including regular repairs and upgrades, etc. (in principle, no more than 8 hours per month, except for special circumstances); b) Events that arise out of your or any third party (not under our direct control) equipment, software, and/or technology; c) Events that result from improper maintenance, use or confidentiality, leading to the loss or leakage of data, passwords, codes, etc.; d) Events that result from the negligence of the customer or operation authorized by the customer; e) events that result from your failure to adhere to any required configurations for the use of the Service; f) events that result from force majeure such as earthquakes, natural disasters, national and local policies, etc.; g) Events that result from your illegal or unlawful use of the Service, or your breach of any of the terms and conditions of the DataMesh Product Terms; h) Scheduled downtime.
Terms of service and notes
Ownership of your works
You retain ownership of the BIM models created by you (or your authorized users) and authorized or submitted to DataMesh by you (or your authorized users).
Privacy
DataMesh is committed to protecting your privacy and informing you about how it handles your personal data. DataMesh’s Privacy Policy outlines how DataMesh may collect, use, store, and process your personal data or data related to you, as well as how you can request access to or delete your personal data.
DataMesh will provide a Data Processing Addendum that specifies the obligations DataMesh must fulfill as a personal data processor under the General Data Protection Regulation (GDPR).
Use of your content
When you actively send your BIM model and relevant information to the designated email address of DataMesh, it represents your authorization for DataMesh and its designated personnel to process the BIM model you provided for the purpose of delivering the Inspector product service and building the architectural scene.
DataMesh staff will not use your content except in the following situations:
a) At your request or with your consent; b) Related to the provision and improvement of products (including maintenance, protection, updates, or other modifications to the products); c) Related to legal obligations, law enforcement, investigations, or litigation.
Location of your data processing and storage
The BIM model authorization conversion service provided by DataMesh, including its entire toolchain, service processing workflow, data processing, and storage, will be hosted on Azure servers located in Japan (pending confirmation).
DataMesh complies with applicable laws regarding the transfer of personal information between different countries/regions to ensure that your project data enjoys the same high level of privacy protection, regardless of where the data is processed.
Your responsibilities
a) You are responsible for owning, using, and disposing of the BIM models you send to the designated DataMesh email address, and you must ensure the legality of the source and content of the BIM models you send. DataMesh advises you to carefully assess the legality of your data’s source and content. You must ensure that the information you provide is authorized for upload to DataMesh FactVerse and does not infringe on any third-party rights (including third-party intellectual property rights, ownership, or trade secrets). b) You will bear all consequences and liabilities arising from any BIM models you provide that violate laws, regulations, departmental rules, or national policies. c) If you believe that the BIM model you provided needs to comply with confidentiality obligations or poses a risk of infringement, you must delete it directly within three working days or contact DataMesh to request deletion. d) Ensure that your content and its use in any product comply with all applicable laws and regulations, as well as these terms.
Reasons for online BIM model conversion failures
When users utilize BIM models to build architectural scenes on the DataMesh FactVerse platform, DataMesh offers an online model conversion service based on Autodesk Platform Services (APS) through an API wrapper. However, due to the inherent stability issues of the cloud services provided by Autodesk, occasional conversion failures may occur. The table below explains the failure messages, their corresponding reasons, and potential solutions:
Failure Message
Reason
Solution
APS processing timed out
The model is too large and exceeds the maximum number of models/components allowed in a single APS service request.
There are two conversion options available:
Authorize DataMesh to perform the model conversion.
Purchase a Revit/3ds Max software license and download the plugin from the official website for local conversion.
Failed to upload to APS
Network error
Check your network environment to ensure a stable and smooth connection.
Failed to download from APS
Network error
FactVerse processing failed
An exception occurred in the scene construction service deployed on the FactVerse platform.
Anomalies occurred in the service provided by APS, such as service failures or command errors.
Scene build plugin version mismatch
The version of the Revit/3ds Max plugin used for executing the local processing workflow is outdated and needs to be updated.
Click xxxxxx to get the latest version of the plugin.
If your business scenario requires you to combine multiple high-detail models (LOD 300 or above) or if a single model file is large (over 200MB), we recommend using DataMesh’s BIM model authorization conversion service to ensure a stable and smooth scene-building process. By authorizing DataMesh to assist with processing your BIM models, you can avoid scene construction issues caused by APS service limitations, thereby facilitating a more seamless construction of architectural scenes and executing your business operations in the Inspector client.
This update further enhances Inspector’s functionality and user experience, featuring the following new capabilities and optimizations:
New features
View equipment in 3D and MR modes
3D Mode: Users can remotely check the real-time operating status of key equipment by clicking on the digital twin in the equipment list, reducing the frequency of site inspections and improving efficiency, while enabling data insights for remote operations management from the control room.
MR Mode: On-site, users can leverage augmented reality to display real-time equipment data, offering valuable operational insights. This mode enables engineers to make quick maintenance decisions, particularly in complex equipment management situations. It effectively resolves the “Equipment-data separation” problem seen in traditional methods, where data is viewed in the control room while equipment inspections happen on-site.
Visualize equipment business logic relationships
Users can intuitively view upstream and downstream connections between equipment in both 3D and MR modes, and understand the spatial impact of faults. When a piece of equipment fails, engineers can quickly locate and troubleshoot issues on-site using the augmented model and business logic, reducing downtime.
Optimizations
Improved architectural scene display efficiency: We’ve optimized the display efficiency for architectural scenes, resulting in improved texture rendering.
Model loading progress indicator: A new feature now shows real-time model loading progress, allowing users to track the status and reduce uncertainty during lengthy loading times.
Additional user experience enhancements: We’ve made several refinements to further enhance the overall user experience in Inspector.
A Behavior Tree is a decision-making structure used to control the behavior of virtual characters or systems. It consists of multiple nodes, each representing a behavior or decision step. Behavior Trees decompose complex behaviors into simpler, manageable sub-behaviors through a hierarchical approach.
In FactVerse Designer, Behavior Trees can be utilized to manage digital twin behaviors (such as starting and stopping equipment) and to handle state transitions (like switching an AGV from “standby” to “working” mode), among other applications.
Execution rules
1. The execution order of a Behavior Tree is from top to bottom and from left to right.
2. When the Behavior Tree reaches a terminal state, it will return to the root node and start executing again.
3. The root node can only have one child node.
4. When a node has child nodes, it executes its child nodes first, from left to right. After all the child nodes have been executed, it moves on to the next node.
5. The behavior tree runs in a continuous loop according to the specified sequence unless it is in idle mode or has been deleted.
Node type
Root node
When creating a behavior tree, a root node is automatically generated, serving as the starting point for the execution of the behavior tree. The root node cannot be deleted and can only connect to one child node.
Composite Nodes
Composite nodes control the execution order and logical evaluation of their child nodes. They include parallel nodes, selector nodes, and sequence nodes. Composite nodes can have multiple child nodes, which can be either composite nodes or action nodes.
Parallel Node: All nodes under a parallel node execute simultaneously. The success criteria for parallel nodes can be set to:
Success if any one child node succeeds
Success only if all child nodes succeed
Selector Node: The child nodes under a selector node execute in left-to-right order. As soon as one child node returns success, the entire selector node returns success and the subsequent child nodes are not executed. If all child nodes fail, the selector node returns failure.
Sequence Node: The child nodes under a sequence node execute in left-to-right order. Sequence nodes offer two traversal strategies.
Return failure if any child node fails
Return failure only if all child nodes fail
Action nodes
Action Nodes represent specific behaviors or tasks of the digital twin, such as moving along a path or waiting. Action Nodes are the leaf nodes of the behavior tree, where complex logic can be implemented. When an action node is executed, it performs a specific action and returns one of the following three return values:
Success: It indicates that the node has been executed successfully.
Running: It indicates that the node is still running, and it will continue to run when the behavior tree is called again.
Failure: It indicates that the node has failed to execute.
Node configuration
Node operations
1. Add nodes: Drag and drop nodes from the Node Menu into the Editing area.
2. Delete nodes: Select the node to be deleted in the Editing area and click the delete button on the toolbar.
3. Rename nodes: Select the node to be renamed in the Editing area and enter the new name in the node name area above the Attribute Pane.
4. Connect nodes: To connect Node A with Node B, drag a yellow connection line from the bottom of Node A and connect it to the top of Node B. Release the mouse to complete the connection.
5. Delete node connections: Hold down the right mouse button and drag over the connection line you want to delete to cut off the link between the two nodes.
6. Organize nodes: Select the Root Node and press the “L” key to automatically organize the tree’s format.
Common configuration of nodes
In behavior trees, node conditions and attribute settings often require obtaining the attributes of digital twins, locating digital twins, and retrieving the positions of digital twins.
Get attributes
Methods to get attributes
Select attribute directly: Choose the required attribute directly from the digital twin template.
Find attribute by ID: Retrieve attributes by searching with the attribute ID.
Get position
Manual entry: Users manually input the attribute value or ID directly.
Attribute of digital twin: Select to retrieve attribute values from specific attributes within the digital twin template.
Behavior tree attributes: Obtain attribute values from the attributes of the current behavior tree.
Current attribute: During the traversal of attributes within a container in the behavior tree, the attribute being traversed is the current attribute.
Way of finding digital twins
Basic method to find digital twins
Self: Choose the digital twin that is currently executing the behavior tree.
Temporary digital twin: Select a digital twin that is temporarily created or used during execution.
Digital twin in the attribute: Locate and select a specific digital twin using the unique identifier stored in the attributes.
Get objects from attributes:
Attribute of digital twin
Behavior tree attribute
Current attribute
Use ID to find digital twin
Digital twin ID
Attribute ID
All digital twins in the scene: Search for digital twins throughout the entire scene.
Method to filter digital twins: When using “All digital twins in the scene” as the base method for locating digital twins, different filtering methods can be used to precisely locate the desired digital twin.
None: No filtering conditions are used; search all digital twins in the scene.
Find in scene by ID: Search based on the unique ID of the digital twin. Each digital twin has a unique identifier (ID) upon creation.
Find the digital twin by minimum distance: Filter based on the distance between the digital twin and the digital twin executing the behavior tree, finding the closest digital twin.
Advanced method to find digital twins
None: No specific advanced methods.
Parent digital twin: Locate the parent digital twin of the current digital twin. For example, the parent digital twin of cargo on a conveyor belt is the conveyor belt itself.
The last child digital twin: Find the last child digital twin of the current digital twin. For example, the last product in a batch of generated products.
The first child digital twin: Find the first child digital twin of the current digital twin. For example, the first product in a batch of generated products.
All child digital twins: Locate all child digital twins of the current digital twin. For example, all products on a production line.
Previous digital twin: Find the previous sibling digital twin of the current digital twin. For example, the product before a certain product on a conveyor belt.
Next digital twin: Find the next sibling digital twin of the current digital twin. For example, the product after a certain product on a conveyor belt.
Find the type of object: Choose the target object to find, which can be either the digital twin itself or a specific element within the digital twin.
Digital twin: Choose to find the entire digital twin object for overall operations or to retrieve its attributes. For example, locating and operating a robot, a production line device, or a shelf unit.
Role in digital twin: Choose to locate specific roles within the digital twin. This type is used for operations on specific parts within the digital twin, such as a raw material generator’s digital twin template containing roles like the generator model, output port, and progress bar.
Method to get the position
Method of getting the position
Manual entry
Directly input the 3D coordinates of a point in the format x, y, z.
Example: Input “10,20,30” represents a specific spatial location.
Location of digital twin: Use the current location of the digital twin.
Attribute of digital twin: Obtain the location from the digital twin’s attributes.
Behavior tree attribute: Obtain the location from the attributes of the behavior tree.
Find the point in the path map by attribute: Use attribute values to find points in the path map.
Coordinate Types
Local coordinate: Coordinates relative to a reference point. For example, the local coordinates of goods on a conveyor belt are relative to the center point of the conveyor belt.
Coordinate: Global coordinate, or world coordinate.
Digital twin nodes
Digital twin nodes are used for creating, deleting, and manipulating digital twins and digital twin attributes.
Create digital twins
Function: Create a digital twin corresponding to a specified digital twin template with a given pose (position, rotation angle).
Example: Automatically generate products on a factory production line. The following example configures the “Create digital twin” node so that Digital twin A creates a new digital twin at its own output port.
The template used to create the digital twin
Method to get attributes: Select “Select attribute directly”
Get location: Select “Attribute of digital twin”
Attribute of digital twin: Choose “Template of Digital twin A” > “Identifier for Generating digital twin” (an attribute defined by the user in the template of Digital twin A for template identification).
Way of finding digital twin
Basic method to find digital twins: Select “Self”
Create the location of the digital twin
Method to get attributes: Select “Select attribute directly”
Get location: Select “Attribute of digital twin”
Attribute of digital twin: Select “Template of Digital twin A” > “Output port” > “Position of the output port”
Delete digital twin
Function: Delete a specified digital twin. For example, automatically remove products from the production line that have expired or are no longer needed.
Set attributes
Function: Set any attributes of a digital twin or behavior tree.
Set digital twin pose
Function: Set the position and rotation of a digital twin or elements within the digital twin. For example, adjust the position and angle of a robot arm to perform a specific task.
Set parent object of digital twins
Function: Set a digital twin as a child object of another digital twin.
Get digital twin
Function: Set a digital twin as the current digital twin or save the digital twin identifier to a target digital twin attribute.
Display or hide digital twin
Function: Display or hide a digital twin or elements within a digital twin.
Attribute change to Vector3
Function: Convert a certain attribute value into a three-dimensional vector (Vector3) format.
Delete elements in the container
Function: Remove an element from a specified container (such as a list or dictionary). Containers are attributes that can store and manage multiple elements. These attributes can contain other objects or basic data types and provide methods to access and manipulate these elements. The currently supported container types are List and Dictionary. For example, remove a specific product from a virtual warehouse containing multiple products.
Path nodes
Path nodes are used for configuring and controlling the moving path of digital twins.
Set digital twin moving path
Function: Save a specified path into the digital twin’s attributes.
Example: The following example demonstrates how to use the “Set digital twin moving path” node to set a movement path for items on a conveyor belt.
Digital twin to be moved
Specify a digital twin
Basic method to find digital twins: Select “Temporary digital twin”
Find the type of object: Select “Digital twin”
Target path
Method to get attributes: Select “Select attribute directly”
Get location: Select “Attribute of digital twin”
Attribute of digital twin: Select “Conveyor template”> “Configure Path” attribute
Way of finding digital twin
Basic method to find digital twins: Self
The starting point of the move: Current point
Set the moving speed of digital twin
Method to get attributes: Select “Select attribute directly”
Get location: Select “Attribute of digital twin”
Attribute of digital twin: Select “Conveyor template”> “Convey Speed” attribute
Way of finding digital twins
Basic method to find digital twins: Select “Self”
Keep the original rotation angle when entering the path: yes
Move digital twins along a path
Function: Move a specified digital twin along a given path. For example, specify a path, starting point, and movement speed for an AGV (Automated Guided Vehicle) so that it can move according to the set path and speed.
Move one step
Function: Move the target digital twin one step at the configured speed. For example, move an object on a conveyor belt one step to simulate the process of gradual movement of the object on the conveyor belt.
Generate path
Function: Set the current moving path attribute of a digital twin. For example, use the “Generate Path” node to configure the “Current moving path” attribute for a digital twin (such as a robot or AGV) and, in conjunction with the “Move digital twins along a path” node, allow the digital twin to move along the predefined path.
Track digital twins along the path
Function: Set the digital twin’s pursuit target, shortest path, and starting point to automatically navigate and chase the target digital twin. For example, a robotic arm grabbing a moving item on a conveyor belt.
Port nodes
Get digital twins from the input port
Function: Retrieve a digital twin from the input port and save it to a specified attribute.
Remove digital twins from the input port
Function: Delete the digital twin at the input port.
Set digital twins to the port
Function: Place the target digital twin at the specified port.
Storage node
Outbound sorting
Function: Move the specified digital twin out of the designated storage area.
Inbound sorting
Function: Place the specified digital twin into the designated storage area.
Role nodes
Play animation
Function: Play the specified animation of a role in the digital twin. By default, the animation plays once.
Example: The following example shows how to use the “Play Animation” node to play the “Pick and Move Object” animation for the “Six-Axis Robotic Arm.”
Target object
Basic method to find digital twins: Self
Find the type of object: Role in the digital twin
Target role ID: Six-Axis Robotic Arm
Animation control
Animation name: Pick and Move Object
Animation speed: 1(Normal speed)
Note:
The Target Role ID needs to be entered manually, which is the name of the model’s role; you can view it in the Template Editor.
The Animation Name needs to be entered manually; the model role must include this animation. You can view the specific animation name in DataMesh Importer.
Tool nodes
Idle
Function: Stop the execution of the behavior tree.
Example:
The following Idle node uses the condition “Check if digital twins already exist at the port” to determine when to stop the behavior tree. The behavior tree stops running when other digital twins are present at the port.
Log
Function: The log node is used to output specified text content, which is displayed in the “Output Information” panel in the scene playback interface.
Wait
Function: The wait node is used to pause the execution of the behavior tree until the waiting time has elapsed. Once the waiting time ends, the behavior tree will continue to execute the nodes following the wait node.
Behavior tree nodes
Add behavior tree
Function: Add the target behavior tree to the target digital twin.
Remove behavior tree
Function: Remove the target behavior tree from the target digital twin.
Math nodes
Math nodes are used to perform various mathematical operations and apply the results to digital twin attributes.
Sin Function
Function: Convert the input value to the corresponding sine value and output it.
Cos Function
Function: Convert the input value to the corresponding cosine value and output it.
Tan Function
Function: Convert the input value to the corresponding tangent value and output it.
Orthostatic Distribution
Function: Generate output values based on a normal distribution using the input value.
Poisson Distribution
Function: Generate output values based on a Poisson distribution using the input value.
Radom number
Function: Output a random number.
Addition
Function: Select two attribute values of the digital twin for addition, and either replace a specific attribute value of the digital twin with the result or add the result to it.
Subtraction
Function: Select two attribute values of the digital twin for subtraction, and either replace a specific attribute value of the digital twin with the result or add the result to it.
Multiplication
Function: Select two attribute values of the digital twin for multiplication, and either replace a specific attribute value of the digital twin with the result or add the result to it.
Division
Function: Select two attribute values of the digital twin for division, and either replace a specific attribute value of the digital twin with the result or add the result to it.
Event nodes
Receive event
Function: Used to listen for and respond to a specified event. Only one digital twin can listen for and respond to this event. For example, if two AGVs need to compete for the same cargo, the receive event node can be used to ensure that only one AGV responds to the cargo arrival event, avoiding conflicts.
Send event
Function: Send a specified event to trigger actions or behaviors in other digital twins. For example, on a production line, after one processing step is completed, an event can be sent to notify the next processing step’s digital twin to start working.
Node condition
Definition
Node conditions can be added to composite nodes and action nodes, referring to the conditions that need to be met for the node to execute. If the conditions are met, the node executes; if not, it does not execute and returns to the parent node.
Condition : A single condition.
Condition Group : Conditions within a condition group are combined with an “AND” relationship. This means that the node can only execute if all conditions within the group are met.
Multiple condition groups : A node can have multiple condition groups. These groups are combined with an “OR” relationship. This means that if any one of the condition groups is met, the node can execute.
Common conditions
Check if digital twins already exist at the port
This condition is used to determine whether other digital twins are present at the ports of a digital twin.
Configure parameters
Target port: Set the port to be checked.
Digital twin at the port: Specify which digital twin’s port needs to be checked.
Basic method to find digital twins: Select the basic method to find digital twins.
Advanced method to find digital twins: Select the advanced method to find digital twins.
Port name: Select the specific port to be checked.
Example
The following example checks whether there are other digital twins at the input port of the digital twin executing the behavior tree.
Compare attributes of two digital twins
This condition is used to compare specific attribute values of two digital twins to determine if a particular condition is met. Perform corresponding behavior logic based on the comparison result.
Configure parameters
Source attribute
Method to get attributes: Select how to get the source attribute.
Get location: Set the method for getting the source attribute.
Way of finding digital twins: Choose how to find the digital twin.
Target attribute
Method to get attributes: Select how to get the target attribute.
Get location: Set the method for getting the attribute value.
Way of finding digital twins: Choose how to find the digital twin.
Comparison method:
Equal to: The source attribute value is equal to the target attribute value.
Greater than: The source attribute value is greater than the target attribute value.
Less than or equal to: The source attribute value is less than or equal to the target attribute value.
Greater than or equal to: The source attribute value is greater than or equal to the target attribute value.
Not equal to: The source attribute value is not equal to the target attribute value.
Example
Assuming we need to compare the “positions” of two digital twins, A and B, to see if they are the same, the following steps can be set up:
Source attribute
Method to get attributes: Select “Select attribute directly”
Get Location: Select “Attribute of digital twin”
Attribute of digital twin: Select “Position” attribute in the Digital Twin A template
Basic method to find digital twins: Select “Self”, which means the Digital Twin A executing the behavior tree.
Target attribute:
Method to get attributes: Select “Select attribute directly”
Get Location: Select “Attribute of digital twin”
Attribute of digital twin: Select “Position” attribute in the Digital Twin B template
Way of finding digital twins:
Basic method to find digital twins: Select “All the digital twins in the scene”
Method to filter digital twins: select “Find in scene by ID”
Use ID to find digital twin: Select “Digital twin ID”
Value: “Digital Twin B”
Comparison method: Select “Equal to”
Check the distance between two points
Compare the distance between two points and perform corresponding actions based on the comparison result of the distance reference value.
Configure parameters
First point
Method of getting the position: how to get the position of the first point.
Manual entry: Directly input the 3D coordinates of the point in the format x, y, z.
Location of digital twin: Use the position of the digital twin as the position of the first point.
Attribute of digital twin: Retrieve the position from the attributes of the digital twin.
Behavior tree attribute: Obtain the position from the attributes of the behavior tree.
Find the point in the path map by attribute: in the Use attribute values to find points in the path map.
Coordinate Type: Choose the type of coordinate to use.
Local coordinate: Coordinates relative to a reference point. For example, the local coordinates of goods on a conveyor belt are relative to the center point of the conveyor belt.
Coordinate: Global coordinate, or world coordinate.
Way of finding digital twins: Choose the method for finding the digital twin.
Second point
Method of getting the position: how to get the position of the second point.
Coordinate Type: Choose the type of coordinate to use, Local coordinate or Coordinate.
Way of finding digital twins: Choose the method for finding the digital twin.
Reference distance value
Method to get attributes: Choose the method for obtaining the reference distance value.
Get location: Choose from where to obtain the reference distance value.
Comparison method: Select the comparison method to determine the relationship between the distance of two points and the reference value.
Equal to: The distance between the two points is equal to the reference value.
Less than: The distance between the two points is less than the reference value.
Greater than: The distance between the two points is greater than the reference value.
Less than or equal to: The distance between the two points is less than or equal to the reference value.
Greater than or equal to: The distance between the two points is greater than or equal to the reference value.
Not equal to: The distance between the two points is not equal to the reference value.
Example
Assuming we need to determine whether the distance between two temporary digital twins (products) A and B on a conveyor belt is less than a specified safety distance, follow these steps for configuration:
First point
Method of getting the position: Select “Location of digital twin”
Coordinate type: Select “Local coordinate”
Basic method to find digital twins: Select “Temporary digital twin”, that is Digital Twin A
Second point
Method of getting the position: Select “Location of digital twin”
Coordinate type: Select “Local coordinate”
Basic method to find digital twins: Select “Temporary digital twin”
Advanced method to find digital twins: Select “The last digital twin”, that is Digital Twin B.
Reference distance value
Method to get attributes: Select “Select attribute directly”
Get location: Select “Manual entry”
Manual entry: “1”
Comparison method
Select “Less than”
Check if there is a digital twin in the storage area
This condition is used to determine if there are any objects present in the specified storage area.
Configure parameters
Target storage: Specify the target storage area.
Digital twin which belongs to the storage: Used to specify the digital twin belonging to the storage area.
Basic method to find digital twins: Choose the basic method for locating the digital twin belonging to the storage area.
Advanced method to find digital twins: Further refine the search for the digital twin belonging to the storage area.
Example
The following example determines whether there are objects in the storage area of the digital twin executing the current behavior tree.
Check if the digital twin can be placed in the storage area
Determine if a certain twin body can be placed in the specified storage area.
Configure parameters
Target Storage: Specify the target storage area.
Digital twin which belongs to the storage: Used to specify the digital twin belonging to the storage area.
Basic method to find digital twins: Choose the basic method for locating the digital twin belonging to the storage area.
Advanced method to find digital twins: Further refine the search for the digital twin belonging to the storage area.
Digital twin to be stored:
Basic method to find digital twins: Choose the basic method for locating the digital twin to be stored.
Advanced method to find digital twins: Further refine the search for the digital twin to be stored.
Example
Assuming we need to determine whether a certain item can be placed in the storage area of the Digital Twin A executing the behavior tree, the setup can be done as follows:
Target storage
Digital twin which belongs to the storage
Basic method to find digital twins: Select “Self”
Digital twin to be stored
Basic method to find digital twins: Select “Temporary digital twin”
Check if the digital twins are the same
This condition is used to determine whether two digital twins are the same.
Configure parameters
Compare digital twins
Basic method to find digital twins: Choose how to find the digital twin for comparison.
Advanced method to find digital twins: Further refine the search for the digital twin for comparison (e.g., by filtering based on specific conditions).
Specify a digital twin ID
Way of finding digital twins: Choose how to find the digital twins for comparison.
Edit behavior tree
Basic method
The basic process of editing a behavior tree can be divided into the following steps:
1. Function decomposition
Break down complex functions into multiple independent sub-functions. For example, a material generator can be decomposed into two sub-functions: “Production” and “Output.”
2. Prepare digital twin templates a) Create digital twin template b) Add resources: Based on the results of the function decomposition, add resources for each specific goal of the sub-functions. For example, the production function of the material generator requires an outlet to output the generated items. c) Set attributes: Configure the relevant attributes for each sub-function. For example, for the production function of the material generator, set the production interval and the template required for producing items.
3. Edit the Behavior Tree: Use the behavior tree editor to define the specific behavior logic for each sub-function. For instance, the production function of the material generator can be set to produce items at regular intervals (wait node) and create digital twins (create twin body node). The output function can use Set Digital Twins to the Port Node to send goods to the outlet.
Example
Basic functions of the Source
1. Function decomposition: “Production” and “Output.”
2. Preparedigital twin template: a) Create a digital twin template: Create a twin template and add a Source model to the template (“/Public Directory/FactVerseDLC/ChuangJianQi_DLC”). b) Add resources: Add an “Output” to the production function of the Source.
c) Set attributes: i. In the Digital Twin Template pane, click the ︙next to Metadata and select Add structure to add “Part_1.” ii. Set production interval attribute: Under “Part_1,” add a “Production Interval” (Double type) attribute with the following settings:
Unit: “s”
Default value: “3”
Check the option “Show in the digital twin.” When checked, users can edit this attribute in the scene editor when using this template to generate digital twins.
iii. Set template ID attribute for production items: Under “Part_1,” add a “Generated Digital Twin Identifier” (String type) property with the following settings:
Purpose: Select “Digital twin template”
Check the option “Show in the digital twin.”
iv. Save template: Click the “Save” button on the toolbar to save the template.
3. Edit behavior tree: a) Create behavior tree: i. In the Digital Twin Template pane, click the︙ next to the Behavior Tree section and select “Create Behavior Tree.”
ii. In the opened window, choose the storage path and enter the Behavior Tree name.
iii. Click the “New” button to complete the creation of the Behavior Tree. iv. Click the “Save” button on the toolbar to save the template.
b) Edit behavior tree: The newly created Behavior Tree “Behavior tree of Source” only contains a root node. The following steps will add nodes and running conditions to the “Behavior tree of Source.” i. In the Digital Twin Template pane, double-click the newly created “Behavior tree of Source” under the Behavior Tree section to open the Behavior Tree Editor. ii. Add a Selector node and connect the Root node with the Selector node. For connection methods, refer to the section Node operations.
iii. Add an Idle node and connect the Selector node with the Idle node. Set the running conditions of the Idle node to stop when production items are present at the output.
iv. Add a Sequence node and connect the Selector node with the Sequence node.
v. Add a Wait node and connect the Wait node with the Sequence node. Set the attributes of the Wait node:
Waiting time
Get location: Attribute of digital twin
Attribute of digital twin: Production interval
vi. Add a node for producing items: Add a Create Digital Twin node, connect the Sequence node with the Create Digital Twin node, and set the attributes of the Create Digital Twin node:
The templated used to create the digital twin
Method to get attributes: Select attribute directly
Get location: Attribute of digital twin
Attribute of digital twin: Produced Digital Twin Identifier
Create the location of the digital twin
Method to get attributes: select attribute directly
Get location: Attribute of digital twin
Attribute of digital twin: Source Template > Output port_1> The location of the port
vii. Add a Set Digital Twins to the Port node: Connect the Sequence node with the Set Digital Twins to the Port node, and set the attributes of the node:
Target port
Digital twin at the port
Basic method to find digital twins: Self
Port name: Source Template > Output port_1
Set the digital twin to the port: “Temporary digital twin”
viii. Click the “Save” button on the toolbar to save the Behavior Tree.
Moving ball
1. Create a Ball Twin Template: a) Click on New Template in the homepage to open the template editor. b) Drag the “Sphere” from Libraries into the scene area and adjust its position. c) Change the color of the ball to green.
d) Click the save button on the toolbar and save the template as “Moving Ball.”
2. Add main function structure to “Moving Ball” Template:
a) In the Digital Twin Template pane, click the ⋮ next to Metadata and select Add Structure. b) Scroll down to find the newly added structure “Part_1,” click on it, and change the structure name in the attributes area to “Main Function.” c) Under the “Main Function” structure, add a “Real-time Position” (Vector3) attribute.
3. Add behavior tree for moving ball: a) In the Digital twin template pane, click the ⋮ next to Behavior Tree and select Create Behavior Tree. b) Save the behavior tree and name it “Move the ball.”
4. Edit behavior tree logic
a) Double-click the “Move the ball” behavior tree to open the behavior tree editor. b) Add a Sequence node under the Selector Node. c) Under the Sequence node, add a Set Digital Twin Pose node “Set Digital Twin Pose 1” and configure the attributes as follows:
Specify a digital twin
Basic method to find digital twins: Self
Find the type of object: Digital twin
Set the position of the digital twin
Method to get attributes: Select attribute directly
Get location: Manual entry
Manual entry: 0.5,0,0
d) Add a Wait node “Wait 1” under the Sequence node and set the Manual entry value for wait time to “0.5.”
e) Add another Set Digital Twin Pose node: i. Copy “Set Digital Twin Pose 1” using CTRL+C and paste it with CTRL+V under the Sequence node to create a new Set Digital Twin Pose node “copy_Set Twin Pose 1.” ii. Rename it to “Set Digital Twin Pose 2.” iii. Change the Manual entry for Set the position of the digital twin to “0.5,0,-0.5.”
f) Copy “Wait 1” node using CTRL+C and paste it with CTRL+V to create a new wait node “copy_Wait 1,” renaming it to “Wait 2.”
g) Add a “Set Digital Twin Pose 3” node by copying “Set Twin Pose 1,” changing the manual input for Set the position of the digital twin to “0,0,-0.5.”
h) Copy “Wait 1” node again using CTRL+C and paste it with CTRL+V to create a new wait node “copy_Wait 1,” renaming it to “Wait 3.”
i) Click the Save button on the toolbar to save the behavior tree. j) Click the < button to exit the behavior tree editor. k) Save the “Moving Ball” template and click the homepage button to exit the template editor.
5. Create a scene and add a moving ball in the scene
a) Create a new scene named “Moving Ball.” b) Use the “Moving Ball” template to create a “Moving Ball 1” digital twin. c) Add “Moving Ball 1” to the scene.
d) Play the scene to preview the ball’s movement.
Response to the signal to change the color of the cube
In this example, we will use event nodes to listen for and respond to events, changing the color attribute of a cube.
a) Click on New Template in the homepage to open the template editor. b) Drag the “Sphere” from Libraries into the scene area and adjust its position.
c) Click the save button in the toolbar and save the template as “Signal-Sending Sphere.”
2. Add main function structure to “Signal-Sending Sphere” template: a) In the Digital Twin Template pane, click ︙ next to Metadata and select “Add Structure.” b) Scroll down to find the newly added structure “Part_1,” click on it, and change its name to “Main Function” in the properties area. c) Under the “Main Function” structure, add a “Signal” (Int) attribute.
3. Add Behavior Tree for “Signal-Sending Sphere”: a) In the Digital Twin Template pane, click ︙ next to the behavior tree section and select “Create Behavior Tree.”
b) Save the behavior tree and name it “Send Signal.”
4. Edit behavior tree logic: In this example, three signals “100,” “200,” and “300” will be sent. a) Double-click on the “Send Signal” behavior tree to open the behavior tree editor. b) Add a Sequence node under the Root node and use the default attributes. c) Under the Sequence node, add a Send Event node “Send Signal 100,” and set its attributes as follows:
Event type
Method to get attributes: Select attribute directly
Get location: Manual entry
Manual entry: 100
d) Add a Wait node “Wait 1” under the Sequence node and set the manual entry wait time to “2.” e) Add the “Send Signal 200” node: i. Copy and paste the “Send Signal 100” node; it will automatically be added under the Sequence node. ii. Rename it to “Send Signal 200.” iii. Reconnect the Sequence node and the “Send Signal 200” node. iv. Change the manual entry value for the Event type in the “Send Signal 200” node to “200.” f) Add a Wait node “Wait 2” and set the manual entry wait time to “2”. g) Add the “Send Signal 300” node: i. Copy and paste the “Send Signal 100” node; it will automatically be added under the Sequence node. ii. Rename it to “Send Signal 300.” iii. Reconnect the Sequence node and the “Send Signal 300” node. iv. Change the manual entry value for the Event type in the “Send Signal 300” node to “300.” h) Add a Wait node “Wait 3” and set the manual entry wait time to “2”. The following image shows the complete behavior tree structure:
i) Save the behavior tree and exit the behavior tree editing interface. j) Save the template and return to the homepage.
5. Add a Behavior Tree for the “Color-Changing Cube” template to receive signals and change color:
a) Open the “Color-Changing Cube” template. b) Change the cube’s color to white. c) Create a new “Event Response” behavior tree in the template. d) Double-click the “Event Response” behavior tree to open the editing interface. e) Add a Parallel node under the Root node. f) Add a Sequence node “Sequence Node 1” under the Selector node. g) Add a Receive event node “Receive Signal 100” under “Sequence Node 1” and set the attributes as follows:
Event type
Method to get attributes: Select attribute directly
Get location: Manual entry
Manual entry: 100
h) Add a Set attributes node “Set Signal” under “Sequence Node 1” and set the attributes as follows:
Target attribute
Method to get attributes: Select attribute directly
Attribute source: Attribute of digital twin
Strategy for setting value: Replace the original value
Attribute of digital twin: Singal (the “Signal” attribute of the “Color-Changing Cube” template)
Target attribute value
Method to get attributes: Select attribute directly
Get location: Manual entry
Manual entry: 100
i) Add a Set attributes node, rename it to “Set Color”, and set the attributes as follows:
Target attribute
Method to get attributes: Select attribute directly
Attribute source: Attribute of digital twin
Strategy for setting value: Replace the original value
Attribute of digital twin: Color (the “Color” attribute of the “Color-Changing Cube” template
Specify a digital twin
Basic method to find digital twins: self
Target attribute value
Method to get attributes: Select attribute directly
Get location: Manual entry
Manual entry: 255,254,145,10 (corresponding color is )
j) Add a Sequence node “Sequence Node 2” under the Selector node. k) Add a Receive event node “Receive Signal 200” under “Sequence Node 2” and set the attributes as follows:
Event type
Method to get attributes: Select attribute directly
Get location: Manual entry
Manual entry: 200
l) Add a Set attributes node “Set Signal” under “Sequence Node 2” and set the attributes as follows:
Target attribute
Method to get attributes: Select attribute directly
Attribute source: Attribute of digital twin
Strategy for setting value: Replace the original value
Attribute of digital twin: Singal (the “Signal” attribute of the “Color-Changing Cube” template)
Target attribute value
Method to get attributes: Select attribute directly
Get location: Manual entry
Manual entry: 200
m) Add a Set attributes node under “Sequence node 2”, rename it to “Set Color”, and set the attributes as follows:
Target attribute
Method to get attributes: Select attribute directly
Attribute source: Attribute of digital twin
Strategy for setting value: Replace the original value
Attribute of digital twin: Color (the “Color” attribute of the “Color-Changing Cube” template
Specify a digital twin
Basic method to find digital twins: self
Target attribute value
Method to get attributes: Select attribute directly
Get location: Manual entry
Manual entry: 100,100,200,200 (corresponding color is )
n) Add a Sequence node “Sequence Node 3” under the Selector node. o) Add a Receive event node “Receive Signal 300” under “Sequence Node 3” and set the attributes as follows:
Event type
Method to get attributes: Select attribute directly
Get location: Manual entry
Manual entry: 300
p) Add a Set attributes node “Set Signal” under “Sequence Node 3” and set the attributes as follows:
Target attribute
Method to get attributes: Select attribute directly
Attribute source: Attribute of digital twin
Strategy for setting value: Replace the original value
Attribute of digital twin: Singal (the “Signal” attribute of the “Color-Changing Cube” template)
Target attribute value
Method to get attributes: Select attribute directly
Get location: Manual entry
Manual entry: 300
q) Add a Set attributes node under “Sequence node 3”, rename it to “Set Color”, and set the attributes as follows:
Target attribute
Method to get attributes: Select attribute directly
Attribute source: Attribute of digital twin
Strategy for setting value: Replace the original value
Attribute of digital twin: Color (the “Color” attribute of the “Color-Changing Cube” template
Specify a digital twin
Basic method to find digital twins: self
Target attribute value
Method to get attributes: Select attribute directly
Get location: Manual entry
Manual entry: 255,100,100,100 (corresponding color is )
The following image shows the complete behavior tree structure:
r) Click the Save button in the toolbar to save the behavior tree. s) Click < to exit the behavior tree editor. t) Save the template and return to the homepage.
6. Create a New Scene Named “Event Response”.
7. Add a Sphere that can send events and two Cubes that can receive events in the scene: a) Create two cubes “Cube 1” and “Cube 2” using the “Color-Changing Cube” template. b) Create a “Signal Sending Sphere 1” using the “Signal-Sending Sphere” template. c) Add “Cube 1”, “Cube 2”, and “Signal Sending Sphere 1” to the scene.
8. Save the scene.
9. Play the scene.
Chasing a moving ball
This example demonstrates how to create a cube digital twin in a virtual scene and implement behavior tree logic to have it automatically chase a moving ball.
1. Create a cube digital twin template: a) Click New Template on the homepage to enter the template editor. b) Drag the “Cube” from the tool pane into the scene area and adjust its position. c) Click the save button in the toolbar and name the template “Chasing Target Cube.”
2. Add Behavior Tree for Chasing Target Cube: a) In the Digital Twin Template pane, click ︙ next to the Behavior Tree section and select “Create Behavior Tree.” b) Save the behavior tree and name it “Chasing Target Cube.”
3. Edit behavior tree logic a) Double-click the “Chasing Target Cube” behavior tree to open the behavior tree editor. c) Add a Sequence node under the Root node. d) Under the Sequence node, add a Track digital twins along the path node and configure the following attributes:
Save the path to
Method to get attributes: Select attribute directly
Source attribute: Attribute of digital twin
Strategy for setting value: replace the original value
Attribute of digital twin: Current moving path(“Current moving path” attribute of “Chasing Target Cube”)
Start point
Method to get attributes: Select attribute directly
Get location: Manual entry
Manual entry: point0
Target to be chased
Specify a digital twin
Basic method to find digital twins: All the digital twins in the scene
Method to filter digital twins: Find in scene by ID
Get objects from attributes: Attribute of digital twin
Use ID to find digital twin: Digital Twin ID
Value: 1599. (ID of “Moving Ball 1” in the Example Moving ball, you can check the ID in its digital twin attribute pane )
Find the type of object: Digital twin
e) Under the Sequence node, add a Move digital twins along a path node and configure the following attributes:
Set path
Method to get attributes: Select attribute directly
Get location: Attribute of digital twin
Attribute of digital twin: Current moving path(“Current moving path” attribute of “Chasing Target Cube”)
Set speed:
Target speed: “10”
f) Click the save button in the toolbar to save the behavior tree. g) Click < to exit the behavior tree editor. h) Save the “Chasing Target Cube” template, then click the homepage button to exit the template editor.
4. Create a Scene: Create a new scene named “Chasing Moving Ball.”
5. Add a Chasing Target Cube and a Moving Ball to the Scene: a) Use the “Chasing Target Cube” template to create “Chasing Target Cube 1” and add it to the scene at position “0,0,0.”
b) Use the “Moving Ball” template to create “Moving Ball 1” and add it to the scene at position “0.5,0,-0.5.”
6. Create a path connecting four points: a) Hide “Chasing Target Cube 1” and “Moving Ball 1” to prevent obstruction while creating the path points, ensuring accurate placement.
b) Enter the path planning interface, use the build mode to create the following four points and connect them. These points will form the movement path for “Chasing Target Cube 1,” where “point0” is the starting point. The cube will calculate the shortest path to chase “Moving Ball 1.”
point0 (0,0,0)
point1 (0.5,0,0)
point2 (0.5,0,-0.5)
point3 (0,0,-0.5)
c) Switch to selection mode and change the connection direction to bidirectional.
d) Save the path and exit the path planning interface. e) Switch “Chasing Target Cube 1” and “Moving Ball 1” back to display status.
7. Save the Scene.
8. Play the Scene: “Chasing Target Cube 1” will chase “Moving Ball 1” along the shortest path.
This version is an update from August 2024, with an internal version number of 7.1.0.
New Features
Scene object locking functionality:
By locking specific objects, accidental operations or movements during the editing process are prevented, ensuring the stability of the scene layout.
Scene debugging runtime mode:
Supports breakpoint debugging of behavior trees. Users can set breakpoints during the execution of the behavior tree, step through the process, and view and adjust condition configurations for more precise control over the behavior tree’s execution logic.
Batch creation and modification of digital twins:
Provides functionality for batch creation and modification of digital twins through tables. Users can efficiently input and import data for multiple digital twins, reducing manual repetitive tasks and improving work efficiency.
Optimizations
Optimization of conveyor input and output ports:
The input and output ports of the conveyor belt will automatically adjust according to the belt’s editing, ensuring they are always in the correct location, thereby enhancing convenience and accuracy.
Expose identifier for digital twin content
Allows users to view and modify the identifiers of digital twins, enhancing the flexibility and transparency of digital twin content management.
Behavior tree node library expansion
New node types:
Track digital twins along the path: Automatically navigates to and tracks target digital twins.
Event node: Send and receive custom events.
Multiplication and division nodes: Performs operations using attribute values.
1. Can multiple people edit a scenario simultaneously?
No, a scenario can only be edited by one account at a time. Users attempting to open a scenario that is being edited by someone else will not be allowed to do so and will see a prompt saying “Another user is editing the scenario.”
2. Can multiple scenarios be merged or split? Can you switch across scenarios?
No, the current version does not support scenario merging, splitting, or cross-scenario navigation.
3. Can you add a hot zone to image resources?
No, only models and their sub-objects can have hot zones added. For more information on the hot zone, please refer to the Link section in the DataMesh Studio User Manual.
4. Can the camera be deleted or reset?
No, the camera is the default element in the scene. There can only be one camera per scenario, which cannot be added or removed.
The current camera POV (Point of view) cannot be reset but can be adjusted and saved. The “Get Camera POV” function allows you to switch to the camera’s view.
For more information about the camera, refer to the Camera section in the DataMesh Studio User Manual.
5. Why do camera movements not work on MR devices?
Please check the Camera Attribute in the scenario within DataMesh Studio and make sure the “Apply Camera Movement” option is checked. The camera movement feature will only function properly if this option is selected.
6. Is lighting effective in MR mode?
Yes, it is. If you need lighting effects, you need to enable the light sources in the DataMesh Studio scenario. For more details on lighting, please refer to the Light chapter in the DataMesh Studio User Manual.
7. Can you copy and paste sub-objects of the model in Studio?
No. In DatatMesh Studio, you can only copy the entire element as a whole.
8. Can you define the center point of an individual model in Studio?
No, Studio does not allow you to customize the center point for an individual element. However, when multiple elements are “grouped” or when multiple elements are selected, you can choose a public center or designate a specific element as the center point. For more information on center point settings, refer to the Multiple Selection and Group Elements section in the DataMesh Studio User Manual.
9. What are the differences between the three export options for scenarios?
Export Basic Scenario: The exported scenario file contains only the basic scenario file and does not include resource files. Note: To import and use this basic scenario file, you must have access to all the resources within the scenario.
Export Full Scenario: The exported scenario includes the basic scenario file and all related original resources (unaccelerated). Users can directly use the scenario after importing it, but resources need to be re-accelerated.
Export Accelerated Scenario: The exported scenario includes the basic scenario file, related original resources, and accelerated resource attachments. Users can directly use the scenario and accelerated resources after importing.
10. 10. How to unlock a file when prompted “Another user is editing the scenario” upon opening the scenario?
This message indicates that the scenario is currently being edited by a user and has been locked. Only the scenario’s creator or the current editor can unlock the file. To unlock it, go to FactVerse -> Digital Assets -> Resources, find the corresponding scenario file, and click the padlock icon.
11. Why can’t certain elements be deleted in Studio?
In Studio, due to the default inheritance between scenes, elements can only be deleted on the scene page where they “appear.” You can add “disappear” actions to the element in subsequent scenes, which means the element will not be displayed in later scenes.
DataMesh One
1. What are the prerequisites for event collaboration?
a) The company needs to enable event collaboration services. b) Accounts participating in the event must be under the same enterprise account. c) Accounts participating in the event have all been granted One-end Standard Mode permissions in FactVerse -> License Management.
2. Does scanning a resource code to view a scenario or model involve permission issues?
Yes, it does. If your account does not have permission to view a specific resource in FactVerse-Digital Assets, you will not be able to open that resource by scanning the resource code.
3. How to select child objects in editing mode?
In editing mode, you need to select objects layer by layer according to the model structure: Click once to select the parent level, click a second time to select the secondary level, and so on.
4. How many position codes can a scenario support at most?
A scenario supports up to 20 position codes.
5. Position code scanning error issues
Inconsistent placement direction: The placement direction of position codes is either horizontal or vertical. If the actual placement of the QR codes does not match the scenario settings, an error will be prompted.
Scanning QR codes not set in the scenario: Scanning position codes that are not set in the scenario will also trigger an error. For example, if the scenario is set with only 2 position codes, scanning the third position cod on DataMesh One will result in an error.
DataMesh Importer
1. Why can’t I open my model?
Unsupported model format: This may be due to model format incompatibility. Currently, the Importer primarily supports FBX and glTF format files, and is also compatible with OBJ, STL, 3MF, and PLY formats.
Model data loss: Data loss may have occurred during model export, which can lead to errors when opening the model in the Importer.
2. What should I do if the model only shows one side or is missing polygons after opening?
This issue is due to incorrect model normals. You need to return to the modeling software and adjust the normal directions.
3. Can you delete or modify child objects of a model in the Importer?
No. The Importer can only adjust nodes within the model structure when uploading the model, but it cannot delete sub-objects or modify them individually.
4. Why do substructure names in the model structure appear as letters or garbled text?
This is due to Chinese character encoding issues that occur during model export or conversion.
5. What should I do if the model is lagging or loading too slowly?
Large numbers of vertices and polygons or complex substructures can make the model file very large, affecting performance and loading speed. The model needs to be optimized for better performance.
6. Why is data missing after opening the model, such as materials or textures, and how should I handle it?
When exporting from Blender:
Pack Data: Ensure you check the “Pack Data” option. When exporting to FBX, make sure to set the path to “Copy” to prevent texture loss.
Material/Texture Issues: If you notice that the exported model shows only one material or textures are misaligned, it’s likely because Blender defaults to naming UV maps as “UVMap,” which differs from C4D. You can resolve this by either unwrapping the UVs again or renaming the UV map.
When exporting FBX from 3DMax:
Ensure you check the “Embed Media” option to make sure textures and materials are correctly included in the export.
When exporting models from C4D:
Models should be converted via Blender to export as FBX or GLB format.
7. What is the purpose of checking model structure nodes in the Importer?
By selecting model structure nodes, you can selectively control the components of the model. When uploading the model, selected nodes will be uploaded as separate objects, while unselected nodes will be merged with other nodes.
DataMeshFactVerse
1. How many admin accounts can a company have?
Each company can have only one admin account. Depending on the service package purchased by the company, multiple FactVerse accounts can be created.
2. Can different accounts within the same enterprise view each other’s edited/uploaded content?
a) Different accounts within the same department can share resources in their respective “department” folders. b) Resources in the “My Space” folder are only visible to the individual user. c) Users can create new folders and set the public scope of those folders themselves.
3. What does accelerated service mean?
The FactVerse platform’s model acceleration service automatically optimizes uploaded models to enhance loading and rendering performance across various platforms. The larger the model, the more significant the benefits of the acceleration service; smaller models may not show as much difference.
4. When logging into FactVerse for the first time, the message “License is not assigned. Please contact your administrator” appears.
The company admin must assign departments, roles, and licenses to accounts within the enterprise management module of FactVerse. Once the assignments are completed, users will be able to access the corresponding permissions.
For more information on permission allocation, please refer to the User Permission Management section in the DataMesh FactVerse User Manual.
5. What is the difference between uploading a model using Importer and directly uploading a model on FactVerse?
Importer Upload: Allows users to select and upload specific model structure nodes as needed.
FactVerse Upload: The cloud server cannot recognize the internal structure of the model, so the model is uploaded as a whole resource to the resources. In Studio, this resource does not have a model structure.
See section Login for login details. After a successful login, the token field in the returned result will be used in subsequent requests.
Add header
Add Authorization: Bearer {token} to the HTTP request header.
Encryption
Interfaces involving passwords must encrypt the transmitted strings. For technical support, please contact support@datamesh.com.
js encryption example
Reference JSEncrypt:
export function encryption(password) {
let encryptor = new JSEncrypt();
encryptor.setPublicKey(RSA_PUBLIC_KEY);
let result = encryptor.encrypt(password);
return result;
}
Client-side login or cancellation after web end approval
Endpoint
POST /api/v6/auth/scan/loginOrCancel
Description
In the QR code login process, after the scan end sends the parameters and the user clicks “Agree” on the web, the scan end will either log in or cancel the login.
1. Please use the latest version from the app store to access the latest features and fixes.
2. The texture resolution of the model is less than 1024 and the number of textures is less than 50.
3. Models made with PBR (Physically Based Rendering) materials can have excellent display effects on Vision Pro.
Troubleshooting
1. After opening a scenario, the scenario operation menu is obscured by the model.
Solution: Hold down the left fist for 1.5 seconds to bring the scenario operation menu back to your hand.
2. The virtual keyboard disappears when entering account credentials.
Solution: Look around; the keyboard may be behind you.
3. In collaboration events, if a member exits the app improperly, they cannot rejoin.
Solution: The event creator can kick out the member from the server or recreate the event to resolve this issue.
4. The app freezes.
Solution: Simultaneously press and hold the top button and Digital Crown until you see Force Quit Applications, tap the name of the app you want to close, then tap Force Quit.
5. When sharing the Vision Pro screen, Apple devices cannot be found in the device list.
Solution: Ensure that AirPlay is enabled on the Apple device and that everyone is allowed to AirPlay to the current device.
6. Internal sub-objects of the model cannot be selected.
Solution: Walk inside the model to accurately select sub-objects. For small models, zoom in first, then walk inside the model to select.
7. The APP window is too far or too close.
Solution: Look at the bottom window bar, pinch and drag the window bar to adjust to the appropriate distance. Note: The window shrinks when closer to the user and enlarges when farther away.
Look at content (like an app icon or button), then tap your index finger and thumb together to select it.
Touch
Type, interact in interactive experiences
Interact with certain elements directly with your fingers. For example, you can touch keys on the virtual keyboard, similar to typing on a physical keyboard.
Pinch and hold
Show additional options, zoom in and out
Pinch and hold your thumb and index finger together. For example, pinch and hold at the bottom of an app to see additional options (such as to close other apps).
You can also pinch and hold with both hands and pull apart to zoom in, or move them closer to zoom out.
Pinch and drag
Move windows, scroll
Pinch and hold to grab a window bar, photos, content or an object, then drag it wherever you like. For example, you can pinch and drag the window bar of an app or a shape in a Freeform board.
You can also pinch and drag to scroll. For example, in Photos, you can scroll up or down through your albums. Swipe to scroll quickly; tap to stop scrolling.
Swipe
Scroll quickly
Pinch and quickly flick your wrist.
Previous page of the scenario
Page navigation
Left palm facing up, pinch with index finger and thumb.
Next page of the scenario
Page navigation
Right palm facing up, pinch with index finger and thumb.
Simultaneously press and hold the top button and Digital Crown until you see Force Quit Applications, tap the name of the app you want to close, then tap Force Quit.
Take a capture
Simultaneously press the Digital Crown and the top button.
Redo eye and hand setup
Quickly redo eye and hand setup: Quadruple-click the top button, then follow the instructions.
Redo eye setup only: Go to Settings > Eyes & Hands > Redo Eye Setup, then follow the instructions.
Redo hand setup only: Go to Settings > Eyes & Hands > Redo Hand Setup, then follow the instructions.
Share your view
1. Open Control Centre, then tap .
2. Tap , then choose a compatible device from the list of available devices.
Open Photos, find the spatial photo or video you want to share, tap the More button, lightly tap the Share button, then choose an option such as “AirDrop,” “Mail,” or “Messages.”
Standard mode
DataMesh One has two browsing modes: Standard Mode and Training Mode. In Standard Mode, users can freely browse various resources and scene contents while also participating in multi-user collaboration events. You can switch browsing modes on the settings page.
Navigation menu
Resources
You can switch between browsing scenario-type resources or other types of files, such as 3D models, images, and PDF files, by clicking on the Scenario or Element tab in the top right corner of the page.
Scenes
Events
Training mode
In Training Mode, you can complete learning tasks and exam tasks, experiencing a more immersive learning environment through XR-based technology.
Navigation bar
Group list
Courseware list
Learning task interface
Exam interaction interface
Course record
Quick start
Preparation
1. Make sure the Apple Vision Pro device is connected to the internet. a) Go to Settings > Wi-Fi, then turn on Wi-Fi. b) Tap a network, then enter the password (if required).
2. Complete eye and hand setup.
3. Set up your space, ensure your space is clear of any obstacles that you could bump into, trip over, or hit with your hands.
4. Install DataMesh One: a) Directly open the App Store to browse and install DataMesh One. b) Download and install DataMesh One through the browser from the following address: https://apps.apple.com/app/datamesh-one/id1514070248.
Login
Private Deployment : When logging in to an enterprise using a private deployment server, you need to set a private deployment service code. Users can click on this icon to set the private deployment service code.
Server List: Clicking on the server list allows you to switch server regions.
Login Steps:
1. Configure server: a) Select public server: If your enterprise is deployed on a public server, choose the server belonging to the enterprise account from the server list. b) Set up private deployment server: If your enterprise uses a private deployment server, you need to click on the private deployment icon to set the exclusive service code.
2. Enter your FactVerse account and password.
3. Check the box “I have read and agree to the DataMesh ‘Terms of Use’ and ‘Privacy Policy’”, then click the Sign in button.
4. Select your enterprise account: If you have multiple accounts, select the correct one from a list.
5. Select the browsing mode: If your account has licenses for two different browsing modes simultaneously, the mode selection interface will appear. Select the mode and click Enter to complete the login.
Scenario playback
To browse scenario resources on the resources page, look at the scenario file, and pinch with your index finger to open the scenario and enter the scenario playback interface.
Note: There is an issue in the current version (DataMesh One 7.0 VisionOS version) where scenarios cannot be opened if they contain PDF files, advanced resources, models without source files (only .ab), or models from versions earlier than 4.0.
In the scenario playback interface, you can use the scenario operation menu to position and edit the scenario. The scenario operation menu is as follows:
Position: Click to enter positioning mode.
Edit: Drag, move, rotate, etc., the model as a whole or in parts through gestures;
Stage List: Click to display a list of scenario scene playback steps, and click on the scene name to jump to the specified scene.
Exit: Click to exit scenario playback.
Back/Next: Page-turning keys.
Scan hexagonal grid to position
Unlike other platforms, the Vision Pro version of DataMesh One uses hexagonal grids for scenario scanning and positioning. The steps for using hexagonal grids for scanning and positioning are as follows:
1. On the Resources page of Digital Assets module of the FactVerse platform, find the scenario you intend to use.
2. Open the Resources details page of the scenario and download the hexagonal grid positioning code.
3. Place the downloaded hexagonal grid positioning code at the target location.
4. On the standard mode interface of DataMesh One, open the scenario in the resource list.
5. Click the positioning button to enter the resource positioning mode.
6. Click the scan button to enter the scan positioning mode.
7. Scan the positioning code placed at the target location to complete the scenario positioning.
Collaboration
1. Click on the Events tab in the navigation menu to switch to the event list.
2. Click on the “+” button in the top right corner of the events page to enter the event creation interface.
3. Click on the Add button to open the scenario list, select a scenario for the event, and save.
4. Click on the Play button to start the event.
The collaboration features on Vision Pro are the same as those on the iOS end. For more details, please refer to the Events section in the DataMesh One user manual.
Example
Example 1: Presentation of Large Equipment Content
If you need to showcase large equipment in a hybrid online-offline meeting, you can use DataMesh One on Vision Pro to present the content in mixed reality (MR), allowing participants to clearly understand the equipment’s functions and structure. Here are the specific steps:
1. Equipment preparation: Prepare a Vision Pro, a display screen, and a MAC laptop (iPhone or iPad can also be used).
2. Network connection: Ensure Vision Pro and MAC are connected to the same Wi-Fi network.
3. Log into meeting software: Log into remote meeting software on the MAC.
4. Screen sharing: a) Mirror the Vision Pro screen to the MAC: Refer to the Share your view section for specific instructions. b) Present MAC content to both online and offline participants:
Use an HDMI cable to project MAC content onto the display screen for offline attendees to view Vision Pro content.
Utilize the screen sharing feature of the remote meeting software to share the MAC screen with online participants, enabling them to see Vision Pro content.
5. DataMesh One login: Log into DataMesh One standard mode on Vision Pro.
6. Open scenario: Locate and open the scenario under the Resources tab.
7. Scenario positioning: Adjust the scenario to the appropriate position using hexagonal grid positioning or gaze mode.
8. Play scenario: Click on buttons to flip pages or interact, initiating playback of the scenario while explaining relevant information and content about the equipment.
Example 2: Complete exam tasks
1. Log in to training mode Members of the “Cabinet operation training” training group log in to DataMesh One training mode.
2. View exam tasks Under the My Tasks tab, find “Cabinet operation training” in the Exam tab.
3. Open courseware list Click on “Cabinet operation training” to open the courseware list for this training group.
4. Open courseware and complete scenario positioning Click on “Cabinet operation training” courseware to open it and complete the scenario positioning.
5. Play and complete interactive operations Play: In the courseware playback interface, watch the scene content. After watching:
Click the right arrow to play the next page
Click the left arrow to return to the previous page
If the scene includes page navigation, click the corresponding buttons, models, or sub-elements as required to complete the navigation operation.
Click more button to view real-time score and remaining time or position scenario again.
Complete interaction position operations: When the scene includes interaction position tasks, the interface switches to the interaction position operation interface. a) Select the interaction element, and a blue highlight box will appear around the element model.
b) Based on the demo effect, move or rotate the interaction element. If the operation effect is not ideal, click the Reset button to restore the model to its pre-movement state and repeat the operation until the interaction element reaches the target position (both position and angle). c) After completing all interaction position operations in the scene, click Confirm button to finish.
6. View current score Click the scoreboard button to view the current score.
7. Complete task After completing the exam tasks, it will show that the task has ended. Click the Next button to view the results of this exam.
8. Complete exam Click the Exit button on the score page to complete the exam.
9. Exit exam On the exam results page, click the Exit button to leave the exam.
Core basic package and core tool package (mandatory installation):
datamesh.xr.ux@1.2.7.tgz
datamesh.localization.toolkit@1.0.0.tgz
datamesh.toolkit@7.0.1.tgz
SDK documentation:
DataMesh FactVerse Cloud API v1.0 User Guide.docx
DataMesh FactVerse Unity SDK v7.0 User Guide.docx
SDK Dependency Package (Install based on needs): The package includes the following paid plugins. Please purchase them from Unity Asset Store to obtain the legitimate copyright.
datamesh.avprovideo@1.0.0.tgz:AVPro Video (Optional video element dependency package for Director)
datamesh.vuplexwebview@1.0.0.tgz:3D WebView (Optional web element dependency package for Director)
URP Dependency Package: For SDK v7.0 and later versions, it is recommended to use the Universal Render Pipeline (URP): Universal Render Pipeline
Note: Due to version constraints of the plugins, the dependency packages include specific versions that have been tested. If you choose to use other versions, it may result in compilation errors. In such cases, you will need to resolve compatibility issues with the selected versions on your own.
SDK installation
Installation and environment requirements
Download
Enter the following short links in your browser to download the respective SDK packages:
Unity 2022.3 LTS or newer version. (We recommend using the tested Unity 2022.3 LTS version for optimal performance and stability.)
Visual Studio 2019 or newer version.
Project configuration
Importing common plugins
Create a new project.
Ensure necessary plugins are installed in the Package Manager before importing the SDK.
Open the project’s Packages directory and place the .tgz library files in it. Add the following content to the manifest.json file. Handle duplicate entries and make necessary code adjustments based on other versions used by the plugins.
5. In Edit > Project Settings, configure URP and TriLib.
Importing samples
After Unity finishes compiling, select Window -> Package Manager from the Unity menu bar.
In the opened window, select DataMesh Toolkit, and expand Samples on the right.
Click Import next to Common Datas and DataMeshDirector respectively.
4. Wait for the project to finish compiling.
Examples
DataMesh FactVerse Unity SDK provides the following basic examples:
Login to FactVerse: Demonstrates how to perform login operations for FactVerse using the SDK.
Use resource library: Shows how to access and manage the FactVerse resource library using the SDK.
Download and play Director scenarios: Demonstrates downloading and playing Director scenarios using the SDK.
Download and play FactVerse scenes: Shows how to download and play FactVerse scenes using the SDK.
Log in to FactVerse
Purpose
To utilize any feature of FactVerse, it is necessary to log in first. Through this example, you will understand:
How to initialize the function modules of the SDK.
Specific login process for FactVerse cloud services.
Common interfaces of AccountManager and their usage methods.
Usage example
Steps
Navigate to the “01 Account – Login and logout” directory and open the LoginSample scene.
Select the LoginSample object and check the information in the Inspector panel.
3. Open the DCS.json configuration file related to the DCS Profile property and modify server configurations. Note: Adjust settings based on your actual usage.
Note: Pay attention to the AccountManager object (Prefab) under the DataMeshModule object, which manages the main logic for the user module.
4. Start the scene.
If already logged in, the login success window will be displayed directly;
If not logged in, proceed to the account login window.
5. In the account login window, enter the account name and password, then click the login button. The system will verify the credentials. If the validation fails, it will prompt that the username or password is incorrect; if validation succeeds, it will proceed to tenant selection.
6. If there’s only one tenant on the account, it will be chosen by default. If there are multiple tenants, a selection window will open for you to choose from.
7. After selecting the tenant, the login process will start. Upon success, the login success window will display current account information.
Clicking on Scenarios redirects to view the scenarios.
Clicking on Scenes opens the digital twin scenes.
8. Click the logout button; after confirmation, you will return to the account login window.
Note: If previously logged in but facing token verification issues, manually delete the project’s token storage file to initiate a fresh login.
The token storage directory is located in [PersistencePath]/LoginData/. You can delete this directory directly.
To find [PersistencePath], refer to the Unity documentation on Application.persistencePath.
Access the resource library
Purpose
The resource library is one of the fundamental features of the FactVerse platform, which includes various types of resources such as 3D models, images, videos, audio and Director scenarios created using the DataMesh Studio.
This example aims to guide users to:
Initialize the resource library module.
Display resources in the resource library module by directory.
Learn about the common interfaces of LibraryManager and their usage methods.
Usage example
Steps
Navigate to the “02 Library – List resources” directory and open the ListResourceSample scene.
Select the ListResourceSample object and check the information in the Inspector panel.
Similar to the example of “logging into FactVerse,” this example also uses the same DCS.json configuration file, so there is no need to modify the configuration again.
Pay attention to the objects under DataMeshModule:
AccountManager: Account module
LibraryManager: Resource module
Make sure you’ve successfully logged in as outlined in the previous example “Logging into FactVerse.”
Start the example. After loading, the content of the current library will be displayed.
6. Click the Enter button next to a directory to enter the directory and display the resources within it.
7. Click the Enter button next to a scenario to enter scenario playback.
8. Click the Back button in the upper right corner to return to the previous directory.
Download and play Director scenarios
Purpose
This example aims to demonstrate how to download and play Director scenarios in the FactVerse resource library. It includes the following:
Methods to locate and download a resource via its path.
How to read scenario resources.
How to download additional resources required by the scenario.
How to play the scenario.
Methods to control the scenario during playback.
Usage example
Steps
Navigate to the “03 Director – Play a scenario” directory and open the DirectorSample scene.
Select the DirectorSample object and check the information in the Inspector panel
DirectorSample:
json: Similar to the example “Access to the resource library”, this one uses the same DCS.json configuration file, so no further modifications are needed.
Root object: Serves as the root object for playback, referenced in FactVerseSample. The Root object does not need to have any scripts attached.
BackStage(backend): Used to store objects that are currently not displayed and referenced in DirectorSample. The script ‘BackStage’ needs to be attached to this object.
ScenarioController object: Used for scenario playback control, referenced in DirectorSample. This object needs to be attached with a Director playback script. In this example, it’s attached with SimpleScenarioController for standalone scenario playback.
DataMeshModule:
AccountManager: Account module
LibraryManager: Resource module
DirectorManager: Director-related definitions
AssetManager: Resource loading module
Table: It provides the display of the floor in the scene.
TablePlane object (Prefab): The actual floor object.
MixedRealityPlayspace object (Prefab): Carries camera control functionalities for the scene. The main camera in the scene should be replaced by this object.
CameraController script: The primary script for camera control, which references the floor object in Table.
Note: Due to the lighting information contained in the Director scenario, the example has disabled the default lighting in the Unity scene.
Make sure you’ve successfully logged in as outlined in the previous example “Logging into FactVerse.”
After starting the example, a download panel will be displayed after the Loading process.
5. In the input box, enter the scenario’s address in the FactVerse resource library. For example: /test/Sample.dirpkg.
6. Click download. If the path is correct, the scenario will be downloaded and parsed, and the related resource download begins. Once everything is downloaded, the scenario will start playing automatically.
Note: Lights will be created during scenario playback based on the scenario settings.
7. Operation interface will be displayed at the bottom of the screen, where you can click “Prev” or “Next” to control the scenario playback. Note: If the scenario contains camera movement, the camera will be driven by the scenario during playback.
Download and play digital twin scene
Purpose
The digital twin scene is a type of resource created using FactVerse Designer and stored in the resource library.
This example aims to guide users to:
Locate and download the digital twin scene via the resource path.
Read digital twin scene resources.
Download additional resources required for the scene.
Play the digital twin scene.
Control the scene playback process.
Preparation: Creating Digital Twin scene
1. Log in to the FactVerse platform and create a working directory “SceneSample”.
2. On the Digital Twins page, click the New button, select Template, create a template named “TemplateSample”, and then enter the template details page.
3. Add new Attribute Name:1|1, IsOpen:1|2, Temperature:1|3 to the template “TemplateSample”.
4. On the Digital Twins page, click the New button, and use the “TemplateSample” template to create the digital twin “TwinObject1”.
5. On the Digital Twins page, click the New button, create the digital twin scene “TwinScene”, and enter the digital twin scene details page.
6. In the digital twin scene details page, reference the digital twin “TwinObject1”.
* Additional settings can also be made using the FactVerse Designer to modify the display effects of templates and digital twin scenes.
7. Open the template in FactVerse Designer and add display models to the template.
Usage example
Steps
Navigate to the “04 FactVerse – Play a digitaltwin scene” directory and open the FactVerseSample scene.
Select the FactVerseSample object and check its information in the Inspector panel.
FactVerseSample:
json configuration file: Similar to the example “Access the resource library”, this example uses the same DCS.json configuration file; no further modifications are needed.
Root object: Serves as the root object for playback, referenced in FactVerseSample. The Root object does not need to have any scripts attached.
Used for scene playback control, referenced in FactVerseSample. This object needs to be attached with the digitaltwin scene playback script. In the example, it is attached with SimulationSceneController, used for standalone scene
DataMeshModule:
AccountManager: Account module
LibraryManager: Resource module
DirectorManager: Director-related definitions
AssetManager: Resource loading module
DigitalTwinManager: DigitalTwin scene module
Table: It provides the display of the floor in the scene.
TablePlane object (Prefab): The actual floor object.
MixedRealityPlayspace: MixedRealityPlayspace: Carries camera control functionalities related to the scene. The main camera in the scene needs to be replaced by this object.
CameraController script: The primary script for camera control, which references the floor object in the Table.
Note: Due to the lighting information contained in the Designer scene, the example has disabled the default lighting in the Unity scene.
Make sure you’ve successfully logged in as outlined in the previous example “Logging into FactVerse.”
Start this example. After loading, the download panel will appear.
5. In the input box, enter the scene’s address in the FactVerse library For example, /test/Sample 10. 6. Click download. If the path is correct, the scene will be downloaded and parsed, and the related resource download begins. Once everything is downloaded, the scene will start playing automatically. Note: During scene playback, lights will be created based on the scene settings.
An operation interface will be displayed at the bottom of the screen. At the top is the tenant ID, followed by the digital twin ID and its attribute group. The attribute group consists of several key-value pairs (e.g., “1|3”: “30°C”). When the DFS server sends attribute changes to the specified digital twin for this tenant, the digital twin’s attributes will change accordingly.
Note: DFS (Data Fusion Service) is a key tool on the FactVerse platform used for accessing and processing data from external systems. It supports data storage, cleaning, and transformation, and also allows users to customize data sets. Through DFS, digital twins can obtain and present external data in real-time, enabling a more accurate simulation and reflection of the operational status and changes of real-world equipment. For more information about DFS, please refer to the information published on the official website. If you need to use DFS services, please contact support@datamesh.com for technical support.
To select something: When the cursor appears, point your hand at what you want to select. Then, pinch your thumb and finger together to select.
Scrolling up, down, left, or right: Pinch your fingers inward. While still pinched inward, move your hand up, down, left or right to scroll. When you’re done scrolling, release.
Brings you back to your Meta Home menu: Look at your palm at eye level, then hold your thumb and index finger together until fills, then release.
Trigger (on the front of the controller): Used for all click (confirm) interaction operations.
Grip button (on the side of the controller): In the scenario, used to bring up the menu bar.
Thumbsticks: Used to control scenario flipping; swipe left for the previous page, swipe right for the next page.
button: Abort program, enter the system main interface (where you can choose to continue or exit the program). For detailed instructions on the Meta Quest Touch Plus controller, please refer to Meta Quest Touch Plus controllers.
Standard mode
DataMesh One has two browsing modes: Standard Mode and Training Mode. In Standard Mode, users can freely browse various resources and scene contents while also participating in multi-user collaboration events. You can switch browsing modes on the settings page.
Navigation menu
Resources
In Standard Mode, the “Resources” page contains all the folder directories and resource files accessible to the current account. You can switch between browsing scenario-type resources or other types of files, such as 3D models, images, and PDF files, by clicking on the Scenario or Element tab in the top right corner of the page.
Scenes
Events
Training mode
In Training Mode, you can complete learning tasks and exam tasks, experiencing a more immersive learning environment through XR-based technology.
Navigation menu
Group list
Coureseware list
Course record interface
Quick start
Preparation
Connect Wi-Fi
Make sure the Meta Quest 3 device is connected to the internet.
1. Press on the right Touch controller to open the universal menu.
2. Hover over the clock on the left side of the universal menu. Once the Quick Settings appear, select it to open the Quick Settings window.
3. Choose Wi-Fi.
4. Turn on the Wi-Fi switch, then select the Wi-Fi network you want to connect to and enter the password.
Install DataMesh One
Three methods:
Select the Store from the universal menu, and find and install DataMesh One in the Meta Quest store.
Download and install DataMesh One via the short link: datame.sh/OneQuest3.
Set boundaries
Before starting to set boundaries, please make sure you have worn the Meta Quest device and ensure that the surrounding environment is open and free of obstacles that may threaten your safety.
Note that during the use of DataMesh One, when you step out of the boundary, you will not be able to view 3D models, files, and other resources, so please operate within the set boundaries.
The steps to set boundaries are as follows:
1. Press on your right Touch controller to pull up your universal menu.
2. Hover over the clock on the left side of the universal menu. When Quick Settings appears, select it to open the Quick Settings panel.
3. Select Boundary.
4. Select Stationary or Roomscale, then follow the on-screen instructions to set up your boundary.
Stationary: For using your headset while sitting or standing in place. Stationary Mode creates a default boundary area of 3 feet by 3 feet (1 meter by 1 meter) centered on yourself.
Roomscale: For using your headset while moving around inside your play area. Roomscale allows you to draw your boundaries in your physical space using your Touch controller. We recommend a safe and unobstructed space measuring at least 6.5 feet by 6.5 feet (2 meters by 2 meters).
Click on the account input box, and a virtual keyboard will appear. Enter your account and password in sequence to log in.
Private Deployment : When logging in to an enterprise using a private deployment server, you need to set a private deployment service code. Users can click on this icon to set the private deployment service code.
Server List: Clicking on the server list allows you to switch server regions.
Demo Mode: Clicking on it to enter DataMesh One in guest mode.
Play scenario
In the standard mode of DataMesh One, you can follow these steps to play the scenario:
1. In the resources page, select the scenario file you want to open, press the Trigger button at the front end of the right Touch controller, or open the scenario through gestures.
2. After the scenario is loaded, start playing the scenario.
In the scenario playback interface, you can use the scenario operation menu to position and edit the scenario. The scenario operation menu is as follows:
Position: Click to enter positioning mode.
Edit: Drag, move, rotate, etc., the model as a whole or in parts through gestures.
Stage List: Click to display a list of scenario scene playback steps, and click on the scene name to jump to the specified scene
Exit: Click to exit scenario playback
Back/Next: Page-turning arrows.
Take a capture
Shortcut
Press on your right Touch controller and press the trigger button once.
The other method
1. Press on your right Touch controller to suspend the app.
2. In the suspension window, tap the camera icon to take a capture.
Record video
Start recording
1. Press on your right Touch controller to pull up your universal menu.
2. Select Camera then select Record Video.
Note: A red dot will appear in VR to indicate that recording has started. This video capture indicatorcan be controlled from Settings.
Stop recording
1. Press on your right Touch controller to pull up your universal menu.
2. Select Camera then select Recording.
Collaboration
1. Click on the Events tab in the navigation menu to switch to the event list.
2. Click on the “+” button in the top right corner of the events page to enter the event creation interface.
3. Click on the Add button to open the scenario list, select a scenario for the event, and save.
4. Click on the Play button to start the event.
The collaboration features on Meta Quest are the same as those on the iOS end. For more details, please refer to the Events section in the DataMesh One user manual.
Cast to a screen
Cast to a phone
1. Download the Meta Quest mobile app to your phone.
2. Make sure your phone and headset are on the same Wi-Fi network.
3. Turn on Bluetooth.
4. Make sure your headset is close to your phone.
5. Open and log in to the Meta Quest app and make sure the headset and your phone are logged into the same Meta account.
6. Tap Casting from the Meta Quest app.
Cast to a computer
1. Make sure your computer and headset are on the same Wi-Fi network.
2. Press Meta button on your right controller to open the universal menu.
3. Select Camera and select Cast.
4. Select Computer then select Next to connect.
5. Take off the headset.
6. On your computer, open your browser then go to oculus.com/casting and log in. Note: Make sure the headset and your computer are logged into the same Meta account.