Task #235
openTask Group #234: [260316] Software Weekly Meeting - 14:00
[260316] Sanghoon Lee, Weekly Report: 14:00
90%
Description
-------
Updates:
- Completed - Create ur5e_xahr2c_config3 MoveIt Pro configuration package for UR5e + allegro hand with xela tactile integration - Task #232: [Dev] Create ur5e_xahr2c_config3 MoveIt Pro configuration package for UR5e + allegro hand with xela tactile integration
- Completed - Create a Web UI Interface of uSkin for Allegro hand v4 curved visualization for the MoveIt Pro Env - Task #233: [Dev] Create a Web UI Interface of uSkin for Allegro hand v4 curved visualization for the MoveIt Pro Env
- Adds helper tools for objective analysis to the Allegro hand model web UI interface - Task #236: [Dev] Adds helper tools for objective analysis to the Allegro hand model web UI interface
- Drafting and Reviewing the ATAG Development Plan - Task #238: [Dev-ATAG plan] Drafting and Reviewing the ATAG Development Plan
- Architecture and Functional Design of ATAG Packages - Task #239: [Dev-ATAG] Architecture and Functional Design of ATAG Packages
--------------
Updates: - MoveIt Pro Application development - Task Group #191: [Dev] MoveIt Pro Application development
- Completed - Create ur5e_xahr2c_config3 MoveIt Pro configuration package for UR5e + allegro hand with xela tactile integration - Task #232: [Dev] Create ur5e_xahr2c_config3 MoveIt Pro configuration package for UR5e + allegro hand with xela tactile integration
- Completed - Create a Web UI Interface of uSkin for Allegro hand v4 curved visualization for the MoveIt Pro Env - Task #233: [Dev] Create a Web UI Interface of uSkin for Allegro hand v4 curved visualization for the MoveIt Pro Env
The first task was to migrate the ROS2 visualization node of the Allegro Hand robot to the MoveIt Pro Config package, identical to the MoveIt Pro package for 2f_140 developed previously. This package controls a model combining ur5e, Allegro hand v4, and Xela uSkin for Robot Hands sensors.
The second task is to create a Xela-specific Allegro hand visualization web UI that maintains visual consistency. This is because the default MoveIt Pro Studio UI does not provide the functionality to display custom visualization markers offered by the standalone visualization node. This UI provides the same functionality as the standalone ROS2 visualization node via RViz2.
The video shows the following:
This video demonstrates the features developed last week
The first scene shows the configuration of the built MoveIt Pro user Docker image. On the left is the ur5e_xahr2c_config3 package for robot control and the MoveIt Pro UI. On the right are the migrated standalone visualization node and the xela_taxel_sidecar_ah app, which provides the Xela Allegro hand web UI. Please note that since I do not possess the actual Allegro hand and uSkin sensors, the data used for development and testing was generated using the previously developed sim_xela_server.
The following scene shows the MoveIt Pro UI on the left and the Xela Model Mode of the Xela Web UI on the right. The existing RViz2 View can be seen in the bottom right corner. The Xela Web UI supports the same Grid Mode and Xela Model Mode developed for ROS2, and additionally supports Robot Model Mode, allowing for a total of three modes to view sensor data.
The scenario in the following scene visualizes the input from the uSkin sensor in Robot Model Mode while MoveIt Pro controls the robot.
For reference, control of the robot hand based on uSkin sensor input has not yet been implemented; this implementation is a task that is larger in scale and requires a bit more development time compared to the 2F Model. A review of the development plan regarding this is scheduled for this week.
The next scene shows 18 sensor modules mounted on each part of the Allegro hand. By selecting specific parts and toggling visualization on/off, you can visualize the fingers and their phalanges. Although all actual sensor inputs are being received, this feature selectively displays data in the Marker View, allowing you to focus on specific locations depending on your task objectives. Here, to highlight the Xela sensor, other parts of the robot and sensors in unselected areas have been simplified in grayscale.
To do:The following scene depicts a scenario where the robot is controlled by focusing the necessary sensor input on the data from the thumbnail and index fingernail when picking up an object with the thumb and index finger. As you can see, the robot's movements are observed with all sensors disabled except for the thumbnail and index fingernail._
- Task Group #191: [Dev] MoveIt Pro Application development
- Task #233: [Dev] Create a Web UI Interface of uSkin for Allegro hand v4 curved visualization for the MoveIt Pro Env
- Adds helper tools for objective analysis to the Allegro hand model web UI interface - Task #236: [Dev] Adds helper tools for objective analysis to the Allegro hand model web UI interface
- Task Group #237: [Dev-ATAG] Scalable Tactile Grasping System for Allegro Hand (ATAG)
This week, I plan to develop two additional features (FollowCam and Analysis View) for sensor visualization and monitoring in the Xela Allegro hand web UI.
I will also request a review meeting regarding the development plan for the Allegro Taxel Adaptive Grasping (ATAG) package to discuss decision-making matters._
Once the development plan is finalized, I plan to proceed with the package design. That's all for me today.
Files
Updated by Sanghoon Lee 20 days ago
- Subject changed from [260309] Sanghoon Lee, Weekly Report: 14:00 to [260316] Sanghoon Lee, Weekly Report: 14:00
- Due date changed from 03/09/2026 to 03/16/2026
- Status changed from New to In Progress
- Start date changed from 03/03/2026 to 03/10/2026
- % Done changed from 50 to 0
Updated by Sanghoon Lee 18 days ago
- File ur5e_xahr2c_config3_2_webUI.mp4 ur5e_xahr2c_config3_2_webUI.mp4 added
- Description updated (diff)
- Status changed from In Progress to Resolved
- % Done changed from 0 to 90