Task #241
openTask Group #240: [260323] Software Weekly Meeting - 14:00
[260323] Sanghoon Lee, Weekly Report: 14:00
90%
Description
-------
Updates:
- Completed - Adds helper tools for objective analysis to the Allegro hand model web UI interface - Task #236: [Dev] Adds helper tools for objective analysis to the Allegro hand model web UI interface
- Postponed(Review) - Drafting and Reviewing the ATAG Development Plan - Task #238: [Dev-ATAG plan] Drafting and Reviewing the ATAG Development Plan
- Delayed - Architecture and Functional Design of ATAG Packages - Task #239: [Dev-ATAG] Architecture and Functional Design of ATAG Packages
- Drafting and Reviewing the ATAG Development Plan - Task #238: [Dev-ATAG plan] Drafting and Reviewing the ATAG Development Plan
- Architecture and Functional Design of ATAG Packages - Task #239: [Dev-ATAG] Architecture and Functional Design of ATAG Packages
- Alternative Work - Create a package that recognizes an object (paper crane), grasp, and moves it. - Task #242: [Dev] Create a 2F-140 package that recognizes an object (paper crane), grasp, and moves it.
Updates:
- MoveIt Pro Application development - Task Group #191: [Dev] MoveIt Pro Application development
- Task #233: [Dev] Create a Web UI Interface of uSkin for Allegro hand v4 curved visualization for the MoveIt Pro Env
- Completed - Adds helper tools for objective analysis to the Allegro hand model web UI interface - Task #236: [Dev] Adds helper tools for objective analysis to the Allegro hand model web UI interface
This task involved adding helper functions to the AllegroHand Sidecar web UI developed the previous week to complement the Taxel sensor monitoring and robot motion tracking capabilities. The FollowCam function allows monitoring the detection status of sensors attached to the Allegro Hand while following the robot's movements. This is similar to the fixed viewpoint of the robot's front camera, which provides the optimal field of view for the sensors. The Analysis function records events occurring over time as the robot performs specific tasks, enabling real-time monitoring and retrospective review of the presence and timing of Taxel detections.
This video demonstrates the features developed last week.
The first scene shows the configuration of the built MoveIt Pro user Docker image. On the left are the ur5e_xahr2c_config3 package for robot control and the MoveIt Pro UI. On the right is the Web UI provided by the xela_taxel_sidecar_ah package. This Web UI displays the Analysis toolbar and FollowCam mode, which focuses on the sensors. Functions can be selected using the buttons at the top, allowing you to choose and use them as needed. Please note that the data used for development and testing was generated using the previously developed sim_xela_server.
The next scene visualizes the input from the uSkin sensor in Robot Model mode while MoveIt Pro controls the robot. Since the Hand uses the tips of the thumb and index finger to pick up objects, only the View for the thumb and index fingertip sensors is enabled, while sensor modules for other parts are disabled.
The robot's control action involves moving to the forward Pick location, approaching the object, and then lifting the object with the thumb and index finger. It is a simple action of moving to the Place location, dropping off an object, and returning to Home Pose.
Note that robot hand control based on uSkin sensor input has not yet been implemented. This implementation is scheduled to proceed later as a project called ATAG.
Issues:The focus of this demo is on visualization to clearly demonstrate the operation of our sensor product, rather than on robot control.
- Postponed(Review) - Drafting and Reviewing the ATAG Development Plan - Task #238: [Dev-ATAG plan] Drafting and Reviewing the ATAG Development Plan
- Delayed - Architecture and Functional Design of ATAG Packages - Task #239: [Dev-ATAG] Architecture and Functional Design of ATAG Packages
To do:The ATAG Dev plan review has been postponed until after the company's software development strategy is established. I expect it will likely proceed this week. Additionally, the ATAG design task has been rescheduled due to the review postponement.
- Task Group #191: [Dev] MoveIt Pro Application development
- Task #233: [Dev] Create a Web UI Interface of uSkin for Allegro hand v4 curved visualization for the MoveIt Pro Env
- Create a package that recognizes an object (paper crane), grasp, and moves it. - Task #242: [Dev] Create a 2F-140 package that recognizes an object (paper crane), grasp, and moves it.
- Task Group #237: [Dev-ATAG] Scalable Tactile Grasping System for Allegro Hand (ATAG)
This week, I plan to proceed with the Dev Plan Review and ATAG design.
If this Dev Plan Review is delayed, I will review the development of a BT as an alternative plan, which involves adding object recognition capabilities to the 2F Gripper Config package by applying the MoveIt Pro CLIPSeg model. That's all for me today.
CLIP-based Segmentation (CLIP-based Segmentation) is an open-vocabulary model that utilizes OpenAI's CLIP model to segment objects within an image using only a text prompt.
Files
Updated by Sanghoon Lee 11 days ago
- Subject changed from [260316] Sanghoon Lee, Weekly Report: 14:00 to [260323] Sanghoon Lee, Weekly Report: 14:00
- Due date changed from 03/16/2026 to 03/23/2026
- Start date changed from 03/10/2026 to 03/17/2026
- % Done changed from 90 to 0
Updated by Sanghoon Lee 11 days ago
- File ur5e_xahr2c_config3_2_webUI_followcam-analysis.mp4 ur5e_xahr2c_config3_2_webUI_followcam-analysis.mp4 added
- Description updated (diff)
- Status changed from New to Resolved
- % Done changed from 0 to 90