Project

General

Profile

Task #235

Updated by Sanghoon Lee 18 days ago

Summary: 
 ------- 
 *Updates:* 
 * Completed - Create ur5e_xahr2c_config3 MoveIt Pro configuration package for UR5e + allegro hand with xela tactile integration - {{issue(232)}} 
 * Completed - Create a Web UI Interface of uSkin for Allegro hand v4 curved visualization for the MoveIt Pro Env - {{issue(233)}} 

 *To do:*  
 * Adds helper tools for objective analysis to the Allegro hand model web UI interface - {{issue(236)}} 
 * Drafting and Reviewing the ATAG Development Plan - {{issue(238)}} 
 * Architecture and Functional Design of ATAG Packages - {{issue(239)}} 
 -------------- 
 *Updates:* 
 * MoveIt Pro Application development - {{issue(191)}} 
 > * Completed - Create ur5e_xahr2c_config3 MoveIt Pro configuration package for UR5e + allegro hand with xela tactile integration - {{issue(232)}} 
 > * Completed - Create a Web UI Interface of uSkin for Allegro hand v4 curved visualization for the MoveIt Pro Env - {{issue(233)}} 
 > *_The first task was to migrate the ROS2 visualization node of the Allegro Hand robot to the MoveIt Pro Config package, identical to the MoveIt Pro package for 2f_140 developed previously. This package controls a model combining ur5e, Allegro hand v4, and Xela uSkin for Robot Hands sensors._* 
 > *_The second task is to create a Xela-specific Allegro hand visualization web UI that maintains visual consistency. This is because the default MoveIt Pro Studio UI does not provide the functionality to display custom visualization markers offered by the standalone visualization node. This UI provides the same functionality as the standalone ROS2 visualization node via RViz2._* RViz2_* 
 > *_The second task involved creating an Objective that utilizes the sensor monitoring system integrated with the ur5e_x2f_140c_config4 package and receives sensor input as feedback to control the robot. The third task involved developing the behaviors required for the Objective. These behaviors are for sensor input and robot control._* 
 {{video("ur5e_xahr2c_config3_2_webUI.mp4",800, 400)}} 
 > *_The video shows the following:_* 
 > *_This video demonstrates the features developed last week_* 
 > *_The first scene shows the configuration of the built MoveIt Pro user Docker image. On the left is the ur5e_xahr2c_config3 package for robot control and the MoveIt Pro UI. On the right are the migrated standalone visualization node and the xela_taxel_sidecar_ah app, which provides the Xela Allegro hand web UI. Please note that since I do not possess the actual Allegro hand and uSkin sensors, the data used for development and testing was generated using the previously developed sim_xela_server._* 

 > *_The following scene shows the MoveIt Pro UI on the left and the Xela Model Mode of the Xela Web UI on the right. The existing RViz2 View can be seen in the bottom right corner. The Xela Web UI supports the same Grid Mode and Xela Model Mode developed for ROS2, and additionally supports Robot Model Mode, allowing for a total of three modes to view sensor data._* 

 > *_The *The scenario in the following scene visualizes the input from the uSkin sensor in Robot Model Mode while MoveIt Pro controls the robot._* 
 > *_For robot. For reference, control of the robot hand based on uSkin sensor input has not yet been implemented; this implementation is a task that is larger in scale and requires a bit more development time compared to the 2F Model. A review of the development plan regarding this is scheduled for this week._* 

 > *_The next scene shows 18 sensor modules mounted on each part of the Allegro hand. By selecting specific parts and toggling visualization on/off, you can visualize the fingers and their phalanges. Although all actual sensor inputs are being received, this feature selectively displays data in the Marker View, allowing you to focus on specific locations depending on your task objectives. Here, to highlight the Xela sensor, other parts of the robot and sensors in unselected areas have been simplified in grayscale._* 

 > *The following scene depicts a scenario where the robot is controlled by focusing the necessary sensor input on the data from the thumbnail and index fingernail when picking up an object with the thumb and index finger. As you can see, the robot's movements are observed with all sensors disabled except for the thumbnail and index fingernail._* 

 *To do:*  
 * {{issue(191)}} 
 > * {{issue(233)}} 
 >> * Adds helper tools for objective analysis to the Allegro hand model web UI interface - {{issue(236)}} 
 > * {{issue(237)}} 
 >> * Drafting and Reviewing the ATAG Development Plan - {{issue(238)}} 
 >> * Architecture and Functional Design of ATAG Packages - {{issue(239)}} 

 > *_This week, I plan to develop two additional features (FollowCam and Analysis View) for sensor visualization and monitoring in the Xela Allegro hand web UI._* 
 > *I will also request a review meeting regarding the development plan for the Allegro Taxel Adaptive Grasping (ATAG) package to discuss decision-making matters._* 
 > *_Once the development plan is finalized, I plan to proceed with the package design. That's all for me today._* 

Back