Task #235
Updated by Sanghoon Lee 18 days ago
Summary: ------- *Updates:* * Completed - Create ur5e_xahr2c_config3 MoveIt Pro configuration package for UR5e + allegro hand with xela tactile integration - {{issue(232)}} * Completed - Create a Web UI Interface of uSkin for Allegro hand v4 curved visualization for the MoveIt Pro Env - {{issue(233)}} *To do:* * Adds helper tools for objective analysis to the Allegro hand model web UI interface - {{issue(236)}} * Drafting and Reviewing the ATAG Development Plan - {{issue(238)}} * Architecture and Functional Design of ATAG Packages - {{issue(239)}} -------------- *Updates:* * MoveIt Pro Application development - {{issue(191)}} > * Completed - Create ur5e_xahr2c_config3 MoveIt Pro configuration package for UR5e + allegro hand with xela tactile integration - {{issue(232)}} > * Completed - Create a Web UI Interface of uSkin for Allegro hand v4 curved visualization for the MoveIt Pro Env - {{issue(233)}} > * Completed - Add helper tools for objective analysis to the xela web UI interface - {{issue(229)}} > * Completed - Create an Objective consisting of Xela 2F Behaviors using uSPr2F Taxel input - {{issue(228)}} > * Completed - Create xela_2f_behaviors controlled by inputs from uSPr2F Taxel to be used in MoveIt Pro Objective. - {{issue(227)}} > *_The second first task is was to create a Xela-specific Allegro hand visualization web UI that maintains visual consistency. This is because add helper functions to the default MoveIt Pro Studio Sidecar Web UI does not provide developed last week, supplementing the functionality Taxel sensor monitoring and robot behavior tracking features. The FollowCam feature allows the robot to display custom visualization markers offered monitor sensor sensing status by following the standalone visualization node. This UI robot's movements, similar to a fixed view of the robot's front camera, which provides the same functionality as best view of the standalone ROS2 visualization node via RViz2_* sensor. The Analysis feature records events that occur over time when the robot performs a specific Objective, enabling real-time monitoring and post-event review of the presence and timing of Taxel sensing._* > *_The second task involved creating an Objective that utilizes the sensor monitoring system integrated with the ur5e_x2f_140c_config4 package and receives sensor input as feedback to control the robot. The third task involved developing the behaviors required for the Objective. These behaviors are for sensor input and robot control._* {{video("ur5e_xahr2c_config3_2_webUI.mp4",800, {{video("mpro_robot_ctrl_w_taxel_behaviors.mp4",800, 400)}} > *_The video shows the following:_* *_Video Content:_* > *_This video demonstrates the features developed last week_* week by implementing this Objective_* > *_The first scene shows the configuration of the built MoveIt Pro user Docker image. On the left is are the ur5e_xahr2c_config3 ur5e_x2f_140c_config4 package for robot control control, the Objective and Behaviors implemented in that package, and the MoveIt Pro UI. On At the right are top, the migrated standalone visualization node and uSPr2F sensors on both sides of the xela_taxel_sidecar_ah app, which provides the Xela Allegro hand web UI. Please note that since I do not possess the paper crane are used as actual Allegro hand and uSkin sensors, inputs. On the data used for development and testing was generated using right are the previously developed sim_xela_server._* analysis/followcam helper functions added to xela_taxel_sidecar_2f._* > *_The *_In the following scene shows scene, the upper left is the MoveIt Pro UI on the left ur5e_x2f_140c_config4 package WebUI, and the Xela Model Mode of upper right is the Xela sidecar Web UI on the right. I developed, with FollowCam mode selected and Analysis mode enabled. The existing RViz2 View can be seen in bottom left is the console terminal, and the bottom right corner. The Xela Web UI supports is a webcam video, showing my sensors and the same Grid Mode and Xela Model Mode developed for ROS2, and additionally supports Robot Model Mode, allowing for a total of three modes to view sensor data._* paper crane._* > *The *_The demo scenario involves picking up a paper crane in front of the following scene visualizes robot and moving it to the input from left. As the uSkin sensor robot closes its gripper by grasping the object in Robot Model Mode while MoveIt Pro controls the robot. For reference, control front of it, I manually activate a sensor at an appropriate moment to pick up the robot hand based on uSkin sensor input has not yet been implemented; paper crane and touch it. The GraspWithForceStop action detects the crane's contact, stops the gripper, and moves it to the desired location. During this implementation movement, the SlipMonitor action is a task that is larger in scale and requires a bit more development time compared activated to the 2F Model. A review detect any slippage of the development plan regarding this paper crane, and if slippage is scheduled for this week._* detected, the gripper is gently closed. Let's take a look._* > *_The next scene shows 18 sensor modules mounted on each part of the Allegro hand. By selecting specific parts and toggling visualization on/off, you can visualize the fingers and their phalanges. Although all actual sensor inputs are being received, robot's movement is complete, but this feature selectively displays data in demo uses the Marker View, allowing you Analysis function to focus on specific locations depending on your task objectives. Here, to highlight review events that occurred during the Xela sensor, other parts task._* > *_First, the timeline is paused with the Pause button. A flag indicates the occurrence of the robot and sensors in unselected areas have been simplified in grayscale._* > *The following scene depicts an event. Detailed information about each event is provided as a scenario where tooltip when the robot mouse is controlled by focusing hovered over it. The red flags here are where the necessary sensor input on the data paper crane slipped from the thumbnail and index fingernail when picking up an object with gripper. These are the thumb places where I gently waved my hand while moving the paper crane._* > *_Currently, the SlipMonitor Behavior closes the gripper slightly more whenever this slip occurs._* > *_The Objective and index finger. As you Behaviors I've developed this time are targeting paper cranes. If future development targets or goals arise, I can see, develop additional Behaviors as needed and use them in the robot's movements are observed with all sensors disabled except for the thumbnail and index fingernail._* Objective._* *To do:* * {{issue(191)}} > * {{issue(233)}} >> * Adds helper tools for objective analysis to the Allegro hand model web UI interface - {{issue(236)}} > * {{issue(237)}} >> * Drafting and Reviewing the ATAG Development Plan - {{issue(238)}} >> * Architecture and Functional Design of ATAG Packages - {{issue(239)}} > *_This week, I plan I'll be porting the xela_server2_ah and standalone xela_taxel_viz_ahv4 ROS2 packages I've developed so far to develop two additional features (FollowCam and Analysis View) the MoveIt Pro environment._* > *_This means I'll be working on integrating the ur5e robot arm with uSkin for sensor visualization Allegro Hand v4 Curved, and monitoring in porting the Xela Allegro hand web UI._* standalone xela_taxel_viz_ah package._* > *I will *_I'll also request a review meeting regarding be developing the development plan xela_taxel_sidecar_ah package, which provides the Taxel visualization capabilities of uSkin for the Allegro Taxel Adaptive Grasping (ATAG) package Hand as a secondary web UI interface, similar to discuss decision-making matters._* > *_Once the development plan is finalized, I plan to proceed with the package design. MoveIt Pro's primary web UI. That's all for me today._* > *_This week, Mr. Wilson and I will review my dev plans to develop a robotic hand-taxel adaptive grasping system, which is my next goal._*