Task #241
Updated by Sanghoon Lee 12 days ago
Summary: ------- *Updates:* * Completed - Create ur5e_xahr2c_config3 MoveIt Pro configuration package for UR5e + allegro hand with xela tactile integration - {{issue(232)}} * Completed - Create a Web UI Interface of uSkin for Allegro hand v4 curved visualization for the MoveIt Pro Env - {{issue(233)}} *To do:* * Adds helper tools for objective analysis to the Allegro hand model web UI interface - {{issue(236)}} *Issues:* * Postponed(Review) - Drafting and Reviewing the ATAG Development Plan - {{issue(238)}} * Delayed - Architecture and Functional Design of ATAG Packages - {{issue(239)}} *To do:* * Drafting and Reviewing the ATAG Development Plan - {{issue(238)}} * Architecture and Functional Design of ATAG Packages - {{issue(239)}} * Alternative Work - Create a package that recognizes an object (paper crane), grasp, and moves it. -------------- *Updates:* * MoveIt Pro Application development - {{issue(191)}} > * {{issue(233)}} Completed - Create ur5e_xahr2c_config3 MoveIt Pro configuration package for UR5e + allegro hand with xela tactile integration - {{issue(232)}} >> > * Completed - Adds helper tools Create a Web UI Interface of uSkin for objective analysis to the Allegro hand model web UI interface v4 curved visualization for the MoveIt Pro Env - {{issue(236)}} {{issue(233)}} >> *_This > *_The first task involved adding helper functions was to migrate the AllegroHand Sidecar web UI developed ROS2 visualization node of the previous week Allegro Hand robot to complement the Taxel sensor monitoring and robot motion tracking capabilities. The FollowCam function allows monitoring the detection status of sensors attached MoveIt Pro Config package, identical to the MoveIt Pro package for 2f_140 developed previously. This package controls a model combining ur5e, Allegro Hand while following the robot's movements. hand v4, and Xela uSkin for Robot Hands sensors._* > *_The second task is to create a Xela-specific Allegro hand visualization web UI that maintains visual consistency. This is similar because the default MoveIt Pro Studio UI does not provide the functionality to display custom visualization markers offered by the fixed viewpoint of the robot's front camera, which standalone visualization node. This UI provides the optimal field of view for the sensors. The Analysis function records events occurring over time same functionality as the robot performs specific tasks, enabling real-time monitoring and retrospective review of standalone ROS2 visualization node via RViz2._* {{video("ur5e_xahr2c_config3_2_webUI.mp4",800, 400)}} > *_The video shows the presence and timing of Taxel detections._* following:_* {{video("ur5e_xahr2c_config3_2_webUI_followcam-analysis.mp4",800, 400)}} >> > *_This video demonstrates the features developed last week._* >> week_* > *_The first scene shows the configuration of the built MoveIt Pro user Docker image. On the left are is the ur5e_xahr2c_config3 package for robot control and the MoveIt Pro UI. On the right is are the Web UI provided by migrated standalone visualization node and the xela_taxel_sidecar_ah package. This Web UI displays the Analysis toolbar and FollowCam mode, app, which focuses on provides the sensors. Functions can be selected using the buttons at the top, allowing you to choose and use them as needed. Xela Allegro hand web UI. Please note that since I do not possess the actual Allegro hand and uSkin sensors, the data used for development and testing was generated using the previously developed sim_xela_server._* >> > *_The next following scene visualizes shows the input from the uSkin sensor in Robot Model mode while MoveIt Pro controls UI on the robot. Since left and the Hand uses the tips Xela Model Mode of the thumb and index finger to pick up objects, only Xela Web UI on the right. The existing RViz2 View for can be seen in the thumb bottom right corner. The Xela Web UI supports the same Grid Mode and index fingertip sensors is enabled, while Xela Model Mode developed for ROS2, and additionally supports Robot Model Mode, allowing for a total of three modes to view sensor modules for other parts are disabled._* data._* >> > *_The robot's control action involves moving to scenario in the forward Pick location, approaching following scene visualizes the object, and then lifting input from the object with uSkin sensor in Robot Model Mode while MoveIt Pro controls the thumb and index finger. It is a simple action robot._* > *_For reference, control of moving to the Place location, dropping off an object, and returning to Home Pose._* >> *_Note that robot hand control based on uSkin sensor input has not yet been implemented. This implemented; this implementation is scheduled a task that is larger in scale and requires a bit more development time compared to proceed later as a project called ATAG._* >> *_The focus the 2F Model. A review of the development plan regarding this demo is scheduled for this week._* > *_The next scene shows 18 sensor modules mounted on each part of the Allegro hand. By selecting specific parts and toggling visualization on/off, you can visualize the fingers and their phalanges. Although all actual sensor inputs are being received, this feature selectively displays data in the Marker View, allowing you to clearly demonstrate focus on specific locations depending on your task objectives. Here, to highlight the operation Xela sensor, other parts of our the robot and sensors in unselected areas have been simplified in grayscale._* > *The following scene depicts a scenario where the robot is controlled by focusing the necessary sensor product, rather than input on robot control._* *Issues:* * Postponed(Review) - Drafting the data from the thumbnail and Reviewing index fingernail when picking up an object with the ATAG Development Plan - {{issue(238)}} * Delayed - Architecture thumb and Functional Design of ATAG Packages - {{issue(239)}} > *_The ATAG Dev plan review has been postponed until after index finger. As you can see, the company's software development strategy is established. I expect it will likely proceed this week. Additionally, robot's movements are observed with all sensors disabled except for the ATAG design task has been rescheduled due to the review postponement._* thumbnail and index fingernail._* *To do:* * {{issue(191)}} > * {{issue(233)}} >> * Adds helper tools for objective analysis to the Allegro hand model web UI interface - {{issue(236)}} > * {{issue(237)}} >> * Drafting and Reviewing the ATAG Development Plan - {{issue(238)}} >> * Architecture and Functional Design of ATAG Packages - {{issue(239)}} > *_This week, I plan to proceed with develop two additional features (FollowCam and Analysis View) for sensor visualization and monitoring in the Dev Plan Review and ATAG design._* Xela Allegro hand web UI._* > *_If this Dev Plan Review is delayed, I *I will also request a review meeting regarding the development of a BT as an alternative plan, which involves adding object recognition capabilities plan for the Allegro Taxel Adaptive Grasping (ATAG) package to discuss decision-making matters._* > *_Once the 2F Gripper Config development plan is finalized, I plan to proceed with the package by applying the MoveIt Pro CLIPSeg model. design. That's all for me today._* > *_CLIP-based Segmentation (CLIP-based Segmentation) is an open-vocabulary model that utilizes OpenAI's CLIP model to segment objects within an image using only a text prompt._*