Project

General

Profile

Task #246

Updated by Sanghoon Lee 4 days ago

Summary: 
 ------- 
 *Updates:* 
 * Completed - Install the bitnami package Adds helper tools for Redmine and configure Redmine objective analysis to use as an external issue tracker for GitLab the Allegro hand model web UI interface - {{issue(244)}} {{issue(236)}} 

 *Issues:* 
 * Postponed(Review) - Drafting and Reviewing the ATAG Development Plan - {{issue(238)}} 
 * Delayed - Architecture and Functional Design of ATAG Packages - {{issue(239)}} 

 *To do:*  
 * Drafting and Reviewing the ATAG Development Plan - {{issue(238)}} 
 * Architecture and Functional Design of ATAG Packages - {{issue(239)}} 
 * Alternative Work - Create a 2F-140 package that recognizes an object (paper crane), grasp, and moves it. - {{issue(242)}} 

 -------------- 
 *Updates:* 
 * DevOps related task management MoveIt Pro Application development - {{issue(243)}} {{issue(191)}} 
 > * {{issue(233)}} 
 >> * Completed - Install the bitnami package Adds helper tools for Redmine and configure Redmine objective analysis to use as an external issue tracker for GitLab the Allegro hand model web UI interface - {{issue(244)}} {{issue(236)}} 
 >> *_This task involved installing Redmine adding helper functions to the AllegroHand Sidecar web UI developed the previous week to complement the Taxel sensor monitoring and configuring robot motion tracking capabilities. The FollowCam function allows monitoring the GitLab integration, as requested by detection status of sensors attached to the DepOps TF last week. Of course, this task became unnecessary as Allegro Hand while following the direction for DevOps tools was decided at last week's XELA SW team alignment follow-up meeting._* 
 >> *_However, I would like robot's movements. This is similar to share a few things I learned through the Redmine installation fixed viewpoint of the robot's front camera, which provides the optimal field of view for the sensors. The Analysis function records events occurring over time as the robot performs specific tasks, enabling real-time monitoring and integration process._* retrospective review of the presence and timing of Taxel detections._* 
 {{video("ur5e_xahr2c_config3_2_webUI_followcam-analysis.mp4",800, 400)}} 

 >> *_1. Since *_This video demonstrates the Bitnami Remine features developed last week._* 

 >> *_The first scene shows the configuration of the built MoveIt Pro user Docker image. On the left are the ur5e_xahr2c_config3 package for robot control and the MoveIt Pro UI. On the right is no longer supported, I used the Debian Redmine image Web UI provided by Docker Hub. It appears the xela_taxel_sidecar_ah package. This Web UI displays the Analysis toolbar and FollowCam mode, which focuses on the sensors. Functions can be selected using the buttons at the top, allowing you to choose and use them as needed. Please note that Bitnami is focusing on solutions optimized the data used for cloud-native environments since being acquired by VMWare._* 
 >> *_2. The GitLab development and Redmine integration testing was configured in three ways:_* 
 generated using the previously developed sim_xela_server._* 

 >> *_a. Push event triggers are sent to Redmine via Webhook settings *_The next scene visualizes the input from the uSkin sensor in GitLab, Robot Model mode while MoveIt Pro controls the robot. Since the Hand uses the tips of the thumb and Redmine performs mirror repo updates using index finger to pick up objects, only the gitlab_hook plugin._* 
 >> *_b. The URL View for Redmine was set up via External Issues Integration in GitLab._* 
 >> *_c. Issues the thumb and index fingertip sensors is enabled, while sensor modules for other parts are referenced in both GitLab and Redmine through commit messages._* 
 disabled._* 

 >> *_Although we will not be using it since *_The robot's control action involves moving to the SW team's standard was finalized last week, my personal opinion forward Pick location, approaching the object, and then lifting the object with the thumb and index finger. It is that it would be good a simple action of moving to maintain the GitLab Place location, dropping off an object, and Redmine settings for experimental use if possible._* returning to Home Pose._* 

 >> *_Note that robot hand control based on uSkin sensor input has not yet been implemented. This implementation is scheduled to proceed later as a project called ATAG._* 

 >> *_The focus of this demo is on visualization to clearly demonstrate the operation of our sensor product, rather than on robot control._* 

 *Issues:* 
 * Postponed(Review) - Drafting and Reviewing the ATAG Development Plan - {{issue(238)}} 
 * Delayed - Architecture and Functional Design of ATAG Packages - {{issue(239)}} 
 > *_The ATAG Dev plan review has been postponed until after the company's software development strategy is established. I expect it will likely proceed this week. Additionally, the ATAG design task has been rescheduled due to the review postponement._* 

 *To do:*  
 * {{issue(191)}} 
 > * {{issue(233)}} 
 >> * Create a 2F-140 package that recognizes an object (paper crane), grasp, and moves it. - {{issue(242)}} 
 > * {{issue(237)}} 
 >> * Drafting and Reviewing the ATAG Development Plan - {{issue(238)}} 
 >> * Architecture and Functional Design of ATAG Packages - {{issue(239)}} 

 > *_This week, I plan to proceed with the Dev Plan Review and ATAG design._*  
 > *_If this Dev Plan Review is delayed, I will review the development of a BT to add as an alternative plan, which involves adding object recognition capabilities to the 2F Gripper Config package by applying the MoveIt Pro's Pro CLIPSeg model. Additionally, I will create and deliver a brief explanatory document regarding the demo video to be shared with Picknik. That's all for my side._* me today._* 

 > *_CLIP-based Segmentation (CLIP-based Segmentation) is an open-vocabulary model that utilizes OpenAI's CLIP model to segment objects within an image using only a text prompt._* 

Back