Conversation
Greptile SummaryThis PR introduces a complete visual servoing system using eye-in-hand ArUco marker tracking: a new Previously flagged concerns around Confidence Score: 4/5Safe to merge for software/simulation paths; real-hardware path in calibrate_hand_eye.py has unresolved P1 API concerns from prior review rounds. Existing P1 findings about XArmAdapter method names and mm/m unit handling in calibrate_hand_eye.py are still unresolved. New findings in this pass are P2 only (default marker_size mismatch). Score is capped at 4/5 by the outstanding P1. dimos/manipulation/dynamic_tracking/calibrate_hand_eye.py — XArmAdapter API surface (connect/disconnect/read_cartesian_position) and unit consistency need verification before real-hardware use. Important Files Changed
Flowchart%%{init: {'theme': 'neutral'}}%%
flowchart TD
RS[RealSenseCamera\ncolor_image + camera_info] --> AT[ArucoTracker]
AT -- solvePnP per marker --> PE[Marker Pose Estimation]
PE -- avg position + quaternion --> TF[TF Publish\ncamera_optical_frame → aruco_avg]
TF --> TFL[TF Lookup\nbase_link → aruco_avg]
TFL -- PoseStamped\nframe_id=task_name --> CC[ControlCoordinator\nCartesianIKTask]
CC -- Pinocchio IK --> JC[Joint Commands]
JC --> HW{Hardware}
HW --> MOCK[Mock Arm\naruco-servo-mock]
HW --> XARM[XArm6\naruco-servo-xarm6]
AT -- annotated_image --> VIZ[Rerun Viewer]
Reviews (4): Last reviewed commit: "Fix mypy errors" | Re-trigger Greptile |
Files Created
dimos/manipulation/dynamic_tracking/__init__.pyPackage init.
dimos/manipulation/dynamic_tracking/aruco_tracker.pyArUco marker tracker module — completely rewritten for main's architecture:
In[Image](color) andIn[CameraInfo](intrinsics) from RealSensecamera_optical_frame → aruco_avgtransform to TF treePoseStampedonOut[cartesian_command]PoseStamped.frame_idis set to the CartesianIKTask name, so the coordinator routes it correctlydimos/manipulation/dynamic_tracking/calibrate_hand_eye.pyEye-in-hand calibration tool:
cv2.calibrateHandEye()(5 methods available)load_calibration()andcalibration_to_transform()helpers for blueprint usedimos/manipulation/dynamic_tracking/blueprints.pyThree visual servoing blueprints:
aruco_tracker_realsensedimos run aruco-trackeraruco_servo_mockdimos run aruco-servo-mockaruco_servo_xarm6dimos run aruco-servo-xarm6Each wires:
RealSenseCamera→ArucoTracker→PoseStamped→ControlCoordinator(withCartesianIKTaskusing Pinocchio IK viaLfsPath("xarm_description/urdf/xarm6/xarm6.urdf"))Files Modified
dimos/robot/all_blueprints.pyAdded 3 new blueprint entries:
aruco-servo-mock,aruco-servo-xarm6,aruco-trackerTF Tree (Full Chain)
Architecture Decision
Rather than rebasing the 25-commit
ruthwik_dynamic_trackingbranch (which was 216 commits behind main and referenced a completely differentControlOrchestratorAPI), I ported thecore concepts onto a fresh branch from main. Main has evolved significantly —
ControlOrchestrator→ControlCoordinator, newCartesianIKTaskwith built-in Pinocchio IK. The new implementation is ~50% less code because it leveragesCartesianIKTaskrather than doing IK internally.One-line Test Command
Gradual Testing
Just verify the code loads without errors in your dimos environment:
python3 -c "from dimos.manipulation.dynamic_tracking.aruco_tracker import ArucoTracker; print('OK')"
python3 -c "from dimos.manipulation.dynamic_tracking.blueprints import aruco_servo_mock; print('OK')"
dimos run aruco-tracker
This runs detection only — verifies ArUco markers are detected and TF transforms are published. You'll need ArUco markers (DICT_4X4_50, 15mm) visible to the camera. Check the Rerun
viewer for annotated images and TF frames.
dimos run aruco-servo-mock
Real camera + mock arm. Verifies the full pipeline: detection → TF lookup → PoseStamped → CartesianIKTask. The mock arm won't move physically but you can watch joint state changes in
Rerun or on the /coordinator/joint_state LCM topic.
dimos run aruco-servo-xarm6
Real camera + real arm.