A Bot That Optically Finds and Laser Targets
TargetBot Showing Servo and Laser and Camera
The main node gets location of the fiducial (checkerboard) pattern and then translates that to an X (back and forth) and Y (vertical) setting for the two servos.  X is the lower one and on it's shaft is a horizontal little HD-1440 servo for Y (up down).   It is crude and there is no closed loop feedback which would requires finding the red laser dot and that is not done yet.

The little laser is rotated off of the Y servo shaft.  So we have crude X-Y control.
Mark-World - Tech Projects To Amuse The Curious

This is a Bot that does not have wheels but is smart in it's own way.   TargetBot will use image recognition to find a target and then point a Laser at that target.

This project is a true showcase of a variety of great subsystems so watch for links below!

A Raspberry Pi 3 is loaded up with a variety of components and software to form a nice system to find and target a pattern with a laser and show info on the target on a display.   The project is based on ROS platform on top of Ubiquity 16.04.

The Mark-Toys USB Multi-Port dongle configured for Servo Control has a simple serial port interface so we control an X and Z servo from this little board as an easy add-on subsystem for the Pi or for that matter ANY computer.

The RaspberryPi cased in a nice transparent 3D printed thin case runs ROS and thus hosts other add-ons with minimal effort.  

The UbiquityRobotics.com Kernel image is used as is the RaspiCam node to talk to the Raspberry Pi compatible wide angle (160 degree) camera. 

The Aruco target image recognition ROS node finds the target and publishes a ROS topic which our main ROS node then picks up in standard ROS callback.


I have a ROS node that drives the very popular 1.3 inch diagonal OLED display from the Raspberry Pi I2C lines.   This is not yet published on my Github (so much to do so little time).   But when it does appear it will likely be on my github for ROS stuff here 

Notice the display node shows host name at top line, the IP address of the bot on line 3 and then app specific info below where in this case it shows the Tag number found (100) and then below the angle to the left as -8.5 deg and the angle up as 7.8 degrees.  Fun stuff.

On many of my projects like this one I use my sys_monitor.py code that I start in linux in the rc.local file.   This allows 'auto start' of the ROS nodes and allows either shutdown or just a restart from a simple switch on the Raspberry Pi GPIO.  It also has a single LED that will give basic feedback as to what the sys_monitor code is doing like gettting ready to shutdown or whatever.    Please refer to  the Mark-Toys   sys_monitor.py code and directions here
TargetBot