<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>Photons, Electrons, and Dirt &#187; Robotics</title>
	<atom:link href="https://bikerglen.com/blog/category/robotics/feed/" rel="self" type="application/rss+xml" />
	<link>https://bikerglen.com/blog</link>
	<description>A blog by Glen Akins</description>
	<lastBuildDate>Mon, 16 Feb 2026 00:47:00 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>hourly</sy:updatePeriod>
	<sy:updateFrequency>1</sy:updateFrequency>
	<generator>https://wordpress.org/?v=4.2.38</generator>
	<item>
		<title>Tracking People with the Googly Eyes and OpenCV</title>
		<link>https://bikerglen.com/blog/tracking-people-with-the-googly-eyes-and-opencv/</link>
		<comments>https://bikerglen.com/blog/tracking-people-with-the-googly-eyes-and-opencv/#comments</comments>
		<pubDate>Wed, 13 May 2015 12:38:07 +0000</pubDate>
		<dc:creator><![CDATA[Glen]]></dc:creator>
				<category><![CDATA[Arduino]]></category>
		<category><![CDATA[Machine Vision]]></category>
		<category><![CDATA[Robotics]]></category>

		<guid isPermaLink="false">http://bikerglen.com/blog/?p=438</guid>
		<description><![CDATA[In part 1 of this series of posts, we built a giant set of robotic googly eyes. In part 2, we brought the googly eyes to life using an Arduino. In this post, we’ll use OpenCV to make the googly &#8230; <a href="https://bikerglen.com/blog/tracking-people-with-the-googly-eyes-and-opencv/">Continue reading <span class="meta-nav">&#8594;</span></a>]]></description>
				<content:encoded><![CDATA[<p><iframe width="640" height="360" src="https://www.youtube.com/embed/ez7bBb92yZM?feature=oembed" frameborder="0" allowfullscreen></iframe></p>
<p>In part 1 of this series of posts, we built a <a href="http://bikerglen.com/blog/build-a-pair-of-robotic-googly-eyes/">giant set of robotic googly eyes</a>. In part 2, we brought the googly eyes to life <a href="http://bikerglen.com/blog/bringing-the-robotic-googly-eyes-to-life/">using an Arduino</a>. In this post, we’ll use OpenCV to make the googly eyes detect and follow people as they move around the room. More specifically, we’ll use OpenCV to detect faces on a webcam and move the googly eyes to look roughly in the direction of the largest face in view.</p>
<p><span id="more-438"></span><strong>Project Overview</strong></p>
<p>To detect faces, we’ll use a Python script and the OpenCV open source computer vision library. The Python script and OpenCV will run on a small computer running Ubuntu Linux. The Python script will grab an image from a webcam, feed the image into one of the OpenCV face detection algorithms, and, if any faces are detected, feed a scaled version of the x coordinate of the largest detected face to the Arduino over a serial port. Finally, the Arduino will run code to move the eyeballs back and forth based on the value received from the Python script.</p>
<p><strong>Required Hardware</strong></p>
<div id="attachment_524" style="width: 650px" class="wp-caption alignnone"><a href="http://bikerglen.com/wp/wp-content/uploads/2015/05/googly-eyes-opencv-hardware.jpg"><img src="http://bikerglen.com/wp/wp-content/uploads/2015/05/googly-eyes-opencv-hardware-1024x737.jpg" alt="The hardware used for this project." width="640" height="461" class="size-large wp-image-524" /></a><p class="wp-caption-text">The hardware used for this project.</p></div>
<p>OpenCV and face detection require some raw compute power. I’ve successfully used both a March 2011 Core i7 MacBook Pro 13&#8243; running MacOS X and a May 2015 Intel Core i7 NUC running Ubuntu 14.10 to run OpenCV and the face detection algorithms for this project. I tried using a Raspberry Pi 2 with the Raspberry Pi camera as well but the latency and frame rate were just too slow to create a convincing tracking / following effect.</p>
<p>I’d recommend the following hardware:</p>
<ul>
<li>A computer with an Intel Core i5 or Core i7 laptop or desktop chip made within the last three or four years. I used an Intel Core i7 NUC with 16GB DDR and a 120GB M.2 SATA SSD.</li>
<li>An HD webcam with a resolution of 1280&#215;720 that is compatible with Linux such as the Logitech C270 that I used.</li>
<li>You’ll need one free USB port for the webcam and one free USB port for the Arduino.</li>
<li>The <a href="http://bikerglen.com/blog/build-a-pair-of-robotic-googly-eyes/">googly eyes</a>, <a href="http://bikerglen.com/blog/bringing-the-robotic-googly-eyes-to-life/">Arduino, and stepper motor drivers</a> described in two earlier posts.</li>
</ul>
<p><strong>Required Software</strong></p>
<div id="attachment_525" style="width: 650px" class="wp-caption alignnone"><a href="http://bikerglen.com/wp/wp-content/uploads/2015/05/opencv-on-macbook-pro.jpg"><img src="http://bikerglen.com/wp/wp-content/uploads/2015/05/opencv-on-macbook-pro-1024x700.jpg" alt="An earlier version of the OpenCV face detection script running on a MacBook Pro." width="640" height="438" class="size-large wp-image-525" /></a><p class="wp-caption-text">An earlier version of the OpenCV face detection script running on a MacBook Pro.</p></div>
<p>The software stack required to run this project is complex and somewhat difficult to install. But once it is installed, the basic Python script that connects OpenCV to the googly eyes is only a few hundred lines of code—and most of that is from the sample code included with OpenCV. All the software required for this project is open source and available for download for free. </p>
<p>Here’s the required software:</p>
<ul>
<li>Ubuntu 14.10</li>
<li>NumPy and SciPy</li>
<li>OpenCV 2.4.9</li>
<li>PySerial</li>
<li>screen (Linux utility used as a terminal emulator)</li>
</ul>
<p><strong>Installing Linux</strong></p>
<p>Installing Linux never goes as easily as I think it should go—especially if trying to install an older, stable version of the OS on new hardware. I ran into issues with the Intel graphics drivers for my hardware not being a part of the Ubuntu 14.10 distribution. Here’s the general procedure to install the OS:</p>
<ul>
<li>Download the 64-bit PC (AMD64) desktop version of Ubuntu 14.10 from <a href="http://releases.ubuntu.com/14.10/">http://releases.ubuntu.com/14.10/</a>.</li>
<li>Follow the instructions <a href="http://computers.tutsplus.com/tutorials/how-to-create-a-bootable-ubuntu-usb-drive-for-pc-on-a-mac--cms-21187">here</a> to create a bootable USB thumbdrive from the downloaded .iso file. These instructions are for creating the USB thumbdrive from a Mac. Other sites have similar instruction for working from a PC or another Linux box.</li>
<li>Boot from the USB thumbdrive and follow the prompts to install the OS.</li>
</ul>
<p>Unfortunately, my graphics driver issue prevented me from installing the OS on the first try. To successfully install the OS, I had to boot the installer with the nomodeset option added to the boot options. This would allow the installer GUI to run. Once the OS was installed, I had to add the nomodeset option to the OS boot options in order to run the Intel graphics driver installer GUI. Once the drivers were installed, I had to remove the nomodeset from the OS boot options to get the full performance of the Intel graphics on the NUC.</p>
<div id="attachment_527" style="width: 650px" class="wp-caption alignnone"><a href="http://bikerglen.com/wp/wp-content/uploads/2015/05/boot-nomodeset.jpg"><img src="http://bikerglen.com/wp/wp-content/uploads/2015/05/boot-nomodeset-1024x682.jpg" alt="nomodeset added to the boot options during Linux installation." width="640" height="426" class="size-large wp-image-527" /></a><p class="wp-caption-text">nomodeset added to the boot options during Linux installation.</p></div>
<p>To boot the installer with the nomodeset option, I followed <a href="http://askubuntu.com/questions/591992/intel-nuc-nuc5i5ryh-boot-only-mouse-pointer-appears-xubuntu-14-04-14-10">these instructions</a>. Bascially, boot the installer from the USB stick, and edit the boot options before starting the installer to have the nomodeset option in addition to the quiet and splash options.</p>
<div id="attachment_529" style="width: 494px" class="wp-caption alignnone"><a href="http://bikerglen.com/wp/wp-content/uploads/2015/05/grub-nomodeset.jpg"><img src="http://bikerglen.com/wp/wp-content/uploads/2015/05/grub-nomodeset.jpg" alt="Adding nomodeset to /etc/default/grub." width="484" height="344" class="size-full wp-image-529" /></a><p class="wp-caption-text">Adding nomodeset to /etc/default/grub.</p></div>
<p>Once the OS was installed and booted, I edited /etc/default/grub to have the nomodeset option as well then ran sudo update-grub to propagate the changes into /boot/grub/grub.cfg. More details on this are available <a href="http://askubuntu.com/questions/38780/how-do-i-set-nomodeset-after-ive-already-installed-ubuntu">here</a>. The solution BELOW the checked answer (not THE checked answer) is fairly detailed.</p>
<p>At this point, I updated the kernel to version 3.18.3 using <a href="https://www.linux.com/community/blogs/133-general-linux/803937-installupgrade-linux-kernel-to-3183-stable-in-ubuntulinux-mintpeppermint">these instructions</a> and installed the <a href="https://download.01.org/gfx/ubuntu/14.10/main/pool/main/i/intel-linux-graphics-installer/intel-linux-graphics-installer_1.0.8-0intel1_amd64.deb">Intel graphics driver</a>. This required invoking dpkg to install the installer then running the installer itself.</p>
<p>Finally, I edited /etc/default/grub to remove the nomodeset option and rebuilt /boot/grub/grub.cfg with sudo update-grab. At this point my Linux box was running Ubuntu 14.10 with the 3.18.3 kernel and the 1.08 Intel graphics drivers. All was working well.</p>
<p>Now it was time to update the OS. Luckily that’s easy to do. Run these two commands:</p>
<ul>
<li>sudo apt-get update</li>
<li>sudo apt-get upgrade</li>
</ul>
<p><strong>Installing the Webcam</strong></p>
<div id="attachment_533" style="width: 610px" class="wp-caption alignnone"><a href="http://bikerglen.com/wp/wp-content/uploads/2015/05/testing-the-webcam.jpg"><img src="http://bikerglen.com/wp/wp-content/uploads/2015/05/testing-the-webcam.jpg" alt="Testing the webcam using the cheese utility." width="600" height="621" class="size-full wp-image-533" /></a><p class="wp-caption-text">Testing the webcam using the cheese utility.</p></div>
<p>I used a Logitech C270 webcam. It was cheap. It’s HD 720p. It works with Linux. I used the cheese application to test the webcam. It can be downloaded and installed with this command.</p>
<ul>
<li>sudo apt-get install cheese</li>
</ul>
<p>Once installed, type cheese at a command prompt to launch and a small window with an image from the webcam should appear on the desktop.</p>
<p><strong>Installing NumPy and SciPy</strong></p>
<p>The OpenCV face detection sample program requires NumPy. SciPy is useful too. Might as well install both. Instructions to install both are on <a href="http://www.scipy.org/install.html">this page</a>. Follow the instructions for Ubuntu. It’s one big long command: </p>
<ul>
<li>sudo apt-get install python-numpy python-scipy python-matplotlib ipython ipython-notebook python-pandas python-sympy python-nose</li>
</ul>
<p><strong>Installing OpenCV</strong></p>
<p>The next step is to download, compile, and install OpenCV version 2.4.9. The process is relatively straightforward except that FFMPEG is no longer part of the Ubuntu 14.10 distribution. Rather than try to download FFMPEG from a different source, I just compiled OpenCV without FFMPEG support.</p>
<p>I followed <a href="http://www.samontab.com/web/2014/06/installing-opencv-2-4-9-in-ubuntu-14-04-lts/">these instructions</a> to download, compile, and install OpenCV except that I used libtiff5 instead of libtiff4 and I ran cmake with the additional option -D WITH_FFMPEG=OFF to disable FFMPEG support. This option should come before the final &#8216;..&#8217; in the cmake command line. </p>
<p>To pick up libtiff5 instead of libtiff4, I used this command to install the opencv dependencies:</p>
<ul>
<li>sudo apt-get install build-essential libgtk2.0-dev libjpeg-dev libtiff5-dev libjasper-dev libopenexr-dev cmake python-dev python-numpy python-tk libtbb-dev libeigen3-dev yasm libfaac-dev libopencore-amrnb-dev libopencore-amrwb-dev libtheora-dev libvorbis-dev libxvidcore-dev libx264-dev libqt4-dev libqt4-opengl-dev sphinx-common texlive-latex-extra libv4l-dev libdc1394-22-dev libavcodec-dev libavformat-dev libswscale-dev default-jdk ant libvtk5-qt4-dev</li>
</ul>
<p>My arguments to cmake with FFMPEG support turned off were:</p>
<ul>
<li>cmake -D WITH_TBB=ON -D BUILD_NEW_PYTHON_SUPPORT=ON -D WITH_V4L=ON -D INSTALL_C_EXAMPLES=ON -D INSTALL_PYTHON_EXAMPLES=ON -D BUILD_EXAMPLES=ON -D WITH_QT=ON -D WITH_OPENGL=ON -D WITH_VTK=ON -D WITH_FFMPEG=OFF ..</li>
</ul>
<p><strong>Accessing the Serial Port</strong></p>
<p>PySerial is used to access the serial port from Python. PySerial was installed either with the rest of the OS or one of the other packages. In any event, it was already present on my system after following all the above steps. </p>
<p>I use the Linux utility screen as a terminal emulator. It’s useful for communicating with the Arduino outside of the Arduino development environment too. Screen is not installed by default. The following command will install it:</p>
<ul>
<li>sudo apt-get install screen</li>
</ul>
<p>Finally, any user that wishes to access the serial port needs to be placed into the dialout group. This can be done using the command below where $USER is the username. Once the command is executed, the user needs to log off then log back into the system for it to take effect.</p>
<ul>
<li>sudo usermod -a -G dialout $USER</li>
</ul>
<p><strong>Woohoo! Software Installed!</strong></p>
<p>That concludes the installation of the OS and all the software packages required to run the Python face detection script and communicate with the Arduino. The next steps are to install a sketch on the Arduino to move the motors in response to commands on the serial port and to run the Python face detection script.</p>
<p><strong>The Arduino Sketch</strong></p>
<p>The Arduino sketch listens on the Arduino’s serial port for a position for each of the eyes and then moves the motors to that position. While the motors are running, it listens for the next position and will change the destination on the fly if the previous move has not completed.</p>
<p>Download the Arduino sketch from my github repository <a href="https://raw.githubusercontent.com/bikerglen/googly-eyes/master/software/arduino/googly_eyes_arduino_listen.ino">here</a>. Install the sketch on the Arduino using the Arduino IDE. If needed, <a href="http://bikerglen.com/blog/bringing-the-robotic-googly-eyes-to-life/">post two</a> of this series contains links to download the IDE and some tutorials.</p>
<p>Connect the Arduino to the Linux box and execute the following command:</p>
<ul>
<li>screen /dev/ttyUSB0 19200</li>
</ul>
<p>where /dev/ttyUSB0 is the Arduino’s serial port on the Linux box. Adafruit has a good <a href="http://www.ladyada.net/learn/arduino/lesson0-lin.html">tutorial</a> on finding the name of the serial port in case it is not /dev/ttyUSB0 on your machine. 19200 is the baud rate. It should match the baud rate set in the Arduino sketch.</p>
<p>If screen starts then immediately exits, your user account needs to be added the dialout group using the usermod command a few paragraphs above.</p>
<div id="attachment_536" style="width: 494px" class="wp-caption alignnone"><a href="http://bikerglen.com/wp/wp-content/uploads/2015/05/arduino-screen-test.png"><img src="http://bikerglen.com/wp/wp-content/uploads/2015/05/arduino-screen-test.png" alt="Using screen to test connectivity to the Arduino and googly eyes." width="484" height="344" class="size-full wp-image-536" /></a><p class="wp-caption-text">Using screen to test connectivity to the Arduino and googly eyes.</p></div>
<p>Hit return a few times and you should get a prompt that says “EYES>.”  Connect the motor power supply and execute the following commands:</p>
<ul>
<li>mt -800 -800</li>
<li>mt 800 800</li>
<li>mt 0 0</li>
</ul>
<p>The motors should move a 1/4 turn CW, a half turn CCW, and finally a 1/4 turn CW and return to their initial position. Type CTRL-A then k to exit screen.</p>
<p><strong>The Python Face Detection Script</strong></p>
<p>In opencv-2.4.9/samples/python2, there’s a Python face detection script called facedetect.py. My script is based off this sample code. To run the sample as is, cd to opencv-2.4.9/samples/python2 then run:</p>
<ul>
<li>python facedetect.py</li>
</ul>
<p>A window should launch showing live video from the webcam. Green rectangles should appear around any faces in the video. Blue rectangles should appear around nested eyeballs. Hit escape to exit.</p>
<div id="attachment_543" style="width: 650px" class="wp-caption alignnone"><a href="http://bikerglen.com/wp/wp-content/uploads/2015/05/out_22.jpg"><img src="http://bikerglen.com/wp/wp-content/uploads/2015/05/out_22.jpg" alt="Output of the default facedetect.py script. Detected faces are highlighted in green. Detected nested eyeballs are highlighted in blue." width="640" height="480" class="size-full wp-image-543" /></a><p class="wp-caption-text">Output of the default facedetect.py script. Detected faces are highlighted in green. Detected nested eyeballs are highlighted in blue.</p></div>
<p>I made the following changes to the script to make it work with the Arduino and the googly eyes:</p>
<ol>
<li>Adjust video size — Capture the input video at the full resolution of the webcam.</li>
<li>Delete nested object detection — We don’t need to detect eyeballs.</li>
<li>Mirror the video — It’s a bit easier to walk around and watch what is happening with a mirror image vs a normal image so I used fliplr to flip the webcam image before displaying it.</li>
<li>Find the biggest face — What happens when multiple people are in the frame? I modified the code to calculate the areas of all the faces found and use the largest face. All the faces are highlighted in green and the face used for the position of the eyes is highlighted in red.</li>
<li>Send motor positions out the serial port — The final modification is to take the x value of the center of the red rectangle, scale it to the range -800 to +800, and send the scaled position out the serial port.</li>
</ol>
<p>Create a new directory, cd to it, and download my script to it from my github repository <a href="https://raw.githubusercontent.com/bikerglen/googly-eyes/master/software/opencv/fd2.py">here</a>. The script is dependent on several files that are in the python2 samples directory. Copy the following files so they are in the same directory as the file fd2.py:</p>
<ul>
<li>opencv-2.4.9/samples/python2/common.py</li>
<li>opencv-2.4.9/samples/python2/video.py</li>
<li>opencv-2.4.9/data/haarcascades/haarcascade_frontalface_alt.xml</li>
</ul>
<p>Connect the Arduino, adjust the pupils so they are looking straight down, and power up the motor drivers. Now launch the Python script using the following command. Be sure to replace /dev/ttyUSB0 with the actual name of the serial port connected to the Arduino.</p>
<ul>
<li>python fd2.py &#8211;serial /dev/ttyUSB0</li>
</ul>
<div id="attachment_546" style="width: 650px" class="wp-caption alignnone"><a href="http://bikerglen.com/wp/wp-content/uploads/2015/05/out_100.jpg"><img src="http://bikerglen.com/wp/wp-content/uploads/2015/05/out_100-1024x576.jpg" alt="Output of the modified face detection python script. The largest face is used to position the eyeballs and is highlighted in red with any smaller faces highlighted in green. The time to process the image and the position value sent to the Arduino are displayed in the upper left hand corner." width="640" height="360" class="size-large wp-image-546" /></a><p class="wp-caption-text">Output of the modified face detection python script. The largest face is used to position the eyeballs and is highlighted in red with any smaller faces highlighted in green. The time to process the image and the position value sent to the Arduino are displayed in the upper left hand corner.</p></div>
<p>With any luck, the webcam video is displayed on the screen, the largest face is highlighted with a red rectangle, any extra faces are highlighted with green rectangles, and the googly eyes move back and forth tracking the face highlighted in the red rectangle.</p>
<p><strong>The Results</strong></p>
<p>In case you missed the YouTube video at the top of the post, here it is again:</p>
<p><iframe width="640" height="360" src="https://www.youtube.com/embed/ez7bBb92yZM?feature=oembed" frameborder="0" allowfullscreen></iframe></p>
<p>The video shows me walking back and forth across the room while the eyes follow me. The monitor in the video shows a mirrored image from the webcam with my face highlighted.</p>
]]></content:encoded>
			<wfw:commentRss>https://bikerglen.com/blog/tracking-people-with-the-googly-eyes-and-opencv/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
		<item>
		<title>Bringing the Robotic Googly Eyes to Life</title>
		<link>https://bikerglen.com/blog/bringing-the-robotic-googly-eyes-to-life/</link>
		<comments>https://bikerglen.com/blog/bringing-the-robotic-googly-eyes-to-life/#comments</comments>
		<pubDate>Mon, 11 May 2015 00:35:53 +0000</pubDate>
		<dc:creator><![CDATA[Glen]]></dc:creator>
				<category><![CDATA[Arduino]]></category>
		<category><![CDATA[Robotics]]></category>

		<guid isPermaLink="false">http://bikerglen.com/blog/?p=433</guid>
		<description><![CDATA[In part 1 of this series of posts, we built a giant set of robotic googly eyes. Now it&#8217;s time to animate them! To keep the project simple, we’ll use an Arduino Uno and a pair of Big Easy Driver &#8230; <a href="https://bikerglen.com/blog/bringing-the-robotic-googly-eyes-to-life/">Continue reading <span class="meta-nav">&#8594;</span></a>]]></description>
				<content:encoded><![CDATA[<div id="attachment_483" style="width: 650px" class="wp-caption alignnone"><a href="http://bikerglen.com/wp/wp-content/uploads/2015/05/animated-googly-eyes-15.gif"><img class="size-full wp-image-483" src="http://bikerglen.com/wp/wp-content/uploads/2015/05/animated-googly-eyes-15.gif" alt="In this post, we’ll add motion to the googly eyes using a microcontroller and some stepper motor drivers." width="640" height="321" /></a><p class="wp-caption-text">In this post, we’ll add motion to the googly eyes using a microcontroller and some stepper motor drivers.</p></div>
<p>In <a href="http://bikerglen.com/blog/build-a-pair-of-robotic-googly-eyes/">part 1</a> of this series of posts, we built <a href="http://bikerglen.com/blog/build-a-pair-of-robotic-googly-eyes/">a giant set of robotic googly eyes</a>. Now it&#8217;s time to animate them! To keep the project simple, we’ll use an Arduino Uno and a pair of Big Easy Driver stepper motor drivers. Let’s get started.</p>
<p><span id="more-433"></span><strong>Required Parts</strong></p>
<p>To animate the eyes, you’ll need the following parts:</p>
<ul>
<li>1 <a href="https://www.sparkfun.com/products/11021">Arduino Uno</a>, $20</li>
<li>2 <a href="https://www.sparkfun.com/products/12859">Big Easy Drivers</a>, $20 each</li>
<li>10 <a href="https://www.sparkfun.com/products/8084">3.5mm Pitch, 2-Pin Screw Terminals, $0.95 each</a></li>
<li>1 <a href="http://www.mouser.com/ProductDetail/Cincon/TRG36A24-11E03-Level-V/?qs=%2fha2pyFadugwUZObrL1oDu2cWa8Z9QFYH0e%252bBjaUZbIo3SusTud6LQ%3d%3d">DC Power Supply for the Motors, 12-24V, 1A</a>, $20</li>
</ul>
<p>Total cost for these parts is about $90 excluding shipping and handling.</p>
<p><strong>Prepare the Motor Drivers</strong></p>
<p>The first step is to prepare the stepper motor drivers.</p>
<div id="attachment_441" style="width: 650px" class="wp-caption alignnone"><a href="http://bikerglen.com/wp/wp-content/uploads/2015/05/powerconnections.jpg"><img class="size-large wp-image-441" src="http://bikerglen.com/wp/wp-content/uploads/2015/05/powerconnections-1024x705.jpg" alt="Big easy driver power connections, +3.3/5V selection pads (blue), Vref test point (green), and current setting pot (yellow)." width="640" height="441" /></a><p class="wp-caption-text">Big easy driver power connections, +3.3/5V selection pads (blue), Vref test point (green), and current setting pot (yellow).</p></div>
<p>Refer to the photo above as you complete the following steps:</p>
<ol>
<li>Connect the M+ screw terminals on both drivers to the (+) lead of the 24V motor power supply and the GND screw terminals immediately next to the M+ terminals to the (-) lead of the 24V power supply as shown in the above photo. Do <strong><em>NOT</em></strong> connect the 24V motor power supply to the Arduino! Doing so will likely damage the Arduino.</li>
<li>Verify the +3.3/5V selection pads are not shorted or soldered together. These are highlighted above in light blue. If the pads are bridged together, use a soldering iron and some solder wick to clear the solder between them.</li>
<li>Calculate the reference voltage for the motor drivers. The reference voltage sets the current through the motor windings. Our motors have a rating of 0.5A. The reference voltage to set the current to 0.5A is computed as follows:<br />
<blockquote><p>Vref = I*(8*Rref) where I = 0.5A and Rref = 0.11Ω.<br />
Vref = 0.5*8.0*0.11 = 0.44V</p></blockquote>
<p>If you used different motors than I specified in <a href="http://bikerglen.com/blog/build-a-pair-of-robotic-googly-eyes/">part 1</a>, be sure to use the current rating of your motors rather than 0.5A.</li>
<li>Next use a voltmeter and a small screwdriver to set the calculated reference voltage.
<ul>
<li>Connect the ground lead of the voltmeter to a convenient ground point in the circuit.</li>
<li>Place the positive lead of the voltmeter on the test point highlighted above in green.</li>
<li>Plug in and turn on the motor power supply.</li>
<li>Slowly adjust the current setting knob highlighted above in yellow until the voltage at the test point is just under the calculated reference voltage (0.44 volts).</li>
<li>Repeat this procedure for the 2nd stepper motor driver.</li>
</ul>
</li>
<li>Turn off and unplug the motor power supply before moving to the next section.</li>
</ol>
<p>My motors ran pretty hot (140°F) using the specified maximum current and calculated reference voltage. To make the motors run cooler, I adjusted the reference voltage down to half the calculated value and used 0.22 volts instead of 0.44 volts. The motors still had plenty of torque and ran significantly cooler (100°F).</p>
<p><strong>Connect the Stepper Drivers to the Arduino and Motors</strong></p>
<p>The next step is to connect the Arduino to the stepper drivers and the stepper drivers to the stepper motors. Below is a diagram of the connections between the stepper drivers, the Arduino, and the stepper motors. Below that are two checklists listing the signal names and their connections. Use the diagram and the checklists to connect the Arduino to the stepper drivers and then the stepper drivers to the stepper motors.</p>
<div id="attachment_442" style="width: 650px" class="wp-caption alignnone"><a href="http://bikerglen.com/wp/wp-content/uploads/2015/05/connections.jpg"><img class="size-large wp-image-442" src="http://bikerglen.com/wp/wp-content/uploads/2015/05/connections-967x1024.jpg" alt="Arduino and stepper motor connections." width="640" height="678" /></a><p class="wp-caption-text">Arduino and Stepper Motor Connections</p></div>
<p><strong>Connections Between the Arduino and the Stepper Drivers</strong></p>
<table>
<tbody>
<tr>
<th>Signal</th>
<th>Color</th>
<th>Arduino Pin</th>
<th>Driver Pin</th>
</tr>
<tr>
<td>X / Left Eye Ground</td>
<td>Black</td>
<td>GND</td>
<td>Driver 1 &#8211; GND</td>
</tr>
<tr>
<td>X / Left Eye Step</td>
<td>Blue</td>
<td>Digital 2</td>
<td>Driver 1 &#8211; ST</td>
</tr>
<tr>
<td>X / Left Eye Direction</td>
<td>Green</td>
<td>Digital 3</td>
<td>Driver 1 &#8211; DR</td>
</tr>
<tr>
<td>Y / Right Eye Ground</td>
<td>Black</td>
<td>GND</td>
<td>Driver 2 &#8211; GND</td>
</tr>
<tr>
<td>Y / Right Eye Step</td>
<td>Blue</td>
<td>Digital 4</td>
<td>Driver 2 &#8211; ST</td>
</tr>
<tr>
<td>Y / Right Eye Direction</td>
<td>Green</td>
<td>Digital 5</td>
<td>Driver 2 &#8211; DR</td>
</tr>
</tbody>
</table>
<p><strong>Connections Between the Stepper Drivers and the Stepper Motors</strong></p>
<table>
<tbody>
<tr>
<th>Signal</th>
<th>Motor Wire Color</th>
<th>Driver Pin</th>
</tr>
<tr>
<td>X / Left Motor A+</td>
<td>Black</td>
<td>Driver 1 &#8211; A+</td>
</tr>
<tr>
<td>X / Left Motor A-</td>
<td>Green</td>
<td>Driver 1 &#8211; A-</td>
</tr>
<tr>
<td>X / Left Motor B+</td>
<td>Red</td>
<td>Driver 1 &#8211; B+</td>
</tr>
<tr>
<td>X / Left Motor B-</td>
<td>Blue</td>
<td>Driver 1 &#8211; B-</td>
</tr>
<tr>
<td>Y / Right Motor A+</td>
<td>Black</td>
<td>Driver 2 &#8211; A+</td>
</tr>
<tr>
<td>Y / Right Motor A-</td>
<td>Green</td>
<td>Driver 2 &#8211; A-</td>
</tr>
<tr>
<td>Y / Right Motor B+</td>
<td>Red</td>
<td>Driver 2 &#8211; B+</td>
</tr>
<tr>
<td>Y / Right Motor B-</td>
<td>Blue</td>
<td>Driver 2 &#8211; B-</td>
</tr>
</tbody>
</table>
<p><strong>The Completed Electronics</strong></p>
<p>The photo below shows all the electronics completely wired.</p>
<div id="attachment_443" style="width: 650px" class="wp-caption alignnone"><a href="http://bikerglen.com/wp/wp-content/uploads/2015/05/wiringcompleted.jpg"><img class="size-large wp-image-443" src="http://bikerglen.com/wp/wp-content/uploads/2015/05/wiringcompleted-1024x735.jpg" alt="All connections completed!" width="640" height="459" /></a><p class="wp-caption-text">All connections completed!</p></div>
<p><strong>Preparing the Development Environment</strong></p>
<p>If you’ve never used an Arduino before, download and install the <a href="http://www.arduino.cc/en/Main/Software">Arduino Development Environment</a>. The Arduino website has several good tutorials on <a href="http://www.arduino.cc/en/Guide/Environment">how to use the development environment to write, compile, and download code</a>.</p>
<p>We’re going to use the AccelStepper library to control the stepper motors. It can be downloaded <a href="http://www.airspayce.com/mikem/arduino/AccelStepper/">here</a>. It is installed by unzipping the distribution zip file into the libraries subfolder of your sketchbook.</p>
<p><strong>The Arduino Sketch</strong></p>
<p>Create a new sketch and paste the following code into it:</p>
<pre><code>
// include stepper library header file
#include &lt;AccelStepper.h&gt;

// motor speed in steps per second
#define motorSpeed 2400

// motor acceleration in steps per second per second
#define motorAccel 32000

// left eyeball step and direction pins
#define xstep 2
#define xdir  3

// right eyeball step and direction pins
#define ystep 4
#define ydir  5

// left eyeball accelstepper instance
AccelStepper stepper1 (1, xstep, xdir);

// right eyeball accelstepper instance
AccelStepper stepper2 (1, ystep, ydir);


void setup()
{
    // set step and direction pins to outputs
    pinMode (xstep, OUTPUT);
    pinMode (xdir,  OUTPUT);
    pinMode (ystep, OUTPUT);
    pinMode (ydir,  OUTPUT);

    // left eyeball stepper setup
    stepper1.setMaxSpeed(motorSpeed);
    stepper1.setSpeed(motorSpeed);
    stepper1.setAcceleration(motorAccel);

    // right eyeball stepper setup
    stepper2.setMaxSpeed(motorSpeed);
    stepper2.setSpeed(motorSpeed);
    stepper2.setAcceleration(motorAccel);
}


void loop()
{
    // main demo loop
    LookLeft ();
    delay (1000);
    LookRight ();
    delay (1000);
    LookDown ();
    delay (1000);
    RollEyes ();
    delay (1000);
    LookLeft ();
    delay (1000);
    LookDown ();
    delay (1000);
    CrossEyed ();
    delay (1000);
    LookUp ();
    delay (1000);
    LookDown ();
    delay (1000);
}


void LookLeft (void)
{
    stepper1.moveTo(+800);
    stepper2.moveTo(+800);
    
    while ((stepper1.distanceToGo() != 0) || (stepper2.distanceToGo() != 0)) {
        stepper1.run();
        stepper2.run();
    }
}


void LookRight (void)
{
    stepper1.moveTo(-800);
    stepper2.moveTo(-800);
    
    while ((stepper1.distanceToGo() != 0) || (stepper2.distanceToGo() != 0)) {
        stepper1.run();
        stepper2.run();
    }
}


void LookDown (void)
{
    stepper1.moveTo(0);
    stepper2.moveTo(0);
    
    while ((stepper1.distanceToGo() != 0) || (stepper2.distanceToGo() != 0)) {
        stepper1.run();
        stepper2.run();
    }
}


void LookUp (void)
{
    stepper1.moveTo(+1600);
    stepper2.moveTo(-1600);
    
    while ((stepper1.distanceToGo() != 0) || (stepper2.distanceToGo() != 0)) {
        stepper1.run();
        stepper2.run();
    }
}


void RollEyes (void)
{
    stepper1.moveTo(+3200);
    stepper2.moveTo(+3200);
    
    while ((stepper1.distanceToGo() != 0) || (stepper2.distanceToGo() != 0)) {
        stepper1.run();
        stepper2.run();
    }
    
    stepper1.setCurrentPosition (0);
    stepper1.moveTo(0);
    stepper2.setCurrentPosition (0);
    stepper2.moveTo(0);
}


void CrossEyed (void)
{
    stepper1.moveTo(-800);
    stepper2.moveTo(+800);
    
    while ((stepper1.distanceToGo() != 0) || (stepper2.distanceToGo() != 0)) {
        stepper1.run();
        stepper2.run();
    }
}
</code></pre>
<p><strong>Running The Sketch</strong></p>
<p>To compile, download, and run the sketch, follow these steps:</p>
<ul>
<li>Plug the Arduino into the computer’s USB port.</li>
<li>Select the Arduino’s serial port from the Tools : Serial Port menu.</li>
<li>Compile, download, and run the code by clicking the right arrow in the code editing window.</li>
</ul>
<p>We want the eyeballs to be in a known position (looking straight down) when starting the Arduino. Follow these steps to move the eyeballs to the starting position and to start the Arduino with the eyeballs in the starting position:</p>
<ul>
<li>Unplug the Arduino from the computer’s USB port.</li>
<li>Rotate both pupils until they’re looking straight down.</li>
<li>Plug in the DC motor supply.</li>
<li>Plug the Arduino into the computer’s USB port.</li>
</ul>
<p>At this point, the eyeballs should start moving following the sequence outlined in the loop section of the sketch. This sequence is left : right : down : rolleyes : left : down : crosseyed : up : down.</p>
<p><strong>Next Steps</strong></p>
<p>In part three of this series of posts, we’re going to use OpenCV to detect faces and make the googly eyes <a href="http://bikerglen.com/blog/tracking-people-with-the-googly-eyes-and-opencv/">track a person as they move around a room</a>. This will first require the installation of Python, NumPy, OpenCV, and PySerial. Then we’ll modify the example OpenCV Python face detection code to send the location of a face on a webcam to the Arduino. Lastly, we’ll modify the Arduino code to listen for the location of a face and move the steppers to point the eyeballs towards that location.</p>
]]></content:encoded>
			<wfw:commentRss>https://bikerglen.com/blog/bringing-the-robotic-googly-eyes-to-life/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
		<item>
		<title>Build a Pair of Robotic Googly Eyes</title>
		<link>https://bikerglen.com/blog/build-a-pair-of-robotic-googly-eyes/</link>
		<comments>https://bikerglen.com/blog/build-a-pair-of-robotic-googly-eyes/#comments</comments>
		<pubDate>Sun, 26 Apr 2015 03:06:38 +0000</pubDate>
		<dc:creator><![CDATA[Glen]]></dc:creator>
				<category><![CDATA[Fusion 360]]></category>
		<category><![CDATA[Robotics]]></category>

		<guid isPermaLink="false">http://bikerglen.com/blog/?p=392</guid>
		<description><![CDATA[Build your very own pair of giant robot googly eyes! These eyes are built from laser cut acrylic panels and motorized using a pair of NEMA-14 stepper motors. This post is the first in a series of three posts. In &#8230; <a href="https://bikerglen.com/blog/build-a-pair-of-robotic-googly-eyes/">Continue reading <span class="meta-nav">&#8594;</span></a>]]></description>
				<content:encoded><![CDATA[<div id="attachment_400" style="width: 650px" class="wp-caption alignnone"><a href="http://bikerglen.com/wp/wp-content/uploads/2015/04/DSC00179.jpg"><img src="http://bikerglen.com/wp/wp-content/uploads/2015/04/DSC00179-1024x682.jpg" alt="Completed robotic googly eyes. The scleras (whites of the eyes) are about 8&quot; (200mm) in diameter with 5&quot; (125mm) pupils." width="640" height="426" class="size-large wp-image-400" /></a><p class="wp-caption-text">Completed robotic googly eyes. The scleras (whites of the eyes) are about 8&#8243; (200mm) in diameter with 5&#8243; (125mm) pupils.</p></div>
<p>Build your very own pair of giant robot googly eyes! These eyes are built from laser cut acrylic panels and motorized using a pair of NEMA-14 stepper motors. This post is the first in a series of three posts. In this post, we&#8217;ll build the googly eyes. In the 2nd post, <a href="http://bikerglen.com/blog/bringing-the-robotic-googly-eyes-to-life/">we&#8217;ll look at hardware and software to animate the googly eyes</a>. In the 3rd and final post, we&#8217;ll connect the eyes to OpenCV to make them <a href="http://bikerglen.com/blog/tracking-people-with-the-googly-eyes-and-opencv/">track motion and faces in a room</a>.</p>
<p><span id="more-392"></span></p>
<p><strong>3D Model</strong></p>
<p>Before buying any parts or doing any building, I designed the eye balls in Autodesk Fusion 360. A 3D model of the eyeballs can be viewed online using their free 3D viewer software <a href="http://a360.co/1CytiFw" title="Googly Eyes 3D Model Web Viewer" target="_blank">here</a>. The 3D model was useful for making sure all the parts would fit together, building the bill of materials for the project, and seeing where all the pieces go when it was time to assemble the googly eyes. A Fusion 360 archive file for the googly eyes can be downloaded <a href="https://github.com/bikerglen/googly-eyes/raw/master/mechanical/googly-eyes-3d-model-fusion360-archive.f3d" title="Googly Eyes 3D Model Fusion 360 Archive File">here</a>.</p>
<p><strong>Required Parts</strong></p>
<div id="attachment_398" style="width: 650px" class="wp-caption alignnone"><a href="http://bikerglen.com/wp/wp-content/uploads/2015/04/DSC00163.jpg"><img src="http://bikerglen.com/wp/wp-content/uploads/2015/04/DSC00163-1024x682.jpg" alt="The parts required to build a set of robotic googly eyes." width="640" height="426" class="size-large wp-image-398" /></a><p class="wp-caption-text">The parts required to build a set of robotic googly eyes.</p></div>
<p>The parts for this project are listed below. The total cost was about $85 excluding the shipping and handling charges.</p>
<ul>
<li>2 Pololu NEMA-14 35x28mm Stepper Motors, Item #1208, $12.95 ea</li>
<li>1 Pololu 2-Pack of Aluminum Mounting Hubs for 5mm Shafts, M3 Holes, Item #1998, $7.49 for 2</li>
<li>8 Keystone #6 Aluminum Round Spacers, 1/2&#8243;, Item #3466, $0.37 ea</li>
<li>6 Keystone 6-32 Aluminum Hex Standoffs, 1-3/4&#8243;, Item #1819, $1.00 ea</li>
<li>4 M3-0.50 x 8mm Black Oxide Button Head Socket Cap Screws, McMaster-Carr #91239A113, $6.43 for 100</li>
<li>12 6-32 x 0.375&#8243; Button Head Socket Cap Head Screws, McMaster-Carr #92949A146, $3.93 for 100</li>
<li>8 M3-0.50 x 18mm Button Head Socket Cap Screws, McMaster-Carr #92095A472, $6.00 for 100</li>
<li>2 Laser Cut Acrylic Eye Whites (scleras)*</li>
<li>2 Laser Cut Acrylic Eye Blacks (pupils)*</li>
<li>1 Laser Cut Acrylic Wall Mounting Plate*</li>
</ul>
<p>*DXF files for use at Ponoko are provided in the next section. The materials and cutting cost was $38.17.</p>
<p><strong>Cutting the Acrylic Pieces</strong></p>
<p>I had Ponoko cut the acrylic pieces for the googly eyes. Total cost excluding shipping was $38.17 at the time I built my pair of googly eyes. You&#8217;ll need the two design files below.</p>
<p><a href="https://github.com/bikerglen/googly-eyes/raw/master/mechanical/p2_nema_14_white_sclera_0.118.dxf">p2_nema_14_white_sclera_0.118.dxf</a><br />
<a href="https://github.com/bikerglen/googly-eyes/raw/master/mechanical/p2_nema_14_black_pupils_0.118.dxf">p2_nema_14_black_pupils_0.118.dxf</a></p>
<p>(Right click and choose save as if you have trouble downloading the files.)</p>
<p>Both of these files are sized to be cut from Ponoko&#8217;s P2-sized material. The file p2_nema_14_white_sclera_0.118.dxf contains two scleras (the white parts of the eyes). I cut these out of glossy white 3mm acrylic. The file p2_nema_14_black_pupils_0.118.dxf contains four pupils and two wall mounting brackets. I cut these out of glossy black 3mm acrylic.</p>
<p><strong>Assembly</strong></p>
<p>Start assembly by attaching the mounting hubs to the shafts of the stepper motors. The top of the hub should sit flush with the end of the shaft. Tighten the set screws to their final torque. The photo in the parts list above shows the hubs mounted to the stepper motors.</p>
<div id="attachment_414" style="width: 650px" class="wp-caption alignnone"><a href="http://bikerglen.com/wp/wp-content/uploads/2015/04/DSC00173.jpg"><img src="http://bikerglen.com/wp/wp-content/uploads/2015/04/DSC00173-1024x682.jpg" alt="Postion of motor and spacers on the rear of the eye white. Notice the wires exit the stepper motor away from the 1.75&quot; spacers." width="640" height="426" class="size-large wp-image-414" /></a><p class="wp-caption-text">Postion of motor and spacers on the rear of the eye white. Notice the wires exit the stepper motor away from the 1.75&#8243; spacers.</p></div>
<p>Next up is to connect the motors to the rear of the eye whites. Insert one of the M3-0.5 x 18mm screws through one of the four smaller holes on the front of the acrylic eye white. Place a spacer on the screw on the back side of the eye white then insert the screw into the corresponding mounting hole on one of the stepper motors. Point the side of the stepper motor with the wires down and away from the three larger 6-32 holes so that the wires will clear the spacers that are installed in the next step. Repeat this procedure for the remaining three M3 screws then repeat for the second eyeball.</p>
<div id="attachment_415" style="width: 650px" class="wp-caption alignnone"><a href="http://bikerglen.com/wp/wp-content/uploads/2015/04/DSC00168.jpg"><img src="http://bikerglen.com/wp/wp-content/uploads/2015/04/DSC00168-1024x682.jpg" alt="The motor and spacers mounted to the rear of the eye whites." width="640" height="426" class="size-large wp-image-415" /></a><p class="wp-caption-text">The motor and spacers mounted to the rear of the eye whites.</p></div>
<p>Now connect the eye whites to the spacers. The spacers will connect the eyeballs to the wall mounting plate. Insert a 6-32 x 0.375&#8243; screw through one of the three larger holes on the front side of the acrylic eye white. Tighten a spacer onto the screw on the rear of the eye white. Repeat this procedure for the two remaining 6-32 screw holes then repeat for the second eyeball.</p>
<div id="attachment_417" style="width: 650px" class="wp-caption alignnone"><a href="http://bikerglen.com/wp/wp-content/uploads/2015/04/DSC00181.jpg"><img src="http://bikerglen.com/wp/wp-content/uploads/2015/04/DSC00181-1024x432.jpg" alt="Side view of the googly eyes mounted on the wall hanging bracket." width="640" height="270" class="size-large wp-image-417" /></a><p class="wp-caption-text">Side view of the googly eyes mounted on the wall hanging bracket.</p></div>
<p>Acrylic scratches easily so carefully flip the assembly over and place it on a soft surface such as a towel. Position the wall mounting bracket over one of the eyeballs. Insert a 6-32 x 0.375 screw through the wall mounting bracket and into the corresponding spacer on the eyeball. I mounted my wall mounting bracket to the eyeballs such that the motor wires exited the bottom when the wall mounting bracket is hung on the wall. Holes are provided for either orientation. Finish attaching the first eyeball to the wall mounting bracket then repeat for the second eyeball.</p>
<p>Flip the eyeballs over and use the M3-0.5 x 8mm black oxide screws to mount the pupils to the mounting hubs on the ends of the stepper motors.</p>
<p>Congratulations! You&#8217;ve built your very own pair of giant robotic googly eyes! </p>
<div id="attachment_400" style="width: 650px" class="wp-caption alignnone"><a href="http://bikerglen.com/wp/wp-content/uploads/2015/04/DSC00179.jpg"><img src="http://bikerglen.com/wp/wp-content/uploads/2015/04/DSC00179-1024x682.jpg" alt="Completed robotic googly eyes." width="640" height="426" class="size-large wp-image-400" /></a><p class="wp-caption-text">Completed robotic googly eyes</p></div>
<p><strong>Next Steps</strong></p>
<p>In the next post in this series, we&#8217;ll look at making the eyeballs move using <a href="http://bikerglen.com/blog/bringing-the-robotic-googly-eyes-to-life/">an Arduino and some simple stepper motor drivers</a>. In the final post, we’ll use OpenCV to make the googly eyes <a href="http://bikerglen.com/blog/tracking-people-with-the-googly-eyes-and-opencv/">track faces in the room</a>.</p>
]]></content:encoded>
			<wfw:commentRss>https://bikerglen.com/blog/build-a-pair-of-robotic-googly-eyes/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
	</channel>
</rss>
