Wayward Sons

Members: Sameer Ansari, Tommy Kazenstein, David Bernal

Technology Used

  • Rovio Robot.
  • pyrovio for Rovio control
  • Win 7 64bit laptop
  • Visual Studio 10 + CMake, etc.
  • opencv2 with python bindings.

Progress Updates

Rocking the world in progress.


Project 3 (Mar 3 - Mar 30)

Motion-planning! This project deals with obstacle avoidance using the overhead camera and configuration space navigation for returning the lemon to the start.
So first we need to detect obstacles, using background subtraction with thresholding we'll get binary images with obstacles as white, here's a test case done in paint.
Using opencv2's nice distrans demo there's even this great function that generates a matrix of minimum distances to obstacles, essentially a voronoi generator!

So, by using approximate cell decomposition, we can generate a path from the rovio to the lemon.

But it cut's it pretty darn close to the obstacles, so now we can add another heuristic to the cost, based on the minimum distance to obstacles, resulting in some clean paths, which we then further clean up using line shortening & smoothing techniques.

Here are some initial trials during pre-demo testing.

The path generated works very well and consistently chooses safe & generally efficient paths, most of the error in path navigation results from the Rovio control system.

Project 2 (Feb 9 - Feb 24)

Friday 2/24 8:00 PM

The Rovio logic is straightforward:

def run(self):
        lastCamTime = time() # Time since last step for camera
        lastBotTime = time() # ditto for robot
        print "Starting Search routine"
        while True:
            # Camera Update
            if (time() - lastCamTime > CAM_STEP_TIME):
                lastCamTime = time() # Step runs, so update lastTime to now
                # Get Face(s) existence and position
            # Logic Update
            if (time() - lastBotTime > BOT_STEP_TIME):
                lastBotTime = time() # Step runs, so update lastTime to now
                # Decide what Rovio should do & do it
            # User Interaction Check
            if cv2.waitKey(5) == 27: # Esc

Overhead tracking worked like a charm, given known consistent environment lighting.
Face tracking worked, the haar cascade is only a little racist.

Report finished, most problems were solved with straightforward solutions:

  • Lemon Tracking -> search for shiny, mask shiny rovio with an and block
  • Crappy Rovio download jpg read jpg image corruption -> Skip jpg creation entirely and generate image from jpg string stream
  • Rovio turning more methodical (slower) -> Time-limited turn is more consistent and slower
  • deal with non-existent lemon without throwing a hissy fit -> Basic sanity check implemented to go to last known lemon position

Saturday 2/18 9:50 PM
Overhead tracking is pretty decent (orientation/position)

  • Lemon Tracking

Rovio tracking is pretty godawful, due to a crappy update jpg image rouine causing garbage images to sneak in

  • Fix jpg image creation
  • Rovio turning more methodical (slower)
  • Rovio movement cleaner and faster
  • deal with non-existent lemon without throwing a hissy fit
  • detect red/yellow

Some basic face detection using HaarCaascades up for Rovio, but in low light (now) it's just piss poor.

Saturday 2/18 6:20 PM
Syntax is by far the worst timesink here.

For tracking and orientation using overhead camera, the logic is to use two colors on the Rovio, and track both of those.
The two points not only provide a much more accurate centerpoint for the Rovio, but also enable the angle between the two to be calculated.
It seems to work really well with the overhead camera at close range.


Project 1 (Jan 24 - Feb 8)

Report: CS3630_Project1_WaywardSons_Sam_Tommy_David.pdf


Almost literally the entire routine for running the Rovio, the rest is just the ugly prep-work and interfacing stuff.

for i in range(0,4):

For tracking the Rovio, chose optical flow since it's just plain better, was able to get sub-centimeter accuracy for multiple points on Rovio, in the future this will enable orientation tracking.

The basic optical flow routine

def run(self):                
    while True:
        img0, img1 = self.prev_gray, frame_gray
        p1, st, err = cv2.calcOpticalFlowPyrLK(img0, img1, p0, None, **lk_params)
        p0r, st, err = cv2.calcOpticalFlowPyrLK(img1, img0, p1, None, **lk_params)
        # ptsx,ptsy are points being tracked by camera
        # bbox is the area that points are tracked from
        self.setBBOX( int(np.mean(ptsx)), int(np.mean(ptsy)) )
        mask[self.bbox1[1]:self.bbox2[1],self.bbox1[0]:self.bbox2[0]] = 255
        p = cv2.goodFeaturesToTrack(frame_gray, mask = mask, **feature_params)
        #ptsx,ptsy set from p essentially

The end.


Unless otherwise stated, the content of this page is licensed under Creative Commons Attribution-ShareAlike 3.0 License

Subscription expired — please renew

Pro account upgrade has expired for this site and the site is now locked. If you are the master administrator for this site, please renew your subscription or delete your outstanding sites or stored files, so that your account fits in the free plan.