Archives

All posts for the month January, 2015

Following up my last post, I took another Introduction to Robotics class last semester, but this time it was the Mechanical Engineering department’s 2.12, which goes into more depth on the modelling and control of robotic mechanisms. See my previous post for a quick comparison of the classes.

Instead of exploring an environment and building a structure out of blocks, the final project for 2.12 was a little more constrained. The final project was the World Cup of robotics, a soccer competition to be played by the robots of the class. The class was divided into several groups, each representing a nation, and each group fielded a goalie robot and a kicker robot. A team of three other undergrads and me, dubbed “Team Kicking and Screaming” built the kicker robot for our nation.

Here, you can see our final robot in action. The robot uses data from a camera system to predict the ball’s motion. With this model, it then calculates the point in time and space at which the ball will intersect the workspace of the robot. When the ball is close enough, the robot starts a kicking trajectory such that the shoe will meet the ball at this point in time.

In order to implement this behavior, we wrote several software modules. These included a module for accessing the camera system, a module for predicting the path of the ball, and a module for inverse kinematics and motor control, among others.

For vision, the 2.12 TAs set up a computer to extract the ball coordinates from a ceiling-mounted camera using techniques we studied in an earlier lab assignment. The machine published these coordinates to a TCP server with updates promised roughly 60 times a second. We first wrote software to poll this server and to perform calculations with the data.

Next, we got to work with ball trajectory tracking. We were writing software at the same time as the TAs were setting up their vision software, so we weren’t able to immediately test different models of rolling ball motion using real data. With this in mind, I deliberately left the model as a parameter that could be supplied by the user, and I used the curve_fit method of SciPy’s Optimize module to fit the data to whatever model was supplied. This method uses the Levenberg–Marquardt algorithm for curve fitting, which can be applied even for non-linear curve fitting.

Written before the camera system was set up, a piece of our software software generated the following demo image. Upon being fed coordinates that represent sampled locations of the ball, plotted in blue, the software adjusted parameters to fit a trajectory model, and it plotted the calculated trajectory in green. The red line is the software’s prediction of the future path of the ball if it remains on this trajectory. The sample points were broadcast from a server hosted locally and were deliberately spaced far apart for ease of debugging, but the actual data would be more dense spatially.

Our software fit the above trajectory to the blue points as they were published. The axes are measured in pixels; the resolution of the field is roughly 2.5px/in. Once built, the actually field was a little smaller, and ball coordinates were published much more densely.

Our software fit the above trajectory to the blue points as they were published. The axes are measured in pixels; the resolution of the field is roughly 2.5px/in. Once built, the actually field was a little smaller, and ball coordinates were published much more densely.

Once the field and camera system was set up and we observed the movement of the ball, it became apparent to us that the a good model of ball motion could indeed be nonlinear. As the ball slowed, the general curvature of its trajectory often increased. More significantly though, being a buckyball-shaped soccer ball, the ball would wobble from side to side as it landed on different faces of its surface. This motion was on the order of inches.

This plot of a ball's motion over the field coordinates, captured from the field vision system, demonstrates the wobbliness of the ball's motion as it slows down. The ball entered from the lower left and rolled to the upper right. Both axes are in pixels at roughly 3.5px/in. This plot is deliberately compressed to make the curvature more apparent.

This plot of a ball’s motion over the field coordinates, captured from the field vision system, demonstrates the wobbliness of the ball’s motion as it slows down. The ball entered from the lower left and rolled to the upper right. Both axes are in pixels at roughly 3.5px/in. This plot is deliberately compressed to make the curvature more apparent.

Luckily, as the project progressed, it became apparent that they intended to roll the ball at a higher speed than what we were initially testing. At higher speeds, the wobbling effect was diminished, and we were able to use a relatively simple model of a rolling ball with rolling resistance.

Once we had this ball trajectory, we wanted our robot to be able to predict at what point the ball would intersect the workspace of our robot. At this point, unsure of what ball trajectory model we would use, we were faced with the problem of finding the intersection of two arbitrary curves: the trajectory of the ball, and the perimeter of the workspace of our robot.

Our solution was to discretize each curve as a finite number of line segments and then iterate through them until an intersection was found–this way, we could make a trade-off between the spatial resolution of the intersection point and the amount of time to compute the point. This strategy finds the intersection point, if there is one, as long as the line segment lengths are small compared to how far the path can diverge from a straight line approximation over a single discretization step. In our case, the inertia of the ball limits how far its path could diverge from a straight line.

It turns out there aren’t many computational geometry modules available in Python, so I ended up writing my own to evaluate whether two line segments intersect. I’d be happy to hear if I overlooked something, so let me know.

The other members of my team developed the hardware for the robot, though I waterjetted some parts. The result of their design was this super kickass leg, as you can see from these renderings.

The frame of the robot attaches to a baseplate on the field while the leg terminates in a shoe to interface with the ball

The frame of the robot attaches to a baseplate on the field while the leg terminates in a shoe to interface with the ball.

Once we developed the design for our robotic manipulator, we performed wrote software to perform inverse kinematics with the parameters for our robot. With this kinematic model available, we wrote software so that that our leg would follow a trajectory, ending at the time and space when the ball intersected our workspace.

The lowest motor in this rendering adjusts the yaw of the ankle of the robot. Above this is a motor for controlling the angle of the knee, and above this is a motor for controlling the angle of the hip. The entire leg can move side-to-side, tracing out a semi-circle, using the driven parallel linkage between the hip and the frame.

The lowest motor in this rendering adjusts the yaw of the ankle of the robot. Above this is a motor for controlling the angle of the knee, and above this is a motor for controlling the angle of the hip. The entire leg can move side-to-side, tracing out a semi-circle, using the driven parallel linkage between the hip and the frame.

Here’s the github repo for our code, which might be helpful for future teams. We wrote the majority of our code in Python with SciPy for curve fitting; and we used MATLAB and matplotlib for graphs.

Ok, this post is about catching up on an old project!

A semester ago, I took MIT’s Robotics: Science and Systems (6.141). I feel obligated to mention that, sadly, the instructor of this course Prof. Seth Teller has since passed away. On this topic, the MIT news office has a well-written article on Prof. Teller and his work. However, the focus of this post is going to be on the structure of the course and the final project.

My team used a few techniques to build this robot quickly and flexibly for our final project for 6.141.

My team used a few techniques to build this robot quickly and flexibly for our final project for 6.141.

It turns out there are several introduction to robotics classes at MIT. 6.141 is offered by Course 6, the Electrical Engineering and Computer Science department. However, there’s also Introduction to Robotics (2.12), offered by Course 2, the Mechanical Engineering department. Last semester, I took 2.12, and it’s safe to say that both 6.141 and 2.12 offer a hands-on introduction to robotics, but from different perspectives. 6.141 has a focus on algorithms related to robotics, such as those for navigation, motion planning, and machine vision. By contrast, 2.12 has a focus on analyzing mechanisms, like multijointed robotic arms or legs. The two classes had some overlap in their discussion of techniques for sensor fusion and filtering. 6.141 also offers an introduction to the Robot Operating System (ROS) and has a technical speaking and writing component. As someone who is interested in improving my public speaking skills, I thought the speaking component was fun. For students interested in robotics, I recommend taking both.

That said, Course 6 also offers an EECS survey class, 6.01. Though I hear 6.01 has changed since I took it, I found it was also a great class in terms of introducing students to problems and algorithms in robotics, perhaps as good as 6.141. For example, 6.01 included a series of labs in which students wrote code snippets for filtering, state estimation, and path finding, which ultimately allowed robotic vehicles to race through a maze. Like 6.141, the labs in 6.01 seem to build up to the notion of SLAM, starting with labs on simple navigation techniques and gradually becoming more advanced. However, unlike 2.12 and 6.141, 6.01 lacks a final project component.

The final project components was also how 2.12 and 6.141 differed. Ironically, although 6.141 was a computer science class, and 2.12 was a mechanical engineering class, I found I did mostly mechanical work for my 6.141 team and mostly software for my 2.12 team. I suppose this is because 6.141 attracted mostly computer science students, and 2.12 attracted mostly mechanical engineering students. For this reason, though, I’ll discuss the mechanical design of our 6.141 robot here.

The final challenge for 6.141 was to build, in a team of five, a robot to explore a room-sized arena, find colored blocks, and build a structure with them. We decided a neat way to further define the problem would be to build several towers of blocks, with each tower formed by only one color of block. We were also pressed for time, so I took a few steps to build this machine as fast as possible.

With this in mind, I decided the best way to design the hardware of the robot would be to break it up into physical modules. Each module could be designed independently, allowing people to design them in parallel, or at least reducing the cognitive load if one person worked on all modules. After consulting with the team, I settled on four: a chassis module, an arm/manipulator module, a block sorting module, and a storage module. The chassis would be the base of the robot and would include the frame, wheels, motors, battery, and laptop computer. The manipulator and arm would be mounted on the front of the chassis, able to pick up blocks from in front of the robot and drop them off on top of the robot. Atop the chassis would be mounted the block sorting module, which would sort and deliver blocks to the storage module, mounted to the back of the chassis. Finally, the storage module would feature a door or gate to release the blocks, stacked in tower formations.

This initial sketch of the robot included a fifth module: a color detection module attached to the front of the sorting module. However, we later decided to mount the color sensor on the claw itself, eliminating the need for color sensing while chanelling blocks.

This initial sketch of the robot included a fifth module: a color detection module attached to the front of the sorting module. However, we later decided to mount the color sensor on the claw itself, eliminating the need for color sensing while chanelling blocks.

Another convenience about the modular design of our robot was that it was really easy to take modules on and off. For example, to get at the electronics of our robot, mounted on the chassis, we could simply unscrew two screws and lift off the sorting mechanism.

So that I could design each module separately, I first settled on the interfaces between the modules: each module would be able to attach to a separate horizontal bar of 80/20 aluminum extrusion attached to the chassis. I knew that I would want to adjust the exact heights and other dimensions of each module after I built the first prototype, so I deliberately designed these 80/20 bars to be adjustable in a few dimensions, such as height. Also, using the same technique as I detailed in my post about a window fan, I created a dimentions.txt file in which I recorded these dimensions. I used these as global parameters, which I referenced while designing each module, knowing their specifications for the connection points were the interfaces between each module.

At the start of creating CAD for the robot, I created a quick drawing to specify the meaning of dimensions A, B, C, and D in this drawing. I gave them names and specified them in dimensions.txt.

At the start of creating CAD for the robot, I created a quick drawing to specify the meaning of dimensions A, B, C, and D in this drawing. I gave them names and specified them in dimensions.txt.

Also to cut down on construction time, I designed each module mostly out of 80/20 and laser cut acryllic. Using the laser cutter made it easy and fast to cut improved versions of parts. I also used Jesus Nuts for fast assembly.

The first two modules were straightforward. The chassis I designed was a slight adjustment from the squarebot chassis we had been using in labs earlier in the semester. Similarly, the arm and manipulator were re-used from earlier labs, but we added a color-detecting sensor to the manipulator to detect the color of the block being grasped.

For the sorting module, we came up with a few options but settled on one that minimized the number of moving parts. The module consisted of a slanted panel with guides for blocks. First, the manipulator drops blocks off at the top. Then, one guide funnels all the blocks to one side of the robot. An actuated flipper then guides a block to one of four channels depending on the color of the block determined earlier from the color sensor.

This view, normal to the surface of the sorting mechanism, shows how blocks are first funneled to the flipper then channeled to a chute. Gravity pulls them down the mechanism.

This view, normal to the surface of the sorting mechanism, shows how blocks are first funneled to the flipper then channeled to a chute. Gravity pulls them down the mechanism.

The first version of the sorter  mechanism was fast to assemble and convenient for testing things like the minimum slope required for cubes to slip.

The first version of the sorter mechanism was fast to assemble and convenient for testing things like the minimum slope required for cubes to slip.

Finally, the collector module consists of four containers just big enough for cubes to stack one atop another. I created a few prototype single-column containers to determine the amount of extra room necessary to keep blocks from sticking to the wall. The containers were open on the top to allow blocks to slide in from the sorting module. I designed one side of the module to open, allowing the robot to drive away, leaving four towers of stacked blocks.

The final storage mechanism included four columns and a double door at the back. Here, the doors are removed to see inside the chutes. The final sorter module is also attached atop.

The final storage mechanism included four columns and a double door at the back. Here, the doors are removed to see inside the chutes. The final sorter module is also attached atop.

As mentioned earlier, these techniques allowed me to quickly make improvements to parts. For example, I increased the heights of certain walls that blocks were able to fall over, I increased the slope of the sorter to reduce the chance of blocks sticking, and I replaced the single door on the back of the storage mechanism with a double door mechanism because the weight of a single door was pushing the limit of what the servo could provide.

Unfortunately, I wasn’t able to collect much video footage before the disassembly of our robots, but the following two videos show our robot during testing and in action.