Carti-B, the semi-autonomous shopping cart

This was the Fall 2018 project for the first of the two senior design classes (ME366J and K) that mechanical engineering students in the Cockrell school must take. The Fall 2018 project was to create a semi-autonomous shopping cart that would follow the user around without the user interacting with the cart using their hands. This would enable users more flexibility while shopping and enable them to use their hands for other tasks like holding a cellphone or pushing a stroller around. This particular example was built within our studios and then donated back. The project was supervised by Dr. Carolyn Seepersad and created by Nicholas Fuselier, Amber Chen, Samuel Bar, Daniel Pousset, John McAnnally, and Emanuel Okeke.


Shot by Obinna Akahara

Narrated by Emily Cazes


Cart operation relies on a Pixy 2 for target recognition and discrimination as well as four SRF05 ultrasonic sensors for dynamic obstacle avoidance. A Particle Photon is the microcontroller used to integrate data from the sensors. The Pixy relies on hue data from its video feed to construct color blocks from the scene. The Pixy is a smart sensor because it integrates the data onboard and relays the result; this is desirable because that means that the microcontroller is free to perform other tasks. The reason why the Pixy is able to process video feed live is because it uses only color data and does not try to discriminate more complex visual artifacts like edges or other phyical features. Specific hues can be tagged and when these are found adjacently in color connected components the Pixy can detect this and relay the height, width, and relative location to the microcontroller. In a commercial setting each user would wear a unique color code in the form of a belt or armband so that they could be tracked. The primary downside of this method is that the Pixy is sensitive to lighting - we found that bright, matte colors worked best but reflectors as used in bikes or self-illumination codes could also be used more reliably.

To keep track of the user, the cart and the Pixy act analogously to a head and a body. The Pixy is mounted on two servos for pan-tilt action and can quickly track changes in target movement. The cart is significantly heavier and so is more slow to update its orientation relative to a target. The pan-tilt mechanism is operated by a PID loop and data is shunted and weighted in order to set the speed of the wheel motor for turning. The cart rotates itself so that the body and the camera are aligned in the static position. Forward-backward motion is controlled by the height of an object (in pixels) relative to a reference. By changing this reference, the cart can maintain a set distance from the user.

The size information and shunted data are then superimposed to create the base tracking motion. On top of this, data from the ultrasonics is also weighted and superimposed to create the final motion. All sensor contributions are added into an exponentially weighted moving average to smooth motion that itself is fed into another PID loop. At short distances, the readings from the front and rear ultrasonics override the contribution from the Pixy to prevent it from running into a user or backing into a wall. The side ultrasonics have a much lower weight and so gently coax the bot away from surfaces while it is in motion. The front ultrasonic is mounted on a servo so that the forward sensor reacts in response to the user specifically, not what is actually in front of the cart. Without this protection, the Pixy would repeatedly run into the user because the marker begins to clip out of screen when the user is closer to the cart than the Pixy can track. This clipping is seen as a reduction in height and without an independent verification of user position, the cart would see the user as moving farther away and try to approach them. When sight of a user is lost, the ultrasonics contribution remain active for several seconds to prevent collisions and then deactivates so the user can access the cart.