Week 17 – Final Project Presentation and Demo

We finally completed our project which is Hybrid Robotic Shopping Cart. Our system has indoor guidance system with speech recognition, which is friendly designed to use by most of the population segment, such as older people and people with different kinds of disability. Despite the challenges we faced while we were working on this project, we have successfully applied many features to our system. Below is the slides sample of our final presentation.

 

 

This slideshow requires JavaScript.

 

We achieved at the end of the project is that we realized the iRobot line following feature using IR sensors; we built server-client web system which enables the manager view of the system’s working and the robot’s moving. The network connects the user, the robot and the server, which allows wireless communication between server and clients. The user interface developed with Android transfers speech to text, which provides various input approaches for users to use our product with convenience. Here is our demo videos for our system and our speech recognition application.

(Final Demo)

 

(Speech recognition application)

 

 

LAST WORDS..

Conclusion

 

The team provides the concept that using iRobot as a guide moving to help people shopping in the market by developing the hybrid robot-cart with line following function. The original idea was focused on assistive technology so that the speech to text feature on the interactive platform was added. Furthermore, not only the system can guide customers on the direction of items, but also it provides a real-time view for a manager to track the history of locations where the robot-cart has passed. There is great potential in the system because it can help people with or without disabilities on convenience shopping.

 

Future Works

 

For the future work, we could include more items on the map as well as involve more robot-carts to expand the scope of the system. The algorithm for the line-following feature could be improved. This would make the robot-cart run more stably. We would like to also consider obstacle avoidance feature for the robot because obstacle collision would be crucial issue when the robot is actually running in shopping markets. Lastly, we could improve user interfaces for both customer side and manager side views.

Thank you Dr. Min for your help through this semester!

— Hamsters

Week 15 (04/17/2019)

 

For the last few days we were building our GUI and creating the path for our Roomba by using contrastion between the black tape line and the white table as the background, so our robot can move above the line and stop once it lost the blackness. Roomba can detect the black line by using the 4 cliff sensors, which are infrared (IR) sensors. These sensors are located underneath and along the front bumper (as showing in the following picture). We are trying to use python to connect and command our robot and read the signal from the 4 sensors.

Screen Shot 2019-04-17 at 5.23.48 PM

Source: https://cs.fit.edu/Projects/robotics/Projects2010/Timmaia/roboproj.html

 

Also, today we are working on the path to add multiple stops for our items. In order to achieve our goal we need to develop different signs for the black area in each destination to make our Roomba stop once it detects that signature of the stop.

Here is our big idea draft for the project and GUI:

dav

 

Below is the environment that we created to test our robot with the line follower code.

KakaoTalk_Photo_2019-04-17-21-01-31

 

 

Week 13 (04/04/2019)

This week, the team benefited from the lecture a lot, testing the raspberry pi (RaspPi) control the arduino LED, and using geany to do basic control on iRobot, which includes setting baud rate, switching modes (passive / safe), send serial packet (move forward / backward). The team reviewed the process and decided to program the irobot based on python, rasp pi and geany. The team achieved to use rasp pi’s terminal to perform those basic tasks and figured out the transformation from decimal to hex when it comes to the serial communication. The python library “pycreate2 library” specially for irobot has been installed to rasp pi.

KakaoTalk_20190404_132349531

The team would meet more often compared with the regular meeting time to shape the project better. By the next meeting, the team will research on the IR sensors on iRobot in order to manage to read the signal and command irobot based on the feedback somehow (eg. line follower sample project). We need to figure out if the irobot control enable the feedback loop system or it is simply for open loop.

 

Week 11 (03/21/2019)

This week we wrapped up some unfinished works to catch up to the schedule. This week we set up a more detailed schedule. From today’s meeting and possible another meeting on weekend, the team is going to achieve programming the iRobot using raspberry pi and laptop, then figuring out realizing the line following feature.

Possible resources see the following links:

Raspberry Pi Control iRobot

Raspberry Pi Setup Guide with Mac

 

Week 9 (03/07/2019)

In this week we tested the irobot-create to connect it with the labtob, where we turn it on and let it go around to check its functions.

KakaoTalk_Photo_2019-03-07-11-26-24

 

For the next meeting, we will try to program the irobot. Since we haven’t received the equipments for this week, we plan to do more research on implementation of irobot this week and hopefully start developing after going back to school

Week 8 (03/02/19) Proposal

This Monday we presented our proposal for the project. Here are the slides sample which show our motivation, existing robots, testing ideas and plan.

This slideshow requires JavaScript.

Peer Reviews:

1. some thinks the goal is good, some thinks the scope too vague
2. clear structure & approach, great motivation
3. what is item search? capability? speed? path planning? many noise during communication?

After summarizing the reviews from our peers, we realize that, to make our study purpose steady and convincing, we need to specify the project objective, like narrowing down the scope and functions. In this case we are more likely to focus on the deep-level questions.

Another after-thought is about the sensors pre-testing. It is very important to test the scope of IR sensors, like range, speed, functionality, etc. As for the communication between robot and the tablet, the range limit and maybe the information passing efficiency should be tested.

Week 7 (02/20/2019)

In our first meeting, we discussed our ideas in order to narrow down the scope. The list below provides papers of similar idea to our project’s aim and some similar existing robots.

Literature Review:

1. Ergonomics-for-One in a Robotic Shopping Cart for the Blind

http://delivery.acm.org.ezproxy.lib.purdue.edu/10.1145/1130000/1121267/p142-kulyukin.pdf?ip=128.210.126.199&id=1121267&acc=ACTIVE%20SERVICE&key=A79D83B43E50B5B8%2E2BA8E8EA4DBC4DB7%2E4D4702B0C3E38B35%2E4D4702B0C3E38B35&__acm__=1550286906_b9df17c9627caafa90d40e3eac8683a5

  • People with vision impairment can use robotic shopping cart.
  • Line following; laser sensor (navigation worked in small spaces).

2. Intelligent shopping cart

https://pdfs.semanticscholar.org/57f0/89176515f628a47cafeeb0ea6d0e0bbfa7dc.pdf

  • The cart’s inbuilt automatic billing system, display products, their prices and the best deals.
  • Radio Frequency Identification RFID; wireless network.

3. Development and Applications of Line Following Robot Based Health Care Management System

https://id-static.z-dn.net/files/d4d/22044b50e16a2d6efc4c724d350058ef.pdf

  • User calls the robot, the robot will fetch/deliver the medicine to them. Can be used for fetching items in grocery shopping
  • Line following; IR; microcontroller 89c51; LDR

4. Design of a Control System for Robot Shopping Carts https://link.springer.com/chapter/10.1007/978-3-642-23851-2_29

  • To assist elderly or disabled, the cart will follow the users accurately and automatically
  • Infrared laser, laser range sensor for user position and distance; evasion system for prevent collision; the control system can be detached to any normal shopping cart

5. CompaRob: The Shopping Cart Assistance Robot https://journals.sagepub.com/doi/pdf/10.1155/2016/4781280

  • person-following shopping cart assistance robot, carry things
  • Similar system reviewed is very informatic in page 4
  • Use smartphone/tablet; RFID; RF/sonar

6. An affective guide robot in a shopping mall https://www.researchgate.net/publication/221473101_An_affective_guide_robot_in_a_shopping_mall

  • More interaction involved
  • position estimation, person identification, and speech recognition; RFID

7. Robot-assisted wayfinding for the visually impaired in structured indoor environments.

https://link.springer.com/content/pdf/10.1007%2Fs10514-006-7223-8.pdf

8.CompaRob: The Shopping Cart Assistance Robot.

https://journals.sagepub.com/doi/pdf/10.1155/2016/4781280

9. Passive Radio Frequency Exteroception in Robot Assisted Shopping for the Blind.

https://link.springer.com/content/pdf/10.1007%2F11833529_6.pdf

 

Existing similar robots:

1. LG Robot Shopping cart

https://www.techspot.com/news/77273-lg-putting-robot-shopping-carts-retail-stores.html

2. dash robotic shopping cart

http://5elementsrobotics.com/dash-robotic-shopping-cart/

3. Budgee robot

https://www.theverge.com/2015/1/5/7498155/budgee-the-rolling-luggage-robot-five-elements-robotics