T.P.R (Tele-Presence Robot) is a unique robot that can be controlled remotely by Skype from any location. It is capable of doing almost anything from surveillance, communication, mobility – getting to smaller areas, carrying various equipment and/or items. Our robot (with a couple of attachments) could preform any task.
Our robot will be able to solve the problem of limited surveillance. We chose this problem because it could be useful and effective. This includes keeping houses safer from being robbed and also stores. This could also contribute to help keep streets safer by placing these robots in small areas where other cameras can’t see or be placed.
T.P.R will allow you attend meetings even if your overseas or need to quickly tell/warn someone in your home that the cake is burning.
T.P.R will give people who are bed ridden the ability to go exploring anywhere from the safe comforts of their beds. This will not only boost morale In people but they will feel included within family/friends adventures.
Our first system design was to use Python to read key presses and display a colour on screen. Using Skype’s screen share we could display that colour on the screen of the phone. The phone would be attached to the robot with a colour sensor looking at the screen. The EV3 could then read the colour of the screen and drive the motors depending on the colour.
In the first test we realised the EV3 colour sensor did not work with ambient colour because it senses colour by flashing a red light and reading the light value, then green, then blue. using the light bouncing back it could determine the colour of the surface the sensor was looking at. So we used different shades of grey and using the ambient light node for the colour sensor. Running the phone at full brightness we could read approximately 7 different values without error. We used this to make our first working prototype.
The ‘A’ frame Design of the robot was too hard to build so we made a cube like design with the top of the phone sticking vertically out the top. We had 3 states the robot could be in, Forward, Turn Left on the spot and backwards. This prototype gave as a lot of information. Firstly the turning was way to fast for the one second delay we where getting from the computer to the robot and back but a quick software change fixed the problem. Despite the limited controls and the weak design of the robot we managed to drive it 10 to 20 meters down the hallway. The one second delay was a problem but we didn’t have enough time to lower it. the next problem was the slow speed. we tried adding gears and while the speed increased we gained other problems. these new problems where probably caused by the large gear ratio but instead of lowering the gear ratio we changed the gears so it would be a 1 to 1 ratio.
The second prototype had more controls, we could now drive forward, turn right and left and stop but for our final design we added 2 colour sensors. we tried to have variable motor speed but time was running short and we didn’t have a variable input into scratch apart from the mouse position. Instead we focused on usability with arrow key controls. For example when only one motor was going forward that motor would be slower as to give precise movement but when both motors where in forward they would go at 100% speed.
Our presentation went well and many people enjoyed controlling the robot however a lot of people also liked driving into other people.
We used Scratch for basic movement controls and keyboard input, scratch was only used on the computer end of the system. Our robot was made with the Lego Mindstorms building system and does Robot movement and colour detection from the phone. Joey’s Phone allowed us to visually see everything the robot does. (linked with Skype) Skype was used with phone to allow us to see, hear and respond to everything including robot controls!