The Remote-Controlled Claw is an assistive device designed for people who may have troubles in mobility or trouble working complex machines such as the elderly. The robot uses a simplified remote control for simple use of the robot itself, allowing the user to be able to drive the robot into places where it would not have been possible with a regular humans hand.
The robot has a motor fixed to both sides with a touch sensor from the remote control to work the claw on the front of the robot. All of these functions are made possible by connecting both bricks with Bluetooth, with extensive testing and developments to enhance the abilities of the machine. In doing so we believe that a robot such as this could be a success on a commercial market, as the need for a simple and lightweight assistive device is one that is lacking for those that need it.
As the controls to operate the robot are very basic everyone that uses it could be able to operate it without much problem, the only downside to the robot is that there are no guides to tell the operator if the claw is lined up correctly to pick up the object surrounding it. A sound or light flash to signify this maybe would’ve helped the owner use the claw easier, although there was a clear plan to not over-complicate it given what the target audience would be if put on for sale in a commercial aspect.
T.P.R (Tele-Presence Robot) is a unique robot that can be controlled remotely by Skype from any location. It is capable of doing almost anything from surveillance, communication, mobility – getting to smaller areas, carrying various equipment and/or items. Our robot (with a couple of attachments) could preform any task.
Our robot will be able to solve the problem of limited surveillance. We chose this problem because it could be useful and effective. This includes keeping houses safer from being robbed and also stores. This could also contribute to help keep streets safer by placing these robots in small areas where other cameras can’t see or be placed.
T.P.R will allow you attend meetings even if your overseas or need to quickly tell/warn someone in your home that the cake is burning.
T.P.R will give people who are bed ridden the ability to go exploring anywhere from the safe comforts of their beds. This will not only boost morale In people but they will feel included within family/friends adventures.
Our first system design was to use Python to read key presses and display a colour on screen. Using Skype’s screen share we could display that colour on the screen of the phone. The phone would be attached to the robot with a colour sensor looking at the screen. The EV3 could then read the colour of the screen and drive the motors depending on the colour.
In the first test we realised the EV3 colour sensor did not work with ambient colour because it senses colour by flashing a red light and reading the light value, then green, then blue. using the light bouncing back it could determine the colour of the surface the sensor was looking at. So we used different shades of grey and using the ambient light node for the colour sensor. Running the phone at full brightness we could read approximately 7 different values without error. We used this to make our first working prototype.
The ‘A’ frame Design of the robot was too hard to build so we made a cube like design with the top of the phone sticking vertically out the top. We had 3 states the robot could be in, Forward, Turn Left on the spot and backwards. This prototype gave as a lot of information. Firstly the turning was way to fast for the one second delay we where getting from the computer to the robot and back but a quick software change fixed the problem. Despite the limited controls and the weak design of the robot we managed to drive it 10 to 20 meters down the hallway. The one second delay was a problem but we didn’t have enough time to lower it. the next problem was the slow speed. we tried adding gears and while the speed increased we gained other problems. these new problems where probably caused by the large gear ratio but instead of lowering the gear ratio we changed the gears so it would be a 1 to 1 ratio.
The second prototype had more controls, we could now drive forward, turn right and left and stop but for our final design we added 2 colour sensors. we tried to have variable motor speed but time was running short and we didn’t have a variable input into scratch apart from the mouse position. Instead we focused on usability with arrow key controls. For example when only one motor was going forward that motor would be slower as to give precise movement but when both motors where in forward they would go at 100% speed.
Our presentation went well and many people enjoyed controlling the robot however a lot of people also liked driving into other people.
We used Scratch for basic movement controls and keyboard input, scratch was only used on the computer end of the system. Our robot was made with the Lego Mindstorms building system and does Robot movement and colour detection from the phone. Joey’s Phone allowed us to visually see everything the robot does. (linked with Skype) Skype was used with phone to allow us to see, hear and respond to everything including robot controls!
We chose to make a prototype robotic hand designed to assist people suffering from Parkinson’s disease, and other debilitating conditions that effect the motor function – in other words the strength and dexterity – of the fingers and hand. An exo-skeleton arm, or ExoArm.
And finally: while Parkinson’s is a terrible disease, and a functioning exoskeleton arm could be greatly beneficial, what finally pushed us to choose this project was how cool robotic hands looked!
Our final product was a basic exoskeleton hand with articulating fingers operated by the press of a button. However, as stated in the intro, this was a prototype; our device was clumsy and awkward, and it weighed a ton – it would require more strength and precision to operate the arm than it would just to pick something up. Whilst our prototype may not have any useful real world application, it is a good indication that with more time and better equipment a useful exoskeleton arm could be created.
Our robot is designed to help teachers, and other whiteboard users, to automatically clear a whiteboard of writing. Using magnets to hold the EV3 brick up, the robot is able to effortlessly glide across vertical surfaces and uses felt along the bottom to erase the whiteboard marker.
Xavier said, “Ooh… maybe we should do a whiteboard wiper.”
“No, that’s dumb,” said Jack, “we should do a Whiteboard WischerTM”
“Alright,” said Matthew.
We started the project by brainstorming ideas and solutions on a provided document. We designed a robot and got to work recreating it with LEGO and the EV3 brick.
Our initial prototype was a brick atop wheels with a bumper on the bottom which pushed the wiping device along the whiteboard.
We ran into a problem when trying to get the robot to climb up the vertical whiteboard. Using duct tape and magnets brought in by Xavier, we created a blanket that would stick the robot to the surface. Some complications arose from this design. We used so many magnets that the robot couldn’t be moved using the wheels, and the magnets were too tall and they stopped the wheels from reaching the whiteboard.
After realising our last design suffered from an over-abundance of magnets, we decided to scrap the blanket idea and start over. This time, we chose to strategically place the magnets in places that would allow the robot to not only stay on the whiteboard, but move around as well.
Placing two of the stronger magnets next to the wheels, the robot’s wheels would stay attached to the whiteboard, and could move around too. Everything was working swell, until the magnets decided to disrupt the peace again. One of the magnets was getting stuck to the whiteboard instead of hovering over it and this caused the robot to go up on an angle instead of driving sideways.
Next, we had to come up with a way to control the robot. There were two options; create a remote control, or automate the robot to move around the whiteboard. Obviously the second option was way cooler so we chose to try make an autonomous cleaning robot. We started by creating a simple program that moved the robot forward (creatively named ‘FORWARD’) and tested it on the whiteboard. The test was a success, so we moved on to harder programming.
Using white reflective tape, we could use a light sensor to… sense the change of light. If taped around the outside of the whiteboard, the robot could tell where the edge is. We placed light sensors on the back and front.
Finally, we readjusted the magnets (again!), in order to place a proper wiper. We moved the magnets to the centre of the robot, and put felt over the top. The magnets pushed the felt on to the whiteboard allowing it to wipe instead of glide over the top.
Here is our final program (click on it for a larger picture)
What do you do when your alarm clock rips you out of your slumber in the morning? Hit the snooze button? Well, if you snooze you lose, and so we have put together an alarm that will get you out of bed whether you like it or not.
Our alarm will pierce your ears and then proceed to drive across the room out of your reach so that you must exit your bed and make it shut its mouth. We took inspiration from our personal experiences of falling into the temptation of smashing that snooze button.
From this project, we learned that it is foolish to have any alarm other than ours. If we had more time we would work on the programming in order to make the alarm easier to use for longer periods of time.
Our assistive device is a wearable robot that you could use to help you navigate around an area despite whatever reason you might not be able to see. The device uses the ultrasonic sensor to tell how far away something is in front of you, and beeps when you get too close. The device also has a sensor at the end of a wand that you can point around, which provides a different sound when its too close.
The reason we chose to do this device is that we believe that being blind is one of the worst disabilities that you can have, and anything that can help you even slightly, is an amazing thing.
The device was very easy to build and simple to use, but with the limitations of the hardware it sometimes struggles in cluttered areas. However, in places like hallways the device works very well. Testing was done by navigating through the hallways at the college with eyes fully closed and it was successful. An issue with the device, is that a blind person might have trouble putting the device on and operating it, if it was a real product, it would need to be very simple, and a lot smaller.
As it is such a simple device, I don’t think there are too many things we could have done, given we had more time, aside from tweaking some of the values that trigger the sounds from the device, and making it easier to operate blind.
This project is essentially a development of a robot we made last year, which started off as a game where you would be blindfolded, and had to aim a gun around until it started to shake. This was not possible, due to the limitations of the colour sensor, so we instead made it into something of an assistive device, that would shake when you got too close to things you aimed it at. Due to its size and that you had to hold it out in front of you, it wasn’t particularly practical.
The book holder effectively upgrades on the inconvenience of holding a book, allowing for automation in a previously unexplored area. This book stand allows readers to take drinks, read the page selected and to take notes. It does this through a clamp system controlled via two motors. The clamps are adjustable, and allow for minimum obscurity of the words on the pages. It uses the back-board to hold the book at a thirty five degree angle, allowing for reading while sitting or standing.
We found inspiration from book stands on Amazon, allowing us to create an effective model prior to programming.
Large and heavy books, such as many maths or science textbooks will close on themselves, and require constant attention from the reader to keep the books open whilst taking notes, or a drink. The act of a book closing, despite the book, is quite an inconvenience – One we wished to eliminate. It is particularly useful for cooking, as it allows for a stand and keeps the book open on the recipe without accidentally overcooking something whilst scrambling to find how long it’s meant to be cooking for.
Must allow for usability – Cannot get in the way of users.
Must hold the pages down without damaging the book. (Too much pressure.)
Must allow for other tasks to be performed whist the book is being held by the stand.
We attempted to create a page-turning system, through use of wheels. This seemed to only be able to turn no pages, or too many, as well as getting in the way of the reader. It also appeared it would damage the pages if too much pressure was applied.
We also attempted to create an over-page clamp. Both sides were unable to exist at the same time, due to the difference in motor types on the top and bottom. Both of the top sides were out of sync, and subsequently there was nothing we could change in the program to improve such a fault.
Have you ever just said “I don’t have enough hands to carry all this stuff”? Instead of walking back and forth carrying things, Butler Bot can hold and transport items for you.
Butler Bot is a device that gives an extra hand when moving around. From carrying a bottle to a bag across the room or through the car park, Butler Bot can help.
The inspiration behind our project was to reduce risk in the armed forced (a larger version of the robot would be required). Having this device in a situation could be life saving whether it be hiding behind the robot, or the robot carrying your gear so you can tend to another person, or even carry an injured person. This robot would also be useful in a household (smaller version required) to carry food, drink, clothing and other household materials.
After conducting some research, we found a company named ‘Roboteam‘ based in the US that makes a large all terrain vehicle that can carry large amounts of cargo called the ‘PROBOT‘. The company also has other smaller and larger robots suck as ‘MTGR‘, ‘IRIS‘, ‘ROCU-7‘, ‘AI-CU‘, and the ‘TIGR‘. The TIGR also has a larger version coming in the near future.
The Butler Bot worked well until we tried to add a dispense system to unload cargo, as the infrared beacon interfered. In the early stages we also wanted to run four independently controlled wheels, this fell through as the EV3 brick and LEGO Mindstorms programmer will not let us run all four motors at the same time. There was a large learning curve when we started using the infrared sensor and beacon as we had never used the sensor or the beacon.
If we had another chance to redesign the robot we would likely build a better wheel base with two motors instead of four, having the four motors would be ideal but with restrictions on the EV3 it was not a possibility. If we had the materials we would also make it bigger.
Jackson Foggo – programmer/designer/presenter
Steve Reynolds – designer/builder/destroyer