Sumo | Colin

‘Colin’ is a final prototype sumo robot used in combat. Colin uses a colour sensor to detect the black lining around the edge of the ring to prevent itself from catapulting off the arena. In addition to the colour sensor, the robot also utilises an ultrasonic sensor which allows for minor adjustments within the programming to tweak minor tactics for matches. A touch sensor was the final addition to the frame created around the brick of the sumo, for the mere purpose of being to start simpler when against a time penalty in most matches, all of this was done with still having an intention to keep the machine as lightweight as possible.

The restrictions implicated on the sumos affected our robot greatly, having weighed in as one of the heaviest of the competitors at 726 grams, starting each match on at least a one second deficit derailed our plans entirely and weren’t able to fight back in any games.

In regard to programming, the design was kept simple, only focusing on using the ultrasonic and colour sensors to guide the robot around the arena, I believe that despite our poor results in matches (0-5, not winning a match) we were not as far off from success as the score shows, our threshold for our colour sensor picking up the reflected light was not set to the correct number giving the wayward directions to the brick.



Disabled German & Mock I (Lucifer)

Remote controlled: We initially wanted to make an attacking-defending robot. The design was that resembling a ramp for a shield, and an arm for a sword, powered by a motor. These were mounted as part of a standard car. This arm transitioned to four on an axle, resembling that of a Buddhist swastika. During the tournament this was mounted on the side, then changing to a large arm on the front. Both spun in an attempt to flip the other robots. Both variations worked to varying degrees. The remote controls using Bluetooth were a large part of people’s horrendous matches, as the steering was relatively off with minimal control if a robot had more than just movement controls.

Our initial Swiss Cheese

The automated robot was a massively scrapped version of the “Disabled German”, in that we initially wanted to look towards the winning sections of the competition. So, we looked at the “Swiss Cheese”, Harry’s robot. It seemed to deliver on the fantasy of being a large skirt disabling robots from moving and charging it. This was our initial take before we realised we needed to mess with the wheels further. So, instead, we looked to the true winner, and google images. It appeared that a flat design was the best idea, whilst turning the brick upside-down, utilising the slight tilt the brick brings to the joints it allows for off the side.

Our original “Mock 1” – A version basically copying Joey’s design

The “Mock 1” as dubbed by Steve, allowed us to elaborate on our own design whilst simultaneously keeping to a ‘winning’ design. Our sensor at the front allows us to detect robots ahead. We had issues with detecting the ‘ring’ we fought on. It worked after some time of adjusting the range. We modified where we put the light sensor, to detect the edge of the ring. We put it to the front, allowing us to detect if it was going to go over the edge after going forward, turning, backing off, repeat.

We did relatively well in the standings, strategically outsmarting many an opponent, only falling short to the one of which it attempted to remake. We sub-named it Lucifer due to the weight being 666 grams.

Andrew & Joe

Lil’ Timmy

Lil’ Timmy is our automated sumo bot. Our robot used an ultrasonic sensor, colour sensor and touch sensor. The ultrasonic sensor detected other robots, colour sensor detect the edge of the ring and the touch sensor activated the robot.

How it works

Our robot would spin on the spot until it detected with the ultrasonic sensor and then charge at them. If our robot is getting pushed from the front towards the edge, it would detect the black line that marked the boundary and turn away. We then added a little bit at the start of the code that made the robot turn 90 degrees, drive for 1 second, then turn back 90 degrees. We did this to try and stop other robots charging ours at the start, especially since ours was fairly heavy, meaning the others got a head start.


This was the end design our our robot. The little arms at the front were there to hopefully get stuck in our opponents robot, preferably on the side, and push it off the edge.


This is how we started our code. This didn’t work because it was trying to do everything at once.


We then changed it to use a switch instead of just a loop because we needed it to do things that were dependant on other things (reading on sensors etc).


This was our final code. The part before the loop was put there so that our robot would turn out of the way at the start to avoid getting charged at straight away.


Because we had our colour sensor at the back which detected the line, this meant that our robot ended up driving itself off the ring a few times, if we did this again we would put it at the front or side because there wasn’t many times our robot was actually getting pushed backwards.

Our robot was one of the heaviest, this meant our opponents were always getting a head-start on our robot. This was one of the reasons we added a turn at the start of our code.

The ‘arms’ on our robot were more of a hindrance than a help most of the time. They came off fairly easily and didn’t really help push.

The range on our robots ultrasonic sensor was too far and often drove itself off the edge because it would detect people standing around the edge of the ring.

Will + Billy

The Robots of Death (ft. The Antenna of Death, and The Shovel of Death)


OVERVIEW: This project was to create a robot that fought and defeated other robots. We made a robot (The Antenna of Death) that would drive under the other contestants and take their wheels off the ground. Our robot was designed to annihilate and humiliate the opponents’ robots in a battle to the death inside the sumo arena. Using wings to create a ramp we were able to lift the other robots of the ground and run them out of the ring.

DESIGN PROCESS: We initially identified that it would be efficient to have the brick upside down for it to be lower to the ground and provide more attaching options. We thought about the options we had and we went with trying to have the lowest design. We also wanted to make it relatively strong so it could withstand a bit of bashing up. So, we made a ramp at the front using the wings to be used as a leverage device to lift the opponents’ robot and push them out of the arena. We didn’t really have any prototypes, we just had fun building the robot from scratch and seeing it fight to the death.

THE SHOVEL OF DEATH Image result for shovel emoji

BACKSTORY: After suffering humiliating defeat at the hands of the Death Triangle, we swore to never be outdone by such a weak opponent EVER again!

CHANGES: Some new rule changes altered our design and prototyping process. A weight handicap was instituted which resulted in the removal of the Antenna of Death’s antenna of death, and it’s really dope back panel. Also, the robot had to be AUTONOMOUS!

DESIGN PROCESS: Due its weight problems (and its inability to perform in the ring), we stripped the entire robot back down the EV3 brick. We brainstormed ways to defeat our inferior opponents, and came up with TORQUE. Using gears, we implemented a… gearing system onto the wheels to create extra power as we pushed the weak enemies out of the ring. Also, after realising the failings of the previous robot, we flipped the wings upside down to ensure that they went as low as they can go. We needed a way of stopping the robot from going off the edge, so we placed a sensor on the front. Once the sensor detected a change of light, it would go backwards a bit, turn, and go forward again. We went into the next battle with high hopes and dreams of a better future for us all.

THE FIGHT: The Shovel of Death failed. It didn’t fall off the ring, but due to structural deficiencies we were defeated once again. The upside-down wings had eliminated any angle the Antenna of Death had, so it didn’t lift the opponent’s wheels off the ground at all. Instead, it provided the other robots with extended piece to lift in order to take OUR wheels off the ground. Also, the wings weren’t very well attached to the robot and were sometimes ripped off, causing us grievous pain. Once the wings were removed, our robot was susceptible to a front-on push that resulted with us out of the ring. Additionally, when the wings were ripped from our existence, the sensor was put out of place and it stopped working good. The gearing system didn’t do as well as we planned and hoped it would… our robot was still less powerful than the others when engaged in a pushing battle. Overall, a fail in the name of SUMO ROBOT WRESTLING. We were so disgraced that we didn’t take any photos of the robot.


A terrible design by Matthew, Xavier, and Jack.

Jerry the Automated Sumo Bot

Prototype 1

Our original design was going to be a continuation of the remote controlled bot from the first week of the challenge, with minimal changes to allow automation. The design was a medium sized, relatively tall robot with a high centre of gravity and a remote control scoop on the front – the idea of which was to drive under an enemy robot and lift their wheels off the ground forklift style. Unfortunately due to the one major constraint of the challenge – that we were limited to the components from a single EV3 kit – we were forced to use the small motor and a few cogs to actuate the scoop – in order to save the larger motors for the drive train – which rendered it ineffective due to its limited power. The other drawback of this design was made clear when pitted against the short yet wide Death Triangle, with its low centre of gravity and simplistic design: our robot was too unstable.


  • High centre of gravity
  • Weak scoop motor
  • Unstable


Prototype 1.5 was an intermediate design between prototypes 1 and 2. It never actually went to war, however it allowed us to gain insight into an alternative style of design.

Prototype 2

Whilst Prototype 1 was effective in contrast to many of the other robots in the competition due to its sturdy construction and frame, the prospect of automation would open up a lot of opportunities for the other teams to improve their Sumo Bots, so we decided to redesign our robot from the ground up. We did away with the over complicated and ineffective scoop, and built a structurally sturdy frame onto a wide wheel base, lowering the centre of gravity in order to replicate one of the more successful robots from the first round. The automation process involved the attachment of two sensors: an ultrasonic and a colour. The idea was that our robot would swivel until the ultrasonic sensor located an opponent, and then it would charge. The safety protocol to stop the robot driving out of the arena was the colour sensor; when the robot identified the black line that signified the edge of the ring, there was an algorithm in the program that had the robot back up and turn around, ready for combat.


  • Poor programming resulted in poor performance against other robots.
  • The offside of the light sensor resulted in the robot running itself out of the arena on numerous occasions, as it would not register the black line in time when approached from one side.
  • Scoop was a little flimsy which resulted in us being able to apply less immediate force to our opponents, as the scoop would simply flex.
  • To turning circle of the robot was a bit ridiculous due to the nature of its wheels being at the very back and it having a long scoop. This made it somewhat unwieldy to program effectively.

Prototype 3

We concluded that prototype two’s light sensor was in a sub-optimal position, offside of centre on an arm. This resulted in our robot’s edge of ring detection only working effectively when the robot approaches the black line from one side. Approaching the line from the other side resulted in the Robot falling off the table, as by the time the colour sensor was over the line, a critical mass of robot was hanging out over the edge. In order to rectify this issue, we redesigned our scoop to incorporate, centralising it. This was prototype three’s only major hardware addition, however we completely overhauled the program to create more effective enemy detection. Whilst it still had the shoddy turning circle of prototype 2, its edge detection, scoop construction and programming were vastly superior.


Unfortunately our robot did not compete very well against the other for a variety of reasons, however its most obvious flaw was the untested programming. Because we spent so much time trouble shooting and building the hardware, our first trial of the new programming was in our first round of the tournament. With a bit more time we could have vastly improved it.

Mr Robot Sumo Competition 2018

The goal of this assignment was to create a robot out of Lego Mindstorms, the robot had to compete in a class sumo completion. The first tournament was for user-controlled robots and the second was for autonomous robots. Before we started building we decided to research what other designs looked like, no need to reinvent the wheel. We found this video where we quickly realised that a ramp was the way to go, and the lower the centre of mass the better the robot seemed to perform.

With our first robot we made sure to make a good ramp, and try and get the angle as low as possible whilst still making it secure, overall we were quite happy with the quality, but in the end, it ended up being a bit tall. In terms of programming, we didn’t need to do any with the normal software but we did experiment a lot with the controls on the EV3 app. We would have preferred to  use simple buttons to control it but for some reason you couldn’t do this with the program so we had to use a joystick, it had a few delay issues, and was sometimes unresponsive and would lock in a position even when our hands were not touching the screen, this would cause the robot to drive out of bounds. If it weren’t for a bug in the semi-finals we would have had a chance at winning.

For our second robot we took inspiration from other designs which had the EV3 brick turned upside down, this allowed access to the buttons on the brick, which we needed, it also allowed us to make the robot overall smaller and lower to the ground, we also removed the small ball we had on the bottom of the first version of the robot as we didn’t need it to reduce the friction. We also didn’t need to put the ramp on an angle, instead, we just used the bottom of the ramp as the contact point for the front of the robot. This increased the overall build quality and stability, and it lowered the centre of mass. Throughout the competition, we did modify the ramp slightly but it didn’t make a difference so we reverted back to the original design in the end.

In the second competition we did need to make the robot autonomous, so we added sensors and a program. We used an UltraSonic sensor, for sensing the other robots, a colour sensor, for sensing the black line and a touch sensor, to make the program easier to start. For the program, I started off by just using lots of IF and Loops Blocks, overall all it was quite unclean and had issues with not turning smoothly and once it had seen other robot, it wouldn’t actually drive forward because of the order in which I had set up the sensor blocks in the program.


For the second version of the program, I found out that I didn’t need so many Loops and overall I cleaned up the program, I also added a charge to make sure it would push the other robots off the edge..By Micah, Tom and Harry




Remote-Control Claw

The final prototype remote and claw

The Remote-Controlled Claw is an assistive device designed for people who may have troubles in mobility or trouble working complex machines such as the elderly. The robot uses a simplified remote control for simple use of the robot itself, allowing the user to be able to drive the robot into places where it would not have been possible with a regular humans hand.

The robot has a motor fixed to both sides with a touch sensor from the remote control to work the claw on the front of the robot. All of these functions are made possible by connecting both bricks with Bluetooth, with extensive testing and developments to enhance the abilities of the machine. In doing so we believe that a robot such as this could be a success on a commercial market, as the need for a simple and lightweight assistive device is one that is lacking for those that need it.

As the controls to operate the robot are very basic everyone that uses it could be able to operate it without much problem, the only downside to the robot is that there are no guides to tell the operator if the claw is lined up correctly to pick up the object surrounding it. A sound or light flash to signify this maybe would’ve helped the owner use the claw easier, although there was a clear plan to not over-complicate it given what the target audience would be if put on for sale in a commercial aspect.

Top- Remote / Bottom- Claw

Group Members – Calen & Daniel

T.P.R (Tele-Presence Robot)

First Hardware Design

T.P.R (Tele-Presence Robot) is a unique robot that can be controlled remotely by Skype from any location. It is capable of doing almost anything from surveillance, communication, mobility – getting to smaller areas, carrying various equipment and/or items. Our robot (with a couple of attachments) could preform any task.

Our robot will be able to solve the problem of limited surveillance. We chose this problem because it could be useful and effective. This includes keeping houses safer from being robbed and also stores. This could also contribute to help keep streets safer by placing these robots in small areas where other cameras can’t see or be placed.

T.P.R will allow you attend meetings even if your overseas or need to quickly tell/warn someone in your home that the cake is burning.

T.P.R will give people who are bed ridden the ability to go exploring anywhere from the safe comforts of their beds. This will not only boost morale In people but they will feel included within family/friends adventures.

First Prototype

Our first system design was to use Python to read key presses and display a colour on screen. Using Skype’s screen share we could display that colour on the screen of the phone. The phone would be attached to the robot with a colour sensor looking at the screen. The EV3 could then read the colour of the screen and drive the motors depending on the colour.

In the first test we realised the EV3 colour sensor did not work with ambient colour because it senses colour by flashing a red light and reading the light value, then green, then blue. using the light bouncing back it could determine the colour of the surface the sensor was looking at. So we used different shades of grey and using the ambient light node for the colour sensor. Running the phone at full brightness we could read approximately 7 different values without error. We used this to make our first working prototype.

First System design
First system design

The ‘A’ frame Design of the robot was too hard to build so we made a cube like design with the top of the phone sticking vertically out the top. We had 3 states the robot could be in, Forward, Turn Left on the spot and backwards. This prototype gave as a lot of information. Firstly the turning was way to fast for the one second delay we where getting from the computer to the robot and back but a quick software change fixed the problem. Despite the limited controls and the weak design of the robot we managed to drive it 10 to 20 meters down the hallway. The one second delay was a problem but we didn’t have enough time to lower it. the next problem was the slow speed. we tried adding gears and while the speed increased we gained other problems. these new problems where probably caused by the large gear ratio but instead of lowering the gear ratio we changed the gears so it would be a 1 to 1 ratio.

Second Prototype

Second Prototype
Second Prototype

The second prototype had more controls, we could now drive forward, turn right and left and stop but for our final design we added 2 colour sensors. we tried to have variable motor speed but time was running short and we didn’t have a variable input into scratch apart from the mouse position. Instead we focused on usability with arrow key controls. For example when only one motor was going forward that motor would be slower as to give precise movement but when both motors where in forward they would go at 100% speed.

Our presentation went well and many people enjoyed controlling the robot however a lot of people also liked driving into other people.

Final System

We used Scratch for basic movement controls and keyboard input, scratch was only used on the computer end of the system. Our robot was made with the Lego Mindstorms building system and does Robot movement and colour detection from the phone. Joey’s Phone allowed us to visually see everything the robot does. (linked with Skype) Skype was used with phone to allow us to see, hear and respond to everything including robot controls!


By Josh F, Caleb G, Joey N



We chose to make a prototype robotic hand designed to assist people suffering from Parkinson’s disease, and other debilitating conditions that effect the motor function – in other words the strength and dexterity – of the fingers and hand. An exo-skeleton arm, or ExoArm.

The completed ExoArm, ready to assist in hydration.

We chose this disease and solution because:

1) Parkinson’s disease inhibits movement. It is caused by nerve cells in the brain malfunctioning. (

2) Parkinson’s can affect the hands to the point of them not being useful for everyday tasks, which can have all sorts of negative repercussions throughout the remainder of one’s life. (

3) A potential solution to this is the creation of an artificial ‘exoskeleton arm’, to assist with hand mobility, and stop the disease interfering with the everyday lives of patients. (

4) This concept could also be used to assist in the case of other motion inhibiting diseases, conditions and injuries. For example a quadriplegic person who still has some control of their fingers would greatly benefit from one such device. (

And finally: while Parkinson’s is a terrible disease, and a functioning exoskeleton arm could be greatly beneficial, what finally pushed us to choose this project was how cool robotic hands looked!


Our final product was a basic exoskeleton hand with articulating fingers operated by the press of a button. However, as stated in the intro, this was a prototype; our device was clumsy and awkward, and it weighed a ton – it would require more strength and precision to operate the arm than it would just to pick something up. Whilst our prototype may not have any useful real world application, it is a good indication that with more time and better equipment a useful exoskeleton arm could be created.


1) Diavo Voltaggio – a LEGO enthusiast who build a robotic arm for Brick Fair. We were inspired by his three finger design, and just how good his hand looked. His YouTube channel is available at (

2) Builderdude35 – a LEGO YouTuber whose anti-stalling program we implemented in our device.


Given more time and resources, there are a number of upgrades we would like to make

1) It would be cool to give the fingers joints (using complex pivoting mechanisms as opposed to additional motors), more accurately replicating the human hand.

2) We would add a 4th finger, once again more accurately replicating the human hand, however for this we would need to use a second brick, increasing the weight and complexity a lot.

3) We would implement pressure sensors as opposed to touch sensors, which would allow for more control of the fingers – the harder the button gets pressed, the faster the finger closes.

4) We would refine our attachment mechanism, resulting in a tighter, more comfortable fit on the arm, and a reduction in weight and bulk.

5) We would refine the program to speed up the finger closing action, and to upgrade the anti-stalling function of it, allowing the hand to grip objects with more strength.



Program for one finger (This was repeated 4 times)


Program for custom block ‘Finger’ (used within the program for each finger movement)

Team: Barney Russell, Benjamin Bruce and Kindilan Hayes.