This year’s North American IoT Tech Expo HackaThon, hosted by InvenSense, was a mix of physics, popular culture, and coding with a healthy dab of cutting edge sensor hardware systems. The objective was to use a real-world gesture that would be replicated to throw the ball and catching a Pokemon in the internationally popular Pokemon Go game. Contestants used InvenSense’s new SensorStudio starter kit to capture real-world gestures or readings.
Participants then had to translate the captured gestures via processing and code into different aspects of a Pokemon Go “throw”. Pokemon Go, launched on July 6th of this year by Niantic, is the first popular mainstream augmented reality game. In the game, players flick a screen to throw a ball at a creature, the throw has a speed, spin and force in the game- represented by the movement of the ball and the player’s ability to capture the pokemon. But the 2016 IoT Tech Expo Hackathon participants decided to replace the “screen flick” with a real-world 3D gesture of their choosing.
Participants in the IoT Tech Expo Hackathon were challenged to do three things:
The participants were expected to be able to fulfill all three objectives, as creatively or simply as they liked. Furthermore, judge and Invensense Senior Software Engineer Helene Wiazemsky was quick to point out that any extra components (for example: extra spins in the Pokemon Go world, jumps, or capturing other gestures from reality) would be worth extra points, and help determine the winner.
Figure 1. InvenSense Senior Software Engineer and Hackathon Judge Helene Wiazemsky presenting the Hackathon rules and win conditions
The Hackathon was a sprint of a coding IoT challenge. Participants had less than 48 hours to use their processing and advanced mathematics skills, applied to embedded processing (Cortex M4 target). Contestants approached the problem in teams of one or more and came from a variety of backgrounds including students, IoT professionals, hobbyist and those transitioning into IoT careers. The judges were four developers from the French office of InvenSense, ready to evaluate the applicability of their creation: InvenSense’s new SensorStudio starter kit.
Figure 2. Instructions and requirements presented to the contestants at the start of Day 1
The judges awarded four prizes, including three finalists and one first place prize based on ability to complete the challenge, creativity and approach. The finalist received Pokémon Go themed prizes including either a Pokémon Go themed Poké ball shaped battery charger, or Pikachu shaped charger. The first place winner received a Traxx drone with a 2-axis gimbal for aerial photo/video assistance with Film, Sport, & Expert modes valued at $399.
A new Poké throw for Pokemon Go
The problem at this year’s North American IoT Tech Expo Hackathon was trendy and surprisingly tricky. The Challenge: to code a program that will translate the real world activity of moving or “throwing” the device presented by InvenSense into the augmented reality world of Pokémon Go. The participants throw should allow a Pokémon Go player to catch a Pokémon in the virtual reality world, by moving in the real world.
The player’s in-game throw must be captured by responding discreetly to real-world actions. Unintended stimuli should not result in an unexpected “throw” or faulty motion. Furthermore, the contestants were encouraged to add spins, twists, or force to their throws to replicate the game-world moves necessary to successfully capture Pokémon.
Figure 3. Participants, developers, judges and bystanders looking over code on the morning of Day 2
The theme was by no means unusual or unheard of in this year of augmented reality, especially sincePokémon Go became an international hit earlier this summer. Representing real-world movements into a virtually augmented world could be the next step for the newest trend of augmented reality games and tools, using IoT programs and devices. Michael Aquino, a first time Hackathon contestant approached the challenge as a physics problem. “It’s a cool challenge because it’s about Pokémon, but I think you have to be classical physicists to do these things… Here we are [thinking] about acceleration and gravity.”
An adaptable, “throwable” sensor
The InvenSense SensorStudio kit is made up of four components: the Nucleo F411RE, Carrier Board, Sensor Daughter board and ICM-20690 daughter board. Used together, the SensorStudio, billed on the InvenSense website as the “sensor development platform” for the “internet of Sensors” and contains 5 different sensors within it. These sensors include a magnetometer, humidity sensor, proximity sensor, gyroscope, accelerometer and a linear accelerometer. All the sensors have the capability to Record and Replay data and algorithm signals. Therefore, creating repeatable development and test scenarios. The software requires C/C++ language developers and uses extension API to create sensor drivers and data fusion.
Figure 4. InvenSense GenericHub Development Kit composed of 3 boards, the ICM-20690 Daughter Board (blue), the Sensors Daughter Board (small-white) and the Carrier Daughter Board (big-white), mounted on Nucleo-F411RE.
The hardware is designed to allow users to easily visualize sensors outputs in order to program the SensorHub MPU. The software, on the other hand, allows the user to easily visualize the output from the captured data through the developer’s own algorithms, or in this instance the Pokémon in-game “throw” including spins, twists, or special surprises. SensorStudio is designed for rapid development. The SensorStudio kit offered five potential sensors, and the participants were free to make use of as many or as few as they wished. Most contestants only used two or three sensors often keeping themselves limited to the most obviously applicable components including the accelerometer, gyroscope, and proximity sensor. The hardware is currently limited to windows based programs.
Catching IoT from professionals to students
One contestant, Madhavi Chodankar, described the problem in simple terms; “We are using the integrated environment they have provided in the form of this sensor studio software and we are basically writing, using the visual aid tools that we have to create.”
Chodankar, an independent IoT consultant, mom, and youth robotics class instructor broke the problem down to an analysis of data mining. She said “I think the creativity [in this challenge] is data mining the sensors, or the combination of which sensors that you use, to make [the project] possible. And each one of us thinks differently. So each one of us [participants] is using different sensors to make the final output. “
While presenting her final project Chodankar described her methods in a few steps. First, she used the proximity sensor as input, so the program could establish if the hardware was being held. Next, Chodankar used the input from the gyroscope to identify the angle or rotation of the physical motion and produced the rotation or spin of the Poké ball in her virtual reality “throw”. Finally, the accelerometer captured the motion or speed in the real world. The information from the accelerometer was used to identify the force and trajectory output of the Poké ball “throw”.
Chodankar’s understanding of the hardware capabilities proved to be more than adequate as she went on to be awarded third place at the end of the competition.
Figure 5: Hackathon contestant and finalist Chodankar testing and demonstrating how proximity
Other contestants approached the problem in different ways. Michael Aquino; a recent coding bootcamp graduate was encouraged to attend the Hackathon by his teacher. “We had a team at my bootcamp but I was the only one who showed up, I don’t know if that was smart or not.”
Using the frontend coding tool CodePen, he approached the Hackathon as an opportunity to apply his new skills. “I thought it would be mostly code…but it’s not about just writing code, [IoT] is working with hardware.” Aquino did find a teammate at the event , a recent IoT professional and Mechanical Engineering graduate Brooke Nichols. Like Aquino, Nichols came to the event to learn and play with a new technology from the leading sensory hardware producer InvenSense.
The two did not present their findings in the final review of the projects, as they were, as Nichols put it: “Stuck on the detection part.” However, they had a good time and as a team the two were able to appreciate the hardware and software provided by InvenSense. Both contestants mentioned being excited to play with the system and when questioned about his favorite part of the challenge Nichols answered without hesitation- it was the visualization; “You can visually see your sensor data really fast, but there is a learning curve, because you have to know how to configure your inputs and outputs. So yes, it was very interesting”
Open the Door to Go Play
Different contestants approached the problem in different ways. Different contestants measured input from different sensors to produce the motions of the Poké ball “throw” output. Multiple contestants did not present their findings. Some were unable to isolate the input from different sensors adequately enough to record the motion of the device consistently. A few other contestants came late, and approached the Hackathon as a learning experience instead of a competition.
One contestant who entered the competition in the last hour used a unique sensor input. Instead of capturing the input of moving the hardware through space, mimicking a throw or tilt of the hardware, Morte used a combination of sensors to recognize the input of human breath into the device. His project captured input from a person breathing on the hardware, instead of a physical motion of the hardware in space which every other presenter used. Morte was awarded the fourth prize due to his creativity. But no matter how creative the other contestants were they can only be one winner.
First place was awarded to a young developer and student, Fenil Parekh. Parekh describes IoT as his passion and he lives in a world filled with little problems and mechanisms to hack and understand.
Figure 6: Hackathon contestant and 1st place winner Fenil Parekh
“Whenever I see some kinds of mobiles or something, I just think about how these things work.” Laughing a little bit he continued; “Actually I just want to get into this, and [figure out] how it works: the operating system and how it works.”
When questioned about his chances before the final presentations Parekh answered without hesitation, “Yeah, 100 percent, I think I will win. I am so confident.”
Parekh’s winning project completed the primary goals set out by the judges in a clear manner. He described his project very simply in three parts; curve, distance, and spinning rate. He used the accelerometer, gyroscope, and linear accelerometer to capture the data. With the data captured in real-time from the three sensors, he was able to create separate outputs for the trajectory, or curve, of the object through virtual space, as well as the distance and spin of the “ball”.
While presenting his final results Parekh’s project delivered consistent measurements for the distance the Poké ball “throw” would travel in virtual reality, that corresponded to the strength of acceleration detected by the linear accelerometer. Similar to finalist Chodankar, Parekh used input from the gyroscope to isolate the “spin” of his virtual reality Poké ball. Therefore he fulfilled all the requirements set forth on by the judges. But, he didn’t stop there.
Figure 7: Parekh holding the hardware system and showing off it responding to his “closing the door” feature on bottom right
Parekh designed a separate recognition system that identified a discrete twist of the device, to the left or right, to turn on or off the application. Or as he put it; opening, or closing the door on the game. “I completed the primary goal and I completed one bonus point too- open the door. The thing which we need to develop is a Pokémon [game]. So I have a program called open the door and close the door. So if you are going to play the game just open the door and go!”
When awarding the final prizes the judges acknowledged that Parekh fulfilled the initial requirements of 1) capturing sensory input 2) translating the sensory input, and 3) producing virtual output (or the Pokemon Go “throw”) and did one more thing. Parekh used the SensorStudios hardware to capture a 4th discrete motion. Specifically, the pivot of the sensor in his hand as pictured above. Then he produced a different output, namely starting or stopping the Pokemon Go application. In his final product, Parekh demonstrated that the primary input/output functions did not record data when he turned the hardware to “close the door”. With a turn of his wrist and an “open door” the functions started again. Undoubtedly, the “open the door/close the door” feature won the day.
After the contestants shared their projects, judges made their final deliberations and the prizes were awarded. No matter who won or didn’t win, all the contestants succeeded in the one thing that they all came ready to do: learn. Well, learn and hack!
IoT Tech Expo North America 2016 Series: