Design + Innovation Day Award-winners: Department of Electrical and Computer Engineering

smart lithium battery charger
A smart lithium battery charger design, one of five winning Design + Innovation Day projects out of the UBC Department of Electrical and Computer Engineering.

Design + Innovation Day is UBC Engineering's annual showcase for student engineering design projects. Carried out in small groups over the previous academic year, the projects enable students to improve their technical, teamwork and management skills while designing solutions to real-world problems.

The most outstanding projects of the year are recognized with UBC Applied Science Faculty Awards. We are pleased to feature the winning teams, selected by their program instructors, on the UBC Applied Science website in the coming weeks.

This week we are spotlighting the top five teams from the UBC Department of Electrical and Computer Engineering. Each demonstrated "overall project excellence" based on "technical achievements and the engineering process".

 


 

Automated AI Photogrammetry Apparatus

Team members: Estalin Alvarez, Robert Bradley, Sam Bedry, Seth Whalen, Tarryn MacPherson 
Community partner: UBC Studios

The referenced media source is missing and needs to be re-embedded.

Our project

Our project automated the photogrammetry process, taking a series of standard photos from a camera and combining them into a 3D model.

UBC Studios' current photogrammetry system can create 3D models; however, it is limited to small objects that can be placed on a turntable. To improve this system, we designed and simulated a mobile photogrammetry robot. Utilizing its free ranging wheels and vertical chain drive, the robot can scan large objects such as furniture. Additionally, the system is automated to reduce workload and time spent creating the 3D scans. The robot autonomously determines photo locations, uses VR tracking to orient the camera and takes a photo of the object at each location.

Our inspiration

We chose this project because we were all interested in some aspect of robotics, either the hardware or the software that controls them. We were also moved by the real-world problem that this project was attempting to solve: enabling more digital learning methods using 3D models to aid faculties and students that rely on analyzing physical objects, such as bones or antiques, for academia.

Our biggest challenge

As the COVID emergency limited our access to manufacturing equipment and materials, we soon realized that we were unable to construct a physical prototype of our robot. Due to this, we decided on a more unconventional approach of designing our project using simulation software. Similarly, the scope of the project was also a continuous concern since the project had multiple different sections, but we managed to successfully delegate each section in accordance with our own aptitudes.  

What excited us most

Broadly speaking, it was just an awesome idea! We got to design a robot with a wide range of functionality that proved to be a rewarding technical challenge. It was awesome to take a lot of things we have seen in classes and apply them to something, as well as learn a lot along the way. More specifically, what continuously motivated us was the opportunity to develop a technology that will be able to materialize virtually any object into a digital model and the possibilities this opens for online learning and future VR technologies.

The most interesting/surprising thing we learned

Primarily, the most surprising factor that we came across was that we were able to create a fully functional virtual model that successfully met our clients’ requirements and expectations solely through simulation software.

Another aspect that we found interesting about this project in comparison to the other project courses we’ve had was the flexibility we had. These projects have so many options, nearly infinite possibilities, so it's easy to get locked in dependent decision-making loops. But it was nice to see the group move and flow and adjust based on decisions we made together.

Our project's future

The next steps for this project would be to build a physical model and work on integration and testing of sections of the robot that we only tested through simulation. More work could also be done for the body, in particular working on the placement of all the components within the frame and expanding the wheel control system and communication protocols. The image capture process can also be improved by adding more user inputs. Doing so would make the image capture process more customizable depending on the object being scanned. 

 


 

Hand Gesture-Controlled Robot Pet

Team members: Charley Johnston, Dianna Kan, Steven Song, Danni Zhao, Sunny He
Community partner: Huawei

The referenced media source is missing and needs to be re-embedded.

Our project

Consider having a pet that is smart, responds to your commands without training and even takes pictures for you! You don’t even have to pick up after or buy kibble for this pet! This is a dream come true for those craving an animal companion but cannot afford the time, money or maintenance required for real animals. Robot pets are one example of how technology can assist in scenarios where companionship and fun are needed, but the handling and upkeep of real pets isn’t possible. especially in health care or senior homes.

Apart from doing normal pet things such as moving around and performing a few tricks, it can also perform robot tasks, like capturing photos with the camera in its nose. It can also communicate its mood through animated eyes on a colored LCD screen mounted to its head.

Through its camera “eyes”, this robot pet uses a combination of machine learning and computer vision to see and respond to commands given through its owner’s hand gestures. The robot pet can track, recognize and perform a task such as following the user while video capturing. The user is thus able to control the pet from a distance, enabling full autonomy.

Our inspiration

We were inspired to pursue this project largely because of the machine learning component used by the robot pet to recognize human faces and hand gestures, as well as the fun, entertaining nature of the project. As the use of machine learning applications is on the rise, we were very excited to take the opportunity to gain some experience with machine learning by integrating it into an ECE project. Many machine learning developments are used in important projects and applications, such as machine learning algorithms used in health care to detect cancer cells in patients, or machines in waste management facilities used to sort recyclables, in order to help protect the environment and prevent landfills from filling up. We were drawn to this project partly because it was an opportunity to integrate machine learning into something that could be a part of your average person’s everyday life. We hoped to bring to light that machine learning can be used by anybody that has a little bit of programming experience, not just in crucial applications or by big companies.

Our biggest challenge

The biggest challenge we faced was integrating together the many subsystems of the robot, while still having the robot respond quickly to commands given to it through hand gestures. Within just a couple of seconds, the robot needed to be able to recognize a hand gesture, update its LCD screen to display the appropriate animation, move itself forwards, backwards or spin around using the motors attached to its wheels or take pictures using its camera. We needed to change the design of the robot several times and make many optimizations in order to achieve this quick response time.

What excited us most

For us, the most exciting thing about this project was the experience we gained utilizing machine learning. We included several machine learning algorithms which allowed the robot to recognize different hand gestures, as well as adjust the robot’s camera position to track the user’s face by moving the robot’s camera “eyes” using servo motors. We were thrilled to gain some experience using machine learning, as it’s not taught in the core ECE curriculum.

The most interesting/surprising thing we learned

One of the most surprising things we learned through this project is that Huawei has its very own deep learning framework, similar to TensorFlow, called MindSpore. Most people probably know Huawei for their smartphones, but it turns out they are developing a lot of projects using machine learning and AI. One of Huawei’s goals is to redefine user experience with AI by making it more personalized for people in all aspects of their life.

Our project's future

We hope that our robot pet project inspires people to think about how machine learning and robots can solve problems in ways that haven’t been considered before, as well as inspire people to create projects of their own. For example, we hope that our robot pet can help provide companionship to people who are unable to have pets. As well, we hope that smart toys such as our robot pet may be able to spark some passion in technology, electronics and machine learning in kids (or adults!) who play with them. Our project is open source on Huawei’s GitHub, so anyone who wants to is able to improve our robot by making it more pet-like or adding features.

 


 

Lab-in-a-Pack

Team members: Ray Allen, Richard Chang, Renz Patrick Angeles, Matthew Schwab, Victor Sira
Community partners: ECE professor Dr. Sudip Shekhar and Intel Lab’s Rajesh Inti (individuals not representing their respective organizations)

The referenced media source is missing and needs to be re-embedded.

Our project

Undergraduate ECE lab equipment has remained virtually unchanged over the last few decades. However, the costs have grown prohibitive for many. In addition, the COVID-19 pandemic has led to the global curtailment of on-site lab activities. Both of these issues have made it difficult to build electronics.

To combat both these problems, the team designed the Lab-in-a-Pack: a small, affordable device that replaces four common pieces of lab equipment: an oscilloscope, multimeter, signal generator and variable power supply. The capable hardware array in tandem with the flexible software stack makes the Lab-in-a-Pack an indispensable tool for anyone interested in electronics. Its extensibility combined with its low price will allow for important problems to be solved at a fraction of the cost compared to traditional equipment.

Our inspiration

We look back on our ECE education and remember the many late nights spent in the labs — if only the Lab-in-a-Pack had existed back then, we could have been home much earlier. Reflecting on our own experiences, we were motivated by an earnest desire to make an impact on future ECE students.

Our biggest challenge

Developing and testing hardware prototypes without access to a lab is a challenging endeavour. It is also especially challenging to debug circuits individually without the in-person help of your team.

What excited us most

Despite the challenges, our team was motivated by the fact that the Lab-in-a-Pack is a device that can solve many of the problems we faced working on our project remotely. We are also quite excited to use it ourselves!

The most interesting/surprising thing we learned

We were surprised by the number of companies working on portable lab equipment, and the extremely different approaches each one took.

Our project's future

We’re looking into further developing the product, to bring the goal of affordable, portable lab equipment one step closer.

 


 

Smart Lithium Battery Charger

Team members: Grant Andersen, Arslan Bhatti, Will Ries, Igor Vuckovic, Yingrui Yang
Community partner: GlüxKind Technologies

The referenced media source is missing and needs to be re-embedded.

Our project

Our smart lithium battery charger is designed to cater to a self-driving stroller application, where the end user will want to be able to select fast charging if they are in a rush, or slow charging when they want to keep the batteries healthy.

Our battery charger is designed to function alongside a wireless user interface that can be used on any laptop or desktop capable of a bluetooth connection. The purpose of this is so that a GlüxKind engineer can have absolute control of how the batteries are charged, which is dependent upon battery type, capacity and voltage.

In the future, once the design of the self-driving stroller has been finalized, GlüxKind will modify our interface for the end user so that they can simply choose if they want fast charging or slow charging.

Our inspiration

The opportunity to work on a power electronics project with a large emphasis on adaptability, especially regarding the end-user market, and adding a large interactive component over a wireless interface.

Our biggest challenge

Due to COVID we found it especially hard to coordinate as a team in terms of the hardware. Testing our Printed Circuit Boards (PCBs) was especially difficult as there were safety restrictions and procedures that had to be created, approved and followed. Additionally, we were required to get a lot of information from our client about their facility, where we would be working, we came up with safety procedures that we would follow, and had to get approval from our instructor and TA.

What excited us most

In our power electronics course, ELEC 451, we learned about the basics of DC-DC converters and the theory behind them. This project allowed us to explore the practical applications of what we learned in that course and to verify it for ourselves.

The most interesting/surprising thing we learned

At the beginning of our project, our client had a good idea of what they wanted in terms of overall design in order to meet all requirements. In the end, we met the criteria for the project using an altogether different design.

Our project's future

Additional features can be given/more control can be given through the wireless user interface, such as being able to save charging settings for later, and other such features that were outside of our project scope.

A few aspects of our hardware can be optimized to increase efficiency and/or reduce size of the product.

 


 

Xbox One Arm Cycle Adaptive Controller

Team members: Scott Beaulieu, Edward Luo, Keith Consolacion, Fabian Lozano, Nicholas Winship 
Community partners: Physical Activity Research Centre (PARC), UBC School of Kinesiology, Makers Making Change, Microsoft

The referenced media source is missing and needs to be re-embedded.

Our project

Individuals with physical disabilities often have limited options to be physically active. A popular form of cardio exercise for people with Spinal Cord Injuries (SCIs) is "hand cycling". At the Physical Activity Research Centre (PARC), users with SCIs can engage in many physical exercise activities while building social connections within their community. 

With the ongoing COVID-19 situation and the initial closure of PARC, it was clear that there was a need for a new home-based exercise device for individuals with SCIs. The goal of this capstone project was to develop an Arm Cycle Controller, compatible with the Xbox adaptive controller for Xbox One Series consoles and Windows 10 PCs. The final product is aimed at bringing an inexpensive and engaging exercise experience into the end user’s home, one that can foster community through online engagement. 

Our solution is an electromechanical adapter that adds controller functionality to a commercial mini-exercise bicycle. The device achieves this by sending control signals derived from encoder and inertial measurements to an Xbox Adaptive Controller through audio cables. These signals are interpreted as joystick and trigger inputs which allows for racing games to be played using an Xbox One Series console or Windows 10 PC. 

Our inspiration

The common drive amongst our team members to work on this project would have to be wanting to give back to the community at large. It felt like a project where we could realistically create a product that would brighten lives.

Our biggest challenge

Our biggest challenge would definitively be how much in-person work was required for our project. We had to design the entire project before meeting in-person at our client’s facilities within the last several weeks of the course to properly assemble it.

What excited us most

It would have to be the first time we saw everything working. It involved lots of jumping in excitement and laughing when it first worked. I think it is a fond memory for all of us on the team.

The most interesting/surprising thing we learned

That you can quite quickly re-wire a TRRS audio jack into two TRS audio jacks!

Our project's future

The project will continue to be iterated on and upgraded by teams working with PARC. They will work towards creating a finalized product that will hopefully be affordable, able to be built easily and then provided to persons with spinal cord injuries.

UBC Crest The official logo of the University of British Columbia. Arrow An arrow indicating direction. Arrow in Circle An arrow indicating direction. Caret An arrowhead indicating direction. E-commerce Cart A shopping cart. Time A clock. Chats Two speech clouds. Facebook The logo for the Facebook social media service. Home A house in silhouette. Information The letter 'i' in a circle. Instagram The logo for the Instagram social media service. Linkedin The logo for the LinkedIn social media service. Location Pin A map location pin. Mail An envelope. Telephone An antique telephone. Play A media play button. Search A magnifying glass. Arrow indicating share action A directional arrow. Speech Bubble A speech bubble. Star An outline of a star. Twitter The logo for the Twitter social media service. Urgent Message An exclamation mark in a speech bubble. User A silhouette of a person. Vimeo The logo for the Vimeo video sharing service. Youtube The logo for the YouTube video sharing service. Future of work A logo for the Future of Work category. Inclusive leadership A logo for the Inclusive leadership category. Planetary health A logo for the Planetary health category. Solutions for people A logo for the Solutions for people category. Thriving cities A logo for the Thriving cities category. University for future A logo for the University for future category.