Hi, I'm Rithinteja Aechan, an Undergraduate Student at Georgia Institute of Technology

Below are some projects I have worked on for the past few years, feel free to check them out! (Click the arrow on the top right of boxes to open projects)

Inexpensive Glasses for the Blind and Visually Impaired Using AI and OpenCV

Final INEXPENSIVE RASBERRY PI GLASSES FOR THE BLIND USING AI AND OpenCV

Research Paper

Final Product

Project Video

About:

Led a research team to develop a pair of inexpensive glasses (~$100) that could audibly describe the world around the user by detecting real-world objects and text using Artificial Intelligence, Open Computer Vision, and PyTesseract

Abstract:

Being blind can have a significant impact on a person's quality of life since it impairs their ability to navigate the world and participate in activities they once loved. Many people experience a loss of independence and mobility as a result of their inability to move around their environment confidently and complete everyday duties without assistance. What if there was a device to help the visually impaired?  This engineering project aims to create a device for the visually impaired that utilizes a Raspberry Pi and Raspberry Pi camera to assist with reading text. The device, in the form of glasses, reads text or detects objects for the user and audibly outputs the information through personal headphones. The project's goal is to enhance the independence and mobility of the visually impaired by providing them with an additional sense to supplement their visual deficiencies. The research will focus on OpenCV and Tesseract OCR, as well as the integration of the Raspberry Pi camera and Raspberry Pi into the glasses design. In the end, the device is able to accurately detect objects and audibly output it to the user, and is also able to accurately capture text and output the text to the user in the chosen language. 

Awards:

Translating American Sign Language using flex sensor and Arduino

Final Translating American Sign Language using flex sensors and Arduino

Research Paper

Final Product

Project Video

About: 

Led a research team to develop an affordable (~$70) American Sign Language glove that can translate sign language to audible output to increase communication between the hearing impaired/deaf and people who don’t know ASL

Abstract:

As of 2019, there have been around one million people who use American sign language as a method of communication. However, even though 1 million people are using sign language to communicate with each other, many Americans do not know sign language and this would cause communication issues with people who are deaf. According to newsweek.com, in 2019, 70% of deaf people are underemployed/don’t work and about ¼ of deaf people have left a job due to discrimination; these were all problems caused by communication barriers. What if there was a way for Americans to understand sign language without having to learn sign language? Using Arduino and flex sensors, in this project we were able to make a glove that was able to translate American sign language signs into letters and transmit those letters to a phone app. Flex sensors were used to record how much each finger was bending, and the Arduino would calculate what hand signs the person was making. Our project also includes a calibration system so people can calibrate how much their fingers bend for each letter in sign language since not all people’s fingers are the same. The sign language glove is mostly working, however, there are some inconsistencies in the outputs. It sometimes doesn’t calculate the correct letter because the flex sensors used in this project weren’t as accurate as they should have been. This causes a delay in the outputs from time to time. But our project has been able to translate hand signs into letters for the alphabet. 

Awards:

Inexpensive EEG-Controlled  Prosthetic Arm using AI and ML 

Portfolio ISEF Paper AFFORDABLE EEG CONTROLLED ROBOTIC ARM USING AI AND ML Rithinteja Aechan & Mehdi Hussain

Research Plan

Prosthetic Arm Circuit diagram

Circuit diagram of the Neurosky EEG headset connection with fingers

About: 

In the Duke Ignite Makers Program, I and my partner (Mehdi Hussain) are made an affordable prosthetic arm controlled by AI and electroencephalogram sensors with haptic feedback. 

Rationale: 

A study on the Global prevalence of traumatic non-fatal limb amputation found that worldwide, over 57.7 million people suffer from some sort of limb amputation caused by traumatic causes such as falls, road injuries, and other transportation injuries (McDonald CL, Westcott-McCoy S, Weaver MR, Haagsma J, Kartin D., 2021). Furthermore, a study conducted on the Mental Health of Individuals With Post-Traumatic Lower Limb Amputation concluded that many amputees often find themselves having severe mental health issues later on in life including a feeling of helplessness and lower self-esteem, and have a much more negative outlook on the future (Şimsek, N., Öztürk, G. K., & Nahya, Z. N., 2020). Many current prosthetic arms on the market suffer from a high cost of entry, a lack of sensory and haptic feedback for the user, and being generally uncomfortable and heavy for the user over longer periods of time, especially when it's the user’s primary use of mobility. Current models of prosthetic arms that are available to users suffering from a wide range of disabilities cost upwards of $50,000, making it unattainable for most Americans, and nearly impossible to obtain in developing countries.

Awards:

AgriRover

Final Product

Website Link

Project Video

About: 

Working with Dr. Delagrammatikas during Duke Summer STEM Academy, my team and I were able to create an autonomous robot that can detect diseased plants and crops and pinpoint coordinates to farmers via GPS. 

Abstract:

Every year, $220 billion worth of crops are lost to plant diseases every year, and existing solutions pose additional threats to human and natural life. There are few solutions for this issue, and the ones that exist require a lot of human involvement (such as an app that can diagnose plants by taking a picture of them). We wanted to create a solution that tackled the problem of fast, reliable diagnosis of crops that also supported our beliefs of feeding people safely and sustainably. Our solution, AgriRover, is an autonomous system that is able to detect diseased/dying crops and pinpoint the locations of these crops to farmers. We used infrared cameras that would measure the infrared light coming off plants in order to measure how healthy they were, and regular cameras with an AI model processing the feed in order to determine if plants had diseases. AgriRover then marks the location of diseased or unhealthy plants using a GPS. These coordinates are then mapped onto a web app that farmers can access in order to see where there are diseased or dying crops in their fields and can take action. We hope that with this solution, we can automate the detection of diseased plants for farmers all around the world and prevent the spread of diseases that destroy their harvests, helping fight against world hunger.

Awards:

Green Level Cross Country Website 

(The website is currently down as of 7/6/24. Will be back up again soon.)

About: 

Working with Mr.Fox, my CSA teacher, and Ms. Barish, a cross country coach at Green Level, me and my partner had made a website involving tracking the progress of (~88) Green Level Cross country athletes over the course of their athletic journey. In this project I had worked as the backend developer working with mySQL and flask to develop a system to be able to take in data from excel files to be outputted onto the website. With the data, I was able to calculate each players personal records and store their progress over matches and be able to visually output the information.


Link: https://www.greenlevelxctrackfield.org/

Lessons for students

Arduino Lessons

Hands-on Electrical Circuit/Arduino Lessons

VexCode training lessons

VexCode lessons

Teacher DOC 2022 Syllabus and Resources

Python Lessons

About: 

Over the past couple of years, I've enjoyed teaching students and peers about skills I have learned in engineering. During quarantine, I learned the skills of Arduino from my brother and YouTube videos and replicated them through lessons to teach to peers during Robotics club meetings. When the Robotics Club was formed into a VEX team, I trained peers how to program with VEXcode. I also created a week-long course about Python on the platform Deepnote for young students who have never coded before. I am currently creating lessons on how to effectively write research for STEM Research club at my high school. 

I am primarily focused on hands-on lessons, as I want to teach others about skills that not only I'm interested in, but that students might choose as a career in the future. 

Gator Scheduler

Green Level Scheduler FBLA Mobile Application Development Presentation

Presentation

FBLA App - Made with Clipchamp.mp4

Demo Video

About: 

Gator Scheduler is a react-native expo application for Green Level High School that helps keep parents and the community up to date. The application is designed for use on phones but can also be used on tablets and in web browsers. It can be used in multiple mobile operating systems, such as Android or iOS, due to using React Native. A Firebase database was used for the login system and storing information about activities. Please check Github for more information. 

Awards:

Varying lenses and the effect on blue light intensity

Varying lenses and the effect on blue light intensity.

Research Paper

About: 

Working with the Department of Physics, East Carolina University SVSM with Dr. Sprague and Dr. Kenny created a research project to test the efficacy of blue light-blocking glasses. 

Abstract

Blue light emitted from computer screens has been a widely researched topic and many studies have been used to prove that blue light from screens have adverse effects on your eyes. The wavelength frequency that impacts the eye found by the studies is between 415 nm and 455 nm and is related to eye strain and photochemical damage.  The purpose of this experiment is to test blue light blocking glasses to see their effectiveness in blocking blue light waves between 415-455 nm, and also to see if blue light filter apps are more effective.  A Dell XPS 13, Vernier UV-VIS spectrophotometer, Eyezen® blue light blocking glasses, Kenzou blue light blocking glasses, and f.lux blue light filter app were used to conduct the experiment. A control group of no glasses, and the groups that were experimented were the 2 blue light blocking glasses and the blue light filter app. The experiment took place in a light blocking container to ensure no other light sources affected the results. Data was quantified through Python and percentages of how much blue light the experimental groups block were made. The expensive glasses blocked 14.3% of blue light, the cheap glasses blocked 7% of blue light, and the f.lux app at 3500k blocked 60.1% of the blue light at 100% screen brightness. Based on previous research it can be concluded that it would be more effective to use an app to reduce blue light coming from screens to reduce strain on your eyes. However, using either a blue light filtering application or blue light glasses are proven effective compared to the control: which was no blue light glasses or blue light filtering app. This research can be useful to showcase the effectiveness of blue light filtering technologies for people to use to protect their eyes from screens. 

Get in touch at rithinteja.aechan@gmail.com