My name is Tomás Vega, and I am a Senior studying Computer and Cognitive Science at the University of California, Berkeley. I was born in Lima, Perú.
In the last few years, my interest in Human-Computer Interfaces has grown in proportion to my desire to pioneer the way we interact with technology. In particular, I’ve been unceasingly mind-blown by recent development in Brain-Computer Interfacing. This has definitely changed my life path, sculpting my passions towards deciphering the power of the mind.
I want to create the "human of the future" - that is, to develop technology to create new and extend existing human sensory and cognitive capabilities.
Projects
⇜
WheelSense
August 2016
{
Arduino
Fusion 360
}
Coordinated a team of fellow hackers in a weeklong makeathon for my friend Daniel Stickney who has cerebral palsy and a cortical vision impairment. We immersed ourselves in his home identifying daily challenges and coming up with creative solutions, with the aim of preventing injury and promoting independence. For the first three days we executed experiments to quantify his cognitive, motor and sensory capabilities. We discovered his biggest challenge is navigation, especially in new environments where he has no spatial memories of. We hacked his wheelchair to add ramp lateral edge detection, frontal drop off detection and backing assistance through multi-modal feedback.
Team
Pierre Karashchuk
•
Stephanie Valencia
•
Oscar Segovia
•
Ryan Sterlich
•
Kelly Peng
•
Tomás Vega
⇜
SmarTee
January - September 2016
{
Eagle
LPKF CircuitPro
Arduino
}
Developed hardware and software to control color-changing textiles in response to variable biometrics. Circuit board drives conductive threads in response to GSR (Galvanic Skin Response). Threads, which heat up when driven, pass under a layer of thermochromic paint, changing its color. Published in DIS ’16.
Inspired by the games of subconscious creativity made famous by the Surrealists, La Sinapsis Colectiva (The Collective Synapse) is a platform for creating community-based narratives and poetry using Twitter and SMS messaging. La Sinapsis Colectiva seeks to take advantage of modern day technological capabilities to foster a new sense of creative collaboration with stories that develop line by line, written by an open community of participants that have chosen to join the project. [Read full statement here (in Spanish)]
Developed website for the Electronic Literature exhibition in which La Sinapsis Colectiva was displayed.
From classic novels to banal social media profiles, it seems reasonable to suspect that any body of text arises from some human mastermind. However, as the following work demonstrates, the wizard behind the curtain may not be human at all. The following work sources its data by soliciting user inputs and compiling them on a single twitter account with surprising coherence.
Team
Tomás Vega
⇜
AliviaRÁ
March 2016
{
Arduino
Eagle
iOS
Python
}
Won first place at UC Berkeley’s Hack for Humanity. We developed a smart rehabilitation system that aids people with arthritis improve their joint flexibility and reduce pain.
An iOS app runs the patient through a series of exercise. The glove measure flexion at every finger and connects to an app via bluetooth. The app provides real-time visual and haptic feedback on performance, allowing the patient to know whether they’re doing the exercise correctly.
By tapping a button on the app the patient is able to tag when pain is experienced. At the end of each exercise session, data is relayed to a server on the cloud which then performs data analytics. Optimal exercises that target the patient’s specific positions of pain are computed and suggested to the user.
Moreover, regular progress reports are automatically generated and sent to the patient and their doctor/physical therapist. This includes information on finger flexibility improvement, recurring positions of pain and suggested exercises.
The whole system was developed in 40 hours.
Team
Ghassan Makhoul
•
Dino Digma
•
Alejandro Castillejo
•
Corten Singer
•
Tomás Vega
⇜
SynthSense
October - December 2016
{
Arduino
Eagle
iOS
Fusion 360
}
Enabling independence for the visually impaired: An augmented white-cane that provides both obstacle-avoidance and navigation assistance.
Obstacle Avoidance: We designed an Augmented White Cane (AWC) with ultrasonic range finders for detecting obstacles within a 2- meter range as well as mounts for the sensors that are both direction and height adjustable. Feedback from these sensors deliver haptic response on the handle when obstacles are detected at body or head-level.
Navigation: To provide navigation instructions to our users, we developed an iOS app that determines when the user needs to turn and then transmits a command to the AWC’s strap to provide directional haptic feedback.
Won first place at UC Berkeley’s 3D Printing Designathon. We modeled and created a bluetooth-enabled prosthetic foot that helps users distribute their weight properly on their prosthetic leg, in order to prevent muscle atrophy, bone density loss, between other health issues. The foot interfaced with an iOS app giving real time feedback to the user via vibrations telling them whether or not they were putting enough pressure on their prosthetic leg. The prototype was developed in 24 hours.
Team
Harshil Goel
•
James Musk
•
Anthony Baiey
•
Tomás Vega
⇜
SmartAss
September 2015
{
Arduino
SolidWorks
iOS
}
Designed and implemented a pressure-sore prevention device for wheelchair users. The system collects data from force sensors, which is then transmitted via BLE to a mobile phone App. The mobile app has the means of notifying the user as to when an area should be relieved of pressure and where that area happens to be. You may imagine how valuable this information is to users who have no feeling below their waist. It is just as valuable to those who can feel but often forget to move due to busy work schedules or intense exercise routines.
A proof of concept for SmartAss v2.0 was also prototyped in which the very same device can be actuated via the use of linear motors, in order to redistribute weight away from a pressure prone area. The next step is to improve resolution of pressure spots by increasing the number of sensors in the design and integrating the actuation into a small form factor.
The whole system was built in 72-hours at Google’s Bay Area Makeathon focused on assistive technology and the needs of people with disabilities. We won the Google.org Innovation Award Winner and the TechShop Self-Manufacturing Award.
Team
Oscar Segovia
•
Yaskhu Madaan
•
Pierre Karashchuk
•
Shaun Giudici Bhargav
•
Hagit Alon
•
Jonathan Bank
•
Tomás Vega
⇜
my.Flow
April 2015 - Now
{
Arduino
Eagle
Fusion 360
}
The period is silent. A smart tampon that senses saturation and sends a notification to your phone via Bluetooth that it’s time to change.
Designed and developed circuit boards, 3D casings and sensors to make any tampon smart. Features include homebrew biocompatible stainless steel probes, magnetic connector to prevent tampon from being pulled out of the vagina when the user pulls down her pants/underwear, and an RFduino (bluetooth enabled) shield using SMD components.
Developed booth for the Cognitive Technology exhibit at the Exploratorium. Enabled visitors to move a robotic-arm left and right with their thought. Developed a co-adaptive motor-imagery brain-computer interface training system that provided 1-degree of freedom. EEG data was recorded using a home-brewed EEG headset that used dry electrodes.
I was able to control a homebrew quadcopter with my mind. We used an OpenBCI to obtain raw EEG data from 6 channels, placing them 5cm from each other arranged in a 2x3 matrix on the motor cortex: 2 on the left, 2 on the right, and two on the longitudinal fissure.
Signal processing was applied to remove noise and unwanted features, subtract activity recorded from the center electrodes, and concentrate on frequencies between 14-20Hz. A k-nearest neighbors classifier was developed by Pierre to train the algorithm to recognize base and motor imagery of me "thinking" about moving my left and right hand.
The entire control of the quadcopter was built during 36 hours at CalHacks, and we won the category Most Technically Challenging and First Overall Hack!
Designed and built a web application to learn to understand braille through haptics, tactile feedback which recreates the sense of touch using complex vibrations.
Characters are sent through serial to the Vibratto Board which controls an array of 3x2 6g coin vibrators. {Node.js}
Wrote arduino sketch to translate from text to haptic braille representation.
Interfaces for learning and practicing.
Learn application uses visual and tactile information to ensure the efficient learning of haptic braille (double encoding).
Practice application sends the haptic representation of a random character and the user needs to select the correct choice to receive the next character.