UPDATE: I haven't updated this website since the end of my undergraduate at Berkeley. Nevertheless, the following information is still relevant to my current vision and line of work. Here is what I did during my master's at the Fluid Interfaces Group of the MIT Media Lab. For the rest, check my LinkedIn.
My name is Tomás Vega, and I am a Senior studying Computer and Cognitive Science at the University of California, Berkeley. I was born in Lima, Perú.
In the last few years, my interest in Human-Computer Interfaces has grown in proportion to my desire to pioneer the way we interact with technology. In particular, I’ve been unceasingly mind-blown by recent development in Brain-Computer Interfacing. This has definitely changed my life path, sculpting my passions towards deciphering the power of the mind.
I want to create the "human of the future" - that is, to develop technology to create new and extend existing human sensory and cognitive capabilities.
Projects
⇜
MindSweeper: Toward Haptic Cognitive Prostheses
Fall 2016 - Spring 2017
{
Fusion 360
Eagle
Arduino
OpenCV
C
}
Research project from UC Berkeley Department of EECS in Human-Computer Interaction. We investigated an intelligent wearable feedback system capable of offloading cognitive demand during task exucation. We used the classic computer game Minesweeper as our task due to its challenging arithmetic nature. We also studied the effects of designing novel haptic techniques that stimulate slowly-adapting mechanoreceptors in our dermis/subdermis. The overall goal was to explore new human-computer interfaces that better utilize our natural sensing capabilities in order to convey information more effectively and through longer periods of time.
Team
Corten Singer
•
Tomás Vega
⇜
SmartWheels
Fall 2016
{
Eagle
LPKF CircuitPro
Laser Cutter
Raspberry Pi 3
Python
}
My team and I developed a self-driving, target-following, obstacle-avoiding motorized wheelchair for our friend Stephen Chavez. Using Stephen's research in reverse-engineering his motorized wheelchair's control system, we were able to communicate with and mount sensors onto a wheelchair in order to enabled dynaimc navigation. We used a Raspberry Pi 3 Model B with a PiCAN 2 shield to send commands to the wheelchair via the R-net protocol. An iPhone application is used to track AprilTags (2D barcodes developed for robotics applications) with its camera and send data to the Raspberry Pi via UDP. Finally, a custom-made PCB shield sandwiched between the Raspberry Pi and the PiCan shield interfaces with three ultrasonic sensors to detect obstacles.
Team
Yash Shah
•
James Musk
•
Corten Singer
•
Tomás Vega
⇜
Robowen: Smart House
March 2017
{
Arduino
Eagle
Electomyography (EMG)
Particle Photon
Fusion 360
}
This project won the Autodesk Design Award at the TOM:Berkeley Make-a-Thon held at UC Berkeley in March 2017. For this project, we worked with our friend Owen Kent who has very limited ability to physically interact with the world. He currently has a single joystick attached to his wheelchair through which he uses his computer and other devices that can take a mouse as input. Prior to this Make-a-Thon, we observed a few key problems that we aimed to solve with our invention.
Specifically, Owen previously had to interface with a very buggy RFID system that was synchronized with a wall-mounted button in order to open his door. The button can open the door from the outside, but he has to time it exactly with his buggy RFID transmitter, and it often takes him multiple tries to open his door. Moreover, Owen is severly constrained by his mouse mounted on his wheelchair. Unless he is seated in his wheelchair, he has no way to connect with any device. He has been wanting to design a system to allow him to use his tablet while lying down in bed.
We created a new joystick USB mouse that will allow Owen to talk to the various devices in his home that he currently has no means of controlling (this includes his door, lights, heater, speaker, etc.). In a little more detail, this new mouse enables him to have a more efficient interaction with his devices while also promoting bodily movement that he previously had no motivation to do. The mouse was built with an Arduino-like microcontroller with Electromyographic (EMG) sensors that can detect and process slight muscle movements in each of his hands and a 3-axis accelerometer that can detect stomach inflation/deflation. The device was programmed to be able to left-click and drag when Owen flexed his left hand and to right-click when he flexed his right hand. His stomach movement could easily be mapped to small functions, such as taking a screenshot. This solution works around his existing problematic setup where all clicks are sensed in software which can be difficult to operate, let alone much slower to use (must stall above an object for a certain time period in order to click, or nudge the joystick in a particular way). This EMG clicking mechanism also provides Owen with the motivation to engage in physical training of his existing hand muscles.
This solution provides Owen with a device that not only allows him to control his door with ease (not to mention allowing him to control various appliances like bedroom lights), but it lets him do so from his bed! We designed multiple mounts that could be placed near hes bed so that his tablet and our mouse could be operated without ever having to get up. Our invention gives Owen the freedom to interact with his house, especially when opening his door, while also giving him the freedom to connect to the world when he is not in hiw wheelchair. The mouse can virtually work with any PC or tablet available.
The whole system was developed in 40 hours.
Team
Daniel Lim
•
Adam Hutz
•
Alejandro García Sala
•
Facundo Severi
•
Pierre Karashchuk
•
Pierluigi Mantovani
•
Corten Singer
•
Tomás Vega
⇜
DoomSense
Fall 2016
{
Stanza
C
Eagle
LPKF CircuitPro
}
Final project for the class “Software Defined PCB Design.” The goal was to exploit neuroplasticity to explore how additional sense modalities can affect a player's performance. We made DoomSense, a device for playing the first person shooter DOOM. In this game there are many resources the player needs to keep track of visually and thus, can be overwhelming at times.
DoomSense is composed of a jacket that enhances the immersiveness of the gaming experience. The jacket provides two information streams: health, and proximity and angle of enemies outside the player's field of view (FOV). We used an Eccentric Rotating Mass (ERM) DC vibration motor to create vibrations to convey information about remaining health. The lower the player's health, the faster and more intense the "heart beat."
We used five vibration coin motors attached on the neck of the jacket to deliver the position of the closest enemy outside the FOV. The closer the enemy, the stronger the vibration. The system activates specific coin vibration motors according to the angle of the enemy with respect to the player. We found that most experienced video game players would adapt quickly and triangulate within the first 2 minutes of game play, while more novice players would take 5.
Team
Mitchell Karchemsky
•
Aleks Kamko
•
Tomás Vega
⇜
WheelSense
August 2016
{
Arduino
Fusion 360
}
Coordinated a team of fellow hackers in a weeklong makeathon for my friend Daniel Stickney who has cerebral palsy and a cortical vision impairment. We immersed ourselves in his home identifying daily challenges and coming up with creative solutions, with the aim of preventing injury and promoting independence. For the first three days we executed experiments to quantify his cognitive, motor and sensory capabilities. We discovered his biggest challenge is navigation, especially in new environments where he has no spatial memories of. We hacked his wheelchair to add ramp lateral edge detection, frontal drop off detection and backing assistance through multi-modal feedback.
Team
Pierre Karashchuk
•
Stephanie Valencia
•
Oscar Segovia
•
Ryan Sterlich
•
Kelly Peng
•
Tomás Vega
⇜
AlterNail
Steptember 2016
{
Eagle
LPKF CircuitPro
Arduino
}
I built a fingernail-shaped, low-power, stateful, wireless, dynamic display with vibrational sensing. The hardware was part of a publication in CHI 2017.
⇜
SmarTee
January - September 2016
{
Eagle
LPKF CircuitPro
Arduino
}
Developed hardware and software to control color-changing textiles in response to variable biometrics. Circuit board drives conductive threads in response to GSR (Galvanic Skin Response). Threads, which heat up when driven, pass under a layer of thermochromic paint, changing its color. Published in DIS ’16.
Inspired by the games of subconscious creativity made famous by the Surrealists, La Sinapsis Colectiva (The Collective Synapse) is a platform for creating community-based narratives and poetry using Twitter and SMS messaging. La Sinapsis Colectiva seeks to take advantage of modern day technological capabilities to foster a new sense of creative collaboration with stories that develop line by line, written by an open community of participants that have chosen to join the project. [Read full statement here (in Spanish)]
Developed website for the Electronic Literature exhibition in which La Sinapsis Colectiva was displayed.
From classic novels to banal social media profiles, it seems reasonable to suspect that any body of text arises from some human mastermind. However, as the following work demonstrates, the wizard behind the curtain may not be human at all. The following work sources its data by soliciting user inputs and compiling them on a single twitter account with surprising coherence.
Team
Tomás Vega
⇜
AliviaRÁ
March 2016
{
Arduino
Eagle
iOS
Python
}
Won first place at UC Berkeley’s Hack for Humanity. We developed a smart rehabilitation system that aids people with arthritis improve their joint flexibility and reduce pain.
An iOS app runs the patient through a series of exercise. The glove measure flexion at every finger and connects to an app via bluetooth. The app provides real-time visual and haptic feedback on performance, allowing the patient to know whether they’re doing the exercise correctly.
By tapping a button on the app the patient is able to tag when pain is experienced. At the end of each exercise session, data is relayed to a server on the cloud which then performs data analytics. Optimal exercises that target the patient’s specific positions of pain are computed and suggested to the user.
Moreover, regular progress reports are automatically generated and sent to the patient and their doctor/physical therapist. This includes information on finger flexibility improvement, recurring positions of pain and suggested exercises.
The whole system was developed in 40 hours.
Team
Ghassan Makhoul
•
Dino Digma
•
Alejandro Castillejo
•
Corten Singer
•
Tomás Vega
⇜
SynthSense
October - December 2016
{
Arduino
Eagle
iOS
Fusion 360
}
Enabling independence for the visually impaired: An augmented white-cane that provides both obstacle-avoidance and navigation assistance.
Obstacle Avoidance: We designed an Augmented White Cane (AWC) with ultrasonic range finders for detecting obstacles within a 2- meter range as well as mounts for the sensors that are both direction and height adjustable. Feedback from these sensors deliver haptic response on the handle when obstacles are detected at body or head-level.
Navigation: To provide navigation instructions to our users, we developed an iOS app that determines when the user needs to turn and then transmits a command to the AWC’s strap to provide directional haptic feedback.
Won first place at UC Berkeley’s 3D Printing Designathon. We modeled and created a bluetooth-enabled prosthetic foot that helps users distribute their weight properly on their prosthetic leg, in order to prevent muscle atrophy, bone density loss, between other health issues. The foot interfaced with an iOS app giving real time feedback to the user via vibrations telling them whether or not they were putting enough pressure on their prosthetic leg. The prototype was developed in 24 hours.
Team
Harshil Goel
•
James Musk
•
Anthony Baiey
•
Tomás Vega
⇜
SmartAss
September 2015
{
Arduino
SolidWorks
iOS
}
Designed and implemented a pressure-sore prevention device for wheelchair users. The system collects data from force sensors, which is then transmitted via BLE to a mobile phone App. The mobile app has the means of notifying the user as to when an area should be relieved of pressure and where that area happens to be. You may imagine how valuable this information is to users who have no feeling below their waist. It is just as valuable to those who can feel but often forget to move due to busy work schedules or intense exercise routines.
A proof of concept for SmartAss v2.0 was also prototyped in which the very same device can be actuated via the use of linear motors, in order to redistribute weight away from a pressure prone area. The next step is to improve resolution of pressure spots by increasing the number of sensors in the design and integrating the actuation into a small form factor.
The whole system was built in 72-hours at Google’s Bay Area Makeathon focused on assistive technology and the needs of people with disabilities. We won the Google.org Innovation Award Winner and the TechShop Self-Manufacturing Award.
Team
Oscar Segovia
•
Yaskhu Madaan
•
Pierre Karashchuk
•
Shaun Giudici Bhargav
•
Hagit Alon
•
Jonathan Bank
•
Tomás Vega
⇜
my.Flow
April 2015 - Now
{
Arduino
Eagle
Fusion 360
}
The period is silent. A smart tampon that senses saturation and sends a notification to your phone via Bluetooth that it’s time to change.
Designed and developed circuit boards, 3D casings and sensors to make any tampon smart. Features include homebrew biocompatible stainless steel probes, magnetic connector to prevent tampon from being pulled out of the vagina when the user pulls down her pants/underwear, and an RFduino (bluetooth enabled) shield using SMD components.
Developed booth for the Cognitive Technology exhibit at the Exploratorium. Enabled visitors to move a robotic-arm left and right with their thought. Developed a co-adaptive motor-imagery brain-computer interface training system that provided 1-degree of freedom. EEG data was recorded using a home-brewed EEG headset that used dry electrodes.
I was able to control a homebrew quadcopter with my mind. We used an OpenBCI to obtain raw EEG data from 6 channels, placing them 5cm from each other arranged in a 2x3 matrix on the motor cortex: 2 on the left, 2 on the right, and two on the longitudinal fissure.
Signal processing was applied to remove noise and unwanted features, subtract activity recorded from the center electrodes, and concentrate on frequencies between 14-20Hz. A k-nearest neighbors classifier was developed by Pierre to train the algorithm to recognize base and motor imagery of me "thinking" about moving my left and right hand.
The entire control of the quadcopter was built during 36 hours at CalHacks, and we won the category Most Technically Challenging and First Overall Hack!
Designed and built a web application to learn to understand braille through haptics, tactile feedback which recreates the sense of touch using complex vibrations.
Characters are sent through serial to the Vibratto Board which controls an array of 3x2 6g coin vibrators. {Node.js}
Wrote arduino sketch to translate from text to haptic braille representation.
Interfaces for learning and practicing.
Learn application uses visual and tactile information to ensure the efficient learning of haptic braille (double encoding).
Practice application sends the haptic representation of a random character and the user needs to select the correct choice to receive the next character.