![Research Assistant (Part-Time)](/assets/images/org/vcl.png)
Research Assistant (Part-Time)
September 2018 - Present | Vancouver B.CFrom September 2018 to May 2020 I worked sporadically as a research assistant for the Visual Cognition Lab, a vision science lab at the University of British Columbia. With the goal of investigating visual intelligence, the lab leverages psychophysics methods like the Weber-Fechner and Stevens’s Power law to measure the relationship between observer perception and physical visual stimuli. The hope is that such behavioral experimentation will yield findings concerning human attention and perception, which will in turn drive the practical design of more effective visual displays and visualization techniques. During my time with the team I used technologies like Node.js, C++ and Python across a variety of tasks and projects.
Project: Flowshow
Screenshot of the virtual optic flowfield (snowfield-like dots that fill the screen). In experiments, the flowfield is superimposed onto a driving simulation that fills the observer's entire peripheral vision.
During my first year with the lab, I worked mainly on Flowshow, a decade-long project (in its twilight year) investigating the direct effects of flowfields on an observer’s perception of speed. In each experiment, participants are sat before a driving simulator while tasked with a series of speed estimation and counting tasks. I also served as a member of the lab’s tech team across a number of projects. My contributions include:
- Improving visual obstacle detection and introduced parameter logging to the flowfield driving simulator. The driving simulator, created by a VCL alumini Lewis Johnson, was programmed in C++ in which the introduction of logging helped the team fix bugs more efficiently.
- Data crunching and analysis of participants’ training and performance data between 2008-2018, using Python for automation and countless Excel spreadsheets. Would not do again.
- Helped with development of the lab’s website using HTML/CSS/JavaScript.
- Hosted a series of bi-weekly Jupyter workshops for lab members.
Project: Visual Perception of Correlation
Menu page of VCLWebFramework, an active open-source visualization tool we built, designed for anyone to easily power their own correlation-related experiments in a web browser.
From January to May 2020, I moved to the lab’s Correlation team in part to produce my final thesis on the nature of correlation perception. The onous was on me to take the wealth of knowledge I had picked up over the year and lead my own investigation on the graphical representations of datasets. In particular, my study aimed at studying how size of dots on a scatterplot affects an observer’s perception of correlation. I developed my own experimental procedures using an in-house open source visualization tool - VCLWebFramework - a web application I helped develop alongside other lab members using Node.js, D3.js, and jsPsych. The process of completing research and submitting my thesis involved:
- Designing a series of experimental conditions to explore dot size effects in single-population scatterplots.
- Realizing each size condition within the VCLWebFramework; developing unique timelines (series of trials) and features across each condition (blog post).
- Parsing and analyzing the 40 sets of participant data collected.
- Establishing a peer review process to onboard and train new contributors on an ongoing basis.
An example of a JND task; a method we use to access participants' precision of linear correlation between two graphs.
Overall, the process involved a nice blend of involving different technologies to conduct behavioral research: from data collection to analysis. If this is something that interests you, see my blog post for a brief writeup on why and how we built the VCLWebFramework!
About the UBC Visual Cognition Lab
The UBC Visual Cognition Lab is a vision science lab in the Psychology Department of the University of British Columbia. Dr. Ronald A. Rensink is the principal investigator. The VCL primarily investigates visual intelligence – the way in which the human visual system uses the light entering the eyes to create a variety of perceptual experiences - to help with the design of effective visual displays. There are currently five active research projects, each piloted by teams of undergraduate, graduate and PhD students from the university.