Human Factors/Applied Cognition: Visual Attention and Search, Perception, Visuo-spatial Cognition, Augmented/ Virtual Reality
Rachel is a doctoral student in the Human Factors & Applied Cognition Program (HFAC), working with Dr. Matt Peterson in the Visual Attention and Cognition Lab. She received her B.A. in Psychological Science at Cal State University San Marcos in 2017 and M.A. from GMU's HFAC program in 2019. During undergrad, she did research in a Visual Cognition and Tracking Laboratory that focused on processing of objects in visual long term memory.
Rachel’s research interests currently includes: visual attention and search, perception of object recognition, trust in automated systems, and perception of different constructs in VR. In 2018, she has had the pleasure to intern at the U.S. Army CCDC C5ISR Center Night Vision and Electronics Sensors Directorate (NVESD) employed under KINEX Inc. (now with Planning Systems Inc.). In 2019, she was a behavioral science intern at The Johns Hopkins University Applied Physics Laboratory in the Integrated Adaptive Cyber Defnese (IACD) program and received the Asymmetric Cyber Operation's Branch Intern Positive Influence Award; her main project focused on the application of IACD Trust Framework to Cyber Security Use Case on Trust in Automation. During this internship, she was also selected to collaborate with across-discipline interns to work on a project in creating a prototype of an internal navigation app that incorporated an indoor positioning system.
In her free time, she likes to paint, cook, grow herbs, play simulation and rhythm games, and teach her cat new tricks.
Graybeal, J. J., Nguyen, R. T. T., Du Bosq, T. W. (2019). Simulating augmented reality spatial accuracy requirements for target acquisition tasks. Paper for the Interservice/Industry Training, Simulation, and Education Conference (I/ITSEC).
Graybeal, Nguyen, R. T. T., & J. J., Du Bosq, T. (2019). Simulating human vehicle identification performance with infrared imagery and augmented reality assistance. Paper for the SPIE Defense and Security conference.
Graybeal, J. J., Nguyen, R. T. T., & Du Bosq, T. (2019). Using simulation to define augmented reality accuracy requirements for target acquisition with the LRAS3 sensor. Military Sensing Symposium.
TA lab instructor
MA Psychology - Human Factors and Applied Cognition, George Mason University, Fairfax, VA, 2019
BA Psychological Science, Cal State University San Marcos, San Marcos, CA, 2017
Nguyen, R. T. T. & Peterson, M. S. (2019). Occlusion and object specific effects on visual search for complex objects. Poster presented at the 19th Annual Meeting of the Visual Science Society. St. Pete Beach, FL.
Nguyen, R. T. T. (2019). Applying trust framework in cyber security automation: To a use case and trust culture survey. Talk presented at the Johns Hopkins Applied Physics Lab. Laurel, MD.
Nguyen, R. T. T. et al. (2019). SABER: Specialized APL beacon-enable routing, Asymmetric Operations Sector Intern Project. Talk presented at the Johns Hopkins Applied Physics Lab. Laurel, MD.
Nguyen, R. T. T. (2019). Effects of occlusion on visual search are complex. Brown bag talk given at George Mason University.
Nguyen, R. T. T. (2018). Design for navigational performance under subtle and drastic augmented reality waypoint errors. Talk (Out-briefing) presented at U.S. ARMY Research, Development and Engineering Command Night Vision and Electronic Sensors Directorate. Fort Belvoir, VA.
Nguyen, R. T. T. & Williams, C. C. (2017). Thick and thin: Occluding elements’ contents and widths affect visual memory. Poster presented at the 58thAnnual Meeting of the Psychonomic Society. Vancouver, British Columbia, Canada.
Nguyen, R. T. T. & Williams, C. C. (2017). Behind the picket fence: Visual representations of occluded objects in long term memory. Talk presented at the 24th Annual CSUSM Psychology Research Fair. San Marcos, CA.