Transportation Simulation

Related Publication:

Sun, C., Qing, Z., Edara, P., Balakrishnan, B., & Hopfenblatt, J. (2017). Driving Simulator Study of J-Turn Acceleration–Deceleration Lane and U-Turn Spacing Configurations. Transportation research record, 2638(1), 26-34.

In Collaboration with the Transportation Engineering Department (ZouSim lab). Various studies were conducted within Virtual Simulators. My primary role was to develop virtual reality software for a variety of transportation research projects.

My duties included: Collecting detailed specifications for driving parameters and variables of study. Leading undergraduate and first year graduate students in helping develop specific components and accurately representing traffic control devices. Also, lead groups and independently developed environments via 2D graphics, 3D modeling, and software programming in addition to incorporating various hardware devices into the virtual simulations. Data collection procedures were also implemented in order to replay and analyze driver behavior in posterity.

TRAINING SIMULATOR FOR WORK ZONE SAFETY INSPECTORS


Work zones account for a great number of the deadly traffic accidents that occur each year. Work zone safety inspectors play a crucial role in making sure that every work zone in the country meets the criteria established for optimal safety of drivers and workers. They go through rigorous training that improves their observational skills in a variety of conditions. It is challenging however to provide adequate training in real-world work zones as they are geographically and temporally distributed and it is difficult to show examples of error. This virtual reality training simulation allows loading of various scenarios and provides an immersive training environment for roadside safety standards for work zones.

DEVELOPMENT WORKFLOW

The road and work zone environment were modeled using Rhino 3D and 3D Studio Max and imported into Unity 3D to develop the interactive training environment for use with Oculus Rift.

Development: James Hopfenblatt and Daeyeol Chang


1. Mapping motion data to rigged digital avatar

1. Utilizing Unity Game Engine, Oculus SDK and all assets to create the two scenarios.