• GamingChairModel@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    1 year ago

    Our heads are just loaded with sensory capabilities that are more than just the two eyes. Our proprioception, balance, and mental mapping allows us to move our heads around and take in visual data from almost any direction at a glance, and then internally model that three dimensional space as the universe around us. Meanwhile, our ears can process direction finding for sounds and synthesize that information with our visual processing.

    Meanwhile, the tactile feedback of the steering wheel, vibration of the actual car (felt by the body and heard by the ears), give us plenty of sensory information for understanding our speed, acceleration, and the mechanical condition of the car. The squeal of tires, the screech of brakes, and the indicators on our dash are all part of the information we use to understand how we’re driving.

    Much of it is trained through experience. But the fact is, I can tell when I have a flat tire or when I’m hydroplaning even if I can’t see the tires. I can feel inclines or declines that affect my speed or lateral movement even when there aren’t easy visual indicators, like at night.

    • xavier666@lemm.ee
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      Just adding to your point, when F1 drivers were asked to play a racing sim, they could not perform like real life because they said no matter how good the sim is, it doesn’t provide the feedback of a real car.