Perceptual enhancement of neural and behavioral response due to combinations of multisensory stimuli are found in many animal species across different sensory modalities. By mimicking the multisensory integration of ocular-vestibular cues for enhanced spatial perception in macaques, a bioinspired motion-cognition nerve based on a flexible multisensory neuromorphic device is demonstrated. A fast, scalable and solution-processed fabrication strategy is developed to prepare a nanoparticle-doped two-dimensional (2D)-nanoflake thin film, exhibiting superior electrostatic gating capability and charge-carrier mobility. The multi-input neuromorphic device fabricated using this thin film shows history-dependent plasticity, stable linear modulation, and spatiotemporal integration capability. These characteristics ensure parallel, efficient processing of bimodal motion signals encoded as spikes and assigned with different perceptual weights. Motion-cognition function is realized by classifying the motion types using mean firing rates of encoded spikes and postsynaptic current of the device. Demonstrations of recognition of human activity types and drone flight modes reveal that the motion-cognition performance match the bio-plausible principles of perceptual enhancement by multisensory integration. Our system can be potentially applied in sensory robotics and smart wearables.