Mobile Brain/Body Imaging Data Heading Computation

dc.contributor.authorGramann, Klaus
dc.contributor.authorHohlefeld, Friederike
dc.contributor.authorGehrke, Lukas
dc.contributor.authorKlug, Marius
dc.date.accessioned2021-08-17T12:20:46Z
dc.date.available2021-08-17T12:20:46Z
dc.date.issued2020-08-31
dc.description.abstractThis is Mobile Brain/Body Imaging data from 20 healthy adult participants in a heading computation experiment. Participants performed a spatial orientation task in a sparse virtual environment (WorldViz Vizard, Santa Barbara, USA) consisting of an infinite floor granulated in green and black. The experiment was self-paced and participants advanced the experiment by starting and ending each trial with a button press using the index finger of the dominant hand. A trial started with the onset of a red pole, which participants had to face and align with. Once the button was pressed the pole disappeared and was immediately replaced by a red sphere floating at eye level. The sphere automatically started to move around the participant along a circular trajectory at a fixed distance (30m) with one of two different velocity profiles. Participants were asked to rotate on the spot and to follow the sphere, keeping it in the center of their visual field (outward rotation). The sphere stopped unpredictably at varying eccentricity between 30° and 150° and turned blue, which indicated that participants had to rotate back to the initial heading (backward rotation). When participants had reproduced their estimated initial heading, they confirmed their heading with a button press and the red pole reappeared for reorientation. To ensure that the floor could not be used as an external landmark during the trials, it was faded out, turned randomly, and faded back in after each outward and backward rotation. The participants completed the experimental task twice, using i) a traditional desktop 2D setup (visual flow controlled through joystick movement; “joyR”), and ii) equipped with a MoBI setup (visual flow controlled through active physical rotation with the whole body; “physR”). The condition order was balanced across participants. To ensure the comparability of both rotation conditions, participants carried the full motion capture system at all times. In the joyR condition participants stood in the dimly lit experimental hall in front of a standard TV monitor (1.5m viewing distance, HD resolution, 60Hz refresh rate, 40″ diagonal size) and were instructed to move as little as possible. They followed the sphere by tilting the joystick and were thus only able to use visual flow information to complete the task. In the physical rotation condition participants were situated in a 3D virtual reality environment using a head-mounted display (HTC Vive; 2x1080x1200 resolution, 90 Hz refresh rate, 110° field of view). Participants’ movements were unconstrained, i.e., in order to follow the sphere they physically rotated on the spot, thus enabling them to use motor and kinesthetic information (i.e., vestibular input and proprioception) in addition to the visual flow for completing the task. If participants diverged from the center position as determined through motion capture of the head position, the task automatically halted and participants were asked to regain center position, indicated by a yellow floating sphere, before continuing with the task. Each movement condition was preceded by recording a three-minute baseline, during which the participants were instructed to stand still and to look straight ahead. The starting condition (visual flow only or physical rotation) was also counterbalanced for participants with different reference frame proclivities, such that five egocentric, four allocentric, and two mixed-strategy participants started with the joyR condition, and four egocentric, five allocentric participants started with the physR condition. In each rotation condition, participants practiced the experiment in three learning trials with instructions presented on screen. Subsequently, the main experiment started, including 140 experimental trials per rotation condition. The experimental trials in each condition were randomized and split into five blocks of 28 trials each. The breaks were self-paced and the next block was initiated with the push of a button. The sphere moved either clockwise or anticlockwise around the participant; this movement was either slow or fast (randomized), depending on two different velocity profiles. The eccentricities of the sphere’s end positions were clustered from -15° to +15° around the mean eccentric end positions of 45°, 90°, and 135° in steps of 3° (e.g., the cluster 45° eccentricity ranged from 30° and 60° with 11 trials covering all eccentricities). In addition, eccentricities of 67° and 112° (2 x 8 trials) were used to achieve a near-continuous distribution of end positions for the outward rotation in both rotation directions. Mobile Brain/Body Imaging (MoBI) setup. To allow for a meaningful interpretation of the data modalities and to preserve their temporal context, the EEG data, motion capture data from different sources, and experiment event marker data were time-stamped, streamed, recorded, and synchronized using the Lab Streaming Layer. Data Recordings: EEG. EEG data was recorded from 157 active electrodes with a sampling rate of 1000 Hz and band-pass filtered from 0.016 Hz to 500 Hz (BrainAmp Move System, Brain Products, Gilching, Germany). Using an elastic cap with an equidistant design (EASYCAP, Herrsching, Germany), 129 electrodes were placed on the scalp, and 28 electrodes were placed around the neck using a custom neckband (EASYCAP, Herrsching, Germany) in order to record neck muscle activity. Data were referenced to an electrode located closest to the standard position FCz. Impedances were kept below 10kΩ for standard locations on the scalp, and below 50kΩ for the neckband. Electrode locations were digitized using an optical tracking system (Polaris Vicra, NDI, Waterloo, ON, Canada). Data Recordings: Motion Capture. Two different motion capture data sources were used: 19 red active light-emitting diodes (LEDs) were captured using 31 cameras of the Impulse X2 System (PhaseSpace Inc., San Leandro, CA, USA) with a sampling rate of 90 Hz. They were placed on the feet (2 x 4 LEDs), around the hips (5 LEDs), on the shoulders (4 LEDs), and on the HTC Vive (2 LEDs; to account for an offset in yaw angle between the PhaseSpace and the HTC Vive tracking). Except for the two LEDs on the HTC Vive, they were subsequently grouped together to form rigid body parts of feet, hip, and shoulders, enabling tracking with six degrees of freedom (x, y, and z position and roll, yaw, and pitch orientation) per body part. Head motion capture data (position and orientation) was acquired using the HTC Lighthouse tracking system with 90Hz sampling rate, since it was also used for the positional tracking of the virtual reality view. Because the main focus of the study concerned the head movement-related modulation of neural dynamics in RSC, only data streams from the head motion capture data were used for the analysis.en
dc.description.sponsorshipDFG, 240600905, Mobiles Bildgebungssystemen
dc.identifier.urihttps://depositonce.tu-berlin.de/handle/11303/11607
dc.identifier.urihttp://dx.doi.org/10.14279/depositonce-10493
dc.language.isoenen
dc.relation.issupplementtohttps://doi.org/10.1101/417972
dc.relation.issupplementtohttps://doi.org/10.1111/ejn.14992
dc.rights.urihttps://creativecommons.org/licenses/by-nc/4.0/en
dc.subject.ddc612 Humanphysiologiede
dc.subject.ddc500 Naturwissenschaften und Mathematikde
dc.subject.othermobile brain/body imagingen
dc.subject.otherelectroencephalographyen
dc.subject.othermotion captureen
dc.subject.othervirtual Realityen
dc.titleMobile Brain/Body Imaging Data Heading Computationen
dc.typeGeneric Research Dataen
tub.accessrights.dnbfree*
tub.affiliationFak. 5 Verkehrs- und Maschinensysteme::Inst. Psychologie und Arbeitswissenschaft::FG Biopsychologie und Neuroergonomiede
tub.affiliation.facultyFak. 5 Verkehrs- und Maschinensystemede
tub.affiliation.groupFG Biopsychologie und Neuroergonomiede
tub.affiliation.instituteInst. Psychologie und Arbeitswissenschaftde

Files

Original bundle
Now showing 1 - 1 of 1
No Thumbnail Available
Name:
raw_EEG_data.zip
Size:
27.8 GB
Format:
ZIP archive format.
License bundle
Now showing 1 - 1 of 1
No Thumbnail Available
Name:
license.txt
Size:
2.71 KB
Format:
Item-specific license agreed upon to submission
Description:

Collections