Multimodal Data Tree

This tree is OPEN for contributions. It was compiled in the paper Di Mitri, D., Drachsler, H., & Specht, M. (2017). From signals to knowledge. A conceptual model for multimodal learning analytics. Journal of Computer Assisted Learning, In press.

Get Started. It's Free
or sign up with your email address
Multimodal Data Tree by Mind Map: Multimodal Data Tree

1. Behaviour

1.1. Motoric

1.1.1. Body Legs Movements Torso Movements Arms EMG Hands Strokes

1.1.2. Head Mouth Voice Prosody Face Traits Eyes EOG

1.2. Physiologcial

1.2.1. Heart ECG/PPG Heart Rate Heart Rate Variability

1.2.2. Brain EEG Focus Attention

1.2.3. Skin GSR/EDA Sweat Temperature

1.2.4. Respiration

2. Context

2.1. Situtational

2.1.1. Activity

2.1.2. Day

2.1.3. Hour

2.1.4. Month

2.2. Environmental

2.2.1. Weather

2.2.2. Location

2.2.3. Noise

2.2.4. Clutter

2.2.5. Architecture

2.2.6. Pollution

2.2.7. Season

2.2.8. Light