Multimodal Data Tree

This tree is OPEN for contributions. It was compiled in the paper Di Mitri, D., Drachsler, H., & Specht, M. (2017). From signals to knowledge. A conceptual model for multimodal learning analytics. Journal of Computer Assisted Learning, In press.

Get Started. It's Free
or sign up with your email address
Multimodal Data Tree by Mind Map: Multimodal Data Tree

1. Behaviour

1.1. Motoric

1.1.1. Body

1.1.1.1. Legs

1.1.1.1.1. Movements

1.1.1.2. Torso

1.1.1.2.1. Movements

1.1.1.3. Arms

1.1.1.3.1. EMG

1.1.1.4. Hands

1.1.1.4.1. Strokes

1.1.2. Head

1.1.2.1. Mouth

1.1.2.1.1. Voice

1.1.2.1.2. Prosody

1.1.2.2. Face

1.1.2.2.1. Traits

1.1.2.3. Eyes

1.1.2.3.1. EOG

1.2. Physiologcial

1.2.1. Heart

1.2.1.1. ECG/PPG

1.2.1.1.1. Heart Rate

1.2.1.1.2. Heart Rate Variability

1.2.2. Brain

1.2.2.1. EEG

1.2.2.1.1. Focus

1.2.2.1.2. Attention

1.2.3. Skin

1.2.3.1. GSR/EDA

1.2.3.1.1. Sweat

1.2.3.1.2. Temperature

1.2.4. Respiration

2. Context

2.1. Situtational

2.1.1. Activity

2.1.2. Day

2.1.3. Hour

2.1.4. Month

2.2. Environmental

2.2.1. Weather

2.2.2. Location

2.2.3. Noise

2.2.4. Clutter

2.2.5. Architecture

2.2.6. Pollution

2.2.7. Season

2.2.8. Light