Get Started. It's Free
or sign up with your email address
Rocket clouds
DBR by Mind Map: DBR

1. Why

1.1. Limitations of experimental design

1.1.1. Isolated variables (Barab 2014)

1.1.1.1. Lead to incomplete theories

1.1.2. Artificial contexts (Barab 2014)

1.1.3. narrow measures of learning (Collins 2004)

1.2. Practice impact

1.2.1. "move beyond simply understanding the world as it is, and involves working to change it" Barah 2014

1.3. Exposes mechanisms (Barab, 2014)

1.4. Articulates conditions (vs. controlling variables)

1.4.1. Offers "storied truths" Gee 2013

1.4.1.1. Beyond descriptive accounts

1.4.1.2. Others get process insights used to inform their work

1.4.1.3. researcher thinks about how their process interacted with their warrants

1.4.2. design narratives

1.4.3. Offers "petite generalizations" Stake 1995

1.4.3.1. provides insights into challenges, opportunities as well as strategies for overcoming them

1.5. Real world conditions

1.5.1. "so called confounding variables necessarily occur and must be taken into account (not controlled) if the findings are to be relevant to practitioners" Barab 2014 pg 153

1.5.2. Context matters. The effectiveness of a design (ie. implementation) in one setting does not guarentee effectiveness in another setting (Collins et al, 2004)

1.6. Transformative learning of researchers and teachers (Yael, ISLS)

1.6.1. Boundary Crossing (Yael) between researchers and teachers

1.6.2. Combining analytical and creative mindsets

2. What is it

2.1. Goals of DBR

2.1.1. Develop new theories, artifacts, and practices that can be generalized to others (Barah, 2014)

2.1.2. dual goals of refining both theory and practice (Collins et al. 2004)

2.2. Assumptions

2.2.1. In complex learning environments it is difficult to test causal impact of variables using experimental design (Barab, 2014)

2.2.2. "factoring assumption" is not valid in learning environments. You can't analyze cognitive processes apart from the context. They are mutually determined. Brown, Collins, Duguid 1989; Greeno; Barab 2014

2.2.3. Test interventions early and frequent M&R pg. 135

2.3. Definitions

2.3.1. Barab 2014

2.3.1.1. Collection of approaches

2.3.1.2. Study of learning environments designed and systematically changed by the researcher

2.3.1.3. naturalistic settings

2.3.2. Collins et al 2004

2.3.2.1. "a way to carry out formative research to test and refine educational designs based on theoretical principles derived from prior research" pg 18

2.4. Characteristics (Collins et al. 2004)

2.4.1. multiple dependent variables

2.4.2. flexible design revision

2.4.3. social interaction

2.4.4. developing a profile

2.4.5. characterizing the situation

2.4.6. messy situations

2.4.7. co-participant design and analysis

3. Methodology

3.1. Models

3.1.1. McKenney and Reeves (2014)

3.1.1.1. Analysis

3.1.1.2. Exploration

3.1.1.3. Design

3.1.1.4. Construction

3.1.1.5. Evaluation

3.1.1.6. Reflection

3.1.2. Easterday 2014

3.1.2.1. Focus

3.1.2.2. Understand

3.1.2.3. Define

3.1.2.4. Conceive

3.1.2.5. Build

3.1.2.6. Test

3.1.3. Cobb 2004

3.2. Procedure

3.2.1. Pre-Build/Construction

3.2.1.1. Focus

3.2.1.1.1. Craft a Design Procedure (Edelson 2009)

3.2.1.2. Understand

3.2.1.2.1. Analysis (M&R)

3.2.1.2.2. Explore (M&R)

3.2.1.3. Define

3.2.1.3.1. Write a problem definition (Edelson 2009)

3.2.1.3.2. Only then do we ask "How might we design this thing?" Easterday 2014

3.2.1.4. Design / Conceive

3.2.1.4.1. Design solution (Edelson 2009)

3.2.1.4.2. Theoretical design argument (Easterday 2014)

3.2.2. Construction/Build

3.2.3. Testing/Evaluation

3.2.3.1. Reflection

3.2.3.1.1. Retrospective consideration of findings/observations (M&R pg 133)

3.2.3.1.2. Organic

3.2.3.1.3. Structured

3.2.3.1.4. Products

3.2.3.2. Testing

3.2.3.2.1. M&R's 9 steps

3.2.3.3. DBR mindset

3.2.3.3.1. "design researchers know where they want to go and have faith that the research process will get them there, even though they do not always know how the journey will play out" M&R pg.135

3.2.3.3.2. "Intervention development is best served by early and frequent evaluation and reflection" M&R 135

4. Exemplars

4.1. Origin

4.1.1. Brown 1992; Collins 1992 Fostering a community of learners

4.1.1.1. Problem:

4.1.1.2. Context: elementary schools, biology and ecology,

4.1.1.3. Theory: used community of practice theory, discourse, metacognition distributed expertise etc.; developed community of learners theory; the notion of "diverse expertise", "developmental corridor"

4.1.1.4. Methods: First cycle was experiments on reciprocal teaching of reading. They found conversations were not deep so they created FCL. Round 2 they wanted less misconceptions so they added more lessons and activities. Round 3 they integrated the curriculum across grades

4.2. Joseph's Compassion Curriculum

4.2.1. Theory: Used cog. apprenticeship and goal based scenario. Developed a broader theory of student engagement

4.2.2. Design: students arranged in classrooms based on their interests. active engagement. diverse expertise. produce films as artifacts

4.2.3. methods: field notes, video tapes, interviews

4.2.4. Revisions: during 1st version students weren't focused and no assessment of learning. 2nd: added a badge system. problems with student interest in activities, no expertise development. 3. changed to after school program, more explicit training of expert students to other students

4.3. Danish BeeSign

5. Use of Theory

5.1. "Theories are judged not by their claims of truth, but by their ability to do work in this world (Dewey, 1938)" Barab 2014

5.2. Levels of theory

5.2.1. Local theory

5.2.1.1. tied to specifics of the investigation. Relatively "humble"

5.2.2. Middle range theory

5.2.2.1. intermediate, all inclusive speculations, day by day

5.2.3. High range theory

5.2.3.1. based on paradigms, set of assumptions, epistemologies

5.2.4. Grand theory

5.2.4.1. not testable e.g. Piaget, Skinner

5.3. How to test theories (Cobb 2011)

5.3.1. 1. Develop theory

5.3.2. 2. Derive principles for design from the theory

5.3.3. 3. Translate the principles into concrete designs

5.3.4. 4. Assess designs to test whether they work as anticipated

5.4. Edelson 2009

5.4.1. Domain theories

5.4.1.1. Descriptive

5.4.1.2. Two types

5.4.1.2.1. Context theories

5.4.1.2.2. Outcomes theories

5.4.1.3. A generalization of the problem analysis

5.4.2. Design Frameworks

5.4.2.1. generalized design solution

5.4.2.2. Prescriptive

5.4.2.3. characteristics of what the designed artifact must have to achieve a particular set of goals in a particular context

5.4.3. Design methodologies

5.4.3.1. Prescriptive of the design process

5.4.3.2. Process for achieving a class of designs, expertise required, and roles for individuals

5.5. van den akker 1999, 2010

5.5.1. Design principles

5.5.2. Substantive design principles

5.5.3. Procedural design principles

5.6. diSessa and Cobb

5.6.1. Ontological innovations

5.6.1.1. invention of new scientific categories that generate, select, and assess design alternatives

6. Critiques/Limitations

6.1. Challenges (Collins, 2004)

6.1.1. Complexity of real-world environments and resistance to experimental control

6.1.2. Large amounts of data

6.1.2.1. not enough time or resources to analyze much of the data collected

6.1.3. Comparing across designs

6.1.4. "enacted design is often quite different form what designers intended" pg. 17

6.2. Representing data with a few cases (Brown 1992)

6.3. Hawthorne effect

6.3.1. people behave differently when they are observed

7. Reporting

7.1. Collins et al. 2004 5 sections

7.1.1. Goals and elements of design

7.1.2. Settings where implemented

7.1.3. Description of each phase

7.1.4. Outcomes found

7.1.5. Lessons learned