Instructional Design + Learning Analytic
Those familiar with instructional design Dick and Cary model brings agile methodology to the process of designing learning products and experiences. Similar to the Addie model, the Dick and Cary model still focus on identify instructional goals and the evaluation participants meeting those goals but with an iterative approach.
The Dick and Carey model, is adaptive, flexible and consist of 9 parts outlined below:
With the emergence of Learning Analytics and data mining, the 1970s Dick and Carey model has an opportunity to transform how enterprise learning is measured conducted, and democratized. The D&C model is a natural fit with learning analytics since it operates as an interconnected system, lending itself to interoperable data analysis. D&C is inter-relational meaning context, content, learning, and instruction are reliant on each other.
LA tools, tactics, and principles can be used in most I&D models. Inter-relation allows for quicker adoption of analysis finding when puling in LA, ultimately facilitating and informing how instruction is revised, objectives are clarified, and how people learn in our environment.
Stages of 1 – 2:
Early on LA does not necessarily impact D&C. We still have to address the questions:
- What does one hope to achieve in a particular lesson?
- What is expected of learners at end of lesson? (the learner visualized the data the pulled from twitter)
In this stage LA can be used to inform learning pathways based on survey data, demographic information and platform use habits. This data can inform addressing questions likes:
- What do our learners already know?
- How do they use our current self-service offerings?
- When do they use it?
- What do they look at next?
During stage 4 learning objectives are clarified, informing our strategy and practice. We can better understand our leaners:
- strengths & weaknesses
- what they hope to gain from this instruction
- applicable experiences, that support learning
- who are champions and disengaged learners may be.
When addressing learners directly or via platform it’s critical to tie our metrics to the mission. Presenting learner personalized data is tactic to increase their understanding and intrinsic investment; its important to emphasizing growth to goal, rather than proficiency perspectives.
For other learners we can use the data to articulate how their path to learning may be different and how their strengths will show up vs weakness, based on our research on other folks who share similar profiles. The platform and data became a supporting tool during conversations regarding learners and learning objective achievement.
- Examples: formal video calls, profile dashboards walk-throughs, recommendation systems, Mentor – mentee programs (using a “tinders” like system) where people with similar demographics are matched but who have different levels of expertise and have consented to the program.
Stage 5 – 8:
During the assessment phase both performance (learning scored an 85% on our pop quiz) and behavioral data is analyzed (learner Rob post in the discussion blogs & has downloaded course material).
In this area were looking to see integration of the instructional content and assessment interest. We also get to assess and adjust our materials and assessments to best support learning.
After the instructional goals and scope are determined the teaching strategy is formed considering factors like feasibility & engagement:
- teachers centered vs student centered
- distributed learning (remote) vs in-person
- instructor paced vs students paced
After determining the strategy and delivery method instructional material have to be chosen. Do we use purely online modality with distributed group setting?
At this stage LA can inform what materials/ modalities have been popular in the past at a more granular level and determine which materials should be retired, updated, merged, etc. Materials include:
- podcast & discussion on readings and case studies
- 3 month project
- Curated Q&A offerings – chat bot/ elevate to customer success
- Developing digital communities of practice
- Inter-Organizational knowledge exchanges via virtual meeting or workshop ideas around our framework and their challenges
Through this process there should be formative assessment points, much like in agile methodology that inform how each stage of the instructional design process is handled. This formative data is incorporated into the more summative analysis. In our final state we use D&C & the LA driven metrics to gather summative data to gives us an opportunity to:
- Asses the effectiveness of our system :
- How many learners demonstrated growth according to our measures?
- How effective were we at categorize learners, build future learning pathways , and measure outcomes?
- Which materials, lessons, and modalities elicit more engagement?
- How did our instruction impact our clients broader strategy?