Embodied and Embedded Theory in Practice: The Student-Owned Learning-Engagement (SOLE) Model

[Print Version]

February – 2011

Embodied and Embedded Theory in Practice: The Student-Owned Learning-Engagement (SOLE) Model

Simon Atkinson
London School of Economics and Political Science, UK

Abstract

The demands on academic staff in all sectors to adopt best ODL practices to create effective and efficient models of learning in the face of increasing external pressures show no signs of abating. The massification of higher education, diversified access, and pressures to meet institutional visions and research objectives demand of teaching staff an increasingly public design process subject to peer review in numerous forms. Expectations of systematized pedagogical planners and embedded templates of learning within the institutional virtual learning environments (VLEs) have, so far, failed to deliver the institutional efficiencies anticipated. In response, a new model of learning design is proposed with a practical, accessible, and freely available “toolkit” that embodies and embeds pedagogical theories and practices. The student-owned learning-engagement (SOLE) model aims to support professional development within practice, constructive alignment, and holistic visualisations, as well as enable the sharing of learning design processes with the learners themselves.

Keywords: Learning design; constructive alignment; pedagogical planners; toolkit

Why a Learning Design Model?

The rich traditions of ODL, particularly those in distance and more recently online provision, are being drawn upon by an increasing number of institutions as they engage with some form of online support for learning. Academic staff are increasingly encouraged, or coerced, into the move towards blended modes of delivery with the same instance of a course being delivered to several cohorts simultaneously in both contact and distance mode or towards wholly online delivery modes. Staff unfamiliar with the intricacies and complexities of ODL provision find themselves, often for the first time, in an environment where the learning design process is more transparent, no longer delivered behind closed doors and often more complex as the contextual parameters prove new and unpredictable.

The conceptual work presented here, the SOLE model and its associated toolkit, is based on ten years of heuristic development in an academic professional context in explicitly ODL institutions (Open University, UK), majority campus-based provision (University of Hull, UK) and emerging blended institutional models (Massey University, NZ). The aim of the SOLE work has been to make pedagogical theory accessible to staff, to support their ability to visualise novel and effective ways for learners to engage remotely, and to avoid the danger of having staff develop materials in addition to those they have traditionally delivered in a face-to-face context, resulting in workload management difficulties for staff and students. The resulting SOLE model is grounded in professional practice and has produced a practical toolkit currently being implemented and evaluated. The aim of the toolkit is to support staff in designing learning and to share ownership of learning with the learner.

It is the author’s intention, as an educational developer, to support both individual and collective professional development processes. In order to do so, one must recognise that place, time, and motivation will be significant factors for academic staff. As Knight, Tait, and Yorke suggest,

How do we make workplaces evoke learning? Firstly, spaces need to be found for this activity, for the creation of shared meaning. Secondly, power relationships within activity systems need to encourage collegiality and participation. Thirdly, appropriate procedures and practices are needed; in higher education this is often represented by the capricious notion of reflection.” (2006, p. 332)

It is suggested that professional development must be situated within practice to be truly effective, must fit within the practical activity of staff in designing support for their practice, must encourage collegial participation, and must establish opportunities for negotiated and shared procedures.

The response to the challenge of the massification of higher education has, in part, been to raise the profile of scholarship of teaching and learning (SoTL) as a recognisable and reward-worthy activity for academic staff. Beyond specialism within a knowledge domain, the ability to support learning in that domain becomes a pursuit in its own right. Shulman’s oftquoted suggestion that “we develop a scholarship of teaching when our work as teachers becomes public, peer-reviewed and critiqued [and] exchanged with members of our professional communities so they, in turn, can build on our work” (Shulman, 2000, p. 50), still suggests the primacy of fellow teaching academics as peers, rather than as the majority of those actively engaged in the learning experience, namely the learners. Nonetheless, Shulman invites us to consider the dynamic nature of the teaching experience, suggesting, “We can treat our courses and classrooms as laboratories or field sites in the best sense of the term, and can contribute through our scholarship to the improvement and understanding of learning and teaching in our field” (Shulman, 2000, p. 50). We might also consider the advantages of engaging the learners themselves in this scholarship of teaching and learning process, making the learning design process and implementation a transparent and reflective experience for the teacher and learner.

Supportive Models

Models, frameworks, and toolkits serve to support staff in a myriad of ways. Much of the work behind the development of learning design models has sought to support course developers, teaching academics, and instructional designers alike in producing well-structured, balanced, and effective learning opportunities for students. A model serves to instil in the design process agreed parameters for a course of study such as the total student workload, shape of assessment strategy, and range, or nature, of learning activities. Recent work in the United Kingdom has invested significantly in the creation of pedagogical planners such as the work derived from JISC Design for Learning (JISC, 2006) and the successor to the Phoebe and London Pedagogical Planner projects, the Learning Design Support Environment (LDSE, 2010). These initiatives have produced largely web-based applications to assist staff in creating and structuring learning activities, which are then shareable and reusable. Much of this work has emphasised the cost-effectiveness and efficacy of technology-enhanced learning in meeting the increasing demands on higher education for expanding and widening participation with static or diminishing financial resources. Meeting this challenge will require innovative approaches to teaching and learning, to the use of technology enhanced learning, and to the means that ensure that institutions are able to support staff in their practice (Laurillard & Masterman, 2010).

Despite the promise of the learning objects movement, of the availability of open educational resources (OERs) that have flourished in recent years, and of the commercial responsiveness of many publishers, the bulk of development still rests in context with individual academic staff. Whilst there is undoubtedly value in the sharing of templates of activity and patterns of learning,

Learning design is a complex activity that is influenced by a wide range of factors such as: the prior experience and background of the designer (or design team); the nature of the target group for which the learning product is being designed; the designer’s understanding of cognition, pedagogy and epistemology; and, of course, various technological factors relating to the use of media and the properties they possess. (Barker, 2008, p. 128)

Funding agencies are often attracted to the notion of a definable product, but staff themselves, and their institutional support structures, have a very limited capacity to engage meaningfully with new tools. Academic staff frequently cite a lack of time as well as a lack of fundamental support as reasons why they do not use the myriad of tools provided.

Laurillard makes it clear that technology is potentially a solution to much of the inconsistency in the quality of learning design, but also represents part of the problem facing higher education (Laurillard, 2008). Its constant evolution and disruptive impact require a level of institutional preparedness that few have been able to live up to. There is much that is exciting about the recent development of individual learning design tools and pedagogical planners, but the emphasis remains on tools to retain levels of academic control; effective use of technology is represented as “essential if the academic community is to both maintain control of the new pedagogies, and find the most creative and effective ways of exploiting what the technology offers” (Laurillard, 2008, p. 149).

Learning Engagement: Constructive Alignment

The United States National Survey of Student Engagement (NSSE) defined student engagement as “the time and energy students devote to educationally sound activities inside and outside the classroom, and the policies and practices that institutions use to induce students to take part in these activities” (NSSE, 2007, p. 3).

This definition is significant in taking into account the world outside the classroom as one that academic staff would be wise to account for in their models of learning engagement. When this is considered alongside Professor John Biggs’ seminal work on the constructive alignment of learning, a powerful notion of the whole life-cycle of learning emerges. Biggs takes Bloom’s taxonomy of educational objectives (Bloom, 1964) and constructs a neat model of integrated and interdependent processes for curriculum designers that seek to align learning outcomes and assessment, as well as associated learning activity (Biggs & Collis, 1982). Biggs argues that, having decided on well-articulated verbs for the learning outcome in question (for example, “learners will be able to evaluate the underlying social prejudices influencing media criticality in contemporary news programming”), associated assessment might be expected to enable students to demonstrate they have met that outcome (“learners will be assessed on their ability to evaluate the underlying social prejudices influencing media criticality in contemporary news programming”). As a consequence, the teaching method, which Biggs clearly prefers to call teaching & learning activity (TLA), will also activate that verb, so one could expect to see the teaching and learning itself evaluating, examining social prejudices, exploring notions of influence unpacking the notion of “media criticality,” and establishing a context for news programming (Figure 1). If the teaching and learning activities do not activate the verb, one cannot be sure learners will experience that which they are expected to evidence to demonstrate attainment of the learning outcomes.

Figure 1

Also significant in any consideration of learning design model is the ability to express a range of existing theoretical models and their evidenced practice. To this end, Laurillard’s conversational framework (Laurillard, 2002) is significant, in that it illustrates very effectively the way in which emerging 20th-century learning theories and approaches have subsumed, rather than crudely displaced, each preceding one. Socio-cultural learning does not displace notions of social constructivism, but simply absorbs and extends its reach. The NSSE definition cited above represents a culmination of that subsuming approach to theory.

Why a Student-Owned Learning-Engagement Model?

SOLE stands for student-owned learning-engagement. All these terms are contested within education, but the emphasis is clearly on students being conscious of the learning design and learning processes and of the desire to optimise appropriate and effective learning engagement opportunities.

The SOLE model’s (Atkinson, 2010) original development goals were threefold:

  1. to embed pedagogical guidance regarding constructive alignment (Biggs & Tang, 2007) inside a learning design tool easily accessible to staff;
  2. to produce a practical model that captured the lessons to be learnt from Laurillard’s representations of conversational learning processes (Laurillard, 2002);
  3. to enable the development of a practical toolkit which would make patterns of learning design shareable and transparent to students and colleagues (Conole & Fill, 2005).

The SOLE model was borne out of a desire to make the learning design process transparent to students, to encourage staff to share patterns of learning with each other, and to provide a basis for self-evaluation and development of specific learning designs.

It is no coincidence that the SOLE model places the intended learning outcomes (ILOs) at the centre. In each constructively aligned course or unit of learning, the resulting pattern of activity will be different because the learning outcomes, the assessment designed to elicit evidence of attainment, and the patterns of teaching required to support that process will each be different. The SOLE model is, therefore, explicitly a model, not a template. The model can, and should, be adapted by staff to suit the particular approach to learning required by their students in any given context. The resulting pedagogical patterns should reflect the nature of their discipline, students’ existing context, and the specific teaching environment.

The model seeks not to restrict, but rather to illuminate, the practices of staff, and so encourage effective practices. The model is not concerned with the design of specific learning activities, although it provides references to effective resources; rather, the model advocates, as appropriate, a balance between the different modes of student engagement. The model is not prescriptive, and its associated toolkit is therefore open and flexible. It is possible for course design or teaching teams to change and modify any aspect of the toolkit, a simple spreadsheet, to suit their needs. The priority, however, is to provide staff with a model of effective practice so that one might be justifiably concerned about the quality of the student learning experience if the toolkit illustrated a consistently unbalanced approach. As Dick et al. suggest, “Instructional design models are based, in part, on many years of research on the learning process. Each component of the model is based on theory and, in most instances, on research that demonstrates the effectiveness of that component.” (2004, p. 14). An imbalance in the elements of the model requires attention.

The SOLE model is, then, a visual representation of the different modes of learning engagement that one might be expected to promote for a holistic learning experience. The model provides a conceptual map of learning engagement aligned to learning outcomes and assessment. The associated toolkit produces a visual representation of these elements of learning engagement for diagnostic, developmental, descriptive, and evaluative purposes.

At the heart of each unit of learning is the graduate profile wrapped in the articulated programme outcomes, and subsequent course outcomes – all of which should be able to demonstrate some form of alignment.

The model illustrates nine elements of learning engagement. These are visually represented in a uniform way and reflect the underlying premise that a balanced approach to learning engagement is preferable. However, it is recognised that each instance of learning design will produce a different representation of the learning experience. The associated toolkit illustrates this notion of balance further.

It is also worth noting that the traditional attitudes of students might be to focus on assessment: “Assessment defines what students regard as important, how they spend their time and how they come to see themselves as students and then as graduates” (Brown, Bull, & Pendlebury, 1997, p. 7)—and so effectively work their way counter-clockwise through this model’s elements from assessment. This is significant because the traditional staff design approach has often been content focussed, beginning with learning materials and working clockwise through a comparable design process. This is important to understanding the means by which students engage with a conceptualisation of learning as presented to them (in a model or a toolkit). To secure learners’ engagement with the learning process itself, it is necessary not only for some degree of transparency to be present but also for ownership to be transferred. It is certainly true that students demand evidence of value and that staff engagement is deemed evidence of value or at least of commitment on behalf of the institution to the learning process; however, we should be aware of the evidence on surface and deep learning from Gibbs and others. This research suggests that relatively high contact hours, excessive course material, and lack of choice (amongst other factors) promote surface learning (Gibbs, 1992, p. 9). Conversely, intrinsic motivation of wanting to know, learner activity, interaction, and well-structured content (related to actual experience and logically consistent within itself) encourage deeper learning (Gibbs, 1992, p. 11).

This anomaly, of students’ counter-clockwise conceptualisation versus staff’s clockwise process, is one of the things the model seeks to expose and mitigate. The transparency of the design process and the clear delineation of the learning experience as one supported by the academic but owned by the student are intended to promote deeper learning. The SOLE model attempts to capture the intrinsic motivation of assessment and encourage its effective use in the learning design by highlighting feedback as an identifiable category to which staff assign student time and commitment. Whilst the toolkit does not enforce a model of negotiated assessment, marking activity, or peer-designed rubrics, it does encourage, and support through annotation, a greater degree of consideration of this important aspect of students’ motivation (Rust, 2002, p. 153).

The identification of feedback as a distinct element is significant. Rather than subsuming feedback within assessment or reflection, the SOLE model aims to raise students’ metacognition by promoting self-measurement of achievement and articulating at each opportunity the drivers and constructive alignment of the learning experience.

Figure 2

Developing a model of engagement from both a teacher and learner perspective is challenging. The SOLE model describes the nine elements of the model (a) and, in the associated toolkit, provides pedagogic guidance (b) supported with references to resources and literature (not detailed here) for design purposes as follows:

1. Feedback

a) Supportive guidance on quality and level of evidence being demonstrated in achievement of the learning outcomes.

b) Feedback could be self-generated, peer-generated, or teacher-focused. What opportunities exist for feedback within your given teaching context? Will students see you each week, for how long, and are class sizes such that feedback will necessarily be peer provision? Would learning sets or group strategies support more effective feedback? If you are teaching online, or supporting the learning online, is there an opportunity for personalised feedback?

2. Assessment

a) Both formative and summative assessment.

b) Assessment could be for the purposes of evaluating progress against achievement of the learning outcomes (formative) or for demonstration of that progress for evaluative and credit purposes (summative); what is the balance within your course? Have you provided opportunities for engagement with the marking rubrics? Have you optionality or negotiated assessment possibilities in your course? Are there opportunities for students to relate assessment tasks to prior learning, to other pre-requisite courses? Does assessment design give the students anything to take away of practical benefit to their future learning career of life-work?

3. Reflection

a) Identified as a reflection-on-action to reflection-in-action process through the course life-cycle.

b) What opportunities exist to capture the reflection on feedback and assessment? What artefacts might be stored for later consideration? What occasions exist to engage in the individual’s social context and with peers to evaluate the learning in progress?

4. Personal Context

a) The individual life context, which the learner occupies, is a source of real-world activity we can build on in our learning design.

b) Is the learner face-to-face or online? Are they working part-time or full-time, studying for a professional degree, trade or craft, or some life-work as yet ill-defined? Is this something that can be developed as a theme for personal reflection? What prior-learning, pre-requisites, or co-requisites might be drawn on in the learning design?

5. Social Context

a) The non-course context in which the learner lives is a source of real-world activity we can build on in our course design.

b) Is the cohort a homogenous or heterogeneous group? What external social contexts can we reference in our learning design; are students working and could contexts be cited? Are there diversities in life contexts which afford opportunities to encourage contextual learning, and can learners be asked to share social differences? What learning might occur with other non-peers, elders, siblings, or social or leisure contexts? 

6. Peer Moderation

a) The direct engagement with fellow students on the same learning cycle that can be reasonably directed.

b) What opportunities exist for in-class, or online, exchange of views, co-construction, and co-resolution? What opportunities for negotiation, sharing, joint inquiry, or critical-friends exist within the course? Is collaboration, critique, or inquiry an identified learning outcome? Are there reasons why group work would contribute to the ILO; are there skills to be learnt through particular forms of collaboration?

7. Tutor Facilitation

a) Time and activity allocated to asynchronous engagement involving the teacher.

b) What level of direct engagement with learner activity is required of you to support and progress student learning? What degree of online intervention is commensurate with your learning design; are students online and requiring your guidance? To what extent is your presence required and motivational? What periodic interventions might you make to contemporise the learning context, drawing on current literature or social contexts to make the learning real-world relevant?

8. Tutor Contact Time

a) Time and activity allocated for real-time synchronous engagement.

b) What balance of face-to-face, or virtual contact time, is appropriate throughout the course? Does institutional timetabling allow variance throughout the course; might you choose to engage to a greater extent at the outset of the learning process and again for summative purposes? If learning materials are supporting domain knowledge acquisition, what is the most effective use of your time?

9. Learning Materials

a) The materials provided, usually in advance, to support domain knowledge acquisition.

b) What material exists? Have you explored existing open educational resources (OER) that could be adapted to suit your learners’ needs? Would a single set-reading be a helpful reference point? What capacity for deep engagement with resources exists? Are seminal texts identified to students as such, and if not, are they truly necessary? What opportunities exist for learners to assist in developing and refining the creation of learning materials, for example in the joint creation of an online glossary or a shared annotated bibliography?

These nine elements of the SOLE model are designed to reflect a comprehensive consideration of the students’ learning experience which, if properly populated, would afford an effective balance of activity, learning ownership, and opportunities for higher-order thinking and deep learning.

From Model to Toolkit: Supporting Theory-In-Action

As the early iterations of the SOLE model were explored with academic colleagues, the original goals were revised in response to the demand to actualise the model in some meaningful way. The embodying of theory within a model became a quest to embed the theoretical principles within a practical manifestation of the model.

Revised development goals were therefore

  1. to embody pedagogical guidance and learning theory within an accessible and transparent model shared by students and teachers;
  2. to embody best practices regarding constructive alignment (Biggs & Tang, 2007) inside a learning design model easily accessible to, and shared by, staff and students;
  3. to produce a practical model that captured the lessons to be learnt from Laurillard’s representations of conversational learning processes (Laurillard, 2002), whilst taking an inclusive approach to alternative conceptualisations of learning;
  4. to enable the development of a practical toolkit that would make patterns of learning design shareable and transparent to students and colleagues (Conole & Fill, 2005).

While the theoretical debate around conversational, or dialogic, learning and constructive alignment is of interest to many in the education disciplines, the model is intended for use across HE subject areas. A visual representation is still not sufficient to make the model live for most academic staff. The opportunity certainly exists for professional development engagements around a presentation of the model itself, and these have been successfully undertaken, but the intention has been to make the model as accessible as possible. As Conole suggests, “the development of toolkits provides a way for non-specialists to engage with such theories in a manner which supports careful design and prompts productive reflection and engagement” (Conole, Dyke, Oliver, & Seale, 2004, p. 18).

The model has a number of underlying theoretical constructs informing its design, but it is not intended to enforce a rigid pedagogical theoretical framework. A teacher may choose to continue to teach in exactly the same way s/he always has; the model simply illustrates that process to colleagues and, more importantly, to students. Indeed, as Conole and colleagues identify,

Toolkits are designed to facilitate the identification of implications or recommend suitable approaches based on the information and assumptions elicited from the user. They provide a structured guiding framework, whilst also enabling flexibility and local contextualisation. Therefore rather than the toolkit deciding on the best approach on behalf of the user, the practitioner uses these interferences to make informed, professional decisions about whether certain changes would be appropriate. (Conole et al., 2004, p. 22)

The model, without the associated toolkit, is in itself a team discussion tool, a course-based instrument for planning and development, and a means of visualising one’s practice and assumptions about that practice. The toolkit provides much the same opportunities but also allows the academic, and ultimately the student, to work within a learning design, diagnosing expected activity, adjusting the balance of engagement through the development process, describing (as an advanced organiser) what the learning might look like, and providing opportunities for ongoing evaluation. The first version of the toolkit, as an Excel 2007 Spreadsheet, was shared with staff in a series of workshops in May and June 2010 at Massey University, New Zealand (see Figure 3).

Figure 3

The focus of the toolkit developed for the workshops was to support staff in student workload planning, seeking to make transparent the activities in which students were being encouraged to engage (Atkinson, 2010). Version 1.1 of the toolkit, released online in May 2010, had a number of features and benefits to academic staff and students:

  • an initial overview sheet contains summary data which need be entered only once (total hours, number of weeks, course descriptions, learning outcomes) and which is then populated across subsequent unit views;
  • a summary table on each unit view pulls data from the overview and displays calculations of student time in each engagement area;
  • the time allocations are summed and displayed clearly, including whether, in that unit, time is ahead or behind the norm or allocated amount;
  • an automatically generated pie chart provides quick visual information to a student to remind them that there is a balance of activity with which to engage.

It is hoped that staff will share their resulting patterns as models of pedagogical approaches. It is also anticipated that staff would, in many cases, leave the spreadsheets open for students to complete with actual details of activity and time recorded. In both cases, this offers the prospects of ongoing evaluation and development of learning designs through “shareable representations of beliefs and of practice” (Conole et al., 2004, p. 18).  The intention is that the spreadsheet toolkit will produce a clear visual representation that is given to the student to form an advanced organiser. A review of the completed spreadsheets would perhaps then act as a useful evaluation exercise, identifying activities that were particularly beneficial or making clear a learning designer’s unrealistic expectations.

Further iterations of the toolkit have followed, and in September 2010 version 1.2 was released on the Internet, expanding the toolset for staff and students. As well as a revised explanatory worksheet that detailed the nine effective elements of the model with extensive guidance (see Figure 4), and questions and prompts towards effective practice, an option was included to allow for the actual time spent to be recorded by students on a distributed version of the toolkit as a spreadsheet rather than as a PDF for printing. The pedagogical guidance embedded in the toolkit is intended to be layered, so in column C of the spreadsheet a cell with a detailed description of an element has an embedded comment associated with it, in column D each guiding question to support reflection also has an embedded comment, and individual resources in column E can support institutionally contextualised guidance (as illustrated in Figure 4).

Staff are also provided with the opportunity to detail the assessment requirements and provide students with a repeated sense of the effective alignment of learning outcomes, assessment, and teaching and learning activities.

Figure 4

One of the reasons for choosing to adopt a familiar desktop spreadsheet application as the basis for a toolkit was to avoid any need for students to download any additional software, and to see the toolkit as something adaptable, personalised, and shareable. For the majority of students, the software is expected to be familiar, but for those who have not engaged with spreadsheets before it can be argued that to do so would have beneficial consequences. Spreadsheets, it is suggested, can still provide a rich and meaningful environment in which students can take ownership of information (Conole, de Laat, Dillon, & Darby, 2008).

Diagnostic, Descriptive, Developmental, and Evaluative

The SOLE toolkit has evolved to have four particular functions in the full life-cycle learning design process. The toolkit has a diagnostic function in asking the academic to envisage a full course cycle, with planned or predicted learning engagement on the part of the learner. The academic is recognised as being restricted by institutional guidelines, and each instance will be different; however, in assigning a time allocation across the duration of the course, the academic as designer is constantly reminded of the holistic nature of the learner experience, core learning outcomes, and assessment within realistic time frames. The result is a visual representation that is descriptive, which provides a representation of the learning experience (and expectations) as moving beyond the immediate relationship between teacher and student. The toolkit might be shared with learners as a print-out so they have a visual representation of what is anticipated of them, holistically. Early evidence suggests this act alone has impact on learners’ perceptions of their role and ownership of the learning process. The notion that they draw directly on personal context and use earlier feedback as an articulated learning activity is a novel concept for many. The toolkit is also developmental, and the academic may choose to invite feedback from learners on their engagement with the model and adjust the balance of activities appropriately.

The learner might be given the toolkit in its spreadsheet form and, as such, can interact with it, recording (in version 1.2) the actual balance of time they applied to the designated learning activity. As well as the developmental feedback available to teachers, this also provides evidence for the final dimension of the model, evaluation. Evidence from students of actual time spent and the degrees of engagement achieved will provide useful re-design opportunities for academic staff. In contexts where the development and maintenance of learning portfolios is appropriate, students might be asked to make their engagement with the toolkit a submissible artefact.

Both the developmental and evaluative aspects of the toolkit provide opportunities for peer support and extensive sharing. Engagement with the toolkit quickly provides evidence that no single model of practice is being enforced. One would anticipate that the visualisation generated by the toolkit would reflect a pattern of learning that differs from course to course. One also identifies quickly how the ideal pattern of learning modulates from week to week within a course. In the first week of an undergraduate course, one might expect to see significantly more teacher-centeredness than in the twelfth week of the same course. A course based around an inquiry model of learning at postgraduate level would be expected to have a different pattern again. The visualisation will differ; the patterns can be expected to reflect different levels of engagement.

It is a curious fact of higher education that, despite all we know about learning styles and dependencies, we continue to timetable activity (often determined by contact-hours) evenly or uniformly through a semester. By making clear that not all learning is teacher-dependent, it becomes easier to visualise how the teacher, reflecting on their changing inputs and recognising institutional systems and constraints, might manipulate profitably the balance of activity.

Conclusion

The SOLE model has originated from ten years of academic development practice in the United Kingdom and New Zealand. Development workshops with practitioners have provided support for the effectiveness of the model as a visualisation of interrelated learning theory. Workshops have also indicated support for the role of the toolkit as an embodiment and embedding of theory into practice. The model and nascent toolkit were presented in April 2010 at the Distance Education Association New Zealand (DEANZ) 2010 conference in Wellington and again in July 2010 at the European LAMS Learning Design Conference in Oxford. Several individual academic staff are in the process of applying the SOLE toolkit to their learning design processes, and documenting that experience, to enable the author to validate the model and toolkit and make enhancements where deemed appropriate. One notable aspect of the feedback from internal staff presentations between March and June 2010 has been the suggestion that the toolkit provide a degree of holistic visualisation, which they welcomed. This transparent practice aspect of the toolkit is central to the notion of staff’s responsibility for creating a full life-cycle experience of learning for the student.

As these early action research projects come to fruition, it is anticipated that there will be further refinements based on the practical implementation of the toolkit in contexts in which learners can take ownership of the learning process. Students’ recognition of their metacognitive development, and consequent self-adjustments for effective learning through engagement with the toolkit representations, will demonstrate the SOLE model’s ultimate value.

After the success of the DiAL-e framework in encouraging a transformative learning experience for academic staff with respect to the deployment of digitised resources for learning (Burden & Atkinson, 2009), the need to provide a more generic learning design tool, beyond engaging content, became evident. Subsequent personal experience with many educational technology design tools and academic colleagues’ resistance to learning new tricks suggested that a solution that relied on familiar desktop applications, with no need for additional software installations, specialised training, or support, had real potential. The SOLE model is an attempt to provide course designers with the benefit of embodying pedagogical theory and reflecting this directly in practice through a freely available and accessible toolkit that is diagnostic, descriptive, developmental, and evaluative.

Acknowledgements

Whilst a decade in gestation, the development work for the SOLE model was undertaken primarily in 2009–2010 whilst the author was employed at the College of Education at Massey University, New Zealand. The work was influenced, though not directly supported, by the award in 2009 of a Massey FIET grant, and benefitted from presentations to, and professional conversations with, staff in the College of Education and the Centre for Academic Development and eLearning (CADeL). The SOLE model also draws on work undertaken jointly with Kevin Burden (University of Hull) in support of the DiAL-e Project funded by JISC in 2006.

References

Atkinson, S. (2010). SOLE learning design. Educational technologies to enable social change [Web log post]. Retrieved from http://spatkinson.wordpress.com/sole-learning-design/.

Barker, P. (2008). Re-evaluating a model of learning design. Innovations in Education and Teaching International, 45(2), 127. doi:10.1080/14703290801950294

Biggs, J., & Collis, K. (1982). Evaluating the quality of learning: The SOLO taxonomy (structure of observed learning outcome). New York: Academic Press.

Biggs, J., & Tang, C. (2007). Teaching for quality learning at university (3rd ed.). Milton Keynes: Open University Press.

Bloom, B. (1964). Taxonomy of educational objectives. New York: McKay.

Brown, G. A., Bull, J., & Pendlebury, M. (1997). Assessing student learning in higher education (1st ed.). London and New York: Routledge.

Burden, K., & Atkinson, S. (2009). Personalising teaching and learning with digital resources: DiAL-e Framework case studies. In J. O'Donoghue (Ed.), Technology supported environment for personalised learning: Methods and case studies. Hershey, PA: IGI Global.

Conole, G., Dyke, M., Oliver, M., & Seale, J. (2004). Mapping pedagogy and tools for effective learning design. Computers & Education, 43(1–2), 17–33.

Conole, G., & Fill, K. (2005). A learning design toolkit to create pedagogically effective learning activities. Journal Article. Retrieved from http://oro.open.ac.uk/11725/.

Conole, G., de Laat, M., Dillon, T., & Darby, J. (2008). Disruptive technologies, pedagogical innovation:What's new? Findings from an in-depth study of students’ use and perception of technology. Computers and Education, 50, 511–524.

Dick, W. O., Carey, L., & Carey, J. O. (2004). The systematic design of instruction (6th ed.). Boston: Allyn & Bacon.

Gibbs, G. (1992). Improving the quality of student learning. Bristol, UK: Technical & Educational Services Ltd.

JISC. (2006). Design for learning. Retrieved from http://www.jisc.ac.uk/whatwedo/programmes/elearningpedagogy/designlearn.aspx.

Knight, P., Tait, J., & Yorke, M. (2006). The professional learning of teachers in higher education. Studies in Higher Education, 31(3), 319–339.

Laurillard, D. (2002). Rethinking university teaching: A conversational framework for the effective use of learning technologies (2nd ed.). London: RoutledgeFalmer.

Laurillard, D. (2008). The teacher as action researcher: using technology to capture pedagogic form. Studies in Higher Education, 33(2), 139–154.

Laurillard, D., & Masterman, E. (2010). TPD as online collaborative learning for innovation in teaching. In J. Ola Lindberg & A. D. Olofsson (Eds.), Online learning communities and teacher professional development (pp. 230–246). Hershey, PA: IGI Global.

LDSE. (2010). LDSE. Learning design support environment. Retrieved from https://sites.google.com/a/lkl.ac.uk/ldse/.

NSSE. (2007). Experiences that matter: Enhancing student learning and success (Annual Report). Bloomington, IL: Center for Postsecondary Research.

Rust, C. (2002). The impact of assessment on student learning. Active Learning in Higher Education, 3(2), 145–158. doi:10.1177/1469787402003002004

Shulman, L. (2000). From Minsk to Pinsk: Why a scholarship of teaching and learning. Journal of the Scholarship of Teaching and Learning, 1(1), 48–53.






ISSN: 1492-3831