Wednesday, June 15, 2016

The Role of Self-Assessment in Online Learning

When creating an online course, the instructional designer and/or instructor must consider which types of assessment (formative or summative, instructor- individual student, instructor-groups of students, peer, or self-assessment) will be incorporated in the course's overall assessment plan. Several sources from our course readings (Lock & Johnson, 2015; Laureate, Assessing Interaction, 2011; Kerton & Cervato, 2014; Mao & Peck, 2014; Vonderwell & Boboc, 2013; Kelly, 2009) have indicated the importance of self-assessment in the overall course assessment plan.  There are no definitive answers, however, on how and when that should be incorporated, and whether that self-assessment should be included as part of the overall grade, though much of the research recommends using self-assessment mainly for formative purposes.. Using the sources from this course, any outside sources you find (including the linked videos below), and your own learning experiences as a basis, make a case for the appropriate level of inclusion of self-assessment in online learning.

Consider these questions in preparing your response:

  • What are the benefits (for both the learner and the instructor) for including self-assessment in an online course?  
  • What might be some of the challenges or drawbacks of self-assessment for the learners and instructor? How could those challenges be addressed?
  • If self-assessment is to be included in an online course, should that self-assessment be used formatively only, or is it appropriate to use elements of self-assessments for summative grading?

By Friday:
Respond to this Blog Post with your thoughts on the advisability of using self-assessment for online learning. Explain the benefits and challenges for both learners and instructors, and propose a model for what you believe would be appropriate inclusion of self-assessment.  Be sure to cite information from the Learning Resources to support your thinking. The attached rubric will be used to assess your responses to the discussion board. You many make copy it to your Google Drive or download it: Discussion Rubric

By Saturday:
Continue the conversation by responding to any replies to your initial response below.

HERE ARE SOME ADDITIONAL VIDEO RESOURCES THAT MIGHT HELP:

 
The Power of Student Self-Assessment (Elliot Haspel, Director of Education Matters and former editor of Best Practices Weekly, discusses the important considerations for student self-assessment.)

 
Peer and Self Assessment: How and Why (Former Stanford and current UC San Diego Professor Scott Klemmer discusses the reasons for self/peer assessment and includes some strategies for using self and peer assessment as a summative measure.)

 
Self and Peer Assessment - Dylan William ( The Deputy Director/Professor of Educational Assessment at the Institute of Education, University of London discusses the benefits and formative nature of self and peer assessment.)

Self-Assessments: Reflections from Students and Teachers (Great video of a K-12 classrooms, with strategies that would need to be adapted for Higher Education/Online Learning)












 
Dr. Heidi Andrade, Ed.D. Reflects on Self-and Peer Assessment (Researcher from SUNY-Albany discussing the self-reflection project featured in the Reflections video above.)

Reference Materials for Your Use:

From EIDT 6511: Assessments in Online Environments:

Week 1:
Kelly, R. (2009). Assessing online learning: Strategies, challenges and opportunities. Faculty Focus. Retrieved on November 24, 2015 from http://www.facultyfocus.com/free-reports/assessing-online-learning-strategies-challenges-and-opportunities/.

Week 2:
Vonderwell, S.K. & Boboc, M. (2013). Promoting formative assessment in online teaching and learning. TechTrends, 57(4), 22-27. Retrieved from the Walden Library.

Week 4:
Kerton, C. & Cervato, C. (2014). Assessment in online learning – it’s a matter of time. Journal of College Science Teaching, 43(4), 20-25.  Retrieved from the Walden Library.
Mao, J. & Peck, K. (2013). Assessment strategies, self-regulated learning skills, perceptions of assessment in online learning. Quarterly Review of Distance Education, 14(2), 75-95.  Retrieved from the Walden Library.

Week 7:
Laureate Education (Producer). (2011 ). Assessing interaction and collaboration in online environments [Video file]. Retrieved from https://class.waldenu.edu
Lock, J. & Johnson, C. (2015). Triangulating Assessment of Online Collaborative Learning. Quarterly Review of Distance Education, 16(4), 61-70. Available from the Walden Library.

Video References

BestPracticesWeekly,. (2011). The Power of Student Self-Assessment. Retrieved from https://www.youtube.com/watch?v=-XJ8f9yLteQ
HCIonline,. (2013). Peer and Self Assessment: How and Why. Retrieved from https://www.youtube.com/watch?v=20XYA-T2qms
Jobs for the Future,. (2013). Dr. Heidi Andrade, Ed.D. Reflects on Self- and Peer Assessment. Retrieved from https://www.youtube.com/watch?v=8OkPW_mX7Vw
Jobs for the Future,. (2013). Self-Assessment: Reflections from Students and Teachers. Retrieved from https://www.youtube.com/watch?v=CkFWbC91PXQ
Land, M. (2014). Self and Peer Assessment Dylan Wiliam. Retrieved from https://www.youtube.com/watch?v=5P7VQxPqqTQ


Thursday, February 11, 2016

Preventing Plagiarism in Online Learning

As a middle school teacher for the past twenty-one years, I have gotten used to plagiarized assignments and attempted cheating of all types.  Since I have transitioned to a completely paperless learning environment, the ways to cheat have greatly increased, but the ability to use technology to detect the plagiarism has helped even the playing field.  I was surprised when I started the Instructional Design and Technology program at Walden University to discover that plagiarism and cheating were just as prevalent among adult learners. Thankfully though, with some planning and preparation, instructors can prevent some and detect even more instances of plagiarism and cheating.

There are a few different options of online plagiarism detection software.  The most commonly used online option is Turnitin.com, which, according to their own website,  is used by more than 10,000 institutions in 135 countries to manage the submission, tracking and evaluation of student work online (Turnitin, 2016).  Another common software is EVE (Essay Verification Engine) which is a downloadable software that claims almost 150 million searches since 2007 (Canexus.com, 2016).  Finally, one of the newer options on the market is iThenticate which compares submitted documents to not only 10 million web pages a day through the use of its own web crawler, but also to a huge database of millions of published research articles, abstracts, and citations from a cache of journals and publishers larger than most university libraries (Ithenticate, 2016).  All of these software programs offer plagiarism detection by indicating copied text, but they also have some limitations.  For example, as Jocoy and DiBiase (2006) explain, Turnitin.com doesn’t distinguish between copied text and cited text, leaving it up to the instructor/facilitator to make that determination.  Others don’t compare against printed sources which might not be openly available via the internet, although iThenticate seems to be changing the industry standard in this area. 
As an alternative or addition to detection software, some sources suggest that intentional design of assessments can be employed to deter plagiarism.  For instance, for a written assessment, the requirements could be so specific that it would make it difficult to find/purchase a paper that would meet the requirements.  Students could be asked to complete the assessment in groups with each person contributing a portion, to submit copies of their references along with their paper, or to write their paper/assessment in stages with required revisions along the way (Simonson, Smaldino, & Zvacek, 2010, p. 160)  These strategies would help with incidences of simply buying a whole paper or copying/pasting a whole paper at once, but might not work as well to prevent smaller cases of copying/pasting plagiarism.  Dr. Keith Pratt suggests that assessments be designed so that collaboration and use of resource material is encouraged; in other words, design the assessment so that students don’t need to cheat because the assessment mirrors real-life expectations where in the work world, we would readily access experts and resources to find out and deliver the necessary information. By removing the ‘test’ thinking, we could help eliminate the pressure to cheat in order to succeed (Laureate, 2010). Boettcher and Conrad (2010) recommend using on-going assessments, because the more familiar an instructor is with the expression and perspectives of a given student, the more likely they are to notice changes to that style which could indicate plagiarism (p. 162).  Finally, several researchers (Braumoeller and Gaines, 2001; Jocoy and DiBiase, 2006) tested the strategy (granted, with mixed results) of using explicit instruction/guidelines regarding plagiarism coupled with follow-up such as an academic integrity quiz or implementation of Turnitin scans and grading penalties for cheating.  This expectation management strategy, though it didn’t produce overwhelming results, seems to promising if used consistently in combination with some of the other techniques mentioned earlier.
In my own online learning environment, I would use a multifaceted approach, leading with clear establishment of the understanding of plagiarism and its consequences.  Once those understandings were established, I would make a point to pay close attention to the participation and communication styles of my learners so that when unusual patterns did appear, it would be more obvious and I could address it immediately.  This has worked for me in the past with adolescent learners, but with adult learners, even closer observation would be required. Finally, I know from experience how important consistency is in establishing the desired behaviors, so I would use Turnitin (or whichever plagiarism detection software my institution chose to use) for each assignment/assessment, so that learners could quickly come to expect that check to occur, which could hopefully deter at least some of the plagiarism.   Prior to the course readings and research for this blog post, I would have assumed that adult learners would be more responsible and organized and much less likely to plagiarize.  Knowing now that this is not the case has helped me understand that effective online facilitators are prepared and intentional about educating their adult learners regarding plagiarism and its consequences.

References
Boettcher, J. V., & Conrad, R. (2010). The online teaching survival guide: Simple and practical pedagogical tips. San Francisco, CA: Jossey-Bass.
Braumoeller, B. F., and Gaines, B. J. (2001). Actions do speak louder than words: Deterring plagiarism with the use of plagiarism detection software. Political Science & Politics, 34(4), 835 – 839.
Canexus.com,. (2016). EVE2 Plagiarism detection for teachers. Retrieved 11 February 2016, from http://www.canexus.com/
Ithenticate.com,. (2016). FAQs | Plagiarism Software. Retrieved 12 February 2016, from http://www.ithenticate.com/products/faqs
Jocoy, C., & DiBiase, D. (2006). Plagiarism by adult learners online: A case study in detection and remediation. International Review of Research in Open & Distance Learning, 7(1), 1-15.
Laureate Education (Producer). (2010). Plagiarism and cheating [Video file]. Retrieved from https://class.waldenu.edu
Simonson, M., Smaldino, S., & Zvacek, S. (2015). Teaching and learning at a distance. (6th ed.) Charlotte, NC: Information Age Publishing.
Turnitin,. (2016). Turnitin - Products : FAQs. Retrieved 11 February 2016, from http://turnitin.com/en_us/what-we-offer/faqs

Thursday, February 4, 2016

Using Technology and Multimedia Tools Online

When implemented correctly, technology and multimedia can have a very positive impact on an online learning environment.  The benefits of incorporating these technologies can range from increased learner interest and engagement (which could lead to greater retention), creation of a greater sense of community in what could otherwise be a somewhat lonely learning space, and access to a richer, wider variety of learning resources.  In the Instructional Design and Technology coursework at Walden University, many of the cited researchers have commented on the importance of using appropriate technology and multimedia tools for online learning.  Dr. Keith Pratt and Dr. Rena Palloff (Laureate, 2010) extolled the virtues of using Web 2.0 technologies such blogs and wikis that allow for greater collaboration and encourage feelings of community, which in turn decrease the sense of isolation, particularly in remote or high-pressure work/learning environments.  In The Online Teaching Survival Guide, Boettcher and Conrad (2010) suggest that “Investing time getting familiar with and using audio and video tools creates a richer, more interesting, and more satisfying course experience for you and your students” (p. 140) and Simonson, Smaldino, and Zvacek (2015) explain that “When the professor who is designing online instruction selects the correct media, it maximizes efficiency and makes available more resources for other learning experiences” (p. 99). Clearly, there are a multitude of benefits to the incorporation of technology and multimedia in the online learning environment.  However, though much of the research supports the use of these tools,  there are some important considerations to be made before implementation.

In any instruction, it makes sense to keep the learning objectives and the needs of the learners first and foremost when making any instructional decisions.  The importance of these factors doesn’t change when the course is being delivered online.  In discussions of instructional best practices for use of technology in an online environment, Simonson, Smaldino and Zvacek (2015) indicate that the first two steps for an instructor deciding to use technology should be to first assess the lowest level of available technology for the learners, instructor, and institution and then second, to determine the learning outcomes for the course to see how the available technologies would meet those objectives (p. 98-99).  By knowing the technology skill level of the learners, the instructor will be able to do as Boettcher and Conrad advise by making sure that the learners either know how to use the technology or know how/where to seek the help they will need in implementing the technology tool (2010, p. 53).  As they go on to explain, “The goal is to communicate regularly and meaningfully and to cultivate a sense of curiosity and search for truth and wisdom.  Tools are just tools.  The goal is communicating with and providing guidance to students, while being as accessible as appropriate.  The form of communication is not as important as using the set of tools that work smoothly for members of the course community” (p. 112-114).  This sentiment echoes something that Palloff and Pratt stressed to their workshop attendees (Laureate, 2010) and that I have experienced firsthand in my own technology enhanced classroom - that technology should be used when it truly supports and advances the learning objectives, and not just because it is available.  If an instructor can keep the learners’ needs/skill levels and the learning objectives clearly in mind when deciding which technology and multimedia to use, the implementation of these is much more likely to go smoothly and enhance the learning experience.
In addition to considering the course objectives and tech savvy of the participants, instructors must take issues of accessibility and usability into account when deciding which types of technology and multimedia to incorporate into their course.  The needs of specific learners cannot be ignored simply because they might be in the minority.  By assessing the usability and accessibility of the technology tools that will be put to use, the instructor is likely giving benefits to all learners and not just those who may need special accommodations:
Accessibility is thus determined by the flexibility of the e-learning system or learning resource to meet the needs and preferences of all users. These needs and preferences may arise from their environment (e.g. working in a noisy environment), the tools they use (e.g. assistive technologies such as screen-readers, voice-recognition tools or alternative keyboards, etc.) or a disability in the conventional sense...improved accessibility for disabled users promotes usability for all...Accessibility and usability impact directly on the pedagogical effectiveness of e-learning systems or resources for all learners, but particularly for disabled learners (Cooper, Colwell, & Jelfs, 2007).
Just as instructors should not consider using technology tools that don’t directly relate to the course learning objectives, neither should they consider using technology tools that would exclude some of the learners from the learning activities.  Any tool that would create barriers for some probably shouldn’t be considered for any.

Going forward into the field of instructional design, there are many technology tools that hold great appeal and promise to me when I think of the online courses I might develop and/or instruct.  First of all, I am a daily user of the Google Apps for Education suite in my middle school classroom, so I am looking forward to seeing how those tools can be applied in a fully online learning environment in order to facilitate collaboration.  Also, the tools that as a learner I find most useful, such as blogs and wikis, will also be put to good use when I am designing my own courses.  Finally, though I have little to no experience with the synchronous collaboration tools such as Elluminate and Wimba other than what I have researched/read about through my coursework, I am anxious to give those tools a try as well because they seem like they could greatly increase the interactivity of the learning in an online environment.  Through all of the research and reading about the use of these technology tools (and others that I may come to learn about as I continue my studies), one thing is very clear:  the objectives of the course and the needs of the learners must be the first measure of whether I will use a technology tool or not.  The thing to avoid is to use a cool new technology just for the technology’s sake.  The learners and their takeaway from the course must be supported by the implementation of any technology or multimedia in order for the implementation to be worth the time and effort.
References
Boettcher, J. V., & Conrad, R. (2010). The online teaching survival guide: Simple and practical pedagogical tips. San Francisco, CA: Jossey-Bass.
Cooper, M., Colwell, C., & Jelfs, A. (2007). Embedding accessibility and usability: Considerations for e-learning research and development projects. ALT-J: Research in Learning Technology, 15(3), 231-245.
Laureate Education (Producer). (2010). Enhancing the online experience [Video file].

Simonson, M., Smaldino, S., & Zvacek, S. (2015). Definitions, history, and theories of distance education. In Teaching and learning at a distance (6th ed., pp. 40-57). Charlotte, NC: Information Age Publishing.

Thursday, January 21, 2016

Setting Up an Online Learning Experience

In order to set up a successful online learning experience, several things must happen.  One of the first of those crucial elements is that the facilitator/instructor must be familiar with the technology that will be used (or could be incorporated later, as needed) in the coursework.  Boettcher and Conrad (2010) list having the tools for the online environment in place, including making sure that learners know how to use them (or seek help in the appropriate places), as one of the first four things that should be happening during the first part of the course.  It would be very difficult (and quite uncomfortable, I would think) for an instructor to expect learners to use technology to interact with them and the content when the instructor didn’t at least have a working understanding of how that technology functioned.  In my own experience with transitioning my traditional paper-and-pencil classroom into a paperless, blended learning environment, I spent time making sure that I understood how the technology worked before I ever introduced it into my classroom.  Of course, in that case I was going to be responsible for all the troubleshooting and tech support myself, which isn’t often the case with online university courses.   Though instructors don’t have the time handle all tech issues that will come up with their learners during the course (nor should they try), they will still field their fair share of tech questions, so having a working knowledge of the technology can make things easier for everyone.  Also, as Conrad and Donaldson point out, “We are living in a rapidly changing techie world and need to know what our students use and their skill levels” (2011, p. 42).   Being ‘up-to-date’ with the latest technology allows instructors to offer learners more flexibility in product/assignment creation and delivery and can establish rapport between them and the learners as well.
One important aspect of technology familiarity for instructors is that we shouldn’t overwhelm ourselves by trying to learn all of the possible tools at once - it’s good to start small and focus on just a few in the first cycle of teaching the course (Boettcher & Conrad, 2010, 57-60).  That was a lesson I learned somewhat the hard way, by trying to use/introduce too many tools at once.  Now that I have learned to slow down and incorporate new technology strategies as they are needed and not just for the sake of the technology, managing the workload and learning curve has gotten much more manageable.
Along with being familiar with the technology for the online course, it is essential for facilitators to clearly communicate their expectations to learners for several reasons.  First of all, clarity of expectations is one of the four Course Beginnings Themes (Boettcher & Conrad, 2010, p. 55): “Clear and unambiguous guidelines about what is expected of learners and what they should expect from an instructor make a significant contribution to ensuring understanding and satisfaction in an online course...can help create a smooth and trusting learning environment”.  By beginning the course with clear expectations and establishing that trusting learning environment, a community is much more likely to develop, which in turn, is more likely to help with student success and retention.  Establishing these expectations from the beginning can also help students plan their personal and work lives around the responsibilities for the course.   In a study of factors influencing adult learners to drop out or persist in online learning, Park and Choi (2009) found that while external factors such as time, family issues, and job/workload were likely to have an impact on learners’ ability to continue with online learning, and that while these factors were largely beyond the control of the institution and the instructor, certain things could be done to plan for and offset these challenges. This is where early and clear communication of expectations comes in, so that learners will know well in advance and can plan around major course deadlines.  Also, by making sure learners understand the expectations in advance, instructors are less likely to face a barrage of frustrated, crisis questions as major deadlines loom (just at the same time that the instructors are facing their heaviest workloads as well).  The clear communication of expectations can clearly benefit both learners and instructors in online courses.
There are a few additional considerations that instructors should keep in mind when establishing an online learning experience.  In the theme of presence, instructors must be aware of the subtleties of the communication within the course and know when to take the lead as the Social Negotiator in the first phase of engagement, and following, when to transition into the other roles of Structural Engineer, Facilitator, and finally Community Member or Challenger (Boettcher & Conrad, 2010, p. 97) It can be a delicate dance of when, as the instructor, to be in the foreground and when to step back and let the learners lead.  This will require attention and knowledge of the learners, which also falls under the theme of community.  This is especially true for courses that are international, with learners from cultures other than the one where the course is ‘home’.  In this case, instructors must find ways to bridge any cultural gaps in understanding of expectations and procedures and to create close (presence/community) connections, though the physical distance might be very great. (Boettcher & Conrad, 2010, p. 76). This will take awareness of the other course beginnings themes of patience and clear expectations, often communicated in a variety of ways, to meet the needs of the culturally diverse learners.
In reflecting on all the suggestions and tips for a successful course beginning, it seems to me that it is vital to plan well in advance to be thoroughly prepared before the learners can even access the course materials.  By having everything well planned and clearly laid out, the course can start on strong, positive note, which will set the tone for the rest of the weeks together as a learning community.  Without that positive start, the crucial element of establishing that cooperative, trusting learning community would be much more difficult, if not impossible.  Much as with the start of each school year where I intentionally plan for the opening weeks of establishing routines and reinforcing expectations in order to get the most out of the following months together, I think it’s just as essential, if not more so, to start an online course with lots of upfront preparation to get the absolute most out of the limited time that we will have together.  The lessons I’ve learned about planning, preparation and community building will serve me well as I implement online learning experiences in the future.
References
Boettcher, J. V., & Conrad, R. (2010). The online teaching survival guide: Simple and practical pedagogical tips. San Francisco, CA: Jossey-Bass.
Conrad, R., & Donaldson, J. A. (2011). Engaging the online learner: Activities and resources for creative instruction (Updated ed.). San Francisco, CA: Jossey-Bass.
Park, J. H., & Choi, H. J. (2009). Factors influencing adult learners' decision to drop out or persist in online learning. Journal of Educational Technology & Society, 12(4), 207-217.

Thursday, January 7, 2016

Learning about Learning Communities from Palloff and Pratt

Having been a part of an online learning community now for the past year and a half, I can attest to the positive impact that community has had on my learning and satisfaction with my courses.  As Dr. Rena Palloff and Dr. Keith Pratt explained (Laureate, 2010), establishing a learning community can have benefits such as increased student satisfaction, a positive perception of the learning that is taking place, a feeling for learners that they are a part of something larger, and a social presence that produces a positive social pressure to succeed.  I have definitely felt that as a learner in my courses at Walden University.  I find myself wanting to interact with my peers and as I’m working independently on an assignment, I’ll catch myself wondering what certain class members will think of my product when I share it.  That constant linking back to the learning community has made me as a learner feel more connected to the university and has made it much less likely that I would ever consider walking away from one of my courses without completing it.  As part of the learning community, I feel accountable to the others in the group - not just to myself or the professor of the particular course.  When Palloff and Pratt describe the role of community members as challenging each other, drawing things out of each other during discussions, professionally supporting each other, and giving feedback, I can say that I have felt the benefit and satisfaction of having my learning community do these things for me (and hopefully I have been able to return the favor sometimes as well).   I have felt the empowerment to create my own meaning and I think I have been transformed much more into a scholar-practitioner, which they mention is one of the greatest powers of a learning community.  Feeling a part of the community has made me rise to the challenge to produce my best work.
Without the essential elements of community building, however, I don’t believe the learning communities in each of my recent courses would have been as successful.  One of the most important aspects of Palloff’s and Pratt’s presentation was when they stressed that none of the elements of community building - people, purpose, process and method/social presence - could exist in isolation.
 The interplay of these elements is key to creating an effective online learning community.  Without the people motivated and willing to be there, even the best designed course would fail.  Without a real purpose and motivation, the learners aren’t likely to stick around long enough for the sense of community to develop.  Without an organized process for delivering the material, even the most motivated learners can become distracted and this would interfere with proper functioning of the learning community.  Finally, if the learners can’t develop a sense of social presence, the trust needed to develop the sense of community would likely be lacking.  Looking back on my experiences, I can say with confidence that the courses with the strongest learning communities for me were the ones that had all the elements firmly in place.
Of course, once a learning community is established, it takes work to keep it on a positive, productive track.  While the facilitator might be largely responsible with getting the community up and running in the very beginning (or by reaching out to learners even before a course starts), I believe the learners are ultimately just as responsible for keeping things running smoothly by being an active, positive participant in the community.  Throughout their presentation, Palloff and Pratt mentioned several things that can and should be done to set up and sustain a successful learning community. These include strategies such as reaching out and getting learners ‘hooked’ in the first two crucial weeks of the course, providing a thorough new student orientation (to allow learners to get to know one another and become familiar with the learning management system and the philosophy of the online learning environment), and setting up the course in a user-friendly way by making it easy to navigate and making it feel warm and inviting through creative personal touches.  These are all strategies a facilitator can use early in the life of the learning community, along with making an effort to relate to students personally and responding to their bio posts thoughtfully, which could model effective communication for other members of the community.  As the community continues to work together and establish itself, it’s everyone’s responsibility to participate, meet deadlines, and support each other.  That includes the facilitator then as an equal, and not superior, participant in the community discussions and activities.  That, I think, is one of the most effective but also most challenging tasks in sustaining an effective learning community and is one of the things I learned that will particularly help me become a better facilitator. Knowing how operate in the learning community as an equal, what to do early to set up the learning community for success, and how to ease the technology into the coursework for the adult learners who might be digital immigrants are all strategies that will allow me to improve my online instruction/facilitation skills.
Palloff and Pratt’s research supports the direct relationship between building an online learning community and creating effective instruction.  The creation of that community with the learner at the center and the facilitator as an equal member leads to more effective instruction, allowing for more learner-to-learner engagement and the co-creation of knowledge and meaning that produces the deepest learning in the online environment.  Also, since one of the easiest ways to measure effective instruction is through learner satisfaction and retention, I can speak personally that I have a much more positive perception of the learning that occurred and the course as a whole when I have positive feelings about my learning community.  A study by Shea (2006) reiterated the correlation between online learning communities and effective instruction that I noticed in my experiences here at Walden.  This indicated that students who reported more effective instructional design and organization also reported higher levels of learning community, greater connectedness, and more learning (p. 41).  In short, the more a part of the community the learners feel, the more likely they are learn what they need to learn from effective instruction that meets their needs.

References
Laureate Education (Producer). (2010). Online learning communities [Video file]. Retrieved from https://class.waldenu.edu
Shea, P. (2006). A study of students’ sense of learning community in online environments. Journal of Asynchronous Learning Networks, 10(1), 35-44.


Thursday, December 3, 2015

Abandon Scope All Ye Who Enter Here...

I have been, for the past year and a half,  involved in a grant-funded project sponsored by the largest labor union in the country.  The small group of teachers from my school district that wrote the grant proposal had a very clear vision of our goals in mind and I feel like our passion came through into our proposal, which won us the funding for the two-year project.  The scope of the project was to ease the transition to our new teacher effectiveness legislation and new evaluation process which placed 50% of our evaluation score in the area of student assessment scores.  Our idea was to support teachers (those brand-new to the profession and those who might be veterans but struggling) by creating a Peer Assistance and Review (PAR) program to offer master teachers time out of the classroom to serve as mentors and supports to those groups in need.  Myself and the two other teachers on the committee felt (and still do feel) really strongly that this program would be a genuine asset to our district.  

            The scope creep began (and got out of control) in our ‘kick-off’ meeting.  In addition to those from the original grant committee, others were invited to join the group at this stage.  Those others included our Assistant Superintendent of Schools, the Director of Teaching and Learning in our district, the Director of Professional Development from our district, as well as a few other upper level district administrators.  The conversation went off track almost immediately into a tangent about how the rubric administrators had to use to evaluate teachers was too long and complicated, so that no consistency in application of the rubric could be found from building to building across our district.  That led to a discussion of how to best communicate common understandings (the word calibration was used over and over) and make the actual evaluation process easier.  I could see that we were getting away from our original idea and asked a few questions to get us back on track, but to no avail.  The administrators on the committee were used to being in charge and the teachers on the committee (including our teacher-leader) were used to taking orders from the administration.  In the course of one afternoon, the project scope completely changed to become one where instead of creating a PAR program, we would spend our two years writing a ‘Best Practices Users’ Guide’ to the rubric...essentially a study guide so that every administrators in every building would interpret and apply the rubric equally.  While I see this as a noble goal, I didn’t see it supporting teachers nearly as well as the original scope, so I was, and am, a bit disappointed every time I go to a meeting.  When I spoke to our teacher leader afterwards about how we would accomplish our original PAR goal, she tried to link the two items together, but over time, that has become less and less possible.  Now, the PAR idea is rarely mentioned.
Why did I stick with it and keep attending meetings?  I supposed I kept hoping that we would be able to eventually circle back around to our original focus, but I can see now that this is highly unlikely (unless we write another two-year grant and start over).  Looking back on it, I can see that our scenario was similar to the one described in our course text, where a PM attempts to avoid bureaucracy (or make the work between labor and leadership, in our case) feel more comfortable by informally handling the suggestions for change during the kick-off meeting. And, just like in the text, before the PM (teacher-leader) could do anything, we were already committed to extending (and as it turned out, changing) the scope (Portny, Mantel, Meredith, Shafer, Sutton, & Kramer, 2008, p. 346).  If I were the leader of the group, knowing what I know now, I would have communicated more fully (prior to the kick-off) with the ‘new’ members from leadership who would be joining us, to make sure they were clear on exactly what a PAR program is, so they would know the benefits and goals we were initially aiming for with the grant.  I honestly believe that several of them just didn’t realize or understand what Peer Assistance and Review meant.  I also think that they didn’t know that a scope had already been determined.  They came to the first meeting thinking it was a brainstorming session to come up with ideas of what to do for the project, not realizing that we had already mapped that out.  They spoke up early and often in the meeting, as is required by many of their positions and those of us on the committee didn’t speak up soon enough.  In addition to the communication ahead of time, I would have dealt with any suggestions for change that came later by following the processes suggested by Greer (2010) and Portny, et al. (2008), such as pinpointing the change and focusing clearly on the impacts the change would have on our projected goals, along with putting everything into a form, in writing.  Knowing that our administrators respond well to goals spelled out and plotted on official documents, I think this would have made managing the change (and likely coming to some compromise) much easier.  Though it seems likely too late to get back to a semblance of our original scope, I am hopeful that the idea of PAR isn’t gone forever - that we’ll figure out a way to revive the project in the near future for the good of the teachers in our district, and ultimately, for our students who will benefit most from having a well-prepared teacher in front of them every day.
References
Greer, M. (2010). The project management minimalist: Just enough PM to rock your projects! (Laureate custom ed.). Baltimore: Laureate Education, Inc.
Portny, S. E., Mantel, S. J., Meredith, J. R., Shafer, S. M., Sutton, M. M., & Kramer, B. E. (2008). Project management: Planning, scheduling, and controlling projects. Hoboken, NJ: John Wiley & Sons, Inc.

Thursday, November 12, 2015

Before viewing or listening to any of the messages, the preconceived ideas I had about communication had more to do with timing and urgency of the message.  For instance, I would have expected that to communicate something via email meant that the message was somewhat less urgent, but also that the person delivering the message wanted ‘something in writing’ - a paper trail, in essence.  Following this line of thinking, I then would have expected the phone message to seem somewhat more time sensitive, with the face-to-face delivery of the message seeming the most urgent to the person delivering it.  It is difficult for me to pinpoint why I held these notions, but method of delivery, in my mind, was somehow linked to need/timing of the task being discussed.
Once I viewed the email message about the missing report, a few things came immediately to mind.  The author of the email came across to me as a little desperate and worried that she was going to miss her deadline, and I perceived a slight indication of blame towards the receiver of the email.  The email somehow gave the impression that this was not the first time the message was being delivered to request this missing report and it even implies that the person receiving it was at fault for the report being missing in the first place.
Hearing the same message, this time via voicemail, changed my perception only slightly.  I noticed that the voice of the caller didn’t sound as desperate or urgent as I had imagined when reading the first email version.  Her tone didn’t have the emphasis on certain words that would contain the mild distress or impatience that I had imagined from the email.
Finally, after watching the message being delivered face-to-face, my perceptions changed even more.  The subtleties of body language and facial expression, such as her smile before she began talking, softened the ‘blow’ of the message considerably and negated the accusatory tone that, in  my mind, I injected while reading the initial message in the email.  This version of the message seemed to me to be the most likely to garner the desired result of completion of the report.
One takeaway from this exercise is that it seems much better to deliver a sensitive message (especially one that could possibly be taken as an indictment or a criticism) face-to-face whenever possible to avoid misunderstandings.  Delivering such messages this way will allow the speaker to use a variety of communication tools (facial expressions, hand gestures, tone of voice, eye contact, etc.) to make sure the message is understood clearly.  In Communicating with Stakeholders, Dr. Stolovich stressed that important communications are best delivered live and with all members present, but that these oral communications should be documented for project records (Laureate, n.d.). This idea of confirming in writing any important information that was shared during informal face-to-face discussions was echoed by Portny, Mantel, Meredith, Shafer, Sutton, and Kramer (2008), and while they suggest that discussions should be avoided if only some of the team members are present, sometimes that’s unavoidable, so a follow-up in writing (email, for instance) could reinforce what was said and keep those who were missing up to speed as well.  
I like the idea of having scheduled project meetings that are both  preceded and followed by brief written communications. Prior to the meetings, a short meeting agenda would hopefully help keep the meetings focused and honor everyone’s time.  Following the meeting , a concise set of notes/minutes, along with a task or to-do list geared towards specific individuals or groups would help ensure that the meeting resulted in progress on the project.  I would hope that by using this combination of information delivery modes, I would be able to tailor the message, either verbally, in writing, or both, to specific stakeholders, which Vince Budrovich pointed out as a necessity (Laureate, Practitioner, n.d.).  For projects that involve teams separated by geography, video-conferencing technology and online live chats such as RingCentral, GlobalMeet,  and Google Hangouts (Best, n.d.)  could be used to simulate face-to-face meetings/conversations.  Wherever the project team members may be located, using a combination of message delivery modes early and often during the project timeline will allow for a greater chance at success.

References

Best Web Conferencing Software | 2015 Reviews of the Most Popular Systems. (n.d.). Retrieved November 12, 2015, from http://www.capterra.com/web-conferencing-software/
Laureate Education (Producer). (n.d.). Communicating with stakeholders [Video file]. Retrieved from https://class.waldenu.edu
Laureate Education (Producer). (n.d.). Practitioner voices: Strategies for working with stakeholders [Video file]. Retrieved from https://class.waldenu.edu
Portny, S. E., Mantel, S. J., Meredith, J. R., Shafer, S. M., Sutton, M. M., & Kramer, B. E. (2008). Project management: Planning, scheduling, and controlling projects. Hoboken, NJ: John Wiley & Sons, Inc.