Reflections

Reflections from Discussion Board Postings: Week 1~ Writing Course Goals and Learning Objectives Discussion As a special education teacher, I am constantly writing goals and objectives for my students on their annual Individualized Education Plans or IEPs. Planning for developing an online course is very similar to writing an IEP. You need to develop very clear and concise goals and you need to make sure that those goals are measurable. If they are not measurable, it is impossible to tell if your learners are meeting the goals that you expect them to achieve.


 * See Week One Weekly Activity for specific course goals~ Civil War Unit Course Goals**

**Week 2~ Survey Tool Discussion** Sentinel Survey

I chose to use Survey Monkey to do a mid-program student satisfaction survey to see how my online learners were doing in the class. Survey Monkey is an excellent tool to use for these informal surveys for many reasons. First of all, Survey Monkey is a free site that is user friendly allowing the user to create a survey in minutes. It allows you to use a variety of different question formats to meet different learning targets. Secondly, it offers a variety of different options for administering your survey, including email, pop-ups, link on a website, etc. As the creator of the survey, you have the flexibility of deciding how you want your survey administered. The last feature that I definitely think is a pro for this tool is the data results feature. This survey allows you to analyze the results very easily, providing you with important data about how your learners are doing in the online setting.
 * //Survey Tool Used://**

Since assessment in many forms is critical in the traditional learning setting, it will be even more important in my online course. Survey assessment like the survey I created on Survey Monkey would be used in multiple ways in my online unit. Prior to the course starting, I would do a pre-assessment survey to get a baseline of where my learners are academically. I would use a mid-course assessment to make sure my learners are feeling engaged and not overwhelmed by the content being presented. Finally, upon completion of the course, I would do a final survey so I could get data to help inform my instruction for future online courses. These three informal surveys would be administered as a link that the students would go to on my class website. These surveys would provide me with not only their satisfaction of the course, but also provide me with important information about how they are handling the content of the course.
 * //Use in my Online Course://**

See Weekly Activity 2 for complete Sentinel Survey  []  Click here to take survey

Week~ 3 Creating Scoring Guides for Discussion Forum Prompts: Creating an assessment rubric to assess online participation in discussion boards: ** Since this is my first time creating a “mock” online course, my learning curve is great. Although I don’t have any ELL students presently, I tried to keep them in mind when designing this rubric for assessment. I specifically did not incorporate any grammatical categories for rating to accommodate ELL students. I do feel that organization of the post is key, as a post that lacks organization often lacks clear meaning. I designed my rubric to assess not only timeliness of postings to ensure a real “discussion”, but also to ensure that the messages posted were not only relative to the topic but also included some personal reflection by the author.
 * 

See Weekly Activity 3 for  For the most part, I felt that my assessment rubric targeted the areas that I wanted to see in the discussion posts. The only area that I marked as a “4” for everyone was promptness in posting, as I had to assume that all post were made on time to allow for others to respond. case studies assessed using rubric.pdf Here are my thoughts on the individual postings:
 * Applying an assessment rubric to sample case studies:** (case studies can be viewed at: [|Sample Postings.doc] )
 * George: 18/20 Overall, I felt that George’s responses to the four questions were adequate. On the first question, he gets off topic a bit and I had to refer back to the question to see what it was that he was supposed to be answering. Secondly, organization was hard to follow at times.
 * Alice: 19/20 I felt that Alice had one of the best posts. She stayed on topic and used multiple references from the reading to support her claims. I only marked her down on organization because she started answering in an essay format and changed to question and answer format half way through the post. It would have been better to have consistent formatting.
 * Dwight: 19/20 Dwight’s post was very clear and concise. He directly answered the questions posed and it was easy to follow. I only had wished that he included more support for his claims by making more connections to the text.
 * Fred: 17/20 Although Fred made good points in his post, the organization of his post made it very hard to read. The reader continually needs to go back to the questions to see what message Fred is trying to convey. He could have made more connections to the text as well to support his ideas.
 * Nathaniel: 16/20 Nathaniel’s post was the most difficult to read and understand. Although I feel he includes a variety of details, he often strays from the topic. His organization is also poor, with a few sentence fragments.

<span style="font-family: 'Comic Sans MS',cursive; font-size: 120%;">**Reflections on how my rubric met the needs of ELL students:** Overall, I liked how my rubric assessed the posts. I liked viewing others rubrics and seeing how they assessed the posts. For some reason, it feels comforting that we are on the same path. I didn’t feel that ELL students were at a disadvantage based on my scoring criteria. I tried to make sure that I focused more on content and relevance vs. grammar. However, since I did put an organization category where I looked at how the post “flowed” so to speak, I was in essence looking at some grammatical items. When scoring, if the poor flow didn’t impede the statement, I did not score it down. The only possible modification I would make to future rubrics is maybe to qualify what some of my words mean such as “multiple”, “many”, or “few.” Although I wasn’t thinking about being a specific number on it, it may be more objective if I did. However, my hesitancy with that is then the response will solely focus on getting that number of references.

<span style="background-color: #a348a3; font-family: 'Comic Sans MS',cursive; font-size: 140%;">Week 4 ~ Constructing a Test Blue Print <span style="font-family: 'Comic Sans MS',cursive; font-size: 120%;"> When creating this test blueprint, I tried to create a test that would match achievement targets to a variety of different assessment methods. I valued the higher level thinking questions or tasks at 3 points per task, and the lower level tasks at 2 points per question. Since the essay question required comparison of 3 different topics for 2 sides, I valued that question with the most points. I also tried to add a variety of different selected response formats such as multiple choice, fill-in, and matching to give the students a variety of different formats. To accommodate for special needs students, I could provide a word bank or reduce the number or multiple choice options. The total value of these 3 objectives is 50 points.
 * Test Blueprint Reasoning:**

<span style="font-family: 'Comic Sans MS',cursive; font-size: 120%;">See Weekly Activity Week 4 for test blue print

<span style="background-color: #a348a3; font-family: 'Comic Sans MS',cursive; font-size: 140%;">Week 5 ~ Online versus Paper/Pencil Assessment Reflection

<span style="font-family: 'Comic Sans MS',cursive; font-size: 120%;"> **What are the strengths/weaknesses in the different presentation formats?** + Can “customize” for special needs students without others knowing (allowing for use of word bank, reducing the number of options for multiple choice questions) + Easy to grade + Gives students instant feedback on their performance + Incorporates technology into the curriculum || <span style="font-family: 'Comic Sans MS',cursive; font-size: 120%;"> - Test results may be invalid due to technical difficulties. (Does the student not know the content or did they make a computing error?) - Students are not familiar with the process, which may cause test anxiety and frustration. - Cheating - Takes additional time on the teacher’s part for designing and creating the test. || <span style="font-family: 'Comic Sans MS',cursive; font-size: 120%;"> + Students are familiar with the process. + Can administer to a large group of students. + Test results reflect students understanding of the content. || <span style="font-family: 'Comic Sans MS',cursive; font-size: 120%;"> - Does not incorporate any technology. - Takes time to grade. - Students don’t always get “instant” feedback due to the amount of time needed for scoring and recording. || <span style="font-family: 'Comic Sans MS',cursive; font-size: 120%;">**Do you have a preference for online or paper/pencil assessments? Why or why not?** Although I have never administered an online assessment to my students, I do feel that they would be a wonderful assessment format to use in the classroom. However, I do think training would need to be offered first. I would want to make sure that my students had the technology competencies to take such an assessment. To check this, I would give them opportunities to take informal quizzes such as “check-in” surveys to make sure they were comfortable with the format. Secondly, I would need to look at how I could administer the assessment and decrease the opportunities for the students to “cheat” from one another’s monitor. Once these 2 areas were addressed, I do feel that I would love using them. Like most technologies, once you invest the time in setting them up, they are very efficient to use and will save time in the end. As I mentioned above, I do think students need “practice” taking an online test. I would give them opportunities to practice this new format by incorporating several “check –ins” or surveys that do not count for grading purposes to make sure they can use the format. **What design considerations, if any, come out of the differences between paper/pencil and online assessments?** I didn’t notice any significant design differences between the two formats. Both formats include a variety of different question types (such as multiple choice, true false, matching, and essay). With either format, you need to make sure that you keep the question types together. I did like the ease of adding visuals in the online format which I think is an excellent design feature of the online format. It was also nice to not worry about the page break and splitting up any questions using the online format compared to the paper and pencil format.
 * <span style="display: block; font-family: 'Comic Sans MS',cursive; font-size: 120%; text-align: center;"> **Online Assessment** |||| <span style="display: block; font-family: 'Comic Sans MS',cursive; font-size: 120%; text-align: center;"> **Paper and Pencil Assessment**  ||
 * <span style="display: block; font-family: 'Comic Sans MS',cursive; font-size: 120%; text-align: center;"> **PROS** || <span style="display: block; font-family: 'Comic Sans MS',cursive; font-size: 120%; text-align: center;"> **CONS**  || <span style="display: block; font-family: 'Comic Sans MS',cursive; font-size: 120%; text-align: center;"> **PROS**  || <span style="display: block; font-family: 'Comic Sans MS',cursive; font-size: 120%; text-align: center;"> **CONS**  ||
 * <span style="font-family: 'Comic Sans MS',cursive; font-size: 120%;"> + Can administer to a large group of students
 * What ideas/suggestions do you have for preparing students for online assessments?**

<span style="background-color: #a348a3; font-family: 'Comic Sans MS',cursive; font-size: 140%;">Weeks 6 and 7: Essay Assessment and Plagiarism in the Online Environment = = <span style="font-family: 'Comic Sans MS',cursive; font-size: 120%;">After making my test blue print, I decided that I wanted to include some higher level thinking and explanation, so I decided to incorporate an essay question into my test. This week there were some great resources for scaffolding an essay question, which at first, I thought was simply "giving them the answer." However, with careful wording, I can see how students will produce a much more detailed response when given clear expectations of what should be included in the essay. ~ See weekly activities for weeks 6 and 7 to view my completed essay questions

For week 7, I explored plagiarism in the online environment. I collaborated with a small group of teachers from my hometown and together we researched and designed a prezi presentation aimed at not only informing elementary students about what plagiarism is, but also teaching them how to cite work properly. The statistics are astounding on plagiarism in the traditional setting, so I can't imagine troublesome the problem gets when students move to the online setting. Like anything, I think if the rules and expectations of what is acceptable and what is not is clear from the beginning, you can minimize some problems that may arise. To see the prezi, see weeks 6 and 7 under weekly activities.

<span style="font-family: 'Comic Sans MS',cursive; font-size: 120%;">I would definitely use some of the plagiarism detection tools available online. Doccop and Turnitin are just two resources that are on the internet now to help teachers detect plagiarism. Although I think I quickly learn what type of workers my students are in the classroom, sometimes I may need help with a "hunch" that I have on a piece I feel was plagiarized, I would use the detection tool to help with my "hunches."

<span style="background-color: #a348a3; font-family: 'Comic Sans MS',cursive; font-size: 140%;">Week 8~ Portfolio Assessment in an online format <span style="font-family: 'Comic Sans MS',cursive; font-size: 120%;"> As Dr. Barnett explains, Web 2.0 tools are instrumental to students in developing and showcasing student learning through portfolio assessment. Unlike the traditional letter grade, portfolio assessment is a very visual, summative assessment format that clearly shows where the learner was “intellectually” when he/she started the class and where he/she is after the course objectives are met. The assessment is much more student centered and meaningful to students and teachers alike. With a variety of different Web 2.0 tools, students are now easily able to showcase their learning through wikis or blogs, presentations both with audio and video, and much more. All 5 of Dr Barnett’s “stages” of portfolio development can be easily achieved using Web 2.0 tools for designing, collecting and presenting learning artifacts.

Back to Kacey's home page