Discourse Analysis

The discussion board is an integral part of any online course.  But how should participation be assessed?  Quantitative measures alone just don’t seem appropriate.  It is important that discussion posts are meaningful and demonstrate that the student is interacting with course content at all  levels of Bloom’s Taxonomy.  So I’ve created a construct for discussion board prompts that incorporates Bloom’s levels.  This is a mere beginning and is in no way complete.  With some time, thought, and effort I would like to see this evolve into an assessment tool for discussion board participation. blooms for prompts


TeacherStream, LLC.  (2009).  Edutopia.  Mastering Online Discussion Board Facilitation:  Resource Guide.  Retrieved August 1, 2012 at http://www.google.com/url?sa=t&rct=j&q=&esrc=s&source=web&cd=1&ved=0CFAQFjAA&url=http%3A%2F%2Fwww.edutopia.org%2Fpdfs%2Fstw%2Fedutopia-onlinelearning-mastering-online-discussion-board-facilitation.pdf&ei=J9QbULLuM-mM6wHVxYGwCg&usg=AFQjCNFe53kQJ2NIwr_9Fe9oec65qhOYQg&sig2=FUrhKJBsf5irsxxT2YgUbw


Summative Assessment

Summative assessment:  bringing all the pieces together to build a whole!  (Khalsa, 2012)

This week I experienced this in a very authentic manner:  creating a website that required me to draw on all of my course experiences and learning in addition to creating something useful and relevant to my teaching.  So, in the midst of all my reflecting, refining and processing, I was learning how to create a webpage.

As I reflect on the process of creating my project, I gain a renewed appreciation for project based learning, but I’ll hold fast to my belief that project based learning must be used judiciously and at an appropriate level.  It is not a K-12 panacea, but could be added to a  teachers’ repertoire to guarantee that students actually reach the higher levels of Bloom’s Taxonomy.

One staggering realization from this experience that will guide my teaching is the number of revisions I made to my “pieces” as I produced the “whole”.  This provision will be important to students, too, as they produce culminating projects of this magnitude.  Encouraging revisions helps students to understand that their knowledge is continually growing, setting the stage for helping students realize the power of lifelong learning.


Khalsa, D.K.  (2012, July 14).  Pieces build the whole [Msg Bloom’s Taxonomy].  Message posted to https://uwstout.courses.wisconsin.edu/d2l/lms/discussions/messageLists/frame.d2l?isShared=False&fid=457162&tid=1564565&ou=1762164

Sandler, S.  (2012).  Assessment in E-Learning:  Culminating Project.  Available at http://assessmentinelearning.weebly.com

Cybercoaching (and Constructivism)

I suppose I’ve had exceptional online learning experiences, because to me, cybercoaching and online learning are one and the same.  Perhaps this is a direct result of the fact that I have only encountered online education in graduate level courses, where intrinsic motivation and self-discipline both play valuable roles.  My performance in such courses was dependent upon the feedback and formative assessment inherent in cybercoaching.  In essence, doesn’t all teaching (and learning, for that matter) involve some form of coaching?

The coaching model is well-suited to constructivist pedagogy, but in a public K-12 environment (online or face to face), the constructivist approach must be viewed as a tool.  I support a more balanced approach to math instruction – a delicate balance between direct instruction and constructivism – with coaching serving as the glue that bonds these two, seemingly contradictory, approaches.  The video Math Instruction: An Inconvenient Truth highlights some of my concerns with a purely constructivist approach to math instruction, particularly at the K-12 level.  Nancy Pon’s egallery of research on constructivism (2001) led me to this enlightening quote:  “when you practice listening to students – as opposed to listening for THE response to your questions – you will begin to understand that constructivism isn’t about a method of teaching, but rather a way of thinking about teaching and learning” (Mikusa, 1999).

A pre-course survey for an online course provides one example of a constructivist approach in that completion of the survey encourages self-analysis and can provide information to the instructor that would allow him/her to better serve as a coach and facilitator.  This is a pre-course survey for an online Analytic Geometry elective that I created using polldaddy.  It will be included in my final project.


McDermott, M.J.  2007.  Math Instruction:  An Inconvenient Truth.  Retrieved on July 20, 2012 at http://www.youtube.com/watch?v=Tr1qee-bTZI

polldaddy.com.  2006.    Available at http://polldaddy.com

Pon, N.  2001.  Constructivism in the Secondary Mathematics Classroom.  University of Calgary.  Retrieved on July 22, 2012 at http://people.ucalgary.ca/~egallery/volume3/pon.html

Sandler. S.  2012.  Analytic Geometry Pre-Course Survey.  Retrieved on July 21, 2012 at http://ssandler.polldaddy.com/s/analytic-geometry-pre-course-survey

Taxonomy of Assessment

Even before the advent of Bloom’s Revised Taxonomy and Bloom’s Digital Taxonomy, I viewed Bloom’s taxonomy, particularly for math instruction, as a process.  Unfortunately, many math students started and ended a course at the LOTS levels of Knowledge, and, once in a while, at the Comprehension level. Therefore, the never-ending math teacher cry: “These students have no retention and can’t use what I’ve taught them!”

As a result, my approach has always been to guide my students through each level of Bloom’s Taxonomy, but not always in the same order or using the same methods.  In terms of content, I have always expected my students to understand the big math picture graphically, algebraically, numerically, verbally, and contextually.  Together, these two constructs make for powerful math instruction.  But… can I make that happen in an online class?

Churches’ article is, for me, nothing short of a miracle.  It provides the much-needed explanation for how things that occur in my face-to-face classroom can migrate to the online classroom.  By teaming his suggestions together with the tools we’ve been examining, I am starting to see this come together.

Hand-in-hand with Churches’ article is the Digital Bloom’s Visual (Fisher, 2009) which matches digital tools with levels of Bloom’s Taxonomy.

I have typically assessed students at every level of Bloom’s Taxonomy as well.  The only way all of this can happen in an online class is to create projects and activities that combine assessments and levels of understanding.  So… the building process begins.

Thus far I’ve got a concept map and this week I created the assessment taxonomy shown below.


Bloom categories


Learning objective verbs




(recall, list, define, identify, collect, label)

define Students will use a digital class glossary to define and summarize new terminology related to conic sections. (blog/wiki/moodle)



(summarize, describe interpret, predict, discuss)

• describe • Students will describe their cutting conics theory with a digital reflection/response journal entry. (journal/moodle)



(apply, demonstrate, illustrate, classify, experiment, discover)

experiment Students will experiment with an applet to formulate a theory on cutting conics

(analyze, classify, connect, explain, infer)

formulate Students will experiment with an applet to formulate a theory on cutting conics.

(combine, integrate, plan, create, design, formulate)

create Students will create a digital multi-media presentation demonstrating real-world application of conic sections.

(assess, recommend, convince, compare, conclude, summarize)

• assess

• compare & contrast

• Student presentations (as per synthesis activity) will be assessed using a workshop in which students assess self/other presentations. (Blue Harvest)

• Students will compare & contrast conic sections and their equations by completing a template.

Armed with these two resources, the process of writing learning objectives became one of clarity.  I was able to write the following objectives for a unit of Analytic Geometry on Conic Sections:

  • Students will formulate a Cutting Conics theory by conducting an experiment using an interactive applet and will describe their theory with 100% accuracy using a video communication tool.
  • Students will compile, refine, and revise a digital class glossary of conic section terminology by contributing a minimum of x entries to the class wiki.
  • Using a blog, students will maintain an online journal for reflection on learning and/or response to teacher-created prompts at a minimum 80% participation rate.
  • Students will compare and contrast conic sections and their equations by completing an individual template and by contributing to the class template posted on the class wiki.
  • Students will design and create a digital multi-media presentation demonstrating real-world application of conic sections according to the rubric at a minimum 80% performance level.
  • Students will assess their digital multi-media presentation according to the rubric.
  • Students will use the rubric to evaluate and provide feedback for at least x peer multi-media presentations.

Assuming these components are acceptable, I feel I am well on my way to completing my final project.  A lot of detailed writing and planning remains, but the basics are in place and I’ve set the stage for completion.

Additional Resources

Churches, A. (2009).  Bloom’s Digital Taxonomy.  edorigami.  Retrieved July 10, 2012 from http://edorigami.wikispaces.com/file/view/bloom%27s+Digital+taxonomy+v3.01.pdf

Fisher, M. (2009).  Digital Bloom’s Visual.  Digigogy.  Retrieved July 10, 2012 from http://digigogy.blogspot.com/2009/02/digital-blooms-visual.html

The Differentiator.  Byrdseed.  Retrieved July 11, 2012 from http://www.byrdseed.com/differentiator/

Variety of Assessment Tools

This week I had the opportunity to investigate and evaluate assessment tools from the following four categories:  quiz & test-builders, collaboration tools, course tracking tools, and Web 2.0 assessment tools, via a group jigsaw project.   Although I was only responsible for evaluating one tool, I’ve learned an awful lot about the tools our team members chose through the collaborative editing and unifying process.  Our final project is in the form of a wiki:  http://assessmenttoolssu2012.wikispaces.com/.

Included in the wiki is a concept map which I hope to expand upon by incorporating the research of the entire class (with permission, of course).  Additionally, I’d like to find time to investigate suggested tools that were not evaluated by classmates.

Assessment Toolbox Concept Map


On Collaboration

This jigsaw project gave me a new perspective of and appreciation for collaboration, particularly online collaboration with a final product as the learning goal.  I found it interesting that despite the fact that the collaboration occurred online, the roles that group members assumed were the same as if the collaboration had occurred in a face-to-face environment.  I suppose, then, that the role one plays in a group is determined more by ones personality traits, than by the constructs of the learning situation.  The fact that we teamed online exaggerated our diversity, particularly in terms of location (internet access and time zone) and availability (time of day and other planned or unplanned  commitments).  These are considerations an online instructor must take into account when developing and designing group projects.

Perfect e-Storm

How exciting to be at the confluence of rapidly advancing technologies, increased learner demand, a move toward learner-centered pedagogy, and severe budget cuts and/or restraints!  Dr. Curtis J. Bonk describes this as the perfect e-storm for the education community.  We are on the cusp of a shift in the teaching and learning paradigm — for all learners everywhere!  What makes this doubly exciting is that anyone, anywhere can collaborate and contribute in a meaningful way to this movement.

Daphne Koller

Photo by James Duncan Davidson

At last week’s TEDGlobal 2012, Daphne Koller touched on many of these same topics, but from the perspective of educational equality, as per Ben Lillie’s Guestblog.   She pointed out that with massive online education “amazing talent could be found anywhere” (Koller, 2012).

So how can I, as an online instructor, be sure that I am fostering higher level thinking skills and learning for learning?

This week I took two baby steps in designing an online instructional module:  writing concise learning objectives and creating a concept map for student learning.  I found both tasks challenging.  While the learning objectives are still a work in progress, the concept map pushed my innate linear mathematician thinking in new directions.  Here is my attempt at a concept map for teaching conic sections in an online environment… again, a work in progress…

Conic Sections Comcept Map

Emerging Practices of Online Assessment

This week we focused on the differences between teacher-centered and learner-centered class structures as well as tools that would promote a learner-centered environment, including performance tasks, interactive inventories, rubrics, blogs, and more authentic assessments.


This is my first experience with creating a blog, although I’ve been intrigued by their use for the classroom.  I am particularly fond of the class scribe concept as well as the “before the test” reflective post.  Both are learner-centered and encourage the student to self-assess.  The class scribe use of the blog guarantees student participation and gives each student the opportunity to contribute in a meaningful way.  The “before the test” reflective post could also be used to guide valuable class review time.  A class blog would fit well into my current teaching situation, but I’m not so sure about using a blog in an online course.  “5 Questions about Classroom Blogging” is a VoiceThread which demonstrates another useful tool that could be incorporated into a blog.

Interactive Inventory (a reusable learning object survey)

Did I miss the point or was this just a self-test… a check for understanding?  The most valuable part of this survey was its reusable nature.  Although extremely non-threatening it was a quiz and a review all rolled into one.

Authentic Assessments

“It is better to solve one problem five different ways, than to solve five problems one way.” — George Polya

I have always been interested in problem-solving.  Specifically, how students learn problem-solving techniques and how I can facilitate that growth.   I have incorporated explanations of problem-solving into my classes, but have been looking for a way to expand on that.  While reading about authentic assessments, it occurred to me that I could have my students create a digital problem-solving portfolio, which would hopefully demonstrate their growth as a problem-solver in mathematics and in my class.  I see this beginning with a basic inventory (perhaps a survey would be the tool of choice) and then allowing students to include artifacts, and, more importantly, their reflections on problem-solving throughout the school year.  Some sort of closing inventory/reflection would also be necessary.  I’ve got to do some research to expand on this idea.

Additional Selected Readings/Links

Shawn Cornally’s TEDx talk and Formative Assessment/Feedback/Grading Tool