Instructors endorsement and rating scheme

#41
Okay cool, yeah. This is why I wanted to define this is because I want completely tracking on that particular category and figured there may be confusion elsewhere for others.

Capability: instructor's ability to apply the skills and the techniques they are teaching effectively during the class.


I think we may want to have a "training aids" category for this system as well. I'm imagining a TCCC style course without having wound packing simulation equipment or an analog for needle decompression.

Sent from my SM-G930V using Tapatalk
Yes, there should be a skills application and training aids part to a TCCC type course. These will also get a little interesting when it comes to target audience. I don't think anyone out there is still teaching decompression to civilian classes, for example. One way to cover that is by categorizing classes as either open enrollment or closed enrollment. We can expect LE/Mil only classes to cover content that would not be acceptable in open enrollment classes.
 

Matt Landfair

Matt Six Actual
Administrator
#42
I think a combination of end user reviews from people who provide their background, coupled with a review board to provide final judgements would cover a lot of bases and fill in blanks where the board may not be overly familiar.
 

Matt Landfair

Matt Six Actual
Administrator
#43
I received this via email too-


"I am a secondary education teacher (U.S. & World History) within my district, and within most of my state we use Robert J Marzano's system both for student lesson evaluation, and for teacher evaluations. I could definitely see a modified version of Marzano's method used to evaluate instructors. For example: When a teacher is being evaluated, they are evaluated on a very specific item for example "Engages students in learning tasks that develop understandings and or skills relevant to the learning objective" This skill is then rated on a 1-5 scale with one being "not using" to five being "innovating". There is a VERY specific rubric that details what each level should look like when observed (I am attaching an example image of a rubric for the above component).

I feel that the use of a similar system would address or eliminate many concerns, for example:

Bias can to a larger degree be removed from the equation. If level 3 performance ("developing") in time management were to be described in the rubric as: "Instructor does not take excessive breaks, spends time using both direct instruction, and engaging students in active participatory learning" if the instructor is doing all of those things, then he or she at least rates a 3, and so on. Having very specific descriptions of what each level looks like makes sure everyone observing and rating instructors is on the same page, despite personal preferences.

The system is also highly flexible in the Marzano teacher evaluation model teacher and administrator come to an agreement on which elements will be evaluated. In the use of a similar system, the individual taking the class could select the given elements they will evaluate before they take the class, and those elements could be reasonably similar across instructors teaching similar classes, while allowing for irrelevant categories to not affect the apparent score of an otherwise high quality instructor. Bobby Instructor may be a great teacher and have all his ducks in a row but has no Mil/LEO background, so why should he be evaluated on that? ( I think a Mil/LEO acknowledgement would make a good addition to an overall rating outside of the score, for example a single sentence stating whether or not the individual has such experience and if it was relevant to the material being taught). It also allows the items being evaluated to be differentiated based on the type of class, pistol class? precision marksmanship? medical? All have different areas that would be important to evaluate with little crossover. A more flexible system allows for a more accurate and relevant score."
 
#52
Sorry for he delay wanted to make sure it was what I wanted it to be before I posted it.

Class title/duration of class:
Location:
Total Cost for class (ammo requirement:, travel, lodging, food, etc):
Class specific/required gear/what gear did you use:
Why I took this class:
What my expectations were and if they were met:
How well did the instructor (and the AIs) convey and facilitate the learning of the topic material (including their personal knowledge and experience on the topic material):
What was the most positive/remarkable part of the class:
What was the most negatively noteworthy part of the class (if any):
Were there any prerequisites and/or do you believe there should have been any:
Would you recommend this class to others and why:
Will you take this class again:
Will you take a class with this instructor again:
Overall rating:


A little on the long side but covers everything I'd like to know from an AAR.
 
#53
Sorry for he delay wanted to make sure it was what I wanted it to be before I posted it.

Class title/duration of class:
Location:
Total Cost for class (ammo requirement:, travel, lodging, food, etc):
Class specific/required gear/what gear did you use:
Why I took this class:
What my expectations were and if they were met:
How well did the instructor (and the AIs) convey and facilitate the learning of the topic material (including their personal knowledge and experience on the topic material):
What was the most positive/remarkable part of the class:
What was the most negatively noteworthy part of the class (if any):
Were there any prerequisites and/or do you believe there should have been any:
Would you recommend this class to others and why:
Will you take this class again:
Will you take a class with this instructor again:
Overall rating:


A little on the long side but covers everything I'd like to know from an AAR.
Are we still going to be breaking down the 1-5 rating numbers into what each stands for or have we decided to move away from that?

I am the kitfox
 

RustyM92

Amateur
WARLORD
#57
I'm the guy who sent the email suggesting the Marzano approach, I'm happy to answer any questions as you move forward on this. One thing I would mention is that the rubric descriptions should be as objective as possible. I'm attaching an example of a rubric.
That looks exactly like what the Marine Corps uses on their Instructional Rating Forms (IRFs).
 
#58
I'm the guy who sent the email suggesting the Marzano approach, I'm happy to answer any questions as you move forward on this. One thing I would mention is that the rubric descriptions should be as objective as possible. I'm attaching an example of a rubric.
That looks exactly like what the Marine Corps uses on their Instructional Rating Forms (IRFs).
Do you have that rubric as well? I'd be interested to see them both and pull from them.

I am the kitfox
 

RustyM92

Amateur
WARLORD
#59
Do you have that rubric as well? I'd be interested to see them both and pull from them.

I am the kitfox
This was screenshot from the digital edition of MCO 1553.2B. The full version is available here:
http://www.t3s.marines.mil/Portals/64/MCO_1553.2B.pdf

IMG_0941.PNG
These would be used to rate instructors based on simple random sampling of the student population after a period of instruction. It's not exact, but seems similar to the Marzano approach to scoring. There's also another template for instructor peer reviews called the Instructor Evaluation Checklist, here:
IMG_0954.PNG IMG_0939.PNG IMG_0940.PNG
Hope that helps!