Content:
The content was based on the TeamSTEPPS framework. Item content for the observational tools was based on items from previously validated instruments. Experts in teamwork and assessment reviewed the tools and items for match with a TeamSTEPPS construct and coverage of TeamSTEPPS constructs. The expert panel identified communication as an underrepresented construct and items were developed to assess this construct.
Response Process:
The expert observational tool was determined too long by raters and was reduced to the final 18-rating form. The video observational tool was determined too time consuming when individual behavior timestamps were required and this feature of the tool was dropped. Scale mid-points (i.e., option 3 on a 5 option scale) were eliminated and “not enough information” was added to better capture rater judgements.
Internal Structure:
The overall inter-rater reliability of the novice, expert, and video observational tools was good (Intraclass Correlation Coefficient (ICC) = 0.85, 0.76, and 0.90, respectively). The inter-rater reliability for four of the five domains in the expert observational tool were acceptable (ICC = 0.44-0.66); the exception was the team structure domain (ICC = 0.21). The inter-rater reliability for all five domains in the video observational tool were acceptable (ICC = 0.54-0.84).
Relation to Other Variables:
Start the Conversation
Every registered user can comment on website content.
Please login or register to comment