TY - JOUR AU1 - Matthew Valle AU2 - Kirk Davis AB - Inter‐rater agreement in a peer performance evaluation system was analyzed using a sample of 44 individuals who rated focal persons in seven teams. Objective information concerning individual performance on multiple choice tests, as well as information gleaned from individual contributions to team testing and team graded exercises, resulted in high inter‐rater reliabilities (assessed via ICCs) and strong criterion related validity for the performance evaluation instrument. A discussion centers on the effect of providing objective job performance information to evaluation participants. TI - Teams and performance appraisal Using metrics to increase reliability and validity JF - Team Performance Management DO - 10.1108/13527599910304912 DA - 1999-12-01 UR - https://www.deepdyve.com/lp/emerald-publishing/teams-and-performance-appraisal-using-metrics-to-increase-reliability-50KUKONvsm SP - 238 EP - 244 VL - 5 IS - 8 DP - DeepDyve