This study focuses on the comparison of expert assessment of virtual surgical procedures through Objective Structured Assessment of Technical Skills (OSATS) with the automatic assessment and feedback provided by a surgical simulator for hysteroscopic procedures. The existing multi-metric scoring system of the simulator was extended to include hysteroscopic myomectomy. The original OSATS was also modified for the examined surgical procedure. OSATS reliability, expert coherence, and interrater agreement with simulator feedback were investigated in a study with eight experts. The same selection of six movies showing virtual procedures performed at a hysteroscopy training course was rated by each expert. For the task-specific checklist, the reliability of the simulator was significantly higher than that of the individual human raters (p=0.006). In addition, the ranked order of the overall scores of all movies was the same for both simulator and expert consensus opinion. This is a first step to providing simulator feedback with the same reliability as an expert panel, thus facilitating competency-based surgical education and assessment in the near future.