All observers agreed that none of the scans were normal and that there was no epidural or subdural hematoma (value of 1.0). The values of the Inter-Observer agreement are summarized in Table 1. The provision of optimal radiology services may require continuous vigilance and perhaps quality assurance interventions.1⇓-3 The content of these interventions may not be visible. In addition, the manner in which error, disparity and differences of opinion must be managed in both theory and clinical practice.4 There was moderate agreement (Kappa 0.525; 95% CI 0.055-0.995) between ng and NR. The agreement was significantly higher in the other groups that reached a near-perfect agreement between NG and RR (Kappa 0.842; 95% CI 0.540-1,000). There was only mediocre to moderate matching, while the page and location of the SAH were considered in most groups, except between RG and RR, which was almost always essential to almost perfect. There have been several previous studies that have examined the degree of compliance between experienced radiologists and radiological residents interpreting CTPA in an emergency context [23-25]. Our study shows a good IOA between the interpretation of on-demand radiologists and PE staff with a total agreement of 91.4% (Kappa of 0.81). This is consistent with Shaham et al.`s study, which found that the CTPA`s preliminary interpretations on the request were reasonably accurate (Kappa statistics 0.7 and 0.8), indicating that the interim interpretations of the residents of the PE studies are reasonably accurate [26].

Similarly, Gimberg et al. reported an overall agreement of 93% (kappa 0.8) between radiology fellows and radiology faculties in interpretations of the CTPA [27]. Yavas et al., although they report a good but slightly lower correlation (0.7 of Kappa statistics) between residents and experienced radiologists, suggests that long-term definitive treatment is not solely based on per capita reading [28]. Our results also apply to PE sites, as while the study was interpreted as positive, there was a good match on the position of pulmonary embolism between staff and the local resident on demand. While some studies have found great variability in the inter-observer agreement in the Hydrocephalus analysis, we have found an essential to almost perfect agreement. The wide variability of these studies has been attributed to the subjective interpretation of the presence or absence of hydrocephalus. [ 2 7 ] There are many quantitative methods for assessing hydrocephalus, but CRO is increasingly being used to replace subjective analyses. Van Zagten et al. found a high inter-observer agreement for VCRs during the examination of brain atrophy (ICC 0.82; 95% CI 0.75-0.94).

[ 14 ] This is supported by low agreement when a similar brain atrophy assessment study used a subjective estimate based on category scales (kappa 0.11-0.59). [ 8 ] Of the 696 CTPA assessed by residents, 128 (18.4%) Positive, 486 (70%) Negative and 82 (12%) Indeterminate. On the other hand, staff radiologists reported more than 694 CTPA, of which 126 (18%) Positive, 493 (71%) Negative and 75 (11%) Indeterminate. Table 1 presents the cross-classification of CTPA evaluation results for patients with interpretations available to both residents and staff radiologists. Of the 694 CTPAS read by both residents and radiologists, the overall agreement rate was 91.4% (634 out of 694), while there were a total of 60 discrete interpretations between residents and staff radiologists, representing a difference rate of 8.6%. Support for positive, negative and indeterminate values was 0.89, 0.95 and 0.75, respectively. The Maxwell Stuart test showed marginal homogeneity between debtors (2 (2 df) – 1.47, p – 0.48). A good agreement between residents and radiologists was reached with a kappa of 0.81 (IC 95% 0.77-0.86). Excluding unspecified cases and taking into account only the 596 values measured either positively or negatively, 21 (3.5%) Differences (“frank” differences) were identified.