Agreement Coding

Krippendorffs Alpha generalizes several known statistics, often called inter-coder agreement indicators, inter-referenced reliability, reliability of coding of certain unit sets (unlike unitzing), but it also differs from statistics called reliability coefficients, but which are not adapted to the specifics of the coding data generated for subsequent analyses. Match when assigning a code to a particular segment is indicated by the green icon in the first column. A red symbol in this column indicates that there is no agreement for this segment. Here too, two results tables are generated: the code-specific results table and the detailed matchboard. In qualitative analysis, the analysis of the Intercoder agreement is mainly aimed at improving coding guidelines and individual codes. However, it is often desirable to calculate the percentage of approval, particularly with respect to the research report to be developed at a later date. This percentage of the agreement can be viewed in the results table specific to the code above, which takes into account both individual codes and all codes. In the « Kappa (RK) column, the results table shows a randomly corrected value for the percentage of agreement. It takes into account the likelihood that two people will randomly choose and assign the same codes in a document (if they simply chose random codes without considering the data).

The calculation only makes sense if you select the Unassigned Codes option as matches and is therefore only visible if this option is selected. The criterion is the frequency of the occurrence of the code in the document; Specifically, the frequency of correspondence (agreement) of the attribution of the code. When it describes a statistic as a statistic of consistency, reproducibility or reliability, it does not become a valid clue as to whether coded data can be relied upon to make future decisions. Its mathematical structure must adapt the process of encoding units into a system of analyccible terms. The Krippendorff alpha coefficient[1] named after the academic Klaus Krippendorff, is a statistical measure of the agreement obtained when encoding a series of units of analysis against the values of a variable. Since the 1970s, alpha has been used in content analysis, in which text units are classified by trained readers, in advice and surveys, where experts code interview data in terms of analysis, in psychological tests that must be compared to alternative tests of the same phenomena, or in observational studies that record unstructured events for subsequent analysis.