← Vault Index
Source: frameworks/kit-interview-scorecard-design/02-terminology.md

Terminology — Interview Scorecard Design Kit

This file is the canonical source for all vocabulary used in scorecard production. When any other file in this kit, in a deployment kit, or in a client deliverable uses a term defined here, this definition applies. If a term appears differently elsewhere, this file wins.


Core Deliverable Terms

TermDefinitionDo Not Use
ScorecardThe structured evaluation tool that interviewers use to assess a candidate. Contains focus areas, scoring criteria, behavior-based questions, and a recommendation framework. Produced as a client-specific document customized to the role, organization, and interview team."evaluation form," "interview sheet," "rating form," "feedback form"
RubricThe evaluation criteria and standards that define what "good" looks like for each focus area. The rubric is the intellectual framework; the scorecard is the document that contains it. In some practices these are the same artifact. During extraction, confirm whether the practitioner treats them as one document or two."grading criteria," "standards matrix"
Focus areaA specific competency domain assigned to an interviewer for deep evaluation. Each focus area has a description, evaluation criteria, and 3-5 behavior-based questions. Examples: strategic vision, financial acumen, team development, stakeholder management."category," "section," "topic," "area of interest"
Behavior-based questionAn interview question that asks the candidate to describe a specific past experience — what they did, how they did it, and what happened. Structured to elicit evidence of competency rather than hypothetical intent."interview question" (too generic), "behavioral question" (acceptable but less precise)
RecommendationThe interviewer's overall assessment of the candidate after completing their evaluation. Expressed on a defined scale (e.g., Strong Yes / Yes / No / Strong No) with required written justification."rating," "grade," "verdict," "vote"
VersionThe iteration number of the scorecard design. v1 = first build. Increment when evaluation criteria, focus areas, or scoring methodology change. Do not increment for formatting corrections."draft number," "revision"

Design Process Terms

TermDefinitionDo Not Use
Extraction interviewA structured interview conducted by the consultant with the recruiting practitioner. The primary source of all scorecard design decisions — focus areas, scoring methodology, question approach, interviewer preparation methods."discovery session," "intake interview," "requirements gathering"
Position profileThe document that defines the role — responsibilities, must-haves, nice-to-haves, competency expectations, organizational context. A required upstream input to scorecard design. The scorecard evaluates what the position profile defines."job posting," "job ad" (these are different deliverables with different purposes)
Must-haveA non-negotiable requirement for the role. If a candidate does not meet a must-have, they do not advance regardless of other strengths. Must-haves flow directly from the position profile into scorecard focus areas."requirement" (too broad), "minimum qualification" (compliance language, not evaluation language)
Nice-to-haveA preferred but not required qualification. May inform focus area questions but does not create a gate in the evaluation."preferred qualification," "bonus"
Kickoff meetingThe meeting where the recruiting engagement is launched with the client. Relevant to the scorecard because interview team composition, decision-maker designation, and process structure are determined here. The scorecard cannot be designed until these inputs exist."project launch," "intake meeting"
Alignment meetingThe meeting where the full interview team is prepared for their role in the evaluation process. The scorecard and its focus area assignments are presented, questions are reviewed, scoring methodology is explained, and interviewers are trained on behavior-based evaluation."prep session," "interviewer briefing," "training session"
GapA required piece of content that is not present in the source material. Gaps are flagged and escalated — never filled by inference or assumption."unknown," "TBD," "placeholder"
Gap reportA structured list of every gap identified before a build begins, with resolution status tracked for each item. The build does not proceed until every gap is resolved."open items list," "questions list"
Golden exampleA complete, production-quality scorecard used as a design reference. Used to extract structural patterns, format, and methodology. Not a content source."template," "sample," "reference scorecard"

Evaluation Methodology Terms

TermDefinitionDo Not Use
Competency domainA broad area of capability relevant to the role. Focus areas are derived from competency domains. A single competency domain (e.g., "leadership") may produce multiple focus areas (e.g., "team development," "strategic decision-making," "cross-functional influence")."skill area," "capability" (acceptable in general discussion but imprecise for scorecard design)
Scoring scaleThe defined range of values an interviewer can assign to a focus area. May be numerical (1-5), descriptive (Exceeds / Meets / Below), or hybrid. Every point on the scale must have a written definition."rating system," "grading scale"
Written justificationThe evidence-based explanation an interviewer provides for every score. Must reference specific candidate statements, behaviors, or demonstrated competencies — not impressions or feelings."comments," "notes," "feedback" (these imply optional; justification is required)
Anchoring biasThe cognitive distortion that occurs when an interviewer's evaluation is influenced by seeing another interviewer's scorecard before the debrief. Prevented by requiring independent submission — no cross-visibility until the debrief.n/a (internal methodology term)
STAR frameworkSituation, Task, Action, Result. One common structure for behavior-based questions and candidate responses. Not the only valid framework, but the most widely recognized. During extraction, confirm whether the practitioner uses STAR, a variation, or their own structure.n/a (reference term — do not impose if the practitioner uses a different framework)
DefensibilityThe quality of a scorecard that allows the organization to demonstrate fairness and consistency if a hiring decision is challenged. Achieved through: same questions for every candidate, documented scoring criteria, written justification, and consistent process across all candidates."compliance," "legal protection" (defensibility is broader than legal compliance)

Interview Structure Terms

TermDefinitionDo Not Use
Paired interviewAn interview conducted by two interviewers together, evaluating the same candidate in the same session. Each interviewer has their own focus area and completes their own scorecard independently. Pairing provides two perspectives on the same interaction and a witness if process issues arise."panel interview" (implies more than two), "joint interview"
PresentationA structured candidate presentation that is part of the evaluation process. Typically includes a resume walkthrough and a topic-based presentation. Used for director-level and above roles. Evaluation criteria are defined in the scorecard."pitch," "talk," "show and tell"
Focus area assignmentThe mapping of a specific focus area to a specific interviewer. Assignments are based on the interviewer's expertise, role, and what the evaluation needs. Each focus area must be assigned to at least one interviewer. No focus area should be unassigned."topic assignment," "question assignment"
Interviewer preparationThe process of equipping interviewers with the scorecard, their focus area assignment, sample questions, scoring methodology, and expectations for documentation. Occurs at the alignment meeting."interviewer training" (acceptable for longer sessions, but "preparation" is the standard term for the alignment meeting context)

Debrief Terms

TermDefinitionDo Not Use
DebriefThe structured discussion where all interviewers share their evaluations, discuss areas of agreement and disagreement, and surface evidence that informs the hiring decision. Facilitated by the recruiting lead. The scorecard is the primary input."post-interview meeting," "candidate review," "roundtable"
Round robinThe debrief format where each interviewer presents their evaluation in turn, within a defined time limit, before open discussion begins. Ensures every voice is heard before group dynamics take over."go-around," "each person shares"
Scoring summaryThe aggregated view of all interviewers' scores and recommendations for a candidate, prepared by the facilitator before the debrief. Presented at a high level (e.g., "3 Strong Yes, 2 Yes, 1 No") without attribution to specific interviewers."scorecard summary," "results overview"
ChallengeA facilitation technique where the recruiting lead pushes an interviewer to provide evidence for a vague or unsupported evaluation. "You said you liked his answer — what was the answer, and what specifically did you like about it?" Challenges are expected, not adversarial."pushback," "confrontation"

Quality Terms

TermDefinitionDo Not Use
Blocking failureA QC finding that must be fixed before the scorecard is deployed. No exceptions."critical issue," "must-fix"
WarningA QC finding that should be addressed if time permits, but does not block deployment. Must be noted for the next revision."minor issue," "nice to have"
Design integrityThe quality of a scorecard where every focus area traces back to a role requirement, every question maps to a focus area, every scoring criterion is defined, and nothing is orphaned or duplicated."completeness," "quality" (too vague)