The Essential Cybersecurity Controls (ECC – 2 : 2024) Control 1-10-5 requires organizations to drive periodic reviews of their cybersecurity awareness program using well-defined metrics and KPIs that demonstrate effectiveness, continuous improvement, and alignment with risk — this post shows how to design those metrics practically, implement them in a small-business environment, and produce audit-grade evidence for compliance with the Compliance Framework.
What to measure: metrics vs KPIs
Start by distinguishing metrics (raw measurements) from KPIs (targeted indicators tied to business objectives). Useful metrics for an awareness program include training completion counts, phishing campaign click rates, phishing report rates, post-training test scores, remediation times, and behavior-based indicators (e.g., credential sharing incidents). KPIs are a subset with targets and timeframes — for example, “Reduce simulated phishing click-rate to under 5% within 12 months” or “Maintain training completion of 95% within 30 days of enrollment.” Defining this distinction up front keeps reviews focused on effectiveness rather than measurement for its own sake.
Practical implementation details for Compliance Framework
Map each metric to the Compliance Framework evidence requirements and specify data sources: LMS reports for training completion, phishing platform logs for click/report rates and timestamps, SIEM/incident tracker for user-reported suspicious emails, HR for role mapping and enrollment records, and GRC/issue tracker for corrective actions. Automate data pulls where possible (API-based exports from your LMS and phishing platform into a reporting database or BI tool) to create a repeatable record for periodic reviews. For small businesses, lightweight approaches (CSV exports, a shared spreadsheet with scripted validation, or a simple dashboard in Power BI/Looker Studio) are acceptable if they produce consistent, time-stamped artifacts.
Example metric definitions and formulas
Provide explicit formulas so auditors and reviewers get consistent numbers: Phishing Click Rate = (Number of recipients who clicked a simulated phish / Total recipients) * 100. Phish Report Rate = (Number who reported the simulation using the report tool / Total recipients) * 100. Training Completion Rate = (Number who completed required modules / Number enrolled) * 100. Mean Time to Report (MTR) = average(time_reported - time_sent) across reported simulations. Pre/post-training Knowledge Gain = (Average post-test score - average pre-test score). Document these formulas in your metric catalog and always store raw logs used to compute them.
Setting targets, baselines and statistical considerations
Establish a baseline by running 2–3 initial measurement cycles (e.g., three monthly phishing campaigns) and calculate the mean and standard deviation. For proportions (click rates), use sample-size math if you test subsets of the population: n = (Z^2 * p * (1-p)) / E^2 where Z = 1.96 for 95% confidence, p is estimated proportion, and E is desired margin of error. Use control charts (p-charts for proportions) to track variation and detect meaningful shifts versus noise. For small businesses (under ~200 users), report confidence intervals alongside KPIs to show whether changes are statistically significant or within expected variability.
Small-business scenario: a practical roadmap
Example: AcmeCo (150 employees) runs quarterly phishing campaigns and monthly short micro-training. Baseline after first quarter: click-rate 14%, report-rate 6%, training completion 80%. KPIs: reduce click-rate to <6% in 12 months, raise report-rate to >25% in 6 months, and maintain completion ≥95% within 30 days. Implementation: set automated quarterly simulations, require remedial micro-training for clickers within 7 days and track completion via LMS. Produce a quarterly compliance review package including CSV exports, dashboard screenshots, remediation logs, meeting minutes, and an action plan for missed KPIs. That package meets Control 1-10-5 evidence requirements in the Compliance Framework.
Reporting cadence, roles and audit evidence
Define a reporting cadence aligned with risk and the Compliance Framework: monthly operational metrics for program owners, quarterly management reviews for leadership and the compliance team, and an annual effectiveness review tied to policy updates. Assign roles: Program Owner (owns KPI targets and reviews), Data Steward (maintains data integrity), IT Security (runs phishing simulations and collects logs), HR (enforces training enrollment), and GRC/Audit (stores evidence). For audits, keep the raw logs (emails sent, click timestamps, LMS completion records), the KPI calculation workbook or script, meeting minutes, and remediation plans for at least the period required by the Compliance Framework retention rules.
Risks and consequences of not implementing metrics/KPIs
Without defined metrics and periodic reviews you risk blind spots: persistent high susceptibility to phishing, delayed remediation, and training that doesn't change behavior. For compliance, the absence of measurable evidence can lead to failing Control 1-10-5 reviews, regulatory findings, fines (depending on jurisdiction), and increased probability of a successful breach that causes financial and reputational damage. Operationally, you also lose the ability to justify investments in awareness initiatives because leadership lacks objective measures of return on security awareness.
In summary, meeting ECC – 2 : 2024 Control 1-10-5 requires a documented metrics catalog, automated and repeatable data collection, SMART KPIs with baselines and statistical context, defined roles and cadence for periodic review, and retention of raw evidence for audits. For small businesses, use pragmatic tooling (LMS exports, phishing platforms, spreadsheets or simple BI dashboards) and focus on a few high-impact KPIs — phishing click-rate, report-rate, training completion, and remediation time — to demonstrate continuous improvement and satisfy the Compliance Framework.