Behind the Scenes: What’s New in ASTM E2659-24, Part 3
- Mickie Rops
- 22 hours ago
- 5 min read
In the previous two posts, I covered 5 of the 8 areas of substantive change in the newly released 2024 edition of ASTM E2659: Standard Practice for Certificate Programs.
Here, I’ll describe the remaining 3.
What’s Changed?
As a reminder, there are eight areas of substantive change. By that, I mean either a requirement has been changed or added such that it likely would change what an existing certificate issuer is in compliance with the standard does. These are distinguished from clarifying changes which don’t actually change the requirements; instead, they clarify the intent.
Clarified Instructional Design Consistency and Alignment (Clause 6.1.2)
The standard has always emphasized consistency and alignment. In this update, we’ve consolidated the clauses and made a few tweaks.
First, the instructional design plan must now document how the program’s design is—and remains—consistent with the program’s stated purpose, its target audience and its scope.
This is covered in 6.1.2.
“6.1.2 The certificate program instructional design plan shall include, at minimum:
(3) How the certificate issuer initially ensured and will continue to ensure certificate program design is consistent with its stated purpose, target audience, and scope;”
Here are practical examples of how instructional design might be inconsistent.
1. Mismatch Between Target Audience Experience and Instructional Level
A certificate program aimed at entry-level data analysts includes advanced statistical modeling and machine learning techniques with minimal foundational instruction.
Issue: This content is NOT consistent with the stated purpose of building core data analysis competencies for beginners.
2. Misaligned Program Purpose and Type of Assessment
A certificate program that claims to prepare participants to perform as home energy auditors but has no practical activities or performance-based assessments.
Issue:The instructional design is NOT consistent with the applied, skill-based nature of the stated program purpose.
3. Scope Creep in Content Coverage
A certificate program intended to train frontline healthcare workers in basic infection control procedures include detailed instruction on hospital infection reporting systems and epidemiology.
Issue: This exceeds the intended scope and may confuse or overwhelm the target audience, which consists of non-clinical staff with limited medical background.
And, we deleted references to making the course aligned (formerly in 6.1.5.2) and the assessment aligned (formerly in 6.1.6.2 and 6.1.6.4) and covered alignment once in the retained:
“6.1.2 The program instructional design plan shall include
(4) How the certificate issuer initially ensured and will continue to ensure the alignment of the intended learning outcomes, course learning activities, and the summative assessment.”
Here are practical examples of how instructional design might not be aligned.
1. Learning Outcomes vs. Passive Activities vs. Unrelated Assessment
Intended Learning Outcome: “Demonstrate the ability to conduct a workplace safety inspection.”
Learning Activity: Participants watch a 30-minute lecture on OSHA regulations.
Summative Assessment: A multiple-choice quiz focused on regulatory acronyms and dates.
Issue: The learning outcome requires demonstration of a practical skill, but the activity is passive, and the assessment does not evaluate the ability to conduct an inspection. None of the components reinforce or measure the intended performance.
2. Outcome Requires Critical Thinking; Assessment Only Tests Recall
Intended Learning Outcome: “Evaluate the ethical implications of decisions made in a clinical research setting.”
Learning Activities: Case study discussions and guided analysis exercises.
Summative Assessment: 10-item quiz with true/false questions on definitions of ethics terms.
Issue: The activities partially support the learning outcome, but the assessment does not. It measures only basic recall rather than the evaluative judgment called for in the learning outcome.
3. Learning Activities and Assessment Cover Extra-Outcome Content
Intended Learning Outcome: “Apply basic Excel formulas to analyze small data sets.”
Learning Activities: Instruction on Excel macros, pivot tables, and VBA scripting.
Summative Assessment: Project requiring creation of a macro-enabled dashboard.
Issue: The course activities and assessment exceed the intended learning outcome, making it difficult to determine if the basic objective was met.
Clarified Certificate Requisites (Clause 6.1.3)
The standard specifies that a certificate must have requirements in two areas:
Minimum participation in the certificate program learning event or events, and
A passing score on the summative assessment(s) covering the certificate program’s intended learning outcomes.
Organizations may have other requirements, such as for eligibility to participate. But, all requirements must be consistent with program purpose, target population, and intended learning outcomes (clause 6.1.3.2). The middle one “target population” is new.
Here’s an example where requirements are not consistent with the purpose and target audience.
1. Unnecessary Degree Requirement for a Non-Academic Skills Program
Program Purpose: To provide practical, on-the-job training for individuals seeking roles as entry-level HVAC technicians.
Target Audience: High school graduates or adult learners entering the skilled trades.
Eligibility Requirement: Must have a bachelor’s degree in engineering or physical sciences.
Issue: The degree requirement excludes the target audience and contradicts the program’s purpose of developing entry-level trade skills, which do not require academic degrees.
Deeper Program Evaluation Requirements (Clause 6.1.10)
While the standard always required an evaluation, the previous version required certificate issuers to measure the “quality, effectiveness, and value” of their programs. What sometimes resulted was certificate issuers making sure their survey questions related to those three attributes, which is nice, but could be better. An important piece missing was what does a quality, effective or valued program mean to the organization? In other words, what’s the target? And then, did you meet it?
Now, clause 6.1.10 requires certificate issuers to establish performance targets and conduct and report on evaluations that show their progress toward those targets. The actual targets are up to the organization, but they need to established for the course(s), and the summative assessment(s) (6.1.10.2).
Specifically, course targets need to set, at minimum, for learners’ satisfaction related to course content and delivery (6.1.10.3). And, assessment targets need to be set for item and form performance, evaluator performance (if applicable), and learner feedback (6.1.10.4).
Remember that the standard does not require that organizations meet the targets. That’s not the point. The point is that certificate issuers establish meaningful targets, measure their progress towards them, and adjust the certificate program, if/as needed.
So, what might these performance targets look like? Here are examples:
For the course:
Target Area | Example Performance Target |
Learner Satisfaction (Content) | “Average rating of 4.2 or higher on a 5-point scale for survey item: ‘The course content was relevant to my job or role.’” |
Learner Satisfaction (Delivery) | “Average rating of 4.5 or higher for: ‘The instructor communicated effectively.’” |
Completion Rate | “At least 90% of enrolled learners complete the course within 30 days.” |
Feedback Participation | “Survey response rate of at least 60% from course completers.” |
For the summative assessment:
Target Area | Example Performance Target |
Item Difficulty | “Item p-values (percent correct) between 0.30 and 0.90 for all items.” |
Item Discrimination | “Item discrimination index positive for all items and >-0.20 for 85% of items.” |
Form Reliability | “KR-20 of at least 0.80 for each exam form.” |
Learner Feedback on Assessment | “At least 80% of respondents rate the fairness and clarity of the assessment items at 4 or higher on a 5-point scale.” “No more than 5% of learners express negative feedback on any one item on the summative assessment.” |
Of course these performance targets are illustrative only and may not be ideal or even appropriate for your program.
While this third post up the posts on the substantive changes to ASTM E2659:2024 Standard Practice for Certificate Programs, stay tuned for more posts to help you apply the standard.
Comments