The Mapping Assessments to Competencies (MAC) Process, developed by CNA and the U.S. Army Research Institute for the Behavioral and Social Sciences, is a systematic tool to evaluate whether existing assessment data can be used for talent management efforts. It can help evaluate the appropriateness of using existing competency assessments for new purposes by talent managers, Soldiers, and officers.
Military services educate and develop enlisted personnel and officers in a career-long continuum, using various forms of assessment to measure how selected knowledge, skills, and behaviors are acquired, developed, and maintained over time. Professional military education (PME) and operational exercises are two key sources of performance data. Moreover, these assessments may produce additional information about Servicemembers’ competencies that could be used to support current and future career-related decisions.
For example, a writing assessment that scores multiple competencies related to written communication could be useful to servicemembers when identifying self‑development opportunities as they prepare for resident senior service colleges. Likewise, an assessment of oral communication conducted via a structured exercise might support servicemembers in refining their skills on the job for staff positions that require frequent oral communication with external audiences (e.g., liaison to Congress). The military is seeking opportunities like these to leverage the assessment information it already collects on its personnel to inform talent management.
Many existing assessments are focused on grading knowledge or task performance rather than measuring specific competencies. Moreover, some existing assessments may lack established reliability and validity evidence. Some assessments are developed to indicate how much learning took place in a course, some provide developmental feedback to instructors and students, and still others inform selection or promotion decisions and predict future performance. When designing assessments for a particular purpose, assessment developers must be sensitive to the appropriate level of rigor required to meet the reliability and validity standards that are needed to justify an intended use.
To meet the objectives of the Army Talent Management Strategy and the Army People Strategy, the U.S. Army Research Institute for the Behavioral and Social Sciences (ARI) partnered with CNA to determine how to better leverage existing assessments that measure the unique knowledge, skills, behaviors, and preferences (KSB-P) of Army Soldiers and officers.
The MAC Process
Building on the literature on competency assessments, as well as the Standards for Educational and Psychological Testing, CNA and ARI created the MAC Process to identify and define competencies and related subcompetencies that Army Soldiers and officers are expected to acquire, and then evaluate the extent to which existing assessments measure those competencies. This systematic tool can be used for a variety of purposes, from developmental through predictive (see figure 1). Not all assessments require the same level of evidence to justify their use. A lower level of evidence is needed to justify assessments used for developmental purposes because the stakes are lower—no administrative decisions are made based on these assessments. More rigorous evidence is required for assessments used for predictive purposes like those used to make critical career path decisions.
In addition to this primary use, the MAC Process also has alternative uses. It can be used in curriculum development by first defining and mapping the competencies and subcompetencies that a Service wants to gather from a course and then identifying the types of evidence needed, in order to create new assessments for that curriculum. Outside of curriculum development, the MAC Process can also help a Service to consider the evidence needed to appropriately use a newly created assessment or to evaluate commercially available assessments (e.g., Myers-Briggs, Hogan Personality Inventory, Watson-Glasser Critical Thinking Appraisal) for a specific use.
Steps in the MAC Process
To execute the MAC Process, users must gather evidence and information about an assessment’s development, use, and validation. It is best conducted in small teams of a few members. Users need to complete the following steps:
- Define the competencies. Describe the assessment’s purpose and intended use. List the relevant competencies and subcompetencies. Indicate which are measurable and the depth of knowledge required for each. Questions to consider include: (1) Who is being assessed? (2) Who will use the resulting scores? (3) How will the assessment scores be used? (4) What decisions will be based on the results?
- Evaluate the assessments. Evaluate each assessment for all measurable subcompetencies, including evidence of validity, reliability/precision, and fairness.
- Map assessments to competencies. Use the evidence acquired in step 2 to document the appropriate use level for each subcompetency and map the battery of usable assessments to a competency. (See figure 2)
To support the Army in this endeavor, CNA created a guidebook for course developers, assessment developers, curriculum designers, and social science researchers that provides detailed instructions and tips for executing the MAC Process.
The MAC Process cannot tell users which competencies are important for a given job or curriculum to address. It also does not teach an individual about competency frameworks nor the fundamentals of assessment development or evaluation. It assumes that users are familiar with competency frameworks and assessment design and guides them through the process of collecting and organizing data to determine the appropriateness of using an established assessment for an intended use.
Although developed for the Army, the MAC Process may have broader relevance to the military services when determining how to leverage existing training assessments to make data-informed talent management decisions.
Download PDFDISTRIBUTION A: Approved for public release. Unlimited distribution.
Details
- Pages:
- Document Number: DMM-2023-U-035131-Final
- Publication Date: 3/17/2023