Skip to main content

Evaluating Human Resources Programs: A 6-Phase Approach for Optimizing Performance

Evaluating Human Resources Programs: A 6-Phase Approach for Optimizing Performance

Jack E. Edwards, John C. Scott, Nambury S. Raju

ISBN: 978-0-787-99670-3

May 2007, Pfeiffer

288 pages

$43.99

Description

Evaluating Human Resources Programs is a groundbreaking book that offers readers a systematic method for enhancing the value and impact of HR and supporting its emerging role as a strategic organizational leader. It provides a practical framework for adjusting and realigning strategies across all types of HR programs. The authors outline a proven six-phase process that will maximize the likelihood of a successful HR program evaluation, including real-world techniques, strategies, and examples to illustrate their recommended steps and actions.
List of Tables, Figures, and Exhibits.

Preface.

A Few Words of Thanks.

Overview: Human Resources (HR) Program Evaluation.

Chapter Objectives.

Use the Approach That Best Addresses Your HR Program Evaluation’s Objectives.

Use Goal-Based Evaluations to Focus on Program Objectives.

Use Process-Based Evaluations to Focus on Workflow and Procedures.

Use Outcome-Based Evaluations to Focus on Results.

Integrate Ongoing and Periodic Program Evaluations into the Operation of HR Programs.

Enhance HR Program Performance Through Ongoing Formative Program Evaluation.

Enhance HR Program Performance Through Periodic Summative Program Evaluation.

Consider Our General Philosophy of HR Program Evaluation.

Be Prepared to Address Potential Excuses for Not Conducting an HR Program Evaluation.

Potential Excuse 1: The Resources Required to Conduct a Program Evaluation Are Better Spent on Administering the Program.

Potential Excuse 2: Program Effectiveness Is Impossible to Measure.

Potential Excuse 3: There Are Too Many Variables to Do a Good Study.

Potential Excuse 4: No One Is Asking for an Evaluation, So Why Bother?

Potential Excuse 5: “Negative” Results Will Hurt My Program.

A Look at How the Remainder of the Book Is Organized.

Phase 1: Identify Stakeholders, Evaluators, and Evaluation Questions.

Phase 2: Plan the Evaluation.

Phase 3: Collect Data.

Phase 4: Analyze and Interpret Data.

Phase 5: Communicate Findings and Insights.

Phase 6: Utilize the Results.

Deviate from Our Approach When It Makes Sense for Your Evaluation.

Phase 1: Identify Stakeholders, Evaluators, and Evaluation Questions.

Chapter Objectives.

Identify Stakeholder Groups.

Decide on Internal Stakeholders First.

Consider the Perspectives of Unions and Their Members.

Don’t Forget That There Are External Stakeholder Groups.

Identify the Evaluation Team.

Ask, “How Big Should the Team Be?”

Ask, “Who Should Be on the Team?”

Ask, “Who Should Lead the Evaluation Team?”

Ask, “Should the Evaluation Team Write a Charter?”

Identify Evaluation Questions.

Determine the Types of Evaluation Questions That Match the Evaluation Objectives.

Develop and Refine Evaluation Questions.

Attend to Desirable Characteristics When Selecting Criterion Measures.

Conclusions.

Phase 2: Plan the Evaluation.

Chapter Objectives.

Determine the Resources Needed to Conduct the Evaluation.

Develop a Preliminary Budget.

Set Milestones with Dates—Making a Commitment Is Hard to Do.

Lay Out Plans for Data Collection.

Determine Desirable Attributes for the Data That Will Be Collected.

Remind the Team of All the Sources and Methods They Might Use.

Decide Whether Pledges of Anonymity or Confidentiality Will Be Needed.

Avoid or Minimize Common Data Collection Errors.

Decide When a Census or Sample Should Be Used.

Identify the Data Analyses Before the Data Are Collected.

Plan the Procedures for Supplying Feedback.

Enhance Buy-In from Top Management.

Provide an Overview of the Program Evaluation Plan.

Prepare to Defend the Budget.

Conclusions.

Phase 3: Collect Data.

Chapter Objectives.

Select Optimum Data Collection Methods and Data Sources.

Use Internal Documents and Files—Current and Historical.

Gather Internal and External Perceptual Data.

Assess Internal Processes and Procedural Information.

Utilize External Documents and Environmental Scans.

Don’t Forget Other Types of Evaluation Data.

Use Evaluation Research Designs That Address Practical Constraints.

Subgroup Comparisons.

Before-and-After Comparisons.

Time-Series Designs.

Case Studies.

Enhance Data Quality During Data Collection.

Check for Potential Vested Interests or Biases.

Document Procedures and Data.

Match Evaluators’ Skill Sets to Types of Assignments.

Pilot-Test Procedures and Instruments.

Train the Data Collectors.

Obtain the Same Data with More Than One Method When Resources Permit.

Verify the Data.

Beware of Becoming Sidetracked During Data Collection.

Avoid Excessive Data Collection.

Monitor Data Collection Schedules Closely.

Conclusions.

Phase 4: Analyze and Interpret Data.

Chapter Objectives.

Create and Modify a Database.

Design Data Codes.

Design the Database.

Decide What, If Anything, Needs to Be Done About Missing Data.

Take Full Advantage of Descriptive Statistics.

Consider the Many Types of Descriptive Statistics Available to the Team.

Look for Opportunities to Use Descriptive Statistics with Qualitative Data.

Address Additional Concerns in Deciding Whether Inferential Statistics Are Appropriate.

Balance the Concerns for Type I vs. Type II Error Rates When Using Statistical Tests.

Determine Whether You Are Really Using the Alpha Level That You Said You Would Use.

Be Clear in Understanding What Statistical Significance Is and Is Not.

Use Inferential Analyses If Warranted and Underlying Assumptions Can Be Met.

Look for Areas in Which Findings Support and Conflict with Other Findings.

Conclusions.

Phase 5: Communicate Findings and Insights.

Chapter Objectives.

Stick to the Basics Found in Any Good Communication Strategy.

Maintain Confidentiality When It Was Promised.

Adapt Communications to the Audience’s Skills and Needs.

Get to the Bottom Line Early.

Determine What to Do with Findings That Do Not Fit.

Tie It All Together: Findings→Conclusions→Recommendations.

Depict Findings and Recommendations Visually.

Picture the Situation to Let Stakeholders See How It Really Is or Might Be.

Show Stakeholders a Path Through the Process.

Clarify Numerical Findings with Graphs.

Use a Table to Convey Easily Misunderstood Information.

Deliver the Product Orally and in Writing.

Share Findings When the Promise Has Been Made.

Use Briefings to Put the Word Out Quickly and Answer Questions.

Write a Report to Preserve Information and Organize Documentation for Storage.

Conclusions.

Phase 6: Utilize the Results.

Chapter Objectives.

Adjust, Replace In-House, or Outsource the HR Program.

Adjust the Existing Program.

Replace the Existing Program.

Outsource the Existing Program.

Leverage Insights Relevant to Evaluation Use.

Build Team Accountability and Skill.

Involve Stakeholders Early and Often to Increase the Odds That Results Are Used.

Incorporate Proven Strategies for Implementing Results.

Build Expertise to Engage Stakeholders.

Leverage Politics.

Manage Resistance.

Establish Follow-Up Responsibilities.

Be Timely and Communicate.

Follow Up with Hard Data.

Conclusions.

References.

Author Index.

Subject Index.

About the Authors.

  • Contains a six-phase process that features real-world techniques and strategies for evaluating and improving HR programs
  • Outlines the major steps and primary considerations when conducting an HR program evaluation