GOT RESULTS? 
  

ISPI’s GOT RESULTS? archive is the product of a campaign managed over several years (1997-2005) by ISPI members Carl Binder, Timm Esque, and Julie Capsambelis. The purpose of the project was to demonstrate how practitioners can and do use meaningful measures of performance outcomes to evaluate and make decisions about performance interventions and ongoing performance systems.

Focus on Results is the first of the 10 Standards of Performance Technology, incorporated in the Certified Performance Technologist (CPT) program and representing the essentials of human performance improvement methodology. While many in the field of performance improvement speak about producing measured results, there is for most of us an enormous potential for increasing our skills and knowledge for collecting and using performance output measures in HPT practice. This archive is dedicated to the improvement of practitioners’ skills and knowledge related to measuring results.

  


   << top>>

Case Selection Criteria

GOT RESULTS? project managers used the following criteria in their solicitation and review of case examples:

  • Cases should illustrate practical interventions and measurement approaches that HPT practitioners (not only researchers) can actually use: The purpose of the GOT RESULTS? campaign was to inspire and inform practitioners, not to stimulate research.

  • Measure performance outputs (accomplishments), not only behavior, whenever possible: Thomas F. Gilbert and other pioneers in the field of HPT emphasized that it is the products or accomplishments produced by behavior that contribute value toward organizational results, not the behavior itself. Thus, measure outputs when possible.

  • Present counts of accomplishments or behavior, not merely percentages or ratios: As emphasized in the long-running column, Measurement Counts! in ISPI’s PerformanceXpress online newsletter, percentages or ratios without the original counts from which they are derived, are “dimensionless quantities” with relatively little value in the measurement of performance. 

  • Rating scales are not measures of performance outcomes: While rating scales and “smile sheets” can provide valuable feedback for program managers and implementers, they are not measures of performance output that most accountants, engineers, or business managers would consider objective. It’s better to present counts of “people who gave a rating of X” than to sum and average values on a rating scale.

  • It’s better to evaluate performance over time than to compare single before/after measures: Since performance can “bounce” from hour to hour, day to day, or month to month, repeated measures before, during, and after interventions present a more reliable basis for decision making than simple before/after measures.

While not all cases in the archive conform to all of the above criteria, GOT RESULTS? managers sought out cases that conformed as much as possible to this “gold standard.”


  << top>>

Historical Background

In the early years of ISPI (formerly NSPI), it was simply assumed that articles and presentations would include measures of behavior, job outputs (accomplishments), or organizational results. The evolving technology was clearly science-based, as were its evaluation methods. Over the years, however, there has been some slippage in this expectation.

In the early 1990s, Ogden Lindsley noticed that performance data occurred in fewer than 5% of tables and displays in ISPI journals and only 4 of 60 chapters in ISPI’s Handbook of Human Performance Technology. [Lindsley, O.R. (1997). Performance is easy to monitor and hard to measure. In R. Kaufman, S. Thiagarajan, & P. MacGillis (Eds.), The Guidebook for Performance Improvement: Working with Individuals and Organizations. San Francisco: Jossey-Bass/ Pfeiffer, pp. 519-559.]

In 1995, inspired by Lindsley’s reports, Carl Binder argued that HPT should return to its data-based, natural science roots in order to spur innovation and ensure continuous improvement. [Binder, C. (1995). Promoting HPT Innovation: A Return to Our Natural Science Roots. Performance Improvement Quarterly 8(2).]

Inspired by Binder, Timm Esque and Pat Patterson sought out and co-published a collection of 22 performance improvement cases, all of which contain measured results, to encourage the minority that were still collecting and presenting such data. Four of the book’s case authors participated in a panel presentation at the 1999 conference. Panel members nearly outnumbered attendees! [Esque, T., and Patterson, P. (1998). Getting Results: Case Studies in Performance Improvement. Amherst, MA: HRD Press.] 

In 1999, ISPI held a Think Tank conference of thought leaders in San Diego, where a working group chaired by Richard Clark submitted recommendations to the ISPI Board. The group advocated monitoring and setting goals for the sharing of results data in publications and conferences, and suggested specific strategies and tactics for accelerating inclusion of results data in our conferences and publications. Among the recommendations was the idea that examples and practical methods for measurement and data presentation were more widely visible and available at conferences.

At the 2000 conference, a special session entitled “Show Me the Money” featured over a dozen 5-minute presentations, all centered on results data. There were more than 100 attendees. Most of that year’s participants (and many more) returned to “Show Me the Money II” in 2001. It was an all-day Showcase event, sponsored by ISPI President John Swinney, dedicated to encouraging and supporting the use of results data. It resulted in a mailing list of more than 100 interested participants plus many requests to continue the “campaign.”

The campaign continued in exhibits at the 2002, 2003, and 2004 ISPI International Conferences under the name “GOT RESULTS?”. Those events, organized by GOT RESULTS? managers Binder, Esque and Capsambelis, produced most of the case examples captured in this archive.

We encourage you to download the PDF files from this archive for educational purposes and to refer to them, when appropriate, in presentations and publications. We hope that they help to increase the practical collection and use of performance results data for evaluation and decision making in years to come. 

   

  << top>>


Get AcrobatAdobe Home - Get Acrobat Please Note:  The articles below are large Adobe PDF files (100 to 500 kb's in size).  To access, please allow adequate time for these files to be downloaded to your computer.


   
2005 GOT RESULTS?
  
Title & Author Sector Business
Function
Problem/
Opportunity
Intervention
Type(s)
Global Lean Manufacturing System
Hogan, Linda M., MS, CPT
Johnson Controls, Inc.
       
Journey to Zero Harm
Zaffron, Steve and Loffredi, Olga,
Landmark Education Business Development, Inc.
Natural Resources Mining Transforming a Workforces Relationship to Safety Transforming Organizational Culture
  
2004 GOT RESULTS?
  

Title & Author

Sector Business
Function
Problem/
Opportunity
Intervention
Type(s)
Operations Center Customer Service Initiative
Armstrong, Phil
Information Mgmt(Private) Customer Service Management Raising the bar on current successful performance  Feedback system,
Training,
Process flow model, Better use of tools
Strategic Performance Measurement: The Case of Mississauga Transit
Plant, Thomas, MBA, MPA
Transportation (Public)  Operations Inefficiency, Threat of Outsourcing or loss of funding Performance Dashboards, Clear Goals and Feedback
Improving a Job Placement Service for Students and Professionals
Sasson, Joseph, MA
Professional Org. (Public Nonprofit) Job Placement Services Services Underutilized Process mapping, Customer Feedback System
  
2003 GOT RESULTS?
  

Title & Author

Sector Business
Function
Problem/
Opportunity
Intervention
Type(s)
Effects of Marketing Collateral on Customer Volume in a Small Business
Binder, Carl, PhD, CPT
Services (Private Small Business) Marketing Decline in Customers Marketing Communications
CEP’s Performance Intervention Produces the "Smoothest Cutover to a New System in Budget Rent a Car’s History"
The Center for Effective Performance, Inc.
Transportation (Private) Customer Service New Computer System Implementation Instructional Design
New Hire Training for Seasonal Call Center Staff: Meeting Increased Demand with Fewer Resources, Ceridian Benefits Services
Capsambelis, Julie
Human Resource Mgmt (Private) Customer Service Maintain Quality while adding new staff Instructional Design
Sales Performance Improvement Getting Results through a Franchise Sales Organization
Swinney, John, Performance Consultant, & Couch, Bruce,
Sales Training Specialist, Bandag, Incorporated
Durable Goods (Private) Sales Franchisee Salespersons Ineffective Process Changes and Clear Expectations, Training Program and Follow Up, Feedback of Results
If You Don't Have Data, You Can't Show Your Value!
Hao, Ruhe, Bank of America
Banking (Private) Call Center Percieved Ineffectiveness at Collections Instructional Design and Assessment, On Job Training, Process Changes, Job Aids
Systematic Productivity Improvement
Hynna, Shauna, Ontario Lottery and Gaming Corporation
Gaming (Private) Service Delivery Operational Excellence Clear Expectations, Frequent Feedback, Group Process for Improvements
Affective Training for Bottom Line Results
Lane, Miki, 
Senior Partner, 
MVM Communications
Durable goods (Private) Sales Lack of Product Knowledge and Attitudinal Issues Needs Assessment, Instructional Design
Visible Value With Performance Improvement Strategies
The Managers’ Mentors, Inc.
High Tech Manuf. (Private) Manufacturing Productivity (including Quality) Mentoring, Instructional Design, Performance Feedback
Achieving Business Results Achieving Business Results Through Human Performance Technology  
(** 3.5 mb PDF)
King, Stephen B., Management Concepts, Inc.
Durable Goods (Private) Manufacturing Changing Environment, Changeover times to slow Job Redesign, Clear Expectations, Perf. Feedback, Training
Building Collaborative Relationships In SCREW YOU Environments! (Reducing the COST of CONFLICT)
Tamm, James (Judge), Managing Director, BConWill SchutzAssociates
Education (Public) All Employee Conflicts costly Needs Assessment, Instructional Design, Follow up Consulting
  
2002 GOT RESULTS?
  

Title & Author

Sector Business
Function
Problem/
Opportunity
Intervention
Type(s)
Measurable Patient Performance
Agilis Consulting Group, LLC
Medical Equipment (Private) Product Development End users struggle to use equipment properly Human Factors,  Analysis, Job Aids
Put Your Money Where It Matters!
Burkett, Alison, 
Lexington Associates
Energy (Private) Manufacturing Plant Operations Unreliable Process flow doc., Performance Logic, Analysis
Hilton Hotels Corporation
The Center for Effective Performance, Inc.
Hospitality (Private) Management Inconsistent Revenue,  Management Instructional Design
Examples From a Self-sustaining Performance System (SPS)
Esque, Timm J.
Education (Private) Administration Accounts Receivables out of control Process clarification, Clear Expectations, Frequent Feedback, Group Process for Improvements
Fluency Coaching™ Accelerates Learning and Productivity Ramp-up
Binder, Carl 
& Sweeney, Lee
Unknown Call Center Productivity (including accuracy) Fluency Development, Clear Expectations, Supportive Incentives, Job Aids
Dropout Recovery & Prevention at North Lake College
PLATO Learning, Inc.

Education (Public)

Instruction Dropouts struggle to recover and achieve GED Individualized, self-paced, mastery instruction
Example of Business Process Improvement
Smith, Martin
Telecom (Private) Accounting Cost Reduction Desired Process Reengineering
RE$ULT$ in the Public Service Sector
James, Randolph I.

Power Utility (Public)

Operations Respond to increase in demand Id proper accomplishment, Establish and use performance metrics
Raising the Bar… Improving the Master/Model Performers in a Computer Help Desk (Call Center)
Panza, Carol M., 
CMP Associates
Telecom (Private) Call Center Customer Satisfaction, Productivity (including Quality) Process Flow, Performance Analysis, Various Interventions as defined by Analysis
Calculating The Return on Investment in Training: A Critical Analysis and a Case Study
Stolovitch, Harold D. 
& Maurice, Jean-Gabriel
Banking (Private) Lending Perceived Potential for Increased Volume of Business Training, ROI Analysis
WANT RESULTS? Link your projects to a CBI!
Rummler, Geary A, PhD
High Tech (Private) Product Development Opportunity for Cycle time reduction Critical Business Issue Identified, Process Mapping, Various interventions as Identified
Self-Directed Work Teams Drive Performance in a Printing Plant
Levick, Michael,
Dynamic Performance Systems
Printing (Private) Operations Decision to implement self-managed teams and resulting issue SMT Training, Clarification of Roles, Quality Metrics and Feedback
Structured On-the-Job Training in Developing Nations
Stolovitch, Harold D. 
& Ngoa-Nguele, Daniel
Developing Country    Training Centers perceived as ineffective Structured On Job Training
UNLEASHING INNOVATION: A successful approach to process improvement!
Preston, Karen, 
Corporate Manager, Walgreens Company
Retail (Private) Distribution Continuous Improvement Training and Tools to help teams get started, Posting of results, Various forms of support
   

  << top>>


Measurement Counts!

Measurement Counts! was a series of articles published monthly in PerformanceXpress from March 2002 to November 2004. The titles, with links to the archived articles, follows.

March 2002 - Introduction to Measurement Counts!

April 2002 - The Dangers of Percent

May 2002 - Deciding What Worked

June 2002 - Got Results?

July 2002 - Measurement and HPT Research

August 2002 - Making an Impact

September 2002 - Back to Basics

October 2002 - Things to Count in a Customer Call Center

November 2002 - Got Results? at ISPI 2003 and A Great Article

December 2002 - Ch…Ch…Ch…Ch…Ch…Changes

January 2003 - Graphing Results:Solution or Deeper Trouble?

February 2003 - Using the Right Graphs to Make Better Decisions

March 2003 - Using Surveys and Questionnaires

April 2003 - Learning is a Trend in Performance

May 2003 - Building in Results Measurement from the Start

June 2003 - Projecting Trends, or How I Got a New Consulting Gig

July 2003 - Why Do We Measure? And How?

October 2003 - A Challenge to Present Measurements to Colleagues

November 2003 - Measurement and “Research-based” Methods

January 2004 - Counting One’s Own Behavior and Accomplishments

February 2004 - Why to Avoid Statistics

March 2004 - Metrics, ROI and Accomplishments (the missing element)

April 2004 - The OBM Network: A Resource for Data-Based Performance Improvement

June 2004 - How Often Can You Make a Decision?

July 2004 - The Power of Count Per Minute Measurement

August 2004 - Units of Analysis and Units of Measurement

September 2004 - The Dangers of Percent: An Example

October 2004 - Using High Fidelity Simulations to Certify Performance

November 2004 - Remembering a Measurement Giant: Ogden R. Lindsley (1922-2004)

   

<< top>>


www.ispi.org  |  info@ispi.org