Tales from the Field:
Case Study Evaluation of the U.S. Coast Guard’s Electronic Performance Support System for the Fast Response Cutter

By Benjamin Lyons, Matthew Romano, and Lauren Weisberg

Tales from the Field, a monthly column, consists of reports of evidence-based performance improvement practice and advice, presented by graduate students, alumni, and faculty of Boise State University’s Organizational Performance and Workplace Learning Department.

Background
The U.S. Coast Guard (USCG) is one of the five military armed forces in the country, and the only one under the Department of Homeland Security. Since 1790, the USCG has been responsible for myriad civil and military missions, both domestic and international. By law, the USCG is responsible for executing 11 statutory missions, specifically: ports, waterways, and coastal security; drug interdiction; navigation; search and rescue; living marine resources; marine safety; defense readiness; migrant interdiction; marine environmental protection; ice operations; and law enforcement operations.

To effectively carry out these missions, the USCG utilizes various types of “cutters,” which are commissioned vessels of at least 65 feet in length with a permanent crew aboard. Currently, the USCG has 25 different classes of cutters in service. In 2012, the USCG commissioned its newest class, the 154-foot, Sentinel Class, fast-response cutter (FRC). The organization developed a self-sustainable electronic performance support system (EPSS) specifically for these new vessels to provide crews with an asynchronous, self-guided, independent learning and support tool. An EPSS is a “packaged (self-contained) digital task support resource. An EPSS unifies relevant support and reference information, media, and guidance at a single, accessible point, organized in a logical and consistent structure” (CG Training System Office, 2011). Ideally, an EPSS is a self-learning program that members can use without the need for additional training or instruction. Implementing an EPSS can help an organization reduce or eliminate training costs while increasing productivity and performance; however, the intent of an EPSS is not meant to be a method of training, but ultimately a performance support tool for use on the job.

The USCG had previously designed, developed, and implemented an EPSS in 2004 to address gaps in knowledge and performance related to a newly commissioned 87-foot patrol boat fleet. In 2007, the results of an evaluation conducted by the organization revealed that the crew not only enjoyed using the EPSS, but that the EPSS significantly improved user knowledge and performance and provided an estimated 1100% return on investment (Davis & Williams, 2007). The USCG hoped to achieve similar organizational outcomes by creating another EPSS made exclusively for the new FRCs.

The USCG has the following long-term goals and expected outcomes for the new EPSS:

  • Provide crews an industry-proven tool to support their performance
  • Minimize potential future training costs
  • The manufacturer of the FRCs, Bollinger Shipyards, provides training for all crews during the cutter’s manufacturer two-year warranty period. Once the warranty expires, the USCG must cover the training costs for additional crewmembers.
  • Provide crews with access to support content anytime, anywhere, and at their preferred pace.
  • Improve organizational outcomes, such as preventing casualties, mishaps, and loss of underway hours.

Evaluation Focus
Our evaluation team consisted of two USCG officers and one civilian. The organization’s Performance Technology Center (PTC), our client, is responsible for designing and maintaining the EPSS using the assistance of personnel from the manufacturer of each piece of equipment associated with the FRCs (e.g., John Deere, Caterpillar, etc.). The PTC requested a summative evaluation with a formative aspect to determine both the merit of the EPSS in its current form (content-based) and the overall worth of the EPSS as a performance support tool. The evaluation was goal-based, with the main goal being to determine the organizational impact of the EPSS and identify potential areas of improvement.

We followed Scriven’s (2007) key evaluation checklist to conduct the evaluation, which provided us with a list of the essential elements to include and served as a framework for structuring the evaluation systematically and reporting results (Davidson, 2005). We began by developing a training impact model (Brinkerhoff, 2005) with input from the client, outlining the means and end of the program (see Table 1). This tool enabled us to establish familiarity with the inputs and resources dedicated to the EPSS and the outputs and potential outcomes for the organization.

Table 1. Training Impact Model for EPSS

Inputs or
Resources

Activities

Program
Capabilities

Critical
Actions

Key
Results

    Business             Goals                

Requirements

  • Basic requirements and/or standards of an EPSS system

Equipment:

  • USCG-standard workstation with Internet connectivity
  • KVH satellite system that allows Internet connectivity while cutter is underway

Users:

  • FRC crews
  • Port engineers
  • Land support teams

Support:

  • USCG IT support
  • USCG ET support
  • PTC designers
  • Equipment manufacturers serving as subject matter experts to designers

Platform

  • Fast-response cutter (FRC) – only to apply the program

User Activity:

  • Logging into a USCG-standard workstation
  • Logging into the CG portal website
  • Logging into the EPSS system
  • Printing module sections for use away from the computer

Equipment Activity:

  • Cutters successfully use KVH system while underway

Support Activity:

  • Training of PTC designers in maintaining and updating the EPSS
  • Coordination of PTC designers with equipment manufacturers to ensure EPSS accuracy
  • Training of USCG ITs to maintain software, hardware, and network
  • Training of ETs to service KVH system
  • Dissemination of EPSS existence by PTC and USCG HQ
  • Improvement of knowledge and confidence of users using FRC equipment
  • Improvement of skills and competence of users when using FRC equipment






  • Proper operation of equipment on FRCs
  • Proper completion of preventive maintenance on FRC equipment
  • Proper

    Trouble-shooting, adjustments, and repairs of FRC equipment after a casualty

  • Preparation and training of pre-commission-ing crews or follow-on crews before reporting to FRCs
  • Proper troubleshoot-ing and facilitation of casualties while FRCs are underway (i.e., remote assistance)





  • Zero preventable casualties
  • Timely preventive maintenance (within pre-set daily, weekly, monthly, quarterly, and annual schedules) and zero overdue maintenance
  • Timely trouble-shooting, adjustments, and repairs underway (ASAP) and while in-port (before scheduled underway period)


  • Quicker qualifications (before established deadlines)
  • Continuity of knowledge during and after transfer season
  • Improved remote support during casualties
  • Closing the gap of knowledge and skills for follow-on crews
  • Decreased number of casualties on EPSS-supported equipment (ideally zero casualties)
  • Decreased amount and impact of mishaps due to EPSS-supported equipment 
  • Decreased number of underway hours lost due to EPSS-supported equipment (ideally cutters meet 2,500 annual hours’ require-ment)
  • Decreased training costs for FRC crews and land support units










































 

 






 

Evaluation Methodology

Our team met with the client and upstream stakeholders to develop a list of specific dimensions to investigate with a focus on the evaluand’s processes and outcomes, and to determine the appropriate importance weighting for each dimension (see Table 2). The sole process-based dimension of merit enabled us to address how thorough and up-to-date the EPSS’s content was, while the outcome-based dimensions of worth enabled us to investigate whether or not the EPSS’s capabilities and critical actions led to key results and achievements of the organization’s goals and missions.

Table 2. Dimensions and Importance Weighting

Category

Dimension / Specific Evaluation Question

Importance Weighting

Process

1. EPSS Content

How well does the specific information in the EPSS support the daily operational needs of the FRC crews, land-based support units, and port engineers?

Very Important

Outcome

2. Performance of Users While Using the EPSS

How successful are the EPSS’s users in operating, conducting maintenance on, troubleshooting, adjusting, and repairing FRC equipment?

Extremely Important

3. Cost Analysis of the EPSS

How cost-effective is the EPSS and what is the USCG’s return on investment in comparison to the previously instituted Bollinger Shipyard training program?

Very Important

4. Organizational Outcomes

To what degree has the implementation of the EPSS prevented casualties, mishaps, and loss of underway hours?

Somewhat Important

 

We selected the following data collection methods to evaluate each dimension:

Web-based Survey. We conducted an anonymous, web-based survey with 210 FRC crew members and land-based support units electronically due to their large sample size and high frequency of EPSS use. The survey consisted of three sections: demographic information gathering (Part A), self-evaluation of program content understanding with Likert scale structure (Part B), and timed performance assessment (Part C). Part C consisted of eight timed multiple-choice questions that pertained specifically to Dimension 2: Performance of users. We divided participants into two groups: Group A was asked to complete the survey without referencing EPSS; Group B was asked to complete the survey using the EPSS. Of the 95 surveys distributed to Group A, we received 17 responses, but eight were discarded due to incomplete information and obvious disregard for proper survey instructions. Of the 115 surveys distributed to Group B, we received 43 responses, but 15 were deemed invalid for reporting for the aforementioned reasons, and six more participants reported distractions or disruptions that were severe enough for us to discard their data. We ran an independent samples t-test to determine the results of the assessment.

Email Questionnaire. We distributed an email questionnaire to 42 immediate recipients including port engineers (even though they do not often use the EPSS, their feedback is still critical as they oversee the chief engineer of the FRCs and serve a supervisory/mentoring role), commanding officers, and engineering petty officers (because they are important to the success of an FRC’s operation). Nine participants responded to the questionnaire (a 21% response rate). Eleven of 18 engineering petty officers elected to participate in the survey in lieu of the questionnaire, which could help explain the low response rate.

On-site Assessment. The organization sent one team member, a USCG  officer (referred to as “the observer” in this paragraph), to ports in Miami and San Juan to conduct on-site practical assessments of crew members and engineers performing operational tasks onboard two different FRCs with and without the EPSS. The participants’ experience levels with the resources they were asked to use and the tasks they were asked to perform ranged from “very familiar” to “not at all familiar.” The observer divided participants into two groups: Group A was not allowed to use the EPSS to locate information during tasks, but could use any other resources at its disposal. Conversely, Group B was instructed to use only the EPSS and no other resources. The observer also observed a separate and independent control group (Group C) comprised of two non-engineers who had no experience with an FRC’s systems, the EPSS, other available reference materials, or the tasks they were required to perform. The observer graded all participants with a performance criterion checklist.

Extant data review. We analyzed and reviewed internal documents from the PTC and Headquarters Training Division to compare the total costs (initial and recurring) associated with the EPSS, employee travel for the Bollinger factory training program, and the support of facilities and instructors provided by Bollinger. We did a cost comparison of the EPSS and Bollinger’s factory training program and calculated the length of time it will take for the EPSS to become a better fiscal option moving forward. We also reviewed data in coordination with the USCG Office of Naval Engineering to identify trends since the implementation of EPSS regarding the number of casualties, amount of equipment downtime, loss of underway hours, and mishaps associated with EPSS-supported equipment. However, the data was deemed inconclusive due to the very recent implementation of EPSS, the low usage of the program thus far, and the multiple competing factors that affect casualty and mishap data.

We applied the principles of critical multiplism and triangulation by gathering data from a variety of sources and methods to reflect an accurate measure of all immediate recipients’ experiences with the EPSS. We also used Kirkpatrick’s (1994) four-level training evaluation model, focusing on Level 1 (reaction), Level 3 (behavior), and Level 4 (results). The team did not address Level 2 (learning), as no training activities took place during the evaluation period. See Table 3 for a more detailed overview of the methodology plan.

Table 3. Data Collection Plan

Dimension

Data Collection Method

Kirkpatrick Level

1. EPSS Content

1.1 Web-based survey of crew members (Part B: Likert Scale, Questions 13-15)

Level 1

1.2 Email questionnaire to port engineers, commanding officers, and engineering petty officers (Question 6)

Level 1

2. Performance of EPSS Users

2.1 Web-based survey of crew members (Part B: User Self-Evaluation—Likert Scale, Question 19)

Level 1

2.2 Web-based survey of crew members (Part C: User Performance)

Level 3

2.3 Practical assessment measuring the accuracy and time-on-task of the operation and maintenance sub-modules

Level 3

2.4 Email questionnaire to port engineers, commanding officers, and engineering petty officers (Questions 7-8)

Level 1

3. Cost Analysis of EPSS

3.1 Documentation review of EPSS total costs

Level 4

3.2 Documentation review of Bollinger trainings total costs.

Level 4

4. Organizational

Outcomes

4.1 Review of casualties, equipment downtime, loss of underway hours, and mishaps per equipment type from CG’s engineering documentation database

Level 4

4.2 Web-based survey of crew members (Part B: User Self-Evaluation—Likert Scale, Question 20)

Level 1

4.3 Email questionnaire to port engineers, commanding officers, and engineering petty officers (Question 9)

Level 1


Results and Conclusions

After analyzing the data from each method, we synthesized the results using a combined rubric to determine each dimension’s quality score. We used a synthesis rubric to combine the results of all four dimensions (see Table 4 to compare individual dimension scores). Based on the dimensional scores from Table 4, with a consideration of the importance weighting of the dimensions, we concluded that the overall quality of the EPSS program was “Good.” We were able to make this determination because the most important dimension, Performance of EPAA Users (Dimension 2), received a score of “Good,” and none of the dimensions received a score of “Poor.”

Table 4. Dimensional Results

Dimension

Individual Dimension Scores

Importance Weighting

POOR

MEDIOCRE

GOOD

EXCELLENT

1. EPSS Content

 

 

 

Very Important

2. Performance of EPSS Users

 

 

 

Extremely Important

3. Cost Analysis of EPSS

 

 

 

Very Important

4. Organizational Outcomes

 

 

 

Somewhat Important

 

The data suggest that the EPSS can adequately support the performance of the FRC crews and land-based support units in successfully locating information and completing engineering-related tasks regardless of prior knowledge or training. Due to various factors such as a low survey and questionnaire response rate and a lack of availability of FRCs for on-site observations during the data collection period, our team could not make a definitive determination regarding whether or not the EPSS could successfully replace Bollinger’s factory training program. However, we concluded from the evaluation that the EPSS met what the client felt was the most important of the aforementioned goals, which is the ability of the EPSS to support user performance.

The evaluation was a positive experience that yielded impactful results for the client. The USCG enthusiastically assisted our team in each step of the process. However, we experienced roadblocks to gathering the most comprehensive data possible. The data highlighted a systemic lack of awareness of the EPSS’s existence among immediate recipients, and those who were aware of its existence did not self-report as frequent users. Therefore, we recommended that the USCG conduct a follow-up evaluation structured with similar key dimensions once the organization has taken steps to more widely implement the EPSS and encourage greater usage.

References

Brinkerhoff, R. O. (2005). The success case method: A strategic evaluation approach to increasing the value and effect of training. Advances in Developing Human Resources, 7(1), 86-101.

CG Training System Office. (2011). Advanced distributed learning (Volume 7). Retrieved from USCG’s intranet portal site.

Davidson, J. (2005). Evaluation methodology basics: The nuts and bolts of sound evaluation. Thousand Oaks, CA: Sage.

Davis, B. D., & Williams, T. (2007). 87' CPB crews benefit from award winning EPSS. U.S. Coast Guard Engineering, Electronics and Logistics Quarterly, 13(47), 38-39.

Kirkpatrick, D. (1994). Evaluating training programs: The four levels. San Francisco, CA: Barrett-Koehler.

Scriven, M. (2007). The key evaluation checklist. Retrieved from http://evaluation.wmich.edu/checklists/

About the Authors

Benjamin Lyons is a lieutenant in the U.S. Coast Guard. A graduate of the U.S. Merchant Marine Academy with a degree in logistics and intermodal transportation, Ben is working toward earning his MS in organizational performance and workplace learning through Boise State University by the fall of 2016. Ben can be reached at benlyons@u.boisestate.edu.



Matthew Romano is a lieutenant in the U.S. Coast Guard. A graduate of the U.S. Coast Guard Academy with a degree in civil engineering, Matt is working toward earning his MS in organizational performance and workplace learning through Boise State University by the fall of 2016. Matt can be reached at romanomatthew@yahoo.com.

Lauren Weisberg has spent the past decade teaching fine art and technology education at a public high school in the Philadelphia suburbs. She recently graduated from Boise State University with an MS degree in organizational performance and workplace learning and a certificate in workplace e-learning and performance support (WELPS).  Lauren can be reached at laurenweisberg7@gmail.com.