2. Our Technical Approach
The team for your project will be assembled from our staff of experienced and talented professionals. Our skill sets span the full spectrum of labor categories including a project manager, a risk mitigation specialist, a configuration manager, quality control specialists, instructional systems designers, computer-based training specialists, programmers, graphic artists, and subject matter experts. Each individual provides an intrinsic component of the process and are fully engaged throughout the lifecycle of each project, ensuring continuity.
2.1. Our Team
Team Composition. We are committed to providing training products faithfully supported by the Science of Learning. To this end, over two thirds of our team members have advanced degrees in Instructional Systems Design, Industrial and Organizational (I/O) Psychology, or a related field. A full spectrum of supporting staff of Computer Based Training Specialists (CBTS), Programmers, Graphic Artists, and Quality Control Specialists complete a synergistic unity of effort towards the successful development, testing, and implementation of your training solution.
To ensure sufficient capacity for completing the project within the performance (PoP), we may partner with one of the many companies with which we have established relationships. As the primary contractor in this partnership, we perform at least 51% of the work and all management activities. The partnering company, as the subcontractor provides additional skilled labor including I/O psychologists, ISDs, programmers, and graphic artists.
Should the need arise, our standing partnership with [Staffing Agency] provides us with reach back capabilities to implement cost-effective surge capacity.
2.2. Gamma Model
From the project management perspective, QuantumLeap Instructionals employs our “Gamma” model (ɤ) methodology as shown in Figure 2. This model marries Agile development methods with the industry proven P-ADDIE) model for instructional development. This gains customer guidance and buy-in early and often, while maintaining the benefits of implementing PADDIE in a quality focus with distinct, gated phases.
2.2.1. Quality Assurance.
We integrate Quality Assurance practices using processes and tools ranging from job aids and forms to Kanban and 5S principles. Our internal built-in quality process is supplemented with an additional Quality Assurance (QA) process overseen by a dedicated team of professionals composed of Quality Assurance Specialists and led by a Configuration Manager. We offer full transparency of our progress through our Gamma model’s frequent opportunities for customer guidance and clarification, providing frequent opportunities to steer the final product towards your vision and to meet requirements.
While our processes “bake” quality into all our products, all deliverables undergo a rigorous, phased draft and internal review process, followed by final review with tailored punch lists to ensure they are free of errors or defects before they are sent for our customer’s final review and acceptance.
2.2.2. Risk Management.
Risk management is integral to us with established communication channels across all levels of our organization. Every team member is empowered to identify potential risks at any time.
A dedicated Risk Management Specialist works with the Project Manager to evaluate potential risks for likelihood and impact. All risks are added to the project’s risk register, mitigation strategies are developed, and risk owners are assigned. Each project undergoes monthly risk reviews to monitor risks and adjust mitigations.
2.3. Project Kick-Off
The KOM is followed by the development of the Project Management Plan (PMP). This living document presents the processes and procedures, communications channels, Integrated Master Schedule (IMS), and other information detailing the who, how, and when of the activities in executing the project. Once created and cleared through our internal QA review, it is provided for the customer’s review. Upon receipt of the customer’s comments, we schedule the Planning Gate Review and implements revisions to the PMP. An agenda is distributed three days prior to the Gate Review.
During the Gate Review, performed either in person or via teleconference, the PMP is examined and discussed. The event culminates with customer acceptance of the document. Minutes of the event are distributed within three working days of its conclusion.
2.4. Project Management Plan
The KOM is followed by the development of the Project Management Plan (PMP). This living document presents the processes and procedures, communications channels, Master Integrated Schedule (MIS), and other information detailing the who, how, and when of the activities in executing the project. Once created and cleared through our internal Quality Assurance review, it is provided for the customer’s review. Upon receipt of the customer’s comments, we schedule the Planning Gate Review and implements revisions to the PMP. An agenda is distributed three days prior to the Gate Review.
During the Gate Review, performed either in person or via teleconference, the PMP is examined and discussed. The event culminates with customer acceptance of the document. Minutes of the event are distributed within three working days of its conclusion.
2.5. Analysis
Once the Planning phase has concluded with the customer’s acceptance of the PMP, the Analysis phase of the project begins.
This phase includes a training situation analysis and task analysis. These analysis activities begin with a review of Government or Customer Furnished Information (GFI/CFI) -- user manuals, technical manuals, and previous course of instructions that were developed. This is followed by interviews with customer provided Subject Matter Experts (SME) to clarify our understanding of GFI/CFI.
In accordance with our gamma-model, analysis activities are conducted in sprints, with regular customer meetings to provide status updates, share sample materials, obtain clarification or guidance, and ensure stakeholder buy-in.
At the end of the Analysis phase, the team provides the analysis reports to the customer for their review and satisfies any other requirements to enter the Analysis gate.
An agenda is distributed three days prior to the Gate Review. During the Gate Review, a summary brief of the analysis findings is presented and discussed. The gate review concludes with customer acceptance of the findings. Minutes of the event are distributed within three working days of its conclusion. Completion of the Analysis Gate Review marks the start of the Design Phase.
2.5.1. Innovation Process
The Design phase begins with our team conducting a five-step innovation workshop as shown in Figure 3.
The first step, challenge definition, identifies barriers to project goals. Then during the brainstorming step, team members suggest operationalized innovations or novel ideas for addressing the barriers.
For the option down selection step, the customer helps to evaluate the suggested innovations. Each proposed option is evaluated on its potential to support the project’s goals, implementation feasibility, advantages, and challenges. Customer representatives may discard any options from consideration. They are then prioritized with the top three to five options advanced to the next step.
Within the option refinement step, project team members detail the option’s integration into the project, including required inputs, outputs, resources, and time required, how to measure its benefits, and potential risks and mitigations.
The final step, selecting a Course of Action (COA), selects one of the options for implementation. Customer representatives may opt to implement multiple options if they feel the benefits outweigh the increased risk. Alternatively, they may opt to forego implementing any innovative options in the interest of minimizing risk. The Course of Action is documented with customer sign-off before continuing with the Design phase.
2.6. Design
Using the selected COA, our team moves forward with the design; holding periodic meetings with customer stakeholders to obtain buy-in and guidance. An instructional analysis is conducted during the design phase to develop Terminal and Enabling Learning Objectives (TLO/ELO). The Learning Objectives (LOs) are assembled into Learning Objective Hierarchies and organized into a course flow diagram. Further, we identify the media formats and instructional strategies.
While the actual design will depend upon our analysis findings, designs may include some of the following instructional strategies.
- Threaded storylines to provide relevance, context, and motivation for the learner.
- Scenario-based practice and assessments providing relevance and context.
- Practical exercises using object models within a simulated environment.
- Virtual mentor to provide guidance and coaching to the learner.
- Virtual peers to offer collaboration and peer teaching opportunities within the IMI.
The team submits all Design deliverables (i.e., IMDP, including sequenced list of TLOs, ELOs, Learning Objective Hierarchies, course flow diagrams, assessment plan (sample test package), sample storyboards, and IMI prototype for the customer’s review prior to entering the Design gate review.
Like the Analysis gate, the Design gate review is performed either in person or via teleconference to ensure all gate exit requirements are achieved. If any discrepancies are identified, these are corrected before moving into the Development phase.
2.7. Development
In the Development phase, our team constructs the training solution defined within the Design phase deliverables. The work is performed in sprints using small batch, staggered, iterative processing in accordance with Lean Manufacturing concepts. This allows specialized teams to focus on implementing features and standards with consistency across the entire training product.
In practice, you might observe a team of our I/O Psychologists focused on drafting assessments around well-structured Learning Objectives. While the work they completed in the previous sprint is incorporated into storyboards by a team of Instructional System Designers, with the storyboards passing to the graphic artists in the next sprint, and the audio engineers the sprint after that. By using this staggered approach, with regular feedback from the customer during each sprint cycle, we can implement changes or course corrections with minimal rework.
Our systems engineering team develops IMI, the user interface, and installation packages, including the SCORM 1.2 meta-tagging and wrapper, within a test environment that mirrors the LMS architecture and specifications.
Upon the completion and submission of all Development deliverables Instructor Guide, IMI, Electronic Performance Support System (EPSS), the customer can conduct a review of the materials. After this, the Development gate is initiated. As part of the gate events, a team of I/O Psychologists and ISDs conduct a verification event in which SMEs experience the courseware and capture any errors or issues that are identified. If any discrepancies are identified, these are corrected to ensure all gate exit requirements are met.
2.8. Implementation and Evaluation
The Implementation phase includes SCORM testing, installing courseware on the schoolhouse’s LMS, and conducting a course pilot. Our system engineers coordinate with schoolhouse technical staff to ensure successful SCORM testing and installation.
The course pilot involves a sample population from the target audience experiencing the IMI for the first time. Our team of I/O Psychologists and ISDs capture pretest and posttest data to validate the effectiveness of the courseware (Kirkpatrick’s Level 2). The courseware includes a survey to measure student reaction and satisfaction with the experience (Kirkpatrick’s Level 1). The Kirkpatrick Level 1 results and appended data shall be included within the Pilot Report. We shall also include the Level 2 results within the report as a best practice and added value to the government.