Canada Border Services Agency
Symbol of the Government of Canada

ARCHIVED - Internal Audit Report of IT Systems under Development - Phase 3
Advance Passenger Information/Passenger Name Record Risk Scoring

Warning This page has been archived.

Archived Content

Information identified as archived is provided for reference, research or recordkeeping purposes. It is not subject to the Government of Canada Web Standards and has not been altered or updated since it was archived. Please contact us to request a format other than those available.

October 2007

Table of Contents


Return to Top of Page

Executive Summary

The audit of Canada Border Services Agency (CBSA) systems under development was identified as a priority in the CBSA’s Risk-Based Multi-Year Audit Plan. This Plan was approved by the Internal Audit and Evaluation Committee on March 17, 2005.

The principal objective of this audit was to provide the necessary assurances to the CBSA that its integrated project management and system development practices and procedures for automated business systems under development are adhering to the internal policies and procedures that have been established by the CBSA. A further objective of this audit was to identify improvement opportunities to the CBSA’s existing internal policies and procedures for managing its systems under development.

The audit was divided into three phases. Phase 1, reported in February 2007, provided an assessment of the adequacy and effectiveness of the CBSA’s current management control framework. Phases 2 and 3 provide an assessment of the degree to which two selected system development projects are effectively and appropriately applying the control framework. This report presents the findings from Phase 3 of the audit that reviewed the management framework for the development of a selected system development project: Advance Passenger Information/Passenger Name Record (API/PNR) Risk Scoring. The audit was conducted by Interis Consulting.

API/PNR Risk Scoring is part of the Passenger Information System, a multi-year, multi-million-dollar project launched under the Smart Border Declaration to improve the handling of air passengers. API/PNR is intended to provide the CBSA with more effective risk-management processes and tools to assess the risk of air travellers (particularly those posing a high or unknown risk) before they arrive at Canada’s international airports. Development work on API/PNR began before the formation of the Agency as a joint initiative between the former Canada Customs and Revenue Agency (Customs Branch), Citizenship and Immigration Canada and U.S. Customs and Border Protection (CBP). The audit addressed the risk-scoring and data-sharing components that were implemented in June 2006.

Audit Findings

Based on the audit work conducted in Phase 3, it is found that API/PNR Risk Scoring was developed in accordance with CBSA policies, procedures and methodology in place at the time and the functionality of the application is working as intended. Many of the observations made in Phase 1 of the audit were supported by the audit work conducted in Phase 3. In addition, the audit noted that since the time of the API/PNR Risk Scoring project development, many new processes and governance practices have been developed and introduced for more recent projects. 

The audit noted a number of strengths in the API/PNR Risk Scoring development process, including a strong governance framework and the use of a sound system development approach. Because API/PNR is a joint development program with U.S. CBP, the governance framework and structure, joint user requirements and a memorandum of understanding were formally developed and signed off by senior representatives of U.S. CBP and the CBSA. A privacy impact assessment was completed and shared with the Office of the Privacy Commissioner of Canada. Recommendations regarding privacy concerns were implemented. The system development approach included the following activities that would be expected in any system development project:

  • The feasibility and viability of the solution were assessed.
  • User requirements were defined with involvement from stakeholders.
  • Testing was appropriately performed.
  • The implementation of the API/PNR Risk Scoring was well managed.

Although many of the findings from the audit of API/PNR Risk Scoring are the same as those reported in the Phase 1 audit report, the following additional areas of control were identified in Phase 3 that could be strengthened at the project level: 

  • Governance -- Authority, responsibility and accountability for this project were defined within a project charter and a project governance structure; however, the commitment from sponsors was not well understood, secured and documented at the project’s outset.
  • Business Benefits -- High-level business benefits were defined and clearly understood for the project. Detailed operational benefits and results were not well defined to fully assess their realization.
  • Project Management Framework -- Two key audit observations were made:
    • Planning and scheduling were not sufficiently integrated across all directorates of the Innovation, Science and Technology Branch (ISTB) to ensure that interdependencies and key dates were clearly understood and planned to meet scheduled release dates and improve efficiencies; and
    • The change management process had not been fully defined and communicated to sponsors, including the expectations and authorities for approvals.
  • Security Assessment -- Security requirements were considered for the broader API/PNR program but at the time of the system development, the IT Security Group in the ISTB was establishing itself and the audit could not determine who in the Agency were involved in assessing, reviewing and approving the security requirements. 

Management’s Response

Management has carefully reviewed the report and provided management action plans to address the audit recommendations and the completion date of each action.
Return to Top of Page

1. Introduction

1.1 Background

The audit of Canada Border Services Agency (CBSA) systems under development has been identified as a priority in the CBSA’s Risk-Based Multi-Year Audit Plan. This Plan was approved by the Internal Audit and Evaluation Committee on March 17, 2005.

Return to Top of Page

1.2 Objectives and Scope

The principal objective of this audit was to provide the necessary assurances to the CBSA that its integrated project management and system development practices and procedures for automated business systems under development are adhering to the internal policies and procedures that have been established by the CBSA. A further objective of this audit was to identify improvement opportunities to the CBSA’s existing internal policies and procedures for managing its systems under development.

To achieve these objectives, the internal audit plan was divided into the following three phases:

  • Phase 1 -- A review of the CBSA’s management control framework (MCF) for the development of automated business systems.
  • Phase 2 -- Audit of a selected system under development project.
  • Phase 3 -- Audit of a second selected system under development project.

The audit deliverables for Phase 1 were the following:

  • Provide an assessment of the adequacy and effectiveness of the CBSA’s current MCF; and
  • Select two of the CBSA’s current system development projects for a more comprehensive audit review of their application of current controls and governance processes. The following two systems under development were selected:
    • Phase 2 -- Advance Commercial Information/Electronic Data Interface Reporting for Air, and
    • Phase 3 -- Advance Passenger Information/Passenger Name Record (API/PNR) Risk Scoring.

The audit deliverables for Phases 2 and 3 were the following:

  • Provide an assessment of the degree to which the two selected system development projects effectively and appropriately apply CBSA policies, procedures and methodology for system development projects; and
  • Provide recommendations for improving policies and procedures at the CBSA for future system development projects.

The scope of this audit was to assess the development of API/PNR Risk Scoring Phase II and not the effectiveness of the application itself. Given the multi-phase approach taken for the API/PNR program, much of the planning, scoping and identification of requirements for the entire program (including Risk Scoring Phase II) had been completed in Phase I. The scope of the audit included all development activity that affected Risk Scoring Phase II regardless of the phase when it occurred.

Return to Top of Page

1.3 Audit Criteria

Audit criteria were developed based on risk elements inherent to system development projects, as well as on public sector trends for effective governance of system development project lifecycles. A detailed set of audit criteria were established that factored in the primary influences that can significantly impact system development projects in the Government of Canada (GC): management structure and processes, system development lifecycle policies and standards, as well as planning and acquisition processes and procedures. The detailed audit criteria were used to assess CBSA practices.

It was noted during the initial audit planning that business system development projects at the CBSA are inherently exposed to risk given the following conditions:

  • The CBSA has undergone significant reorganization;
  • The CBSA has had to adopt and adapt separate system development processes, including separate IT infrastructures that were established in its predecessor departments (primarily the former Canada Customs and Revenue Agency and Citizenship and Immigration Canada); and
  • There are aggressive delivery timelines imposed on the CBSA from outside sources.

Key control areas addressing inherent risks were identified based on the Treasury Board of Canada Secretariat’s (TBS) Enhanced Management Framework and the Control Objectives for Information and related Technology (CobiT) Framework issued by the IT Governance Institute. The audit criteria used to initially assess the CBSA’s overall business practices, general controls and governance processes for their systems under development projects have been organized into the following six categories:

  • Governance
  • Business Benefits
  • User Requirements
  • Project Management
  • Technical Solution
  • Business Transformation

The audit criteria are presented in Appendix A.

Return to Top of Page

1.4 Approach and Methodology

The approach and methodology were risk-based and compliant with applicable TBS internal audit policies. The audit was conducted in accordance with an audit program that defined audit tasks to assess each criterion. Through interviews and documentation review, the audit assessed the current practices against the criteria and formally assessed the effectiveness of each practice.

Interviews were conducted with representatives of the Innovation, Science and Technology, Enforcement, Comptrollership and Operations branches.

The fieldwork for Phase 3 was conducted between July and October 2006.

Return to Top of Page

1.5 Results from Phase 1

The results of the audit from Phase 1 were reported in February 2007. A number of strengths were noted in the controls over systems under development that can provide the foundation for building a strong MCF over systems under development, namely a governance structure and a project management framework. Given the newness of the agency reorganization, many of the processes were still in early development and had not matured in the organization. New committees were being formed, roles and responsibilities were being clarified, and specific deliverables were being developed. These activities are indicators that the Agency is making progress toward enhancing its control over systems under development.

The following key areas of control that should be strengthened were noted in the Phase 1 audit report:

  • Project Management Framework -- The project management framework was not fully defined and did not include processes such as the following:
    • Gate approvals where predefined gates are established for management to formally evaluate the continued realization of benefits and provide a go/no-go decision.
    • Project status reporting on project cost tracking against budget.
    • Processes to consolidate individual project schedules into one master schedule.
  • Project Prioritization -- The process to realign priorities and resources in-year for the portfolio of projects, when new initiatives are introduced, had not been sufficient given the capacity limitations of the Agency. Prioritization of projects continues to be a particular challenge for the CBSA. The Agency does not have full control over its priorities to drop or realign existing priorities given the external influences and pressures.
  • End User Involvement -- End users were not always sufficiently engaged in the system development process, particularly in the acceptance of the functional design, final testing and approvals of the system before it is put into production, and the formal review and acceptance of scope changes.

Management reviewed the Phase 1 audit report and provided management action plans to address the audit recommendations and the expected date of implementation for each action.

Return to Top of Page

1.6 Purpose

The purpose of this document is to present the findings from Phase 3 of the audit.

Return to Top of Page

2. Overview of API/PNR Risk Scoring

Following the terrorist attacks of September 11, 2001, Canada and the United States signed the Smart Border Declaration, also known as the Manley-Ridge 30-point plan, in December 2001. The Smart Border Declaration outlines a 32-point Action Plan that provides for ongoing collaboration to identify and address security risks while efficiently expediting the legitimate flow of people and goods across the Canada-U.S. border. U.S. Customs and Border Protection (CBP) has had the lead responsibility for 11 of the 32-point Action Plan initiatives. The CBSA had responsibility to develop a parallel API/PNR program in Canada to that being developed in the United States.

The Smart Border Declaration identified a commitment to using innovative technology to more effectively identify high-risk individuals (High-Risk Traveller Identification Initiative). In addressing this commitment, Canada and the United States agreed to share API and PNR data as well as to explore a means of identifying risks posed by travellers on international flights destined to either country. Managed jointly between the former Canada Customs and Revenue Agency (Customs Branch), Citizenship and Immigration Canada and U.S. CBP, the initiative was composed of two parts:

  • Lookout sharing, which was implemented in two phases: bulk transfer of lookouts from the United States implemented on June 29, 2004, and automated exchange of lookouts with the United States on May 17, 2005; and
  • API/PNR risk scoring and data sharing, which were implemented in June 2006.

This audit looked at the system development project that addressed the second component, API/PNR Risk Scoring.

API/PNR was one of the highest priorities for the CBSA and a GC commitment to help protect the border. Although conceived before the events of September 11, 2001, the API/PNR Risk Scoring project became a priority for the GC as a result of those events. 

API/PNR is intended to provide the CBSA with more effective risk-management processes and tools to assess the risk of air travellers (particularly those posing a high or unknown risk) before they arrive at Canada’s international airports. This initiative provides CBSA officers with pre-arrival data regarding air travellers destined for Canada. Air carriers provide PNR information (in electronic form) for all flights destined for Canada at the time of departure and, for the United States, within 30 minutes of departure. API, together with other relevant information, permits officers to analyze the information and, with the assistance of the automated targeting and risk-assessment tool, to identify high-risk air travellers requiring further scrutiny on arrival. 

API/PNR is part of the Passenger Information System, a multi-year, multi-million-dollar project launched under the Smart Border Declaration to improve the flow of air passengers by providing for the receipt and analysis of travel information on individuals before their arrival in Canada. 

The CBSA worked on API/PNR in parallel with U.S. CBP, based on a joint requirements document that outlined the requirements that had to be implemented and followed by each border agency. 

At the time that API/PNR Risk Scoring was being developed, the organizational structure of the CBSA and the Innovation, Science and Technology Branch (ISTB) [ 1 ] was as presented below.

CBSA's Innovation Science and Technology Organizational Structure

Organizational strucutre of the CBSA and ISTB

Under this structure, the API/PNR Risk Scoring project was managed by one project manager from the Major Projects Design and Development (MPDD) Directorate and by one information technology (IT) project manager from the Border Systems Directorate. The project was sponsored by the Enforcement Branch. Through the development process, there were personnel changes to the ISTB team, as well as numerous changes in the other branches. 

At the time when API/PNR Risk Scoring was being developed, there were many pressures on the Agency and the ISTB. The operations of the Agency are highly dependent on information systems and technology, which are developed, implemented and managed by the ISTB. The scope of system development activity at the CBSA is significant, with annual spending on development and maintenance of systems, infrastructure and related technology estimated to be in the $150-250 million range over the next few years. In response to increased North American security concerns, following the terrorist attacks in September 2001, the GC introduced a considerable number of security-related initiatives, under the umbrella of the Public Security and Anti-terrorism (PSAT) Initiative and the Shared Border Declaration, resulting in expedited initiative development. This put pressure on a number of system development projects, including API/PNR Risk Scoring.

The API/PNR program was funded from the Manley-Ridge funding envelope. Although a budget was not officially established for the API/PNR Risk Scoring project, development costs were estimated to have been approximately $4.87 million. [ 2 ] This cost estimate includes all CBSA costs related to the development of the API/PNR risk scoring project and data sharing from 2003-2004 to 2006-2007.

API/PNR Risk Scoring Phase II was implemented in June 2006 and is in active use by its primary user, the National Risk Assessment Centre (NRAC). NRAC issues lookouts to ports of entry in both Canada and the United States and this facilitates the expeditious flow of information and intelligence between and within partnering agencies to help identify high-risk travellers before they arrive at the border. Further development of API/PNR will involve building interfaces to improve immigration data exchange and to enhance data analysis functionality.

Return to Top of Page

3. Audit Findings

Based on the audit work conducted in Phase 3, it is concluded that API/PNR Risk Scoring was developed in accordance with CBSA policies, procedures and methodology in place at the time. Furthermore, the application is working as intended. However, opportunities exist to strengthen the system development processes used in API/PNR Risk Scoring to ensure adequate governance, risk management of and control over future system under development projects.   

Since the time of API/PNR Risk Scoring Phase II development, many new processes and governance practices have been developed and introduced for more recent projects. In addition, many of the observations made in Phase 1 were supported by this Phase 3 audit. Management action plans for Phase 1 audit recommendations indicated that many audit areas needing strengthening have recently been addressed or are in the process of being addressed.

The audit findings are described in detail below.

3.1 Technical Solution Development

A sound system development approach was followed.

The audit found that the CBSA system development methodology was used across all phases of the API/PNR Risk Scoring project. The following key activities were carried out:

  • The feasibility and viability of the solution were assessed. 
  • User requirements were defined with involvement from stakeholders including U.S. CBP. 
  • Testing was appropriately performed. 

The API/PNR Risk Scoring project was developed using an industry-recognized methodology, the Rational Unified Process (RUP), [ 3 ] to achieve the final design and build.  The audit examined various documents that demonstrated good development practices for the technical solution, as expected in any system development process. The system design was described in a project scope document, an initiation phase process document and a high-level business requirements document. User requirements were clearly described and documented in business use cases. The requirements were then translated into a system component design document and system use cases for the developers. Many of the documents prepared for API/PNR were developed using templates that have since evolved and been replaced by new templates, but the key objectives of the documents remain the same. Key individuals from the ISTB were involved in the initial design and development phases. The Enterprise Architecture Group interfaced with the Business Architecture Group in the MPDD Directorate to define the requirements and review the feasibility of the technical solution.

User requirements were defined with involvement from key stakeholders including U.S. CBP. The MPDD Directorate took a large role in determining the high-level requirements and priorities; however, joint application development sessions were held with representatives of the Enforcement Branch and U.S. CBP to develop the business use cases. The Intelligence Sector and Contraband Program in the Enforcement Branch provided functional guidance and subject matter expertise to the API/PNR Risk Scoring project. The audit found evidence of ongoing refinement of the business requirements, largely through e-mail communications and updates to project documents. To improve its process of developing user requirements, the ISTB has begun using "usability architects" who look at the interface to see how people use the system. User requirements can then be further enhanced to make the system more "usable" for the users.

Testing of API/PNR Risk Scoring was performed in a number of different test environments before implementation. Development testing was performed by the IT group and included data and database integrity testing, function testing, business cycle testing and user interface testing. Test cases were mapped out, schedules were defined, resources were assigned and test results were documented and summarized. After the development testing was completed, the MPDD Directorate conducted systems integration testing, user acceptance (client) testing and operational testing. Individuals outside the MPDD Directorate, including those from NRAC, were invited to participate in user acceptance testing.

Typically, once the development and implementation are complete, the system application is moved from development to production and is transferred to the Systems Operations group in the MPDD Directorate for ongoing support. The move is well controlled through release management procedures. At the time of the audit, the API/PNR Risk Scoring project had not yet been transferred to Systems Operations. It is a normal practice within the ISTB to keep a system under development project under the control of the development group for about a year after implementation to monitor initial operational problems before it gets moved to Systems Operations.

Return to Top of Page

3.2 Business Transformation

The implementation of API/PNR Risk Scoring was well managed.

Implementation plans were developed and managed by the MPDD Directorate, in collaboration with NRAC, to support internal users (primarily NRAC) and external air carriers. Processes and resources were put in place to record and track systems issues and specific API/PNR problems. End users were trained in the use of API/PNR Risk Scoring based on formal training documentation prepared in the development process. Training and maintenance of training materials is carried out by NRAC as the primary user of API/PNR Risk Scoring.

Air carriers were required by law to provide the data elements defined jointly by the CBSA and U.S. CBP. These data elements were reviewed with the Office of the Privacy Commissioner of Canada and the European Union to ensure that they complied with Canadian and European law. Communications with the external air carrier community were maintained through the development process of the data acquisition component to ensure that necessary data was provided in the correct format for use in API/PNR Risk Scoring. A formal process was in place for carriers to test their interfaces to send data electronically.

A formal structure and processes were developed to support internal users and external air carriers. A call centre approach is used to document and resolve problems from air carriers. Carriers are provided with toll-free support during regular business hours. Carriers reporting API/PNR data electronically are supported by the ISTB call centre as the first line of support for data transmission problems. If required, the call centre escalates problems to the technology services group responsible for monitoring the external interface.

Return to Top of Page

3.3 Authority, Responsibility and Accountability

Authority, responsibility and accountability for the API/PNR Risk Scoring project were defined within a project charter and a project governance structure; however, the commitment from sponsors was not well understood, secured and documented at the project’s outset. 

Authorities, responsibilities and accountabilities for API/PNR Risk Scoring were identified in the project charter for the API/PNR High-Risk Traveller Identification Initiative. The project charter was developed jointly with U.S. CBP over the period from November 2002 to April 21, 2004. The project charter was initially signed off by the CBSA and U.S. CBP in May 2003 with subsequent revisions made to reflect changes that were required as the project evolved in the bilateral development arrangement.

In addition to the project charter, a governance structure was defined for API/PNR in the governance structure for the High-Risk Traveller Identification Initiative, which evolved over the period from July 30, 2003, to March 3, 2004. The governance structure was created based on input from the various relevant organizations including Customs Branch and Citizenship and Immigration Canada (before the creation of the CBSA), Contraband and Intelligence Directorate and the MPDD Directorate. The structure included clear definitions of the executive sponsors, the API/PNR working group (steering committee role), the project management group and the project team members. It also outlined the respective roles and responsibilities of these individuals, decision-making responsibilities, the lead U.S. counterpart for bilateral communications and liaison, and the dispute resolution process. The governance structure was approved by representatives of the following organizations:

  • Enforcement Program Development, Enforcement Branch, CBSA
  • Strategic Intelligence, Immigration Services, CBSA
  • Risk Assessment Systems Division, MPDD Directorate, CBSA
  • Intelligence Services Division, Contraband and Intelligence Directorate, CBSA
  • Office of Border Security and Facilitation, U.S. CBP

The Enforcement Branch was identified as the lead sponsor for the project with the MPDD Directorate having the lead role for development. In the early stages, the Enforcement Branch was heavily involved in the definition of user requirements in business use cases. However, as the project progressed, sponsorship and engagement of the business units decreased.

With multiple conflicting demands on the Enforcement Branch, engaging the sponsor was seen as a constant struggle. As with all development projects, the ISTB and the sponsor have authorities and accountabilities to make decisions affecting the development project. On account of the demands on the sponsor, the MPDD Directorate drove the project to its completion with authority for overall development and the resources to complete it. Under the old historical structure, the MPDD Directorate was part of, and represented, the business group and held overall systems authority for all major system development projects. Interviewees indicated that the MPDD Directorate assumed authority for some development decisions out of necessity to meet the timelines for project delivery. Some senior managers from the sponsor branches interviewed indicated that they did not sufficiently involve themselves throughout the development process largely due to competing priorities and other operational commitments. Decisions were made quickly and sponsors did not always have time to respond. This was done to manage an environment of numerous projects underway at the same time and many incremental demands being placed on the Agency to deliver its mandate. 

As the project evolved, sponsor branches became more involved in providing policy direction, (e.g. the policy on privacy concerns and the process for sharing sensitive information with U.S. CBP). As the project progressed, decisions were made in bi-weekly project team meetings and recorded in minutes. Apart from the formal project charter, governance structure and scope documents, sponsors’ approval of decisions at key milestones in the project development were not formally documented. It is important to obtain and document sponsors’ support to ensure that the development work continues to meet sponsors’ requirements and is expected to achieve projected benefits. 

In the Phase 1 audit report, it was recommended that the ISTB should continue with its plans to develop processes to better engage sponsoring branches in systems development. Management had indicated that processes would be developed, as well as a change management strategy, which would include obtaining appropriate approvals of the sponsoring branch.

Return to Top of Page

3.4 Project Management Framework

The project management framework currently in development for systems under development projects was not in place at the time of the API/PNR Risk Scoring project development. As such, weaknesses were identified with the project management practices used for this project.

As noted in the Phase 1 audit report, a project management framework (or lifecycle) to manage systems under development is in development and being refined with supporting tools and templates for implementation. This framework was not in place at the time of the API/PNR Risk Scoring project development. The project did, however, follow a development methodology with clear phases closely aligned to the framework being developed. 

The Phase 1 audit report noted areas where the project management framework could be enhanced or improved. The audit work done in Phase 3 supports those earlier conclusions. The following observations were made on the project management practices for the development of the API/PNR Risk Scoring project:

  • Documentation of Gate Approvals -- The audit noted that formal approval decisions at key milestones were not well documented. The audit did not find documentation supporting approvals at key milestones. Interviewees indicated that although a formal sign-off process was not in place at the time of the development, consensus for proceeding was obtained within the development team. Although there were several forums that discussed the progress of all projects, there was no steering committee to oversee project progress and to make decisions specific to the API/PNR Risk Scoring project including approvals to proceed at predefined gates. This process weakness may be a result of the authorities and accountabilities for projects not being well defined. An effective oversight body that formally documents and communicates approvals will ensure a common understanding and transparency to the development process and ensure that approvals are received when needed.
  • Cost Monitoring -- A single budget estimate for this project could not be determined since allocations for API/PNR Risk Scoring were highly distributed and not easily separated from other API/PNR sub-projects and accumulated. The CBSA’s cost accounting system tracks costs by cost centre, i.e. it track costs by funding source not by project. As such, costs associated with the API/PNR Risk Scoring project were incurred through many different cost centres and not accumulated or reported by project. Within the cost centres, regular budgeting and cost comparisons are prepared to show monthly progress against the plan, which provides a rough indicator whether budgets are being achieved. As was the practice for the IT directorates for many years, all CBSA employees are now capturing their time by project code, as of April 1, 2007. This should permit the accumulation and tracking of project costs, thereby enabling appropriate oversight of project spending.
  • Integrated Planning and Scheduling -- The audit found that project plans and schedules existed within the MPDD and IT directorates but were not integrated to coordinate all the various development tasks and interdependencies of the project. As with many ISTB system development projects, this project involved many individuals including one project manager in the MPDD Directorate, one IT project manager and many other contributors including architects and data management and release management teams. Ongoing oral communications between various teams to monitor interdependencies were considered by the project teams to be an effective way to coordinate activities. An integrated plan and schedule will ensure that interdependencies are clearly understood and key dates are planned to meet scheduled release dates and improve efficiencies. The recent ISTB reorganization that consolidated the IT developers with MPDD project managers may improve the integration issues between the MPDD and IT directorates. 
  • Benefits Realization -- As noted in the Phase 1 audit report, a benefits realization framework was not in place to clearly define benefits at the outset of the API/PNR Risk Scoring project, then monitor and track for realization upon completion. For this project, the high-level business benefits for the overall API/PNR program were clearly understood and defined in various documents. These include the Smart Border Action Plan, a Treasury Board submission dated October 27, 2003, the API/PNR project proposal dated November 14, 2003, and the API/PNR project charter dated April 21, 2004. However, the operational benefits of API/PNR Risk Scoring were not clearly defined and have not been measured. It was not known whether the API/PNR Risk Scoring has improved the “hit rate” in identifying high-risk travellers and contributed to better enforcement. A baseline measure and measurement targets were not identified at the outset of the project and “hit rate” results were not reported from the system. Thus, it was difficult to measure, track and report on the realization of operational benefits of API/PNR Risk Scoring. 
  • Change Management -- In its review of the API/PNR Risk Scoring project, the audit noted that although the project scope was defined, the process and authorities for approving change requests were not well defined. Through review of sample change requests, the audit did not find formal sign-offs by the sponsors and the ISTB. The audit revealed that approvals of scope change requests from the sponsors were usually voiced at project team meetings; however, it was difficult to assess whether the appropriate person approved the change from a review of project team meeting minutes. Several sponsor representatives interviewed also indicated that the required turnaround time was sometimes too short to fully review the change or to involve more senior staff. Once the change was considered approved by the sponsor group, there was a formal change control process in place within IT to review the change and to determine if it can be added to the release schedule. Change management is a critical component of project management to ensure sufficient rigour is in place to avoid scope creep. Sponsors would value a clearer understanding of the change management process and the expectations and authorities for approving changes. This process weakness could have been the result of the authorities and accountabilities for projects not being well defined.

In the Phase 1 audit report, it was recommended that the ISTB continue with its plans to implement a process to ensure that gates were clearly defined and that benefits realization is reviewed and documented at each gate. Management had indicated that such processes would be implemented. It was also recommended that a business case process be developed and management had indicated that such a process would be developed.

Recommendation:

  1. The ISTB should further enhance its project management framework to include the following:
    • A formal process to integrate planning and scheduling within individual projects across all directorates.
    • Identification and communication with sponsors of the change management process and the expectations and authorities for approving changes.
Management Action Plan Completion Date
Management is in the process of further refining the integrated master project schedule and is working in consultation with project teams and other stakeholders. November 2007
Management is developing a formal change management strategy that includes establishing a formal change management control board. December 2007
Return to Top of Page

3.5 Project Risk Management

Project risks were managed informally. Risk-management processes could be strengthened through improved analysis and tracking of project risks.

A formal project risk-management process involves the identification, analysis, mitigation and monitoring of risks that could jeopardize the success of a project with respect to product quality, schedule and cost. The potential consequences associated with the project risks and the tolerance for the risk exposure would be discussed to determine the formal risk response (avoid, mitigate, assume or transfer/escalate). The project team would determine whether the risks are acceptable and what can be done to remedy or better manage the risk. A formal escalation process would ensure that senior management is engaged in the risk discussions at the appropriate times.

For the API/PNR Risk Scoring project, the audit found that risks were informally addressed on a day-to-day basis but they were not well documented. Interviewees indicated that when risks were identified, they were reported to the relevant project manager during regular meetings. Minutes of meetings were kept for the regular project meetings to support the documentation of the risks and actions taken. A project issues log was maintained for this project to monitor project management issues; however, a standard risk log was not maintained for the project to monitor the risk exposure across the Agency as a whole. Since the time of the development process, a risk log template has been developed to document project risks. 

In the Phase 1 audit report, it was recommended that the ISTB should continue with its efforts to implement a risk-management strategy to support the Major Project Governance and Project Lifecycle Framework. Management had indicated that such a strategy would be implemented.

Return to Top of Page

3.6 Security Assessment

Security requirements were considered for the broader API/PNR program but the level of assessment was unclear.

A security assessment for the API/PNR program was prepared in the appropriate template to ensure key aspects of IT security are consistently addressed for all projects. From the documentation, it was unclear who reviewed and approved the security assessment. Through interviews, it was noted that the process for conducting security assessments was being reviewed at the CBSA by the IT Security Group in the ISTB and the Departmental Security Office in the Comptrollership Branch.

Recommendation:

  1. The Comptrollership Branch, in coordination with the ISTB, should continue its review of the processes to conduct security assessments and to clarify the role of the IT Security Group and the Departmental Security Office in assessing security requirements for individual projects, including technical, information and physical security requirements.
Management Action Plan Completion Date
The Comptrollership Branch and the ISTB will develop a comprehensive process to ensure that threat and risk assessments of new IT systems/projects include a coordinated security assessment that takes into account the technical systems requirements as well as information and physical security considerations. Once developed, this process will be incorporated into the project requirements and monitored via the project management process. First quarter of 2008-2009
Return to Top of Page

Appendix A - Audit Criteria

The audit criteria used for Phase 3 of the audit were the following:

Control Category Audit Criteria
Governance
  • Authority and accountability defined.
  • Projects prioritized to achieve CBSA objectives.
  • Stakeholders engaged in and committed to the project.
  • Project performance is regularly measured, reported and monitored.
  • Effective oversight bodies established, including their roles with respect to governance, risk management and control.
  • Approval decisions made at key milestones.
Business Benefits
  • Business benefits identified and there are quantifiable and/or qualitative metrics to track and measure their actual realization.
User Requirements
  • User requirements identified, verified, validated and properly documented.
  • Management controls exist to manage changes and conflicts across business systems.
Project Management
  • An integrated project plan is prepared that clearly guides the project execution and control.
  • Scope management processes in place to ensure that the project scope includes all the work required and only the work required to be completed.
  • Cost-management processes established to ensure that the project is completed within the approved budget.
  • Schedule management processes properly established and used effectively to ensure the project will be completed on time.
  • Quality management system is in place to ensure that the needs for which the project was undertaken will be satisfied.
  • Risk-management processes in place to regularly identify, analyze and respond to project risks.
  • The project status is regularly monitored and reported.
  • Problem- and issue-management processes in place to identify, action and monitor problems.
  • Communication processes in place to ensure optimal coordination within and outside the project team.
  • The project makes the most effective use of the people involved.
  • Processes in place to manage partner and supplier relationships.
Technical SolutionTechnical solution is viable in terms of implementing the new technology:
  • System development standards and procedures established;
  • Feasibility of the technical solution is assessed;
  • Application software developed in accordance with design specifications, development and documentation standards and quality requirements;
  • Information security requirements met by all components;
  • Testing is sufficiently planned, performed and documented and covers all components of the information system (e.g. application software, facilities, technology and user procedures);
  • Final test results reviewed and approved by user management and the IT function; and
  • Formal procedures in place to promote the system from development to testing to production.
Business Transformation Appropriate governance processes and procedures in place to address the impacts of changes to users generated by the implementation of new system development projects. For example:
  • An implementation plan is developed and approved by management from all stakeholder groups to guide the rollout and releases;
  • Users sufficiently trained in a timely way;
  • End user support is clearly established and communicated to users;
  • Calls from users tracked and actioned; and
  • Escalation procedures established.
Return to Top of Page

Appendix B - List of Acronyms

API
Advance Passenger Information

API/PNR
Advance Passenger Information/Passenger Name Record

CBSA
Canada Border Services Agency

CFIA
Canadian Food Inspection Agency

CIC
Citizenship and Immigration Canada

CRA
Canada Revenue Agency

DSO
Departmental Security Office

ICAB
Integrated Change Advisory Board

ICES
Integrated Customs Enforcement System

ICS
Integrated Customs System

ISTB
Innovation, Science and Technology Branch

Lookout
The identification of a person, goods, business, conveyance or document that relates to a potentially serious violation of Canadian or U.S. customs or immigration laws

MCF
Management Control Framework

MPDD
Major Projects Design and Development

NRAC
National Risk Assessment Centre

PAXIS
Passenger Information System

PNR
Passenger Name Record

PSAT
Public Security and Anti-Terrorism

RUP
Rational Unified Process

SOS
Statement of Sensitivity

TRA
Threat and Risk Assessment

U.S. CBP
U.S. Customs and Border Protection