Evaluation of the International Legal Programs Section
3. Methodology
The evaluation of the Section draws on four lines of evidence: document review, review of administrative data, key informant interviews with departmental officials and other stakeholders, and case studies. Each of these methods is described more fully below. This section also includes a brief description of the methodological challenges.
The evaluation matrix, which identifies the evaluation questions, indicators and lines of evidence used to guide the study, can be found in Appendix B. The data collection instruments developed to respond to the evaluation matrix are in Appendix C.
3.1. Document Review
An extensive document review was conducted both to inform the development of the data collection instruments and to address a majority of the evaluation questions. The review also provided insight into the operations of the Section. Documents reviewed included Departmental Performance Reports; Reports on Plans and Priorities; the Policy Sector’s Business Plans; legal technical assistance semi-annual and annual project reports to GAC; Government of Canada documents including documents provided for partner countries; media reports relating to the projects; finance and human resources information; foreign policy plans; project performance management frameworks; Administrative Arrangements for the legal technical assistance projects; strategic advice documents; and other supporting documents.
3.2. Administrative Data Review
The evaluation included a review of administrative data from the Department’s iCaseFootnote 1 database for fiscal years 2009-10 to 2013-14, which provided descriptive information about the types of files for which the Section is responsible and the associated level of effort (number of hours).
3.3 Key Informant Interviews
The key informant interviews conducted for this evaluation addressed the majority of the evaluation questions and were a key line of evidence in gathering information about the relevance as well as performance.
A list of potential key informants was prepared, and interview guides tailored to each key informant group were developed in consultation with the EAC. Potential interviewees received an invitation to participate in an interview. Those who agreed to participate were provided with a copy of the interview guide in the official language of their choice prior to the interview. Each interview was conducted in the respondents’ preferred official language, and key informants were assured of the anonymity of their responses.
Interviews were conducted with a total of 33 key informants with representatives from the Department, GAC, other federal government department, beneficiaries from the funded projects, and other partners.
Table 3 below provides a breakdown of the number of key informant interviews by ILPS activities and the response rate, and includes those interviewed as part of the case studies.
ILPS Activities | Suggested # | Participated # | Response Rate (%) | Additional interviews (Snowball Effect) | Total # interviewed (Participated + Additional) |
---|---|---|---|---|---|
Jamaica Project | 7 | 7 | 100% | - | 7 |
Mexico Project | 5 | 3 | 60% | 5 | 8Note de table i |
Palestinian Authority Project | 9 | 7 | 78% | 2 | 9Note de table i |
Strategic Advice | 6 | 4 | 67% | - | 4 |
Outreach | 6 | 5 | 83% | - | 5 |
TOTAL | 33 | 26 | 78% | 7 | 33 |
- Note de table i
-
case study interviews.
3.4. Case Studies
The MexicoFootnote 2 and the Palestinian Authority legal technical assistance projects were used as case studies. The purpose of the case studies was to illustrate what worked well and did not work well in terms of factors either contributing to the success or constraining the projects and to allow a more in-depth assessment of the projects. The choice of the projects as case studies was made in consultation with the EAC.
The methodology for the case study approach included a detailed document review of project reports, media reports and other supporting documents followed by key informant interviews with beneficiaries, representatives from GAC associated with those projects selected as case studies, temporary staff who had worked on one of the projects, departmental staff, ILPS staff working on the projects, and other partners involved with the projects such as Canadian experts delivering the training, and the Canadian embassy. See Table 3 for a detailed breakdown of the key informants. A total of 17 case study interviews were conducted to support the documented information, and which is inclusive in the total number of interviews conducted. A ‘snowball’ approach was used whereby additional interviewee sources were recommended by some of the case study interviewees. Seven (7) additional interviews were conducted between the two (2) projects.
A majority (67%) of the interviews including key informants and case studies, were conducted either by telephone or in-person, and the remaining (33%) interviews were conducted in writing by completing the interview questionnaire. There were several reasons for this latter approach. For example, as some interviewees spoke neither English nor French, the interview guides were translated into Spanish or Arabic. The completed responses were then translated into English. Additionally, the difference in time zones posed a scheduling challenge.
3.5. Methodological Limitations
The evaluation faced some methodological limitations which are discussed by line of evidence below.
Interviews and Case Studies.
The interviews with key informants have the possibilities of self-reported response bias and strategic response bias. Self-reported response bias occurs when individuals are reporting on their own activities and so may want to portray themselves in the best light. Strategic response bias occurs when participants answer questions with the desire to affect outcomes. The interviews also have the possibility of selection bias in that the potential key informants were identified by the EAC which consisted of some ILPS staff members.
For the Mexico – Education and Training of Judges technical assistance sub-project, only judges and magistrates were interviewed. Other recipients (lawyers and prosecutors) from the other two sub-projects of the Mexico project were not interviewed. It was not possible to identify or locate such individuals as the project had ended before the evaluation began. Similarly, it was not possible to identify interviewees from the Ukraine and Turks and Caicos Islands technical assistance projects so follow-up was limited. The Ukraine and Turks and Caicos Islands’ projects had ended in 2012, and the Mexico project had ended in 2013.
The projects for the case studies were not chosen by random selection since the sample of projects within the evaluation period was not large. The two projects that were identified as case studies were considered to provide a good representation of the diversity of the Section’s work and were intended to be illustrative of the Section’s work in technical assistance projects.
iCase Data Review.
Overall, iCase was a useful source of information for the evaluation by providing some insight into the type of activities provided by the Section. However, there were some limitations.
It was difficult to determine with accuracy the amount of technical assistance project implementation work, pure advisory work, and outreach work done by the Section using the iCase data as staff had entered data inconsistently. For instance, a large portion of the Section’s technical assistance project work was entered into more than one category in iCase (i.e., corporate, advisory, general, policy) which did not distinguish the nature of the service provided. This was also the case for the strategic advisory work, which was entered into more than one category in iCase and was not fully descriptive of the nature of its advisory work. Therefore, it was difficult to determine with full accuracy what was pure advisory work.
In addition, the time keeping in iCase of the temporary employees who came to ILPS on secondment were not distinguished from those of the permanent employees. Therefore, it was difficult to determine the level of effort by the flexible group of temporary employees.
3.6. Mitigation Strategy
The mitigation strategy for the above methodological limitations was to use multiple lines of evidence and which included both quantitative and qualitative data collection methods to answer the evaluation questions. The evaluation gathered information from the Section, the Department, representatives from GAC, other federal government department, the beneficiaries and partners of the funded projects, project-related and other relevant documents, and comprehensive administrative data review (iCase). By triangulatingFootnote 3 the findings from the different sources countered the concern that the study’s findings were the result of a single method or source, and at the same time to strengthen the conclusions of the evaluation.
- Date modified: