Buscar

Tranparency coruption def

Faça como milhares de estudantes: teste grátis o Passei Direto

Esse e outros conteúdos desbloqueados

16 milhões de materiais de várias disciplinas

Impressão de materiais

Agora você pode testar o

Passei Direto grátis

Você também pode ser Premium ajudando estudantes

Faça como milhares de estudantes: teste grátis o Passei Direto

Esse e outros conteúdos desbloqueados

16 milhões de materiais de várias disciplinas

Impressão de materiais

Agora você pode testar o

Passei Direto grátis

Você também pode ser Premium ajudando estudantes

Faça como milhares de estudantes: teste grátis o Passei Direto

Esse e outros conteúdos desbloqueados

16 milhões de materiais de várias disciplinas

Impressão de materiais

Agora você pode testar o

Passei Direto grátis

Você também pode ser Premium ajudando estudantes
Você viu 3, do total de 9 páginas

Faça como milhares de estudantes: teste grátis o Passei Direto

Esse e outros conteúdos desbloqueados

16 milhões de materiais de várias disciplinas

Impressão de materiais

Agora você pode testar o

Passei Direto grátis

Você também pode ser Premium ajudando estudantes

Faça como milhares de estudantes: teste grátis o Passei Direto

Esse e outros conteúdos desbloqueados

16 milhões de materiais de várias disciplinas

Impressão de materiais

Agora você pode testar o

Passei Direto grátis

Você também pode ser Premium ajudando estudantes

Faça como milhares de estudantes: teste grátis o Passei Direto

Esse e outros conteúdos desbloqueados

16 milhões de materiais de várias disciplinas

Impressão de materiais

Agora você pode testar o

Passei Direto grátis

Você também pode ser Premium ajudando estudantes
Você viu 6, do total de 9 páginas

Faça como milhares de estudantes: teste grátis o Passei Direto

Esse e outros conteúdos desbloqueados

16 milhões de materiais de várias disciplinas

Impressão de materiais

Agora você pode testar o

Passei Direto grátis

Você também pode ser Premium ajudando estudantes

Faça como milhares de estudantes: teste grátis o Passei Direto

Esse e outros conteúdos desbloqueados

16 milhões de materiais de várias disciplinas

Impressão de materiais

Agora você pode testar o

Passei Direto grátis

Você também pode ser Premium ajudando estudantes

Faça como milhares de estudantes: teste grátis o Passei Direto

Esse e outros conteúdos desbloqueados

16 milhões de materiais de várias disciplinas

Impressão de materiais

Agora você pode testar o

Passei Direto grátis

Você também pode ser Premium ajudando estudantes
Você viu 9, do total de 9 páginas

Prévia do material em texto

1 
 
TRANSPARENCY INTERNATIONAL UK 
DEFENCE AND SECURITY PROGRAMME 
 
GOVERNMENT DEFENCE ANTI-CORRUPTION INDEX – METHODOLOGY (LONG) 
 
 
 
Contents 
 
1. Introduction and Definitions 
 
2. Index Scope 
Country Selection 
Stages of the Index 
 
3. Methodology 
The Research Process 
Corruption Risk Typology 
Questions and Model Answers 
Scoring Scheme 
Presentation of Results 
Methodological Reflections 
 
Annex 1: List of Countries Analysed 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
2 
 
1. INTRODUCTION AND DEFINITIONS 
 
Transparency International UK’s Defence and Security Programme’s Government Defence Anti-
Corruption Index aims to measure the degree of corruption risk in defence and security ministries and 
armed forces worldwide, and to further Transparency International’s innovative work into the production 
of indices that raise awareness of corruption and act as ground breaking advocacy tools. 
 
The definitions we used were: 
 
Corruption: 
For the purpose of this index, we use Transparency International’s definition of corruption: “Corruption 
is the abuse of entrusted power for private gain. It hurts everyone whose life, livelihood or happiness 
depends on the integrity of people in a position of authority.” Corruption in the defence and security 
industries can be conceived in a more nuanced and specific fashion. In section 3 of this document, 
reference is made to the main areas of corruption risk listed out in the 2011 TI Defence Handbook, and 
these categories were used to drive question development.i These categories are grounded in solid 
research, are specific to the defence and security sector, and allow consideration of various types of 
corruption – embezzlement, fraud, abuse of political position, nepotism, bribery, misuse of budgets, to 
be considered. 
 
Corruption Risk: 
This refers to the degree of probability that defence and security corruption might occur along with a 
reflection of the potential cost associated with that corruption. It thus reflects the risk that such loss, 
whether monetary, social, or political, can arise; and reflects the degree of such cost when it occurs. 
Increased risk means higher potential for corruption or higher potential cost or both; decreased risk 
means lower potential for corruption or lower potential cost or both. Governments, though not entirely 
able to stem corruption, are important actors with the ability to influence levels of corruption risk through 
policy, institutions, and approach. 
 
Country: 
Generic reference to the term “country” refers to the defence and security ministries and armed forces 
associated with that country – the key focus of the index. It also spills over to include other government 
institutions if they have the potential to influence levels of corruption risk in the sector. Ministries of 
finance, presidential institutions, parliaments, auditor generals, and public procurement authorities may 
often be considered to harbour this potential. 
 
2. INDEX SCOPE 
 
Country Selection 
 
82 countries were assessed, selected based on various criteria: 
1. Aggregate arms exports volumes, 2001-2010.ii 
2. Aggregate arms imports volumes, 2001-2010.iii 
3. Size of military (absolute terms), 2010.iv 
4. Size of military (per capita), 2010.v 
5. Size of police force (absolute terms), 2003-2008.vi 
6. Size of police force (per capita), 2003-2008.vii 
7. Geographical diversity across the selection. 
 
 
 
3 
 
 
Three subjective criteria were also deployed to determine final country selection: the selection was 
required to exhibit geographic diversity, the selected countries were required to be likely to display 
substantive findings, and there were required to be sufficient available research resources to analyse 
each country selected. 
 
A list of the countries analysed can be found in Annex 1. 
 
Stages of the Index 
 
Preliminary Test Period: 
The preliminary test period ran between May and July 2011. During this stage: 
1. The collation of expert internal and external feedback on the evolving question and model 
answer document (QMA) used to inform the index took place. Feedback was sought from 
domain experts worldwide, to ensure the relevance of questions to countries with diverse 
political and cultural characteristics. 
2. ‘Live’ trial assessments were commissioned, where the evolving QMA was sent to trial 
assessors to provide an initial indication of how well the questions stood up to research in 
seven sample countries. Outputs were highly instructive in informing the developing 
methodology though results were not published. 
3. Discussions with potential partners were initiated and Global Integrity was chosen as a key 
partner. They provided expert advice on the evolving methodology and we chose to use 
Indaba, their software tool designed to automate field work workflows and to enable collection 
and analysis of results. We were also granted exclusive access to Global Integrity’s network of 
contacts, which proved highly useful in accessing potential country assessors. 
 
Stage 1: 
Stage 1 ran from July to November 2011. The purpose of Stage 1 was to set the stage for the 
assessment of a wider sub-set of countries that would form the output of the index. During this stage: 
1. A sample of six countries was assessed in order to stress and improve, as necessary, the QMA 
and methodology. Results were not, initially, intended to be published, yet the quality of the 
assessments and their ability to be adjusted so that they would be consistent with the 
subsequent stages later enabled us to include them.viii 
2. The software programme Indaba was used to manage the workflow, and associated problems 
or concerns were identified and solved in preparation for Stage 2. 
3. The QMA was further exposed to independent review and comment by interested and informed 
parties, and was appropriately modified and finalised. A four-strong Academic Advisory 
Committee, comprised of specialists with either defence or methodological expertise, were 
recruited and engaged with to help refine and finalise the methodology.ix 
 
Stage 2: 
This stage, which ran from October 2011 to February 2012, saw the assessment of a set of 25 
countries using the finalised methodology and QMA. 
 
Stage 3: 
This stage, which ran from February to October 2012, saw the assessment of the remaining 51 
countries using the finalised methodology and QMA. 
 
 
 
 
4 
 
 
3. METHODOLOGY 
 
The Research Process 
 
For each country, the QMA was answered by a lead Country Assessor, an expert with proven 
experience in field work and comprehensive knowledge of the country being analysed, with a solid 
grasp of the issue of corruption, or excellent knowledge of the defence and security establishment of 
the country, or both. Where possible, we sought in-country experts well-placed to assess the tone of 
the country and arrange in-country interviews. 
 
Two peer reviewers, similarly qualified, were recruited to examine, critique, and comment on the 
Country Assessor’s scores and the justifications for them. A Government Reviewer was, where 
possible, appointed to comment on the assessor’s responses. Any promises or commitments they 
made during their review can be followed up in any repeated waves of research, contributing to the 
potential for the index to be used as a tool for reform.x Finally, the relevant TI National Chapter was 
given the opportunity to comment on the Country Assessor’s research, and on particularly instructive 
peer reviewer comments and Government Reviewer comments.xi The research process is summarised 
diagrammatically in Figure A. 
 
Figure A: The Research Process 
 
 
 
 
A panel of experts centred on the TI-DSP team were convened to review the output of the assessors’ 
and reviewers’ research on an ongoing basis, with particular regard to the consistency, accuracy, clarity 
and completeness of the responses across the nations being analysed. This panel helped standardise 
thenumerical scores assigned to each country and consolidate the outputs of the research undertaken. 
It should be emphasised, therefore, that ultimate ownership and responsibility for the finalised scores 
and bands rests with TI-DSP. 
 
 
 
 
 
 
 
 
 
 
5 
 
 
Corruption Risk Typology 
 
The framework for the index QMA is taken from the corruption typology shown in TI’s 2011 Defence 
Handbook.xii Questions were therefore structured around the five key areas identified in the typology, 
and the corruption risk associated with the various sub-topics clustered in each area: 
 
1) Political 
a. Defence and Security Policy 
b. Defence Budgets 
c. Nexus of Defence and National Assets 
d. Organised Crime 
e. Control of Intelligence Services 
f. Export Controls 
2) Finance 
a. Asset Disposals 
b. Secret Budgets 
c. Military-Owned Businesses 
d. Illegal Private Enterprises 
3) Personnel 
a. Leadership Behaviour 
b. Payroll, Promotions, Appointments, Rewards 
c. Conscription 
d. Salary Chain 
e. Values and Standards 
f. Small Bribes 
4) Operations 
a. Disregard of Corruption In Country 
b. Corruption within Mission 
c. Contracting 
d. Private Security Companies 
5) Procurement 
a. Technical Requirements / Specifications 
b. Single Sourcing 
c. Agents / Brokers 
d. Collusive Bidders 
e. Financing Package 
f. Offsets 
g. Contract Award, Delivery 
h. Subcontractors 
i. Seller Influence 
 
Questions and Model Answers 
 
The question and model answer (QMA) document for the index, separately published, was structured 
around the framework outlined above. The reach of the questions was purposefully holistic, not centred 
on any one type of activity that might help stem corruption in the area, but on a range of government 
activities, spanning public commitment to anti-corruption, institutional deterrents, and transparent and 
effective legislation / regulation. 
 
 
6 
 
 
The questions and model answers were drawn up to enhance the defensibility and credibility of 
particular scores, and to promote reliability. Where value-laden terms were used, assessors were 
requested to define them in the context of the country they were analysing. 
 
Assessors were encouraged, when they were unable to answer questions due to lack of information 
sufficient to match a model answer, to make a reasoned assumption in order to provide a score. In 
such circumstances, assessors were asked to consider the wider social and political context of the 
country, and any broader corruption problem that may exist, in order to defend a score in preference to 
leaving a question blank. The justifications and references sections for such answers were, of course, 
subject to peer review, as were all other responses. 
 
Scoring Scheme 
 
We used a five-point scoring scale. The model answers pertaining to each are highly specific to each 
question, but the principles underlying each score are, generally, as follows: 
4 = High transparency; strong, institutionalised activity to address corruption risks in defence and 
security. 
3 = Generally high transparency; activity to address corruption risks, but with shortcomings. 
2 = Moderate transparency; some activity to address corruption risk, but with significant 
shortcomings. 
1 = Generally low transparency; weak activity to address corruption risk. 
0 = Low transparency; very weak or no activity to address corruption risk. 
 
Presentation of Results 
 
On our website, the results for each country and question are presented in the following forms, which 
we entitled a detailed ‘country assessment’: 
 
Country Assessments: 
Full publication, for each question, of: 
1. A country’s score for each question asked. 
2. The reasoning and narrative justification for the score. 
3. The sources of information from which the score has been derived. 
4. Peer reviewers’ comments, the government reviewer’s comments, and the TI National 
Chapter reviewers’ comments, as appropriate. 
 
Country Summaries: 
Adjoining the detailed country assessment described above is, a short qualitative analysis of the 
country’s results, making reference to areas in which corruption risk is handled particularly well, and 
where it can be improved. Themes that help explain consistency or deviation in scores are identified 
and discussed. 
 
Country bands: 
Aggregated compilation of bands associated with the country’s overall score. The banding runs from A 
to F, according to the percentage of marks awarded across the questionnaire. The six bands split the 
range of marks equally. The table below summarises the banding schema and re-iterates the question 
scoring principles. 
 
 
 
7 
 
Scoring Principles and Banding Brackets 
 
QUESTION SCORING PRINCIPLES: 
 
BANDING BRACKETS 
4 = High transparency; strong, institutionalised activity to 
address corruption risks. 
 
BAND 
 
 
 
A 
B 
C 
D 
E 
F 
 
LOWER 
SCORE (%) 
 
83.3 
66.7 
50.0 
33.3 
16.7 
0 
 
 
HIGHER 
SCORE (%) 
 
100 
83.2 
66.6 
49.9 
33.2 
16.6 
 
CORR. RISK 
 
 
 
VERY LOW 
LOW 
MODERATE 
HIGH 
VERY HIGH 
CRITICAL 
3 = Generally high transparency; activity to address 
corruption risks, but with shortcomings. 
2 = Moderate transparency; activity to address corruption 
risk with significant shortcomings. 
1 = Generally low transparency; weak activity to address 
corruption risk. 
0 = Low transparency; very weak or no activity to address 
corruption risk. 
 
Scores by risk area: 
For each of the five key corruption risks areas in defence and security, ‘integrity scores’ are presented. 
This provides nuance to where countries are doing particularly well, and where their reform efforts 
might best be targeted. They are calculated simply: the percentage of marks awarded in the questions 
pertaining to each risk area. 
 
Hardcopy reports: 
An overall hardcopy report summarising they key index results is available, and a hardcopy report 
specific to the Middle East and North Africa (MENA) will be available in due course. 
 
Methodological Reflections 
 
A Hybrid Index: 
1. The Government Defence Anti-Corruption Index is a hybrid index that spans the numerical 
indicator – descriptive indicator dichotomy identified by authors who classify corruption 
indices.xiii By combining quantitative scoring with qualitative justification for scores, the 
Government Defence Anti-Corruption Index provides a data repository for statistical analysis 
and comparison, alongside qualitative evidence enabling in-depth study of corruption risk by 
country, region, and risk area. 
2. The index is also hybrid in that it invites analysis of both the de juro and de facto circumstances 
simultaneously, so long as the subtlety of the reasoning is explained. 
3. Finally, the index is hybrid because it combines both independent, critical analysis with the 
potential for dialogue and discussion with government reviewers. It aims to provide objective 
assessment while allowing for government input and argument. 
 
Meeting Diagnostic Criteria: 
Three criteria have been proposed to help judge the measurement quality of a corruption index.xiv The 
Government Defence Anti-Corruption Index has methodological characteristics in place that ensure 
each criterion is addressed: 
1. Reliability: is the index likely to return consistent results across different assessors and in 
repeated waves? The use of a QMA with detailed model answers ensures that assessors look 
for the same characteristics across countries when they assess a government’s ability to tackle 
each particular corruption risk associated with a question. This is further enhanced by the use 
of objective questions and answers where appropriate. The use of a set of reviewers, finally, 
helps ensure the research is robust and the scoring is standardised. Such defences help 
 
8 
 
ensure that any dramatic change in scores over time—if the index is repeated—signals a 
genuine transformation in circumstances and / or efforts. 
2. Validity: does the index tap what we want it to tap? The questionset is underpinned by a 
theoretical typology that demonstrates our view of where corruption risk in the sector lies. The 
use of model answers ensures that all researchers are attuned to the specific high risk 
categories as defined by TI-DSP. Furthermore, in having to explain their scoring, assessors 
make clear how their reasoning helps shed light on the potential corruption risk explicitly or 
implicitly identified by the question. 
3. Precision: does the index produce usefully detailed results? The index produces nuanced 
scores relating to five key risks, and each of the 77 questions for each of the 82 countries is 
accompanied by explanation and a list of sources. Practitioners and analysts have access to a 
wealth of data and accompanying explanatory material that can be analysed and studied in 
depth by region, country, cluster of countries, corruption risk area, and individual question. 
 
 
For further questions on the methodology, feel free to contact the team at Transparency 
International UK’s Defence and Security Programme who worked on the Government Defence 
Anti-Corruption Index. 
 
Email: gi@transparency.org.uk 
 
 
 
 
 
 
 
i Transparency International Defence & Security Programme (2011) Building Integrity and Countering Corruption in Defence and Security: 
20 Practical Reforms. Available online from: http://www.defenceagainstcorruption.org/publications, p. 10. 
ii Source: SIPRI Arms Transfers Database. http://www.sipri.org/databases/armstransfers. Accessed 23/11/11. 
iii Source: SIPRI Arms Transfers Database. http://www.sipri.org/databases/armstransfers. Accessed 23/11/11. 
iv Source: International Institute for Strategic Studies; Hackett, James (ed.) (2010-02-03). The Military Balance 2010. London: Routledge. 
v Source: International Institute for Strategic Studies; Hackett, James (ed.) (2010-02-03). The Military Balance 2010. London: Routledge. 
vi Source: United Nations Office on Drugs and Crime. http://www.unodc.org/unodc/en/data-and-analysis/statistics/data.html. Accessed 
24/11/11. 
vii Source: United Nations Office on Drugs and Crime. http://www.unodc.org/unodc/en/data-and-analysis/statistics/data.html. Accessed 
24/11/11. 
viii To facilitate this possibility, and to ensure comparability with the country assessments in future waves, the Stage 1 scoring and 
comments were reviewed and edited so they directly answered the questions of the slightly adjusted Stage 2 QMA, yet remained entirely 
consistent with the researchers’ judgements. We found that the changes to comments and scoring were minimal. One notable difference, 
however, is that Stage 1 assessments include the deleted question 33, and omit questions 12A and 12B. From extensive analysis, we 
were satisfied that the information relating to the questions omitted and included in future stages was effectively trapped in the Stage 1 
responses, and therefore the stages were compatible. Furthermore, to understand the potential effect of the slight difference in weighting 
between the two versions of the QMA, testing for each country was carried out to ensure the banding and integrity scores would not have 
been affected. We briefed all researchers, chapters, and government contacts in February 2012 with confirmation that we would be 
publishing these results, and as a courtesy allowed them the opportunity to view the finalised assessments. As the governments of Stage 
1were initially under the impression that the assessments would not be published, we re-invited those governments who did not contribute 
to the research submit a report in response, and suggested to those governments who did contribute to the research that we would be 
willing to publish on our website a weblink to a response on their own website. 
ix The committee comprised: Indira Carr, University of Surrey; Peter Foot, Geneva Centre for Security Policy; Michael Macaulay, Teesside 
University; Pat Harned, Ethics Resource Center. 
x Unfortunately, obtaining government nominees is a time-consuming exercise, if a response is to come at all. We therefore attempted to 
brief governments at very early stages of the research process. Where we did not have a response in time, and we were at the point 
where it was necessary, logistically, to avoid a serious delay, then we proceeded without government input. If, after this point, details of a 
government nominee are obtained, we provide them with access to the finalised assessment and give them the opportunity to submit a 
written report by way of reply to be published in tandem with the research. Where governments have submitted a review or a report but 
wish to provide further evidence, we will provide a weblink on our own website to their website. 
xi Transparency International National Chapters contain much expertise on country’s corruption profiles, and were often best positioned 
and appropriately resourced to conduct the full assessment themselves. We consequently asked National Chapters if they would like to 
mailto:gi@transparency.org.uk
http://www.defenceagainstcorruption.org/publications
 
9 
 
 
conduct the assessment themselves, and 35% of chapters whose countries we examined took up this offer. Added to the chapters who 
agreed to conduct a review, we had an overall chapter participation rate of 83% for those countries included who have a chapter. 
xii Transparency International Defence & Security Programme (2011) Building Integrity and Countering Corruption in Defence and 
Security: 20 Practical Reforms. Available online from: http://www.defenceagainstcorruption.org/publications, p. 10. 
xiii See, for example: Doig, A., McIvor.S. amd Theobold, R. (2006) Numbers, nuances, and moving targets: converging the use of 
corruption indicators or descriptors in assessing state development, International Review of Administrative Sciences, 72(2). 
xiv Johnston, M. (2009) Components of Integrity: Data and Benchmarks for Tracking Trends in Government, OECD Paper 
GOV/PGC/GF(2009)2, accessed 16 March 2012 from: 
http://www.oecd.org/officialdocuments/displaydocumentpdf?cote=GOV/PGC/GF(2009)2&doclanguage=en 
 
http://www.defenceagainstcorruption.org/publications
http://www.oecd.org/officialdocuments/displaydocumentpdf?cote=GOV/PGC/GF(2009)2&doclanguage=en

Outros materiais