Prévia do material em texto
CIPT® Sample Questions An IAPP Publication v5.1 About the IAPP CIPT Sample Questions The IAPP CIPT Sample Questions are designed to support your preparation for the CIPT certification exam. Developed using IAPP study resources as well as subject matter experts’ practical knowledge of the topics set forth in the IAPP’s CIPT Body of Knowledge, the sample questions can help identify your relative strengths and weaknesses in the major domains of the CIPT Body of Knowledge. All items on the IAPP CIPT Sample Questions were reviewed for accuracy at the time of publication. The IAPP CIPT Sample Questions were developed independently of the CIPT certification exam and are not intended to represent actual CIPT certification exam content. Your performance on the IAPP CIPT Sample Questions is not a predictor of your performance on the CIPT certification exam. Do you have questions or comments? Please contact us at training@iapp.org The CIPT Sample Questions and references may not be reproduced in any manner other than for use by the original purchaser. CIPP, CIPP/US, CIPP/C, CIPP/E, CIPP/G, CIPM, CIPT and IAPP are registered trademarks of the International Association of Privacy Professionals, Inc. registered in the U.S. CIPP, CIPP/E, CIPM, CIPT and IAPP are also registered in the EU as Community Trademarks (CTM). IAPP is a registered trademark in Brazil. © 2022 by the International Association of Privacy Professionals (IAPP). All rights reserved. No part of this publication may be reproduced, stored in a retrieval system or transmitted in any form or by any means, mechanical, photocopying, recording or otherwise, without the prior, written permission of the publisher, International Association of Privacy Professionals, Pease International Tradeport, 75 Rochester Ave., Portsmouth, NH 03801, United States of America. mailto:training@iapp.org © 2021 by the International Association of Privacy Professionals (IAPP). All rights reserved. Instructions 1. Remove/Print a copy of the Answer Sheet. 2. To simulate a timed test, set a timer for 40 minutes. 3. Complete the test without referring to the Answer Key or References. 4. Check your answers against the Answer Key. 5. For each correct response, write a “1” in the corresponding domain column of the Answer Key. 6. Add up the number of correct answers under each domain column. 7. To compare how you did in each domain, calculate your scores as a percent: a. Divide the number of correct answers by the total number of questions in that domain b. Multiply that number by 100 8. Consult the References for detailed explanations of each answer and the section of the Body of Knowledge to which the question relates. © 2021 by the International Association of Privacy Professionals (IAPP). All rights reserved. CIPT Sample Questions 1. Which of the following may pose a “client side” privacy risk? A. An employee loading personal data on a company laptop. B. Failure of a firewall that is protecting the company’s network. C. A distributed denial of service (DDoS) attack on the organization. D. A remote employee placing communication software on a company server. 2. You are browsing the web and shopping for new patio furniture. You then open your favorite social media app and begin to scroll through the posts. While doing so, you start noticing ads for patio furniture. This is an example of what? A. Direct marketing. B. Individual advertising. C. Behavioral advertising. D. Indirect marketing. 3. Which of the following privacy practices would be most useful to users who are not knowledgeable about protecting their personal information? A. Choice. B. Control. C. Notice. D. Consent. 4. Which of the following privacy-related principles would be the main concern during the data usage stage of the data life cycle? A. Transparency. B. Data minimization. C. Storage limitation. D. Purpose limitation. 5. Under the EU’s General Data Protection Regulation (GDPR), which of the following types of information would NOT require notification to a supervisory authority in the event of a personal data breach? A. Pseudonymized data. B. Anonymized data. C. Reidentified data. D. Deidentified data. © 2021 by the International Association of Privacy Professionals (IAPP). All rights reserved. 6. Authentication can be accomplished by a variety of mechanisms. Which are the four main categories? A. What you know, when you know, where you are, what you are. B. What you know, what you have, when you know, where you are. C. What you know, what you have, where you are, what you are. D. What you know, what you have, where you are, when you know. 7. The acronym PGP stands for: A. Protect Good People. B. Planned Group Protocol. C. Pretty Good Privacy. D. Parsed Group Protocol. 8. Julie needs to securely transfer a file containing personal data to Katlyn. They decide to use asymmetric encryption. Select the correct steps that Julie and Katlyn should use. A. Julie encrypts the file using her own private key and Katlyn decrypts the file using her own public key. B. Julie encrypts the file using Katlyn’s public key and Katlyn decrypts the file using her own private key. C. Julie encrypts the file using her own public key and Katlyn decrypts the file using Julie’s public key. D. Julie encrypts the file using Katlyn’s private key and Katlyn decrypts the file using Julie’s public key. 9. When purchasing a product from TripeType’s website, a customer must enter basic information into a purchase form. A link to TripeType’s privacy statement is provided on the purchase form. However, it does not disclose that it will use personal information for other purposes. The statement provides that TripeType will store the customer information in its database. A month later, TripeType’s sales team wants to generate new leads and decides to use the information collected from customers. This is an example of what? A. Secondary use. B. Involuntary use. C. Disapproved use. D. Selective use. © 2021 by the International Association of Privacy Professionals (IAPP). All rights reserved. 10. Which of the following explains why it is difficult to regulate what individually identifiable data is? A. Many people mistakenly expose personal information online. B. Personal information means different things to different people. C. Most legislative bodies are hesitant to enact laws about identifiable data. D. Data that is not overtly identifiable can be combined to identify individuals. 11. Ubiquitous computing can raise significant concerns about the sheer volume of data that can be collected by a system. Each of the following are necessary considerations when utilizing a data collection process that falls into this category EXCEPT which? A. The system should provide end-users with both feedback and control. B. The system should have obvious value. C. The retention of data by the system should be limited. D. The data collected by system should be aggregated and made available to all users. 12. In creating a registration form for a mobile app directed at grade school children, what privacy engineering objective is addressed by asking for grade level instead of date of birth? A. Disassociability. B. Manageability. C. Security. D. Predictability 13. Which of the following is NOT an example of automated decision-making? A. Receiving an answer to a support question utilizing a chatbot. B. Obtaining approval for insurance through an online application. C. Requesting an emailed catalog from an online retailer. D. Setting airfare based on browser history and date of purchase. 14. Which of the following circumstances would best be addressed by utilizing radio frequency identification(RFID) technology? A. An organization has a high error rate for entering credit card data into its POS system. B. An organization requires two-way communication between its discoverable devices. C. An organization needs to develop an encryption-supported network. D. An organization’s inventory process is taking too long. © 2021 by the International Association of Privacy Professionals (IAPP). All rights reserved. 15. What type of interference occurs when false or inaccurate information on a credit application results in denial of credit? A. Decisional. B. Intrusion. C. Disclosure. D. Appropriation. 16. Which of the following is an objective for privacy engineering? A. Encryption. B. Anonymization. C. Manageability. D. Audit. 17. Which of the following technologies allows individuals to participate in a salary survey without revealing the specific salary or personal information of any of the participants? A. Secure multiparty computation. B. Digital rights management. C. Ciphertext. D. Homomorphic encryption. 18. An organization wants to enter into a contract with a third-party cloud provider for storage of client personal information. The business head is entering into this agreement to eliminate risk associated with a data breach by transferring the information to the third-party processor. She asks you if this is a good way to eliminate breach risk. Please choose the best response from the choices below. A. Third party processors have sole liability for the data they process, because the data is in their possession. We can rely on the security program of the third party since they did not report a data breach in the previous 12 months. B. Under most privacy and data protection laws, following a data breach, an organization retains liability for personal data that it has collected and transferred to third party processors. Third party processors may share liability for the breach as well. We should routinely validate data protection controls of third parties we are doing business with to make sure our client data is protected properly. C. Organizations can transfer data to a third party to avert all liability for damages resulting from a data breach. We can use contract language to eliminate the need for third party due diligence. D. Organizations can only be liable for data breaches if an individual brings a lawsuit. A government agency is able to investigate the organization but can only issue orders to the organization to correct deficiencies in the information security program. Money penalties are only available to individual plaintiffs, or as a result of a class action. © 2021 by the International Association of Privacy Professionals (IAPP). All rights reserved. 19. When creating a data inventory, it is important to include a range of detailed information on the company's data assets. This information should include how the data is accessed and by whom, how the data is managed, who owns it, where the data is stored, and the ____________ that defines the individual data records and what they contain. A. Structured data. B. Schema. C. Metadata. D. Dictionary. 20. Testing during software development generally consists of which two sets of activities? A. Implementation and deployment. B. Alpha and beta testing. C. Validation and verification. D. Runtime monitoring and auditing. 21. A marketing lead has collected a large data set of personal information and stored it in a shared folder. The marketing lead controls who has access to the shared folder. The type of access control being used is: A. Discretionary. B. Mandatory. C. Attribute-Based. D. Rule-Based. 22. Vulnerability is determined by what two factors? A. Probability and confidentiality. B. Capability and portability. C. Confidentiality and integrity. D. Capability and probability. 23. Low-level design concerns the details of the overall design of the system and focuses on improving the quality of programming practices through each of the following mechanisms EXCEPT: A. Information hiding. B. Threat modeling. C. Reusing existing standard API libraries. D. Loose coupling. © 2021 by the International Association of Privacy Professionals (IAPP). All rights reserved. SCENARIO Use the following to answer questions 24-25: You have been tasked with developing an incident response process for your employer, BrandEnt Company, a media entertainment company. As the senior manager of information privacy, you have been creating privacy-related procedures for the company. There has been an uptick in the number of privacy-related questions being sent to customer service through the website’s generic portal, and the customer service reps are unsure of what to do with the questions. This has led to the director of privacy asking that you work with the IT department to identify, track and resolve privacy-related incidents, as well as with the Information Security team to leverage their existing incident-management process. As you review the questions, you notice that many customers are asking what personal information BrandEnt has collected about them. You grow concerned as you notice that customer service representatives are not always responding to these inquiries. The website doesn’t have a portal dedicated to asking privacy-related questions, and instead a general customer service portal form is being used. This form only requests the customer’s name and their email address. The site does not require authentication to get to this portal. For responses that have been processed, the customer service representatives sent compressed files containing all data collected regarding the individual and sent it to the email provided. You reach out to the Information Security team to request access to their incident ticketing system to determine if the existing process can be leveraged. As you review the incident tickets, you notice several security incidents related to data breaches. After speaking with the Information Security team lead, you learn that the tickets were closed after the vulnerabilities were patched and the system owners were notified. 24. Which common privacy principle is missing at BrandEnt? A. Use limitation. B. Collection limitation. C. Security safeguard. D. Data quality. 25. What follow up should be done regarding the data breaches that have already occurred? A. Review the fixes applied for the vulnerability and verify that it was applied on all affected systems. B. Review the information that was breached and determine what levels of notification are required. C. Send a notification to all customers notifying them of the breach. D. Nothing is required as the security team reviewed and closed the incident. (End of Sample Questions) References © 2021 by the International Association of Privacy Professionals (IAPP). All rights reserved. 1. The correct answer is A. Body of Knowledge Domain V(C): Privacy Engineering (Privacy Design Patterns) Client-server architecture describes the relationship between the client, which is typically a program that runs on a local computer, and the server, which is the program that runs on a remote computer. One advantage of this architectural paradigm is that it allows the service to store computer data on the client side for the purpose of completing transactions. Storing data on the client may introduce privacy risks if the client data is insecure, or if the storage of this data is not clear to the user and serves the purpose of surveillance or identification when the user would otherwise prefer to remain anonymous. 2. The correct answer is C. Domain: III(D): Privacy Threats and Violations (Intrusion, DecisionalInterference and Self Representation) Behavioral advertising is typically utilized online and is the practice of targeting advertisements using a profile of an individual based on the websites they visit over time. Online advertisers collect information about a user's browsing behavior and then present ads based on that data. Concerns around the potential for discriminatory practices and other misuse of the data contained in the users’ profiles have caused behavioral advertising to be the subject of significant scrutiny and debate. 3. The correct answer is C. Body of Knowledge Domain I(A): Foundation Principles (Privacy Risk Models and Frameworks) Too often privacy notices are not readable, people do not understand what they consent to, and people are not aware of certain data practices or the privacy settings or controls available to them. The challenge is that an emphasis on meeting legal and regulatory obligations is not sufficient to create privacy interfaces that are usable and useful for users. Usable means that people can find, understand and successfully use provided privacy information and controls. Useful means that privacy information and controls align with users’ needs with respect to making privacy-related decisions and managing their privacy. 4. The correct answer is D. Body of Knowledge Domain I(D): Foundation Principles (The Data Life Cycle) The principal of purpose limitation states that personal information should not be disclosed, made available or otherwise used for purposes other than those disclosed at the time of collection (unless necessary to comply with the law) without the express consent of the individual. For example, when an employee provides their home address for payroll purposes, it should not be disclosed to others for a holiday card exchange without the employee’s express consent. Implementing data governance policies to manage the flow of information throughout the life cycle can help to mitigate the risk of improper use of personal information. 5. The correct answer is B. Body of Knowledge Domain II(B): The Role of IT in Privacy (Information Security) © 2021 by the International Association of Privacy Professionals (IAPP). All rights reserved. Anonymous information, under the GDPR, is personal data that does not relate to an identified or identifiable natural person or which has been rendered anonymous in such a manner that the data subject is not or no longer is identifiable. This type of data is not subject to the GDPR; therefore, the breach notification requirements would not apply. However, when anonymizing data, data technologists must test the result to ensure that the individual cannot be reidentified through any means, otherwise, the data is considered pseudonymized. 6. The correct answer is C. Body of Knowledge Domain IV(B): Technical Measures and Privacy Enhancing Technologies (Techniques) Authentication mechanisms fall into four main categories: 1. What you know—secret knowledge held only by the individual corresponding to the identity; 2. What you have—authentication requires an object possessed by the individual; 3. Where you are—the location matches the expected location; and, 4. What you are—biometric data from the individual. These mechanisms vary in terms of their ability to be defeated, ease of use, cost and privacy implications. 7. The correct answer is C. Body of Knowledge Domain IV(B): Technical Measures and Privacy Enhancing Technologies (Techniques) Pretty Good Privacy (PGP) is an alternative to the S/MIOME protocol for encrypting email. PGP uses a model called the Web of Trust for its public key infrastructure. In this model, individuals create their own public keys and then publish them either on a web page or by uploading the key to the PGP key server. 8. The correct answer is B. Body of Knowledge Domain IV(B): Technical Measures and Privacy Enhancing Technologies (Techniques) Asymmetric encryption algorithms use one key to encrypt data and a second key to decrypt the data. These keys are typically called the public and private keys, and as a result asymmetric cryptography is frequently called public key cryptography because one key is made publicly available and the other key is kept private. Since Katlyn will be receiving the file, she will need to be able to decrypt it. Therefore, she provides Julie with a public key to encrypt the file. Once Julie sends it to Katlyn, Katlyn will use her matching private key to decrypt the file. 9. The correct answer is A. Domain III(B): Privacy Threats and Violations (During Use) Secondary use involves using an individual’s information without consent for purposes unrelated to the original reasons for which it was collected. This is in violation of some privacy laws and regulations, including the European General Data Protection Regulation (GDPR). Additionally, this conflicts with the principle of purpose specification which states, in part, that use of personal information should be limited to the fulfillment of the purpose for which that information is collected. 10. The correct answer is D. Body of Knowledge Domain III(B): Privacy Threats and Violations (During Use) © 2021 by the International Association of Privacy Professionals (IAPP). All rights reserved. While data that is not about an individual is obviously not covered, and data that is individually identified clearly is, there is a large middle area of data that is not overtly identified but may be identifiable. Little guidance has been given as to exactly how much information is needed to cross the threshold into individually identifiable data. 11. The correct answer is D. Body of Knowledge Domain VII(D): Technology Challenges for Privacy (Ubiquitous Computing) Ubiquitous computing refers to the transition of computing from purpose-built, independent computing devices, such as a traditional laptop computer, to computing that occurs at all times and in all places, such as that seen in smart cities and internet-of- things devices. As computing occurs ubiquitously, the types and amount of data that can be collected at scale raise important concerns about privacy, tracking and surveillance. Important requirements for end-user acceptance of this paradigm from a privacy perspective include ensuring: the system provides end-users with both feedback and control, the system has obvious value, and the retention of data is limited. 12. The correct answer is A. Body of Knowledge Domain V(B): Privacy Engineering (Privacy Engineering Objectives) Disassociability is the minimization of connections between data and individuals to the extent compatible with system operational requirements. This minimization can take many forms, from maximally disassociated data in the form of high-level aggregated data, for example, to de-identified records pertaining to distinct individuals. Disassociability can also take the form of architectural data separation, in which identifiable personal information is kept segregated, but still linkable to, transactional data. 13. The correct answer is C. Body of Knowledge Domain VII(A): Technology Challenges for Privacy (Automated Decision-Making) Automated decision-making is the process of making a decision by automated means without any human involvement, usually done by a computer algorithm. These decisions can be based on factual data, as well as on digitally created profiles or inferred data. Automated decision-making can be very useful for organizations and may also benefit individuals. They can lead to quicker and more consistent decisions, particularly in cases where a very large volume of data needs to be analyzed and decisions made very quickly. However, there are risks with automated decision-making. Examples of some of the risks: • Automated decision-making uses collected data, in most cases large amounts of collected data;the more data an organization has, the higher the risk. • People may not understand how automated decision-making works or how it affects them. • Automated decision-making may result in unintentional profiling by an organization, which individuals may not expect and which the organization has not disclosed. 14. The correct answer is D. Body of Knowledge Domain VII(B): Technology Challenges for Privacy (Tracking and Surveillance) © 2021 by the International Association of Privacy Professionals (IAPP). All rights reserved. RFID chips are tiny microchips that can be as small as a fraction of a millimeter. Each microchip is identified by a unique serial number and contains an antenna with which it transmits information, such as its serial number, to an RFID reader. RFID chips can be placed on products or cards…for tracking purposes. They are commonly used in supply chain management to allow companies to track inventory. 15. The correct answer is A. Body of Knowledge Domain III(D): Privacy Threats and Violations (Intrusion, Decisional Interference and Self Representation) Decisional interference is any action by an external party, such as a government or commercial entity, that interferes with decisions that affect the person’s daily life, such as to borrow money or to obtain lawful employment. Since this form of interference can be concealed in complex business or IT processes, it may not be easily recognizable by the person about whom the decision is made, as they may be unaware of the decision-making process, or the decision itself may be hidden from the person. Steps the privacy technologist can take: • Include cross-checks for accuracy when information is transferred from a manual form into an electronic form • Ensure that backup storage mechanisms allow for updating information • Include individuals in the review of their information 16. The correct answer is C. Body of Knowledge Domain V(B): Privacy Engineering (Privacy Engineering Objectives) NIST’s Privacy Engineering Program has proposed three privacy engineering objectives intended to be the privacy version of the traditional security objectives of confidentiality, integrity, and availability (C-I-A). Predictability aims to enable reliable assumptions about a system, particularly its data and the processing of that data, by all stakeholders. Manageability refers to the ability to granularly administer personal information, including modification, disclosure and deletion. Disassociability is the minimization of connections between data and individuals to the extent compatible with system operation requirements. 17. The correct answer is A. Body of Knowledge Domain IV(B): Technical Measures and Privacy Enhancing Technologies (Techniques) Secure multiparty computation allows two or more computers to participate in a computation and compute a mathematical result without otherwise revealing private information. For example, five people can get together and compute their average salaries without revealing individual salaries to anyone; not to each other and not to a third party. 18. The correct answer is B. Body of Knowledge Domain II(B): The Role of IT in Privacy (Information Security) © 2021 by the International Association of Privacy Professionals (IAPP). All rights reserved. Transfer the risk—If there are other entities that can do a better job managing the risk, transferring the risk may be the best option. For example, using third-party services that can manage payroll, payment and other financial services using high privacy and security standards may be preferable to developing an equivalent system from the ground up. This is especially important when using a compliance model, where these third-party services have been engineered to conform to specific privacy laws, such as the variety of data breach notification laws. While the original organization may still bear ultimate legal responsibility, it has nonetheless transferred relevant risks in part to the third-party service, as it is the service that must contend with those risks at an operational level. 19. The correct answer is C. Body of Knowledge Domain II(A): The Role of IT in Privacy (Fundamentals of Privacy-Related IT) “Meta” is a prefix meaning “an underlying description” in information technology usage. Metadata is information that is automatically or manually added to content and that can be later accessed and used during processing or by applications. It describes either a data set or individual data records and provides information about that data. 20. The correct answer is C. Body of Knowledge Domain VI(A): Privacy by Design Methodology (The Privacy by Design Process) Testing may be considered the most crucial phase of software development with regard to implementing a privacy-friendly design. Testing consists of two parts: verification and validation. Verification ensures the resultant system performs the way it is supposed to perform. Validation ensures the requirements satisfy the needs of the intended user base. 21. The correct answer is A. Body of Knowledge Domain IV(B): Technical Measures and Privacy Enhancing Technologies (Techniques) Access-control systems differ in their approaches to specifying policy. Access-control models are typically based around subjects, the users or programs who may be granted or denied access, and objects, the resources to which access is being regulated. Access- control systems differ by who is authorized to configure the access-control policy. In mandatory access control (MAC), policy is set by administrators; users have no ability to change the policy, even for their own data. In contrast, discretionary access control (DAC) systems allow users to change the access-control policies for the data they own. 22. The correct answer is D. Domain III(E): Technical Measures and Privacy Enhancing Technologies (Software Security) Vulnerability (an issue that could be exploited, e.g., weakness in an information system, poor procedures) is determined by capability and probability: what skills and resources are available to a threat actor and what may impede the threat actor from violating privacy? • Introducing controls serves as impediments that help increase difficulty and reduce vulnerability. • Creating a cyber defense infrastructure support using firewalls to prevent malicious network traffic helps to manage a system’s vulnerabilities. • An organization’s incident response plan may be used in the event of an attack. © 2021 by the International Association of Privacy Professionals (IAPP). All rights reserved. 23. The correct answer is B. Body of Knowledge Domain VI(A)(f): Privacy by Design Methodology (The Privacy by Design Process) In low-level design, privacy technologists engage in improving the quality of programming practices, including how well they meet privacy standards and requirements. Opportunities for improving programming are done via coding practices, such as information hiding and loose coupling, and the reuse of standard libraries and frameworks. Information hiding identifies data that has been assigned to specific levels of classification and restricts access to that data via limited class functions. Loose coupling reduces objects’ dependency on other objects and helps to control the flow of information. Reusing existing libraries of standard application programming interfaces (APIs) reduces the risk of defects in source code and can be used to improve confidentiality and integrity in support of privacy. Threat modeling is used to identify risks to a system based on concrete scenarios and considers the negative outcomes enabled by a particular threat agent or type of agent. 24. The correct answer is C. Body of Knowledge Domain I(A): Foundation Principles (Privacy Risk Models and Frameworks) The 1980 OECD Guidelines providea foundational and international standard for privacy. The guidelines contain principles that are not found in the FTC’s FIPPs, such as the collection limitation principle. The OECD security safeguard principle states that reasonable measures must be taken to protect data from unauthorized use, destruction, modification or disclosure of personal information. BrandEnt does not have appropriate methods for verifying customers prior to releasing personal information, which could result in unauthorized disclosure of that information to the wrong party. 25. The correct answer is B. Body of Knowledge Domain II(B): The Role of IT in Privacy (Information Security) There may be circumstances where a previously undisclosed use of data is occurring or through circumstances beyond the control of an organization (a data breach) data is used in a manner inconsistent with the original disclosure. In these cases, it is imperative that the organization make the individual aware of such new use of data, possibly giving the individual opportunity to take corrective or remedial efforts to avoid adverse consequences. © 2021 by the International Association of Privacy Professionals (IAPP). All rights reserved. ANSWER SHEET 1 A B C D 2 A B C D 3 A B C D 4 A B C D 5 A B C D 6 A B C D 7 A B C D 8 A B C D 9 A B C D 10 A B C D 11 A B C D 12 A B C D 13 A B C D 14 A B C D 15 A B C D 16 A B C D 17 A B C D 18 A B C D 19 A B C D 20 A B C D 21 A B C D 22 A B C D 23 A B C D 24 A B C D 25 A B C D END This page may be reproduced. © 2021 by the International Association of Privacy Professionals (IAPP). All rights reserved. Answer Key Item Number Correct Answer Foundational Principles The Role of IT in Privacy Privacy Threats and Violations Technical Measures and Privacy Enhancing Technologies Privacy Engineering Privacy by Design Methodology Technology Challenges for Privacy 1 A 2 C 3 C 4 D 5 B 6 C 7 C 8 B 9 A 10 D 11 D 12 A 13 C 14 D 15 A 16 C 17 A 18 B 19 C 20 C 21 A 22 D 23 B 24 C 25 B SUMMARY ___ of 3 correct ___ of 4 correct ___ of 5 correct ___ of 5 correct ___ of 3 correct ___ of 2 correct ___ of 3 correct PERCENTAGE (# correct/# total) x 100 This page may be reproduced.