Online Inquiry Form
All UI&U degrees and certificates
For new and returning students
Individuals who interact with human subjects for research purposes are ethically obligated to protect the privacy, safety, welfare, and rights of those human research subjects. Methods used to assure these protections include appropriate recruitment procedures, informed consent processes, and analyses of the risks to the subjects relative to the benefits of the research. The Belmont Report (1979) was the first published report to address human research ethics and safety.
The National Research Act, enacted in 1974, established the National Commission for the Protection of Human Subjects in the United States. The Commission was charged with identifying basic ethical principles that should underlie the conduct of biomedical and behavioral research involving human subjects and to develop guidelines for conducting research within those principles.A human subject is a living individual about whom an investigator (whether professional or student) conducting research obtains:
One outcome of the Commission’s work was The Belmont Report (1979), which responds to concerns across the medical community about research studies where subjects—many of whom were poor, uneducated, and/or developmentally disabled—had been placed at serious risk and sometimes seriously harmed. The researchers conducting these studies may have intended to benefit society; however, the ways in which participating subjects were treated created a public outcry.
The Belmont Report describes and expands on three ethical principles identified by the Commission:
After The Belmont Report was issued, the U.S. Department of Health and Human Services (HHS), the National Institutes of Health (NIH), and the Office for Protection from Research Risks (now known as the Office for Human Research Protections, or OHRP), introduced Part 46, “Protection of Human Subjects,” to Title 45, “Public Welfare,” of the Code of Federal Regulations (CFR), providing specific guidelines and definitions for researchers. Revised Part 46, “Protection of Human Subjects,” became effective June 23, 2005.
Researchers should carefully and thoroughly read The Belmont Report and Part 46 (revised) of Title 45 of the CFR before beginning to design a research project involving human subjects. Both documents inform the policies and procedures of the IRB. The IRB Web site provides links to 45CFR46 and to The Belmont Report.
The Belmont Report describes two key aspects of the principle of respect for persons. This principle leads directly to the requirement for informed consent and the important concept that some individuals (members of protected populations) require additional considerations from researchers. Respect for persons requires that a researcher must:
The Belmont Report’s principle of beneficence requires researchers to ensure and secure the well-being of their subjects; it speaks to the concept of benefits and risks. The principle of beneficence stems from the principles of medical ethics of the Hippocratic Oath: (1) do not harm, and (2) maximize possible benefits, but it is equally applicable to studies in other fields. Researchers may not intentionally injure any person during the conduct of a study, no matter what benefits might be realized as a result.
The principle of beneficence was and continues to be controversial in the research community. Some researchers argue that research that does not directly benefit participants should never be permitted, while others believe that anticipated future benefits to the larger community (through knowledge gained from a study) can be an acceptable justification for some risk to participating subjects. These opposing interpretations of the principle of beneficence can force researchers to make difficult choices.
Union Institute & University’s IRB has chosen to follow the most commonly held interpretation of the principle of beneficence: participants in a research project do not always have to benefit directly from their participation, as long as a strong likelihood exists that the study will benefit others.
Minimal risk means that the probability and magnitude of harm or discomfort anticipated in the research are not greater in and of themselves than those ordinarily encountered in daily life or during the performance of routine physical or psychological examinations or tests. (45 CFR 46.102(i))
As with the principle of respect for persons, the researcher’s expertise is an important factor in analyzing risks and benefits. A researcher needs to be thoroughly familiar with similar research studies before a risks versus benefits analysis of her or his proposed study is possible. If a researcher is unaware of what other researchers have done, he or she may needlessly place subjects at risk. As an example, to place subjects at risk in the investigation of a drug or medical procedure that has already been thoroughly tested is ethically and professionally unacceptable.
The third ethical principle established by The Belmont Report—justice—governs how benefits and burdens of research are shared. Injustice occurs when benefits of research are denied to participating subjects or when burdens of research are imposed unduly. Recruitment methods must be designed to assure fair selection of subjects based on clearly identified and justifiable inclusion and exclusion criteria.
The principle of justice was the subject of some of the earliest reflections on the ethics of research. Considerable concern focused on medical research that all too often used subjects who were “charity” patients or prisoners when the benefits of the studies, such as improved treatment methods or more effective medications, went—at least initially—only to the more fortunate members of society.
Following World War II, the Nuremberg trials brought to the world’s attention many cases where biomedical experimentation/ research had been conducted without regard or care for the well-being or safety of the human subjects of the experiments. The Nuremberg Code was developed after those trials. Intended to set international standards for ethical conduct of research involving human subjects, the Code would become the prototype for later codes. Often difficult to interpret, the codes resulted in conflict and differing opinions, thus failing to guard against the problems they were designed to prevent. Many of these codes were also limited in scope, failing to address human subject safety issues for psychological and other types of social science research.
Even after the Nuremberg Codewas published in 1949, a number of studies clearly violated the rights of human participants, put them unnecessarily at risk, or were otherwise questionable. These studies include some surprisingly recent ones as well as a number of studies in the behavioral sciences that were highly controversial. The following examples are provided as illustrations of the importance of holding closely to the ethical principles of The Belmont Report during the design and over the duration of a study.
Perhaps the best-known medical research study involving human subjects, the Tuskegee syphilis study, spanned forty years, from the 1930s to 1972. Conducted by the U.S. Public Health Service, it ultimately involved 400 African American men, most of whom were poor and uneducated. All participants had been diagnosed with syphilis prior to their participation in the study; none were informed of the potential risks of participation; and some were misled about the study’s potential benefits.
Even worse, when penicillin (an effective treatment for syphilis) became available in the 1940s, subjects were not treated, nor were they informed of the availability of treatment. As a result, many subjects died when their deaths might have been prevented by treatment with penicillin.
Conducted by social psychologist Stanley Milgram in 1961, this study on obedience to authority raised concern about deception. Milgram wanted to investigate whether subjects would do something that an authority figure told them to do, even if it conflicted with their personal beliefs and morals. Participants thought that they would be participating in an experiment about memory and learning. Prior to agreeing to participate, potential subjects were told that the purpose of the study was to investigate the effect of electric shock on learning. After consenting to participate, subjects drew their roles from a hat—either “teacher” or “learner.” The “learner” was hooked up to electrodes, and the “teacher” was told to administer an electric shock for every incorrect response that the “learner” gave in a word-pairing exercise.
The purpose of the study was misrepresented, and the selection process was rigged. All participants drew the role of “teacher”; all “learners” were researchers with the project. The real purpose of the study was to look at the reaction of the “teacher” when he or she was told to administer what appeared to be increasingly dangerous and painful electric shocks. The “learners” were acting—no electric shocks were ever administered.
Some participants stopped and left the study; others suffered emotional/ psychological harm—especially if the “teachers” believed that they had seriously hurt other participants. Participants were debriefed and told that the “learners” were actors and that they did not receive any shocks.
Milgram’s experience showed that in stressful situations, some people tend to conform to requests of an authority figure, because they would have no responsibility over their own actions by obeying commands. Many researchers believe that the Milgram study was unethical because subjects were not told about the true nature of the study or its potential risks and benefits and consequently could not have given true informed consent. Other researchers argue that the pretense and risk were justified because the knowledge gained from the study was potentially beneficial to society.
This 1970 study, conducted by sociologist Laud Humphreys, involved investigation of illegal homosexual acts in public restrooms. The stated purpose of the study was to compare demographic characteristics of subjects with the general, nonhomosexual, population, but the study was deceptive on multiple levels.
In this deceptive study, the researcher pretended to be acting as a lookout to help individuals avoid detection or interruption. Instead, Humphreys took advantage of this situation to record license plate numbers. He then used a false cover story to obtain names and addresses from the police. Humphreys visited each of the men, posing as a market researcher conducting an innocuous survey, which allowed him to collect information about occupation, marital status, socioeconomic characteristics, etc. The first phase of the Humphreys study was an example of clandestine observational research—the researcher misrepresents him or herself and observes activities under false pretenses. The second phase used deception to obtain subjects’ participation, hiding the true purpose of the visits.
No informed consent process was used, and the researcher placed the unknowing subjects doubly at risk—first by creating a false feeling of safety during the observational phase and later by identifying the subjects, creating a link to behavior that could have resulted in considerable psychological, physical, legal, and/or financial harm.
In the 1963 Jewish Chronic Disease Hospital study, subjects were injected with live cancer cells but were not informed of the risks. This act was a clear-cut violation of research ethics. Individuals involved with the study were subsequently found guilty of fraud, deceit, and unprofessional conduct.
The Willowbrook study, conducted between 1963 and 1966, used residents of a state school for “mentally defective children” as subjects. The children were deliberately infected with the hepatitis virus; the children did not consent to participate. Parents (who did consent to their child’s participation) were not told of the likely long-term risks and were not advised about available treatments.
Union Institute & University values the integration of theory and practice in its academic programs. Researchers should contact the IRB Director to discuss planned interactions with other people as part of an academic program or work at the university if they are unsure whether IRB approval is required. Regardless of the research methodology chosen, all research projects involving human subjects conducted for an internship, seminar, culminating project (thesis, dissertation, or capstone), or class project must be reviewed and approved by the UI&U IRB prior to implementation. Failure to contact the IRB for review and/or approval prior to the onset of the research project may result in the research project data not being accepted for grading purposes.
In regulations governing research with human subjects, the U.S. government defines research as follows:
A systematic investigation, including research development, testing and evaluation, designed to develop or contribute to generalizable knowledge. Activities which meet this definition constitute research… whether or not they are conducted or supported under a program which is considered research for other purposes. For example, some demonstration and service programs may include research activities.
Designed to develop or contribute to generalizable knowledge typically means that results or conclusions are intended to extend beyond one person (participant) or an internal program. Examples include activities designed with the intent to publish the results in a peer-reviewed journal, to present them at a professional meeting, as well as to publish them in a culminating document to meet the requirements for a degree program.
Practice is an intervention or action designed to enhance the well-being of an individual, having a reasonable expectation of achieving that goal. Practice—teaching, student teaching, testing, diagnosis, preventive treatment, therapy, and so on—does not require IRB approval. However, a study may be conducted as a component of practice such as an investigation in which the researcher will evaluate the safety of a therapeutic method or a study of the efficacy of a teaching model. In cases such as these, IRB approval is required.
Researchers working with human subjects must ensure a favorable balance between potential risk and likely benefits—a condition stemming from the principle of beneficence. As a researcher, one of the greatest challenges is to realistically and objectively assess the potential risks and anticipated benefits of a project. To do so, the researcher must be thoroughly knowledgeable of what other researchers have discovered and present both risks and benefits to potential subjects as part of the informed consent process.
Risk to participants in research is the possibility that harm may occur to them as a result of their participation. Risk can be a known, probable, or possible outcome. Risk occurs in various levels.
In the context of research, benefit is something of positive value to the health or well-being of people. Like risk, benefits may include known, probable, or possible outcomes. Benefits can accrue directly to subjects or to larger populations.Because researchers are unable to guarantee potential benefits to participants or a broader community, researchers are cautioned to use tentative terminology, such as may, rather than will when assessing benefits of their research projects.
The purpose of a study is not the same as its benefit. Differentiating between the two in the application to the IRB and in the informed consent process is important.
Generally speaking, the purpose of the study is to find an answer to a research question or to illuminate a topic of inquiry. For students, fulfillment of degree requirements is also a purpose. The purpose statement should also include a statement of the general potential benefits of the study such as a contribution to knowledge.
The benefit of a study, as noted above, is its potential value—direct or indirect—to participants and/or others. The statement of potential benefit—in the research proposal and informed consent process—may include a reiteration of the study’s purpose, but it should also focus on how the study is likely to directly benefit study participants and/or the larger population.
Achieving a favorable balance between benefits and risks is likely to be one of a researcher’s greatest challenges. All applications to the IRB should include a risk/ benefit assessment—IRB approval will depend on a favorable balance. A thorough risk/ benefit assessment will look at both the probabilities and magnitudes of possible harm and anticipated benefits, taking into account many kinds of possible harms and benefits. Risks and benefits may be psychological, physical, legal, social, or economic.
The researcher’s assessment should be systematic, realistic, nonarbitrary, and consider all aspects of the proposed study, including alternative research designs. A systematic assessment of risks and benefits will address the following ten questions:
Although potential risks are those risks that directly or indirectly affect individual subjects, anticipated benefits may accrue to individual subjects, their families, special groups of subjects in society, and/or society at large. Risks must be outweighed by the sum of both the anticipated benefits to the subjects, if any, and the anticipated benefits to society, typically in the form of knowledge to be gained from the research. In balancing these elements, the risks and benefits that directly affect the study participants typically carry greater weight than do the indirect risks and benefits.
This chapter borrows extensively from The Belmont Report, using both direct quotations and interpretive language.
Title 45, Code of Federal Regulations, Part 46, “Protection of Human Subjects.”