via RH Reality Check, by Anna Forbes and Kate Ryan

People who participate in clinical trials take the enormous step of volunteering to test a product that may be useful and, sometimes, life-saving if it turns out to be effective. They play an irreplaceable role in research to prevent, treat, and sometimes cure illness – as well as to find other ways to improve people’s health and lives.

Trial participants make a profoundly personal contribution and accept potential medical, social, and personal risks on behalf of others. An ethical trial is one that eliminates or minimizes participants’ risks as much as possible, invests in making sure that participants understand clearly what they are volunteering for, and protects their rights at every step.

For example, without clinical trials, we would not have seen recent advances in antiretroviral drugs to treat HIV, long-acting contraceptive choices that allow women greater control over their use, or microbicides that may be able to protect women from HIV.

The United States government has rules to protect people who participate in federally-funded biomedical and behavioral research. The rules vary depending on which agency is supporting the research, but they all share a starting point known as the Common Rule, a set of regulations for all federally-funded research involving human participants, whether it is conducted inside or outside the U.S.

But those rules have not always been in place, and there are some shameful chapters in the history of medical research supported by the United States that include violations of the most basic standards of ethical behavior. This history has left some people deeply suspicious of clinical trials and the motives of those who conduct them. Many explain their suspicion with one word: “Tuskegee.”

Read the rest.

[If an item is not written by an IRMA member, it should not be construed that IRMA has taken a position on the article’s content, whether in support or in opposition.]