Principles of Marketing
Informed consent is the ethical and legal requirement that a person must be fully informed about and agree to a medical, research, or marketing procedure before it is undertaken. It ensures that individuals understand the potential risks, benefits, and alternatives, and voluntarily choose to participate.
congrats on reading the definition of Informed Consent. now let's actually learn it.