Informed consent is both an ethical and legal obligation in which healthcare providers educate patients about the risks, benefits, and alternatives of a procedure or intervention. This requires competent patients to voluntarily make educated medical decisions.