Do Doctors Have To Tell Patients The Truth?
Do Doctors Have To Tell Patients The Truth? Health professionals are expected to always tell the truth. This is based on the argument that, lying is wrong and disrespecting the person’s autonomy is not right. However, this may not necessarily be the case, as the ‘right not to know’ the truth, should as well be