Definition - What does Legal Positivism mean?
Legal positivism is a philosophy of law in which validity for laws is created when laws are posited and put into place by a governing body, and when society accepts this governing body as legitimate.
Legal positivism denies that laws become valid based on general concepts of morality or natural principles.
Justipedia explains Legal Positivism
Essentially, the idea behind legal positivism is that laws are only legitimate when they are approved by the government or agency that has authority over a jurisdiction. This means, for example, that though many people of a certain religion may view it as natural law that no work shall be done on Sundays, this will not be an official law unless the government creates a law that declares that no work shall be done on Sundays.