Law of the Land
Definition - What does Law of the Land mean?
The law of the land refers to the general set of laws and rules that all members of a community, state or country are subject to.
In the United States, the Constitution is where much of the laws of the land for the country come from. This includes the rights of citizens, such as the right to free speech, the right to keep and bear arms, and the right to due process of law.
Justipedia explains Law of the Land
The laws of the U.S. Constitution are applicable within the United States borders. However, once a person leaves the country and crosses the border into another country, they become subject to the laws of the country that they cross into.
The judicial branch of the United States government is responsible for upholding the laws of the land in the country and for carrying out justice. The legislative branch is responsible for adding new laws and/or amending existing ones.
Step by Step: Here’s What Happens When You're Charged with a Crime