In short, do you believe the traditional rights, recognized by the US Constitution bill of rights to be natural or not?
My personal view is that the only natural right is the right to try and accomplish your goals, either by using force or by not using force. In essence might makes right. This is evident in how nature operates and it seems to be how society operates when government is removed.
Anything beyond that, such as free speech, the right to own a gun, etc are legal constructs that we, as a society, largely agree to. While these things are a good idea, there is nothing inherent about them.
What is your view?