PDA

View Full Version : Leverage and the politics of the future




Chase
07-10-2008, 01:35 AM
I want to start this post out by identifying my existing political stances. I am a big believer in small government, natural rights (including gun rights), free markets, etc.

People who think the world is a dangerous place have a good foundation of fact with which to do so: nuclear proliferation, whatever the cause, gives an enormous amount of leverage to foreign entities, not all of whom are our friends. Thankfully, mutually assured destruction - the promise of fatal retaliation - has been enough to stop foreign governments from launching those bombs, all except for the USA which had the first and final word.

One of the bombs lobbed by gun control activists at their gun rights counterparts is the question of whether or not a private individual should be allowed to own a nuclear bomb. Some people see this question as absurd, but there is an interesting challenge in drawing the line here. I tend to think a nuclear bomb classifies as military munitions, and most would agree. But what about an AK-47? It's a fully automatic gun. What is it about being automatic that limits its use to the military? The same gun in semi-automatic form is legal.

I think the real question here is leverage... how much should any one individual be capable of possessing? Many years in the future, we might develop a matter compiler. The free market could make such technology as commonplace as the microwave, if we let it. Matter compilers could provide a tremendous benefit to mankind.

But how do we stop a single, crazed individual from using their matter compiler to assemble bio-weapons? We do have to consider that individuals today can build fully automatic weapons in their garage. It's not lawful, but it is technically achievable.

Nevertheless, the leverage automatic weapons gives an individual is nowhere near the leverage a super-virus would. In the case of automatic weapons, a gunman could kill dozens, but since mass shootings are rare, we might reasonably conclude that it was okay for man to have machine tools. In the case of a virus, a single mad scientist could kill millions. Such a mad scientist is probably rarer than the mass shooter, but there only needs to be one mad scientist before you truly face a crisis. A ha, you say, a mad scientist could do the same today. Sure, but as it isn't an easy thing to do in your home kitchen, we don't run the same risk we might in the future.

Sure, matter compilers and mad scientists may seem silly, but there is a legitimate question in all of this. As technology advances and this gives the individuals all more leverage than ever, should our policies change? Will they change? How will they change? How should they?