Results 1 to 5 of 5

Thread: How “Deep-Fakes” Could Actually Protect Privacy

  1. #1

    How “Deep-Fakes” Could Actually Protect Privacy

    A deep fake is an edited photo or video that cannot be differentiated from the original.
    It’s been easy to fake photos for some time. Basically, every image you see in a magazine of a sexy female is fake, edited to the point where it bears little resemblance to the real original photo.
    But video fakes have been easier to detect. Until now, you needed expensive software and Hollywood budgets to really manipulate footage. But now requests are done online by amateurs.
    What kind of requests? Usually putting someone’s head on a porn star’s body to imagine you are watching whoever the face came from.
    Of course, the applications of any emerging technology will be used for sex if it can–just look how quickly pornography proliferated on the internet.
    But imagine all the other nefarious uses for this type of fake.
    You could blackmail political opponents or tip an election with compromising videos.
    You could hold judges hostage threatening to destroy their career.
    You could start a war.
    Or suppose it is just the same old-fashioned police corruption framing suspects, but with a new technological twist. They simply doctor the video surveillance evidence or interrogation video, and boom, easy convictions.
    And just think about all the data companies like Google (and therefore agencies like the NSA) have on all of us. They could tailor a fake video to something that people would believe, and bolster it with real audio…
    Or doctor surveillance footage based on where they know you were… try coming up with an alibi for that.
    For all the sketchy crap the government has done before, I wouldn’t put it past them to easily dispatch some political dissidents or troublesome reporters.
    So seeing it with your own eyes will no longer be enough. We will have to rely on the dreaded… experts.
    Already our court system is inundated with experts who tell us what’s what. And these people are easily corrupted, and sometimes just wrong.
    Take the case of a Massachusetts lab technician who falsified thousands of drug test results in order to advance her own career. (That was a different Massachusetts lab worker from the one who got high from the confiscated drugs at work while testing samples.)
    Or the man facing conviction based on DNA tested by a machine and interpreted with a computer algorithm. And the court won’t allow his defense to examine the algorithm.
    This is after his first conviction was thrown out, because of a faulty algorithm in the DNA testing machine!
    So with deep fakes, will it all come down to an expert sitting behind a keyboard, testifying to the authenticity of the video that depicts you cackling with glee while dumping toxic waste on an endangered sea turtle breeding ground?
    It is another arms race scenario, where everytime a new method of detection is invented, the fakes get a little bit better to avoid detection. Already the deep fake technology is outpacing the security countermeasures.
    And it’s not just pictures and videos either. People are already coming up with ways to fake biometric data like fingerprints. So tight security requirements like fingerprints and iris scans might not be so secure after all.
    Could faking DNA be next? Or bluffing the facial recognition systems the TSA is already starting to use for boarding flights?
    Basically when you have to rely on scientists to test, and experts to interpret, and programmers to build algorithms, then we are in the same position we have always been in. Only now, it might be easier to hide the fact that no one really knows what they are talking about.
    All the benefits of the advanced technology can easily be worked around with a little corruption, or a basic human error.
    But what if deep fakes truly outpace the technology to detect them? And suppose everyone agrees that this fact is true.
    Then it is basically like a reset.
    Anyone could realistically claim that a video, picture, or biometric access was faked. And we would be back to square one, doing the typical gumshoe detective work that can really pin down a suspect to a time and place.
    People could believe whatever narrative they want by simply assuming every video they don’t like is faked, and all the faked videos they want to believe are real.
    So in that sense, it seems like nothing would change… people already believe whatever they want to believe.
    Perhaps instead we would avoid character assassinations and trial by publicity. But we also couldn’t hold people accountable for their actual transgressions.
    But would that be the worst thing in the world, if the reset button got hit?
    Privacy could return. But criminals could get away with more.
    Innocent people could no longer be framed. And guilty people could believably claim they were framed.
    We would have to actually investigate claims, or guilt or innocence. No more relying on experts or algorithms.
    No more relying on our own eyes–which even before deep fakes, weren’t super reliable. After all, we’ve all seen the videos of cops shooting people, and we still don’t agree on what happened.


    More at: https://www.zerohedge.com/news/2019-...rotect-privacy
    Never attempt to teach a pig to sing; it wastes your time and annoys the pig.

    Robert Heinlein

    Give a man an inch and right away he thinks he's a ruler

    Groucho Marx

    I love mankind…it’s people I can’t stand.

    Linus, from the Peanuts comic

    You cannot have liberty without morality and morality without faith

    Alexis de Torqueville

    Those who fail to learn from the past are condemned to repeat it.
    Those who learn from the past are condemned to watch everybody else repeat it

    A Zero Hedge comment



  2. Remove this section of ads by registering.
  3. #2
    all that is visual is becoming an illusion, via fractal mathematics in our digital age....

  4. #3
    This article is pie in the sky bull$#@!. If technology and propaganda were concepts that encouraged human beings to become critical thinkers about their validity, then we would already be growing in that direction. But the amount of bull$#@! fodder and people willing to believe and then post it as fact is enormous.


    Here's an attempt to make Jordan Peterson sound like an idiot.
    The wisdom of Swordy:

    On bringing the troops home
    Quote Originally Posted by Swordsmyth View Post
    They are coming home, all the naysayers said they would never leave Syria and then they said they were going to stay in Iraq forever.

    It won't take very long to get them home but it won't be overnight either but Iraq says they can't stay and they are coming home just like Trump said.

    On fighting corruption:
    Quote Originally Posted by Swordsmyth View Post
    Trump had to donate the "right way" and hang out with the "right people" in order to do business in NYC and Hollyweird and in order to investigate and expose them.
    Fascism Defined

  5. #4


    "He's talkin' to his gut like it's a person!!" -me
    "dumpster diving isn't professional." - angelatc
    "You don't need a medical degree to spot obvious bullshit, that's actually a separate skill." -Scott Adams
    "When you are divided, and angry, and controlled, you target those 'different' from you, not those responsible [controllers]" -Q

    "Each of us must choose which course of action we should take: education, conventional political action, or even peaceful civil disobedience to bring about necessary changes. But let it not be said that we did nothing." - Ron Paul

    "Paul said "the wave of the future" is a coalition of anti-authoritarian progressive Democrats and libertarian Republicans in Congress opposed to domestic surveillance, opposed to starting new wars and in favor of ending the so-called War on Drugs."

  6. #5
    "He's talkin' to his gut like it's a person!!" -me
    "dumpster diving isn't professional." - angelatc
    "You don't need a medical degree to spot obvious bullshit, that's actually a separate skill." -Scott Adams
    "When you are divided, and angry, and controlled, you target those 'different' from you, not those responsible [controllers]" -Q

    "Each of us must choose which course of action we should take: education, conventional political action, or even peaceful civil disobedience to bring about necessary changes. But let it not be said that we did nothing." - Ron Paul

    "Paul said "the wave of the future" is a coalition of anti-authoritarian progressive Democrats and libertarian Republicans in Congress opposed to domestic surveillance, opposed to starting new wars and in favor of ending the so-called War on Drugs."



Similar Threads

  1. Man Arrested For Microwaving ID To Protect His Privacy
    By DamianTV in forum World News & Affairs
    Replies: 4
    Last Post: 11-14-2015, 04:22 PM
  2. UN Votes To Protect Privacy In Digital Age
    By DamianTV in forum U.S. Political News
    Replies: 0
    Last Post: 12-19-2013, 09:12 PM
  3. How do you protect your internet privacy?
    By Sematary in forum Privacy & Data Security
    Replies: 20
    Last Post: 09-16-2012, 02:21 PM
  4. Replies: 1
    Last Post: 03-14-2009, 11:28 PM
  5. Gov pretending to protect privacy...
    By Chosen in forum Privacy & Data Security
    Replies: 3
    Last Post: 01-28-2009, 09:56 AM

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •