• This is a political forum that is non-biased/non-partisan and treats every person's position on topics equally. This debate forum is not aligned to any political party. In today's politics, many ideas are split between and even within all the political parties. Often we find ourselves agreeing on one platform but some topics break our mold. We are here to discuss them in a civil political debate. If this is your first visit to our political forums, be sure to check out the RULES. Registering for debate politics is necessary before posting. Register today to participate - it's free!

'Deepfakes' easy to make, hard to detect, lawmakers learn in Congressional hearing

TU Curmudgeon

B.A. (Sarc), LLb. (Lex Sarcasus), PhD (Sarc.)
DP Veteran
Joined
Mar 7, 2018
Messages
62,465
Reaction score
19,290
Location
Lower Mainland of BC
Gender
Male
Political Leaning
Centrist
From The Deseret News

'Deepfakes' easy to make, hard to detect, lawmakers learn in Congressional hearing


SALT LAKE CITY — Rep. Chris Stewart isn't sure there's a way to control the proliferation of digital technology that makes it easier to create synthetic video, images, audio or text known as "deepfakes."

The Utah Republican said that some of the suggestions industry experts offered during a congressional hearing Thursday might be helpful in an ideal world, but in the real world they would be nearly impossible to implement.

Policies could be adopted to control government and to some extent legitimate businesses, he said.

"But we can't control everyone. This is going to be so pervasive and so available that virtually anyone could create this. It’s easy to control the U.S. government and say you can’t use it, you can’t create it for political manipulations or whatever it might be," Stewart said.

"But you can’t control the other 6 billion people on the earth."

COMMENT:-

Why does Rep. Stewart's comment to the effect that it won't be possible to stop 'Deepfakes' (and, by implication, that means that no one should bother to try) bother me?

Does it bother you?
 
From The Deseret News

'Deepfakes' easy to make, hard to detect, lawmakers learn in Congressional hearing


SALT LAKE CITY — Rep. Chris Stewart isn't sure there's a way to control the proliferation of digital technology that makes it easier to create synthetic video, images, audio or text known as "deepfakes."

The Utah Republican said that some of the suggestions industry experts offered during a congressional hearing Thursday might be helpful in an ideal world, but in the real world they would be nearly impossible to implement.

Policies could be adopted to control government and to some extent legitimate businesses, he said.

"But we can't control everyone. This is going to be so pervasive and so available that virtually anyone could create this. It’s easy to control the U.S. government and say you can’t use it, you can’t create it for political manipulations or whatever it might be," Stewart said.

"But you can’t control the other 6 billion people on the earth."

COMMENT:-

Why does Rep. Stewart's comment to the effect that it won't be possible to stop 'Deepfakes' (and, by implication, that means that no one should bother to try) bother me?

Does it bother you?

With a media that has lost all integrity, and a portion of the public willing to buy into everything it attempts to sell to them, what the Senator said is very true.

As the technology continues to improve, how do you stop it?

As it is, slander and libel laws allow broad fake news attacks on public figures. There are no ramifications.

Why would a propaganda peddler care about what is real or not, as long as the objective of harm is achieved, even short term?

It's a real head scratcher and an area that needs careful consideration.
 
Last edited:
From The Deseret News

'Deepfakes' easy to make, hard to detect, lawmakers learn in Congressional hearing


SALT LAKE CITY — Rep. Chris Stewart isn't sure there's a way to control the proliferation of digital technology that makes it easier to create synthetic video, images, audio or text known as "deepfakes."

The Utah Republican said that some of the suggestions industry experts offered during a congressional hearing Thursday might be helpful in an ideal world, but in the real world they would be nearly impossible to implement.

Policies could be adopted to control government and to some extent legitimate businesses, he said.

"But we can't control everyone. This is going to be so pervasive and so available that virtually anyone could create this. It’s easy to control the U.S. government and say you can’t use it, you can’t create it for political manipulations or whatever it might be," Stewart said.

"But you can’t control the other 6 billion people on the earth."

COMMENT:-

Why does Rep. Stewart's comment to the effect that it won't be possible to stop 'Deepfakes' (and, by implication, that means that no one should bother to try) bother me?

Does it bother you?

Oh, good! Last time the electorate was only swayed by the written word; giving them visuals is brilliant!
 
doesn't bother me one bit that someone makes an observation.
 
From The Deseret News

'Deepfakes' easy to make, hard to detect, lawmakers learn in Congressional hearing


SALT LAKE CITY — Rep. Chris Stewart isn't sure there's a way to control the proliferation of digital technology that makes it easier to create synthetic video, images, audio or text known as "deepfakes."

The Utah Republican said that some of the suggestions industry experts offered during a congressional hearing Thursday might be helpful in an ideal world, but in the real world they would be nearly impossible to implement.

Policies could be adopted to control government and to some extent legitimate businesses, he said.

"But we can't control everyone. This is going to be so pervasive and so available that virtually anyone could create this. It’s easy to control the U.S. government and say you can’t use it, you can’t create it for political manipulations or whatever it might be," Stewart said.

"But you can’t control the other 6 billion people on the earth."

COMMENT:-

Why does Rep. Stewart's comment to the effect that it won't be possible to stop 'Deepfakes' (and, by implication, that means that no one should bother to try) bother me?

Does it bother you?
It's concerning but if they can create technology to make these videos they can probably make software to detect it also.

Sent from my SM-G965U using Tapatalk
 
Why does Rep. Stewart's comment to the effect that it won't be possible to stop 'Deepfakes' (and, by implication, that means that no one should bother to try) bother me?

Does it bother you?
Why does it bother you, don’t you think it’s true? Trying to flat-out prevent this kind of technology being abused would be an exercise in futility, a much more rational approach is to accept the possibility and do the best we can to minimise and mitigate the harms. Do you try to stop the rain or do you invest in an umbrella and some towels? :cool:
 
Deeply troubling that so many americans will inevitably fall for this utter BS because of partisanship, on both sides of the aisle.
 
From The Deseret News

'Deepfakes' easy to make, hard to detect, lawmakers learn in Congressional hearing


SALT LAKE CITY — Rep. Chris Stewart isn't sure there's a way to control the proliferation of digital technology that makes it easier to create synthetic video, images, audio or text known as "deepfakes."

The Utah Republican said that some of the suggestions industry experts offered during a congressional hearing Thursday might be helpful in an ideal world, but in the real world they would be nearly impossible to implement.

Policies could be adopted to control government and to some extent legitimate businesses, he said.

"But we can't control everyone. This is going to be so pervasive and so available that virtually anyone could create this. It’s easy to control the U.S. government and say you can’t use it, you can’t create it for political manipulations or whatever it might be," Stewart said.

"But you can’t control the other 6 billion people on the earth."

COMMENT:-

Why does Rep. Stewart's comment to the effect that it won't be possible to stop 'Deepfakes' (and, by implication, that means that no one should bother to try) bother me?

Does it bother you?
It bothers me in the sense that it's troubling that there's so little that can be done, but not in the sense that he'd acknowledge how little can be done. The software is already out there, the processing needed to run it just gets cheaper and cheaper, and once these things are made, injecting them into the social media stream anonymously is easy. I suppose there are algorithms that can detect likely deepfakes and mark them as such, when they pass through Google, Facebook, etc. Such algorithms could key on metadata, known digital "fingerprints" left by common fake software, or just a pattern of when and how the video first pops up. But just as more low-tech fake news continues to make the rounds and take in people who want to be taken in, so will these things.
 
DNA can be faked, and the tests otherwise abused and cheated, but I don't lose sleep over it.

The real concern - if any - here should be over any politician attempting to interfere with expression, especially on the dubious grounds of >public safety.<
 
From The Deseret News

'Deepfakes' easy to make, hard to detect, lawmakers learn in Congressional hearing


SALT LAKE CITY — Rep. Chris Stewart isn't sure there's a way to control the proliferation of digital technology that makes it easier to create synthetic video, images, audio or text known as "deepfakes."

The Utah Republican said that some of the suggestions industry experts offered during a congressional hearing Thursday might be helpful in an ideal world, but in the real world they would be nearly impossible to implement.

Policies could be adopted to control government and to some extent legitimate businesses, he said.

"But we can't control everyone. This is going to be so pervasive and so available that virtually anyone could create this. It’s easy to control the U.S. government and say you can’t use it, you can’t create it for political manipulations or whatever it might be," Stewart said.

"But you can’t control the other 6 billion people on the earth."

COMMENT:-

Why does Rep. Stewart's comment to the effect that it won't be possible to stop 'Deepfakes' (and, by implication, that means that no one should bother to try) bother me?

Does it bother you?

XML metadata can address this issue. Apply metadata to the various video formats in use that cannot be copied over without altering the metadata, thus insuring that one is viewing the original and unaltered video file.

It would require slight changes to all of the video and audio codecs currently in use but the changes would be very small.

The best bet would be a device with a locked clock that would have a hidden key store that the user shouldn't have access to and create a signature based on this so that they couldn't fake their own signature, but since the key must still reside in the device, it is still possible for someone to break as all the necessary information is in their possession, even if it is hard to get tom but the act of breaking the key would also be an indicator as well.
 
Last edited:
DNA can be faked, and the tests otherwise abused and cheated, but I don't lose sleep over it.

The real concern - if any - here should be over any politician attempting to interfere with expression, especially on the dubious grounds of >public safety.<

Would it bother you so much if what was required was actual, positive, verifiable, identification of the person/organization posting the material but with absolutely no vetting of the materials content?

Or is it your position that everyone has the constitutional right to say whatever they want to say (and that that constitutional right comes with a constitutional immunity for saying things that they knew, or ought reasonably to have known) were false AND saying them for a "nefarious purpose" (such as skewing electoral results AGAINST a candidate [rather than FOR a candidate])?
 
Last edited:
XML metadata can address this issue. Apply metadata to the various video formats in use that cannot be copied over without altering the metadata, thus insuring that one is viewing the original and unaltered video file.

It would require slight changes to all of the video and audio codecs currently in use but the changes would be very small.

The best bet would be a device with a locked clock that would have a hidden key store that the user shouldn't have access to and create a signature based on this so that they couldn't fake their own signature, but since the key must still reside in the device, it is still possible for someone to break as all the necessary information is in their possession, even if it is hard to get tom but the act of breaking the key would also be an indicator as well.

I'm going to assume that you have a good idea, because I don't have the foggiest idea what you mean.

Could you please explain for me? Feel free to PM the explanation.
 
This video was done a few years, ago. I would imagine the technology has gotten a little better even since then. But man, it's good one! It's called "Barack Obama Thug Life":lamo

 
Thing is, you can attempt to clamp down on new technology. You can also try to bail the ocean out with a teaspoon.

You'll have just as much luck, and wind up looking just as silly.
 
I'm going to assume that you have a good idea, because I don't have the foggiest idea what you mean.

Could you please explain for me? Feel free to PM the explanation.
More or less, give every video file its own DNA that can't be copied without someone else noticing.

Sent from the Oval Office using Putin's MacBook, and Barr's Wi-Fi password.
 
I'm going to assume that you have a good idea, because I don't have the foggiest idea what you mean.

Could you please explain for me? Feel free to PM the explanation.

I wish I could do a deep dive but I'll try to do the easy explanation in the form of an analogy.

In the early 2000's all the camera mfrs were wrestling with the HD issue and the MPEG (Motion Picture Experts Group) decided that one of the useful pro/prosumer formats would be based on the MPEG-2 (.mpg file extension - have you ever seen that?) codec.

(A codec is a device or program that compresses data to enable faster transmission and decompresses received data. Hence the term "codec" which stands for "compressor-decompressor".)

Sony came up with theirs, Panasonic came up with theirs, Canon came up with theirs and JVC came up with theirs...and they were all "somewhat" compatible with each other but not compatible ENOUGH. Nightmare.

In fact, and here's the analogy part...JVC altered the HEADER INFORMATION in EVERY FRAME of video by adding FOUR BITS to the frame headers on every frame.
They then called it their OWN format and gave it the .TOD file extension and thus avoided having to PAY the MPEG group any royalties for every device they sold, since it was not .mpg anymore.
The funny part is, you could take a .TOD file, rename it to .mpg and it would work just fine anyway.
There was NO DIFFERENCE between a .TOD file and a .mpg file except for those four metadata bits at the header of every frame.


So...much the same way JVC inserted their little 4-bit frame header on every frame of video, I am proposing that ALL video and audio codecs should now HAVE to HAVE a set of bits for every frame of audio and video that defines what was used to RECORD the audio or video, and if you copy that file, those bits do NOT migrate and copy over to the copy.

If all news outlets agree to maintain their archives of unedited original audio and video footage, and that footage contains all that unique header information, then one can just go back and check the identifying header metadata to see if it is authentic.
If you GET a piece of footage and it DOES NOT CONTAIN the metadata, then it cannot be guaranteed to be authentic.

It's a bit of work but I don't imagine people will have to check every time an elected official sneezes or farts, just when something is questionable. It's not a guarantee or a 100% solution but it helps. The encrypted KEY is the essential part.
It forever marks that unedited footage as originating with a specific device that recorded it.

It helps because news organizations can then say that what they put out on the air MATCHES the original or people can contest footage and demand to see if it is genuine by asking if the metadata tags are intact on the file.

Make sense?
 
More or less, give every video file its own DNA that can't be copied without someone else noticing.

Sent from the Oval Office using Putin's MacBook, and Barr's Wi-Fi password.

Thank you. I guess I am too verbose. You said it better than I did LOL.
 
I am thinking that all of this would dovetail with something called PGP (Pretty Good Privacy)

Pretty Good Privacy (PGP) is an encryption program that provides cryptographic privacyand authentication for data communication. PGP is used for signing, encrypting, and decrypting texts, e-mails, files, directories, and whole disk partitions and to increase the security of e-mail communications.

So, each frame of video and audio would have something that is similar to PGP, call it "Video PGP" if you want, I guess.

I don't think that the data overhead would "cost" very much in binary terms and it would be very easy to update any professional editing software to accommodate it.
 
Last time it was only bogus FB posts and Tweets; now the electorate will have visuals and won’t even have to read....
 
I wish I could do a deep dive but I'll try to do the easy explanation in the form of an analogy.

In the early 2000's all the camera mfrs were wrestling with the HD issue and the MPEG (Motion Picture Experts Group) decided that one of the useful pro/prosumer formats would be based on the MPEG-2 (.mpg file extension - have you ever seen that?) codec.

(A codec is a device or program that compresses data to enable faster transmission and decompresses received data. Hence the term "codec" which stands for "compressor-decompressor".)

Sony came up with theirs, Panasonic came up with theirs, Canon came up with theirs and JVC came up with theirs...and they were all "somewhat" compatible with each other but not compatible ENOUGH. Nightmare.

In fact, and here's the analogy part...JVC altered the HEADER INFORMATION in EVERY FRAME of video by adding FOUR BITS to the frame headers on every frame.
They then called it their OWN format and gave it the .TOD file extension and thus avoided having to PAY the MPEG group any royalties for every device they sold, since it was not .mpg anymore.
The funny part is, you could take a .TOD file, rename it to .mpg and it would work just fine anyway.
There was NO DIFFERENCE between a .TOD file and a .mpg file except for those four metadata bits at the header of every frame.


So...much the same way JVC inserted their little 4-bit frame header on every frame of video, I am proposing that ALL video and audio codecs should now HAVE to HAVE a set of bits for every frame of audio and video that defines what was used to RECORD the audio or video, and if you copy that file, those bits do NOT migrate and copy over to the copy.

If all news outlets agree to maintain their archives of unedited original audio and video footage, and that footage contains all that unique header information, then one can just go back and check the identifying header metadata to see if it is authentic.
If you GET a piece of footage and it DOES NOT CONTAIN the metadata, then it cannot be guaranteed to be authentic.

It's a bit of work but I don't imagine people will have to check every time an elected official sneezes or farts, just when something is questionable. It's not a guarantee or a 100% solution but it helps. The encrypted KEY is the essential part.
It forever marks that unedited footage as originating with a specific device that recorded it.

It helps because news organizations can then say that what they put out on the air MATCHES the original or people can contest footage and demand to see if it is genuine by asking if the metadata tags are intact on the file.

Make sense?

More or less.

Sort of like serializing kit and if someone used that serialized kit to make a copy, then the copy doesn't have the serial number.

Unless, of course, the person making the copy also copies the serial number.

This would appear to work fine and "originals" would appear to be easily identifiable, unless the copy had the same "serial number" as the "original" and then the originators of the "copy" would be proclaiming that THEIRS was the "original" and we'd be right back where we started from.
 
Last time it was only bogus FB posts and Tweets; now the electorate will have visuals and won’t even have to read....

Considering the growing number of people who are "functionally illiterate in practice" (that means BOTH those who are functionally illiterate and those who are capable of being functionally literate but just don't bother to expend the effort because they already "know the truth"), that is going to be seen as a "Good Thing" by certain segments of the election management industry.
 
Back
Top Bottom