Netease Technology News July 2 news, at a meeting of the Senate Intelligence Committee on June 8, the former Federal Bureau of Investigation, James Comey (James Comey) said: "I hope to use the tape (verify the truth). "The reason Kemy hopes to use the sound of tape recording as proof is that in our society, the use of audio and video files as evidence is not completely reliable. It is not equivalent to the truth."
Nowadays, when people see videos of violence, crime, and especially videos that have a certain quality and no obvious editorial marks, they often think that the events depicted in the video actually occurred.
However, the truthfulness of the video is about to be changed by artificial intelligence.
Our analysis predicts that technologies such as big data and machine learning can help monitor reality and identify facts. But on the other hand, these technologies can also help us lie. Just as the rapid development of artificial intelligence technology, audio and video forgery technology is also making amazing progress, which is mainly due to the continuous improvement of artificial intelligence technology. In the future, people need to deal with more genuine and fake audio and video. Whether it is audio, video, images or text, it is difficult for people to find out whether it is true or not.
Lyrebird is a Montreal-based deep learning technology startup company. The company is currently developing a technology that enables anyone to create surprisingly realistic speeches with any personal voice.燣yrebird uses demonstrator technology to imitate the speeches of celebrities such as Donald Trump, Barack Obama and Hillary Clinton. At present, although Lyrebird's simulation is impressive, the quality is not high, and the signs of machine synthesis are obvious. Even ordinary people can easily recognize that it is a machine synthesis. However, similar technological progress is very rapid. Creative software giant Adobe is working on similar technologies and announced the goal of creating Photoshop for audio in the audio industry.
Researchers at Stanford University and other institutions have made surprising progress in video forgery. Using only off-the-shelf webcams, researchers can use artificial intelligence software to change the individual's facial expressions and speech-related oral movements in YouTube videos. A researcher re-edited a video about former U.S. President George W. Bush, in which new facial expressions and verbal expressions were inserted in real time.
Other artificial intelligence research groups have demonstrated the technology of running image recognition functions in reverse. This technique allows composite images to be generated based on text descriptions. Jeff Clune, the researcher who led the work, pointed out, “People send me real images and I doubt whether these images are fake. When they send fake images to me, because the quality is very good. OK, I think it's true. "
In general, the quality of counterfeit audio and video is getting higher and higher, and the cost is getting lower and lower, which is unavoidable. According to the current progress, audio falsification technology can deceive people's ears within two to three years, and after five to ten years, counterfeiters can even go to the case-handling experts. When tools for producing fake videos can generate higher quality videos that are popular in the public at the same time, these fake audio and video may affect the entire information ecosystem. The development of related technologies will change the definition of evidence and truth in the fields of news, government exchanges, criminal justice evidence, and national security.
According to reports, the Russian intelligence agency employs thousands of full-time workers who provide fake news articles, social media posts and comments on mainstream websites. These agents in turn control millions of social media zombie accounts. A study conducted by the Oxford Internet Research Institute’s Computational Communications Research Project found that half of Twitter accounts reviewed by Russia are robots. These actions are not confined to its territory: In the United States, Russian social media robots have demonstrated their ability to promote mainstream media coverage of counterfeit news and even affect US stock prices.
What happens when these agents and botnets start sharing fake HD video and audio? The technology industry and the government should not be indifferent. The threat of this technology is manifold. So there needs to be a corresponding solution.
Some will be technical in nature, similar to technology solutions that try to prevent image software such as Photoshop from being used to counterfeit money, and there will be technical solutions to mitigate the worst effects of artificial intelligence counterfeiting. In this regard, Blockchain, a blockchain technology, offers a possibility that this technology provides reliable evidence for the sorting of Bitcoin transactions. We can design cameras and microphones that use blockchain technology to create video recordings that cannot be tampered with. Although this does not prevent re-editing or counterfeiting, at least it guarantees that cryptographic security-related evidence for a particular document exists on a specific date.
Other solutions will be regulatory and procedural. Police and prosecutors must establish evidence standards to prove the chain of custody of a particular camera or microphone. Video files for anonymous emails may not eventually become valid evidence. Since telephone and video chat can not only be digitally intercepted but also digitally simulated, people use face-to-face communication as much as possible in high-level conferences.
Since the end of the 19th century, with the invention of photographs and phonographs, people have found answers through some important audio and video materials. President Richard Nixon said he did not know the Watergate incident. But the tape proved to be lying on him. Faced with the development of audio and video forgery, it is necessary for society to face it correctly. Otherwise, we must live in a society where the truth cannot be found.
Leather Phone Case,Wallet Phone Case,Bullstrap Iphone Case,Leather Phone Holster
Guangzhou Jiaqi International Trade Co., Ltd , https://www.make-case.com