April 30, 2021
Adnan Alattar, Ph.D., is Digimarc’s principal R&D engineer and has been with the company for 21 years. He has made extensive contributions to the development of Digimarc’s digital watermarking technology. Adnan is also a co-chair of the “Media Watermarking, Security & Forensics” conference and an expert in the field of digital watermarking.
We spoke with Adnan to discuss media security and his industry expertise in a rapidly evolving digital environment.
Digimarc: You are heavily involved in the “Media Watermarking, Security & Forensics” conference. When did you begin your involvement and what was your inspiration?
Adnan: My involvement with MWSF began shortly after its founding by Prof. Ed Delp in 1997. Initially, I served as a reviewer and session chair, then I became a member of the program technical committee. In 2012, I became a conference co-chair, and by 2016 I held an at-large conference chair position in the Electronic Imaging Symposium, the parent symposium of the MWSF. My goal with this conference is to establish a premier venue where researchers and practitioners in the field of media watermarking, security and forensics can collaborate on research to advance the field, publish cutting edge research results in a timely manner, and keep abreast of the latest developments.
Digimarc: What is the relationship between watermarking and media security?
Adnan: Watermarking is a critical component of media security. It discreetly hides in the file without distorting the messaging the media carries, which makes it extremely useful for copyright protection and media authentication. Therefore, watermarking can be used to ensure media’s integrity and detect tampering in video, audio or imagery. This provides deterrence of illegal distribution and can aid in the identification of Deepfake videos.
Digimarc: Speaking of Deepfakes, can you give us a working definition? Because of the extensive media coverage recently there may be some misunderstandings of what it is.
Adnan: Deepfakes are fake videos that look convincingly real, which can be generated from real videos using deep learning algorithms. These algorithms are employed to easily alter how people look in the original video or to alter what they say or do.
Digimarc: What industries are focused on Deepfake issues right now?
Adnan: A wide assortment of industries are focused on Deepfakes; there are traditional social media giants such as Google, Facebook and Twitter, various national governmental agencies, universities and research centers, along with mainstream news media outlets.
Digimarc: What are some of the hot topics these days in the manipulation of digital media? Is this the same as Deepfakes?
Adnan: Individuals use deep learning AI (Artificial Intelligence) techniques, such as GAN (Generative Adversarial Network) to easily manipulate existing video into fabricated media pieces that appear real. For example, replacing the face or speech of one person with someone else’s for impersonation purposes are signature Deepfake alterations. If Deepfakes disseminate in political campaigns, it could cause misinformation among voters and change the outcome of an election. Another recent Deepfake targeted a German Bank subsidiary and defrauded the institution’s CEO. This Deepfake convinced the bank to transfer a large sum of money into a fraudulent bank account and resulted in a startling security breach.
Digimarc: Your conference is soliciting papers on several hot topics, including blockchain. What is the relationship between media security and blockchain, and how does it relate to Deepfakes?
Adnan: A blockchain is an open ledger where various media owners can share the media, but no content alterations can be made unless all parties consent. Additionally, this ledger documents the history of changes the media undergoes throughout its distribution. Therefore, to use blockchain for video security, the hardware capturing the video should automatically create timestamps, calculate a hash of it and store it in the metadata. The capture device could also watermark the video to indicate its source and provenance. All subsequent video manipulation events should be written into the blockchain, which creates an audit trail that can verify the integrity of the captured video. This accurate record keeping provides media protection from external, vying perpetrators, which makes blockchain a viable solution to prevent Deepfakes.
Although detecting Deepfakes is tricky and often challenging, there is a strong evidence that pairing watermarking with blockchain technology can empower media platforms to prevent unwarranted attacks.
Learn more about how Digimarc’s digital watermarking technology provides solutions for the Media & Entertainment industry.