The development of artificial intelligence (AI) has had a significant impact on the ease and accessibility of image generation and editing. This has heightened concerns about deliberately manipulated images causing political turmoil, especially when AI-generated deepfake tools are involved. These tools have the ability to quickly create new images and videos that can be used to spread misinformation or discredit political leaders, putting the authenticity of factual images, photographs or eyewitnesses into question.
However, the history of manipulated photographs for political purposes goes back to the 19th and 20th centuries, long before the terms “deepfake” or “AI” existed. Photographs were often altered to shape the image of world leaders, whether to comfort mourning families, demonstrate patriotism, or enhance the appearance of portrait sitters by hiding flaws. In some cases, the manipulation of photographs even served to rewrite history according to leaders’ political agendas, such as during the regime of Josef Stalin in the Soviet Union. In his time, people who he murdered were sometimes removed from important “historical” photographs, too.
Today, organizations, like the Content Authenticity Initiative, work to help verify the authenticity of digital images and detect AI’s modifying influence. Provenance, or the history of the photograph, is crucial in determining its authenticity. Expert conservators can also examine the physical properties of a photograph to identify any anomalies. However, sometimes all it takes is a feeling that the photo doesn’t look right, as was the case when a reading room assistant at a library had suspicions about a photograph featuring Ulysses S. Grant during the Civil War.
Although the methods of manipulating photographs have evolved over time, the overall goal remains the same: to shape or reshape the image of political leaders through careful editing and manipulation. This highlights the ongoing significance and risks associated with the use of AI in creating and altering photographs, videos and audio for political purposes.
The whytry.ai article you just read is a brief synopsis; the original article can be found here: Read the Full Article…