BBC: How do we locate a ‘troubled’ image

She apologized today (11.03.2024) after the publication of one in which she is portrayed along with her three children and which major news outlets, such as Reuters, AFP, and AP, were forced to withdraw as they considered she had been processed. “Like many amateur photographers, I sometimes experiment with processing. about the confusion caused by the family photo shared yesterday,” Kate Middleton said in a post in H. It is noted that this was the first official photo given to the public by the Kensington Palace after the intervention that Kate was subjected to and because of which she has withdrawn from her duties. On the occasion of the publication of the photograph and the apology of the Princess of Wales, she proceeded to some indications that can help us identify whether an image has been tampered with or “tied” as we say. In a world where images can be altered digitally with just a few clicks or even created entirely from scratch by artificial intelligence (AI), it becomes increasingly difficult to trust what our eyes see. The techniques used to process images are so sophisticated that we have entered the era of superrealistic imitation. Such images can lead to misinformation and could even affect public opinion on important events such as elections. So is there anything we can do to locate an image that may have been modified or created with artificial intelligence? Wonders the BBC… Reflections and shadows An unnatural lighting is often an indication that a photograph has been tampered with. Check people’s eyes. For example, the light source is often reflected in their eyes. If the size and color do not match the position or if the eyes look different, then you have every reason to be suspicious. The way themes and objects appear on the reflective surfaces of an image may also be indications. The shadows of objects in the picture may not be aligned, if it has been “assembled” by many images, although bear in mind that some images may have been taken from multiple sources of light. It is worth considering the way light is reflected in the subject’s face. If the sun is behind it, for example, its ears may look red. Artificial intelligence can also produce incompatible lighting and shadows, but as algorithms improve, persons created by artificial intelligence often become perceived as more real than human. Hands and ears Another revealing approach is to look for features that are difficult to reproduce. Artificial intelligence is bad, at least at the moment, in naturally rendering hands and ears, falsifying their shapes, proportions and even the number of fingers. Look at the metadata. In the code of digital images there are nuggets of information that can help detect an imitation. Every time a digital camera draws an image, metadata are recorded in the image file. The “noise” of the image Each digital camera sensor has tiny manufacturing errors that lead to unique errors that leave a kind of “fingerprint” on the images. This is then associated with a specific camera and can help identify the areas of the photo that have been processed. Verification tools Google and other companies have launched image verification tools that can help people locate images created with the help of artificial intelligence. Facebook and Instagram have begun to highlight the images created through artificial intelligence and come from Meta’s systems. They even plan to do the same for the images created by the artificial intelligence tools of other companies.