Human Rights Watch recently conducted an extensive review of AI training materials and found that images of children taken from the Internet were used to train AI models without the consent of their families. These images included personal information that made it easy to identify the children, and many were obtained from social media accounts with privacy settings. This practice raises serious concerns about the violation of children’s privacy and the potential long-term implications of posting information about them online.
The report also highlights the issue of “sharenting,” where parents share information and photos of their children on social media. While this may seem harmless, the report reveals that children cannot consent to the sharing of their personal information, and parents may not fully understand the consequences of their actions. Furthermore, the use of children’s data in machine learning raises questions about the ethical implications of AI companies using a minor’s information without permission.
The recent ruling by the U.S, Supreme Court against the Chevron doctrine adds another layer of complexity to the issue, as it limits the power of federal agencies to regulate these practices. This leaves the regulation of privacy and data protection laws in the hands of the states, creating a disjointed and potentially inconsistent approach to addressing these issues.
In the absence of meaningful protections and ethical guidelines for technology, the responsibility falls on individuals to consider the implications of sharing personal information about their children online. Until comprehensive privacy legislation is in place, it is important to be cautious about sharing children’s data and to advocate for data dignity and ethical technology practices. The lack of federal oversight and the dominance of powerful technology and AI companies in setting their own rules further underscores the urgency of addressing these issues at the state level.
The whytry.ai article you just read is a brief synopsis; the original article can be found here: Read the Full Article…