sport

Deepfake of principal's voice is the latest case of AI being used for harm

Font size+Author:Global Gist news portalSource:health2024-05-21 16:55:38I want to comment(0)

The most recent criminal case involving artificial intelligence emerged last week from a Maryland hi

The most recent criminal case involving artificial intelligence emerged last week from a Maryland high school, where police say a principal was framed as racist by a fake recording of his voice.

The case is yet another reason why everyone — not just politicians and celebrities — should be concerned about this increasingly powerful deep-fake technology, experts say.

“Everybody is vulnerable to attack, and anyone can do the attacking,” said Hany Farid, a professor at the University of California, Berkeley, who focuses on digital forensics and misinformation.

Here’s what to know about some of the latest uses of AI to cause harm:

AI HAS BECOME VERY ACCESSIBLE

Manipulating recorded sounds and images isn’t new. But the ease with which someone can alter information is a recent phenomenon. So is the ability for it to spread quickly on social media.

The fake audio clip that impersonated the principal is an example of a subset of artificial intelligence known as generative AI. It can create hyper-realistic new images, videos and audio clips. It’s cheaper and easier to use in recent years, lowering the barrier to anyone with an internet connection.

Related articles
  • Nuggets blow 20

    Nuggets blow 20

    2024-05-21 16:35

  • Mayor of North Carolina's capital city won't seek reelection this fall

    Mayor of North Carolina's capital city won't seek reelection this fall

    2024-05-21 16:25

  • US files 2nd labor complaint after Mexico refuses to act on union

    US files 2nd labor complaint after Mexico refuses to act on union

    2024-05-21 15:58

  • The Sky added plenty of star power in the WNBA draft with Kamilla Cardoso and Angel Reese

    The Sky added plenty of star power in the WNBA draft with Kamilla Cardoso and Angel Reese

    2024-05-21 14:37

Netizen comments