AI

California’s Attorney General says AI and ‘deepfake’ scams are widespread. Here’s how to avoid them


California bill AI

In this photo illustration an Artificial Intelligence (AI) symbol is displayed on a smartphone with stock market percentages in the background. (Omar Marques/SOPA Images/LightRocket via Getty Images)

Artificial Intelligence and “deepfake” related content in the form of fake visual or audio pieces of media are only becoming more widespread as technology advances, and California officials are looking to inform residents on how to spot them.

“Scammers are often quite literally in our pockets, just a phone call, social media message, or text away,” Attorney General Rob Bonta said in a statement. “AI and other novel and evolving technologies can make scams harder to spot. Knowing what to look for is an important way to keep consumers safe against these tactics. I urge Californians to take practical steps to guard against being victimized by scammers, including talking to friends and family who may be unaware of these dangers.”


AI-related scandals have already rocked the U.S. in recent months.

FILE - President Joe Biden signs an executive on artificial intelligence in the East Room of the White House, Oct. 30, 2023, in Washington. Vice President Kamala Harris looks on at right. The White House said Wednesday, Feb. 21, 2024, that it is seeking public comment on the risks and benefits of having an AI system's key components publicly available for anyone to use and modify. (AP Photo/Evan Vucci, File)
FILE – President Joe Biden signs an executive on artificial intelligence in the East Room of the White House, Oct. 30, 2023, in Washington. Vice President Kamala Harris looks on at right. The White House said Wednesday, Feb. 21, 2024, that it is seeking public comment on the risks and benefits of having an AI system’s key components publicly available for anyone to use and modify. (AP Photo/Evan Vucci, File)

The proliferation of AI-generated pornographic photos of young students at a middle school in Beverly Hills led to five expulsions in February. In April, an educator at a Maryland high school created a fake audio recording of his boss, the principal, saying racist and antisemitic remarks.

According to the California AG, an AI scam targeting parents is making the rounds, where their child’s cloned voice can beg for money after a car accident or trouble with the law. In January, an AI scam in New Hampshire allegedly gave voters a fake call from President Joe Biden, who dissuaded them from voting in the state’s primary election.

Similar incidents have happened at a more macro level, as even Taylor Swift has been victim of AI-generated pornography and misinformation can spread like wildfire online, especially in an election year.

Taylor Swift wears a Kansas City Chiefs tight end Travis Kelce jacket as she arrives before an NFL wild-card playoff football game between the Chiefs and the Miami Dolphins, Saturday, Jan. 13, 2024, in Kansas City, Mo. A scourge of pornographic deepfake images generated by artificial intelligence and sexualizing people without their consent has hit its most famous victim, singer Taylor Swift, drawing attention to a problem that tech platforms and anti-abuse groups have struggled to solve. (AP Photo/Ed Zurga, File)

Legislation to combat these “deepfakes” is in the works at a state and national level, but in the meantime, how can regular people avoid becoming a victim of attacks and misinformation?

Here are the tips from the California AG’s office:

  • Develop family code words: Develop simple ways of verifying if a family member truly is in trouble before responding to phone calls for financial help or sharing personal information. Talk with family about designating “safe words” or asking a question that only that person would know the answer to. When creating a question, be mindful that scammers might have access to information from social media and other online sources.
  • Minimize personal audio/video content on social media accounts: Consider removing personal phone numbers and audio and video clips from your and your children’s social media profiles. AI scammers can use these clips to create clone voices and videos of loved ones.
  • Check privacy settings: Strengthen privacy settings on social media so that strangers don’t know facts about your life and your current whereabouts, including whether you or a family member is out of town.
  • Don’t answer the phone: Let phone calls from unfamiliar numbers go to voicemail. They often are illegal robocalls.
  • Don’t trust caller ID: Phone numbers can be “spoofed” to look like a familiar number from friends, family, a school district, or a government agency. Don’t assume the caller ID is accurate and be wary if anything seems different about the caller or if they ask for financial or personal information.
  • Hang up the phone: If you suspect a scam call, immediately hang up. Don’t automatically trust automated messages: often pressing “1” to indicate you don’t want to receive future calls just notifies bad actors that they should continue calling this active phone number.
  • Take advantage of call-blocking technology: Many cellular providers offer enhanced call-blocking technology that can assist in preventing robocalls from reaching you.
  • Don’t click on suspicious links: Scammers will try to get you to click on links that are sent to you in texts, emails, or social media. Text messaging is particularly dangerous because you might hurriedly click on a link and begin entering a password, not realizing that the link is phony, and your password is being recorded.
  • Go directly to websites: Go directly to the website of a company you are familiar with rather than clicking on a link that has been sent to you. Some fraudulent links are made to look very similar to the actual website address. You should never click on links that are texted to you – for example, by what seems like a bank. Instead, go to the bank’s website on your own internet browser.
  • Use strong passwords: Protect yourself by using different, unique passwords for each of your online accounts. Make sure that the passwords you use are at least eight characters, including a mix of letters, numbers, and symbols. Consider using a password manager to provide suggestions and store strong passwords.
  • Protect your Social Security number (SSN) and other sensitive information: Keep your Social Security card at home in a safe place instead of carrying it around in your wallet. Only provide your SSN when absolutely necessary, such as on tax forms or employment records. If a business asks you for your SSN, see if there is another number that can be used instead.
  • Beware of government impersonations and other common scams: Some scammers are sophisticated. They may offer to provide “documentation” or “evidence,” or use the name of a real government official or agency to make you think that their calls are legitimate. If a government agency calls you and asks for financial or personal information, hang up and go to the agency’s official website (which should be a .gov website) and call them directly. Government officials will not threaten you with arrest or legal action in exchange for immediate payment. They will not promise to increase your benefits or resolve an issue in exchange for a fee or transfer of funds to a protected account. And they will not ask for payment in the form of gift cards, prepaid debit cards, wire transfer, internet currency, or by mailing cash.



Source

Related Articles

Back to top button