Most Americans do not believe artificial intelligence (AI) is trustworthy for election information.

A poll released Thursday by The Associated Press-NORC Center for Public Affairs Research and USAFacts found that just under two-thirds of Americans do not trust generative predictions produced by AI.

Approximately 64% of respondents responded to the survey saying that they are not confident that election information generated by AI chatbots is reliably factual. 

In fact, 43% of survey respondents said they believe AI programs will make finding factual information about the presidential election more difficult. Only 16% of respondents said AI programs will make it easier.

AI chatbots are large language model computer programs that allow users to request information using conversational command prompts. Users can ask questions via text input, and the bot will return an answer composed in a similarly conversational format.

Some of the most successful chatbots use thousands of terabytes of collected data to formulate their answers — but programs can only sort, remix and regurgitate information scraped from somewhere else. AI is unable to think or reason like a human.

In addition to factual errors regularly made by chatbots, AI programs can be used by malicious actors in a variety of ways to spread disinformation.

About 52% of respondents in the AP-NORC poll expressed concern about how AI will compromise their access to verifiable data, compared to just 9% who are excited about AI’s expanding role in the dissemination of information.

While far from perfect, AI programs are becoming increasingly capable of generating realistic images of real-world individuals. Manufactured images of former President Donald Trump, Vice President Kamala Harris and others have become common on social media.

The AP-NORC poll was conducted between July 29 and Aug 8. Its self-reported margin of error is +/- 4%.

This post appeared first on FOX NEWS
Author

Write A Comment

Generated by Feedzy