Israel attack reveals the dangers of reliance on artificial intelligence

Students and people of all ages have been dumbed down intentionally.

They have been indoctrinated to believe certain things whether it is the belief about climate change or anything else. And instead of reading books, they now do Google searches.

We see the results of this all over society now as these kids graduate and move into society.
 
Students, and the adults they become, are programmed instead of being taught to do research and ask questions. This is especially a problem with the media. 
 
Now we see one result of that which for Israel had lethal consequences, according to ZeroHedge:
 
Hamas’s massacre of Israeli civilians, known as ‘Black Sabbath,’ caught virtually everyone by surprise, even though the group had a long history of violence. One reason for this situation is the lack of information on several aspects of Hamas’s modus operandi. The resulting lacuna has biased the algorithms underpinning search engines that drive artificial intelligence (AI) on the subject.
 
The AI Challenge

The prominence of AI has profoundly and irrevocably changed the human discourse. From its inception on Google and other search engines to the most recent iteration of chatbots such as ChatGPT or Bard, complex algorithms have increasingly driven this process.

A large literature, mostly highly specialized, has analyzed numerous possible biases of the AI discursive products. Bias is created when one idea/topic/concept is disproportionally weighted against another. Faulty algorithms can introduce bias and need to be adjusted. But other issues are also at play.

  • Choosing representative data to correct for bias is also recommended, but in cases where voluminous data is generated on a daily basis over extended periods of time, such remedies are not practical. Experts point out biases which occur when there is imbalance in available data, in the sense that certain topics are overrepresented, whereas information on others hardly exists.
  • Quality of data in the discourse varies from rigorous research appearing in respectable academic publications to conspiracy theories found in niche outlets and social media. The sheer magnitude of ideas/topics/concepts in the discursive universe makes it hard to evaluate their quality. As a rule, discerning players in the discourse shy away from outlandish conspiracy theories, but evaluation of the in-between narratives is exceedingly hard.
  • Relations and causations between variables, two distinctive concepts, are regularly confused in discursive practices, creating a host of fallacies and biases in the narrative. When correlation is mistaken for or misrepresented as causation, it generates a “reality” that does not exist.

These three sources of bias helped to mask Hamas’s true character as a savage terror group, with many adopting the narrative of a national resistance group fighting to liberate Palestinians from “Israel’s oppression.”

 
It appears that people throughout the Biden administration were lulled to sleep while they desperately wanted a deal with Iran, which pledges death to Israel and death to America while HAMAS was training and preparing to attack Israel. They Googled instead of studied history.
 
It appears the stock market, social media outlets, search engines and others rely too much on algorithms and too little on critical thinking, too. Some of them must have gotten burned for lack of actable information based on conditions on the ground.
 
It was all there, but buried in the algorithms, which put to the top of searches the most conventional takes. That's damped down thinking, because criticial thinking is not operative here, nobody develops their brain by doing just Google searches.
 
It is sad that it took a massacre for that information to be finally out there.
 
Image:  Pixabay / Pixabay License

 

 

If you experience technical problems, please write to helpdesk@americanthinker.com