Ex-Google CEO says tech sector faces A.I. ‘reckoning’

Artificial-intelligence-powered tools have quickly gone from sci-fi experiments to a big part of people’s daily lives—for better or for worse. While some people are using A.I. more to plan vacations and illustrate comic books, others have been using it to generate fake images of the pope in a puffer jacket or write fraudulent essays. 

Eric Schmidt, the former Google CEO and chairman, says he can hardly keep up with how quickly A.I. technology like ChatGPT is being deployed, and he thinks the tech sector must confront how to use the technology so that it does more good than bad. 

“We, collectively, in our industry face a reckoning of, how do we want to make sure this stuff doesn’t harm but just helps?” Schmidt said in an interview with ABC News on Sunday. He gave the example of social media influencing election outcomes and, in some extreme cases, even leading to deaths. 

“No one meant that as [the] goal, and yet it happened. How do we prevent that with this [A.I.] technology?” Schmidt told ABC News. 

The former Google executive said that A.I. had the potential to address big problems that plague society. He highlighted how certain industries like health and education will benefit from the advancement of A.I. by improving access to resources. 

“Imagine a world where you have an A.I. tutor that increases the educational capability of everyone in every language globally. These are remarkable,” Schmidt said. 

“But, at the same time, they face extraordinary—we face extraordinary new challenges from these things, whether it’s the deepfakes that you’ve discussed, or what happens when people fall in love with their A.I. tutor?” Schmidt said. He added that he was concerned about A.I.’s “use in biology or in cyberattacks,” and said the real problem with A.I. was when it was “used to manipulate people’s day-to-day lives, literally the way they think, what they choose and so forth, it affects how democracies work.” 

Although A.I. has long been a subject of discussion, it broke into the public consciousness when OpenAI’s ChatGPT was introduced last November. In a matter of two months, ChatGPT had already gained over 100 million daily active users, making history as the fastest-growing internet application ever. Since then, Google and Microsoft have introduced their A.I. chatbots. 

As these bots become more widely adopted, concerns about the safety, bias, and accuracy of what they say continue to pose a challenge. Microsoft’s chatbot advised a reporter to leave his wife in one instance, while Google’s chatbot Bard made a factual error in a public demo that resulted in Google’s parent Alphabet shedding $100 billion in market value.

It’s all about balance

The ex-CEO of Google has spoken about the potential of A.I. in the past—including in military and warfare. Earlier this year, he said that the Pentagon could capitalize on the technological advancements of A.I. to modernize weapons and circumvent the slow pace of defense upgrades in the U.S.

“Every once in a while, a new weapon, a new technology comes along that changes things,” Schmidt said in an interview with Wired in February. 

While A.I.’s strides are remarkable, Schmidt also had some dire forecasts about what inappropriate use of A.I. could mean for society. 

He has said that A.I. could damage politics by enabling the spread of false information and “outrageous” messaging in an interview with Amanpour & Company last month and suggested that an agreement between government and private players over A.I. regulation was critical to its smooth functioning.

Schmidt, who coauthored a book called The Age of A.I., released in 2021, said that guardrails need to be put in place if A.I. is to be used for the betterment of society. First thing: The government should find the right way to address and communicate about A.I., according to the former Google chief. Next, he suggested the tech industry must have an organization to set controls on how this technology is used. 

SOURCE

No votes yet.
Please wait...

Leave a Reply

Your email address will not be published. Required fields are marked *