At Fortune magazine’s Most Powerful Women Summit on Tuesday, AI leaders from Accenture, Salesforce, and Bloomberg Beta discuss why more women aren’t taking advantage of this technology and how it can eliminate bias in data. He talked about how it would make things worse. Karin Klein, founding partner at venture capital firm Bloomberg Beta, said she has read that women are 20% less likely to use ChatGPT at work, and the difference could be even larger. This is because they are hesitant to use technology, perceive it to be biased against them, and distrust its impact. They are jumping off the AI train.
But Klein said women shouldn’t be quick to abandon AI. At its current pace of scale and consolidation, technology is constantly changing. “You can’t try it once and say, “Oh, I get it,” or “It didn’t work for me,” or “I don’t know, the results were bad.” Well, try it again in six months. You might get better results.”
Klein wants women to test AI on their own time and bring its expertise and useful utilities to their work, such as composing emails and scheduling meetings. She acknowledged that while the tool does have potential dangers, there are also plenty of ways to take advantage of it. And if women don’t jump on the AI bandwagon, they won’t be able to keep up with their male colleagues.
“I don’t want women or any community to be left behind, because instead of always hearing opportunities, we hear risks,” Klein said.
Lan Guan, chief AI officer at Accenture, echoed her sentiments.
“All women need to be part of this AI movement by being early adopters,” says Guan. “There’s a lot of fear at first. Seeing is believing, so it’s the responsibility of every business leader to drive this grassroots enablement that enables everyone in the company to use safe and trusted AI tools.” .”
But beyond having executives take over the reins, Guan said women need to seize the moment on their own. She encouraged women to not only be early adopters of AI but also champions of the GenAI movement.
And if you don’t, you’re at a lot of risk. Guan recalled an example in which a chatbot assumed a woman would be a housewife when she took on a new role, and a financial leader when asked about a man’s new job. The dataset on which the AI was trained was biased and produced sexist results. But that could change as more women take the lead in creating and testing AI.
“Something is wrong here. If we don’t take an active role in eliminating bias in AI, starting with each of us, these kinds of problems will never go away,” Guan said. spoke.
Paula Goldman, Salesforce’s director of ethics and humane use, agreed. She said underrepresented groups need to be part of the AI process and her company hires a diverse workforce to disrupt and change technology models. They test the AI, coach it, and provide feedback. This helps identify biases and weaknesses in the tool. Without their input, the AI will continue down that same path.
“Feedback when using an AI system can significantly change the trajectory of the AI system,” Goldman says.
Recommended newsletters
Broadsheet: Covering trends and issues affecting women in and out of the workplace, and the women who are changing the future of business.
Sign up here.
Source link