‘Every copilot pilot gets stuck in pilot’-unless companies balance data security and innovation, say experts
“It’s very hard to innovate unless the underlying data that you’re innovating on is properly protected,” said Vecci. “We’re trying to make people more productive, we’re trying to use AI and other new technologies, but in order to realize these benefits, it has to be done safely.”
Scott Holcomb, U.S. enterprise trust AI leader at Deloitte, agreed that both internally and for his clients, “we’ve absolutely had to put guardrails in place” in terms of what people can and cannot do when using AI tools. For example, the amount of data that Microsoft Copilot has on individuals and organizations is “immense,” he explained. “We were not comfortable with that, so we had to work our way through that with Microsoft, but we absolutely had to do a lot of training for our staff in terms of what you can and can’t do with client data, too.”
Yet leaders like Keith Na, SVP of technology and data at Cargill, cautioned that swinging too far the other way—shutting down experimentation altogether—can be just as dangerous. What organizations need, he said, is a culture of curiosity: a willingness to let engineers break, test, and learn in safe spaces.
“I think a lot of technologists go into our profession to solve badass problems together,” he said. “And I think over time we’re isolating our [teams].”
For the past 18 months, he explained, the company has worked to break down those barriers and have engineers embed into product teams. “Not only does it solve the hard problems in a more simple way, it’s actually created a culture and an environment where people are having fun coming to work, they’re solving problems that we haven’t been able to solve and the morale has just skyrocketed,” he said. Over time, “this creates an environment of proactive innovation while still putting guardrails in place.”