NuEnergy.ai continues to gain traction with a platform that supports the ethical and transparent governance of artificial intelligence (AI) systems.
The company has announced its third implementation of a large-scale AI governance project with the federal government – a deployment of its Machine Trust Platform (MTP) with Public Services and Procurement Canada (PSPC).
NuEnergy.ai is already working with the RCMP and with Innovation, Science and Economic Development Canada. All three contracts are part of an Innovative Solutions Canada (ISC) program to test drive Canadian-made innovation.
“NuEnergy’s MTP is being continuously enhanced based on feedback from the Government of Canada and their wide variety of AI use cases,” said NuEnergy CEO and co-founder Niraj Bhargava.
NuEnergy.ai – which has staff in Ottawa, Waterloo, Toronto, Montreal and Vancouver – was launched four years ago to provide AI management software and consulting services that help clients set up “guardrails" that mitigate risk and protect trust in an organization. Those guardrails consist of governance plans and the software needed to measure essential AI “trust parameters" such as privacy, ethics, transparency and biases.
Artificial intelligence is increasingly used in commercial and government systems, from Siri to online chatbots, customer service, health care, transportation, video streaming, self-driving cars, facial recognition, law enforcement and more.
Given AI’s dependence on data sets, its growing use has sparked a global discussion about privacy, security, ethics, biases, cultural sensitivities and human rights. The goal, according to governance specialists, is to ensure that AI technologies are understandable, transparent and ethical.
The European Union is in the midst of refining a proposed Artifical Intelligence Act. In Canada, the federal government recently tabled BIll C-27, a revamping of privacy legislation that sets out AI and data protection requirements. And earlier this year, the Ontario government issued a list of six “beta principles” to guide the use of AI and data-enhanced technologies in Ontario.
Meanwhile, Communitech is developing a proposed “Good AI” coalition to help guide founders and shape discussions around the ethical and transparent use of artificial intelligence.
The Good AI project is an extension of Communitech’s “tech for good” mantra and the notion that Canada’s reputation as a trusted nation is a competitive advantage for Canadian tech companies.
“In a world where technology, and especially big tech, is not trusted coming out of the United States and not trusted coming out of China, there’s an opportunity for Canada to build technology that has trust inherently built into it and for us to be able to bring that to the rest of the world,” Communitech CEO Chris Albinson told a CityAge event last November.
Bhargava, who has been working with Communitech on the Good AI initiative, agrees. Not only is Canada a trusted nation, it’s also a leader in AI and machine learning.
“We see Canada as potentially leaders in this, and we’re getting calls from international settings to share what we’ve learned,” Bhargava said.
While numerous countries and economic jurisdictions are developing ethical AI frameworks, NuEnergy.ai aims to provide practical assistance to companies and public organizations that want to mitigate the risks around AI, data and privacy.
“As the Artificial Intelligence and Data Act and Bill C27 come into effect, all organizations bound by the act who are procuring AI will need to take action to ensure ethical governance frameworks and monitoring are in place,” Christian Siregar, a NuEnergy.ai faculty member and AI measurement expert, said in a news release. “Our platform is evolving to also address the need for conformity assessments and compliance with these important regulations.”
With AI embedded in so many technologies, Bhargava said organizations need to be more inquisitive about what’s in the products they’re buying. They also need the practical governance and measurement tools to mitigate unintended consequences from using AI, a technology that, by it’s very nature, is meant to be dynamic and evolving.
“AI can be embedded in a lot of other procurements. so you may or may not always know that what you're procuring includes machine learning models and algorithms,” he said. “So there has to be a lens on when do you need to have certain checks and balances beyond what may have been more traditional (in the past).”
NuEnergy.ai has been hearing from a wide range of organizations that collect and use data, such as law enforcement. The company has also been hearing more from boards of directors who are responsible for ensuring that risk is properly managed within an organization.
“Public trust matters so much in so many industries,” said Bhargava. “You can’t just move towards technology solutions; you’ve got to balance it with public trust.”
This website uses cookies to save your preferences, and track popular pages. Cookies ensure we do not require visitors to register, login, or share any identity information.