Datametrex AI (DM.V) gets a Department of National Defence contract for their fake news filter

Does Canada have a fake news problem?

A national survey conducted by Nanos Research for the organization Canadian Journalists for Free Expression (CJFE) found that “more than eight in ten Canadians agree or somewhat agree that search engines like Google should be forced to remove search results related to a person’s name when they are inaccurate, incomplete, or outdated and that fake news is making it more difficult to find accurate sources of information. More than seven in ten Canadians agree or somewhat agree that government regulation is needed to prevent the proliferation of fake news.”

We don’t have the same free speech provisions as the United States. Instead, the Canadian Charter of Rights and Freedoms affords us a reasonable expectation of free speech, with provisions for hate speech, sedition, etc. But there’s a lingering question surrounding the reach of the government as it pertains to curtailing its citizens access to information, and that question surrounds the notion of what is and is not fake news.

The government doesn’t have an answer for us yet, except to say that they believe that fake news could threaten Canada’s democratic institutions at a time when traditional news outlets are facing cutbacks and financial challenges, and there’s not much they can do to stop it.

So naturally they downloaded the problem to the private sector, and that’s (probably) where the IDEaS program came from.

Datametrex AI (DM.V) was awarded the second contract of a multiphase research and development program through the Department of National Defence’s Innovation for Defence Excellence and Security (IDEaS) program today.

The company will receive CAD$945,094 for Component 1b, which will further develop Nexalogy, Social Media Automated Reporting Technologies (SMART), while expanding its fake news and narrative detection technologies. The new cash infusion brings the total value for this contract to $1.1 million, and if DM can finish this product, they’ll be eligible for extra funding.

“This is a solid validation of the work Nexalogy is doing to help secure the safety of Canadians from cyber threats. Working with the Canadian government to build new tools that will be utilised by the military and then be applied to the corporate environment is a key driver for technical advancement in our country.” says Andrew Ryu, Chairman and CEO of the Company.

Fake news detection

Originally announced in February as part of the government’s Defence policy, Strong, Secure, Engaged, the program is a 20-year commitment to $1.6 billion of investment in innovations for defence and security. It’s mandate is for the ongoing search for solutions to support the development of defence and security capabilities.

The IDEaS supports the development of solutions from their conceptual stage, through prototype testing and capability development. The program being developed in IDEaS is Nexalogy, which leverages algorithm-based machine learning to analyze data received from social media to help the government (and other agencies) filter out fake news, malicious bots and other erroneous messages.

Here’s what the project offers:

The following three technologies — topic, viewpoint and narrative identification — track false narratives in the discussions on social media for a region or in relation to an event, such as an election. This is key to deception detection and message shaping:

  • Topic detection — ML (machine learning) and NLP (natural language processing) based automatic detection of topics within social media using unsupervised ML (AI). This allows the system to automatically identify topics;
  • Viewpoint clustering — ML and NLP based automatic detection of viewpoints circulating within social media using unsupervised ML (AI). This allows the system to automatically identify viewpoints;
  • Narrative identification — ML and NLP based automatic detection of narratives circulating within social media using unsupervised ML (AI). This allows the system to automatically identify narratives circulating on social media;
  • Topic/viewpoint/narrative user interface — the topics are a subcategory of the viewpoints, which are a subcategory of the narratives; these layers are automatically detected and displayed so that a user of NexaIntelligence can select and filter different summarized elements from the data set.

“This is a solid validation of the work Nexalogy is doing to help secure the safety of Canadians from cyber threats. Working with the Canadian government to build new tools that will be utilised by the military and then be applied to the corporate environment is a key driver for technical advancement in our country.” says Andrew Ryu, chairman and CEO of the company.

—Joseph Morton

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

%d bloggers like this: