Igor Borisov told state news agency RIA Novosti that his team had concerns over the vote, which had not conformed to “international standards”.
In the febrile atmosphere that followed the referendum, Borisov’s intervention merely confirmed what many already believed. Fuelled by a conspiracy theory, a campaign for a re-vote was launched. Particularly influential in drawing supporters to the cause was a YouTube video alleging vote rigging, which featured the “smoking gun” of an election counter apparently moving votes from the Yes to No pile.
The video has been viewed nearly 900,000 times and is likely to have enjoyed a crossover with a Change.org petition launched by Rally for a Revote, which attracted more than 100,000 signatures. Research published last week by Ben Nimmo, an analyst at US think tank the Atlantic Council, claimed pro-Kremlin internet trolls helped bolster the success of both the video and the petition, despite the ultimate failure of the overall campaign.
Mr Nimmo said that while there had been concerns among genuine Scottish voters about the fairness of the referendum, these had been amplified and disseminated by the Russian trolls.
His work builds on a growing body of evidence showing Russian attempts to influence Western democracies via social media.
Last month the US Congress published a list of more than 2,700 Twitter accounts with links to the St Petersburg-based Internet Research Agency, sometimes referred to as the “troll factory”.
Researchers at Edinburgh University have identified more than 400 accounts operating out of the agency which have commented on UK politics.
While there is limited evidence of links between the Kremlin and the shadowy troll factory, there has in the past few months been a growing awareness of Russian efforts to subvert, something Prime Minister Theresa May has described as an attempt to “weaponise” information.
The difficulty for law enforcement and the security services is that much of this agitating is being done under the auspices of free speech and isn’t actually illegal.
On occasion, however, things have become more sinister, such as when pro-Russian Twitter accounts attempted to foment anti-Muslim sentiment following the Westminster Bridge terror attack.
Trolls have also been accused of setting up Facebook accounts which have orchestrated demonstrations (and counter demonstrations) in the real world, raising the possibility of remote actors creating potential flashpoints from thousands of miles away.
After years of burying their heads in the sand, there does now appear to be an attempt to take responsibility on the part of the social media companies.
Twitter this week suspended the accounts of far-right Britain First leaders Paul Golding and Jayda Fransen after tightening its rules on hate speech.
Tracking down anonymous trolls hidden within the apparatus of one of the world’s most secretive states may be an altogether more difficult task.