The move follows the death of British 14-year-old Molly Russell who killed herself in 2017 after viewing graphic content on the platform.
Her father Ian said he believed Instagram was partly responsible for her death.
Speaking at the NSPCC's How Safe Are Our Children? conference in June, Mr Russell said: "It is important to acknowledge that they (technology firms) do a lot of good, but sadly their platforms are being used by people to do harm and they have not done enough to prevent that.
"Unless change happens, their platforms will become toxic."
Mr Russell has described Instagram's new commitment as "sincere" but said the company needed to act more swiftly.
The Facebook-owned app's latest promise covers explicit drawings, cartoons and memes about suicide, along with any other method "promoting" self-harm.
Instagram chief Adam Mosseri told BBC News: "It will take time to fully implement... but it's not going to be the last step we take."
In a previous move last February, Instagram banned "graphic images of self-harm" and restricted those with suicidal themes, including both pictures and videos.
When Molly died, her father found graphic content about self-harm and suicide on her Instagram account, and similar material on her Pinterest account.
Instagram says it has doubled the amount of material removed related to self-harm and suicide since the start of this year.
Between April and June 2019, it said it had removed 834,000 pieces of content, 77% of which had not been reported by users.
"There is still very clearly more work to do, this work never ends," Mr Mosseri said.
Mr Russell responded: "I just hope he delivers."