While there is an option to report an account as a whole, all it does it flag the account itself for the team to look at, and see if they’re creating ToS breaking content. However, most of the time an account is reported, it isn’t for the content they’re making, it’s for the words they’re saying as a user on the app (their comments and replies).
As of right now, reporting an account does nothing to hold those using hate speech (racist, homophobic, etc) in comments on a regular basis, and those telling others to kill themselves/harass others in comments, accountable for their toxic behavior. And that needs to change ASAP.
In order to change that, allowing us to report specific problematic comments/replies on people’s bytes AND type out a small blurb explaining why they’re being reported (sexual harassment, hate speech, homophobic slurs, etc) would work best for getting this hate off the app quicker and easier, so we can all continue to have a safe space to create.