Based on intelligence it gathers automatically from users’ content — such as posts and videos uploaded to the social media platform — Facebook announced recently it’s launching enhanced suicide prevention efforts.

A week ago, Facebook’s vice president of product management, Guy Rosen, published a blog post offering some insight into the program’s goals.

The moves comes after several high-profile cases in which people used Facebook Live to stream suicide attempts.

“Facebook is a place where friends and family are already connected and we are able to help connect a person in distress with people who can support them,” read the post. “It’s part of our ongoing effort to help build a safe community on and off Facebook.”

[lz_ndn video=33307084]

Reaction to the announcement’s been mixed. Though some people are regaling the company’s proactive efforts to prevent suicide attempts (as well as other types of self-harm), many others have worries about privacy.

Misha MacIntyre is someone familiar with the suicide-related dangers that surface on social media, particularly among its younger users. MacIntyre is a former EMT and paramedic employed as a health technician in a school system near Hidden Valley Lake, California. “I’ve been on scene where people have attempted and completed suicides,” she told LifeZette.

Regarding Facebook’s new program, she believes it could be helpful, “if it’s done right.”

Related: Director Alludes to a ‘Few More’ Sexual Predators’ Lurking in Hollywood

She deals with suicidal students, she explained, within the school system on a regular basis. Some of them come to her attention via fellow students. The students show her the Facebook and Instagram posts, for example, of friends they’re concerned about.

Who do you think would win the Presidency?

By completing the poll, you agree to receive emails from LifeZette, occasional offers from our partners and that you've read and agree to our privacy policy and legal statement.

One of MacIntyre’s concerns about social media in general is related to cyberbullying — which has played a role in a number of tragic and widely reported suicides among youth. Though Facebook has talked about this issue in the past, cyberbullying remains a significant problem on the platform.

Facebook says its implementation of the suicide prevention protocol will include elements that go beyond the algorithm used to flag worrisome content. In addition to pattern recognition technology designed to key in on certain phrases, the company is dedicating additional staff from its community operations team to review reports that users may harm themselves.

Related: Exclusive: ‘Dilbert’ Creator on Trump, the Master Persuader

If Facebook’s reviewers determine someone may be at imminent risk of self-harm, they will contact emergency services to intervene directly. So Facebook’s plans include dispatching police, ambulance, or other emergency personnel right to users’ front doors, in some cases.

Most folks agree suicide prevention is a laudable goal. Some users’ concerns, however, focus on the scope of the program — that the data Facebook gathers could easily be leveraged against people. The data, for example, could conceivably be used unfairly by employers or insurance companies to wrongly pin mental health problems on certain individuals.

Some users, young and old, have become comfortable — perhaps too comfortable — in sharing intimate details of their lives.

Social media’s ubiquity is a double-edged sword. As MacIntyre noted, “Social media are a huge part of people’s lives.” On platforms like Facebook — which has more than a billion active users — the sheer volume of data generated is staggering.

Many young users have never known a world without social media. Some users, young and old, have become comfortable — perhaps too comfortable — in sharing intimate details of their lives, which are better kept private.

Related: Teens and Social Media: Time to Dial It Down

None of us can “un-ring” a bell. If Facebook’s well-intentioned suicide prevention efforts inappropriately tag a user as having mental health issues, the fallout could be devastating. While suicide prevention is clearly an important goal, caution is in order here — for both users and Facebook itself.

Michele Blood is a freelance writer with a passion for children’s literature. Based in Flemington, New Jersey, she leverages her background in psychology in her work for publishers, businesses and NPOs.