Safety and Crisis Support

BACK TO HOME

1. Overview

Creed includes AI-generated features that may feel personal or conversational, but users are interacting with an artificial intelligence system and not a human being. This page describes our current high-level approach to self-harm and crisis-related safety.

2. Unsafe Self-Harm Content

Creed is not intended to encourage, assist, normalize, or instruct self-harm, suicide, or violent behavior. We may limit, interrupt, or block content or account activity that appears to seek or promote such content.

3. Crisis Support Notices

If a user's message appears to indicate suicidal ideation, self-harm, or a crisis situation, Creed may provide a notice encouraging the user to seek immediate support from qualified crisis resources or emergency services.

4. Emergency Situations

Creed is not an emergency service, mental health provider, or crisis hotline. If you are in immediate danger or think you may harm yourself or someone else, call 911 or your local emergency services immediately. If you are in the United States or Canada, call or text 988 to reach the Suicide and Crisis Lifeline.

5. Minors

Companion-style AI features may not be suitable for some minors. We may apply additional notices, restrictions, or safety interventions where appropriate.

6. Changes to This Page

We may update this page from time to time as our product, safety practices, and legal obligations evolve.

7. Contact Us

If you have questions about this page or our safety practices, contact us at [email protected].