While sure, it’s definitely true that careers can be a source of pride, purpose, and stability, they can also become overwhelming, draining, and even harmful. When work starts to chip away at your mental health, it’s easy to ignore the warning signs or brush them off as “just part of the job.”
But here’s the truth: no job is worth sacrificing your well-being. It’s really unfortunate to say, but employers aren’t doing enough to help their employees with their mental health. Sure, it’s being recognized, but literally, not much is actually being done.
So, just recognising when your career is causing more harm than good is an act of self-care, not failure. If any of the following signs resonate with you, it might be time to re-evaluate your path and prioritize your mental health. Seriously, your mental health, even your physical health, needs to be number one, you need to be number one, and this job (or career) just might not be enough.
Stress Has Become Your Constant “Companion”
Every job comes with its share of stress, but when it starts to feel like your baseline, well, that’s a problem. For example, if you’re waking up anxious, dreading the day ahead, or lying awake at night replaying work situations, it’s not just “part of the grind.” No, really, it’s not! You need to understand that chronic stress impacts every part of your life, such as your mood, your relationships, and even your physical health. If you’re feeling constantly on edge or emotionally exhausted, your body and mind are sending you a clear message: you deserve better.
You’re Turning to Unhealthy Coping Mechanisms
When the pressure of work becomes unbearable, it’s easy to turn to quick fixes like alcohol or substances to escape or keep going. This is especially common in high-pressure careers like nursing and healthcare, where long shifts, emotional intensity, and unrealistic expectations can take a huge toll.
It’s really sad to say, but some nurses find themselves relying on substances to energize them through the day or to numb the stress once it’s over. It’s even gotten to the point where there is even a drug rehab for nurses because of how common this is getting. However, overall, when it comes to healthcare professionals, this is becoming the norm.
Technically, for those who work in finance or even the entertainment industry, while jokes are constantly made on social media, it’s literally the same case, too. So, if this sounds familiar, it’s a sign that the demands of your job are pushing you into dangerous territory. You’re better off leaving it now because you’ll only spiral into getting worse.
Your Relationships are Feeling the Strain
When work starts to bleed into your personal life, it can have a devastating ripple effect. Now, sure, it happens, but it’s going to get to the point where you feel so drained after work that you have nothing left to give to the people who matter most. This does ruin relationships.
You’ve Lost the Spark for What You Do
Even in the most fulfilling careers, not every day is going to be a good one. But if the bad days far outweigh the good (as in, every day), and the passion you once felt for your work has completely faded, it’s a sign something isn’t right, and it might be time to quit.