Do people just pretend?

Do people actually like work and the company culture or do they just pretend? Props either way, but do you just gaslight yourself into loving work? What’s the secret

I can’t imagine but I feel like people have to actually love company culture the way that tehy embrace it

I feel like it depends on company to company. Like, I worked an entry-level job at this one company once where everyone treated me like I was beneath them. I’m a lot happier where I am now, but I definitely didn’t stay at the other place long.

1 Like

I used to really like my company/job but it’s gone downhill recently. I feel like most of us know that it’s not great but at the end of the day it’s a check. My job doesn’t define my happiness and it’s not going to give me the most fulfillment from life. I’m fine with just having a job for now.

This is a good way to look at it, you can always have hobbies on the side