I’m a 10+ (cumulative) yr. experience dev. While I never used The GitHub Copilot specifically, I’ve been using LLMs (as well as AI image generators) on a daily basis, mostly for non-dev things, such as analyzing my human-written poetry in order to get insights for my own writing. And I already did the same for codes I wrote, asking for LLMs to “Analyze and comment” my code, for the sake of insights. There were moments when I asked it for code snippets, and almost every code snippet it generated was indeed working or just needing few fixes.
They’ve been becoming good at this, but not enough to really replace my own coding and analysis. Instead, they’re becoming really better for poetry (maybe because their training data is mostly books and poetry works) and sentiment analysis. I use many LLMs simultaneously in order to compare them:
- Free version of Google Gemini is becoming lazy (short answers, superficial analysis, problems with keeping context, drafts aren’t so diverse as they were before, among other problems)
- free version of ChatGPT is a bit better (can keep contexts, can issue detailed answers) but not enough (it does hallucinate sometimes: good for surrealist poetry but bad for code and other technical matters when precision and coherence matters)
- Claude is laughable hypersensitive and self-censoring to certain words independently of contexts (got a code or text that remotely mentions the word “explode” as in PHP’s
explode
function? “Sorry, can’t comment on texts alluding to dangerous practices such as involving explosives”, I mean, WHAT?!?!) - Bing Copilot got web searching, but it has a context limit of 5 messages, so, only usable for quick and short things.
- Same about Bing Copilot goes for Perplexity
- Mixtral is very hallucination-prone (i.e. does not properly cohere)
- LLama has been the best of all (via DDG’s “AI Chat” feature), although it sometimes glitches (i.e. starts to output repeated strings ad æternum)
As you see, I tried almost all of them. In summary, while it’s good to have such tools, they should never replace human intelligence… Or, at least, they shouldn’t…
Problem is, dev companies generally focus on “efficiency” over “efficacy”, wishing the shortest deadlines while wishing some perfection. Very understandable demands, but humans are humans, not robots. We need our time to deliver, we need to cautiously walk through all the steps needed to finally deploy something (especially big things), or it’ll become XGH programming (Extreme Go Horse). And machines can’t do that so perfectly, yet. For now, LLM for development is XGH: really fast, but far from coherent about the big picture (be it a platform, a module, a website, etc).
My comment is meant to bring the perspective of someone who’s facing depression so to try to answer the main question (“a warning with suicide hotline really make positive difference?”) through that perspective. It’s not to seek mental help for myself.
For context, I’m a person facing depression, and my depression has broad and multifaceted reasons, from unemployment, going through familiar miscommunication (my parents can’t really understand my way of thinking), all the way to my awareness of climate change and transcendental concepts that lead myself to existential crisis. I’m unemployed to seek therapy (it’s a paid thing) and I don’t really have someone face-to-face capable of understand the multitude of concepts and ideas that I face in my mind (even myself can’t understand me sometimes).
That said, every depressive person has different ways to cope with depression. While some really need someone to talk to (and the talking really helps in those situations), it’s naive to think a conversation will suffice for every single case. I mean, no suicide hotline will make me employed, nor will magically solve the climate changes we’re facing.
So how I try to deal with my own depression? With two things: occult spirituality (worshiping The Dark Mother Goddess) and writing poetry and prose. I use creative writing as “catharsis” for my suffering, in order to “cope” with the state of things that I can’t really control (I can’t “employ myself” or “sell my services to myself”, I can’t “befriend myself”, I can’t stop temperatures from rising till scorching temps, nor the other already-ongoing consequences of climate change; I try to make some difference but I’m just a hermit weirdo nerdy nobody among 8 billion people, and I have no choice but to accept it).
I’m no professional writer (I’m just a software developer), but thanks to The Goddess, I can kinda access my unconscious (dark) mind and let it speak freely (it’s called stream-of-consciousness writing style). Sometimes I even write some funny surrealist prose/story, but sometimes it takes a darker turn, such as dark humor, or nihilistic, or memento mori. Doing this relieves the internal pressure inside my unconscious mind. After writing, I sometimes decide to publish it through fediverse , but when I do it, I constantly feel the need to “self-censor”: sometimes the stream-of-consciousness can lead to texts that people could interpret as some “glorification of suicide/self-harming” (especially when my texts take a nihilistic/memento mori turn), so I often censor myself and change the way I wrote the text. Well, it’s kinda frustrating not being able to fully express it, but I kinda understand how these texts could trigger other people also facing depression.
The fact is: when I write, it’s really relieving, way more than talking to people because, with poetry/prose writing, I can express symbolic things, I can have multiple layers of depth, I can use creative literary devices such as acrostics and rhymes, I can learn new English words while being a Brazilian, I can blend scientific concepts with esoteric and philosophic (my mind really thinks this way, blending STEM, philosophy and belief/esoteric/occult/religious concepts) without the need to fully explain them (because it’d take several hours and it’d be boring to anybody else other than me).
So, in summary (TL;DR): it depends on how multifaceted is the depressive situation. It won’t work for me. It surely can work for others that just need to talk to someone. Not exactly my case.