In one of the newsletters I follow there is this quote. What do you think about it?
Chatbots comply with the user s wish to solve the problem on their own, even when this is impossible and may make matters worse. Chatbots, in fact, are not built to help, but to please. If you feel flattered when your LLM tells you how smart your question was (I certainly do), you are not alone: a pre-print from 2025 found that all major LLMs were highly sycophantic.
A Fortune piece from late last year had the CEOs of Apple, Airbnb, and PepsiCo all agreeing on something that surprised me: being at the top is one of the loneliest places to be.
That resonated. But I think it starts way before you reach Fortune 500 CEO. It starts the moment you go all in on your own thing.
There is a strange irony I have been sitting with lately.
I am building an app specifically to help young people feel less alone. And yet, some of the loneliest moments I have experienced have been in the past year, building it.
There's a stat that keeps haunting me: 61% of young adults report feeling seriously lonely. Not occasionally seriously.
I know this number isn't abstract. When I started building Murror (an AI companion app for young people battling isolation), I was living it. Working from a tiny apartment, going days without a real conversation, completely absorbed in a product designed to solve the very problem I was drowning in. The irony wasn't lost on me.
Let me start from the creator s perspective: I personally don t have a product (apart from hiring people for creative work or offering personal consultations).
But as a creator, I constantly share content, insights, and information, value that helps me build trust (for free). Based on that perceived expertise, people eventually decide to work with me (a paid service).
I've been noticing something lately. We went from using AI as a tool to letting AI become the default for almost everything: writing, deciding, planning, even reflecting.
Need to write an email? AI. Need to make a decision? Ask AI. Need to understand how you feel about something? Believe it or not, AI.
The problem isn't the technology. The problem is that we're quietly outsourcing the one thing that makes us valuable: our ability to think for ourselves.
Hey I'm James, a software developer from Australia with 20+ years building things professionally.
Most of my career I've been the person behind the scenes solving hard technical problems, shipping reliable software, making other people's ideas work. Unravl is the first thing I've built entirely for myself, and now I'm figuring out the part they don't teach developers: how to actually get it in front of people who might find it useful.
No funding. No growth team. No playbook. Just me, the product, and a lot of learning in public.
If you've been down this road builder trying to find an audience I'd genuinely love to hear what worked for you. And if Unravl sounds like something you'd use, even better.
I am a Computer Science student doing research into how solopreneurs and small startups create new apps and what their stack looks like. Particularly, I'm interested in how you handle things like authentication, billing, and permissions/authorization in your apps.
Let me know what you're working on below and how you're going about it -- I'd love to connect for some quick calls to learn about your product and talk about your process in building it!
Years ago, depression wasn t a topic I studied. It was something I lived with quietly. Some days, just functioning felt like a win. Reflection became the only tool that helped me make sense of my own thoughts and emotions.
In the same year I left Google s Mountain View HQ where I was working on subscription experiences used by billions to pursue Murror, two engineers also left Google to build Character AI. Their early prototype raised safety concerns, but the idea evolved into a platform where people could create virtual characters such as AI companions, assistants, or friends. We started from similar places, but chose very different paths. Character AI focused on moving fast to meet market demand and scale quickly, then kids committed suicide using the product. Murror chose to move slowly prioritizing research, ethics, and user safety. We intentionally designed our AI around a butterfly symbol, as a reminder that it is a tool for reflection, not a replacement for real human relationships. This approach takes more time, and it doesn t always show immediate financial results. Over time, the contrast between speed and responsibility has become clearer. At Murror, money is not the starting point. It is a result that comes after doing the work carefully and responsibly. This is a long journey. If you are someone an investor, partner, or builder who values patience, resilience, and long-term impact, I believe this path matters. As the world becomes more complex and emotionally fragile, the need for thoughtful, ethical technology will only grow. This is just the beginning.
Happy New Year, everyone. How did you spend the first day of 2026?
For me, the first day of a new year feels like the opening step of a long journey. So instead of rushing into productivity, I chose to begin 2026 by taking care of both my body and my inner world.
Here s how my Day One looked:
An early morning run, pushed myself 1km further to reach 7km Wrote down all my goals for the year, both personal and professional Repotted my flowering plant into a new pot Cooked a nutritious meal for myself with Stranger Things series Started reading a new book Cleaned and reset my living space