We’ve just read through a pretty interesting predictions report from You.com, written by two of the world’s top AI researchers – Richard Socher and Bryan McCann. These are the folks who literally helped create the AI we’re all using today, so when they reckon they know what’s coming next, it’s worth a listen.
While most of their predictions are aimed at tech companies and big business, we’ve had a good look through and pulled out what we think actually matters for small nonprofits and community services here in Aotearoa.
Fair warning: some of this might sound a bit full-on, but stick with us. We’ll break it down into plain English and talk about what it means for your mahi.
Your Team’s Going to Need New Skills (like, properly soon)
Everyone Needs to Get Comfortable with AI
Here’s one of the predictions that made us sit up: Socher reckons that by the end of this year, we’ll start seeing people lose their jobs if they can’t work with AI. He compares it to how nobody can say “I’m not good with computers” anymore if they want a professional job.
He puts it pretty bluntly: “There are going to be two-to-three jobs where that is already going to happen this year, starting with software engineering, and eventually it’ll be any job.”
What this means for you: Remember when everyone had to learn how to use email? Or when we all had to get our heads around Zoom during COVID? This is that, but bigger. Your team – whether they’re youth workers, case managers, or admin staff – need to start getting comfortable with AI tools now, not next year.
We’re All Becoming Supervisors of AI (whether we like it or not)
McCann and Socher reckon that everyone who works in an office or does professional work will need to learn how to boss AI around. Even if you’re not managing people, you’ll be managing AI – learning how to delegate tasks to it, knowing when to trust it, and spotting when it’s talking rubbish.
Socher describes it as working “at higher levels of abstraction”—which basically means you’ll be telling AI what you need in plain English, and it’ll do the technical stuff. But you still need to know how to ask properly and check the work.
What this means for you: Your job descriptions need a rethink. That case worker position? They’ll need to know how to get AI to help with case notes, research, and reports – while still doing the human stuff that actually matters. It’s a new skill set on top of the existing ones.
The Good News: Small Teams Can Punch Way Above Their Weight
Doing More With Less (actually, for real this time)
Here’s a prediction that got us excited: McCann reckons we’ll see companies with just 10 people achieving what used to take hundreds. They call them “10-person unicorns” (unicorn = billion-dollar company, in case you’re wondering).
Now, we’re not talking about billion-dollar nonprofits (wouldn’t that be nice?), but the principle’s the same. The report suggests that with AI, a small team can achieve way more than they ever could before.
What this means for you: Your three-person organisation might be able to deliver services at a level that used to require ten staff. Or you might be able to take on new programmes you never had the capacity for before. The catch? You need people who know how to work with AI effectively.
More Jobs, Not Fewer (eventually)
Before you panic about job losses, both researchers are keen to point out that every time technology has automated work, it’s created more jobs overall. Socher calls this “The Lump of Labor Fallacy” – the mistaken idea that there’s only a fixed amount of work to go around.
He notes: “150 years ago, 90% worked in agriculture. Today, 5% feed the world.” All those farm workers didn’t just disappear – new industries emerged.
What this means for you: Instead of replacing staff, think about how AI might let your current team expand what they do. Maybe your housing support workers can help more families. Maybe your youth workers can run more programmes. The work changes, but there’s still plenty of it.
The Tools Are Going to Get Way Better (and weirder)
Forget Apps – Software’s Going to Be Made On Demand
McCann predicts that the whole idea of separate apps will start breaking down by the end of 2026. Instead of choosing between different case management systems, you’ll be able to describe what you need and have it created for you on the spot.
He reckons: “As it becomes increasingly easy to speak an app into existence from scratch, we’ll start to see the entire idea of separation between apps break down.”
What this means for you: No more settling for software that “sort of” fits your needs because it’s what’s available. No more expensive custom builds. You’ll be able to create exactly what you need for your specific community. The flip side? Your staff need to know how to clearly describe what they actually need.
You Won’t Need Developers Anymore
Here’s a big one: McCann predicts that traditional coding will basically be gone by the end of this year. Not for learning or for making AI better, but for building regular software. AI will write the code.
“Software development will be entirely AI-assisted – it only makes sense to develop software with AI writing most, if not all, of the code.”
What this means for you: That expensive database or reporting system you need? You might not need to hire a developer or pay for custom software. But someone on your team needs to know enough to explain what you want and check that what you get actually works properly.
Specialised Tools for Social Services Are Coming
Both researchers predict that industry-specific AI tools will emerge, going deep into areas like healthcare, legal services, and (we’d expect) social services. These won’t be general chatbots – they’ll understand your sector’s specific language, regulations, and needs.
The report notes these tools will use “specialized search over niche datasets not easily accessible on the public web.”
What this means for you: Keep an eye out for AI tools built specifically for community services, case management, or whatever your specialty is. They’ll be miles better than trying to make general AI tools fit your needs.
The Challenges (because it’s not all smooth sailing)
People Are Going to Push Back – Hard
Here’s a reality check from the predictions: there’s going to be serious resistance to AI. Socher reckons we’ll see “multiple Luddite waves” (that’s people who reject new technology) and lots of “we’re doing well enough” attitudes.
He also predicts “continued pushback against agents on the web” because the economics don’t always line up. For example, if lawyers bill by the hour, they’re not exactly keen on AI that makes them 10 times faster.
What this means for you: Expect resistance from your team, your communities, and maybe even your funders. Some of the people you support might actively not want AI-mediated services. You’ll need to bring people along gently and keep the human connection where it really matters.
AI Will Keep Making Mistakes at a Steady Rate
Here’s an interesting prediction: Socher reckons AI missteps will stay constant in absolute numbers. AI’s getting better, but we’re using it way more, so the number of stuff-ups stays about the same.
What this means for you: When you’re working with vulnerable people, you absolutely cannot just trust AI blindly. A mistake in a case note or benefit application could seriously harm someone. AI needs human oversight, especially for anything high-stakes.
The Privacy Stuff (this is important)
Everything’s Going to Be Recorded (and that’s a problem for us)
McCann predicts that robots and passive data collection will become increasingly normal. He notes how quickly people have accepted things like Meta’s recording glasses and devices that sit in rooms recording everything.
“It will leak into the personal world too and it’s already happening.”
What this means for you: This is massive for social services. You work with people who are often vulnerable, who might not be able to give meaningful consent to being recorded or having their data collected. You’ll need really strong policies about AI and privacy that go beyond what’s legally required.
The opportunities for better data on outcomes are real, but so are the risks. Tread carefully here.
The Healthcare and Research Bits (if that’s your area)
New Treatments Coming Faster
Both researchers predict drugs will get to market more quickly through AI-accelerated development. Regulatory bodies like the FDA are already supporting this.
What this means for you: If you work in health or disability services, the people you support might get access to new treatments faster. Though remember, we’re in New Zealand, so Pharmac and our system will affect how quickly that flows through to us.
Biology Is About to Get Interesting
Socher makes a prediction that’s a bit more out there but worth knowing about: AI will transform biology the way calculus transformed physics. He reckons AI can handle the complexity of biological systems in ways we never could before.
What this means for you: If you’re in health, disability, or aged care, keep an eye on this space. The prediction is that we’ll see progress on things like curing cancer and extending healthy lifespans- not next year, but we’ll see the beginnings.
So What Should You Actually Do About All This?
Based on what these top researchers are predicting, here’s what we reckon small Kiwi nonprofits and community services should be thinking about:
1. Get Your Team AI-Ready (like, now)
Don’t wait. Start getting everyone comfortable with basic AI tools. This doesn’t mean everyone needs to become tech experts – just like everyone learned email, everyone needs to learn AI basics.
Action: Run some workshops, find some free training, play around with ChatGPT or similar tools as a team.
2. Rethink What Jobs Look Like
Stop hiring for 2020 and start hiring for 2026. Look for people who can work effectively with AI, not just people with traditional skills.
Action: Update your position descriptions. Add “ability to work effectively with AI tools” to your essential criteria.
3. Sort Out Your Privacy and Ethics
Before you dive into AI tools, get your policies sorted. Especially if you work with vulnerable people.
Action: Write clear policies about where AI can and can’t be used, particularly around consent and high-stakes decisions.
4. Look for the Right Tools
Don’t just grab whatever’s popular. Wait for (or advocate for) tools built specifically for community services.
Action: Join sector networks. When someone builds AI tools for social services in NZ, you want to know about it early.
5. Keep Humans Where They Matter
AI can do your admin, your data entry, your report writing. Let it. But keep humans doing the relationship work, the complex judgements, the empathy stuff.
Action: Map out which parts of your work are “human essential” and which could be AI-assisted. Be really honest about this.
6. Prepare for Pushback
Not everyone’s going to be keen. Some of your team, some of your community, definitely some of the people you support.
Action: Develop your change management plan now. Think about how you’ll bring people along and what choices you’ll give them.
7. Think Bigger
If a three-person team can do what used to take ten people, what could you achieve? What communities could you serve? What programmes could you run?
Action: Do some blue-sky thinking. If capacity wasn’t the limiting factor, what would you do?
The Bottom Line
Look, we know this all sounds a bit intense. The researchers who wrote this report (Socher and McCann, remember – literally some of the smartest AI people on the planet) are predicting some pretty big changes, and they’re predicting them for this year, not some distant future.
But here’s the thing: every one of their predictions points to the same conclusion. AI is going to change how we work. Not if, not maybe – it’s happening.
For Kiwi nonprofits and community services, that’s actually good news if we get ahead of it. These tools could mean your small team can help more people, deliver better services, and have more impact than ever before.
The trick is to start now, bring your people along with you, and stay grounded in what matters: serving your communities well.
AI is just a tool. A powerful one, sure, but still a tool. What matters is how we use it, who we use it for, and making sure we keep the human connection at the heart of everything we do.
So, shall we give it a go?
______________________________________________________________
About the Source: This analysis is based on the “2026 AI Predictions” whitepaper from You.com, written by Richard Socher (Co-Founder & CEO) and Bryan McCann (Co-Founder & CTO). Richard has a PhD from Stanford and is one of the most-cited researchers in natural language processing worldwide. Bryan authored groundbreaking early work on how AI understands language. Between them, they’ve basically helped build the AI we’re all using today, so their predictions carry some serious weight.
All predictions and quotes in this article come directly from their whitepaper – we haven’t added our own speculation, just translated what they’re saying into what it means for Kiwi community organisations.