Log in or create an account to register your event.
Have a look on the event list to see if there are events in your area!
Are there no events? Maybe you can team up with some friends and start an event. It's not that hard!
This was the keynote written for de Nepal Open Source Klub and the worldwide Software Freedom Community
Good morning, friends,
and welcome to Software Freedom Day 2025.
Before we begin, I want to acknowledge something that weighs on all our hearts. Around the world - in Ukraine, in Palestine and Israel, in Myanmar, in Nepal, and in countless communities touched by conflict - people have lost loved ones simply for being in the wrong place at the wrong time, or for standing up for what they believe in.
Whether they were students, teachers, parents, or neighbors, each loss ripples through families and communities in ways that numbers cannot capture. Today, we hold space for that grief while celebrating those who continue working - often quietly, often at great personal cost - to make their corners of the world more just, more transparent, more connected.
Software Freedom Day isn't just about code. It's about the belief that when we can inspect, modify, and share the tools that shape our lives, we create conditions for understanding rather than suspicion,
for collaboration rather than control. NOSK has been nurturing these values for over twenty years in Nepal - teaching, building, and connecting communities around the simple idea that knowledge belongs to everyone. This work takes on deeper meaning when we recognize that the tools we build together can help us see each other more clearly, especially when the world seems determined to divide us.
Let me begin with what we mean by software freedom and federated systems, because these aren't just technical concepts - they reflect something deeper about how we relate to one another.
Software freedom means the right to run, study, share, and improve the programs that increasingly govern our lives. When code is open, it can be examined by many eyes, understood by many minds,
and improved by many hands. This isn't just about having access to source code - it's about having agency in shaping the digital systems that shape us.
Federated communication takes this principle into our most intimate digital spaces - how we talk to each other. Instead of one company controlling how billions of people communicate, federated systems allow communities to run their own servers while remaining connected to a larger network. When combined with strong end-to-end encryption, these systems ensure that even the operators of the servers cannot read the messages passing through them.
These principles become essential during times of upheaval. In Ukraine, we've seen how quickly communication infrastructure can become contested territory - both physically and digitally. When governments restrict platforms, when companies change policies overnight, when infrastructure fails under bombardment - that's precisely when communities need communication tools they can verify, host independently,
and trust completely. Not because technology will prevent or stop the human suffering, but because transparent, accountable tools give us better chances to maintain connection and coordinate care when traditional systems fail.
The power of software freedom lies not just in what it enables, but in how it enables verification of trust. When communication tools are built on open protocols with published code, communities can audit them for backdoors, governments can verify their security properties, and users can understand exactly what happens to their data. Consider the Signal Protocol - the cryptographic foundation that secures billions of messages daily. Because its code is open source, researchers worldwide have studied its security properties, found and fixed vulnerabilities, and implemented it in various applications. And while the Signal app if the flagship entity, other application also rely on its cryptographic strength.
Federated systems extend this principle to infrastructure. Matrix, for instance, allows organizations to run their own communication servers while maintaining interoperability with others. A university in Nepal can host its own Matrix server,maintaining full control over its data and policies, while still allowing its students to communicate with peers using Matrix servers anywhere in the world. The encryption ensures privacy; the federation ensures sovereignty; the open source code ensures transparency. This becomes crucial in conflict situations. In Ukraine, traditional communication infrastructure has been targeted, but federated systems can route around damage. When one server goes offline, the network adapts. When one company changes its policies, communities aren't trapped. The resilience comes not from any single provider, but from the network's distributed nature and the community's ability to understand and maintain it themselves.
We must also acknowledge the growing role of artificial intelligence in mediating our communications and shaping our understanding of the world. This presents new challenges that our community cannot ignore.
AI systems - from content recommendation algorithms to automated translation services - increasingly influence what information we see, how we interpret it, and whom we connect with. Yet most of these systems operate as black boxes, their training data and decision processes hidden from the very communities they affect. When these systems carry biases - and research shows they often do - those biases get amplified at unprecedented scale. An AI trained primarily on text from certain regions or demographics will embed those perspectives in its outputs, potentially marginalizing voices that were underrepresented in its training data. A content moderation system trained without sufficient context about local conflicts might suppress legitimate political discourse while allowing harmful content to spread.
Open source AI development offers some hope for addressing these challenges. When training code, datasets, and model architectures are transparent, communities can identify biases, researchers can study failure modes, and developers can work collectively to improve fairness and accuracy. But we must be honest: even "open weights" models - where only the final parameters are shared - provide limited transparency compared to fully open development processes.
The complexity is daunting, and I won't pretend to have simple answers. This is why I address you - the community of aspiring developers and tech experts. The principles we've always advocated - transparency, community governance, and the right to understand and modify the tools that affect us - remain relevant in this algorithmic age.
But we must speak honestly about the limitations and dangers, because wisdom requires acknowledging what we don't know and cannot control. Even the most carefully designed federated systems carry risks. When anyone can run their own server, how do you handle harassment that crosses server boundaries? When communities can defederate at will, how do you prevent the kind of fragmentation
that makes meaningful dialogue impossible? These aren't technical problems with technical solutions - they're questions about how we want to live together, and they require ongoing community engagement and wisdom.
Strong encryption, while essential for privacy and security, can also complicate efforts to address harmful content. Law enforcement agencies worldwide are grappling with how to investigate serious crimes
when communications are genuinely private. There are no easy answers here - attempts to weaken encryption for law enforcement inevitably create vulnerabilities that hostile actors can exploit.
Even open source software carries risks. The Matrix protocol, despite its open development and skilled community,has had serious security vulnerabilities that required careful patching. Transparency enables rapid fixes, but it also means that harmful actors can study the same code that protection-seekers rely on.
Perhaps most importantly, we must resist the illusion that any tool - no matter how open, how federated, how carefully designed - can substitute for the patient, difficult work of building trust between human beings. Technology amplifies existing human qualities: if we approach it with wisdom and compassion, it can enhance those qualities. If we approach it with fear and suspicion, it will amplify those instead.
So what can student communities - here in Nepal and around the world - actually do with these insights?
First, experiment with federated infrastructure thoughtfully. Set up small Matrix or Mastodon instances for your study groups or student organizations. Learn to configure end-to-end encryption properly.
Write clear codes of conduct. Practice making difficult moderation decisions collectively. These skills - community governance, conflict resolution, secure communication practices - are as valuable as any programming language.
Second, contribute to open source communication tools. Your local knowledge and language skills are irreplaceable assets in making these tools useful to your communities. Help translate interfaces,
test software on local networks, document setup procedures for your region's specific challenges.
Third, advocate for open standards and transparent algorithms where appropriate. When your university adopts new learning management systems, ask about data portability and encryption standards.
When AI systems are deployed in education or governance, ask about training data sources and bias testing. These may seem like small technical questions, but they shape whether future communities
will have the tools they need to govern themselves.
Fourth, develop literacy about AI systems and their limitations. Learn to recognize when you're interacting with automated systems. Understand that AI outputs reflect their training data, not objective truth.
Practice critical thinking about algorithmic recommendations and automated translations.
But do all of this with humility and caution. Don't build tools that could put vulnerable people at greater risk. Don't promise more than technology can deliver. Don't let the excitement of technical possibility
override the wisdom of lived experience.
Most importantly, remember that your primary contribution isn't necessarily technical- it's bringing wisdom, local knowledge, and ethical reflection to communities that are often so focused on what they can build that they forget to ask why they should build it.
As we close, I return to the recognition we began with: that our work takes place in a world where too many people are suffering, too many communities are divided, and too many institutions have lost the trust
they need to function effectively.
This is not a problem that software can solve. But the values embedded in software freedom - transparency, accountability, collaborative improvement, distributed rather than concentrated power - these are seeds of the kind of world that many of us hope to see.
Long-standing wisdom all around the world teaches us that all phenomena arise in interdependence - that nothing exists in isolation, that every action ripples through networks of cause and condition we cannot fully perceive. The code we write, the communities we build, the tools we choose to trust or reject - these all participate in shaping the conditions under which future generations will seek truth, build relationships, and navigate conflict.
The students protesting in Nepal, the families maintaining connections across war zones in Ukraine, the developers building secure communication tools, the community organizers fighting for digital rights in your own cities - they all depend, in ways large and small, on having access to communication tools they can trust, software systems they can verify, and digital infrastructure they can govern. We cannot give them perfect tools, because perfect tools don't exist. But we can offer them transparent ones, accountable ones, improvable ones - tools built with the intention of serving human dignity rather than concentrating power.
But perhaps most importantly, we gather. In a world increasingly mediated by screens and algorithms, the strongest networks remain those built through physical presence - through workshops where students learn to configure servers together, through conferences where developers from different countries share meals and stories, through local meetups where community members help each other solve real problems. Through explaining to your parents how technology works and learning from them how the world has affected them.
NOSK has been creating these spaces in Nepal for over two decades now - not just teaching technical skills, but fostering the relationships and trust that make collaborative development possible. When we meet face to face, we remember that behind every username is a human being with hopes, fears, and stories worth hearing. We build the social infrastructure that makes technical infrastructure meaningful.
This gathering - Software Freedom Day 2025 - is itself an act of faith in that vision. By bringing together people who might never meet otherwise, by creating space for learning and questioning and building together, NOSK and organizations like it worldwide are doing the essential work of weaving human connection into our technological future. So thank you, NOSK, for twenty years of patient cultivation. Thank you to every local FOSS community that creates spaces for learning and sharing. Thank you to everyone who shows up - online and offline - to build tools that serve human flourishing rather than mere efficiency.
The tools we build together today will carry forward the hopes of everyone who believes that a more transparent, connected, and compassionate world is possible. And the connections we make in building them - these human networks of trust and collaboration - may prove to be our most important creation of all.
May that work bring benefit to all beings.
Happy Software Freedom Day to every single one of you.