We need to shape native paths, not recreate #fashionista ones with shinier branding

We’ve got a new bunch of #mainstreaming tech devs flooding into the #Fediverse. Some from burned-out Big Tech, some from the academic funding circuits, some just looking for the next shiny project after the #AI hype wore thin.

Now, this could be good. IF even a few of them started working on native, grassroots tech – tools built for and by the communities who actually use them, not just more #dotcons platform clones. Right now, we’re at a turning point. The first wave of the Fediverse was all about copying the #dotcons:

#Mastodon as “ethical Twitter”

#PeerTube mimicking YouTube

#Mobilizon as a Facebook Events replacement

#Lemmy doing Reddit but federated

All of this was necessary, it helped people jump ship and start imagining life beyond the #dotcons. But that wave is peaking, and the second step is overdue. That next step? It’s about original, grassroots infrastructure. A federated trust graph instead of reinventing karma points or like-buttons. Protocols for local-first publishing, like the #p2p side of the #OMN or radical #4opens-inspired news and tools for community trust flows, moderation and accountability, rooted in values, not corporate TOS and #PR management. Infrastructure for interoperability and redundancy, so projects don’t die when a maintainer burns out or a server goes down

But, the risk, if the new #devs only copy the #dotcons AGAIN, it’s a fail. Worse still, if they get sucked into the #NGO vampire nests, the slow, bureaucratic funding black holes of the worst paths of #nlnet and #NGI, we’ll see more “safe” projects that burn grant money building tools nobody uses.

Let’s be clear, these institutions do some small good, on basic infrastructure, but their #NGO sides are hoovering up resources by pushing for risk-free deliverables, and ignore the actual messy needs of grassroots groups. This funding is too often shaped by #mainstreaming politics and careerism, not lived practice. We’ve seen it before, and we’re seeing it again.

What we need now are tools that grow from compost, not code sprints. Tools built from social use, not tech fashion. We need radical simplicity, transparency, and flexibility, tech that can’t be easily co-opted by the forces we’re trying to move beyond.

So if you’re a dev stepping into this space, welcome. But please don’t make another Mastodon, but with more “privacy” or #AI features. Instead, work with those who’ve been composting here for years. Build with the messy, weird, and beautiful people who are on new paths, not, boringly, recreate the old ones with shinier branding.

Thinking about data and metadata

On a positive note, the progressive world of technology has transformed our lives, making things easier, enhancing our, health and well-being. Yet, within this change the is meany challenges, one is the sustainable management of digital data. In the era of rapid technological advancement, addressing lifelong data redundancy, storage, and preservation is needed, especially as decentralized systems reshape the way we share and store information.

Recent discussions have highlighted the complexities of data management, particularly within peer-to-peer (P2P) networks and federated platforms. While self-hosting data offers autonomy, it remains a niche path accessible to only a small fraction of people. To truly democratize data storage and distribution, we need alternative solutions like “blackbox” #P2P on community-run federated servers that balance people and community control with collective responsibility.

The challenge of redundancy, is a critical hurdle, we are yet to solve. People need simple ways to maintain multiple copies of their data, while selectively choosing what subsets of others’ data to store, and integrate these choices seamlessly. A hybrid approach combining #P2P and client-server models would offer the best of both worlds, allowing people to control their data while ensuring resilience and availability across the wider “commons” network.

Managing the data lifecycle, these solutions require clear mechanisms for data retention, filtering, and lifecycle management. Defining how data is preserved, what subsets are stored, and when data can expire is crucial for balancing sustainability with functionality. Lossy processes can be acceptable, even desirable, as long as we establish thoughtful guidelines to maintain system integrity.

The growing volume of high-definition media intensifies the storage burden, making efficient data management more pressing. One practical solution could be transferring files at lower resolutions within P2P networks, with archiving high-resolution versions locally. Similarly, client-server setups could store original data on servers while buffering lighter versions on clients, reducing server load without sacrificing accessibility.

There is a role for institutions and collective responsibility to preserve valuable content. Projects like the Internet Archive offer centralized backups, but on the decentralized systems we need a reimagining of traditional backup strategies. With a social solution grounded in collective responsibility, where communities and institutions share the task of safeguarding data, this would mitigate the risk of loss and create a more resilient achieve and working network.

For a decentralised sustainable digital future, we need a better balance of technological and social, and a step to rethink how we manage data. By seeding hybrid architectures, growing community-driven autonomy, and promoting collective care, we navigate the complexities of digital preservation.

With the current state of much of our tech, we need to do better in the #activertypub, #Fediverse, and #openweb reboot. Projects like #makeinghistory from the #OMN outline how we can pave the way. It’s time to pick up the shovels, there’s a lot of composting to do. And perhaps it’s time to revive the term #openweb, because that’s exactly what the #Fediverse is: a reboot of the internet’s original promise.

Let’s keep it #KISS and focused, the future depends on it.

https://hamishcampbell.com/tag/makeinghistory

In tech, what matters and what is dangerous

The influx of #mainstreaming brings many non-native, focuses into our growing shared alt spaces. When we embed content, most of these will be better handled as external resources. Let’s keep the core simple: #KISS and #4opens. One of the strongest of these is money, it is a dangerous subject for #openweb projects. It’s way too often the root of corruption and co-option, so it’s best to keep financial aspects as external applications and simply link to them. And remember that words are wind, look at the ground. We live in a closed world, and we should not add to this mess.

  • There is no security in CLOSED systems – security comes from OPEN and social processes.
  • There is no security in individualism – true security lies in community.
  • There is no security in “trustless” models – real security emerges from social trust.

Over the last 10 years, we’ve been fed meany, meany lies. This is especially clear now in tech. Look at #opensource: can you find any lasting value in CLOSED within that? Over the last 20 years, we’ve seen an ongoing battle between OPEN and CLOSED, but the last decade has been dominated by the #dotcons and their shadow puppet, the #encryptionists. Both are CLOSED, both wear the cloth of OPEN, and both recite the right words. But words are wind, look at the ground. The #4opens is the reality check.

I’ll be truly optimistic when closed paths of #encryptionist apps cross standards with the #open of #activertypub apps, that the bridging of human nature and brings to view the “feeling” that fuels this gap. But right now, there’s a strong, unspoken push not to address these issues.

Nearly everything I do today revolves around #4opens, addressing the unspoken issues head-on. I’ve been doing this full-time for more than 30 years, and I’ve watched hundreds of alt-tech projects wither on the vine, with only a handful of flowering. Reflecting on this, I’ve developed a #KISS universally reliable way of judging which projects might flourish and which will collapse: the #4opens framework. Others might call it open-source development or radical transparency, but it’s all the same core path #nothingnew

To move this forward, we need to address the core problems:

  • The #geekproblem – a teenage mix of arrogance and ignorance that spreads like wildfire.
  • The #dotcons – a relentless push of greed over human need.

These are fundamental issues, and it’s good, necessary, to have strong opinions on them. Because not having an opinion? That’s a path straight back to the #deathcult. We don’t need more of that. What we need is compost, patience, and the courage to keep pushing for openness and social solidarity, no matter how messy it gets.


There’s an unspoken #geekproblem lurking at the heart of the #openweb, and it’s past time we bring it into the light. If we frame #p2p as #human2human, scaling becomes a virtue, an organic process of communities growing, evolving, and finding balance through social trust. But if we view #p2p as #data2data, scaling becomes a purely technical challenge, one that strips the human element away.

The first path embraces the messy beauty of human connections. Scaling isn’t a failure, it’s a reminder that growth needs care, cooperation, and thoughtful design. The second path, the data-centric approach, treats humans as nodes, reducing complex social interactions to packets of information to optimize and control.

The issue is the latter view is pushed by the #dotcons. The systems we’ve grown up with, the platforms we’ve relied on, all reinforce this anti-human perspective. And whether we like it or not, we’ve internalized some of this thinking, even within our #openweb projects. That’s the uncomfortable truth.

The question is, do we actually collectively want to solve this? Because the solution isn’t technical, it’s social. It means rejecting the idea that tech should replace or dominate human processes. It means making space for friction, for inefficiency, for the unpredictability of people working together.

Talk to a geek today. Start the conversation. Ask how they see #p2p – as people connecting, or as machines exchanging data? Their answer might tell you a lot about where their compass is pointing, and whether we can navigate back toward a web that is human.

Let’s compost the #geekproblem, nourish the soil, and grow something better, please.

The #OMN path is about building the activist #openweb infrastructure

The #OpenMediaNetwork (#OMN) offers a clear, practical path to building the #openweb, grounded in #4opens. It does this by leveraging open protocols like #ActivityPub (#AP) and #RSS, alongside #FOSS software, to create a distributed network of media platforms where people and groups can join, participate, and contribute. This, like the #Fediverse, is a direct challenge to the centralised, corporate-dominated structures that define so much of the current internet landscape.

Step-by-Step Building Blocks: The #OMN is simplicity and humanistic coding, rather than over-engineered complexity we often see in tech today.

  • Start with the client-server model. The initial focus is on building a robust client-server architecture to create a stable foundation for media sharing and participation. This forms the “hot” storage layer, data that is live, accessible, and regularly used.
  • Introduce an offline cold store: Once the client-server infrastructure is operational, a secondary layer of offline cold storage is added. This acts as a backup system, providing high redundancy to safeguard against data loss. Cold storage is cheap, offline, and relies on human interaction for maintenance and retrieval, ensuring resilience and sustainability.
  • P2P connections to cold storage: The final stage introduces peer-to-peer (#P2P) connections to integrate the offline cold storage with the broader network. This allows people to share and retrieve data across the network, even in decentralised or disconnected environments.
  • Iterative learning and improvement: The process is intentionally iterative, encouraging learning from practical experience. The system path is designed to evolve and improve over time, informed by real-world use rather than theoretical perfection.

The success of the #OMN depends on its commitment to #4opens. These principles allow for the free sharing and reuse of content, breaking down barriers to collaboration and growing innovation. By storing most data unencrypted (as the majority of it is not private), the system reduces overhead and complexity, keeping the project aligned with the “Keep It Simple, Stupid” (#KISS) philosophy.

Separating privacy from the #openweb: One critical aspect of the #OMN approach is recognising that encrypted privacy tools are a separate project. Mixing these with the development of the #openweb and #Fediverse leads to unnecessary complexity and division. Privacy tools are vital, but are developed in parallel rather than tangled with the foundational infrastructure. This separation allows each project to focus on its strengths while maintaining a clear, streamlined design philosophy.

At its core, the #OMN empowers “normal” people to store and manage their own data. By using a mix of hot and cold storage, people gain control over their digital lives without relying on corporate platforms. The focus on redundancy, backed by tools to search and reimport old data into hot storage, ensures resilience and accessibility.

This human-centric approach contrasts sharply with the corporate and #geekproblem obsession with control and perfection. It’s a more humane vision of technology, based on trust and collaboration rather than surveillance and control.

This builds from a history rooted in activism, the #OMN isn’t just a theoretical project; it’s grounded in decades of real-world activism. From the work of Undercurrents in the 1990s (http://www.undercurrents.org/about.html) to the global mobilisation of the Carnival Against Capitalism (https://en.wikipedia.org/wiki/Carnival_Against_Capital), this draws on over 30 years of direct, on-the-ground experience. The lessons from this history inform every aspect of the OMN, ensuring it stays true to its activist roots.

The current #block on this needed project is dealing with the #geekproblem and #fashernistas: One of the biggest challenges in progressive tech is the dominance of the #geekproblem, projects driven by technologists who prioritise complexity and self-interest over usability and impact. Coupled with the influence of #fashernistas, who chase trends without substance, many projects are doomed from the start

The #OMN cuts through this, yes, we can’t solve this mess pushing, but we are a critical step in the right direction to mediate this mess, by encouraging us to get out the shovels and compost these pushing failures. The goal is to build a system that works, not one that dazzles investors with hype while failing to deliver.

The #openweb won’t (re)build itself. It requires us to reject the endless noise of pointless projects and focus on practical, sustainable solutions. By supporting and growing the #OMN path, grounded in #KISS simplicity, #4opens principles, and decades of activism, we create a resilient infrastructure that empowers people and communities.

The future of the #openweb is in our hands. Dig deep, embrace trust, and start building.

OMN #openweb #OGB #Indymediaback #makehistory

Fuck Off to the #Bitcoin Bros and Their Cult of Scarcity

Let me say it loud and clear—again—for the ones in the back: P2P systems that tether their tech to encryptionsist/blockchain coin economy are a dead end. Full stop. Tying this native #openweb path of distributed technology to the idea of selling “resources” doesn’t just miss the point; it’s like engineering a system that’s designed to fail from the start. It’s self-sabotage on a systemic level, shooting yourself in the foot while you’re still lacing up your boots.

Why? Because these systems, heralded by the #Bitcoinbros and their ilk, are about enforcing artificial scarcity into spaces that could—and should—be models of abundance. Instead of embracing the revolutionary potential of #P2P networks to unlock and distribute resources equitably, they double down on the same tired “deathcult” economics of scarcity that brought us to the current mess in the first place.

Coding scarcity into abundance, is the fatal flaw, the beauty of distributed systems lies in their ability to facilitate abundance, bypassing the bottlenecks and hoarding inherent in centralized paths. Yet, what do these “geniuses” do? They take this fertile ground for innovation and graft onto it the same broken logic of capitalism that created the problem. Artificial Scarcity, instead of using resources efficiently and equitably, they introduce a transactional economy that prioritizes profit and competition over collaboration and sharing. Death by design paths embed scarcity into their structure, ensuring they will eventually choke out their own potential. What could and needs to be a fertile cooperative garden becomes a battlefield of extraction and exploitation.

The Bitcoin and crypto crew, with their get-rich-quick schemes, aren’t building the future—they’re pushing us all back into the past, rehashing old hierarchies in a new digital wrapper. Their vision of the world isn’t radical or liberating; it’s just #techshit wearing a suit made of gold leaf and bad ideas.

Then we have the #encryptionistas and their “Common Sense” cult, with the mantra of 90% closed, 10% open might sound like “common sense” to those steeped in fear and control, but what they’re really peddling is the same #deathcult ideology to lock down innovation, stifle collaboration, and strangles the potential of the #openweb path.

Both are enforcing scarcity as though it’s inevitable, despite all evidence to the contrary.
They frame their closed systems as “security,” but what they’re really doing is hoarding power and excluding voices. This isn’t progress; it’s regression. It’s the equivalent of building a massive wall in the middle of the commons and selling tickets to access what was already there for everyone.

The radical alternative is abundance by design, where we don’t need scarcity baked into our systems, we need abundance. We need tools and networks designed to share resources, knowledge, and opportunities without the artificial barriers of token economies and closed ecosystems.

  • P2P systems should empower cooperation, not competition
  • Decentralization should facilitate access, not introduce new forms of gatekeeping.
  • Abundance is the point: The beauty of distributed networks lies in their ability to amplify sharing, not enforce scarcity.

This is where the Open Media Network (#OMN) comes in—a vision rooted in the values of the #4opens: Open Data, Open Source, Open Process, and Open Standards. This isn’t about creating a new “elite” made up of the nasty few or another #dotcons “marketplace” policed by the #geekproblem. It’s about building #DIY networks, radically inclusive and genuinely liberatory.

What are we to do with the Bitcoin bros, the #encryptionistas, and their #deathcult economics? Compost them. Take their #techshit, strip it of its toxic scarcity mindset, and use it to fertilize better systems. Systems that prioritize people over profit, collaboration over competition, and abundance over fear.

To those still clinging to the Bitcoin fantasy: Grab a shovel. You’re going to need it—not to mine more tokens, but to bury the bloated corpse of your scarcity-driven ideology. It’s dead weight, and it’s holding us all back. The future belongs to those who can imagine abundance, build it, and share it. Let’s stop walking down the “common sense” dead-end paths and start digging our way out of this mess, composting matters, you likely need a shovel #OMN

How can we mediate the #NGO blocking?

The #NGO world has been both ally and obstacle for decades. Too often, NGOs smother movements with paperwork, reporting cycles, and status-quo compromises. They professionalize struggle into careers, replacing urgency with strategy documents, and radicalism with caring workshops. Survival of the institution becomes more important than the fight itself.

But if we are serious about an #openweb reboot, we cannot just reject the #NGO crew outright. They have resources, networks, legitimacy in the eyes of institutions, and people who genuinely want change. The task is to make them more functional – to mediate them into alignment with grassroots, horizontal, #4opens values.

Transparency vs. the black box. Most NGOs operate like closed castles. Decisions are opaque, wrapped in “internal processes” no one can see. This is poison for trust. The antidote: embed radical transparency. Decisions must be documented, accessible, and open to input. When governance is open, collaboration becomes possible. When it’s closed, suspicion festers and movements fracture.

Flexibility vs. Rigidity. NGOs love five-year plans, KPIs, and strategy frameworks that collapse on contact with reality. In a world spinning into #climatechaos and political instability, rigidity is suicide. The fix: embrace iterative, adaptive paths. Think agile. Test, fail, learn, pivot. If grassroots crews can adapt in the streets and on the fly, NGOs can damn well learn to adapt in their boardrooms.

Tech as Social, Not Specialist. One of the worst NGO habits is treating technology as a “separate department.” IT staff build tools no one uses while the campaigners rely on #dotcons because “that’s where people are.” This deepens dependency and undermines any autonomy. The answer: hard code social understandings into tech frameworks. Train staff in digital literacy. Break the barrier between “techies” and “non-techies.” Build tools with grassroots values at the core, not bolted on as an afterthought.

Decentralization vs. Dependence. NGOs instinctively centralize, but resilience comes from decentralization. #Fediverse and #P2P networks show the way: messy, federated, harder to control, but alive. NGOs need to step off the corporate #dotcons treadmill and start investing in distributed infrastructures that empower communities instead of platforms.

Funding without shackles. Follow the money, and you find the leash. #NGO agendas bend to donors, governments, and foundations. If your funding is tied to maintaining the status quo, radical change is impossible. Solution: diversify funding. Community crowdfunding. Partnerships with projects that share #openweb values. Build independence rather than dependency. Stop mistaking survival for success.

Beyond tokenism. Diversity statements, inclusion workshops, and endless identity branding have become the fig leaves of #NGO culture. It’s just box-ticking while real grassroots voices are sidelined. True inclusivity means messy organizing: bringing in voices you don’t control, valuing experience over credentials, connecting with movements like #XR and #OMN not to manage them but to amplify them. Tokenism builds silos; real inclusivity builds bridges.

The polemic. The NGO crew must choose: remain bureaucratic husks feeding on donor cycles, or transform into allies that enable radical grassroots change. We do not need their brands. We do not need their logos on banners. We need their structures to stop blocking and start enabling. That means adopting the #4opens, embracing federation, composting control culture, and learning from messy grassroots organizing.

The truth is simple:

  • NGOs that cling to their black boxes, their rigidity, their donor-driven agendas, will collapse into irrelevance.
  • NGOs that embrace openness, decentralization, and collaboration can play a real role in rebooting the #openweb.

This isn’t about saving NGOs. It’s about saving movements from being smothered by them.

#KISS #OMN #4opens

A Positive View Of Current Trends

The challenges of today: #climatechaos, inequality, and the social impacts of #dotcons technology are a creating a very nasty social mess. However, there is a some potential for a positive transformation if we push the power of #openweb and #4opens technology and align it with progressive and radical “native” grassroots politics.

Addressing Climate Change with Technology and Revolutionary change

  • Renewable Energy: Solar, wind, and other renewable energy sources are becoming more than cost-effective and widespread. With strong political will, we can transition to a carbon-neutral economy. By reducing consumption and shifting this energy balance, we take a step to mitigating some of the effects of climate change.
  • Climate Resilience: Investment in both physical and social climate resilience infrastructure, flood defences and mediation, sustainable agriculture. This will shape and can protect vulnerable communities and ecosystems as we weather this transition. On the digital side, federation is a big step towards more #p2p native infrastructure, which will help to mediate the failing of our overly centralise #dotcons world.

Leveraging Automation for Social Good

  • Reducing Work Hours: Automation reduces the need for human labour, allowing for shorter work weeks and more leisure time without reducing productivity. This leads to improved quality of life and wider social and mental health benefits.
  • Universal Basic Income: #UBI provides a financial base for building sustainable alternatives, ensuring that wider groups benefits from increased productivity and technological advancements, rather than the normal #nastyfew.

Ensuring Equitable Access to Resources and Services

  • Universal Basic Services: By providing free and universal access to essential services such as healthcare, education, housing, and public transport, we create a more equitable society where people has the opportunity to thrive and build social good.
  • Socialized Finance: Redirecting financial resources from speculative markets to socially beneficial projects ensures that investments are made in areas that improve public well-being and infrastructure.

Fostering a Culture of Innovation and Inclusion

  • Inclusive Policy Making: Ensuring that marginalized communities have a voice in policymaking leads to more equitable and just outcomes. Participatory democracy and community-led tech initiatives like the #OGB drive inclusive development and the needed social challenge to become the change we need.
  • Education and Retraining: As the job paths shift, providing education and retraining opportunities helps people transition to new roles, ensuring that fewer people are left behind.

Utilizing Technology for Global Collaboration and Problem-Solving

  • Global Cooperation: Harnessing #4opens technology for international collaboration can address global challenges more effectively. Federated platforms for knowledge sharing and linking joint initiatives leads to real solutions for climate change, health, and economic development.
  • Data for Good: Using #openweb and #4opens data, metadata analytics to address social issues leads to more effective public planing, policies and resource allocation.

Conclusion: A vision of hope, in tech

There is a potential for a positive future when we combine technological innovation with radical progressive politics and a commitment to social equity. By addressing #climatechange, leveraging automation, ensuring food security, and providing universal access to essential services, we build a wider world of opportunity and basic justice.

This vision needs us to reimagine our current paths to prioritize humanistic well-being over profit. With the right policies and collective action, we can turn today’s challenges into opportunities for basic survival and a better global society.

You can support a technological project https://opencollective.com/open-media-network it’s a small step.

Funding application for the #OMN

Funding application for the #OMN (Open Media Networking) project, an innovative initiative to  revolutionize the landscape of media and communication. The project address the limitations and challenges posed by centralized social networks by developing an interconnected network that empowers people, fosters innovation, and promotes openness and decentralization.

What do you think about/Have you heard about project X? We are always interested in learning about other projects that aim to address similar challenges in the media landscape. Collaboration and cooperation are crucial in achieving the collective goal of creating a better internet and society.

Who are your competitors? While established networks like Facebook, Youtube, and Twitter are perceived as competitors, we view them as irrelevant, techshit to be composted. Cooperation partners are other decentralization efforts such as #ActivityPub etc. are also projects we aim to reach compatibility with.

How will you attract your first users? We plan to attract our first crew through various strategies, including leveraging the advantages of our system, collaborating with “content creators” and “influencers”, fostering change and linking, through leveraging our network of contacts.

Which programming language do you use? Our team has primarily engaged with the XXXX framework. However, we plan to explore existing open-source solutions in social networking to ensure compatibility with various technologies.

Who are potential users? Potential users of #OMN include social activists, frustrated users overwhelmed with managing multiple accounts, power users seeking greater control over their online presence, content creators and journalists, users with specific needs, decentralization enthusiasts, and anyone interested in an alternative to centralized networks.

How does #OMN make the internet more awesome? #OMN empowers people by offering them the freedom to choose their networks and applications freely, fostering fairness, promoting independent media, fostering creativity, and enhancing the peoples experience.

What are you building? We are building a new media experience that allows people to interact with different networks and applications seamlessly, offering greater flexibility and control over their society and local communertys.

Why do you want to bring micropayments to social media? Microgifts are essential for supporting community creators and networks, empowering people to support those they trust and enjoy with minimal effort.

What are the goals of #OMN? The goals of #OMN are to empower people and communertys, foster effectiveness in competition to #mainstreaming “common sense”, promote independent media, and enhance change and challenge in the communication space.

What does success look like? Success for #OMN includes the development of a working prototype, collaboration with various networks and applications, and widespread adoption of the #openweb “native” #OMN protocol and working practices as an internet/social standard.

What are the key deliverables of the prototyping phase? The key deliverables of the prototyping phase include the development of the #OMN #p2p client, User self-hosting, and Networks & Network Server prototypes, along with detailed documentation for developers and communertys.

Who will do the work? Our team, consisting of dedicated people committed to the vision of the #openweb, will primarily handle the work. With funding available, we plan to expand the team to expedite the prototyping phase.

What needs to be done now? We need funding support to commence the development of the prototype and advance the #OMN project to the next stage. This includes development, coordination, collaboration, and public outreach efforts.

How are you licensing any software or documentation you produce? We intend to make all our software openly available, encouraging collaboration and innovation in the open-source community.

How do you communicate publicly about your work? We communicate publicly through various channels, including videos, direct outreach to journalists and content creators, and engagement on media platforms like Mastodon and the #dotcons.

What do you hope to learn during the project? Throughout the project, we hope to learn about community project coordination, software collaboration, public outreach, software technologies, and other relevant fields, ultimately contributing to peoples growth and success.

What happens to #OMN if it does not get funded? If #OMN does not receive funding, we will continue our efforts to raise awareness and support for the project, confident in its value and potential impact on the communication landscape.

Thank you for considering our funding application for the #OMN project. We are excited about the opportunity to bring this “native” #openweb vision to life and look forward to the possibility of collaborating with you.

Talking about p2p as a tool to use today

#P2P projects keep failing socially because adoption is tiny. The #Fediverse succeeds socially because it keeps social #UX familiar. The path forward is a half-step strategy: bridge #fediverse + #p2p in real, usable ways until decentralised clients are socially relevant.

We need: Bridges & killer apps, seamless UX that makes federated + p2p content feel like one stream. A server that reads from both channels without making the user care about protocols.

A. what is happening with protocols:

* The #nostr crew are the children of #web3 mess, they are a bit reformed, let’s see.
* Then the #BlueSky are the reformed children of the #dotcons
* The #fediverse is the child of the #openweb
* #dat is a child of the #geekproblem if it is reformed or not, you can maybe tell me?
* #SSB was a wild child, now sickly/lonely with the #fashionable kids gathering round #nostr
* #p2p was the poster child of the era of the #openweb it was caught in the quicksand of legal issues, the shadow that was left was eclipsed by “free to use” #dotcons Now finds it hard to come back due to mobile devices not having an IP address, thus most people not actually able to use p2p reliably.

Q. ssb has technical shortcomings. It cant sparsely replicate data and verify it. It needs to download all data ever created by a user to verify, which makes it infeasible for many use cases. The main underlying data format is also hard to fix and leads to performance bottlenecks. The main founder moved on and it seems most ssb people are also looking for a new home.
dat’s time has not yet started as it approached things from a much more fundamental perspective. The initial vision was “git for any kind of data”, which means “version control for any kind of data” (peer to peer). The stack only now reached maturity to build proper tools on top of it. You have the dat-ecosystem with 2-3 dozen projects.
You have the holepunch/pears project which built a phenomenal “never on a server” desktop/mobile p2p video conferencing messenger with built in file sharing.
The app works flawless on mobile and is called https://keet.io
Also https://dat-ecosystem.org just now released it’s new website.
The https://pears.com runtime will be live in 5 days from now on the 14th of February for anyone to start hacking on p2p apps and some time later, the plan is to integrate it into the dat-ecosystem website, so anyone can start using p2p from within dat-cosystem page (which is an open source static website anyone can fork to get to the same) …no back ends required.
pears 🍐will only start working on the 14th of february. You can set a reminder.
The revolution starts then 🙂

A. will have a look, there are a few new #p2p projects reaching use at mo – the issue is none of them link to each other and likely thus non inter-op. This is the #geekproblem

Q. I don’t think there are any mature projects out there other than dat and ipfs. The former made by open source devs, self funded with a bit of help from public funding bodies, while the latter is the poster child of venture capitalists and got gazillions from investors. It’s the “big tech” of p2p.
Then you have a few less general purpose p2p projects which popped into existence in the last few years, but both dat and ipfs go back all the way to 2013 and it takes a lot to get things smooth and stable and support all use cases and get enough critical adoption and nodes to make the p2p network work.
That is why dat-ecosystem has a lot of existing projects that work and why it is reliable to build on top of it.
I do think the new more recent p2p projects in research state might become mature as well, but it will easily take them a few more years.
Many of those newer projects have people working on them part time only or focus on really special use cases and only time will tell if their approaches will bring something new to the table or not.
2024 will definitely be the year of dat, especially after February 14th, when pears.com goes live. This has been years in the making.
What started 2013 as (git for data) will now finally become it’s own independent p2p runtime. Goodbye nodejs & co. …and soon goodbye github & npm 🙂

A. https://holepunch.to/ its a very sparse website with no company info or #4opens process – it looks and feels like meany #dotcons if these projects do not link to each other or inter-op then they will fail like the hundreds I have seen fail over the last 20 years of this mess making. it’s a problem we can’t keep doing this shit, but we do. #4opens is a shovel to help compost this, can you do a write-up for these projects please.

Q. dat-ecosystem is a 501c3
It’s Code for Science and Society
And it is https://opencollective.com/dat
And it is governed by a Manifesto.
It is all on the website next to the “Info” button in the upper left corner.
If you mean pears.com ….that will change on February 14th
I didn’t mention holepunch.
Holepunch is just one of the many dat-ecosystem projects.
It is special, because one of the core developers of dat started it after he got a lot of funding and is currently maintaining many of the important code that powers dat and the dat-ecosystem projects.
But it doesn’t matter too much. The stack is open source under MIT and Apache 2.0 License for anyone to use. If holepunch would ever decide to stop maintaining the stack (which we do not think), dat-ecosystem can find other maintainers.

A. they are the owners of https://keet.io always look for ownership in #dotcons 🙂 a few of the ones I have been looking at over the last few years https://www.eff.org/deep…/2023/12/meet-spritely-and-veilid and the was a another one funded by NLNET they recently whent live, but can’t find the link. None of them link or interop, not even bridges. This is the #geekproblem

Q. Spritely is a great project.
It embraces the ocap security model (Object Capabilities).
It does apply it in lisp/scheme, which is a great fit with GNU Guix.
Their foundation is led by Randy Farmer.
Randy Farmer co-created Habitat with Chip Morningstar (an MMORPG) in the 1980s.
Chip Morningstar works with Mark Miller (Mentor of Christine Lemmer Webber).
Their project is called “Agoric”, which is a blockchain projcet funded by Salesforce.
They have their own Token and build a “Market Place”.
They as well work with ocap security model (but in JavaScript).
The JavaScript ocap version is what is known as SES and Endojs.
They regularly talk to make sure things are interoperable.
Ocap security is also what dat-ecosystem is embracing to pair it with peer to peer and bring it to the post-web. A version of the web not dominated anymore by big tech and big standard bodies.

#Veilid is a young and interesting project as well with a focus on anonymity over performance. This is a great use case that needs support, but dat was always about performance and any size of data and anonymity and privacy at all costs.
I’m not saying that is an unimportant use case, but there are plenty of solutions for extreme cases where anonymity and privacy are at utmost importance.
What is vastly more important imho is to have a p2p technology able to replace mainstream big tech services such as youtube, facebook, instagram, tiktok, google & co. because it won’t help us if we have a special niche technology that cant actually tackle big tech and surveillance capitalism but gives people some way to hide from it. …we need it too, but we also need a foundation on which to actually outcompete big tech imho.

Keet is a closed source peer to peer messenger & video conferencing app (might be open source in the future) and it is built on top of the dat stack.
The dat stack is very modular and in it’s core consists of a few main modules.
– hypercore, hyprebee & hyperdrive
– hyperdht & hyperswarm
– autobase
Those modules are maintained by holepunch, an organisation started by one of the core dat developers afte rreceiving a lot of funding to develop keet and now the pear runtime, which will be open source and public under https://pears.com after February 14th 2024 (Valentine’s Day ❤)
Keet itself is one of many apps, all part of the dat-ecosystem.
Most projects are open source, but not all, but they are all built on top of the MIT/Apache licenses p2p stack, which started as `dat` in 2013 and matured many years ago. The stack is battle tested and really works.
Of course – we all want everything open source and one day we might find a model, but if some closed source apps help bring in funding, it benefits the open source core.
Basically, you can think of “keet” as some fancy UI/UX on top of the open source software stack. Now sure – would be sweet if the UI/UX was open source as well, but then again, it’s not essential and until we transition into fully automated luxury Communism or whatever else works, something pays the bills and enables the open source core to be maintained 🙂
At least it works without any “Cloud Landlords”.
No servers, never on a server. No more cloud lords, a.k.a. Big Tech or #dotcons

A. The best we have currently is #ActivityPub DIY federated – this is community based (but fails in code to actually be this) which in meany ways is complemtery to #p2p based approaches – they are better together and if the can bridge or interop this is MUCH better, the #OMN is native to this.

Q. Yes. dat is very low level.
It would be cool to see somebody implement an activity pub based tool on top of it.
One dat-ecosystem project did it for nostr, but no activity pub yet.
I’m personally more interested into a desktop, terminal, version controlled data and software packages. “Social” tools are just one type of tools to built on top of the more fundamental p2p network and p2p system infrastructure.
I do think dat is good for laying these foundations, but “social” tools are a layer that dat as a stack will probably never focus on, but instead dat-ecosystem projects will hopefully take on that challenge 🙂

A. Some people are community based federated (the start of this conversation) others are individual, the #p2p world you talk about. This is not a fight they are both valid. As you say what we don’t won’t is more #dotcons 🙂 Good conversation on the state of #p2p I used to be much more involved in this side, but it failed with the move to #dotcons so got re-engaged when ActivityPub came alone the rebooting of web 1.5 😉 are you happy for me to copy this to my blog, can credit you or just use AQ anonymous format?

Q. any way you want. I dont think p2p has failed.
the p2p of the past was naive kids playing and it took a decade of adults and all the law enforcement they had at their disposal to bring it down and despite that torrents still run and even the piratebay continues to operate, although heavily censored.
Back then it was a few devs and a majority of users.
This time p2p is back and will enter mainstream open source developers after February 14th 2024 (5 days now).
This empowers an entire generation and anyone who wants to dive into p2p to build any kind of tool.
What was once hard and reserved to a few will be available to everyone.
We might see another nodejs/npm movement.
It loads a bit slow, but load this and check “all time”
This is the largest open source ecosystem humanity has ever experienced. http://www.modulecounts.com/
And while npm/github have been hijacked by microsoft, we will claw it all back soon
Btw. regarding Spritely and the backstory behind OCap, even though extremely technical in description, here is a summary of the work by Mark Miller et. al.
https://erights.org/history/index.html
https://en.wikipedia.org/wiki/Mark_S._Miller
> Miller has returned to this issue repeatedly since the Agoric Open Systems Papers from 1988
Mark Miller is Christine Lemmer Webbers Mentor.
He works with Chip Morningstar (who with Randy Farmer did Habitat in the 80s)
Randy Farmer is Executive Director of the Spritely Institute.
Agoric is the Cosmos Framework based Blockchain now.
https://agoric.com/team

A. Interesting to look back at all this stuff, reminds me I had dinner with https://en.wikipedia.org/wiki/Ted_Nelson in Oxford 20 years ago, he was a little eccentric with a clip on digital recording device, every convention had to be record. good to catch up with history https://www.youtube.com/watch?v=h-t405_JAJA that is more relevant today.

Q. Yes – peer to peer is hard. Not as a user, it is actually easy enough, but as a developer. Building p2p is not taught anywhere and there aren’t online learning resources the same way you can learn how to set up your react app, etc…
This will change after February 14th 2024 when the pears.com runtime is released. It is powered by the same p2p stack that developed with dat since 2013.
If anyone of you is a developer or has friends who are, you are all invited to dip your toes into the dat water 😛 …and start a new p2p project and join the dat-ecosystem 🙂 It will get quite easy in 4 days from now and it will again get a lot easier in the coming weeks when more examples and docs are publishes and others build as well.
The Storyline around Mark Miller, Randy Farmer & Chip Morningstar is totally separate from it, but it is also important, because it is what powers
1. the Spritely project and Christine Lemmer Webber
2. the Agoric Blockchain Project backed by Salesforce
3. the Ethereum Metamask Wallet and Co.
It also influences the big standards bodies and I see it two fold.
It’s a story about philosophy, values and vision driven by the specific people in it.
It is also a story about “object capabilities” which is a powerful perspective on security and will enable and inform a lot of p2p interaction which without would require some sort of centralized servers, but with ocap can do it on it’s own p2p

A lightly edited conversation between Hamish Campbell (A) and Alexander Praetorius (Q)

The new and old #openweb protocols

A.

The #nostr crew are the children of #web3 mess, they are a bit reformed, let’s see.
Then the #BlueSky are the reformed children of the #dotcons
The #fediverse is the child of the #openweb

Q. Where would you put #dat or #ssb and in general the #p2p post-web tools?

A.

#dat is a child of the #geekproblem if it is reformed or not, you can maybe tell me?
#SSB was a wild child, now sickly/lonely with the #fahernable kids gathering round #nostr
#p2p was the poster child of the era of the #openweb it was caught in the quicksand of legal issues, the shadow that was left was eclipsed by “free to use” #dotcons Now finds it hard to come back due to mobile devices not having an IP address, thus most people not actually able to use p2p reliably.

Archiveing the “commons” the #makeinghistorys project.

The #OMN are building this into the archiving nodes (see the #makinghistory project https://unite.openworlds.info/Open-Media-Network/MakingHistory) you can then choose to archive hashtags, and you can get a “lossy” view of what is backed up across the network you can see, so you can choose what to focus on your archiving then we rely on “institutions like liberys, arcive.org, university’s etc to back this up in more structured ways. We just do the messy part, 🙂 so agen is a balance with no right path.

Messy is like good enough for most people, and good to have traditional institutions as a backup to this backup. Remember, all our data is #4opens so nothing is private. Privacy is done #p2p decrypted else’s where the best we offer is sudo anonymity through tor.

You can get a “lossy” view of what is backed up across the network you can see, so you can choose what to focus on your archiving. An archiving node is simply a normal node with a different template #KISS simplicity is where the value is.

This is central to the #OMN as it builds meany subject hubs, as you can’t scale storing everything. So a federated natural outcome, anyone can run one. When your database gets too full, you look at your “lossy/local network view of what is backed up and start throwing stuff away. If you are nice you though away stuff that has wide distribution of backup, if you are nasty to throw away stuff only you have.

Talking about trust and power in networks

A. on the subject of “security” we have a #open policy of not trusting ANY client server security at all, so this should only be done #4opens as far as possible and having limited trust in #p2p security, even though we use this, because of the insecurity of the undelighting syteams it runs on, mostly old outdated phones, built as blobs by #dotcons this simple approach gets round much of the current thinking of technical “security” ie. the is almost non at a normal use level and little real security at the paranoid level as you will be talking to the normal level so there security will fail even if yours is solid. Good to keep this in mind 🙂

The #OMN is all about people messing around with each other’s data 😉 but yes we need good basic security, (sudo anonymous) accounts, public audit trails (#openprocess) everywhere. We will need digital hashes/cigs for media items etc. but the data itself just sloshes around and gets hacked at and added to. it’s a common, the rules are social based on trust flows, they are not mostly hard coded or encrypted. But we add a smidgen of hard-coding and decryption ONLY where it’s needed. So 90% trust flows, 5% social norms, 4% hardcoded, 1% encryption is my thinking.

A. Data has the value the instance itself is transitory, and yes the instance is needed and stores the data but if it vanishes it has little impact on the value (the data), we build this into the network.

Q. I am talking about the machines

A. We won’t the instance to stay up and be secure, BUT we build the network, so it keeps working when they are hacked and poised by bad actors.

Q. Yes, but that doesn’t mean we make things easy for bad actors

A. Yes, the code and instances have to be secure, but the network flows, and the data soup have to keep working when the individual instances are hacked and poisend, no security is fool prof and the #OMN is focused on building trust so is inherently more open to fools, we build with this in mind. We are building a #KISS semantic internet of data/flows. For example the idea of rollback as a core security model rather than more traditional hard (control) security is a good fit, due to the #4opens approach, the missing few days of data will (mostly) rollback into the instance so the cost of being hacked/trust failed is less of a block to being open and (social) trusting to bring in actors/sysadmins/moderates etc. On the tin, we are clear that our network is a trust based “lossy” network.

Where you can still run the #OMN as a hard control based secure network if you won’t BUT it will not scale to the social change/challenge if this second option is the only one, this is the current #geekproblem we need to work our way out of. The first path of trust based “lossy” is where the real horizontal “power” comes from.

Q. We sometimes need to think/talk about “security”.

A. I can only repeat I don’t have a solution to this, but I have a path to one, make the user facing “trust” based then from this, “trust” them to fix the next “problem” the #geekproblem of the hardcoded #feudalism of all our networks and code. Or in other words head in sand and pray someone else will fix it, am bussey 😉

On the #OMN projects maybe we need to list what needs to be secure: the account, the activity feed, the data credit might be more but can’t think of much else off the top of my head. And yes to secure the account the instance has to be secure, to secure the activity feed the flows need to be secure, to secure the credit the likely needs to be some hashing done on the media objects.
We likely end up back close to the place we started, but we come to this from a very different place, if that makes sense. This path we take matters.