Market Failure: Green Energy, Capitalism, and the Path We’re Not Taking

Professor Brett Christophers (Uppsala University)

This lecture will explore the shortcomings of market-driven solutions to the climate crisis, the role of green energy, and the structural limits of capitalism in addressing environmental challenges.

The climate crisis is getting worse, not better. We are burning more fossil fuels, not less. Even with the massive expansion of renewables, energy use is still rising, because green growth adds to consumption rather than replacing it.

So, what’s blocking real change? Professor Brett Christophers lays it out: It’s not economics—it’s politics. The cost of renewables is dropping, largely thanks to China’s command economy driving down manufacturing costs. But the real problem is deployment, not production. Governments in the rich world still rely on the private sector to make the energy transition, using subsidies, tax incentives, and market nudges.

But capitalism is not built to save us, the market won’t solve this. The profit motive is a #blocking force. The oil and energy sectors are oligarchic, meaning investment only flows where market control guarantees profit. Renewable energy doesn’t work this way. Once solar panels or wind farms are built, everyone benefits, so investors can’t “capture” the value in the same way fossil fuel companies can.

This is why China is leading the transition. In 2023, 65% of global renewable investment was happening in China, before that, it was 90%. In contrast, the for-profit world is barely moving. The left is starting to rethink public ownership, but decades of privatization and #neoliberal dogma make this difficult, especially in the Global South, where many countries lost their public energy sectors over the last 40 years.

One small but key issue is that we are trapped in a modernist mindset, where the lights must come on when you flick the switch. The market logic of energy scarcity (storage = control = profit) is at odds with the need to stabilize and expand access. When energy storage becomes widespread, its market value drops, meaning investment dries up before it even begins.

Public ownership has a bad history, but so does privatization. Without cultural change, we are stuck with broken systems that won’t save us. The Coming Storm, in the next 10–20 years, shit is going to hit the fan. #climatechaos is not a distant threat, it’s already disrupting global energy grids. Look at China, where hydropower is failing due to extreme drought, and where record heat waves are driving air conditioning demand through the roof. These are feedback loops that increase carbon emissions, pushing us closer to tipping points.

Governments aren’t prepared for the chaos that’s coming. If history is any guide, they’ll do what they always do: double down on control, repression, and violence. As the crisis deepens, we could see a return to 20th-century authoritarian solutions, forced migration, resource wars, and military crackdowns. If you’re young today, ask yourself: What future are you walking into? What careers will put you on the wrong side of history? Which paths will put a gun in your hands, or leave you standing in front of one? These are grim questions, but they are real.

The #Deathcult has failed, what comes next? For 40 years, neoliberal capitalism has blocked systemic change. Market redesign might be possible, but power and politics shape the system, and the #deathcult that built this mess won’t give it up easily.

The #dotcons are stepping into the void. Big Tech is now playing the role governments used to play, guaranteeing long-term energy contracts to fund #datacenters and #AI infrastructure. But this is a narrow and unstable path, its more noise than signal.

We need alternatives, we need #publicownership, #commons-based solutions, and governance. We need to mediate our overconsumption, compost the #mainstreaming, and reclaim progressive paths before capitalism drives us into collapse.

If we don’t, the market’s failure will become our failure, and the planet won’t care whether we survive or not.


Market Failure: Climate Crisis, Green Energy and the Limits of Capitalism

Professor Brett Christophers (Uppsala University)

This lecture will explore the shortcomings of market-driven solutions to the climate crisis, the role of green energy, and the structural limits of capitalism in addressing environmental challenges.

My notes:

We are using more carbon based energy, adding to energy use with “green growth” this varies regionally, but the numbers are going up not down.

What is #blocking this, its political and policy he argues, the NIMBYs. The economics are not a problem, the costs are going down. The costs coming down is due to China with its central command economy, this is a useful view of the path we need to take. What’s #blocking it has to do with profitability not generating costs, what douse this mean? Deployment is the hidden “cost”, the hidden restraint. Governments in most parts of the world are relying on the private sector to make this energy change, using nudges, subsidy etc. the motivation is profit, and “confidence” in this profit.

Can capitalism save us?

The oil industry is full of oligarchy’s, this shapes investment. The electricity is the same, but how it’s generated has its own market value. Your ability to make a profit is only based on you capturing the market sector. The tech change helps everyone, so the is no profit, value if the investment can’t “capture” a sector.

He slags off the understanding of the Labour Party in the UK. One ansear is market redesign, that what we have is not “natural” but planned, it’s shaped by power and politics and for the agenda of this power. Then we have the artifice of “price” we have not planned this well enough yet, externality’s. In the UK the carbon tax could be argued to have worked with the phase out of the last coal power plant, drax, is shut. But the cost of a real carbon tax is to high for our “democracy” to implement. This is likely true.

More subsidy is an example, the Inflation Reduction Act in the US is an example. To incentivise the private sector to make the change in energy production.

The left criticises this, anti market, It’s still not working, this argument is likely true, look at china. Let’s look at this in 2023 its is 65% globe of renewables investment in China, before this it was 90% this almost nothing happening in the for-profit world, for profit is obviously not working. The left are starting to rethink public ownership as a path.

In China there are contradictions, it’s a mix of clean and dirty, energy demand is growing very fast, climate change is driving this in part, with the disruption of hydropower and the heat waves driving air conditioning, it’s a feedback loop. But it’s instructive with a very different political economy you can have very different outcomes in the energy transition.

This path might happen in the rich north, but will be hard to do in the weak south? They just don’t have the public budgets, some of these have only lost to privatization there public energy sectors over the last 40 years.

We are stuck in the modernist mind set, the lights must come on when you flick the switch. This is still a core #blocking force. Storage is to tame the market, to stabilize the price. The business model is based on the scarcity of storage so when we implement it can easily lose its market value, so investment will not flow in the first place.

Culture change is needed as public ownership does have a bad history as much a for-profit ownership, without this cultural change we don’t solve any of the mess.

One path is blended finance, but the is very little of this existing, so it’s not going to happen in a meaningful way despite the fluffy propaganda people spread.

The question of responsibility?

In the next 10–20 years shit is hitting the fan with #climatechaos we are likely to go back to the 20th century tradition of shooting people, I am wondering, for this generations job prospective, what careers are likely to lead to you being shot when this history repeats and what careers will leave you with the metaphorical gun in your hands, both of course are bad outcomes. But would be useful for young people to think about this to help choices a path after #Oxford

The question of cross discipline for the students comes up, but he says this is really hard, narrow areas, grants, and culture. His ansear is pessimistic, to play the game, till you have the power not to play the game, mess. He does not like it, but advises young people to play.
Market redesign, the #deathcult fucked over this path over the last 40 years.

AI and distributed energy, the #dotcons are pushing this, the preform the same role governments used to play, by garentlying prices in long term contracts for there new data centres, they promise long term fixed price which lets the banks fund projects. This is a very limited funding flow, so more noise than signal.

Worshipping at the Temple of the #Deathcult: The Business Class and Its Myths

At the Oxford Arms Dealer School, in the room with the “enemy“, the business class, we gathered to hear Rain Newton-Smith, Chief Economist and CEO of the Confederation of British Industry, preach to the “faithful”. But at the drinks after I find the ordnance is actorly a mix of locals and academics, less enemy than frenemy. The wine and nibbles are good.

The message of the talk? Confidence is the mythical glue that holds together the #deathcult of #neoliberalism. The sermon? A familiar tale: business must be given free rein, deregulation is the key to prosperity, and any redistribution is a sin against the gods of capital. If only we believe hard enough, the market will save us. The myths of confidence and growth, Newton-Smith speaks of investment, but not for public good, this is about private wealth. Her concern is business confidence, the great phantom that, if disturbed, will cause the economy to crumble. The solution? Keep to the path, no change, no challenge. Keep worshipping the deathcult, and perhaps the gods of profit will smile upon us.

A nod to #climatechaos, but only as an economic opportunity. No mention of the wreckage it has already caused, only that with the right “leadership” (read: the same leadership that led us here) we can turn catastrophe into a marketplace. Innovation will save us, more mythology.

China? She’s pragmatic, trade first, morality later. The UK? She hopes for “stability”, a stable continuation of 40 years of destruction, a sweeping away of the mess, not to fix it, but to make the temple of capital more presentable, more safe for capital.

Fear and the business priesthood, is the overriding theme of the event. Fear of uncertainty, fear of change, fear that the high priests of capital in the current government might lose faith and deviate from doctrine. The business class wants certainty, certainty that their power remains untouched, their profits unchallenged, their control intact.

The EU? Negotiation, to reduce fear. Trade? More important than people, the fear of disruption. Regulation? Only if it removes uncertainty, fear is the real enemy.

The Q&A touches on AI. A bubble of nonsense inflates and then bursts, but somehow the same mythology survives. #AI will fix capitalism’s problems, we are told. A few #climatechaos activists push back, capitalism will heal itself through “innovation” and faith she says. At every turn, she circles back to the cult, unwilling or unable, to step outside the narrow doctrine of the worship of capital.

Conclusion, the mythology in this space remains Intact, this event, like the building it’s held in, is a temple to the #deathcult. Nothing changes, because they fear change more than they fear collapse. The business class doesn’t seek solutions, it seeks certainty. It doesn’t want to fix the mess, it just wants to ensure its own survival as the world burns. Regulation is acceptable, but only if it protects them from risk. Innovation is holy, but only when it upholds the status quo.

Yes, this is the same 40 years of mess, we do need to break free from #KISS

The #dotcons share an ideology

There is a tech ideology that masks corporate power, and this view of #mainstreaming Cyber libertarianism is a bizarre ideological mishmash, a combination of hippie flower power, economic neoliberalism, and a heavy dose of technological determinism. It’s the credo of Silicon Valley, so much so that for years it was known as the “Californian Ideology.” this “thinking” shapes the tech bros and their billionaire overlords, who for the last ten years have push #cryptocurrencys and now claim that technologies like #AI hold the key to solving all human problems and offers “endless opportunities” for wealth, power, and pleasure. Naturally, anything that stands in the way of this vision, government regulation, public oversight, and most importantly collective action, must be swept aside. For meany years, this sounded like a progress path to some, but it’s riddled with obvious contradictions and dangers.

Many of the problems we face are inherently political, requiring systemic solutions that involving collective governance. Yet, the CEOs, executives, and vulture capitalists would rather you believe that the solutions lie in the “free-market”, that is then conveniently funnelled through their platforms and products. This serves their interests in maintaining power and wealth while pushing aside meaningful public accountability and any possible of an alternative.

This fusion of #geekproblem libertarian engineers and anti-government #fahernistas gave rise to the foundational myths of this #geekproblem flow, that technology empowers individuals to create a better world. In the 1990, cyber libertarianism become the dominant ideology in Silicon Valley. Yet, as this ideology flourished, it should have been clear that its vision of “freedom” was fundamentally flawed.

The rhetoric of #techbrow claims to be about freedom—freedom from government oversight, freedom of speech, and freedom to innovate. But in practice, this freedom is selective. It serves the powerful and nasty few while ignoring or exploiting the vast majority. This omission is central to the current #dotcons and parts of our #openweb reboot By focusing exclusively on the dangers of government tyranny, it ignores how corporations can wield just as much, if not more, power over people. This isn’t an accident—it’s the entire point. Silicon Valley’s billionaires don’t want less power for themselves; they want less oversight from governments and the public.

Neoliberalism becomes the new normal to justify policies that benefit the nasty rich. This path of our current #dotcons oligarchs is no accident. The vague anti-government ethos provides the perfect cover for neoliberal policies. By dressing up deregulation, tax cuts for the wealthy, and the dismantling of public services in the language of “freedom,” both tech billionaires and neoliberal politicians can push their agendas without ever addressing the systemic issues of capitalism, inequality and exploitation.

The Musk empire is a prime example, while he rails against government interference, he eagerly accepts billions in subsidies, pushes for deregulation that benefits his companies, and weaponises his platforms to amplify far-right ideologies. Since taking over Twitter, Musk has turned it into a haven for white supremacists and conspiracy theorists, throttled links to media outlets he dislikes, and boosted his own tweets to ensure maximum visibility.

This is the logical conclusion of the path we have all walked down with our embrace of the #dotcons. By rejecting democratic oversight and embracing a narrow, individualistic definition of freedom, we have consolidates power in the hands of the few wealthy, nasty #techbrows and their acolytes. For all the rhetoric about empowering individuals, this path has always been about protecting the privileges of the nasty few.

We see in the USA this Silicon Valley influence growing. Now more than ever, it’s crucial to challenge these paths and step away from the #dotcons these inadequate and nasty people control. We need to understand that freedom isn’t about the absence of government oversight, it’s about creating a humanistic society where power is accountable, resources are shared more equitably, and everyone has the opportunity to grow. The spreading fascism hiding behind the ideology of Cyber libertarianism offers none of this, Instead, it offers us a neo feudalism, tech kings, knights and priests who claim to liberate us while consolidating their control. It’s time to see through the shiny algorithm driven façade and make the effort and focus to build something better. With the native #openweb reboot we have the tools to do this, with #OMN there is a different technological path we can take.

Metaphors matter, composting the current paths in #AI

This #AI-meets-copyright consultation is another wave of opportunistic grafting, much like the #crypto mess before it. The rhetoric about leveraging AI to “grow the economy” and “improve public services” is justification for a “commons” grab by nasty interests. It’s the normal pushing in the ongoing path of #deathcult worship, 40 years of #neoliberalism, digging us deeper into a hole we desperately need to climb out of.

The metaphor of composting captures the urgent need for discernment, what cultural and technological artefacts still serve us in the onrushing era of #climatechaos, and what is toxic and must urgently be composted. People ask what do we mean by this, in its cultural sense, composting is about adapting the remnants of the deathcult into something fertile for a radically different way of life. This is achievable only if we act swiftly to embrace radical change while there’s still time for the metaphor to remain metaphorical. Delay, and #climatechaos will render the metaphor physical—turning our cities, infrastructure, and economies into literal waste piles, where the nasty few will be left to fights over the scraps.

This urgent need for sorting what’s salvageable from what’s dead weight, requires critical thinking and collaborative effort, we need projects like the #OGB to build affinity groups of action, to balance radical action with consensus-building. While consensus about the failures of the last 40 years is important, we need to avoid falling into the trap of endless sterile deliberation. The urgency of the moment demands bold, practical action to balance the needed intellectual and rhetorical critique.

The metaphorical shovel is right there, let’s use it. What we need, is a clear framework (#OMN) to identify what is compostable (ideas, tools, and systems that can support a degrowth future) and what must be discarded to the compost heap. A part of this is cultural agitation to shake people out of their complacency, as the economy of thinking must shift radically. This has to be on a positive path to community resilience, building networks of mutual support, trust and regenerative paths, not the default #deathcult’s control/fear paradigm we are currently walking.

#AI could play a role if it’s on the path, but the current #dotcons push to #AI is part of a “last binge” of neoliberal exploitation, it’s largely irrelevant to the path we need to take, we need to urgently ignore and shift #mainstreaming conversations to focus on what we actually need. The challenge is to redirect the narrative, how can we use our technology to empower grassroots alternatives to build a post-death cult world? We need to do this in tandem with radical action for fertile new growth. Delay, and we’ll find ourselves buried under the non-compostable remnants of a civilization too slow to adapt. It well pastime to grab that shovel. #OMN

Blavatnik Book Talks: The Forever Crisis

This is my reaction from the talk, have not read the book.

In The Forever Crisis, the author presents complex systems thinking as a framework for addressing the world’s intractable challenges, particularly at the level of global governance. The book critiques the traditional top-down approaches that are pushed by powerful institutions like the #UN, highlighting how these solutions are a mismatched for complex, interwoven issues like #climatechange, security, finance, and digital governance.

One of the core issues raised is that global governance structures are failing to keep pace with the crises they are supposed to address. Traditional approaches “silo” issues, handling them in isolation, which makes it hard for messy interconnected challenges to be addressed in a holistic way. For example, while climate change is universally recognized as a priority, the complex “network of governance” is fragmented, leaving institutions like the UN and #IPCC struggling to effectively drive change. These traditional, siloed paths reflect a short-term vision, prioritizing superficial “silver bullet” solutions over systemic, transformative approaches.

A complex systems approach, likening effective governance to networks such as the “mushrooms under the forest floor”—resilient, interconnected, and adaptable. Rather than rigid, top-down mandates, this metaphor supports creating flexible, networked governance structures that can adapt to shifting crises. The notion of cascading solutions is key here: solutions should ripple across systems in a way that amplifies positive outcomes, rather than relying solely on isolated, large-scale interventions.

The talk highlights how unready we are for institutional preparedness and adaptive governance, with the importance of adaptability in governance, particularly in preparing for shocks, both anticipated and unanticipated. Using COVID-19 as an example, he critiques the over-reliance on “luck” rather than robust structures, suggesting that governance systems must be nimble and interconnected enough to absorb shocks without collapsing. Currently, we have a fasard, the UN and other agencies are trying to act as “confidence boosters,” convincing themselves of their own effectiveness.

Challenges to implementing complexity in governance, despite the potential of complexity theory, the talk raises significant questions about implementation. Power structures are deeply entrenched in traditional governance systems, making it difficult to shift away from rigid, reactive models. Further, financial systems tend to funnel resources into quick-fix solutions rather than funding long-term, adaptive responses.

My though, about the talk on mainstream solutions, touches on an essential question: can the existing structures within the “#deathcult” of neoliberalism actually provide the transformation we need? This perspective aligns with the book’s critique, questioning whether today’s dominant structures can truly embrace a complexity-oriented approach to governance. To solve this I focus on #Indymediaback, #OMN, and #OGB as grassroots projects which underlines an alternative that prioritizes local, networked, and community-driven solutions—a departure from the centralized and out-of-touch responses typical of global governance.

The book’s focus on complexity theory as a tool to facilitate self-organizing, resilient systems could be a powerful argument for the decentralized path I advocate. This framework validates the idea that change might be more effectively driven from the grassroots, where diverse actors work in networked patterns that reflect the natural resilience seen in ecosystems.

The talk:

Join Thomas Hale, Professor in Public Policy at the Blavatnik School of Government, and Adam Day, Head of UN University Centre for Policy Research in Geneva, as they discuss Day’s newest book The Forever Crisis.

The Forever Crisis is an introduction to complex systems thinking at the global governance level. It offers concepts, tools, and ways of thinking about how systems change that can be applied to the most wicked problems facing the world today. More than an abstract argument for complexity theory, the book offers a targeted critique of today’s highest-profile proposals for improving the governance of our environment, security, finance, health, and digital space. It suggests that we should spend less effort and resources on upgrading existing institutions, and more on understanding how they (and we) relate to each other.

My thinking and notes.

Its the #NGO crew talking about my subject, this is a professor and the #UN secretary generals adviser. Start with basic complexity, telling a normal story.

Globalisation drives complexity, the nudge theory, the network of governance which we have to manage. Use the IPCC as a tool, but this is a mess. The argument for big solutions, top down is a bad fit for complexity thinking. The solution is tendicalse? Or the mushrooms under the forest floor, network metaphor.

Shifting tipping point, to shift change

Long problems demand complexity, current risk is undervalued

Transformative global governance, or our current global governance could go extinct.

We have a anufe data, for AI to be used as early warning “advising” governance.

So this is main-streaming looking at change and mediating the challenge. Whether it works at all is an open question, looking unlikely looking around the room.

He says we can’t co-operate, and in his terms this is correct. The solution is to try and “trick” the current systems to work together, don’t think he gets beyond this.

UN women calls the current path a failer, and that this is ongoing, but MUCH more urgent now.

In the report, the silos were knitted together, but nobody understood this, so then it was unpacked into sloes so that people could accept it.

The conference that did this report, was in a large part a confidence booster that the current systems could actually work. This is a very small step. No war was won.

The is a consensus that the current process is failing, and needs to change to challenge the current structures. The problem of re-siloing, the crumbling of bridges as they are being built, the outcome the establishment is still blocking the needed bridging.

For him, the ideas don’t create transformation. They spent a year going over old agreements, the new issues were not focused on. This was a problem of trust and transparency. So the whole process was knocked back a year.

Is this change easer or harder during crises? We tend to think that crises creates flexibility, but he argues they hold together stronger when change might be happening? She points to the defence crotch, that change is being blocked by the crises, it’s complex.

Are any of the current institutions fit to governing #AI

Finance funds silver bulite solutions rather than long term solutions. Quick fix, fixes nothing, its funding pored down the drain. His solution is a real cost on carbon if we can get the spyware command and control right to make this work.

On chip verification, hardcoded spy and control in our chips… now this is a very #geekproblem idea.

Can the states raise to work, she says we hope so 🙂 as the is no alternative 🙁 we won’t states to work, in partnership with the private secturer… we need the UN to preform its function, that partners with other actors, private structure, civil society etc.

Capacity building is 10% of the climate budget, this is about writing PDF’s, the people doing the change are simply not there.

Q. on the time to act, with the example of Gorbertrov and the claps of the Soviet Union.

Resilience is not a good thing, if the thing that is resilients are paths are not working.

Can we bake in a long term path into current decisions?

How can we change the existing system so that it balances?

The word leadership, that individuals playing a role, to be the change, is a subject that excites them.

My question would have been, the #deathcult – is the any actors or forces outside this cult – that you see could be the change we need?

He, Cascading solutions across the system fast enough to be the change we need?

She, better preparedness for the shocks, so we can pull together. To deal with issues we have not anticipated. We are not there yet.

COVID was an example of luck not structures.

#oxford

The current “debate” about AI is a distraction #KISS

The debate over AI’s energy consumption is one piece of a larger mess about technological in the face of current existential risks. Yes, #AI’s energy demands are a huge #dotcons waste, but focusing only on this is distracting us from a more discussions about the underlying ideology and assumptions driving the #geekproblenm technological paths—an example, the ideas of #longtermism, lets look at ths:

#Longtermism is a philosophe prioritizes the far future, arguing that we should make decisions today that benefit humanity hundreds or thousands of years from now. Proponents of longtermism advocate for technological advancements like AI and space colonization, pushing that these will ultimately secure humanity’s future, that is after many of us have been killed and displaced by #climatchoas and the resulting social brake down of mass migration. The outcome of the last 40 years of worshipping the #deathcult is this sleight of hand by changing the subject, yes, its a mess.

This mindset is a ridiculous and obviously stupid path we should not take, some of the issues:

  • Overconfidence in predicting the future: Longtermists assume that we can reliably predict the long-term outcomes of our actions. History has shown that even short-term predictions are fraught with uncertainty. The idea that we can accurately forecast the impact of technologies like AI or space colonization centuries from now is, at best, speculative and, at worst, dangerously hubristic.
  • The danger of #geekproblem mentality, the idea that we should “tech harder” to solve our problems, that is, to invest more heavily in advanced technologies with the hope that they will eventually pull us out of our current crises, mirrors longtermist thinking. It assumes that the resource consumption, environmental degradation, and social upheaval caused by these technologies will be justified by the benefits they might bring in the future.

This path is the current mess and flawed for meany reasons:

  • Resource Consumption: The development of AI, space technologies, and other technological “solutions” requires vast amounts of energy and resources. If these technologies do not deliver the expected returns, the initial resource consumption itself exacerbate the crises we are trying to solve, such as the onrushing catastrophe of climate change.
  • Opportunity Costs: By focusing on speculative technologies, we neglect immediate and practical solutions, like transitioning away from fossil fuels, which mitigates some of the worst effects of climate change. These simpler, more grounded paths may not be as glamorous as AI or space travel, but they cannot backfire catastrophically.
  • Moral and Ethical Implications: Whether it is right to invest heavily in speculative technologies when there are pressing issues today that need addressing—issues that affect billions of lives. The idea that a few future lives might be more valuable than current ones is a dangerous and ethically questionable stance.

The is always a strong case for caution and pragmatism in technology. Instead of betting our future on high-stakes #geekproblem technological gambles, a pragmatic approach to focus on solutions that offer benefits today while reducing the risks of tomorrow is almost always a good path. For example, changing our social relations and economic systems away from the current #deathcult, by using social tools to investing in renewable energy, rethinking urban planning, and restore ecosystems would all be actions that can have immediate positive effects while also contributing to a humanistic future. This #KISS path carry far fewer risks if they turn out to be less impactful than hoped. The worst-case scenario with renewable energy is that it doesn’t solve every problem—but it won’t make them worse. In contrast, if AI or space colonization doesn’t deliver on its pie in the sky promises, the consequences are simply disastrous.

A #mainstreaming view of this mess

A call for grounded action, the challenge of our time is not to “tech harder” in the hope that advanced technologies will save us, but to consider the balance between “native” humanistic innovation and #dotcons caution. The example here #Longtermism, with its emphasis on far-off futures, leads us to a dangerous path by neglecting the immediate, tangible actions we can take now, not in a thousand years. We need to focus on paths that address our most pressing problems without risking everything on pie in the sky self-serving mess making. This means actions like reducing fossil fuel dependence, preserving biodiversity, and creating more change and challenge social systems like the #OMN and #OGB—steps that will help us build a resilient and humanist world for both the present and the future #KISS

The media noise about the current #AI is mostly noise https://www.bath.ac.uk/announcements/ai-poses-no-existential-threat-to-humanity-new-study-finds/ and money mess, it’s the normal #deathcult with a bit of kinda working tech.

A Critique of “fluffy” Leftist and Progressive #AI Paths

In our conversations on #AI there is a copyright trap, pushed in the #mainstreaming, the #fashionista conversation around protecting producers and cultural industries are growing hysterical. Some policymakers and activists are pushing for shielding creators from the very real threats posed by these new technologies. However, in their haste to act, leftist and progressive crew are advocating for the use of copyright law as a defensive path. This approach is a mess and fraught with contradictions and risks, a real “Copyright Trap”.

The Copyright Trap is the “common sense” belief that copyright law can be used as a tool to support and protect producers of our culture. This path is problematic:

  • Feudal Nature of Copyright: Copyright, along with patents and trademarks, is a form of intellectual property that comes from feudal rights. It grants semi-eternal rents to those who did not contribute to the production of the work, much like the way land was historically controlled by a few powerful lords.
  • Restriction of the Commons: Copyright takes works out of the public domain and locks them into walled gardens, thus restricting the commons. These runs counter to the principles of access and communal sharing that activists and progressives champion.
  • Injustice to Future Creators: By extending and expanding copyright protections, we make it harder for future producers to build upon the shoulders of giants. This stifles creativity, trapping future generations in a cycle of restricted access and limited freedom.

The mess underpin the current debates around AI and copyright:

  • “If Value, Then (Property) Right” Fallacy: This is the ideological belief that if something has value, it must be protected as property. This ignores the complex ways in which value is created and shared, particularly through communal and collaborative efforts, that do not fit into property rights dogma.
  • Unauthorized Copying as Inherently Wrongful: The idea that copying is wrong ignores the realities of how culture and knowledge developed through imitation, adaptation, and remixing. This perspective is particularly ill-suited to the #openweb era, where information is shared and transformed.
  • The Starving Artist Trope: This trope is resurrected to justify the expansion of copyright protections, suggesting that without such protections, artists will starve. This story fails to address the systemic issues that actually lead to the impoverishment of producers, such as inequitable distribution of wealth and the monopolistic practices in the #dotcons.

Using copyright as a weapon against AI companies is counterproductive and hypocritical for those who advocate for the rights of authors, creators, and intellectual workers:

  • Counter to Progressive Values: Copyright as it stands is a tool of capital that entrenches inequality and restricts access to knowledge and culture. Using it to protect producers from #AI companies simply reinforces a system that many leftists and progressives have long criticized.
  • Locking Up the Commons: Stronger copyright protections, risk enclosing the cultural commons, making it difficult for producers to share content freely to be built upon.
  • Hindering change and challenge: Stricter copyright laws stifle social activism, as new producers find it harder to access and build on existing works. This is detrimental in an era where collaborative and iterative creation is key to technological and cultural progress.

Alternative Approaches, to effectively address the risks and harms posed by generative AI, we need to move past the “copyright trap” and look towards more appropriate “native” paths:

  • Promote Open Access and Open Source: Encourage the use of open access and open source licenses and traditions that allow for the free sharing and modification of works. This helps knowledge and culture remain accessible for social use.
  • Equitable Funding Models: Develop new models for supporting creators that do not rely on restrictive copyright laws. This could include systems of public funding, grants, and cooperative ownership that ensure people are fairly compensated for their work without repressively restricting access.
  • Regulation of #AI Companies: Rather than using copyright as a blunt instrument, on the vertical path, we can regulate AI companies directly. This includes measures to ensure transparency, accountability, and fair compensation for the use of creative works.

The call to use copyright law to protect producers from the threats of #AI is not a useful path for leftist and progressive movements. Instead of reinforcing a flawed and restrictive system, we need to seek “native” paths that align with our values. By doing so, we build a future where both humane creativity and resulting technology can thrive in balance, and not just #techchun the current mess.

Why #AI is more #techshit

Why #AI is more #techshit

The #stupidindividualism of the Silicon Valley’s ideology, around tech-driven libertarianism and as our chattering classes say “hyper-individualism”, is spreading social mess and #techshit, we need shovels to compost. It’s now clear that these anti #mainstreaming ‘solutions’ create more problems than they attempt to solve, particularly in terms of social breakdown and environmental damage. The utopian nightmares of tech billionaires collapse under the weight of on rushing real-world challenges. This should make visible to more of us the #geekproblem, the limits of technocratic fixes. The lies under the once-promised technological mediated future of freedom and innovation has been shown to be control and chaos, this should make it obvious that we need to take different paths away from the Silicon Valley’s delusion.

A podcast from of our weak liberals on the subject of #AI https://flex.acast.com/audio.guim.co.uk/2024/07/15-61610-gnl.sci.20240715.eb.ai_climate.mp3 a #mainstreaming view of the mess we are making on this path. The big issue is not the actual “nature” of AI, though that is not without issues. What I am covering here is that #AI is reinforcing existing power structures and socioeconomic realities, #neoliberal ideology and historical bias. This is driven by the goals of enhancing efficiency, reducing costs, and maximizing profits by increased surveillance, this in itself should raise ethical concerns about privacy and freedoms, that the #geekproblem so often justifies under the guise of security.

We need to think about this: AI systems trained on data from the past 40 years are inherently biased by the socio-political context of that period, perpetuating what are now outdated and obsolete beliefs. This historical bias locks in narrow ideological paths, particularly those associated with #neoliberalism and our 40 years worshipping at this #deathcult. This is not only a problem with AI, its a wider issue, we continue to prioritize economic growth over social and environmental paths, with the resent election victory in the UK, the Labour Party’s is pushing the normal #mainstreaming established during the #Thatcher era, in this we see past ideologies continue to shape current #mainstreaming political paths, the tech simply reinforces this.

It’s hard to know what path to take with this mess. Ethical frameworks like the and regulatory oversight to guide the responsible use of AI might help. By addressing the current mess and challenges, we might be able to work towards an AI path that reflects diverse perspectives and serves a more common good rather than reinforcing narrow #deathcult litany and hard right ideological paths this grows, which is the current default path. Recognizing and addressing the challenges in AI development is the first step towards the change we need to challenge, us, to compost this social mess and heaps of #techshit we have created, that shapes us.

UPDATE: An academic talking about this has just come out https://arxiv.org/pdf/2410.18417

The Mess of Web3: Why #openweb natives question the Blockchain Narrative

In the ongoing discourse surrounding #openweb and its relation to failing technologies like #web3 and #blockchain, a critical question emerges: why do we readily accept solutions without first defining the problem at hand?

“… it’s not secure, it’s not safe, it’s not reliable, it’s not trustworthy, it’s not even decentralized, it’s not anonymous, it’s helping destroy the planet. I haven’t found one positive use for blockchain. It has nothing that couldn’t be done better without it.”

—Bruce Schneier, *Bruce Schneier on the Crypto/Blockchain Disaster

The allure of decentralized autonomous organizations (DAOs) and blockchain technology for the last ten years has overshadowed the necessity of understanding the fundamental issues within our communities. Instead of exploring how we want to govern, decide, and interact within our communities, we find ourselves seduced by the promises of #DAO pitches.

The core of the matter lies in the conflation of culture with technology. Every time a DAO or blockchain solution is proposed, the culture and organization of communities become intertwined with the #geekproblem tools being offered. This bundling tactic obscures the essence of the technology and stifles meaningful discourse. By presenting technology as a fait accompli, we are robbed of the opportunity to critically assess its implications.

In the realm of the #openweb, technology is envisioned as a manifestation of communal decisions and conscious choices. It is the crystallization of community values, traditions, and needs. Where blockchain and DAOs represent an antithesis to this vision. They dictate choices rather than empower communities to determine their own paths.

One of the most concerning aspects of blockchain technology is its enforced financialization within communities. The implementation of ledger systems and tokens mirrors the #dotcons capitalist market traditions, where wealth equates to power. In stark contrast to the principles of “native” gift economies and communalism, blockchain perpetuates a system where those with the most resources wield influence.

In this, even in #mainstreaming dialogue, these ten years of blinded move to blockchain threatens to undermine centuries of liberal evolution by replacing established legal systems with #web3 engineers acting as arbiters of justice. This shift from #mainstreaming transparent and “equitable” legal frameworks to opaque and centralized technological solutions is deeply troubling.

As proponents of ideals, we should question the last ten years narrative of blockchain’s and DAOs. We must resist the allure of #geekproblem technological solutions that obscure the essence of community governance and autonomy. Instead, let’s engage in meaningful dialogue, grounded in clear understanding of the problems we address and the values we hold to forge a “native” #openweb path.

We now face another wasted ten years of #AI hype with the same issues and agender. We have to stop feeding this mess.

#OGB #OMN #makeinghistory

Algorithms of War: The Use of AI in Armed Conflict



Joel H. Rosenthal (Carnegie Council for Ethics in International Affairs), Janina Dill (University of Oxford), Professor Ciaran Martin (Blavatnik School of Government, Oxford), Tom Simpson (Blavatnik School of Government), Brianna Rosen (Blavatnik School of Government, University of Oxford)

Algorithms of war

Arriving early, the panel and audience are ugly broken people, priests and worshippers of the #deathcult

Near the start the young and energetic start to flood in, eager and chatty yet to be broken by service of the dark side of #mainstreaming

The ritual of making killing “humane” and “responsible”, ticking the boxes on this new use of technology in war, repression and death.

Touching on the “privatisation” that this technology pushes to shift traditional military command.

The exeptabl rate of collateral damage 15 to 1 in the case of the IDF Gaza conflict

Introducing human “friction” into the process, the means to the end, is the question. Public confidence and trust is key to this shift, policy is in part about this process.

The establishment policy response to AI in war, this is already live, so these people are catching up. They are at the stage of “definition” in this academic flow.

The issue agen is that none of this technology actually works, we wasted ten years on blockchain and cryptocurrency, this had little value and a lot of harm, we are now going to spend ten years on #AI and yes this will affect society, but is the anything positive in this? Or another wasted ten years of #fashernista thinking, in this case death.


Artificial intelligence (#AI) into warfare raises ethical, practical, and strategic considerations.

Technological Advancements and Warfare: The use of AI in war introduces new algorithms and technologies that potentially reshape military strategies and tactics. AI is used for tasks like autonomous targeting, decision-making, or logistics optimization.

Ethical Concerns: ethical dilemmas associated with AI-driven warfare. Making killing more “humane” and “responsible” through technological advancements, can lead to a perception of sanitizing violence.

Privatization of Military Command: The shift towards AI in warfare leads to a privatization of military functions, as technology companies play a role in developing and implementing AI systems.

Collateral Damage and Public Perception: Collateral damage ratios like 15 to 1 raises questions about the acceptability of casualties in conflicts where AI is employed. Public confidence and trust in AI-driven warfare become critical issues.

Policy and Governance: Establishing policies and regulations around AI in warfare is crucial. Defining the roles of humans in decision-making processes involving AI and ensuring accountability for actions taken by autonomous systems.

Challenges and Risks: The effectiveness of AI technology in warfare draws parallels with previous tech trends like blockchain and cryptocurrency. There’s concern that investing heavily in AI for military purposes will yield little value while causing harm.

Broader Societal Impact: Using AI in warfare will have broader societal implications beyond the battlefield. It will influence public attitudes towards technology, privacy concerns, and the militarization of AI in civilian contexts.

Balance of Innovation and Responsibility: Whether the pursuit of AI in warfare represents progress or merely another trend driven by superficial or misguided thinking #fashernista thinking with potentially dire consequences.

In summary, the integration of AI into warfare demands ethical, legal, and societal implications. The goal should be to leverage technological advancements responsibly, ensuring that human values and principles guide the development and deployment of AI systems in any contexts.

#Oxford

Cambridge Analytica, 5 years on

I think we face the usual problem of working on and implementing policy for yesterday’s issues.

* We are coming out of ten years of Blockchain mess

* Now we are into #AI mess, the is no intelligence in the current round, only artificial writing.

Let’s look at what actually matters

The original openweb had in this context #opendata is the issue we are talking about.

We then had 20 years of the #dotcons with #closeddata. Which you have talked about.

Coming out of this, we have an active openweb reboot happing with federation and opendata.

For example with #Mastodon, the #Fediverse, #bluesky and #Nosta which have grown from half a million to 10 to 15 million users over the last year. #WordPress building #ActivityPub support for a quarter of the internet and #Failbook‘s #threads.

You are seeing a different world back to #opendata, if you run a mastodon instance you will have a large part of the content of the Fediverse sitting in your database in plan text….

Take this into account with policy and regulation please

#Oxford