Content with tag geekproblem .

Alt-geek culture is broken - indymedia

An introduction to a "unspoken" problem. Everything is "pointless" in till you do something "that is not", if we keep repeating the pointless stuff were/when is the "that is not" going to happen?

 

An example of the geek problem can be found in the flowing and fading of radical alt/grassroots media at the peek of the #openweb

 

The basis of any new media is the technology it is transmitted/mediated by. In the case of newspapers this is the printing press, and for radio and TV it is access to the transmission spectrum. The open internet changed this "traditional" media which was based on a world of (vertical) analogue scarcity. As the accessing technology improved, it created a radically (horizontal) digital media space.

 

This was intently filled with (naive in a good sense) alt-media such as the Indymedia project (IMC). In this post I am looking at how this was killed off by internal geek/process dogmatism at the same time as its space was colonised by new/mainstream such as blogging and social media.

 

We are now coming full circle to where we started with closed client/server, algorithm-determined, gatekeeper, for-profit networks dominating media production and consumption. The corporate gate keeping venture capital driven (and invisible ideology) algorithm is the new printing press/broadcast spectrum that we started the century with.

 

What part did radical geeks play in this?

 

Let's look at the successful global indymedia project, which was based on open publishing and open process through a centralised server network. Before this the radical video project undercurrents, while not so open, was again based on a technical hub. They had the only free digital editing suite for production of grassroots video, thus anyone wanting to produces radical content was funnelled though this grassroots gatekeeper. With IMC, it was publishing to their hosted servers.

 

The indymedia network was setup in the very avant-gardist open model that was to dominate the internet for a time. Like undercurrents it succeeded because of its technical centralisation – the server was the ONLY place citizen journalist content could be published without hard technical knowledge. This monopoly was later lost to the growth of individualistic blogging platforms and later corporate social media. But what I want to argue here is that it died before this due to internal (process) pressures.

 

Indymedia was set up on the open, open, open, open, pseudonymous model.

 

* Open source (free software)

 

* Open publishing (post-publishing moderation)

 

* Open licence content (non commercial re-use)

 

* Open process (everything was organised on public e-mail lists, open meetings)

 

* Pseudo-anonymous (you didn’t have to provide an e-mail address or a real name to publish)

 

Let's look as some of the pragmatism that allowed the project to take off:

 

* The project was initially pragmatic about open source as it used the closed realmedia (RM) video streaming codec and servers. But the core project was committed to the free software path where technically possible.

 

* Open publishing was the basis of the project, things could only be hidden (not removed) because they broke a broad public editorial guideline. Even then they were added to a background page so were still public. In this the publishing process was naïvely open.

 

* Open licence stayed with the project to the end.

 

* Open process was gradually abandoned, a clique formed then fought and split, this was the main reason the project ossified and could not adapt to keep its relevance in the changing world of blogs and social media.

 

* (Pseudo) anonymity was part of the abandonment of open process and led down many of the technical dead ends that finally killed the relevance of the project to most users.

 

Lets look at this final one in more depth

 

Firstly, it's important to realise that any attempt at anonymous publishing in a client server relationship even at its most restrictive and paranoid would produce pseudo anonymity. ie. you might be able to hide from your mates and your employer but you cannot hide from the “powers that be” if they are interested in subverting your server and its internet connection.

 

The internet is inherently naïvely open, its built that way, this is why it works. The recent Edward Snowdon leaks highlight this to the wider public view.

 

- the integrity of the ISP and hosting was always based on trusting a tiny anonymous minority of geeks

 

- the physical security of the server could never be guaranteed.

 

- as the project process closed the identity of these core geeks became tenuous/invisible.

 

In activism just as the man driving the white van repeatedly turned out to be the police/corporate spy, the invisible server admin is the obvious opening for the same role – am not saying this is what existed, rather just trying to highlight how you cannot build a network based on this closed client server infrastructure/culture that IMC became. Given the open nature of the internet, it became dangerous to push IMC as an anonymous project.

 

There were four fatal blocks:

 

- the repeated blocks and failure and delay of decentralisation of the servers to the regions.

 

- the blocks on aggregation, then the closed subculture aggregation that final happened as a parallel project

 

- the focusing on encrypted web hosting and self-signed certificates put a block on new non-technical users that proved termanaly offputting.

 

- the failed "security theater" of not login IP address locally on the server as a limited security fig leaf. They could simply be logged on the ISP/open web instead.

 

These, together with a shrinking of the core group, led to the project becoming irrelevant in the face of the growth of more openly accessible blogging and then social media.

 

Let's get positive and suggest some ways the IMC project could have flourished and still be a dominant grassroots project:

 

* The base level of the project should have actively decentralised as the technology matured to make this feasible. Every town needed its own DIY run server.

 

* Then regional aggregation using RSS (really simple syndication) would make this grassroots media presentable as outreach media.

 

* A national aggregation site could then have compete directly with the (then) declining traditional media outlets.

 

* Recognising that the IMC project was pseudo-anonymous at best, IMC could have built a parallel encrypted peer-to-peer gateway app/network to feed into this to provide true(ish) anonymity for publishers to this ongoing open media project.

 

* The decentralisation would have been a force to keep the process open by feeding though new people/energy – this would have naturally balanced the activist clique forming/closing in the centre.

 

* As blogging became popular and matured these could have been “ethically” aggregated into the network to build a truly federated global open media network such as http://openworlds.info is working to be.

 

* Social networking could have been added as an organic part of this flourishing federated network.

 

If this had happened, it's not too much to say that the internet would have been a different place to where it is now. The IMC project highlights some of the failures of activist/geek culture. If we are to (re)build the open web we need to learn from this and move on.

 

(find photo of indymedia Sheffield masked up photo)

 

This is sadly not a metaphor for an open media project

 

It should be obvious to people now that even the most paranoid centralised closed internet is only pseudo-anonymous at best. We need to learn how to live with "open" to build the world we want to see. And our geeks fighting for closed are actually a problem for us, just as much as "them".



No comments yet. Be the first.

The problem with alt-media

Looking at the media (or lack of media) comeing out of the media democracy festival kindaled a old train of thought:

Most alt-geeks are trying to solve a pointless problem "privacy online" anything online is in a "photocopying system" privacy is an illusion. You can get a shallow privacy by going encrypted P2P but this relays on your device - android or apple phone being secure and they aren’t. to move on we have to move past this dominating geek view point.

The alt-media producers are building 20th century silos, this is such a failed strategy that it doesn’t even need to be talked about anymore.

The solutions are KISS and not complex, were are the geek affinity groups to make these happen.

One example OMN

 



No comments yet. Be the first.

Real Media gathering, how not to re-boot grassroots media

Draft

Firstly I don’t have any ill will to the people I know organising this event and would love it to succeed in being a part of the kindling to (re)light the fire of alt-media.

But we have 3 main problem groupings/failures blocking grassroots media (culture) from re-booting

Lets look at how they manifest as negative (can do a positive post on this subject, just ask or look back on my blog)

NGO “culture”

Geek “culture”

Activist “culture”

They all manifest in the upcoming Real Media Gathering, lets use this as an example and look at each in turn.

NGO thinking is a malaise that is filling the vacuum left by the catastrophic failings of Alt-geeks and activist spiky/fluffy debate separation. What is NGO thinking? Well in short its the way you HAVE to think to have a continuing payied careerer in a NGO. It in body’s bureaucracy, (respecting) hierarchy, endemic narrow liberal thinking or at the most radical rigid utopian process – leading to deadening bureaucracy.

Geek culture is dealt with here

Activist culture at its worst is bound by life style, the things you do tobe an activist, that looking/sounding/acting like the change is more important than being the change. Some of the people involve understand this, they are just to lost to take a way out of this malaise. This can manifest as a diversion between spiky/ fluffy and a ritualistic on/off spiting contest between these two mindsets. Change is often lost in this.

How do these manifest in the upcoming real media gathering/movement.

The headline main day of the event is made-up of top-down speakers repeating all the things people already know. For a grassroots gathering this is clearly problematic, think about it for a minute ;) This is how a NGO would organise a “grassroots” event.

Geek culture, the is a tech project going on in parallel with the gathering – its happening in darkness with no knowledge, input or interest. The outcome is likely a black box designed by geeks, now we know for a fact that this NEVER ends well. This is how geek's like to work, just trust us.

Activist grassroots culture is high on the banner header image but no existence in the headline speakers and a shadow in the workshops. They do have a little documented day after for this. No sparks and no rocking the boat, this comes full cirecal to NGO thinking were we started.

The have been a serease of these “NGO” re-booting activism conferences and gatherings over the last few years. I helped to organise some of them for my sins.

Much of the content of the event is fine, the workshops have content, what it lacks is any spark to light the needed media fire. Rubbing the damp sticks of NGO together isn’t going to do it, we need to break out of this malaise, and it's easy to do.



No comments yet. Be the first.

Dangerous thoughts - anonymity on the internet

The last 10 years activist technology and its supporting NGO's have been pushing the encrypted web as secure form of communication. From the Indymedia network "not logging IP's" to Wikileaks "secure whistleblowing" to numerous encrypted chat and social networks. Not to mention all the corporate dotcoms "solutions" jumbling up the space.

This naiveté working had driven alt-tech into oblivion, by complexity and obfuscation. Has this in any way been worth while? I would have liked to right this up but you will have to make do with the notes - This is a good example summing up of the issue (from SN-493-Notes.pdf)

TOR: Not so Anonymous after all

Our previous coverage:
● SN#70 (Internet Anonymity) - seven years ago, March 28th, 2008
● SN#394 (TOR Hidden Services) - nearly two years ago, March 8th, 2013
● In our earlier "what is TOR" coverage, we primarily focused upon the cleverness of
TOR's ONION layering cryptography.
http://thestack.com/chakravarty-tor-traffic-analysis-141114
● "81% of Tor users can be de-anonymised by analysing router information, research
indicates."
● Using weak but pervasive built-in Cisco "NetFlow" tech and deliberate traffic
perturbation.● Perturb the traffic from the server a user is connecting to, and watch the exit nodes'
traffic.
● The point was that even very weak "NetFlow" aggregation was enough. More expensive
"per packet" monitoring and analysis was not needed.
Did feds mount a sustained attack on Tor to decloak crime suspects?
● http://arstechnica.com/tech-policy/2015/01/did-feds-mount-a-sustained-attack-on-tor-t
o-decloak-crime-suspects/
● <quote> Despite the use of Tor, FBI investigators were able to identify IP addresses
that allegedly hosted and accessed the servers, including the Comcast-provided IP
address of one Brian Farrell, who prosecutors said helped manage SilkRoad2. In the
affidavit, DHS special agent Michael Larson wrote:
○ From January 2014 to July 2014, a FBI NY Source of Information (SOI) provided
reliable IP addresses for TOR and hidden services such as SilkRoad2, which
included its main marketplace URL, its vendor URL, its forum URL, and its support
interface (uz434sei7arqunp6.onion). The SOI's information ultimately led to the
identification of SilkRoad2 servers, which led to the identification of at least
another seventeen black markets on TOR.
○ The SOI also identified approximately 78 IP addresses that accessed a vendor
.onion address. A user cannot accidentally end up on the vendor site. The site is
for vendors only, and access is only given to the site by the SilkRoad2
administrators/moderators after confirmation of a significant number of successful
transactions. If a user visits the vendor URL, he or she is asked for a user name
and password. Without a user name and password, the vendor website cannot be
viewed.
The Internet was never designed to provide anonymity... and it doesn't.
● True anonymity is extremely difficult to achieve.
● In a high-latency store & forward system it's somewhat feasible...
● But in any low-latency near real time network, it's arguably impossible.
Review... What is TOR?
● TOR is a LOW LATENCY anonymity-enhancing network service.
● The original designers of TOR made some assumptions and compromises that are
coming back to haunt us now...
● One academic paper put it this way: "Tor aims to protect against a peculiar threat
model, that is unusual within the anonymous communications community. It is
conventional to attempt to guarantee the anonymity of users against a global passive
adversary, who has the ability to observe all network links. It is also customary to
assume that transiting network messages can be injected, deleted or modified and that
the attacker controls a subset of the network nodes. This models a very powerful
adversary, and systems that protect against it can be assumed to be secure in a very
wide range of real world conditions.
Tor, on the other hand, assumes a much weaker threat model. It protects against a
(weaker) non-global adversary, who can only observe a fraction of the network, modify
the traffic only on this fraction, and control a fraction of the Tor nodes.
Furthermore, Tor does not attempt to protect against traffic confirmation attacks, wherean adversary observes two parties that he suspects to be communicating with each
other, to either confirm or reject this suspicion. Instead, Tor aims to make it difficult for
an adversary with a very poor a priori suspicion of who is communicating with whom, to
gain more information.
The Crypto Model:
● Choose a "circuit", default is three nodes.
● Negotiate keys with the 1st node.
● Using the first node, get keys for a randomly chosen second node.
● Using the first and second nodes, get keys for the randomly chosen third node.
● Wrap outgoing traffic in an onion from node 3 to node 2 to node 1.
● The onion model nailed it. No one is attacking that. But...
The Traffic Flow Model: (and the Achilles' heel)
● Deliberate obfuscation of individual packets with random length padding.
● TCP flows are divided into 512 byte cells... And are sent round robin out of the node.
● The power of the global observer
● Much like metadata... traffic pattern analysis is a POWERFUL tool.
● The power of active vs passive attacks
● Being able to "perturb" the flow makes attacks far more powerful.
The extreme power of active assumption confirmation attacks.
● One academic paper: <quote> "Tor does not attempt to protect against traffic
confirmation attacks, where an adversary observes two parties that he suspects to be
communicating with each other, to either confirm or reject this suspicion."
● IOW -- In any near real time network, traffic confirmation is a killer.
Bottom line... *I* would never rely upon TOR alone.
● Consider it, itself, another layer of a more full "Defense in Depth."
● The dream is that someone can sit at home and be fully anonymous. But that's not the
reality.
Defense in depth:
● First of all... DO NOT do anything illegal. Do not do anything that you wouldn't want the
Federal Government to know about.
● Traditional old school & new school.
● Go somewhere as far away as convenient.
● Be anonymous there... Pay with cash.
● Don't go anywhere familiar, don't stay long, don't know anyone, don't talk to anyone.
● Plan ahead to get in and out. Rehearse for speed. Get it done and leave.
● Don't do ANYTHING having to do with your own identity.
● Perhaps purchase a cheap laptop just for this. Pay with cash.
● Override your laptop's default MAC address.
● Use TOR and sacrifice real time performance
● Use widely dispersed global nodes.
● Use many nodes.
● In other words... Tor IS useful, but it's not perfect. So always act as though it's not.


No comments yet. Be the first.

What is needed for the next 5 years to build a open media web

* We have to discredit the domination of corporate social networking such as FB and twitter as solutions. (this should be easy – but needs to be put in a centralised place that is easy to send people to). WHY

* Deal with the “geek question” how to get user-friendly – user-relevant free tools (open-source) as a focus of geek development. Currently 99% of geek development time is wasted here, this is a HUGE untapped operturenerty for open tools. (this is a hard one as the problem is invisible/irrelevant to most people). 

* The need for open industrial standards rather than fashionable standards. This is a “geek chattering class” issue and is simply solved by a critical mass of chattering about open industrial standards.

* Co-operation is an issue for left/progressive contemporary media projects, its a steep clime to get simple linking between sites to happen. We are working on a number of projects to address this.

* Traditional Journalism is a obstacle to the building of open media, as they still have a gate keeping role on what is seen as news and many contemporary media people are sucked into the traditional media world as a career option due to the continuing failer of contemporary media business models. We currently don’t have a good solution to this problem.

If we have a strategy for dealing with each of these issues than we can realisticly move onto building real infrastructure. http://visionon.tv

UPDATE

Its interesting that people say to me that these things aren’t needed or are two vague – my answerer is simple and straightforward if you don’t deal with them then your project WILL LIKELY fail so you should at minimum have them as a highlight on your project description and at best have continuing documented experiments on over coming them for your project to succeed.



No comments yet. Be the first.